Sample records for carlo code egsnrc

  1. Monte Carlo dose calculations of beta-emitting sources for intravascular brachytherapy: a comparison between EGS4, EGSnrc, and MCNP.

    PubMed

    Wang, R; Li, X A

    2001-02-01

    The dose parameters for the beta-particle emitting 90Sr/90Y source for intravascular brachytherapy (IVBT) have been calculated by different investigators. At a distant distance from the source, noticeable differences are seen in these parameters calculated using different Monte Carlo codes. The purpose of this work is to quantify as well as to understand these differences. We have compared a series of calculations using an EGS4, an EGSnrc, and the MCNP Monte Carlo codes. Data calculated and compared include the depth dose curve for a broad parallel beam of electrons, and radial dose distributions for point electron sources (monoenergetic or polyenergetic) and for a real 90Sr/90Y source. For the 90Sr/90Y source, the doses at the reference position (2 mm radial distance) calculated by the three code agree within 2%. However, the differences between the dose calculated by the three codes can be over 20% in the radial distance range interested in IVBT. The difference increases with radial distance from source, and reaches 30% at the tail of dose curve. These differences may be partially attributed to the different multiple scattering theories and Monte Carlo models for electron transport adopted in these three codes. Doses calculated by the EGSnrc code are more accurate than those by the EGS4. The two calculations agree within 5% for radial distance <6 mm.

  2. Monte Carlo dose calculation using a cell processor based PlayStation 3 system

    NASA Astrophysics Data System (ADS)

    Chow, James C. L.; Lam, Phil; Jaffray, David A.

    2012-02-01

    This study investigates the performance of the EGSnrc computer code coupled with a Cell-based hardware in Monte Carlo simulation of radiation dose in radiotherapy. Performance evaluations of two processor-intensive functions namely, HOWNEAR and RANMAR_GET in the EGSnrc code were carried out basing on the 20-80 rule (Pareto principle). The execution speeds of the two functions were measured by the profiler gprof specifying the number of executions and total time spent on the functions. A testing architecture designed for Cell processor was implemented in the evaluation using a PlayStation3 (PS3) system. The evaluation results show that the algorithms examined are readily parallelizable on the Cell platform, provided that an architectural change of the EGSnrc was made. However, as the EGSnrc performance was limited by the PowerPC Processing Element in the PS3, PC coupled with graphics processing units or GPCPU may provide a more viable avenue for acceleration.

  3. Comment on ‘egs_brachy: a versatile and fast Monte Carlo code for brachytherapy’

    NASA Astrophysics Data System (ADS)

    Yegin, Gultekin

    2018-02-01

    In a recent paper (Chamberland et al 2016 Phys. Med. Biol. 61 8214) develop a new Monte Carlo code called egs_brachy for brachytherapy treatments. It is based on EGSnrc, and written in the C++ programming language. In order to benchmark the egs_brachy code, the authors use it in various test case scenarios in which complex geometry conditions exist. Another EGSnrc based brachytherapy dose calculation engine, BrachyDose, is used for dose comparisons. The authors fail to prove that egs_brachy can produce reasonable dose values for brachytherapy sources in a given medium. The dose comparisons in the paper are erroneous and misleading. egs_brachy should not be used in any further research studies unless and until all the potential bugs are fixed in the code.

  4. Monte Carlo modelling the dosimetric effects of electrode material on diamond detectors.

    PubMed

    Baluti, Florentina; Deloar, Hossain M; Lansley, Stuart P; Meyer, Juergen

    2015-03-01

    Diamond detectors for radiation dosimetry were modelled using the EGSnrc Monte Carlo code to investigate the influence of electrode material and detector orientation on the absorbed dose. The small dimensions of the electrode/diamond/electrode detector structure required very thin voxels and the use of non-standard DOSXYZnrc Monte Carlo model parameters. The interface phenomena was investigated by simulating a 6 MV beam and detectors with different electrode materials, namely Al, Ag, Cu and Au, with thickens of 0.1 µm for the electrodes and 0.1 mm for the diamond, in both perpendicular and parallel detector orientation with regards to the incident beam. The smallest perturbations were observed for the parallel detector orientation and Al electrodes (Z = 13). In summary, EGSnrc Monte Carlo code is well suited for modelling small detector geometries. The Monte Carlo model developed is a useful tool to investigate the dosimetric effects caused by different electrode materials. To minimise perturbations cause by the detector electrodes, it is recommended that the electrodes should be made from a low-atomic number material and placed parallel to the beam direction.

  5. Monte Carlo dose calculations in homogeneous media and at interfaces: a comparison between GEPTS, EGSnrc, MCNP, and measurements.

    PubMed

    Chibani, Omar; Li, X Allen

    2002-05-01

    Three Monte Carlo photon/electron transport codes (GEPTS, EGSnrc, and MCNP) are bench-marked against dose measurements in homogeneous (both low- and high-Z) media as well as at interfaces. A brief overview on physical models used by each code for photon and electron (positron) transport is given. Absolute calorimetric dose measurements for 0.5 and 1 MeV electron beams incident on homogeneous and multilayer media are compared with the predictions of the three codes. Comparison with dose measurements in two-layer media exposed to a 60Co gamma source is also performed. In addition, comparisons between the codes (including the EGS4 code) are done for (a) 0.05 to 10 MeV electron beams and positron point sources in lead, (b) high-energy photons (10 and 20 MeV) irradiating a multilayer phantom (water/steel/air), and (c) simulation of a 90Sr/90Y brachytherapy source. A good agreement is observed between the calorimetric electron dose measurements and predictions of GEPTS and EGSnrc in both homogeneous and multilayer media. MCNP outputs are found to be dependent on the energy-indexing method (Default/ITS style). This dependence is significant in homogeneous media as well as at interfaces. MCNP(ITS) fits more closely the experimental data than MCNP(DEF), except for the case of Be. At low energy (0.05 and 0.1 MeV), MCNP(ITS) dose distributions in lead show higher maximums in comparison with GEPTS and EGSnrc. EGS4 produces too penetrating electron-dose distributions in high-Z media, especially at low energy (<0.1 MeV). For positrons, differences between GEPTS and EGSnrc are observed in lead because GEPTS distinguishes positrons from electrons for both elastic multiple scattering and bremsstrahlung emission models. For the 60Co source, a quite good agreement between calculations and measurements is observed with regards to the experimental uncertainty. For the other cases (10 and 20 MeV photon sources and the 90Sr/90Y beta source), a good agreement is found between the three codes. In conclusion, differences between GEPTS and EGSnrc results are found to be very small for almost all media and energies studied. MCNP results depend significantly on the electron energy-indexing method.

  6. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field.

    PubMed

    Yang, Y M; Bednarz, B

    2013-02-21

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  7. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Bednarz, B.

    2013-02-01

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  8. Comparison of normal tissue dose calculation methods for epidemiological studies of radiotherapy patients.

    PubMed

    Mille, Matthew M; Jung, Jae Won; Lee, Choonik; Kuzmin, Gleb A; Lee, Choonsik

    2018-06-01

    Radiation dosimetry is an essential input for epidemiological studies of radiotherapy patients aimed at quantifying the dose-response relationship of late-term morbidity and mortality. Individualised organ dose must be estimated for all tissues of interest located in-field, near-field, or out-of-field. Whereas conventional measurement approaches are limited to points in water or anthropomorphic phantoms, computational approaches using patient images or human phantoms offer greater flexibility and can provide more detailed three-dimensional dose information. In the current study, we systematically compared four different dose calculation algorithms so that dosimetrists and epidemiologists can better understand the advantages and limitations of the various approaches at their disposal. The four dose calculations algorithms considered were as follows: the (1) Analytical Anisotropic Algorithm (AAA) and (2) Acuros XB algorithm (Acuros XB), as implemented in the Eclipse treatment planning system (TPS); (3) a Monte Carlo radiation transport code, EGSnrc; and (4) an accelerated Monte Carlo code, the x-ray Voxel Monte Carlo (XVMC). The four algorithms were compared in terms of their accuracy and appropriateness in the context of dose reconstruction for epidemiological investigations. Accuracy in peripheral dose was evaluated first by benchmarking the calculated dose profiles against measurements in a homogeneous water phantom. Additional simulations in a heterogeneous cylinder phantom evaluated the performance of the algorithms in the presence of tissue heterogeneity. In general, we found that the algorithms contained within the commercial TPS (AAA and Acuros XB) were fast and accurate in-field or near-field, but not acceptable out-of-field. Therefore, the TPS is best suited for epidemiological studies involving large cohorts and where the organs of interest are located in-field or partially in-field. The EGSnrc and XVMC codes showed excellent agreement with measurements both in-field and out-of-field. The EGSnrc code was the most accurate dosimetry approach, but was too slow to be used for large-scale epidemiological cohorts. The XVMC code showed similar accuracy to EGSnrc, but was significantly faster, and thus epidemiological applications seem feasible, especially when the organs of interest reside far away from the field edge.

  9. WE-DE-201-05: Evaluation of a Windowless Extrapolation Chamber Design and Monte Carlo Based Corrections for the Calibration of Ophthalmic Applicators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, J; Culberson, W; DeWerd, L

    Purpose: To test the validity of a windowless extrapolation chamber used to measure surface dose rate from planar ophthalmic applicators and to compare different Monte Carlo based codes for deriving correction factors. Methods: Dose rate measurements were performed using a windowless, planar extrapolation chamber with a {sup 90}Sr/{sup 90}Y Tracerlab RA-1 ophthalmic applicator previously calibrated at the National Institute of Standards and Technology (NIST). Capacitance measurements were performed to estimate the initial air gap width between the source face and collecting electrode. Current was measured as a function of air gap, and Bragg-Gray cavity theory was used to calculate themore » absorbed dose rate to water. To determine correction factors for backscatter, divergence, and attenuation from the Mylar entrance window found in the NIST extrapolation chamber, both EGSnrc Monte Carlo user code and Monte Carlo N-Particle Transport Code (MCNP) were utilized. Simulation results were compared with experimental current readings from the windowless extrapolation chamber as a function of air gap. Additionally, measured dose rate values were compared with the expected result from the NIST source calibration to test the validity of the windowless chamber design. Results: Better agreement was seen between EGSnrc simulated dose results and experimental current readings at very small air gaps (<100 µm) for the windowless extrapolation chamber, while MCNP results demonstrated divergence at these small gap widths. Three separate dose rate measurements were performed with the RA-1 applicator. The average observed difference from the expected result based on the NIST calibration was −1.88% with a statistical standard deviation of 0.39% (k=1). Conclusion: EGSnrc user code will be used during future work to derive correction factors for extrapolation chamber measurements. Additionally, experiment results suggest that an entrance window is not needed in order for an extrapolation chamber to provide accurate dose rate measurements for a planar ophthalmic applicator.« less

  10. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry T; Yin, Fang-Fang; Chetty, Indrin J

    2013-04-21

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan-scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the 'ISource = 8: Phase-Space Source Incident from Multiple Directions' in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  11. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation

    NASA Astrophysics Data System (ADS)

    Kim, Sangroh; Yoshizumi, Terry T.; Yin, Fang-Fang; Chetty, Indrin J.

    2013-04-01

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan—scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the ‘ISource = 8: Phase-Space Source Incident from Multiple Directions’ in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  12. Effect of the multiple scattering of electrons in Monte Carlo simulation of LINACS.

    PubMed

    Vilches, Manuel; García-Pareja, Salvador; Guerrero, Rafael; Anguiano, Marta; Lallena, Antonio M

    2008-01-01

    Results obtained from Monte Carlo simulations of the transport of electrons in thin slabs of dense material media and air slabs with different widths are analyzed. Various general purpose Monte Carlo codes have been used: PENELOPE, GEANT3, GEANT4, EGSNRC, MCNPX. Non-negligible differences between the angular and radial distributions after the slabs have been found. The effects of these differences on the depth doses measured in water are also discussed.

  13. Charged particle transport in magnetic fields in EGSnrc.

    PubMed

    Malkov, V N; Rogers, D W O

    2016-07-01

    To accurately and efficiently implement charged particle transport in a magnetic field in EGSnrc and validate the code for the use in phantom and ion chamber simulations. The effect of the magnetic field on the particle motion and position is determined using one- and three-point numerical integrations of the Lorentz force on the charged particle and is added to the condensed history calculation performed by the EGSnrc PRESTA-II algorithm. The code is tested with a Fano test adapted for the presence of magnetic fields. The code is compatible with all EGSnrc based applications, including egs++. Ion chamber calculations are compared to experimental measurements and the effect of the code on the efficiency and timing is determined. Agreement with the Fano test's theoretical value is obtained at the 0.1% level for large step-sizes and in magnetic fields as strong as 5 T. The NE2571 dose calculations achieve agreement with the experiment within 0.5% up to 1 T beyond which deviations up to 1.2% are observed. Uniform air gaps of 0.5 and 1 mm and a misalignment of the incoming photon beam with the magnetic field are found to produce variations in the normalized dose on the order of 1%. These findings necessitate a clear definition of all experimental conditions to allow for accurate Monte Carlo simulations. It is found that ion chamber simulation times are increased by only 38%, and a 10 × 10 × 6 cm(3) water phantom with (3 mm)(3) voxels experiences a 48% increase in simulation time as compared to the default EGSnrc with no magnetic field. The incorporation of the effect of the magnetic fields in EGSnrc provides the capability to calculate high accuracy ion chamber and phantom doses for the use in MRI-radiation systems. Further, the effect of apparently insignificant experimental details is found to be accentuated by the presence of the magnetic field.

  14. Study of the impact of artificial articulations on the dose distribution under medical irradiation

    NASA Astrophysics Data System (ADS)

    Buffard, E.; Gschwind, R.; Makovicka, L.; Martin, E.; Meunier, C.; David, C.

    2005-02-01

    Perturbations due to the presence of high density heterogeneities in the body are not correctly taken into account in the Treatment Planning Systems currently available for external radiotherapy. For this reason, the accuracy of the dose distribution calculations has to be improved by using Monte Carlo simulations. In a previous study, we established a theoretical model by using the Monte Carlo code EGSnrc [I. Kawrakow, D.W.O. Rogers, The EGSnrc code system: MC simulation of electron and photon transport. Technical Report PIRS-701, NRCC, Ottawa, Canada, 2000] in order to obtain the dose distributions around simple heterogeneities. These simulations were then validated by experimental results obtained with thermoluminescent dosemeters and an ionisation chamber. The influence of samples composed of hip prostheses materials (titanium alloy and steel) and a substitute of bone were notably studied. A more complex model was then developed with the Monte Carlo code BEAMnrc [D.W.O. Rogers, C.M. MA, G.X. Ding, B. Walters, D. Sheikh-Bagheri, G.G. Zhang, BEAMnrc Users Manual. NRC Report PPIRS 509(a) rev F, 2001] in order to take into account the hip prosthesis geometry. The simulation results were compared to experimental measurements performed in a water phantom, in the case of a standard treatment of a pelvic cancer for one of the beams passing through the implant. These results have shown the great influence of the prostheses on the dose distribution.

  15. Multiple scattering of 13 and 20 MeV electrons by thin foils: a Monte Carlo study with GEANT, Geant4, and PENELOPE.

    PubMed

    Vilches, M; García-Pareja, S; Guerrero, R; Anguiano, M; Lallena, A M

    2009-09-01

    In this work, recent results from experiments and simulations (with EGSnrc) performed by Ross et al. [Med. Phys. 35, 4121-4131 (2008)] on electron scattering by foils of different materials and thicknesses are compared to those obtained using several Monte Carlo codes. Three codes have been used: GEANT (version 3.21), Geant4 (version 9.1, patch03), and PENELOPE (version 2006). In the case of PENELOPE, mixed and fully detailed simulations have been carried out. Transverse dose distributions in air have been obtained in order to compare with measurements. The detailed PENELOPE simulations show excellent agreement with experiment. The calculations performed with GEANT and PENELOPE (mixed) agree with experiment within 3% except for the Be foil. In the case of Geant4, the distributions are 5% narrower compared to the experimental ones, though the agreement is very good for the Be foil. Transverse dose distribution in water obtained with PENELOPE (mixed) is 4% wider than those calculated by Ross et al. using EGSnrc and is 1% narrower than the transverse dose distributions in air, as considered in the experiment. All the codes give a reasonable agreement (within 5%) with the experimental results for all the material and thicknesses studied.

  16. Modelling of an Orthovoltage X-ray Therapy Unit with the EGSnrc Monte Carlo Package

    NASA Astrophysics Data System (ADS)

    Knöös, Tommy; Rosenschöld, Per Munck Af; Wieslander, Elinore

    2007-06-01

    Simulations with the EGSnrc code package of an orthovoltage x-ray machine have been performed. The BEAMnrc code was used to transport electrons, produce x-ray photons in the target and transport of these through the treatment machine down to the exit level of the applicator. Further transport in water or CT based phantoms was facilitated by the DOSXYZnrc code. Phase space files were scored with BEAMnrc and analysed regarding the energy spectra at the end of the applicator. Tuning of simulation parameters was based on the half-value layer quantity for the beams in either Al or Cu. Calculated depth dose and profile curves have been compared against measurements and show good agreement except at shallow depths. The MC model tested in this study can be used for various dosimetric studies as well as generating a library of typical treatment cases that can serve as both educational material and guidance in the clinical practice

  17. Evaluation of dosimetric properties of shielding disk used in intraoperative electron radiotherapy: A Monte Carlo study.

    PubMed

    Robatjazi, Mostafa; Baghani, Hamid Reza; Mahdavic, Seied Rabi; Felici, Giuseppe

    2018-05-01

    A shielding disk is used for IOERT procedures to absorb radiation behind the target and protect underlying healthy tissues. Setup variation of shielding disk can affect the corresponding in-vivo dose distribution. In this study, the changes of dosimetric parameters due to the disk setup variations is evaluated using EGSnrc Monte Carlo (MC) code. The results can help treatment team to decide about the level of accuracy in the setup procedure and delivered dose to the target volume during IOERT. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Validation of a commercial TPS based on the VMC(++) Monte Carlo code for electron beams: commissioning and dosimetric comparison with EGSnrc in homogeneous and heterogeneous phantoms.

    PubMed

    Ferretti, A; Martignano, A; Simonato, F; Paiusco, M

    2014-02-01

    The aim of the present work was the validation of the VMC(++) Monte Carlo (MC) engine implemented in the Oncentra Masterplan (OMTPS) and used to calculate the dose distribution produced by the electron beams (energy 5-12 MeV) generated by the linear accelerator (linac) Primus (Siemens), shaped by a digital variable applicator (DEVA). The BEAMnrc/DOSXYZnrc (EGSnrc package) MC model of the linac head was used as a benchmark. Commissioning results for both MC codes were evaluated by means of 1D Gamma Analysis (2%, 2 mm), calculated with a home-made Matlab (The MathWorks) program, comparing the calculations with the measured profiles. The results of the commissioning of OMTPS were good [average gamma index (γ) > 97%]; some mismatches were found with large beams (size ≥ 15 cm). The optimization of the BEAMnrc model required to increase the beam exit window to match the calculated and measured profiles (final average γ > 98%). Then OMTPS dose distribution maps were compared with DOSXYZnrc with a 2D Gamma Analysis (3%, 3 mm), in 3 virtual water phantoms: (a) with an air step, (b) with an air insert, and (c) with a bone insert. The OMTPD and EGSnrc dose distributions with the air-water step phantom were in very high agreement (γ ∼ 99%), while for heterogeneous phantoms there were differences of about 9% in the air insert and of about 10-15% in the bone region. This is due to the Masterplan implementation of VMC(++) which reports the dose as "dose to water", instead of "dose to medium". Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Comparison of measured and Monte Carlo calculated dose distributions in inhomogeneous phantoms in clinical electron beams

    NASA Astrophysics Data System (ADS)

    Doucet, R.; Olivares, M.; DeBlois, F.; Podgorsak, E. B.; Kawrakow, I.; Seuntjens, J.

    2003-08-01

    Calculations of dose distributions in heterogeneous phantoms in clinical electron beams, carried out using the fast voxel Monte Carlo (MC) system XVMC and the conventional MC code EGSnrc, were compared with measurements. Irradiations were performed using the 9 MeV and 15 MeV beams from a Varian Clinac-18 accelerator with a 10 × 10 cm2 applicator and an SSD of 100 cm. Depth doses were measured with thermoluminescent dosimetry techniques (TLD 700) in phantoms consisting of slabs of Solid WaterTM (SW) and bone and slabs of SW and lung tissue-equivalent materials. Lateral profiles in water were measured using an electron diode at different depths behind one and two immersed aluminium rods. The accelerator was modelled using the EGS4/BEAM system and optimized phase-space files were used as input to the EGSnrc and the XVMC calculations. Also, for the XVMC, an experiment-based beam model was used. All measurements were corrected by the EGSnrc-calculated stopping power ratios. Overall, there is excellent agreement between the corrected experimental and the two MC dose distributions. Small remaining discrepancies may be due to the non-equivalence between physical and simulated tissue-equivalent materials and to detector fluence perturbation effect correction factors that were calculated for the 9 MeV beam at selected depths in the heterogeneous phantoms.

  20. Comparison of measured and Monte Carlo calculated dose distributions in inhomogeneous phantoms in clinical electron beams.

    PubMed

    Doucet, R; Olivares, M; DeBlois, F; Podgorsak, E B; Kawrakow, I; Seuntjens, J

    2003-08-07

    Calculations of dose distributions in heterogeneous phantoms in clinical electron beams, carried out using the fast voxel Monte Carlo (MC) system XVMC and the conventional MC code EGSnrc, were compared with measurements. Irradiations were performed using the 9 MeV and 15 MeV beams from a Varian Clinac-18 accelerator with a 10 x 10 cm2 applicator and an SSD of 100 cm. Depth doses were measured with thermoluminescent dosimetry techniques (TLD 700) in phantoms consisting of slabs of Solid Water (SW) and bone and slabs of SW and lung tissue-equivalent materials. Lateral profiles in water were measured using an electron diode at different depths behind one and two immersed aluminium rods. The accelerator was modelled using the EGS4/BEAM system and optimized phase-space files were used as input to the EGSnrc and the XVMC calculations. Also, for the XVMC, an experiment-based beam model was used. All measurements were corrected by the EGSnrc-calculated stopping power ratios. Overall, there is excellent agreement between the corrected experimental and the two MC dose distributions. Small remaining discrepancies may be due to the non-equivalence between physical and simulated tissue-equivalent materials and to detector fluence perturbation effect correction factors that were calculated for the 9 MeV beam at selected depths in the heterogeneous phantoms.

  1. Internal dosimetry through GATE simulations of preclinical radiotherapy using a melanin-targeting ligand

    NASA Astrophysics Data System (ADS)

    Perrot, Y.; Degoul, F.; Auzeloux, P.; Bonnet, M.; Cachin, F.; Chezal, J. M.; Donnarieix, D.; Labarre, P.; Moins, N.; Papon, J.; Rbah-Vidal, L.; Vidal, A.; Miot-Noirault, E.; Maigne, L.

    2014-05-01

    The GATE Monte Carlo simulation platform based on the Geant4 toolkit is under constant improvement for dosimetric calculations. In this study, we explore its use for the dosimetry of the preclinical targeted radiotherapy of melanoma using a new specific melanin-targeting radiotracer labeled with iodine 131. Calculated absorbed fractions and S values for spheres and murine models (digital and CT-scan-based mouse phantoms) are compared between GATE and EGSnrc Monte Carlo codes considering monoenergetic electrons and the detailed energy spectrum of iodine 131. The behavior of Geant4 standard and low energy models is also tested. Following the different authors’ guidelines concerning the parameterization of electron physics models, this study demonstrates an agreement of 1.2% and 1.5% with EGSnrc, respectively, for the calculation of S values for small spheres and mouse phantoms. S values calculated with GATE are then used to compute the dose distribution in organs of interest using the activity distribution in mouse phantoms. This study gives the dosimetric data required for the translation of the new treatment to the clinic.

  2. A preliminary study of in-house Monte Carlo simulations: an integrated Monte Carlo verification system.

    PubMed

    Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hideki; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki

    2009-10-01

    To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.

  3. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy

    NASA Astrophysics Data System (ADS)

    Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.

    2016-12-01

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  4. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.

    PubMed

    Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M

    2016-12-07

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  5. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV.

    PubMed

    Maigne, L; Perrot, Y; Schaart, D R; Donnarieix, D; Breton, V

    2011-02-07

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV.

  6. SU-E-T-169: Evaluation of Oncentra TPS for Nasopharynx Brachy Using Patient Specific Voxel Phantom and EGSnrc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadad, K; Zoherhvand, M; Faghihi, R

    2014-06-01

    Purpose: Nasopharnx carcinoma (NPC) treatment is being carried out using Ir-192 HDR seeds in Mehdieh Hospital in Hamadan, Iran. The Oncentra™ TPS is based on optimized TG-43 formalism which disregards heterogeneity in the treatment area. Due to abundant heterogeneity in head and neck, comparison of the Oncentra™ TPS dose evaluation and an accurate dose calculation method in NPC brachytherapy is the objective of this study. Methods: CT DICOMs of a patient with NPC obtained from Mehdieh Hospital used to create 3D voxel phantom with CTCREATE utility of EGSnrc code package. The voxel phantom together with Ir-192 HDR brachytherapy source weremore » the input to DOSXYZnrc to calculate the 3D dose distribution. The sources were incorporate with type 6 source in DOSXYZnrc and their dwell times were taken into account in final dose calculations. Results: The direct comparison between isodoses as well as DVHs for the GTV, PTV and CTV obtained by Oncentra™ and EGSnrc Monte Carlo code are made. EGSnrc results are obtained using 5×10{sup 9} histories to reduce the statistical error below 1% in GTV and 5% in 5% dose areas. The standard ICRP700 cross section library is employed in DOSXYZnrc dose calculation. Conclusion: A direct relationship between increased dose differences and increased material density (hence heterogeneity) is observed when isodoses contours of the TPS and DOSXYZnrc are compared. Regarding the point dose calculations, the differences range from 1.2% in PTV to 5.6% for cavity region and 7.8% for bone regions. While Oncentra™ TPS overestimates the dose in cavities, it tends to underestimate dose depositions within bones.« less

  7. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images.

    PubMed

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity.

  8. Monte Carlo derivation of filtered tungsten anode X-ray spectra for dose computation in digital mammography.

    PubMed

    Paixão, Lucas; Oliveira, Bruno Beraldo; Viloria, Carolina; de Oliveira, Marcio Alves; Teixeira, Maria Helena Araújo; Nogueira, Maria do Socorro

    2015-01-01

    Derive filtered tungsten X-ray spectra used in digital mammography systems by means of Monte Carlo simulations. Filtered spectra for rhodium filter were obtained for tube potentials between 26 and 32 kV. The half-value layer (HVL) of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F & MAM Detector Platinum and 8201023-C Xi Base unit Platinum Plus w mAs in a Hologic Selenia Dimensions system using a direct radiography mode. Calculated HVL values showed good agreement as compared with those obtained experimentally. The greatest relative difference between the Monte Carlo calculated HVL values and experimental HVL values was 4%. The results show that the filtered tungsten anode X-ray spectra and the EGSnrc Monte Carlo code can be used for mean glandular dose determination in mammography.

  9. Monte Carlo derivation of filtered tungsten anode X-ray spectra for dose computation in digital mammography*

    PubMed Central

    Paixão, Lucas; Oliveira, Bruno Beraldo; Viloria, Carolina; de Oliveira, Marcio Alves; Teixeira, Maria Helena Araújo; Nogueira, Maria do Socorro

    2015-01-01

    Objective Derive filtered tungsten X-ray spectra used in digital mammography systems by means of Monte Carlo simulations. Materials and Methods Filtered spectra for rhodium filter were obtained for tube potentials between 26 and 32 kV. The half-value layer (HVL) of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F & MAM Detector Platinum and 8201023-C Xi Base unit Platinum Plus w mAs in a Hologic Selenia Dimensions system using a direct radiography mode. Results Calculated HVL values showed good agreement as compared with those obtained experimentally. The greatest relative difference between the Monte Carlo calculated HVL values and experimental HVL values was 4%. Conclusion The results show that the filtered tungsten anode X-ray spectra and the EGSnrc Monte Carlo code can be used for mean glandular dose determination in mammography. PMID:26811553

  10. Monte Carlo investigation of positron annihilation in medical positron emission tomography

    NASA Astrophysics Data System (ADS)

    Chin, P. W.; Spyrou, N. M.

    2007-09-01

    A number of Monte Carlo codes are available for simulating positron emission tomography (PET), however, physics approximations differ. A number of radiation processes are deemed negligible, some without rigorous investigation. Some PET literature quantify approximations to be valid, without citing the data source. The radiation source is the first step in Monte Carlo simulations, for some codes this is 511 keV photons 180° apart, not polyenergetic positrons with radiation histories of their own. Without prior assumptions, we investigated electron-positron annihilation under clinical PET conditions. Just before annihilation, we tallied the positron energy and position. Right after annihilation, we tallied the energy and separation angle of photon pairs. When comparing PET textbooks with theory, PENELOPE and EGSnrc, only the latter three agreed. From 10 6 radiation histories, a positron source of 15O in a chest phantom annihilated at as high as 1.58 MeV, producing photons with energies 0.30-2.20 MeV, 79-180° apart. From 10 6 radiation histories, an 18F positron source in a head phantom annihilated at energies as high as 0.56 MeV, producing 0.33-1.18 MeV photons 109-180° apart. 2.5% and 0.8% annihilation events occurred inflight in the chest and the head phantoms, respectively. PET textbooks typically either do not mention any deviation from 180°, or state a deviation of 0.25° or 0.5°. Our findings are founded on the well-established Heitler cross-sections and relativistic kinematics, both adopted unanimously by PENELOPE, EGSnrc and GEANT4. Our results highlight the effects of annihilation in-flight, a process sometimes forgotten within the PET community.

  11. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images*

    PubMed Central

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    Objective This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. Materials and Methods A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. Results With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. Conclusion The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity. PMID:25741101

  12. A Monte Carlo calculation model of electronic portal imaging device for transit dosimetry through heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Jihyung; Jung, Jae Won, E-mail: jungj@ecu.edu; Kim, Jong Oh

    2016-05-15

    Purpose: To develop and evaluate a fast Monte Carlo (MC) dose calculation model of electronic portal imaging device (EPID) based on its effective atomic number modeling in the XVMC code. Methods: A previously developed EPID model, based on the XVMC code by density scaling of EPID structures, was modified by additionally considering effective atomic number (Z{sub eff}) of each structure and adopting a phase space file from the EGSnrc code. The model was tested under various homogeneous and heterogeneous phantoms and field sizes by comparing the calculations in the model with measurements in EPID. In order to better evaluate themore » model, the performance of the XVMC code was separately tested by comparing calculated dose to water with ion chamber (IC) array measurement in the plane of EPID. Results: In the EPID plane, calculated dose to water by the code showed agreement with IC measurements within 1.8%. The difference was averaged across the in-field regions of the acquired profiles for all field sizes and phantoms. The maximum point difference was 2.8%, affected by proximity of the maximum points to penumbra and MC noise. The EPID model showed agreement with measured EPID images within 1.3%. The maximum point difference was 1.9%. The difference dropped from the higher value of the code by employing the calibration that is dependent on field sizes and thicknesses for the conversion of calculated images to measured images. Thanks to the Z{sub eff} correction, the EPID model showed a linear trend of the calibration factors unlike those of the density-only-scaled model. The phase space file from the EGSnrc code sharpened penumbra profiles significantly, improving agreement of calculated profiles with measured profiles. Conclusions: Demonstrating high accuracy, the EPID model with the associated calibration system may be used for in vivo dosimetry of radiation therapy. Through this study, a MC model of EPID has been developed, and their performance has been rigorously investigated for transit dosimetry.« less

  13. SU-F-T-370: A Fast Monte Carlo Dose Engine for Gamma Knife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, T; Zhou, L; Li, Y

    2016-06-15

    Purpose: To develop a fast Monte Carlo dose calculation algorithm for Gamma Knife. Methods: To make the simulation more efficient, we implemented the track repeating technique on GPU. We first use EGSnrc to pre-calculate the photon and secondary electron tracks in water from two mono-energy photons of 60Co. The total photon mean free paths for different materials and energies are obtained from NIST. During simulation, each entire photon track was first loaded to shared memory for each block, the incident original photon was then splitted to Nthread sub-photons, each thread transport one sub-photon, the Russian roulette technique was applied formore » scattered and bremsstrahlung photons. The resultant electrons from photon interactions are simulated by repeating the recorded electron tracks. The electron step length is stretched/shrunk proportionally based on the local density and stopping power ratios of the local material. Energy deposition in a voxel is proportional to the fraction of the equivalent step length in that voxel. To evaluate its accuracy, dose deposition in a 300mm*300mm*300mm water phantom is calculated, and compared to EGSnrc results. Results: Both PDD and OAR showed great agreements (within 0.5%) between our dose engine result and the EGSnrc result. It only takes less than 1 min for every simulation, being reduced up to ∼40 times compared to EGSnrc simulations. Conclusion: We have successfully developed a fast Monte Carlo dose engine for Gamma Knife.« less

  14. MCMEG: Simulations of both PDD and TPR for 6 MV LINAC photon beam using different MC codes

    NASA Astrophysics Data System (ADS)

    Fonseca, T. C. F.; Mendes, B. M.; Lacerda, M. A. S.; Silva, L. A. C.; Paixão, L.; Bastos, F. M.; Ramirez, J. V.; Junior, J. P. R.

    2017-11-01

    The Monte Carlo Modelling Expert Group (MCMEG) is an expert network specializing in Monte Carlo radiation transport and the modelling and simulation applied to the radiation protection and dosimetry research field. For the first inter-comparison task the group launched an exercise to model and simulate a 6 MV LINAC photon beam using the Monte Carlo codes available within their laboratories and validate their simulated results by comparing them with experimental measurements carried out in the National Cancer Institute (INCA) in Rio de Janeiro, Brazil. The experimental measurements were performed using an ionization chamber with calibration traceable to a Secondary Standard Dosimetry Laboratory (SSDL). The detector was immersed in a water phantom at different depths and was irradiated with a radiation field size of 10×10 cm2. This exposure setup was used to determine the dosimetric parameters Percentage Depth Dose (PDD) and Tissue Phantom Ratio (TPR). The validation process compares the MC calculated results to the experimental measured PDD20,10 and TPR20,10. Simulations were performed reproducing the experimental TPR20,10 quality index which provides a satisfactory description of both the PDD curve and the transverse profiles at the two depths measured. This paper reports in detail the modelling process using MCNPx, MCNP6, EGSnrc and Penelope Monte Carlo codes, the source and tally descriptions, the validation processes and the results.

  15. Simulation of the Mg(Ar) ionization chamber currents by different Monte Carlo codes in benchmark gamma fields

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Liu, Yuan-Hao; Nievaart, Sander; Chen, Yen-Fu; Wu, Shu-Wei; Chou, Wen-Tsae; Jiang, Shiang-Huei

    2011-10-01

    High energy photon (over 10 MeV) and neutron beams adopted in radiobiology and radiotherapy always produce mixed neutron/gamma-ray fields. The Mg(Ar) ionization chambers are commonly applied to determine the gamma-ray dose because of its neutron insensitive characteristic. Nowadays, many perturbation corrections for accurate dose estimation and lots of treatment planning systems are based on Monte Carlo technique. The Monte Carlo codes EGSnrc, FLUKA, GEANT4, MCNP5, and MCNPX were used to evaluate energy dependent response functions of the Exradin M2 Mg(Ar) ionization chamber to a parallel photon beam with mono-energies from 20 keV to 20 MeV. For the sake of validation, measurements were carefully performed in well-defined (a) primary M-100 X-ray calibration field, (b) primary 60Co calibration beam, (c) 6-MV, and (d) 10-MV therapeutic beams in hospital. At energy region below 100 keV, MCNP5 and MCNPX both had lower responses than other codes. For energies above 1 MeV, the MCNP ITS-mode greatly resembled other three codes and the differences were within 5%. Comparing to the measured currents, MCNP5 and MCNPX using ITS-mode had perfect agreement with the 60Co, and 10-MV beams. But at X-ray energy region, the derivations reached 17%. This work shows us a better insight into the performance of different Monte Carlo codes in photon-electron transport calculation. Regarding the application of the mixed field dosimetry like BNCT, MCNP with ITS-mode is recognized as the most suitable tool by this work.

  16. SU-E-T-552: Monte Carlo Calculation of Correction Factors for a Free-Air Ionization Chamber in Support of a National Air-Kerma Standard for Electronic Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mille, M; Bergstrom, P

    2015-06-15

    Purpose: To use Monte Carlo radiation transport methods to calculate correction factors for a free-air ionization chamber in support of a national air-kerma standard for low-energy, miniature x-ray sources used for electronic brachytherapy (eBx). Methods: The NIST is establishing a calibration service for well-type ionization chambers used to characterize the strength of eBx sources prior to clinical use. The calibration approach involves establishing the well-chamber’s response to an eBx source whose air-kerma rate at a 50 cm distance is determined through a primary measurement performed using the Lamperti free-air ionization chamber. However, the free-air chamber measurements of charge or currentmore » can only be related to the reference air-kerma standard after applying several corrections, some of which are best determined via Monte Carlo simulation. To this end, a detailed geometric model of the Lamperti chamber was developed in the EGSnrc code based on the engineering drawings of the instrument. The egs-fac user code in EGSnrc was then used to calculate energy-dependent correction factors which account for missing or undesired ionization arising from effects such as: (1) attenuation and scatter of the x-rays in air; (2) primary electrons escaping the charge collection region; (3) lack of charged particle equilibrium; (4) atomic fluorescence and bremsstrahlung radiation. Results: Energy-dependent correction factors were calculated assuming a monoenergetic point source with the photon energy ranging from 2 keV to 60 keV in 2 keV increments. Sufficient photon histories were simulated so that the Monte Carlo statistical uncertainty of the correction factors was less than 0.01%. The correction factors for a specific eBx source will be determined by integrating these tabulated results over its measured x-ray spectrum. Conclusion: The correction factors calculated in this work are important for establishing a national standard for eBx which will help ensure that dose is accurately and consistently delivered to patients.« less

  17. Sci-Sat AM: Radiation Dosimetry and Practical Therapy Solutions - 05: Not all geometries are equivalent for magnetic field Fano cavity tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkov, Victor N.; Rogers, David W.O.

    The coupling of MRI and radiation treatment systems for the application of magnetic resonance guided radiation therapy necessitates a reliable magnetic field capable Monte Carlo (MC) code. In addition to the influence of the magnetic field on dose distributions, the question of proper calibration has arisen due to the several percent variation of ion chamber and solid state detector responses in magnetic fields when compared to the 0 T case (Reynolds et al., Med Phys, 2013). In the absence of a magnetic field, EGSnrc has been shown to pass the Fano cavity test (a rigorous benchmarking tool of MC codes)more » at the 0.1 % level (Kawrakow, Med.Phys, 2000), and similar results should be required of magnetic field capable MC algorithms. To properly test such developing MC codes, the Fano cavity theorem has been adapted to function in a magnetic field (Bouchard et al., PMB, 2015). In this work, the Fano cavity test is applied in a slab and ion-chamber-like geometries to test the transport options of an implemented magnetic field algorithm in EGSnrc. Results show that the deviation of the MC dose from the expected Fano cavity theory value is highly sensitive to the choice of geometry, and the ion chamber geometry appears to pass the test more easily than larger slab geometries. As magnetic field MC codes begin to be used for dose simulations and correction factor calculations, care must be taken to apply the most rigorous Fano test geometries to ensure reliability of such algorithms.« less

  18. Roos and NACP-02 ion chamber perturbations and water-air stopping-power ratios for clinical electron beams for energies from 4 to 22 MeV

    NASA Astrophysics Data System (ADS)

    Bailey, M.; Shipley, D. R.; Manning, J. W.

    2015-02-01

    Empirical fits are developed for depth-compensated wall- and cavity-replacement perturbations in the PTW Roos 34001 and IBA / Scanditronix NACP-02 parallel-plate ionisation chambers, for electron beam qualities from 4 to 22 MeV for depths up to approximately 1.1 × R50,D. These are based on calculations using the Monte Carlo radiation transport code EGSnrc and its user codes with a full simulation of the linac treatment head modelled using BEAMnrc. These fits are used with calculated restricted stopping-power ratios between air and water to match measured depth-dose distributions in water from an Elekta Synergy clinical linear accelerator at the UK National Physical Laboratory. Results compare well with those from recent publications and from the IPEM 2003 electron beam radiotherapy Code of Practice.

  19. Monte Carlo calculations of k{sub Q}, the beam quality conversion factor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muir, B. R.; Rogers, D. W. O.

    2010-11-15

    Purpose: To use EGSnrc Monte Carlo simulations to directly calculate beam quality conversion factors, k{sub Q}, for 32 cylindrical ionization chambers over a range of beam qualities and to quantify the effect of systematic uncertainties on Monte Carlo calculations of k{sub Q}. These factors are required to use the TG-51 or TRS-398 clinical dosimetry protocols for calibrating external radiotherapy beams. Methods: Ionization chambers are modeled either from blueprints or manufacturers' user's manuals. The dose-to-air in the chamber is calculated using the EGSnrc user-code egs{sub c}hamber using 11 different tabulated clinical photon spectra for the incident beams. The dose to amore » small volume of water is also calculated in the absence of the chamber at the midpoint of the chamber on its central axis. Using a simple equation, k{sub Q} is calculated from these quantities under the assumption that W/e is constant with energy and compared to TG-51 protocol and measured values. Results: Polynomial fits to the Monte Carlo calculated k{sub Q} factors as a function of beam quality expressed as %dd(10){sub x} and TPR{sub 10}{sup 20} are given for each ionization chamber. Differences are explained between Monte Carlo calculated values and values from the TG-51 protocol or calculated using the computer program used for TG-51 calculations. Systematic uncertainties in calculated k{sub Q} values are analyzed and amount to a maximum of one standard deviation uncertainty of 0.99% if one assumes that photon cross-section uncertainties are uncorrelated and 0.63% if they are assumed correlated. The largest components of the uncertainty are the constancy of W/e and the uncertainty in the cross-section for photons in water. Conclusions: It is now possible to calculate k{sub Q} directly using Monte Carlo simulations. Monte Carlo calculations for most ionization chambers give results which are comparable to TG-51 values. Discrepancies can be explained using individual Monte Carlo calculations of various correction factors which are more accurate than previously used values. For small ionization chambers with central electrodes composed of high-Z materials, the effect of the central electrode is much larger than that for the aluminum electrodes in Farmer chambers.« less

  20. An experimental MOSFET approach to characterize (192)Ir HDR source anisotropy.

    PubMed

    Toye, W C; Das, K R; Todd, S P; Kenny, M B; Franich, R D; Johnston, P N

    2007-09-07

    The dose anisotropy around a (192)Ir HDR source in a water phantom has been measured using MOSFETs as relative dosimeters. In addition, modeling using the EGSnrc code has been performed to provide a complete dose distribution consistent with the MOSFET measurements. Doses around the Nucletron 'classic' (192)Ir HDR source were measured for a range of radial distances from 5 to 30 mm within a 40 x 30 x 30 cm(3) water phantom, using a TN-RD-50 MOSFET dosimetry system with an active area of 0.2 mm by 0.2 mm. For each successive measurement a linear stepper capable of movement in intervals of 0.0125 mm re-positioned the MOSFET at the required radial distance, while a rotational stepper enabled angular displacement of the source at intervals of 0.9 degrees . The source-dosimeter arrangement within the water phantom was modeled using the standardized cylindrical geometry of the DOSRZnrc user code. In general, the measured relative anisotropy at each radial distance from 5 mm to 30 mm is in good agreement with the EGSnrc simulations, benchmark Monte Carlo simulation and TLD measurements where they exist. The experimental approach employing a MOSFET detection system of small size, high spatial resolution and fast read out capability allowed a practical approach to the determination of dose anisotropy around a HDR source.

  1. Investigating the effect of a magnetic field on dose distributions at phantom-air interfaces using PRESAGE® 3D dosimeter and Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Costa, Filipa; Doran, Simon J.; Hanson, Ian M.; Nill, Simeon; Billas, Ilias; Shipley, David; Duane, Simon; Adamovics, John; Oelfke, Uwe

    2018-03-01

    Dosimetric quality assurance (QA) of the new Elekta Unity (MR-linac) will differ from the QA performed of a conventional linac due to the constant magnetic field, which creates an electron return effect (ERE). In this work we aim to validate PRESAGE® dosimetry in a transverse magnetic field, and assess its use to validate the research version of the Monaco TPS of the MR-linac. Cylindrical samples of PRESAGE® 3D dosimeter separated by an air gap were irradiated with a cobalt-60 unit, while placed between the poles of an electromagnet at 0.5 T and 1.5 T. This set-up was simulated in EGSnrc/Cavity Monte Carlo (MC) code and relative dose distributions were compared with measurements using 1D and 2D gamma criteria of 3% and 1.5 mm. The irradiation conditions were adapted for the MR-linac and compared with Monaco TPS simulations. Measured and EGSnrc/Cavity simulated profiles showed good agreement with a gamma passing rate of 99.9% for 0.5 T and 99.8% for 1.5 T. Measurements on the MR-linac also compared well with Monaco TPS simulations, with a gamma passing rate of 98.4% at 1.5 T. Results demonstrated that PRESAGE® can accurately measure dose and detect the ERE, encouraging its use as a QA tool to validate the Monaco TPS of the MR-linac for clinically relevant dose distributions at tissue-air boundaries.

  2. Sci—Fri PM: Topics — 01: A monte carlo model of a miniature low-energy x-ray tube using EGSnrc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, P; Seuntjens, J

    The INTRABEAM system (Carl Zeiss, Oberkochen, Germany) is a miniature x-ray generator for use in intraoperative radiotherapy and brachytherapy. The device accelerates electrons to up to 50 keV, which are then steered down an evacuated needle probe to strike a thin gold target. For accurate dosimetry of the INTRABEAM system, it is important that the photon spectrum be well understood. Measurements based on air-kerma are heavily impacted by photon spectra, particularly for low photon energies due to the large photoelectric contribution in air mass energy absorption coefficient. While low energy photons have little clinical significance at treatment depths, they maymore » have a large effect on air-kerma measurements. In this work, we have developed an EGSnrc-based monte carlo (MC) model of the Zeiss INTRABEAM system to study the source photon spectra and half-value layers (HVLs) of the bare probe and with various spherical applicators. HVLs were calculated using the analytical attenuation of air-kerma spectra. The calculated bare probe spectrum was compared with simulated and measured results taken from literature. Differences in the L-line energies of gold were found between the spectra predicted by EGSnrc and Geant4. This is due to M and N shell averaging during atomic transitions in EGSnrc. The calculated HVLs of the bare probe and spherical applicators are consistent with literature reported measured values.« less

  3. Dosimetric evaluation of the clinical implementation of the first commercial IMRT Monte Carlo treatment planning system at 6 MV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heath, Emily; Seuntjens, Jan; Sheikh-Bagheri, Daryoush

    2004-10-01

    In this work we dosimetrically evaluated the clinical implementation of a commercial Monte Carlo treatment planning software (PEREGRINE, North American Scientific, Cranberry Township, PA) intended for quality assurance (QA) of intensity modulated radiation therapy treatment plans. Dose profiles calculated in homogeneous and heterogeneous phantoms using this system were compared to both measurements and simulations using the EGSnrc Monte Carlo code for the 6 MV beam of a Varian CL21EX linear accelerator. For simple jaw-defined fields, calculations agree within 2% of the dose at d{sub max} with measurements in homogeneous phantoms with the exception of the buildup region where the calculationsmore » overestimate the dose by up to 8%. In heterogeneous lung and bone phantoms the agreement is within 3%, on average, up to 5% for a 1x1 cm{sup 2} field. We tested two consecutive implementations of the MLC model. After matching the calculated and measured MLC leakage, simulations of static and dynamic MLC-defined fields using the most recent MLC model agreed to within 2% with measurements.« less

  4. An Approach in Radiation Therapy Treatment Planning: A Fast, GPU-Based Monte Carlo Method.

    PubMed

    Karbalaee, Mojtaba; Shahbazi-Gahrouei, Daryoush; Tavakoli, Mohammad B

    2017-01-01

    An accurate and fast radiation dose calculation is essential for successful radiation radiotherapy. The aim of this study was to implement a new graphic processing unit (GPU) based radiation therapy treatment planning for accurate and fast dose calculation in radiotherapy centers. A program was written for parallel running based on GPU. The code validation was performed by EGSnrc/DOSXYZnrc. Moreover, a semi-automatic, rotary, asymmetric phantom was designed and produced using a bone, the lung, and the soft tissue equivalent materials. All measurements were performed using a Mapcheck dosimeter. The accuracy of the code was validated using the experimental data, which was obtained from the anthropomorphic phantom as the gold standard. The findings showed that, compared with those of DOSXYZnrc in the virtual phantom and for most of the voxels (>95%), <3% dose-difference or 3 mm distance-to-agreement (DTA) was found. Moreover, considering the anthropomorphic phantom, compared to the Mapcheck dose measurements, <5% dose-difference or 5 mm DTA was observed. Fast calculation speed and high accuracy of GPU-based Monte Carlo method in dose calculation may be useful in routine radiation therapy centers as the core and main component of a treatment planning verification system.

  5. TU-D-209-02: A Backscatter Point Spread Function for Entrance Skin Dose Determination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vijayan, S; Xiong, Z; Shankar, A

    Purpose: To determine the distribution of backscattered radiation to the skin resulting from a non-uniform distribution of primary radiation through convolution with a backscatter point spread function (PSF). Methods: A backscatter PSF is determined using Monte Carlo simulation of a 1 mm primary beam incident on a 30 × 30 cm × 20 cm thick PMMA phantom using EGSnrc software. A primary profile is similarly obtained without the phantom and the difference from the total provides the backscatter profile. This scatter PSF characterizes the backscatter spread for a “point” primary interaction and can be convolved with the entrance primary dosemore » distribution to obtain the total entrance skin dose. The backscatter PSF was integrated into the skin dose tracking system (DTS), a graphical utility for displaying the color-coded skin dose distribution on a 3D graphic of the patient during interventional fluoroscopic procedures. The backscatter convolution method was validated for the non-uniform beam resulting from the use of an ROI attenuator. The ROI attenuator is a copper sheet with about 20% primary transmission (0.7 mm thick) containing a circular aperture; this attenuator is placed in the beam to reduce dose in the periphery while maintaining full dose in the region of interest. The DTS calculated primary plus backscatter distribution is compared to that measured with GafChromic film and that calculated using EGSnrc Monte-Carlo software. Results: The PSF convolution method used in the DTS software was able to account for the spread of backscatter from the ROI region to the region under the attenuator. The skin dose distribution determined using DTS with the ROI attenuator was in good agreement with the distributions measured with Gafchromic film and determined by Monte Carlo simulation Conclusion: The PSF convolution technique provides an accurate alternative for entrance skin dose determination with non-uniform primary x-ray beams. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  6. NOTE: MCDE: a new Monte Carlo dose engine for IMRT

    NASA Astrophysics Data System (ADS)

    Reynaert, N.; DeSmedt, B.; Coghe, M.; Paelinck, L.; Van Duyse, B.; DeGersem, W.; DeWagter, C.; DeNeve, W.; Thierens, H.

    2004-07-01

    A new accurate Monte Carlo code for IMRT dose computations, MCDE (Monte Carlo dose engine), is introduced. MCDE is based on BEAMnrc/DOSXYZnrc and consequently the accurate EGSnrc electron transport. DOSXYZnrc is reprogrammed as a component module for BEAMnrc. In this way both codes are interconnected elegantly, while maintaining the BEAM structure and only minimal changes to BEAMnrc.mortran are necessary. The treatment head of the Elekta SLiplus linear accelerator is modelled in detail. CT grids consisting of up to 200 slices of 512 × 512 voxels can be introduced and up to 100 beams can be handled simultaneously. The beams and CT data are imported from the treatment planning system GRATIS via a DICOM interface. To enable the handling of up to 50 × 106 voxels the system was programmed in Fortran95 to enable dynamic memory management. All region-dependent arrays (dose, statistics, transport arrays) were redefined. A scoring grid was introduced and superimposed on the geometry grid, to be able to limit the number of scoring voxels. The whole system uses approximately 200 MB of RAM and runs on a PC cluster consisting of 38 1.0 GHz processors. A set of in-house made scripts handle the parallellization and the centralization of the Monte Carlo calculations on a server. As an illustration of MCDE, a clinical example is discussed and compared with collapsed cone convolution calculations. At present, the system is still rather slow and is intended to be a tool for reliable verification of IMRT treatment planning in the case of the presence of tissue inhomogeneities such as air cavities.

  7. Poster - 18: New features in EGSnrc for photon cross sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, Elsayed; Mainegra-Hing, Ernesto; Rogers, Davi

    2016-08-15

    Purpose: To implement two new features in the EGSnrc Monte Carlo system. The first is an option to account for photonuclear attenuation, which can contribute a few percent to the total cross section at the higher end of the energy range of interest to medical physics. The second is an option to use exact NIST XCOM photon cross sections. Methods: For the first feature, the photonuclear total cross sections are generated from the IAEA evaluated data. In the current, first-order implementation, after a photonuclear event, there is no energy deposition or secondary particle generation. The implementation is validated against deterministicmore » calculations and experimental measurements of transmission signals. For the second feature, before this work, if the user explicitly requested XCOM photon cross sections, EGSnrc still used its own internal incoherent scattering cross sections. These differ by up to 2% from XCOM data between 30 keV and 40 MeV. After this work, exact XCOM incoherent scattering cross sections are an available option. Minor interpolation artifacts in pair and triplet XCOM cross sections are also addressed. The default for photon cross section in EGSnrc is XCOM except for the new incoherent scattering cross sections, which have to be explicitly requested. The photonuclear, incoherent, pair and triplet data from this work are available for elements and compounds for photon energies from 1 keV to 100 GeV. Results: Both features are implemented and validated in EGSnrc.Conclusions: The two features are part of the standard EGSnrc distribution as of version 4.2.3.2.« less

  8. [Dosimetric evaluation of eye lense shieldings in computed tomography examination--measurements and Monte Carlo simulations].

    PubMed

    Wulff, Jorg; Keil, Boris; Auvanis, Diyala; Heverhagen, Johannes T; Klose, Klaus Jochen; Zink, Klemens

    2008-01-01

    The present study aims at the investigation of eye lens shielding of different composition for the use in computed tomography examinations. Measurements with thermo-luminescent dosimeters and a simple cylindrical waterfilled phantom were performed as well as Monte Carlo simulations with an equivalent geometry. Besides conventional shielding made of Bismuth coated latex, a new shielding with a mixture of metallic components was analyzed. This new material leads to an increased dose reduction compared to the Bismuth shielding. Measured and Monte Carlo simulated dose reductions are in good agreement and amount to 34% for the Bismuth shielding and 46% for the new material. For simulations the EGSnrc code system was used and a new application CTDOSPP was developed for the simulation of the computed tomography examination. The investigations show that a satisfying agreement between simulation and measurement with the chosen geometries of this study could only be achieved, when transport of secondary electrons was accounted for in the simulation. The amount of scattered radiation due to the protector by fluorescent photons was analyzed and is larger for the new material due to the smaller atomic number of the metallic components.

  9. Evidence for using Monte Carlo calculated wall attenuation and scatter correction factors for three styles of graphite-walled ion chamber.

    PubMed

    McCaffrey, J P; Mainegra-Hing, E; Kawrakow, I; Shortt, K R; Rogers, D W O

    2004-06-21

    The basic equation for establishing a 60Co air-kerma standard based on a cavity ionization chamber includes a wall correction term that corrects for the attenuation and scatter of photons in the chamber wall. For over a decade, the validity of the wall correction terms determined by extrapolation methods (K(w)K(cep)) has been strongly challenged by Monte Carlo (MC) calculation methods (K(wall)). Using the linear extrapolation method with experimental data, K(w)K(cep) was determined in this study for three different styles of primary-standard-grade graphite ionization chamber: cylindrical, spherical and plane-parallel. For measurements taken with the same 60Co source, the air-kerma rates for these three chambers, determined using extrapolated K(w)K(cep) values, differed by up to 2%. The MC code 'EGSnrc' was used to calculate the values of K(wall) for these three chambers. Use of the calculated K(wall) values gave air-kerma rates that agreed within 0.3%. The accuracy of this code was affirmed by its reliability in modelling the complex structure of the response curve obtained by rotation of the non-rotationally symmetric plane-parallel chamber. These results demonstrate that the linear extrapolation technique leads to errors in the determination of air-kerma.

  10. SU-F-J-146: Experimental Validation of 6 MV Photon PDD in Parallel Magnetic Field Calculated by EGSnrc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghila, A; Steciw, S; Fallone, B

    Purpose: Integrated linac-MR systems are uniquely suited for real time tumor tracking during radiation treatment. Understanding the magnetic field dose effects and incorporating them in treatment planning is paramount for linac-MR clinical implementation. We experimentally validated the EGSnrc dose calculations in the presence of a magnetic field parallel to the radiation beam travel. Methods: Two cylindrical bore electromagnets produced a 0.21 T magnetic field parallel to the central axis of a 6 MV photon beam. A parallel plate ion chamber was used to measure the PDD in a polystyrene phantom, placed inside the bore in two setups: phantom top surfacemore » coinciding with the magnet bore center (183 cm SSD), and with the magnet bore’s top surface (170 cm SSD). We measured the field of the magnet at several points and included the exact dimensions of the coils to generate a 3D magnetic field map in a finite element model. BEAMnrc and DOSXYZnrc simulated the PDD experiments in parallel magnetic field (i.e. 3D magnetic field included) and with no magnetic field. Results: With the phantom surface at the top of the electromagnet, the surface dose increased by 10% (compared to no-magnetic field), due to electrons being focused by the smaller fringe fields of the electromagnet. With the phantom surface at the bore center, the surface dose increased by 30% since extra 13 cm of air column was in relatively higher magnetic field (>0.13T) in the magnet bore. EGSnrc Monte Carlo code correctly calculated the radiation dose with and without the magnetic field, and all points passed the 2%, 2 mm Gamma criterion when the ion chamber’s entrance window and air cavity were included in the simulated phantom. Conclusion: A parallel magnetic field increases the surface and buildup dose during irradiation. The EGSnrc package can model these magnetic field dose effects accurately. Dr. Fallone is a co-founder and CEO of MagnetTx Oncology Solutions (under discussions to license Alberta bi-planar linac MR for commercialization).« less

  11. Comparison of Flattening Filter (FF) and Flattening-Filter-Free (FFF) 6 MV photon beam characteristics for small field dosimetry using EGSnrc Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Sangeetha, S.; Sureka, C. S.

    2017-06-01

    The present study is focused to compare the characteristics of Varian Clinac 600 C/D flattened and unflattened 6 MV photon beams for small field dosimetry using EGSnrc Monte Carlo Simulation since the small field dosimetry is considered to be the most crucial and provoking task in the field of radiation dosimetry. A 6 MV photon beam of a Varian Clinac 600 C/D medical linear accelerator operates with Flattening Filter (FF) and Flattening-Filter-Free (FFF) mode for small field dosimetry were performed using EGSnrc Monte Carlo user codes (BEAMnrc and DOSXYZnrc) in order to calculate the beam characteristics using Educated-trial and error method. These includes: Percentage depth dose, lateral beam profile, dose rate delivery, photon energy spectra, photon beam uniformity, out-of-field dose, surface dose, penumbral dose and output factor for small field dosimetry (0.5×0.5 cm2 to 4×4 cm2) and are compared with magna-field sizes (5×5 cm2 to 40×40 cm2) at various depths. The results obtained showed that the optimized beam energy and Full-width-half maximum value for small field dosimetry and magna-field dosimetry was found to be 5.7 MeV and 0.13 cm for both FF and FFF beams. The depth of dose maxima for small field size deviates minimally for both FF and FFF beams similar to magna-fields. The depths greater than dmax depicts a steeper dose fall off in the exponential region for FFF beams comparing FF beams where its deviations gets increased with the increase in field size. The shape of the lateral beam profiles of FF and FFF beams varies remains similar for the small field sizes less than 4×4 cm2 whereas it varies in the case of magna-fields. Dose rate delivery for FFF beams shows an eminent increase with a two-fold factor for both small field dosimetry and magna-field sizes. The surface dose measurements of FFF beams for small field size were found to be higher whereas it gets lower for magna-fields than FF beam. The amount of out-of-field dose reduction gets increased with the increase in field size. It is also observed that the photon energy spectrum gets increased with the increase in field size for FFF beam mode. Finally, the output factors for FFF beams were relatively quite low for small field sizes than FF beams whereas it gets higher for magna-field sizes. From this study, it is concluded that the FFF beams depicted minimal deviations in the treatment field region irrespective to the normal tissue region for small field dosimetry compared to FF beams. The more prominent result observed from the study is that the shape of the beam profile remains similar for FF and FFF beams in the case of smaller field size that leads to more accurate treatment planning in the case of IMRT (Image-Guided Radiation Therapy), IGAT (Image-Guided Adaptive Radiation Therapy), SBRT (Stereotactic Body Radiation Therapy), SRS (Stereotactic Radio Surgery), and Tomotherapy techniques where homogeneous dose is not necessary. On the whole, the determination of dosimetric beam characteristics of Varian linac machine using Monte Carlo simulation provides accurate dose calculation as the clinical golden data.

  12. MO-FG-BRA-01: 4D Monte Carlo Simulations for Verification of Dose Delivered to a Moving Anatomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gholampourkashi, S; Cygler, J E.; The Ottawa Hospital Cancer Centre, Ottawa, ON

    Purpose: To validate 4D Monte Carlo (MC) simulations of dose delivery by an Elekta Agility linear accelerator to a moving phantom. Methods: Monte Carlo simulations were performed using the 4DdefDOSXYZnrc/EGSnrc user code which samples a new geometry for each incident particle and calculates the dose in a continuously moving anatomy. A Quasar respiratory motion phantom with a lung insert containing a 3 cm diameter tumor was used for dose measurements on an Elekta Agility linac with the phantom in stationary and moving states. Dose to the center of tumor was measured using calibrated EBT3 film and the RADPOS 4D dosimetrymore » system. A VMAT plan covering the tumor was created on the static CT scan of the phantom using Monaco V.5.10.02. A validated BEAMnrc model of our Elekta Agility linac was used for Monte Carlo simulations on stationary and moving anatomies. To compare the planned and delivered doses, linac log files recorded during measurements were used for the simulations. For 4D simulations, deformation vectors that modeled the rigid translation of the lung insert were generated as input to the 4DdefDOSXYZnrc code as well as the phantom motion trace recorded with RADPOS during the measurements. Results: Monte Carlo simulations and film measurements were found to agree within 2mm/2% for 97.7% of points in the film in the static phantom and 95.5% in the moving phantom. Dose values based on film and RADPOS measurements are within 2% of each other and within 2σ of experimental uncertainties with respect to simulations. Conclusion: Our 4D Monte Carlo simulation using the defDOSXYZnrc code accurately calculates dose delivered to a moving anatomy. Future work will focus on more investigation of VMAT delivery on a moving phantom to improve the agreement between simulation and measurements, as well as establishing the accuracy of our method in a deforming anatomy. This work was supported by the Ontario Consortium of Adaptive Interventions in Radiation Oncology (OCAIRO), funded by the Ontario Research Fund Research Excellence program.« less

  13. Study of solid-conversion gaseous detector based on GEM for high energy X-ray industrial CT.

    PubMed

    Zhou, Rifeng; Zhou, Yaling

    2014-01-01

    The general gaseous ionization detectors are not suitable for high energy X-ray industrial computed tomography (HEICT) because of their inherent limitations, especially low detective efficiency and large volume. The goal of this study was to investigate a new type of gaseous detector to solve these problems. The novel detector was made by a metal foil as X-ray convertor to improve the conversion efficiency, and the Gas Electron Multiplier (hereinafter "GEM") was used as electron amplifier to lessen its volume. The detective mechanism and signal formation of the detector was discussed in detail. The conversion efficiency was calculated by using EGSnrc Monte Carlo code, and the transport course of photon and secondary electron avalanche in the detector was simulated with the Maxwell and Garfield codes. The result indicated that this detector has higher conversion efficiency as well as less volume. Theoretically this kind of detector could be a perfect candidate for replacing the conventional detector in HEICT.

  14. TU-AB-BRC-08: Egs-brachy, a Fast and Versatile Monte Carlo Code for Brachytherapy Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamberland, M; Taylor, R; Rogers, D

    2016-06-15

    Purpose: To introduce egs-brachy, a new, fast, and versatile Monte Carlo code for brachytherapy applications. Methods: egs-brachy is an EGSnrc user-code based on the EGSnrc C++ class library (egs++). Complex phantom, applicator, and source model geometries are built using the egs++ geometry module. egs-brachy uses a tracklength estimator to score collision kerma in voxels. Interaction, spectrum, energy fluence, and phase space scoring are also implemented. Phase space sources and particle recycling may be used to improve simulation efficiency. HDR treatments (e.g. stepping source through dwell positions) can be simulated. Standard brachytherapy seeds, as well as electron and miniature x-ray tubemore » sources are fully modelled. Variance reduction techniques for electron source simulations are implemented (Bremsstrahlung cross section enhancement, uniform Bremsstrahlung splitting, and Russian Roulette). TG-43 parameters of seeds are computed and compared to published values. Example simulations of various treatments are carried out on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core. Results: TG-43 parameters calculated with egs-brachy show excellent agreement with published values. Using a phase space source, 2% average statistical uncertainty in the PTV ((2mm){sup 3} voxels) can be achieved in 10 s for 100 {sup 125}I or {sup 103}Pd seeds in a 36.2 cm{sup 3} prostate PTV, 31 s for 64 {sup 103}Pd seeds in a 64 cm{sup 3} breast PTV, and 56 s for a miniature x-ray tube in a 27 cm{sup 3} breast PTV. Comparable uncertainty is reached in 12 s in a (1 mm){sup 3} water voxel 5 mm away from a COMS 16mm eye plaque with 13 {sup 103}Pd seeds. Conclusion: The accuracy of egs-brachy has been demonstrated through benchmarking calculations. Calculation times are sufficiently fast to allow full MC simulations for routine treatment planning for diverse brachytherapy treatments (LDR, HDR, miniature x-ray tube). egs-brachy will be available as free and open-source software to the medical physics research community. This work is partially funded by the Canada Research Chairs program, the Natural Sciences and Engineering Research Council of Canada, and the Ontario Ministry of Research and Innovation (Ontario Early Researcher Award).« less

  15. EDITORIAL: Special section: Selected papers from the Third European Workshop on Monte Carlo Treatment Planning (MCTP2012) Special section: Selected papers from the Third European Workshop on Monte Carlo Treatment Planning (MCTP2012)

    NASA Astrophysics Data System (ADS)

    Spezi, Emiliano; Leal, Antonio

    2013-04-01

    The Third European Workshop on Monte Carlo Treatment Planning (MCTP2012) was held from 15-18 May, 2012 in Seville, Spain. The event was organized by the Universidad de Sevilla with the support of the European Workgroup on Monte Carlo Treatment Planning (EWG-MCTP). MCTP2012 followed two successful meetings, one held in Ghent (Belgium) in 2006 (Reynaert 2007) and one in Cardiff (UK) in 2009 (Spezi 2010). The recurrence of these workshops together with successful events held in parallel by McGill University in Montreal (Seuntjens et al 2012), show consolidated interest from the scientific community in Monte Carlo (MC) treatment planning. The workshop was attended by a total of 90 participants, mainly coming from a medical physics background. A total of 48 oral presentations and 15 posters were delivered in specific scientific sessions including dosimetry, code development, imaging, modelling of photon and electron radiation transport, external beam radiation therapy, nuclear medicine, brachitherapy and hadrontherapy. A copy of the programme is available on the workshop's website (www.mctp2012.com). In this special section of Physics in Medicine and Biology we report six papers that were selected following the journal's rigorous peer review procedure. These papers actually provide a good cross section of the areas of application of MC in treatment planning that were discussed at MCTP2012. Czarnecki and Zink (2013) and Wagner et al (2013) present the results of their work in small field dosimetry. Czarnecki and Zink (2013) studied field size and detector dependent correction factors for diodes and ion chambers within a clinical 6MV photon beam generated by a Siemens linear accelerator. Their modelling work based on the BEAMnrc/EGSnrc codes and experimental measurements revealed that unshielded diodes were the best choice for small field dosimetry because of their independence from the electron beam spot size and correction factor close to unity. Wagner et al (2013) investigated the recombination effect on liquid ionization chambers for stereotactic radiotherapy, a field of increasing importance in external beam radiotherapy. They modelled both radiation source (Cyberknife unit) and detector with the BEAMnrc/EGSnrc codes and quantified the dependence of the response of this type of detectors on factors such as the volume effect and the electrode. They also recommended that these dependences be accounted for in measurements involving small fields. In the field of external beam radiotherapy, Chakarova et al (2013) showed how total body irradiation (TBI) could be improved by simulating patient treatments with MC. In particular, BEAMnrc/EGSnrc based simulations highlighted the importance of optimizing individual compensators for TBI treatments. In the same area of application, Mairani et al (2013) reported on a new tool for treatment planning in proton therapy based on the FLUKA MC code. The software, used to model both proton therapy beam and patient anatomy, supports single-field and multiple-field optimization and can be used to optimize physical and relative biological effectiveness (RBE)-weighted dose distribution, using both constant and variable RBE models. In the field of nuclear medicine Marcatili et al (2013) presented RAYDOSE, a Geant4-based code specifically developed for applications in molecular radiotherapy (MRT). RAYDOSE has been designed to work in MRT trials using sequential positron emission tomography (PET) or single-photon emission tomography (SPECT) imaging to model patient specific time-dependent metabolic uptake and to calculate the total 3D dose distribution. The code was validated through experimental measurements in homogeneous and heterogeneous phantoms. Finally, in the field of code development Miras et al (2013) reported on CloudMC, a Windows Azure-based application for the parallelization of MC calculations in a dynamic cluster environment. Although the performance of CloudMC has been tested with the PENELOPE MC code, the authors report that software has been designed in a way that it should be independent of the type of MC code, provided that simulation meets a number of operational criteria. We wish to thank Elekta/CMS Inc., the University of Seville, the Junta of Andalusia and the European Regional Development Fund for their financial support. We would like also to acknowledge the members of EWG-MCTP for their help in peer-reviewing all the abstracts, and all the invited speakers who kindly agreed to deliver keynote presentations in their area of expertise. A final word of thanks to our colleagues who worked on the reviewing process of the papers selected for this special section and to the IOP Publishing staff who made it possible. MCTP2012 was accredited by the European Federation of Organisations for Medical Physics as a CPD event for medical physicists. Emiliano Spezi and Antonio Leal Guest Editors References Chakarova R, Müntzing K, Krantz M, E Hedin E and Hertzman S 2013 Monte Carlo optimization of total body irradiation in a phantom and patient geometry Phys. Med. Biol. 58 2461-69 Czarnecki D and Zink K 2013 Monte Carlo calculated correction factors for diodes and ion chambers in small photon fields Phys. Med. Biol. 58 2431-44 Mairani A, Böhlen T T, Schiavi A, Tessonnier T, Molinelli S, Brons S, Battistoni G, Parodi K and Patera V 2013 A Monte Carlo-based treatment planning tool for proton therapy Phys. Med. Biol. 58 2471-90 Marcatili S, Pettinato C, Daniels S, Lewis G, Edwards P, Fanti S and Spezi E 2013 Development and validation of RAYDOSE: a Geant4 based application for molecular radiotherapy Phys. Med. Biol. 58 2491-508 Miras H, Jiménez R, Miras C and Gomà C 2013 CloudMC: A cloud computing application for Monte Carlo simulation Phys. Med. Biol. 58 N125-33 Reynaert N 2007 First European Workshop on Monte Carlo Treatment Planning J. Phys.: Conf. Ser. 74 011001 Seuntjens J, Beaulieu L, El Naqa I and Després P 2012 Special section: Selected papers from the Fourth International Workshop on Recent Advances in Monte Carlo Techniques for Radiation Therapy Phys. Med. Biol. 57 (11) E01 Spezi E 2010 Special section: Selected papers from the Second European Workshop on Monte Carlo Treatment Planning (MCTP2009) Phys. Med. Biol. 55 (16) E01 Wagner A, Crop F, Lacornerie T, Vandevelde F and Reynaert N 2013 Use of a liquid ionization chamber for stereotactic radiotherapy dosimetry Phys. Med. Biol. 58 2445-59

  16. SU-F-T-364: Monte Carlo-Dose Verification of Volumetric Modulated Arc Therapy Plans Using AAPM TG-119 Test Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onizuka, R; Araki, F; Ohno, T

    2016-06-15

    Purpose: To investigate the Monte Carlo (MC)-based dose verification for VMAT plans by a treatment planning system (TPS). Methods: The AAPM TG-119 test structure set was used for VMAT plans by the Pinnacle3 (convolution/superposition), using a Synergy radiation head of a 6 MV beam with the Agility MLC. The Synergy was simulated with the EGSnrc/BEAMnrc code, and VMAT dose distributions were calculated with the EGSnrc/DOSXYZnrc code by the same irradiation conditions as TPS. VMAT dose distributions of TPS and MC were compared with those of EBT3 film, by 2-D gamma analysis of ±3%/3 mm criteria with a threshold of 30%more » of prescribed doses. VMAT dose distributions between TPS and MC were also compared by DVHs and 3-D gamma analysis of ±3%/3 mm criteria with a threshold of 10%, and 3-D passing rates for PTVs and OARs were analyzed. Results: TPS dose distributions differed from those of film, especially for Head & neck. The dose difference between TPS and film results from calculation accuracy for complex motion of MLCs like tongue and groove effect. In contrast, MC dose distributions were in good agreement with those of film. This is because MC can model fully the MLC configuration and accurately reproduce the MLC motion between control points in VMAT plans. D95 of PTV for Prostate, Head & neck, C-shaped, and Multi Target was 97.2%, 98.1%, 101.6%, and 99.7% for TPS and 95.7%, 96.0%, 100.6%, and 99.1% for MC, respectively. Similarly, 3-D gamma passing rates of each PTV for TPS vs. MC were 100%, 89.5%, 99.7%, and 100%, respectively. 3-D passing rates of TPS reduced for complex VMAT fields like Head & neck because MLCs are not modeled completely for TPS. Conclusion: MC-calculated VMAT dose distributions is useful for the 3-D dose verification of VMAT plans by TPS.« less

  17. Percentage depth dose evaluation in heterogeneous media using thermoluminescent dosimetry

    PubMed Central

    da Rosa, L.A.R.; Campos, L.T.; Alves, V.G.L.; Batista, D.V.S.; Facure, A.

    2010-01-01

    The purpose of this study is to investigate the influence of lung heterogeneity inside a soft tissue phantom on percentage depth dose (PDD). PDD curves were obtained experimentally using LiF:Mg,Ti (TLD‐100) thermoluminescent detectors and applying Eclipse treatment planning system algorithms Batho, modified Batho (M‐Batho or BMod), equivalent TAR (E‐TAR or EQTAR), and anisotropic analytical algorithm (AAA) for a 15 MV photon beam and field sizes of 1×1,2×2,5×5, and 10×10cm2. Monte Carlo simulations were performed using the DOSRZnrc user code of EGSnrc. The experimental results agree with Monte Carlo simulations for all irradiation field sizes. Comparisons with Monte Carlo calculations show that the AAA algorithm provides the best simulations of PDD curves for all field sizes investigated. However, even this algorithm cannot accurately predict PDD values in the lung for field sizes of 1×1 and 2×2cm2. An overdosage in the lung of about 40% and 20% is calculated by the AAA algorithm close to the interface soft tissue/lung for 1×1 and 2×2cm2 field sizes, respectively. It was demonstrated that differences of 100% between Monte Carlo results and the algorithms Batho, modified Batho, and equivalent TAR responses may exist inside the lung region for the 1×1cm2 field. PACS number: 87.55.kd

  18. An efficient Monte Carlo-based algorithm for scatter correction in keV cone-beam CT

    NASA Astrophysics Data System (ADS)

    Poludniowski, G.; Evans, P. M.; Hansen, V. N.; Webb, S.

    2009-06-01

    A new method is proposed for scatter-correction of cone-beam CT images. A coarse reconstruction is used in initial iteration steps. Modelling of the x-ray tube spectra and detector response are included in the algorithm. Photon diffusion inside the imaging subject is calculated using the Monte Carlo method. Photon scoring at the detector is calculated using forced detection to a fixed set of node points. The scatter profiles are then obtained by linear interpolation. The algorithm is referred to as the coarse reconstruction and fixed detection (CRFD) technique. Scatter predictions are quantitatively validated against a widely used general-purpose Monte Carlo code: BEAMnrc/EGSnrc (NRCC, Canada). Agreement is excellent. The CRFD algorithm was applied to projection data acquired with a Synergy XVI CBCT unit (Elekta Limited, Crawley, UK), using RANDO and Catphan phantoms (The Phantom Laboratory, Salem NY, USA). The algorithm was shown to be effective in removing scatter-induced artefacts from CBCT images, and took as little as 2 min on a desktop PC. Image uniformity was greatly improved as was CT-number accuracy in reconstructions. This latter improvement was less marked where the expected CT-number of a material was very different to the background material in which it was embedded.

  19. SU-F-T-12: Monte Carlo Dosimetry of the 60Co Bebig High Dose Rate Source for Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, L T; Almeida, C E V de

    Purpose: The purpose of this work is to obtain the dosimetry parameters in accordance with the AAPM TG-43U1 formalism with Monte Carlo calculations regarding the BEBIG 60Co high-dose-rate brachytherapy. The geometric design and material details of the source was provided by the manufacturer and was used to define the Monte Carlo geometry. Methods: The dosimetry studies included the calculation of the air kerma strength Sk, collision kerma in water along the transverse axis with an unbounded phantom, dose rate constant and radial dose function. The Monte Carlo code system that was used was EGSnrc with a new cavity code, whichmore » is a part of EGS++ that allows calculating the radial dose function around the source. The XCOM photon cross-section library was used. Variance reduction techniques were used to speed up the calculation and to considerably reduce the computer time. To obtain the dose rate distributions of the source in an unbounded liquid water phantom, the source was immersed at the center of a cube phantom of 100 cm3. Results: The obtained dose rate constant for the BEBIG 60Co source was 1.108±0.001 cGyh-1U-1, which is consistent with the values in the literature. The radial dose functions were compared with the values of the consensus data set in the literature, and they are consistent with the published data for this energy range. Conclusion: The dose rate constant is consistent with the results of Granero et al. and Selvam and Bhola within 1%. Dose rate data are compared to GEANT4 and DORZnrc Monte Carlo code. However, the radial dose function is different by up to 10% for the points that are notably near the source on the transversal axis because of the high-energy photons from 60Co, which causes an electronic disequilibrium at the interface between the source capsule and the liquid water for distances up to 1 cm.« less

  20. [Benchmark experiment to verify radiation transport calculations for dosimetry in radiation therapy].

    PubMed

    Renner, Franziska

    2016-09-01

    Monte Carlo simulations are regarded as the most accurate method of solving complex problems in the field of dosimetry and radiation transport. In (external) radiation therapy they are increasingly used for the calculation of dose distributions during treatment planning. In comparison to other algorithms for the calculation of dose distributions, Monte Carlo methods have the capability of improving the accuracy of dose calculations - especially under complex circumstances (e.g. consideration of inhomogeneities). However, there is a lack of knowledge of how accurate the results of Monte Carlo calculations are on an absolute basis. A practical verification of the calculations can be performed by direct comparison with the results of a benchmark experiment. This work presents such a benchmark experiment and compares its results (with detailed consideration of measurement uncertainty) with the results of Monte Carlo calculations using the well-established Monte Carlo code EGSnrc. The experiment was designed to have parallels to external beam radiation therapy with respect to the type and energy of the radiation, the materials used and the kind of dose measurement. Because the properties of the beam have to be well known in order to compare the results of the experiment and the simulation on an absolute basis, the benchmark experiment was performed using the research electron accelerator of the Physikalisch-Technische Bundesanstalt (PTB), whose beam was accurately characterized in advance. The benchmark experiment and the corresponding Monte Carlo simulations were carried out for two different types of ionization chambers and the results were compared. Considering the uncertainty, which is about 0.7 % for the experimental values and about 1.0 % for the Monte Carlo simulation, the results of the simulation and the experiment coincide. Copyright © 2015. Published by Elsevier GmbH.

  1. Optimisation of 12 MeV electron beam simulation using variance reduction technique

    NASA Astrophysics Data System (ADS)

    Jayamani, J.; Termizi, N. A. S. Mohd; Kamarulzaman, F. N. Mohd; Aziz, M. Z. Abdul

    2017-05-01

    Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 107 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 107 to 20 × 107. In this study, 5 MeV electron cut-off with 10 × 107 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy.

  2. Accuracy Evaluation of Oncentra™ TPS in HDR Brachytherapy of Nasopharynx Cancer Using EGSnrc Monte Carlo Code.

    PubMed

    Hadad, K; Zohrevand, M; Faghihi, R; Sedighi Pashaki, A

    2015-03-01

    HDR brachytherapy is one of the commonest methods of nasopharyngeal cancer treatment. In this method, depending on how advanced one tumor is, 2 to 6 Gy dose as intracavitary brachytherapy is prescribed. Due to high dose rate and tumor location, accuracy evaluation of treatment planning system (TPS) is particularly important. Common methods used in TPS dosimetry are based on computations in a homogeneous phantom. Heterogeneous phantoms, especially patient-specific voxel phantoms can increase dosimetric accuracy. In this study, using CT images taken from a patient and ctcreate-which is a part of the DOSXYZnrc computational code, patient-specific phantom was made. Dose distribution was plotted by DOSXYZnrc and compared with TPS one. Also, by extracting the voxels absorbed dose in treatment volume, dose-volume histograms (DVH) was plotted and compared with Oncentra™ TPS DVHs. The results from calculations were compared with data from Oncentra™ treatment planning system and it was observed that TPS calculation predicts lower dose in areas near the source, and higher dose in areas far from the source relative to MC code. Absorbed dose values in the voxels also showed that TPS reports D90 value is 40% higher than the Monte Carlo method. Today, most treatment planning systems use TG-43 protocol. This protocol may results in errors such as neglecting tissue heterogeneity, scattered radiation as well as applicator attenuation. Due to these errors, AAPM emphasized departing from TG-43 protocol and approaching new brachytherapy protocol TG-186 in which patient-specific phantom is used and heterogeneities are affected in dosimetry.

  3. Accuracy Evaluation of Oncentra™ TPS in HDR Brachytherapy of Nasopharynx Cancer Using EGSnrc Monte Carlo Code

    PubMed Central

    Hadad, K.; Zohrevand, M.; Faghihi, R.; Sedighi Pashaki, A.

    2015-01-01

    Background HDR brachytherapy is one of the commonest methods of nasopharyngeal cancer treatment. In this method, depending on how advanced one tumor is, 2 to 6 Gy dose as intracavitary brachytherapy is prescribed. Due to high dose rate and tumor location, accuracy evaluation of treatment planning system (TPS) is particularly important. Common methods used in TPS dosimetry are based on computations in a homogeneous phantom. Heterogeneous phantoms, especially patient-specific voxel phantoms can increase dosimetric accuracy. Materials and Methods In this study, using CT images taken from a patient and ctcreate-which is a part of the DOSXYZnrc computational code, patient-specific phantom was made. Dose distribution was plotted by DOSXYZnrc and compared with TPS one. Also, by extracting the voxels absorbed dose in treatment volume, dose-volume histograms (DVH) was plotted and compared with Oncentra™ TPS DVHs. Results The results from calculations were compared with data from Oncentra™ treatment planning system and it was observed that TPS calculation predicts lower dose in areas near the source, and higher dose in areas far from the source relative to MC code. Absorbed dose values in the voxels also showed that TPS reports D90 value is 40% higher than the Monte Carlo method. Conclusion Today, most treatment planning systems use TG-43 protocol. This protocol may results in errors such as neglecting tissue heterogeneity, scattered radiation as well as applicator attenuation. Due to these errors, AAPM emphasized departing from TG-43 protocol and approaching new brachytherapy protocol TG-186 in which patient-specific phantom is used and heterogeneities are affected in dosimetry. PMID:25973408

  4. TU-D-209-05: Automatic Calculation of Organ and Effective Dose for CBCT and Interventional Fluoroscopic Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Oines, A

    Purpose: To compare PCXMC and EGSnrc calculated organ and effective radiation doses from cone-beam computed tomography (CBCT) and interventional fluoroscopically-guided procedures using automatic exposure-event grouping. Methods: For CBCT, we used PCXMC20Rotation.exe to automatically calculate the doses and compared the results to those calculated using EGSnrc with the Zubal patient phantom. For interventional procedures, we use the dose tracking system (DTS) which we previously developed to produce a log file of all geometry and exposure parameters for every x-ray pulse during a procedure, and the data in the log file is input into PCXMC and EGSnrc for dose calculation. A MATLABmore » program reads data from the log files and groups similar exposures to reduce calculation time. The definition files are then automatically generated in the format used by PCXMC and EGSnrc. Processing is done at the end of the procedure after all exposures are completed. Results: For the Toshiba Infinix CBCT LCI-Middle-Abdominal protocol, most organ doses calculated with PCXMC20Rotation closely matched those calculated with EGSnrc. The effective doses were 33.77 mSv with PCXMC20Rotation and 32.46 mSv with EGSnrc. For a simulated interventional cardiac procedure, similar close agreement in organ dose was obtained between the two codes; the effective doses were 12.02 mSv with PCXMC and 11.35 mSv with EGSnrc. The calculations can be completed on a PC without manual intervention in less than 15 minutes with PCXMC and in about 10 hours with EGSnrc, depending on the level of data grouping and accuracy desired. Conclusion: Effective dose and most organ doses in CBCT and interventional radiology calculated by PCXMC closely match those calculated by EGSnrc. Data grouping, which can be done automatically, makes the calculation time with PCXMC on a standard PC acceptable. This capability expands the dose information that can be provided by the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  5. The influence of Monte Carlo source parameters on detector design and dose perturbation in small field dosimetry

    NASA Astrophysics Data System (ADS)

    Charles, P. H.; Crowe, S. B.; Kairn, T.; Knight, R.; Hill, B.; Kenny, J.; Langton, C. M.; Trapp, J. V.

    2014-03-01

    To obtain accurate Monte Carlo simulations of small radiation fields, it is important model the initial source parameters (electron energy and spot size) accurately. However recent studies have shown that small field dosimetry correction factors are insensitive to these parameters. The aim of this work is to extend this concept to test if these parameters affect dose perturbations in general, which is important for detector design and calculating perturbation correction factors. The EGSnrc C++ user code cavity was used for all simulations. Varying amounts of air between 0 and 2 mm were deliberately introduced upstream to a diode and the dose perturbation caused by the air was quantified. These simulations were then repeated using a range of initial electron energies (5.5 to 7.0 MeV) and electron spot sizes (0.7 to 2.2 FWHM). The resultant dose perturbations were large. For example 2 mm of air caused a dose reduction of up to 31% when simulated with a 6 mm field size. However these values did not vary by more than 2 % when simulated across the full range of source parameters tested. If a detector is modified by the introduction of air, one can be confident that the response of the detector will be the same across all similar linear accelerators and the Monte Carlo modelling of each machine is not required.

  6. SU-E-T-525: Ionization Chamber Perturbation in Flattening Filter Free Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czarnecki, D; Voigts-Rhetz, P von; Zink, K

    2015-06-15

    Purpose: Changing the characteristic of a photon beam by mechanically removing the flattening filter may impact the dose response of ionization chambers. Thus, perturbation factors of cylindrical ionization chambers in conventional and flattening filter free photon beams were calculated by Monte Carlo simulations. Methods: The EGSnrc/BEAMnrc code system was used for all Monte Carlo calculations. BEAMnrc models of nine different linear accelerators with and without flattening filter were used to create realistic photon sources. Monte Carlo based calculations to determine the fluence perturbations due to the presens of the chambers components, the different material of the sensitive volume (air insteadmore » of water) as well as the volume effect were performed by the user code egs-chamber. Results: Stem, central electrode, wall, density and volume perturbation factors for linear accelerators with and without flattening filter were calculated as a function of the beam quality specifier TPR{sub 20/10}. A bias between the perturbation factors as a function of TPR{sub 20/10} for flattening filter free beams and conventional linear accelerators could not be observed for the perturbations caused by the components of the ionization chamber and the sensitive volume. Conclusion: The results indicate that the well-known small bias between the beam quality correction factor as a function of TPR20/10 for the flattening filter free and conventional linear accelerators is not caused by the geometry of the detector but rather by the material of the sensitive volume. This suggest that the bias for flattening filter free photon fields is only caused by the different material of the sensitive volume (air instead of water)« less

  7. G4DARI: Geant4/GATE based Monte Carlo simulation interface for dosimetry calculation in radiotherapy.

    PubMed

    Slimani, Faiçal A A; Hamdi, Mahdjoub; Bentourkia, M'hamed

    2018-05-01

    Monte Carlo (MC) simulation is widely recognized as an important technique to study the physics of particle interactions in nuclear medicine and radiation therapy. There are different codes dedicated to dosimetry applications and widely used today in research or in clinical application, such as MCNP, EGSnrc and Geant4. However, such codes made the physics easier but the programming remains a tedious task even for physicists familiar with computer programming. In this paper we report the development of a new interface GEANT4 Dose And Radiation Interactions (G4DARI) based on GEANT4 for absorbed dose calculation and for particle tracking in humans, small animals and complex phantoms. The calculation of the absorbed dose is performed based on 3D CT human or animal images in DICOM format, from images of phantoms or from solid volumes which can be made from any pure or composite material to be specified by its molecular formula. G4DARI offers menus to the user and tabs to be filled with values or chemical formulas. The interface is described and as application, we show results obtained in a lung tumor in a digital mouse irradiated with seven energy beams, and in a patient with glioblastoma irradiated with five photon beams. In conclusion, G4DARI can be easily used by any researcher without the need to be familiar with computer programming, and it will be freely available as an application package. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. An assessment of the efficiency of methods for measurement of the computed tomography dose index (CTDI) for cone beam (CBCT) dosimetry by Monte Carlo simulation.

    PubMed

    Abuhaimed, Abdullah; J Martin, Colin; Sankaralingam, Marimuthu; J Gentle, David; McJury, Mark

    2014-11-07

    The IEC has introduced a practical approach to overcome shortcomings of the CTDI100 for measurements on wide beams employed for cone beam (CBCT) scans. This study evaluated the efficiency of this approach (CTDIIEC) for different arrangements using Monte Carlo simulation techniques, and compared CTDIIEC to the efficiency of CTDI100 for CBCT. Monte Carlo EGSnrc/BEAMnrc and EGSnrc/DOSXYZnrc codes were used to simulate the kV imaging system mounted on a Varian TrueBeam linear accelerator. The Monte Carlo model was benchmarked against experimental measurements and good agreement shown. Standard PMMA head and body phantoms with lengths 150, 600, and 900 mm were simulated. Beam widths studied ranged from 20-300 mm, and four scanning protocols using two acquisition modes were utilized. The efficiency values were calculated at the centre (εc) and periphery (εp) of the phantoms and for the weighted CTDI (εw). The efficiency values for CTDI100 were approximately constant for beam widths 20-40 mm, where εc(CTDI100), εp(CTDI100), and εw(CTDI100) were 74.7  ±  0.6%, 84.6  ±  0.3%, and 80.9  ±  0.4%, for the head phantom and 59.7  ±  0.3%, 82.1  ±  0.3%, and 74.9  ±  0.3%, for the body phantom, respectively. When beam width increased beyond 40 mm, ε(CTDI100) values fell steadily reaching ~30% at a beam width of 300 mm. In contrast, the efficiency of the CTDIIEC was approximately constant over all beam widths, demonstrating its suitability for assessment of CBCT. εc(CTDIIEC), εp(CTDIIEC), and εw(CTDIIEC) were 76.1  ±  0.9%, 85.9  ±  1.0%, and 82.2  ±  0.9% for the head phantom and 60.6  ±  0.7%, 82.8  ±  0.8%, and 75.8  ±  0.7%, for the body phantom, respectively, within 2% of ε(CTDI100) values for narrower beam widths. CTDI100,w and CTDIIEC,w underestimate CTDI∞,w by ~55% and ~18% for the head phantom and by ~56% and ~24% for the body phantom, respectively, using a clinical beam width 198 mm. The CTDIIEC approach addresses the dependency of efficiency on beam width successfully and correction factors have been derived to allow calculation of CTDI∞.

  9. The effect of low-energy electrons on the response of ion chambers to ionizing photon beams

    NASA Astrophysics Data System (ADS)

    La Russa, Daniel J.

    Cavity ionization chambers are one of the most popular and widely used devices for quantifying ionizing photon beams. This popularity originates from the precision of these devices and the relative ease with which ionization measurements are converted to quantities of interest in therapeutic radiology or radiation protection, collectively referred to as radiation dosimetry. The formalisms used for these conversions, known as cavity theory, make several assumptions about the electron spectrum in the low-energy range resulting from the incident photon beam. These electrons often account for a significant fraction of the ion chamber response. An inadequate treatment of low-energy electrons can therefore significantly effect calculated quantities of interest. This thesis sets out to investigate the effect of low-energy electrons on (1) the use of Spencer-Attix cavity theory with 60Co beams; and (2) the standard temperature-pressure correction factor, P TP, used to relate the measured ionization to a set of reference temperature and pressure conditions for vented ion chambers. Problems with the PTP correction are shown to arise when used with kilovoltage x rays, where ionization measurements are due primarily to electrons that do not have enough energy to cross the cavity. A combination of measurements and Monte Carlo calculations using the EGSnrc Monte Carlo code demonstrate the breakdown of PTP in these situations when used with non-air-equivalent chambers. The extent of the breakdown is shown to depend on cavity size, energy of the incident photons, and the composition of the chamber. In the worst case, the standard P TP factor overcorrects the response of an aluminum chamber by ≈12% at an air density typical of Mexico City. The response of a more common graphite-walled chamber with similar dimensions at the same air density is undercorrected by ≈ 2%. The EGSnrc Monte Carlo code is also used to investigate Spencer-Attix cavity theory as it is used in the formalism to determine the air kerma for a 60Co beam. Following a comparison with measurements in the literature, the air kerma formalism is shown to require a fluence correction factor, Kfl, to ensure the accuracy of the formalism regardless of chamber composition and cavity size. The need for such a correction stems from the fact that the cavity clearly distorts the fluence for mismatched cavity and wall materials, and the inability to select the appropriate "cut-off" energy, Delta, in the Spencer-Attix stopping-power ratio. A discussion of this issue is followed by detailed calculations of K fl values for several of the graphite ionization chambers used at national metrology institutes, which range between 0.9999 and 0.9994 with a one standard deviation uncertainty of +/- 0.0002.

  10. SU-F-T-366: Dosimetric Parameters Enhancement of 120-Leaf Millennium MLC Using EGSnrc and IAEA Phase-Space Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haddad, K; Alopoor, H

    Purpose: Recently, the multileaf collimators (MLC) have become an important part of any LINAC collimation systems because they reduce the treatment planning time and improves the conformity. Important factors that affects the MLCs collimation performance are leaves material composition and their thickness. In this study, we investigate the main dosimetric parameters of 120-leaf Millennium MLC including dose in the buildup point, physical penumbra as well as average and end leaf leakages. Effects of the leaves geometry and density on these parameters are evaluated Methods: From EGSnrc Monte Carlo code, BEAMnrc and DOSXYZnrc modules are used to evaluate the dosimetric parametersmore » of a water phantom exposed to a Varian xi for 100cm SSD. Using IAEA phasespace data just above MLC (Z=46cm) and BEAMnrc, for the modified 120-leaf Millennium MLC a new phase space data at Z=52cm is produces. The MLC is modified both in leaf thickness and material composition. EGSgui code generates 521ICRU library for tungsten alloys. DOSXYZnrc with the new phase space evaluates the dose distribution in a water phantom of 60×60×20 cm3 with voxel size of 4×4×2 mm3. Using DOSXYZnrc dose distributions for open beam and closed beam as well as the leakages definition, end leakage, average leakage and physical penumbra are evaluated. Results: A new MLC with improved dosimetric parameters is proposed. The physical penumbra for proposed MLC is 4.7mm compared to 5.16 mm for Millennium. Average leakage in our design is reduced to 1.16% compared to 1.73% for Millennium, the end leaf leakage suggested design is also reduced to 4.86% compared to 7.26% of Millennium. Conclusion: The results show that the proposed MLC with enhanced dosimetric parameters could improve the conformity of treatment planning.« less

  11. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of computemore » node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.« less

  12. TH-AB-201-09 [Medical Physics, Jun 2016, v. 43(6)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mirzakhanian, L; Benmakhlouf, H; Seuntjens, J

    2016-06-15

    Purpose: To determine the k-(Q-msr,Q)^(f-msr,f-ref ) factor, introduced in the small field formalism for five common type chambers used in the calibration of Leksell Gamma-Knife Perfexion model over a range of different phantom electron densities. Methods: Five chamber types including Exradin-A16, A14SL, A14, A1SL and IBA-CC04 are modeled in EGSnrc and PENELOPE Monte Carlo codes using the blueprints provided by the manufacturers. The chambers are placed in a previously proposed water-filled phantom and four 16-cm diameter spherical phantoms made of liquid water, Solid Water, ABS and polystyrene. Dose to the cavity of the chambers and a small water volume aremore » calculated using EGSnrc/PENELOPE codes. The calculations are performed over a range of phantom electron densities for two chamber orientations. Using the calculated dose-ratio in reference and machine specific reference field, the k-(Q-msr,Q)^(f-msr,f-ref ) factor can be determined. Results: When chambers are placed along the symmetry axis of the collimator block (z-axis), the CC04 requires the smallest correction followed by A1SL and A16. However, when detectors are placed perpendicular to z-axis, A14SL needs the smallest and A16 the largest correction. Moreover, an increase in the phantom electron density results in a linear increase in the k-(Q-msr,Q)^(f-msr,f-ref ). Depending on the chambers, the agreement between this study and a previous study performed varies between 0.05–0.70% for liquid water, 0.07–0.85% for Solid Water and 0.00–0.60% for ABS phantoms. After applying the EGSnrc-calculated k-(Q-msr,Q)^(f-msr,f-ref ) factors for A16 to the previously measured dose-rates in liquid water, Solid Water and ABS normalized to the dose-rate measured with TG-21 protocol and ABS phantom, the dose-rate ratios are found to be 1.004±0.002, 0.996±0.002 and 0.998±0.002 (3σ) respectively. Conclusion: Knowing the electron density of the phantoms, the calculated k-(Q-msr,Q)^(f-msr,f-ref ) values in this work will enable users to apply the appropriate correction for their own specific phantom material. LM acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less

  13. Monte Carlo calculated doses to treatment volumes and organs at risk for permanent implant lung brachytherapy

    NASA Astrophysics Data System (ADS)

    Sutherland, J. G. H.; Furutani, K. M.; Thomson, R. M.

    2013-10-01

    Iodine-125 (125I) and Caesium-131 (131Cs) brachytherapy have been used with sublobar resection to treat stage I non-small cell lung cancer and other radionuclides, 169Yb and 103Pd, are considered for these treatments. This work investigates the dosimetry of permanent implant lung brachytherapy for a range of source energies and various implant sites in the lung. Monte Carlo calculated doses are calculated in a patient CT-derived computational phantom using the EGsnrc user-code BrachyDose. Calculations are performed for 103Pd, 125I, 131Cs seeds and 50 and 100 keV point sources for 17 implant positions. Doses to treatment volumes, ipsilateral lung, aorta, and heart are determined and compared to those determined using the TG-43 approach. Considerable variation with source energy and differences between model-based and TG-43 doses are found for both treatment volumes and organs. Doses to the heart and aorta generally increase with increasing source energy. TG-43 underestimates the dose to the heart and aorta for all implants except those nearest to these organs where the dose is overestimated. Results suggest that model-based dose calculations are crucial for selecting prescription doses, comparing clinical endpoints, and studying radiobiological effects for permanent implant lung brachytherapy.

  14. A full Monte Carlo simulation of the YAP-PEM prototype for breast tumor detection

    NASA Astrophysics Data System (ADS)

    Motta, A.; Righi, S.; Del Guerra, A.; Belcari, N.; Vaiano, A.; De Domenico, G.; Zavattini, G.; Campanini, R.; Lanconelli, N.; Riccardi, A.

    2004-07-01

    A prototype for Positron Emission Mammography, the YAP-PEM, is under development within a collaboration of the Italian Universities of Pisa, Ferrara, and Bologna. The aim is to detect breast lesions, with dimensions of 5 mm in diameter, and with a specific activity ratio of 10:1 between the cancer and breast tissue. The YAP-PEM is composed of two stationary detection heads of 6×6 cm 2, composed of a matrix of 30×30 YAP:Ce finger crystals of 2×2×30 mm 3 each. The EGSnrc Monte Carlo code has been used to simulate several characteristics of the prototype. A fast EM algorithm has been adapted to reconstruct all of the collected lines of flight, also at large incidence angles, by achieving 3D positioning capability of the lesion in the FOV. The role of the breast compression has been studied. The performed study shows that a 5 mm diameter tumor of 37 kBq/cm 3 (1 μCi/cm 3), embedded in active breast tissue with 10:1 tumor/background specific activity ratio, is detected in 10 min with a Signal-to-Noise Ratio of 8.7±1.0. Two hot lesions in the active breast phantom are clearly visible in the reconstructed image.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ureba, A.; Salguero, F. J.; Barbeiro, A. R.

    Purpose: The authors present a hybrid direct multileaf collimator (MLC) aperture optimization model exclusively based on sequencing of patient imaging data to be implemented on a Monte Carlo treatment planning system (MC-TPS) to allow the explicit radiation transport simulation of advanced radiotherapy treatments with optimal results in efficient times for clinical practice. Methods: The planning system (called CARMEN) is a full MC-TPS, controlled through aMATLAB interface, which is based on the sequencing of a novel map, called “biophysical” map, which is generated from enhanced image data of patients to achieve a set of segments actually deliverable. In order to reducemore » the required computation time, the conventional fluence map has been replaced by the biophysical map which is sequenced to provide direct apertures that will later be weighted by means of an optimization algorithm based on linear programming. A ray-casting algorithm throughout the patient CT assembles information about the found structures, the mass thickness crossed, as well as PET values. Data are recorded to generate a biophysical map for each gantry angle. These maps are the input files for a home-made sequencer developed to take into account the interactions of photons and electrons with the MLC. For each linac (Axesse of Elekta and Primus of Siemens) and energy beam studied (6, 9, 12, 15 MeV and 6 MV), phase space files were simulated with the EGSnrc/BEAMnrc code. The dose calculation in patient was carried out with the BEAMDOSE code. This code is a modified version of EGSnrc/DOSXYZnrc able to calculate the beamlet dose in order to combine them with different weights during the optimization process. Results: Three complex radiotherapy treatments were selected to check the reliability of CARMEN in situations where the MC calculation can offer an added value: A head-and-neck case (Case I) with three targets delineated on PET/CT images and a demanding dose-escalation; a partial breast irradiation case (Case II) solved with photon and electron modulated beams (IMRT + MERT); and a prostatic bed case (Case III) with a pronounced concave-shaped PTV by using volumetric modulated arc therapy. In the three cases, the required target prescription doses and constraints on organs at risk were fulfilled in a short enough time to allow routine clinical implementation. The quality assurance protocol followed to check CARMEN system showed a high agreement with the experimental measurements. Conclusions: A Monte Carlo treatment planning model exclusively based on maps performed from patient imaging data has been presented. The sequencing of these maps allows obtaining deliverable apertures which are weighted for modulation under a linear programming formulation. The model is able to solve complex radiotherapy treatments with high accuracy in an efficient computation time.« less

  16. MCTP system model based on linear programming optimization of apertures obtained from sequencing patient image data maps.

    PubMed

    Ureba, A; Salguero, F J; Barbeiro, A R; Jimenez-Ortega, E; Baeza, J A; Miras, H; Linares, R; Perucha, M; Leal, A

    2014-08-01

    The authors present a hybrid direct multileaf collimator (MLC) aperture optimization model exclusively based on sequencing of patient imaging data to be implemented on a Monte Carlo treatment planning system (MC-TPS) to allow the explicit radiation transport simulation of advanced radiotherapy treatments with optimal results in efficient times for clinical practice. The planning system (called CARMEN) is a full MC-TPS, controlled through aMATLAB interface, which is based on the sequencing of a novel map, called "biophysical" map, which is generated from enhanced image data of patients to achieve a set of segments actually deliverable. In order to reduce the required computation time, the conventional fluence map has been replaced by the biophysical map which is sequenced to provide direct apertures that will later be weighted by means of an optimization algorithm based on linear programming. A ray-casting algorithm throughout the patient CT assembles information about the found structures, the mass thickness crossed, as well as PET values. Data are recorded to generate a biophysical map for each gantry angle. These maps are the input files for a home-made sequencer developed to take into account the interactions of photons and electrons with the MLC. For each linac (Axesse of Elekta and Primus of Siemens) and energy beam studied (6, 9, 12, 15 MeV and 6 MV), phase space files were simulated with the EGSnrc/BEAMnrc code. The dose calculation in patient was carried out with the BEAMDOSE code. This code is a modified version of EGSnrc/DOSXYZnrc able to calculate the beamlet dose in order to combine them with different weights during the optimization process. Three complex radiotherapy treatments were selected to check the reliability of CARMEN in situations where the MC calculation can offer an added value: A head-and-neck case (Case I) with three targets delineated on PET/CT images and a demanding dose-escalation; a partial breast irradiation case (Case II) solved with photon and electron modulated beams (IMRT + MERT); and a prostatic bed case (Case III) with a pronounced concave-shaped PTV by using volumetric modulated arc therapy. In the three cases, the required target prescription doses and constraints on organs at risk were fulfilled in a short enough time to allow routine clinical implementation. The quality assurance protocol followed to check CARMEN system showed a high agreement with the experimental measurements. A Monte Carlo treatment planning model exclusively based on maps performed from patient imaging data has been presented. The sequencing of these maps allows obtaining deliverable apertures which are weighted for modulation under a linear programming formulation. The model is able to solve complex radiotherapy treatments with high accuracy in an efficient computation time.

  17. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    PubMed

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-08

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.

  18. SU-E-T-171: Evaluation of the Analytical Anisotropic Algorithm in a Small Finger Joint Phantom Using Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J; Owrangi, A; Jiang, R

    2014-06-01

    Purpose: This study investigated the performance of the anisotropic analytical algorithm (AAA) in dose calculation in radiotherapy concerning a small finger joint. Monte Carlo simulation (EGSnrc code) was used in this dosimetric evaluation. Methods: Heterogeneous finger joint phantom containing a vertical water layer (bone joint or cartilage) sandwiched by two bones with dimension 2 × 2 × 2 cm{sup 3} was irradiated by the 6 MV photon beams (field size = 4 × 4 cm{sup 2}). The central beam axis was along the length of the bone joint and the isocenter was set to the center of the joint. Themore » joint width and beam angle were varied from 0.5–2 mm and 0°–15°, respectively. Depth doses were calculated using the AAA and DOSXYZnrc. For dosimetric comparison and normalization, dose calculations were repeated in water phantom using the same beam geometry. Results: Our AAA and Monte Carlo results showed that the AAA underestimated the joint doses by 10%–20%, and could not predict joint dose variation with changes of joint width and beam angle. The calculated bone dose enhancement for the AAA was lower than Monte Carlo and the depth of maximum dose for the phantom was smaller than that for the water phantom. From Monte Carlo results, there was a decrease of joint dose as its width increased. This reflected the smaller the joint width, the more the bone scatter contributed to the depth dose. Moreover, the joint dose was found slightly decreased with an increase of beam angle. Conclusion: The AAA could not handle variations of joint dose well with changes of joint width and beam angle based on our finger joint phantom. Monte Carlo results showed that the joint dose decreased with increase of joint width and beam angle. This dosimetry comparison should be useful to radiation staff in radiotherapy related to small bone joint.« less

  19. SU-F-J-14: Kilovoltage Cone-Beam CT Dose Estimation of Varian On-Board Imager Using GMctdospp Monte Carlo Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S; Rangaraj, D

    2016-06-15

    Purpose: Although cone-beam CT (CBCT) imaging became popular in radiation oncology, its imaging dose estimation is still challenging. The goal of this study is to assess the kilovoltage CBCT doses using GMctdospp - an EGSnrc based Monte Carlo (MC) framework. Methods: Two Varian OBI x-ray tube models were implemented in the GMctpdospp framework of EGSnrc MC System. The x-ray spectrum of 125 kVp CBCT beam was acquired from an EGSnrc/BEAMnrc simulation and validated with IPEM report 78. Then, the spectrum was utilized as an input spectrum in GMctdospp dose calculations. Both full and half bowtie pre-filters of the OBI systemmore » were created by using egs-prism module. The x-ray tube MC models were verified by comparing calculated dosimetric profiles (lateral and depth) to ion chamber measurements for a static x-ray beam irradiation to a cuboid water phantom. An abdominal CBCT imaging doses was simulated in GMctdospp framework using a 5-year-old anthropomorphic phantom. The organ doses and effective dose (ED) from the framework were assessed and compared to the MOSFET measurements and convolution/superposition dose calculations. Results: The lateral and depth dose profiles in the water cuboid phantom were well matched within 6% except a few areas - left shoulder of the half bowtie lateral profile and surface of water phantom. The organ doses and ED from the MC framework were found to be closer to MOSFET measurements and CS calculations within 2 cGy and 5 mSv respectively. Conclusion: This study implemented and validated the Varian OBI x-ray tube models in the GMctdospp MC framework using a cuboid water phantom and CBCT imaging doses were also evaluated in a 5-year-old anthropomorphic phantom. In future study, various CBCT imaging protocols will be implemented and validated and consequently patient CT images will be used to estimate the CBCT imaging doses in patients.« less

  20. A Monte Carlo investigation of lung brachytherapy treatment planning

    NASA Astrophysics Data System (ADS)

    Sutherland, J. G. H.; Furutani, K. M.; Thomson, R. M.

    2013-07-01

    Iodine-125 (125I) and Caesium-131 (131Cs) brachytherapy have been used in conjunction with sublobar resection to reduce the local recurrence of stage I non-small cell lung cancer compared with resection alone. Treatment planning for this procedure is typically performed using only a seed activity nomogram or look-up table to determine seed strand spacing for the implanted mesh. Since the post-implant seed geometry is difficult to predict, the nomogram is calculated using the TG-43 formalism for seeds in a planar geometry. In this work, the EGSnrc user-code BrachyDose is used to recalculate nomograms using a variety of tissue models for 125I and 131Cs seeds. Calculated prescription doses are compared to those calculated using TG-43. Additionally, patient CT and contour data are used to generate virtual implants to study the effects that post-implant deformation and patient-specific tissue heterogeneity have on perturbing nomogram-derived dose distributions. Differences of up to 25% in calculated prescription dose are found between TG-43 and Monte Carlo calculations with the TG-43 formalism underestimating prescription doses in general. Differences between the TG-43 formalism and Monte Carlo calculated prescription doses are greater for 125I than for 131Cs seeds. Dose distributions are found to change significantly based on implant deformation and tissues surrounding implants for patient-specific virtual implants. Results suggest that accounting for seed grid deformation and the effects of non-water media, at least approximately, are likely required to reliably predict dose distributions in lung brachytherapy patients.

  1. Comparison of TG‐43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes

    PubMed Central

    Zaker, Neda; Sina, Sedigheh; Koontz, Craig; Meigooni1, Ali S.

    2016-01-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross‐sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross‐sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in  125I and  103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code — MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low‐energy sources such as  125I and  103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for  103Pd and 10 cm for  125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for  192Ir and less than 1.2% for  137Cs between the three codes. PACS number(s): 87.56.bg PMID:27074460

  2. Monte Carlo investigation of backscatter point spread function for x-ray imaging examinations

    NASA Astrophysics Data System (ADS)

    Xiong, Zhenyu; Vijayan, Sarath; Rudin, Stephen; Bednarek, Daniel R.

    2017-03-01

    X-ray imaging examinations, especially complex interventions, may result in relatively high doses to the patient's skin inducing skin injuries. A method was developed to determine the skin-dose distribution for non-uniform x-ray beams by convolving the backscatter point-spread-function (PSF) with the primary-dose distribution to generate the backscatter distribution that, when added to the primary dose, gives the total-dose distribution. This technique was incorporated in the dose-tracking system (DTS), which provides a real-time color-coded 3D-mapping of skin dose during fluoroscopic procedures. The aim of this work is to investigate the variation of the backscatter PSF with different parameters. A backscatter PSF of a 1-mm x-ray beam was generated by EGSnrc Monte-Carlo code for different x-ray beam energies, different soft-tissue thickness above bone, different bone thickness and different entrance-beam angles, as well as for different locations on the SK-150 anthropomorphic head phantom. The results show a reduction of the peak scatter to primary dose ratio of 48% when X-ray beam voltage is increased from 40 keV to 120 keV. The backscatter dose was reduced when bone was beneath the soft tissue layer and this reduction increased with thinner soft tissue and thicker bone layers. The backscatter factor increased about 21% as the angle of incidence of the beam with the entrance surface decreased from 90° (perpendicular) to 30°. The backscatter PSF differed for different locations on the SK-150 phantom by up to 15%. The results of this study can be used to improve the accuracy of dose calculation when using PSF convolution in the DTS.

  3. SiliPET: An ultra-high resolution design of a small animal PET scanner based on stacks of double-sided silicon strip detector

    NASA Astrophysics Data System (ADS)

    Di Domenico, Giovanni; Zavattini, Guido; Cesca, Nicola; Auricchio, Natalia; Andritschke, Robert; Schopper, Florian; Kanbach, Gottfried

    2007-02-01

    We investigated with Monte Carlo simulations, using the EGSNrcMP code, the capabilities of a small animal PET scanner based on four stacks of double-sided silicon strip detectors. Each stack consists of 40 silicon detectors with dimension of 60×60×1 mm 3 and 128 orthogonal strips on each side. Two coordinates of the interaction are given by the strips, whereas the third coordinate is given by the detector number in the stack. The stacks are arranged to form a box of 5×5×6 cm 3 with minor sides opened; the box represents the minimal FOV of the scanner. The performance parameters of the SiliPET scanner have been estimated giving a (positron range limited) spatial resolution of 0.52 mm FWHM, and an absolute sensitivity of 5.1% at the center of system. Preliminary results of a proof of principle measurement done with the MEGA advanced Compton imager using a ≈1 mm diameter 22Na source, showed a focal ray tracing FWHM of 1 mm.

  4. An EGSnrc Monte Carlo study of the microionization chamber for reference dosimetry of narrow irregular IMRT beamlets.

    PubMed

    Capote, Roberto; Sánchez-Doblado, Francisco; Leal, Antonio; Lagares, Juan Ignacio; Arráns, Rafael; Hartmann, Günther H

    2004-09-01

    Intensity modulated radiation therapy (IMRT) has evolved toward the use of many small radiation fields, or "beamlets," to increase the resolution of the intensity map. The size of smaller beamlets can be typically about 1-5 cm2. Therefore small ionization chambers (IC) with sensitive volumes < or = 0.1 cm3 are generally used for dose verification of IMRT treatment. The dosimetry of these narrow photon beams pertains to the so-called nonreference conditions for beam calibration. The use of ion chambers for such narrow beams remains questionable due to the lack of electron equilibrium in most of the field. The present contribution aims to estimate, by the Monte Carlo (MC) method, the total correction needed to convert the IBA-Wellhöfer NAC007 micro IC measured charge in such radiation field to the absolute dose to water. Detailed geometrical simulation of the microionization chamber was performed. The ion chamber was always positioned at a 10 cm depth in water, parallel to the beam axis. The delivered doses to air and water cavity were calculated using the CAVRZ EGSnrc user code. The 6 MV phase-spaces for Primus Clinac (Siemens) used as an input to the CAVRZnrc code were derived by BEAM/EGS4 modeling of the treatment head of the machine along with the multileaf collimator [Sánchez-Doblado et al., Phys. Med. Biol. 48, 2081-2099 (2003)] and contrasted with experimental measurements. Dose calculations were carried out for two irradiation geometries, namely, the reference 10x10 cm2 field and an irregular (approximately 2x2 cm2) IMRT beamlet. The dose measured by the ion chamber is estimated by MC simulation as a dose averaged over the air cavity inside the ion-chamber (Dair). The absorbed dose to water is derived as the dose deposited inside the same volume, in the same geometrical position, filled and surrounded by water (Dwater) in the absence of the ionization chamber. Therefore, the Dwater/Dair dose ratio is a MC direct estimation of the total correction factor needed to convert the absorbed dose in air to absorbed dose to water. The dose ratio was calculated for several chamber positions, starting from the penumbra region around the beamlet along the two diagonals crossing the radiation field. For this quantity from 0 up to a 3% difference is observed between the dose ratio values obtained within the small irregular IMRT beamlet in comparison with the dose ratio derived for the reference 10x10 cm2 field. Greater differences from the reference value up to 9% were obtained in the penumbra region of the small IMRT beamlet.

  5. Poster — Thur Eve — 48: Dosimetric dependence on bone backscatter in orthovoltage radiotherapy: A Monte Carlo photon fluence spectral study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J; Grigor, G

    This study investigated dosimetric impact due to the bone backscatter in orthovoltage radiotherapy. Monte Carlo simulations were used to calculate depth doses and photon fluence spectra using the EGSnrc-based code. Inhomogeneous bone phantom containing a thin water layer (1–3 mm) on top of a bone (1 cm) to mimic the treatment sites of forehead, chest wall and kneecap was irradiated by the 220 kVp photon beam produced by the Gulmay D3225 x-ray machine. Percentage depth doses and photon energy spectra were determined using Monte Carlo simulations. Results of percentage depth doses showed that the maximum bone dose was about 210–230%more » larger than the surface dose in the phantoms with different water thicknesses. Surface dose was found to be increased from 2.3 to 3.5%, when the distance between the phantom surface and bone was increased from 1 to 3 mm. This increase of surface dose on top of a bone was due to the increase of photon fluence intensity, resulting from the bone backscatter in the energy range of 30 – 120 keV, when the water thickness was increased. This was also supported by the increase of the intensity of the photon energy spectral curves at the phantom and bone surface as the water thickness was increased. It is concluded that if the bone inhomogeneity during the dose prescription in the sites of forehead, chest wall and kneecap with soft tissue thickness = 1–3 mm is not considered, there would be an uncertainty in the dose delivery.« less

  6. Implementation and validation of collapsed cone superposition for radiopharmaceutical dosimetry of photon emitters

    NASA Astrophysics Data System (ADS)

    Sanchez-Garcia, Manuel; Gardin, Isabelle; Lebtahi, Rachida; Dieudonné, Arnaud

    2015-10-01

    Two collapsed cone (CC) superposition algorithms have been implemented for radiopharmaceutical dosimetry of photon emitters. The straight CC (SCC) superposition method uses a water energy deposition kernel (EDKw) for each electron, positron and photon components, while the primary and scatter CC (PSCC) superposition method uses different EDKw for primary and once-scattered photons. PSCC was implemented only for photons originating from the nucleus, precluding its application to positron emitters. EDKw are linearly scaled by radiological distance, taking into account tissue density heterogeneities. The implementation was tested on 100, 300 and 600 keV mono-energetic photons and 18F, 99mTc, 131I and 177Lu. The kernels were generated using the Monte Carlo codes MCNP and EGSnrc. The validation was performed on 6 phantoms representing interfaces between soft-tissues, lung and bone. The figures of merit were γ (3%, 3 mm) and γ (5%, 5 mm) criterions corresponding to the computation comparison on 80 absorbed doses (AD) points per phantom between Monte Carlo simulations and CC algorithms. PSCC gave better results than SCC for the lowest photon energy (100 keV). For the 3 isotopes computed with PSCC, the percentage of AD points satisfying the γ (5%, 5 mm) criterion was always over 99%. A still good but worse result was found with SCC, since at least 97% of AD-values verified the γ (5%, 5 mm) criterion, except a value of 57% for the 99mTc with the lung/bone interface. The CC superposition method for radiopharmaceutical dosimetry is a good alternative to Monte Carlo simulations while reducing computation complexity.

  7. Implementation and validation of collapsed cone superposition for radiopharmaceutical dosimetry of photon emitters.

    PubMed

    Sanchez-Garcia, Manuel; Gardin, Isabelle; Lebtahi, Rachida; Dieudonné, Arnaud

    2015-10-21

    Two collapsed cone (CC) superposition algorithms have been implemented for radiopharmaceutical dosimetry of photon emitters. The straight CC (SCC) superposition method uses a water energy deposition kernel (EDKw) for each electron, positron and photon components, while the primary and scatter CC (PSCC) superposition method uses different EDKw for primary and once-scattered photons. PSCC was implemented only for photons originating from the nucleus, precluding its application to positron emitters. EDKw are linearly scaled by radiological distance, taking into account tissue density heterogeneities. The implementation was tested on 100, 300 and 600 keV mono-energetic photons and (18)F, (99m)Tc, (131)I and (177)Lu. The kernels were generated using the Monte Carlo codes MCNP and EGSnrc. The validation was performed on 6 phantoms representing interfaces between soft-tissues, lung and bone. The figures of merit were γ (3%, 3 mm) and γ (5%, 5 mm) criterions corresponding to the computation comparison on 80 absorbed doses (AD) points per phantom between Monte Carlo simulations and CC algorithms. PSCC gave better results than SCC for the lowest photon energy (100 keV). For the 3 isotopes computed with PSCC, the percentage of AD points satisfying the γ (5%, 5 mm) criterion was always over 99%. A still good but worse result was found with SCC, since at least 97% of AD-values verified the γ (5%, 5 mm) criterion, except a value of 57% for the (99m)Tc with the lung/bone interface. The CC superposition method for radiopharmaceutical dosimetry is a good alternative to Monte Carlo simulations while reducing computation complexity.

  8. SU-E-T-285: Dose Variation at Bone in Small-Animal Irradiation: A Monte Carlo Study Using Monoenergetic Photon Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vuong, A; Chow, J

    Purpose: The aim of this study is to investigate the variation of bone dose on photon beam energy (keV – MeV) in small-animal irradiation. Dosimetry of homogeneous and inhomogeneous phantoms as per the same mouse computed tomography image set were calculated using the DOSCTP and DOSXYZnrc based on the EGSnrc Monte Carlo code. Methods: Monte Carlo simulations for the homogeneous and inhomogeneous mouse phantom irradiated by a 360 degree photon arc were carried out. Mean doses of the bone tissue in the irradiated volumes were calculated at various photon beam energies, ranging from 50 keV to 1.25 MeV. The effectmore » of bone inhomogeneity was examined through the Inhomogeneous Correction Factor (ICF), a dose ratio of the inhomogeneous to the homogeneous medium. Results: From our Monte Carlo results, higher mean bone dose and ICF were found when using kilovoltage photon beams compared to megavoltage. In beam energies ranging from 50 keV to 200 keV, the bone dose was found maximum at 50 keV, and decreased significantly from 2.6 Gy to 0.55 Gy, when 2 Gy was delivered at the center of the phantom (isocenter). Similarly, the ICF were found decreasing from 4.5 to 1 when the photon beam energy was increased from 50 keV to 200 keV. Both mean bone dose and ICF remained at about 0.5 Gy and 1 from 200 keV to 1.25 MeV with insignificant variation, respectively. Conclusion: It is concluded that to avoid high bone dose in the small-animal irradiation, photon beam energy higher than 200 keV should be used with the ICF close to one, and bone dose comparable to the megavoltage beam where photoelectric effect is not dominant.« less

  9. Epp: A C++ EGSnrc user code for x-ray imaging and scattering simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lippuner, Jonas; Elbakri, Idris A.; Cui Congwu

    2011-03-15

    Purpose: Easy particle propagation (Epp) is a user code for the EGSnrc code package based on the C++ class library egspp. A main feature of egspp (and Epp) is the ability to use analytical objects to construct simulation geometries. The authors developed Epp to facilitate the simulation of x-ray imaging geometries, especially in the case of scatter studies. While direct use of egspp requires knowledge of C++, Epp requires no programming experience. Methods: Epp's features include calculation of dose deposited in a voxelized phantom and photon propagation to a user-defined imaging plane. Projection images of primary, single Rayleigh scattered, singlemore » Compton scattered, and multiple scattered photons may be generated. Epp input files can be nested, allowing for the construction of complex simulation geometries from more basic components. To demonstrate the imaging features of Epp, the authors simulate 38 keV x rays from a point source propagating through a water cylinder 12 cm in diameter, using both analytical and voxelized representations of the cylinder. The simulation generates projection images of primary and scattered photons at a user-defined imaging plane. The authors also simulate dose scoring in the voxelized version of the phantom in both Epp and DOSXYZnrc and examine the accuracy of Epp using the Kawrakow-Fippel test. Results: The results of the imaging simulations with Epp using voxelized and analytical descriptions of the water cylinder agree within 1%. The results of the Kawrakow-Fippel test suggest good agreement between Epp and DOSXYZnrc. Conclusions: Epp provides the user with useful features, including the ability to build complex geometries from simpler ones and the ability to generate images of scattered and primary photons. There is no inherent computational time saving arising from Epp, except for those arising from egspp's ability to use analytical representations of simulation geometries. Epp agrees with DOSXYZnrc in dose calculation, since they are both based on the well-validated standard EGSnrc radiation transport physics model.« less

  10. A Monte Carlo study of the energy spectra and transmission characteristics of scattered radiation from x-ray computed tomography.

    PubMed

    Platten, David John

    2014-06-01

    Existing data used to calculate the barrier transmission of scattered radiation from computed tomography (CT) are based on primary beam CT energy spectra. This study uses the EGSnrc Monte Carlo system and Epp user code to determine the energy spectra of CT scatter from four different primary CT beams passing through an ICRP 110 male reference phantom. Each scatter spectrum was used as a broad-beam x-ray source in transmission simulations through seventeen thicknesses of lead (0.00-3.50 mm). A fit of transmission data to lead thickness was performed to obtain α, β and γ parameters for each spectrum. The mean energy of the scatter spectra were up to 12.3 keV lower than that of the primary spectrum. For 120 kVp scatter beams the transmission through lead was at least 50% less than predicted by existing data for thicknesses of 1.5 mm and greater; at least 30% less transmission was seen for 140 kVp scatter beams. This work has shown that the mean energy and half-value layer of CT scatter spectra are lower than those of the corresponding primary beam. The transmission of CT scatter radiation through lead is lower than that calculated with currently available data. Using the data from this work will result in less lead shielding being required for CT scanner installations.

  11. Monte Carlo simulations and benchmark measurements on the response of TE(TE) and Mg(Ar) ionization chambers in photon, electron and neutron beams

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Huang, Tseng-Te; Liu, Yuan-Hao; Chen, Wei-Lin; Chen, Yen-Fu; Wu, Shu-Wei; Nievaart, Sander; Jiang, Shiang-Huei

    2015-06-01

    The paired ionization chambers (ICs) technique is commonly employed to determine neutron and photon doses in radiology or radiotherapy neutron beams, where neutron dose shows very strong dependence on the accuracy of accompanying high energy photon dose. During the dose derivation, it is an important issue to evaluate the photon and electron response functions of two commercially available ionization chambers, denoted as TE(TE) and Mg(Ar), used in our reactor based epithermal neutron beam. Nowadays, most perturbation corrections for accurate dose determination and many treatment planning systems are based on the Monte Carlo technique. We used general purposed Monte Carlo codes, MCNP5, EGSnrc, FLUKA or GEANT4 for benchmark verifications among them and carefully measured values for a precise estimation of chamber current from absorbed dose rate of cavity gas. Also, energy dependent response functions of two chambers were calculated in a parallel beam with mono-energies from 20 keV to 20 MeV photons and electrons by using the optimal simple spherical and detailed IC models. The measurements were performed in the well-defined (a) four primary M-80, M-100, M120 and M150 X-ray calibration fields, (b) primary 60Co calibration beam, (c) 6 MV and 10 MV photon, (d) 6 MeV and 18 MeV electron LINACs in hospital and (e) BNCT clinical trials neutron beam. For the TE(TE) chamber, all codes were almost identical over the whole photon energy range. In the Mg(Ar) chamber, MCNP5 showed lower response than other codes for photon energy region below 0.1 MeV and presented similar response above 0.2 MeV (agreed within 5% in the simple spherical model). With the increase of electron energy, the response difference between MCNP5 and other codes became larger in both chambers. Compared with the measured currents, MCNP5 had the difference from the measurement data within 5% for the 60Co, 6 MV, 10 MV, 6 MeV and 18 MeV LINACs beams. But for the Mg(Ar) chamber, the derivations reached 7.8-16.5% below 120 kVp X-ray beams. In this study, we were especially interested in BNCT doses where low energy photon contribution is less to ignore, MCNP model is recognized as the most suitable to simulate wide photon-electron and neutron energy distributed responses of the paired ICs. Also, MCNP provides the best prediction of BNCT source adjustment by the detector's neutron and photon responses.

  12. Commissioning of a Varian Clinac iX 6 MV photon beam using Monte Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dirgayussa, I Gde Eka, E-mail: ekadirgayussa@gmail.com; Yani, Sitti; Haryanto, Freddy, E-mail: freddy@fi.itb.ac.id

    2015-09-30

    Monte Carlo modelling of a linear accelerator is the first and most important step in Monte Carlo dose calculations in radiotherapy. Monte Carlo is considered today to be the most accurate and detailed calculation method in different fields of medical physics. In this research, we developed a photon beam model for Varian Clinac iX 6 MV equipped with MilleniumMLC120 for dose calculation purposes using BEAMnrc/DOSXYZnrc Monte Carlo system based on the underlying EGSnrc particle transport code. Monte Carlo simulation for this commissioning head LINAC divided in two stages are design head Linac model using BEAMnrc, characterize this model using BEAMDPmore » and analyze the difference between simulation and measurement data using DOSXYZnrc. In the first step, to reduce simulation time, a virtual treatment head LINAC was built in two parts (patient-dependent component and patient-independent component). The incident electron energy varied 6.1 MeV, 6.2 MeV and 6.3 MeV, 6.4 MeV, and 6.6 MeV and the FWHM (full width at half maximum) of source is 1 mm. Phase-space file from the virtual model characterized using BEAMDP. The results of MC calculations using DOSXYZnrc in water phantom are percent depth doses (PDDs) and beam profiles at depths 10 cm were compared with measurements. This process has been completed if the dose difference of measured and calculated relative depth-dose data along the central-axis and dose profile at depths 10 cm is ≤ 5%. The effect of beam width on percentage depth doses and beam profiles was studied. Results of the virtual model were in close agreement with measurements in incident energy electron 6.4 MeV. Our results showed that photon beam width could be tuned using large field beam profile at the depth of maximum dose. The Monte Carlo model developed in this study accurately represents the Varian Clinac iX with millennium MLC 120 leaf and can be used for reliable patient dose calculations. In this commissioning process, the good criteria of dose difference in PDD and dose profiles were achieve using incident electron energy 6.4 MeV.« less

  13. Monte Carlo simulation of a compact microbeam radiotherapy system based on carbon nanotube field emission technology.

    PubMed

    Schreiber, Eric C; Chang, Sha X

    2012-08-01

    Microbeam radiation therapy (MRT) is an experimental radiotherapy technique that has shown potent antitumor effects with minimal damage to normal tissue in animal studies. This unique form of radiation is currently only produced in a few large synchrotron accelerator research facilities in the world. To promote widespread translational research on this promising treatment technology we have proposed and are in the initial development stages of a compact MRT system that is based on carbon nanotube field emission x-ray technology. We report on a Monte Carlo based feasibility study of the compact MRT system design. Monte Carlo calculations were performed using EGSnrc-based codes. The proposed small animal research MRT device design includes carbon nanotube cathodes shaped to match the corresponding MRT collimator apertures, a common reflection anode with filter, and a MRT collimator. Each collimator aperture is sized to deliver a beam width ranging from 30 to 200 μm at 18.6 cm source-to-axis distance. Design parameters studied with Monte Carlo include electron energy, cathode design, anode angle, filtration, and collimator design. Calculations were performed for single and multibeam configurations. Increasing the energy from 100 kVp to 160 kVp increased the photon fluence through the collimator by a factor of 1.7. Both energies produced a largely uniform fluence along the long dimension of the microbeam, with 5% decreases in intensity near the edges. The isocentric dose rate for 160 kVp was calculated to be 700 Gy∕min∕A in the center of a 3 cm diameter target. Scatter contributions resulting from collimator size were found to produce only small (<7%) changes in the dose rate for field widths greater than 50 μm. Dose vs depth was weakly dependent on filtration material. The peak-to-valley ratio varied from 10 to 100 as the separation between adjacent microbeams varies from 150 to 1000 μm. Monte Carlo simulations demonstrate that the proposed compact MRT system design is capable of delivering a sufficient dose rate and peak-to-valley ratio for small animal MRT studies.

  14. Influence of ion chamber response on in-air profile measurements in megavoltage photon beams.

    PubMed

    Tonkopi, E; McEwen, M R; Walters, B R B; Kawrakow, I

    2005-09-01

    This article presents an investigation of the influence of the ion chamber response, including buildup caps, on the measurement of in-air off-axis ratio (OAR) profiles in megavoltage photon beams using Monte Carlo simulations with the EGSnrc system. Two new techniques for the calculation of OAR profiles are presented. Results of the Monte Carlo simulations are compared to measurements performed in 6, 10 and 25 MV photon beams produced by an Elekta Precise linac and shown to agree within the experimental and simulation uncertainties. Comparisons with calculated in-air kerma profiles demonstrate that using a plastic mini phantom gives more accurate air-kerma measurements than using high-Z material buildup caps and that the variation of chamber response with distance from the central axis must be taken into account.

  15. Enhancement of natural background gamma-radiation dose around uranium microparticles in the human body.

    PubMed

    Pattison, John E; Hugtenburg, Richard P; Green, Stuart

    2010-04-06

    Ongoing controversy surrounds the adverse health effects of the use of depleted uranium (DU) munitions. The biological effects of gamma-radiation arise from the direct or indirect interaction between secondary electrons and the DNA of living cells. The probability of the absorption of X-rays and gamma-rays with energies below about 200 keV by particles of high atomic number is proportional to the third to fourth power of the atomic number. In such a case, the more heavily ionizing low-energy recoil electrons are preferentially produced; these cause dose enhancement in the immediate vicinity of the particles. It has been claimed that upon exposure to naturally occurring background gamma-radiation, particles of DU in the human body would produce dose enhancement by a factor of 500-1000, thereby contributing a significant radiation dose in addition to the dose received from the inherent radioactivity of the DU. In this study, we used the Monte Carlo code EGSnrc to accurately estimate the likely maximum dose enhancement arising from the presence of micrometre-sized uranium particles in the body. We found that although the dose enhancement is significant, of the order of 1-10, it is considerably smaller than that suggested previously.

  16. MCNP6.1 simulations for low-energy atomic relaxation: Code-to-code comparison with GATEv7.2, PENELOPE2014, and EGSnrc

    NASA Astrophysics Data System (ADS)

    Jung, Seongmoon; Sung, Wonmo; Lee, Jaegi; Ye, Sung-Joon

    2018-01-01

    Emerging radiological applications of gold nanoparticles demand low-energy electron/photon transport calculations including details of an atomic relaxation process. Recently, MCNP® version 6.1 (MCNP6.1) has been released with extended cross-sections for low-energy electron/photon, subshell photoelectric cross-sections, and more detailed atomic relaxation data than the previous versions. With this new feature, the atomic relaxation process of MCNP6.1 has not been fully tested yet with its new physics library (eprdata12) that is based on the Evaluated Atomic Data Library (EADL). In this study, MCNP6.1 was compared with GATEv7.2, PENELOPE2014, and EGSnrc that have been often used to simulate low-energy atomic relaxation processes. The simulations were performed to acquire both photon and electron spectra produced by interactions of 15 keV electrons or photons with a 10-nm-thick gold nano-slab. The photon-induced fluorescence X-rays from MCNP6.1 fairly agreed with those from GATEv7.2 and PENELOPE2014, while the electron-induced fluorescence X-rays of the four codes showed more or less discrepancies. A coincidence was observed in the photon-induced Auger electrons simulated by MCNP6.1 and GATEv7.2. A recent release of MCNP6.1 with eprdata12 can be used to simulate the photon-induced atomic relaxation.

  17. Determination of output factor for 6 MV small photon beam: comparison between Monte Carlo simulation technique and microDiamond detector

    NASA Astrophysics Data System (ADS)

    Krongkietlearts, K.; Tangboonduangjit, P.; Paisangittisakul, N.

    2016-03-01

    In order to improve the life's quality for a cancer patient, the radiation techniques are constantly evolving. Especially, the two modern techniques which are intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) are quite promising. They comprise of many small beam sizes (beamlets) with various intensities to achieve the intended radiation dose to the tumor and minimal dose to the nearby normal tissue. The study investigates whether the microDiamond detector (PTW manufacturer), a synthetic single crystal diamond detector, is suitable for small field output factor measurement. The results were compared with those measured by the stereotactic field detector (SFD) and the Monte Carlo simulation (EGSnrc/BEAMnrc/DOSXYZ). The calibration of Monte Carlo simulation was done using the percentage depth dose and dose profile measured by the photon field detector (PFD) of the 10×10 cm2 field size with 100 cm SSD. Comparison of the values obtained from the calculations and measurements are consistent, no more than 1% difference. The output factors obtained from the microDiamond detector have been compared with those of SFD and Monte Carlo simulation, the results demonstrate the percentage difference of less than 2%.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Danielle; Siegbahn, Albert; Fallone, Gin

    Purpose: The BioMedical Imaging and Therapy (BMIT) beamlines at the Canadian Light Source offer the opportunity for investigating novel imaging and therapy applications of synchrotron radiation. A necessary component in advancing this research, and in progressing toward clinical applications, is the availability of accurate dosimetry that is traceable to a standards institution. However, dosimetry in this setting is challenging. These beams are typically small, non-uniform, and highly intense. This work describes air kerma rate measurements on a BMIT beamline using a free-air ionization chamber (FAC). Methods: The measurements were taken at the 05B1-1 beamline (∼8 – 100 keV) for severalmore » beam qualities with mean energies between 20.0 and 84.0 keV. The Victoreen Model 480 cylindrical FAC, with a specially fabricated 0.52 mm diameter aperture, was used to measure air kerma rates. The required correction factors were determined using a variety of methods: tabulated data, measurements, theoretical calculations and Monte Carlo simulations (EGSnrc user code egs-fac). Results: The experimental air kerma rates measured between 0.270 ± 13.6% and 312 ± 2.7% Gy/min. At lower energies (low filtration), the most impactful correction factors were those for ion recombination and for x-ray attenuation. Conclusions: These measurements marked the first absolute dosimetry performed at the BMIT beamlines. The experimental and Monte Carlo methods developed will allow air kerma rates to be measured under other experimental conditions, provide a benchmark to which other dosimeters will be compared, and provide a reference for imaging and therapy research programs on this beamline.« less

  19. Estimation of absorbed doses from paediatric cone-beam CT scans: MOSFET measurements and Monte Carlo simulations.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry T; Toncheva, Greta; Frush, Donald P; Yin, Fang-Fang

    2010-03-01

    The purpose of this study was to establish a dose estimation tool with Monte Carlo (MC) simulations. A 5-y-old paediatric anthropomorphic phantom was computed tomography (CT) scanned to create a voxelised phantom and used as an input for the abdominal cone-beam CT in a BEAMnrc/EGSnrc MC system. An X-ray tube model of the Varian On-Board Imager((R)) was built in the MC system. To validate the model, the absorbed doses at each organ location for standard-dose and low-dose modes were measured in the physical phantom with MOSFET detectors; effective doses were also calculated. In the results, the MC simulations were comparable to the MOSFET measurements. This voxelised phantom approach could produce a more accurate dose estimation than the stylised phantom method. This model can be easily applied to multi-detector CT dosimetry.

  20. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    NASA Astrophysics Data System (ADS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  1. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    PubMed

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  2. Air-kerma strength determination of a miniature x-ray source for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Davis, Stephen D.

    A miniature x-ray source has been developed by Xoft Inc. for high dose-rate brachytherapy treatments. The source is contained in a 5.4 mm diameter water-cooling catheter. The source voltage can be adjusted from 40 kV to 50 kV and the beam current is adjustable up to 300 muA. Electrons are accelerated toward a tungsten-coated anode to produce a lightly-filtered bremsstrahlung photon spectrum. The sources were initially used for early-stage breast cancer treatment using a balloon applicator. More recently, Xoft Inc. has developed vaginal and surface applicators. The miniature x-ray sources have been characterized using a modification of the American Association of Physicists in Medicine Task Group No. 43 formalism normally used for radioactive brachytherapy sources. Primary measurements of air kerma were performed using free-air ionization chambers at the University of Wisconsin (UW) and the National Institute of Standards and Technology (NIST). The measurements at UW were used to calibrate a well-type ionization chamber for clinical verification of source strength. Accurate knowledge of the emitted photon spectrum was necessary to calculate the corrections required to determine air-kerma strength, defined in vacuo. Theoretical predictions of the photon spectrum were calculated using three separate Monte Carlo codes: MCNP5, EGSnrc, and PENELOPE. Each code used different implementations of the underlying radiological physics. Benchmark studies were performed to investigate these differences in detail. The most important variation among the codes was found to be the calculation of fluorescence photon production following electron-induced vacancies in the L shell of tungsten atoms. The low-energy tungsten L-shell fluorescence photons have little clinical significance at the treatment distance, but could have a large impact on air-kerma measurements. Calculated photon spectra were compared to spectra measured with high-purity germanium spectroscopy systems at both UW and NIST. The effects of escaped germanium fluorescence photons and Compton-scattered photons were taken into account for the UW measurements. The photon spectrum calculated using the PENELOPE Monte Carlo code had the best agreement with the spectrum measured at NIST. Corrections were applied to the free-air chamber measurements to arrive at an air-kerma strength determination for the miniature x-ray sources.

  3. Evaluation of backscatter dose from internal lead shielding in clinical electron beams using EGSnrc Monte Carlo simulations.

    PubMed

    De Vries, Rowen J; Marsh, Steven

    2015-11-08

    Internal lead shielding is utilized during superficial electron beam treatments of the head and neck, such as lip carcinoma. Methods for predicting backscattered dose include the use of empirical equations or performing physical measurements. The accuracy of these empirical equations required verification for the local electron beams. In this study, a Monte Carlo model of a Siemens Artiste linac was developed for 6, 9, 12, and 15 MeV electron beams using the EGSnrc MC package. The model was verified against physical measurements to an accuracy of better than 2% and 2mm. Multiple MC simulations of lead interfaces at different depths, corresponding to mean electron energies in the range of 0.2-14 MeV at the interfaces, were performed to calculate electron backscatter values. The simulated electron backscatter was compared with current empirical equations to ascertain their accuracy. The major finding was that the current set of backscatter equations does not accurately predict electron backscatter, particularly in the lower energies region. A new equation was derived which enables estimation of electron backscatter factor at any depth upstream from the interface for the local treatment machines. The derived equation agreed to within 1.5% of the MC simulated electron backscatter at the lead interface and upstream positions. Verification of the equation was performed by comparing to measurements of the electron backscatter factor using Gafchromic EBT2 film. These results show a mean value of 0.997 ± 0.022 to 1σ of the predicted values of electron backscatter. The new empirical equation presented can accurately estimate electron backscatter factor from lead shielding in the range of 0.2 to 14 MeV for the local linacs.

  4. Evaluation of backscatter dose from internal lead shielding in clinical electron beams using EGSnrc Monte Carlo simulations

    PubMed Central

    Marsh, Steven

    2015-01-01

    Internal lead shielding is utilized during superficial electron beam treatments of the head and neck, such as lip carcinoma. Methods for predicting backscattered dose include the use of empirical equations or performing physical measurements. The accuracy of these empirical equations required verification for the local electron beams. In this study, a Monte Carlo model of a Siemens Artiste linac was developed for 6, 9, 12, and 15 MeV electron beams using the EGSnrc MC package. The model was verified against physical measurements to an accuracy of better than 2% and 2 mm. Multiple MC simulations of lead interfaces at different depths, corresponding to mean electron energies in the range of 0.2–14 MeV at the interfaces, were performed to calculate electron backscatter values. The simulated electron backscatter was compared with current empirical equations to ascertain their accuracy. The major finding was that the current set of backscatter equations does not accurately predict electron backscatter, particularly in the lower energies region. A new equation was derived which enables estimation of electron backscatter factor at any depth upstream from the interface for the local treatment machines. The derived equation agreed to within 1.5% of the MC simulated electron backscatter at the lead interface and upstream positions. Verification of the equation was performed by comparing to measurements of the electron backscatter factor using Gafchromic EBT2 film. These results show a mean value of 0.997±0.022 to 1σ of the predicted values of electron backscatter. The new empirical equation presented can accurately estimate electron backscatter factor from lead shielding in the range of 0.2 to 14 MeV for the local linacs. PACS numbers: 87.53.Bn, 87.55.K‐, 87.56.bd PMID:26699566

  5. Characterisation of mega-voltage electron pencil beam dose distributions: viability of a measurement-based approach.

    PubMed

    Barnes, M P; Ebert, M A

    2008-03-01

    The concept of electron pencil-beam dose distributions is central to pencil-beam algorithms used in electron beam radiotherapy treatment planning. The Hogstrom algorithm, which is a common algorithm for electron treatment planning, models large electron field dose distributions by the superposition of a series of pencil beam dose distributions. This means that the accurate characterisation of an electron pencil beam is essential for the accuracy of the dose algorithm. The aim of this study was to evaluate a measurement based approach for obtaining electron pencil-beam dose distributions. The primary incentive for the study was the accurate calculation of dose distributions for narrow fields as traditional electron algorithms are generally inaccurate for such geometries. Kodak X-Omat radiographic film was used in a solid water phantom to measure the dose distribution of circular 12 MeV beams from a Varian 21EX linear accelerator. Measurements were made for beams of diameter, 1.5, 2, 4, 8, 16 and 32 mm. A blocked-field technique was used to subtract photon contamination in the beam. The "error function" derived from Fermi-Eyges Multiple Coulomb Scattering (MCS) theory for corresponding square fields was used to fit resulting dose distributions so that extrapolation down to a pencil beam distribution could be made. The Monte Carlo codes, BEAM and EGSnrc were used to simulate the experimental arrangement. The 8 mm beam dose distribution was also measured with TLD-100 microcubes. Agreement between film, TLD and Monte Carlo simulation results were found to be consistent with the spatial resolution used. The study has shown that it is possible to extrapolate narrow electron beam dose distributions down to a pencil beam dose distribution using the error function. However, due to experimental uncertainties and measurement difficulties, Monte Carlo is recommended as the method of choice for characterising electron pencil-beam dose distributions.

  6. SU-E-T-627: Precision Modelling of the Leaf-Bank Rotation in Elekta’s Agility MLC: Is It Necessary?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vujicic, M; Belec, J; Heath, E

    Purpose: To demonstrate the method used to determine the leaf bank rotation angle (LBROT) as a parameter for modeling the Elekta Agility multi-leaf collimator (MLC) for Monte Carlo simulations and to evaluate the clinical impact of LBROT. Methods: A detailed model of an Elekta Infinity linac including an Agility MLC was built using the EGSnrc/BEAMnrc Monte Carlo code. The Agility 160-leaf MLC is modelled using the MLCE component module which allows for leaf bank rotation using the parameter LBROT. A precise value of LBROT is obtained by comparing measured and simulated profiles of a specific field, which has leaves arrangedmore » in a repeated pattern such that one leaf is opened and the adjacent one is closed. Profile measurements from an Agility linac are taken with gafchromic film, and an ion chamber is used to set the absolute dose. The measurements are compared to Monte Carlo (MC) simulations and the LBROT is adjusted until a match is found. The clinical impact of LBROT is evaluated by observing how an MC dose calculation changes with LBROT. A clinical Stereotactic Body Radiation Treatment (SBRT) plan is calculated using BEAMnrc/DOSXYZnrc simulations with different input values for LBROT. Results: Using the method outlined above, the LBROT is determined to be 9±1 mrad. Differences as high as 4% are observed in a clinical SBRT plan between the extreme case (LBROT not modeled) and the nominal case. Conclusion: In small-field radiation therapy treatment planning, it is important to properly account for LBROT as an input parameter for MC dose calculations with the Agility MLC. More work is ongoing to elucidate the observed differences by determining the contributions from transmission dose, change in field size, and source occlusion, which are all dependent on LBROT. This work was supported by OCAIRO (Ontario Consortium of Adaptive Interventions in Radiation Oncology), funded by the Ontario Research Fund.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; Liu, B; Liang, B

    Purpose: Current CyberKnife treatment planning system (TPS) provided two dose calculation algorithms: Ray-tracing and Monte Carlo. Ray-tracing algorithm is fast, but less accurate, and also can’t handle irregular fields since a multi-leaf collimator system was recently introduced to CyberKnife M6 system. Monte Carlo method has well-known accuracy, but the current version still takes a long time to finish dose calculations. The purpose of this paper is to develop a GPU-based fast C/S dose engine for CyberKnife system to achieve both accuracy and efficiency. Methods: The TERMA distribution from a poly-energetic source was calculated based on beam’s eye view coordinate system,more » which is GPU friendly and has linear complexity. The dose distribution was then computed by inversely collecting the energy depositions from all TERMA points along 192 collapsed-cone directions. EGSnrc user code was used to pre-calculate energy deposition kernels (EDKs) for a series of mono-energy photons The energy spectrum was reconstructed based on measured tissue maximum ratio (TMR) curve, the TERMA averaged cumulative kernels was then calculated. Beam hardening parameters and intensity profiles were optimized based on measurement data from CyberKnife system. Results: The difference between measured and calculated TMR are less than 1% for all collimators except in the build-up regions. The calculated profiles also showed good agreements with the measured doses within 1% except in the penumbra regions. The developed C/S dose engine was also used to evaluate four clinical CyberKnife treatment plans, the results showed a better dose calculation accuracy than Ray-tracing algorithm compared with Monte Carlo method for heterogeneous cases. For the dose calculation time, it takes about several seconds for one beam depends on collimator size and dose calculation grids. Conclusion: A GPU-based C/S dose engine has been developed for CyberKnife system, which was proven to be efficient and accurate for clinical purpose, and can be easily implemented in TPS.« less

  8. Study the sensitivity of dose calculation in prism treatment planning system using Monte Carlo simulation of 6 MeV electron beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardiansyah, D.; Haryanto, F.; Male, S.

    2014-09-30

    Prism is a non-commercial Radiotherapy Treatment Planning System (RTPS) develop by Ira J. Kalet from Washington University. Inhomogeneity factor is included in Prism TPS dose calculation. The aim of this study is to investigate the sensitivity of dose calculation on Prism using Monte Carlo simulation. Phase space source from head linear accelerator (LINAC) for Monte Carlo simulation is implemented. To achieve this aim, Prism dose calculation is compared with EGSnrc Monte Carlo simulation. Percentage depth dose (PDD) and R50 from both calculations are observed. BEAMnrc is simulated electron transport in LINAC head and produced phase space file. This file ismore » used as DOSXYZnrc input to simulated electron transport in phantom. This study is started with commissioning process in water phantom. Commissioning process is adjusted Monte Carlo simulation with Prism RTPS. Commissioning result is used for study of inhomogeneity phantom. Physical parameters of inhomogeneity phantom that varied in this study are: density, location and thickness of tissue. Commissioning result is shown that optimum energy of Monte Carlo simulation for 6 MeV electron beam is 6.8 MeV. This commissioning is used R50 and PDD with Practical length (R{sub p}) as references. From inhomogeneity study, the average deviation for all case on interest region is below 5 %. Based on ICRU recommendations, Prism has good ability to calculate the radiation dose in inhomogeneity tissue.« less

  9. Technical Note: Effect of explicit M and N-shell atomic transitions on a low-energy x-ray source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, Peter G. F., E-mail: peter.watson@mail.mcgill.ca; Seuntjens, Jan

    Purpose: In EGSnrc, atomic transitions to and from the M and N-shells are treated in an average way by default. This approach is justified in which the energy difference between explicit and average M and N-shell binding energies is less than 1 keV, and for most applications can be considered negligible. However, for simulations of low energy x-ray sources on thin, high-Z targets, characteristic x-rays can make up a significant portion of the source spectra. As of release V4-2.4.0, EGSnrc has included an option to enable a more complete algorithm of all atomic transitions available in the EADL compilation. Inmore » this paper, the effect of M and N-shell averaging on the calculation of half-value layer (HVL) and relative depth dose (RDD) curve of a 50 kVp intraoperative x-ray tube with a thin gold target was investigated. Methods: A 50 kVp miniature x-ray source with a gold target (The INTRABEAM System, Carl Zeiss, Germany) was modeled with the EGSnrc user code cavity, both with and without M and N-shell averaging. From photon fluence spectra simulations, the source HVLs were determined analytically. The same source model was then used with egs-chamber to calculate RDD curves in water. Results: A 4% increase of HVL was reported when accounting for explicit M and N-shell transitions, and up to a 9% decrease in local relative dose for normalization at 3 mm depth in water. Conclusions: The EGSnrc default of using averaged M and N-shell binding energies has an observable effect on the HVL and RDD of a low energy x-ray source with high-Z target. For accurate modeling of this class of devices, explicit atomic transitions should be included.« less

  10. Development and validation of a BEAMnrc component module for a miniature multileaf collimator.

    PubMed

    Doerner, E; Hartmann, G H

    2012-05-21

    A new component module (CM) named mini multileaf collimator (mMLC) was developed for the Monte Carlo code BEAMnrc. It models the geometry of the add-on miniature multileaf collimator ModuLeaf (MRC Systems GmbH, Heidelberg, Germany, now part of Siemens, Erlangen, Germany). The new CM is partly based on the existing CM called DYNVMLC. The development was performed using a modified EGSnrc platform which enables us to work in the Microsoft Visual Studio environment. In order to validate the new CM, the PRIMUS linac with 6 MV x-rays (Siemens OCS, Concord, CA, USA) equipped with the ModuLeaf mMLC was modelled. Validation was performed by two methods: (a) a ray-tracing method to check the correct geometry of the multileaf collimator (MLC) and (b) a comparison of calculated and measured results of the following dosimetrical parameters: output factors, dose profiles, field edge position penumbra, MLC interleaf leakage and transmission values. Excellent agreement was found for all parameters. It was, in particular, found that the relationship between leaf position and field edge depending on the shape of the leaf ends can be investigated with a higher accuracy by this new CM than by measurements demonstrating the usefulness of the new CM.

  11. Development and validation of a BEAMnrc component module for a miniature multileaf collimator

    NASA Astrophysics Data System (ADS)

    Doerner, E.; Hartmann, G. H.

    2012-05-01

    A new component module (CM) named mini multileaf collimator (mMLC) was developed for the Monte Carlo code BEAMnrc. It models the geometry of the add-on miniature multileaf collimator ModuLeaf (MRC Systems GmbH, Heidelberg, Germany, now part of Siemens, Erlangen, Germany). The new CM is partly based on the existing CM called DYNVMLC. The development was performed using a modified EGSnrc platform which enables us to work in the Microsoft Visual Studio environment. In order to validate the new CM, the PRIMUS linac with 6 MV x-rays (Siemens OCS, Concord, CA, USA) equipped with the ModuLeaf mMLC was modelled. Validation was performed by two methods: (a) a ray-tracing method to check the correct geometry of the multileaf collimator (MLC) and (b) a comparison of calculated and measured results of the following dosimetrical parameters: output factors, dose profiles, field edge position penumbra, MLC interleaf leakage and transmission values. Excellent agreement was found for all parameters. It was, in particular, found that the relationship between leaf position and field edge depending on the shape of the leaf ends can be investigated with a higher accuracy by this new CM than by measurements demonstrating the usefulness of the new CM.

  12. Enhancement of natural background gamma-radiation dose around uranium microparticles in the human body

    PubMed Central

    Pattison, John E.; Hugtenburg, Richard P.; Green, Stuart

    2010-01-01

    Ongoing controversy surrounds the adverse health effects of the use of depleted uranium (DU) munitions. The biological effects of gamma-radiation arise from the direct or indirect interaction between secondary electrons and the DNA of living cells. The probability of the absorption of X-rays and gamma-rays with energies below about 200 keV by particles of high atomic number is proportional to the third to fourth power of the atomic number. In such a case, the more heavily ionizing low-energy recoil electrons are preferentially produced; these cause dose enhancement in the immediate vicinity of the particles. It has been claimed that upon exposure to naturally occurring background gamma-radiation, particles of DU in the human body would produce dose enhancement by a factor of 500–1000, thereby contributing a significant radiation dose in addition to the dose received from the inherent radioactivity of the DU. In this study, we used the Monte Carlo code EGSnrc to accurately estimate the likely maximum dose enhancement arising from the presence of micrometre-sized uranium particles in the body. We found that although the dose enhancement is significant, of the order of 1–10, it is considerably smaller than that suggested previously. PMID:19776147

  13. Sci-Thur PM: YIS - 07: Monte Carlo simulations to obtain several parameters required for electron beam dosimetry.

    PubMed

    Muir, B; Rogers, D; McEwen, M

    2012-07-01

    When current dosimetry protocols were written, electron beam data were limited and had uncertainties that were unacceptable for reference dosimetry. Protocols for high-energy reference dosimetry are currently being updated leading to considerable interest in accurate electron beam data. To this end, Monte Carlo simulations using the EGSnrc user-code egs_chamber are performed to extract relevant data for reference beam dosimetry. Calculations of the absorbed dose to water and the absorbed dose to the gas in realistic ion chamber models are performed as a function of depth in water for cobalt-60 and high-energy electron beams between 4 and 22 MeV. These calculations are used to extract several of the parameters required for electron beam dosimetry - the beam quality specifier, R 50 , beam quality conversion factors, k Q and k R50 , the electron quality conversion factor, k' R50 , the photon-electron conversion factor, k ecal , and ion chamber perturbation factors, P Q . The method used has the advantage that many important parameters can be extracted as a function of depth instead of determination at only the reference depth as has typically been done. Results obtained here are in good agreement with measured and other calculated results. The photon-electron conversion factors obtained for a Farmer-type NE2571 and plane-parallel PTW Roos, IBA NACP-02 and Exradin A11 chambers are 0.903, 0.896, 0.894 and 0.906, respectively. These typically differ by less than 0.7% from the contentious TG-51 values but have much smaller systematic uncertainties. These results are valuable for reference dosimetry of high-energy electron beams. © 2012 American Association of Physicists in Medicine.

  14. Room scatter effects in Total Skin Electron Irradiation: Monte Carlo simulation study.

    PubMed

    Nevelsky, Alexander; Borzov, Egor; Daniel, Shahar; Bar-Deroma, Raquel

    2017-01-01

    Total Skin Electron Irradiation (TSEI) is a complex technique which usually involves the use of large electron fields and the dual-field approach. In this situation, many electrons scattered from the treatment room floor are produced. However, no investigations of the effect of scattered electrons in TSEI treatments have been reported. The purpose of this work was to study the contribution of floor scattered electrons to skin dose during TSEI treatment using Monte Carlo (MC) simulations. All MC simulations were performed with the EGSnrc code. Influence of beam energy, dual-field angle, and floor material on the contribution of floor scatter was investigated. Spectrum of the scattered electrons was calculated. Measurements of dose profile were performed in order to verify MC calculations. Floor scatter dependency on the floor material was observed (at 20 cm from the floor, scatter contribution was about 21%, 18%, 15%, and 12% for iron, concrete, PVC, and water, respectively). Although total dose profiles exhibited slight variation as functions of beam energy and dual-field angle, no dependence of the floor scatter contribution on the beam energy or dual-field angle was found. The spectrum of the scattered electrons was almost uniform between a few hundred KeV to 4 MeV, and then decreased linearly to 6 MeV. For the TSEI technique, dose contribution due to the electrons scattered from the room floor may be clinically significant and should be taken into account during design and commissioning phases. MC calculations can be used for this task. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. Dependences of mucosal dose on photon beams in head-and-neck intensity-modulated radiation therapy: a Monte Carlo study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, James C.L., E-mail: james.chow@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Department of Physics, Ryerson University, Toronto, Ontario

    2012-07-01

    Dependences of mucosal dose in the oral or nasal cavity on the beam energy, beam angle, multibeam configuration, and mucosal thickness were studied for small photon fields using Monte Carlo simulations (EGSnrc-based code), which were validated by measurements. Cylindrical mucosa phantoms (mucosal thickness = 1, 2, and 3 mm) with and without the bone and air inhomogeneities were irradiated by the 6- and 18-MV photon beams (field size = 1 Multiplication-Sign 1 cm{sup 2}) with gantry angles equal to 0 Degree-Sign , 90 Degree-Sign , and 180 Degree-Sign , and multibeam configurations using 2, 4, and 8 photon beams inmore » different orientations around the phantom. Doses along the central beam axis in the mucosal tissue were calculated. The mucosal surface doses were found to decrease slightly (1% for the 6-MV photon beam and 3% for the 18-MV beam) with an increase of mucosal thickness from 1-3 mm, when the beam angle is 0 Degree-Sign . The variation of mucosal surface dose with its thickness became insignificant when the beam angle was changed to 180 Degree-Sign , but the dose at the bone-mucosa interface was found to increase (28% for the 6-MV photon beam and 20% for the 18-MV beam) with the mucosal thickness. For different multibeam configurations, the dependence of mucosal dose on its thickness became insignificant when the number of photon beams around the mucosal tissue was increased. The mucosal dose with bone was varied with the beam energy, beam angle, multibeam configuration and mucosal thickness for a small segmental photon field. These dosimetric variations are important to consider improving the treatment strategy, so the mucosal complications in head-and-neck intensity-modulated radiation therapy can be minimized.« less

  16. Poster - Thurs Eve-23: Effect of lung density and geometry variation on inhomogeneity correction algorithms: A Monte Carlo dosimetry evaluation.

    PubMed

    Chow, J; Leung, M; Van Dyk, J

    2008-07-01

    This study provides new information on the evaluation of the lung dose calculation algorithms as a function of the relative electron density of lung, ρ e,lung . Doses calculated using the collapsed cone convolution (CCC) and adaptive convolution (AC) algorithm in lung with the Pinnacle 3 system were compared to those calculated using the Monte Carlo (MC) simulation (EGSnrc-based code). Three groups of lung phantoms, namely, "Slab", "Column" and "Cube" with different ρ e,lung (0.05-0.7), positions, volumes and shapes of lung in water were used. 6 and 18MV photon beams with 4×4 and 10×10cm 2 field sizes produced by a Varian 21EX Linac were used in the MC dose calculations. Results show that the CCC algorithm agrees well with AC to within ±1% for doses calculated in the lung phantoms, indicating that the AC, with 3-4 times less computing time required than CCC, is a good substitute for the CCC method. Comparing the CCC and AC with MC, dose deviations are found when ρ e,lung are ⩽0.1-0.3. The degree of deviation depends on the photon beam energy and field size, and is relatively large when high-energy photon beams with small field are used. For the penumbra widths (20%-80%), the CCC and AC agree well with MC for the "Slab" and "Cube" phantoms with the lung volumes at the central beam axis (CAX). However, deviations >2mm occur in the "Column" phantoms, with two lung volumes separated by a water column along the CAX, using the 18MV (4×4cm 2 ) photon beams with ρ e,lung ⩽0.1. © 2008 American Association of Physicists in Medicine.

  17. Modified COMS Plaques for {sup 125}I and {sup 103}Pd Iris Melanoma Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomson, Rowan M., E-mail: rthomson@physics.carleton.c; Furutani, Keith M.; Pulido, Jose S.

    2010-11-15

    Purpose: Novel plaques are used to treat iris melanoma at the Mayo Clinic Rochester. The plaques are a modification of the Collaborative Ocular Melanoma Study (COMS) 22 mm plaque design with a gold alloy backing, outer lip, and silicone polymer insert. An inner lip surrounds a 10 mm diameter cutout region at the plaque center. Plaques span 360{sup o}, 270{sup o}, and 180{sup o} arcs. This article describes dosimetry for these plaques and others used in the treatment of anterior eye melanomas. Methods and Materials: The EGSnrc user-code BrachyDose is used to perform Monte Carlo simulations. Plaques and seeds aremore » fully modeled. Three-dimensional dose distributions for different plaque models, TG-43 calculations, and {sup 125}I (model 6711) and {sup 103}Pd (model 200) seeds are compared via depth-dose curves, tabulation of doses at points of interest, and isodose contours. Results: Doses at points of interest differ by up to 70% from TG-43 calculations. The inner lip reduces corneal doses. Matching plaque arc length to tumor extent reduces doses to eye regions outside the treatment area. Maintaining the same prescription dose, {sup 103}Pd offers lower doses to critical structures than {sup 125}I, with the exception of the sclera adjacent to the plaque. Conclusion: The Mayo Clinic plaques offer several advantages for anterior eye tumor treatments. Doses to regions outside the treatment area are significantly reduced. Doses differ considerably from TG-43 predictions, illustrating the importance of complete Monte Carlo simulations. Calculations take a few minutes on a single CPU, making BrachyDose sufficiently fast for routine clinical treatment planning.« less

  18. Parallel processing implementation for the coupled transport of photons and electrons using OpenMP

    NASA Astrophysics Data System (ADS)

    Doerner, Edgardo

    2016-05-01

    In this work the use of OpenMP to implement the parallel processing of the Monte Carlo (MC) simulation of the coupled transport for photons and electrons is presented. This implementation was carried out using a modified EGSnrc platform which enables the use of the Microsoft Visual Studio 2013 (VS2013) environment, together with the developing tools available in the Intel Parallel Studio XE 2015 (XE2015). The performance study of this new implementation was carried out in a desktop PC with a multi-core CPU, taking as a reference the performance of the original platform. The results were satisfactory, both in terms of scalability as parallelization efficiency.

  19. Re-evaluation of the correction factors for the GROVEX

    NASA Astrophysics Data System (ADS)

    Ketelhut, Steffen; Meier, Markus

    2018-04-01

    The GROVEX (GROssVolumige EXtrapolationskammer, large-volume extrapolation chamber) is the primary standard for the dosimetry of low-dose-rate interstitial brachytherapy at the Physikalisch-Technische Bundesanstalt (PTB). In the course of setup modifications and re-measuring of several dimensions, the correction factors have been re-evaluated in this work. The correction factors for scatter and attenuation have been recalculated using the Monte Carlo software package EGSnrc, and a new expression has been found for the divergence correction. The obtained results decrease the measured reference air kerma rate by approximately 0.9% for the representative example of a seed of type Bebig I25.S16C. This lies within the expanded uncertainty (k  =  2).

  20. SU-G-206-05: A Comparison of Head Phantoms Used for Dose Determination in Imaging Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Kilian-Meneghin, J

    Purpose: To determine similarities and differences between various head phantoms that might be used for dose measurements in diagnostic imaging procedures. Methods: We chose four frequently used anthropomorphic head phantoms (SK-150, PBU-50, RS-240T and Alderson Rando), a computational patient phantom (Zubal) and the CTDI head phantom for comparison in our study. We did a CT scan of the head phantoms using the same protocol and compared their dimensions and CT numbers. The scan data was used to calculate dose values for each of the phantoms using EGSnrc Monte Carlo software. An .egsphant file was constructed to describe these phantoms usingmore » a Visual C++ program for DOSXYZnrc/EGSnrc simulation. The lens dose was calculated for a simulated CBCT scan using DOSXYZnrc/EGSnrc and the calculated doses were validated with measurements using Gafchromic film and an ionization chamber. Similar calculations and measurements were made for PA radiography to investigate the attenuation and backscatter differences between these phantoms. We used the Zubal phantom as the standard for comparison since it was developed based on a CT scan of a patient. Results: The lens dose for the Alderson Rando phantom is around 9% different than the Zubal phantom, while the lens dose for the PBU-50 phantom was about 50% higher, possibly because its skull thickness and the density of bone and soft tissue are lower than anthropometric values. The lens dose for the CTDI phantom is about 500% higher because of its totally different structure. The entrance dose profiles are similar for the five anthropomorphic phantoms, while that for the CTDI phantom was distinctly different. Conclusion: The CTDI and PBU-50 head phantoms have substantially larger lens dose estimates in CBCT. The other four head phantoms have similar entrance dose with backscatter hence should be preferred for dose measurement in imaging procedures of the head. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  1. Dose reduction of scattered photons from concrete walls lined with lead: Implications for improvement in design of megavoltage radiation therapy facility mazes.

    PubMed

    Al-Affan, I A M; Hugtenburg, R P; Bari, D S; Al-Saleh, W M; Piliero, M; Evans, S; Al-Hasan, M; Al-Zughul, B; Al-Kharouf, S; Ghaith, A

    2015-02-01

    This study explores the possibility of using lead to cover part of the radiation therapy facility maze walls in order to absorb low energy photons and reduce the total dose at the maze entrance of radiation therapy rooms. Experiments and Monte Carlo simulations were utilized to establish the possibility of using high-Z materials to cover the concrete walls of the maze in order to reduce the dose of the scattered photons at the maze entrance. The dose of the backscattered photons from a concrete wall was measured for various scattering angles. The dose was also calculated by the FLUKA and EGSnrc Monte Carlo codes. The FLUKA code was also used to simulate an existing radiotherapy room to study the effect of multiple scattering when adding lead to cover the concrete walls of the maze. Monoenergetic photons were used to represent the main components of the x ray spectrum up to 10 MV. It was observed that when the concrete wall was covered with just 2 mm of lead, the measured dose rate at all backscattering angles was reduced by 20% for photons of energy comparable to Co-60 emissions and 70% for Cs-137 emissions. The simulations with FLUKA and EGS showed that the reduction in the dose was potentially even higher when lead was added. One explanation for the reduction is the increased absorption of backscattered photons due to the photoelectric interaction in lead. The results also showed that adding 2 mm lead to the concrete walls and floor of the maze reduced the dose at the maze entrance by up to 90%. This novel proposal of covering part or the entire maze walls with a few millimeters of lead would have a direct implication for the design of radiation therapy facilities and would assist in upgrading the design of some mazes, especially those in facilities with limited space where the maze length cannot be extended to sufficiently reduce the dose. © 2015 American Association of Physicists in Medicine.

  2. Dose reduction of scattered photons from concrete walls lined with lead: Implications for improvement in design of megavoltage radiation therapy facility mazes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Affan, I. A. M., E-mail: info@medphys-environment.co.uk; Hugtenburg, R. P.; Piliero, M.

    Purpose: This study explores the possibility of using lead to cover part of the radiation therapy facility maze walls in order to absorb low energy photons and reduce the total dose at the maze entrance of radiation therapy rooms. Methods: Experiments and Monte Carlo simulations were utilized to establish the possibility of using high-Z materials to cover the concrete walls of the maze in order to reduce the dose of the scattered photons at the maze entrance. The dose of the backscattered photons from a concrete wall was measured for various scattering angles. The dose was also calculated by themore » FLUKA and EGSnrc Monte Carlo codes. The FLUKA code was also used to simulate an existing radiotherapy room to study the effect of multiple scattering when adding lead to cover the concrete walls of the maze. Monoenergetic photons were used to represent the main components of the x ray spectrum up to 10 MV. Results: It was observed that when the concrete wall was covered with just 2 mm of lead, the measured dose rate at all backscattering angles was reduced by 20% for photons of energy comparable to Co-60 emissions and 70% for Cs-137 emissions. The simulations with FLUKA and EGS showed that the reduction in the dose was potentially even higher when lead was added. One explanation for the reduction is the increased absorption of backscattered photons due to the photoelectric interaction in lead. The results also showed that adding 2 mm lead to the concrete walls and floor of the maze reduced the dose at the maze entrance by up to 90%. Conclusions: This novel proposal of covering part or the entire maze walls with a few millimeters of lead would have a direct implication for the design of radiation therapy facilities and would assist in upgrading the design of some mazes, especially those in facilities with limited space where the maze length cannot be extended to sufficiently reduce the dose.« less

  3. Latent uncertainties of the precalculated track Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, Marc-André; Seuntjens, Jan; Roberge, David

    Purpose: While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited numbermore » of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pregenerated for electrons and protons using EGSnrc and GEANT4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (CUDA) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a “ground truth” benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of D{sub max}. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Results: Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. Conclusions: The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.« less

  4. Latent uncertainties of the precalculated track Monte Carlo method.

    PubMed

    Renaud, Marc-André; Roberge, David; Seuntjens, Jan

    2015-01-01

    While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited number of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Particle tracks were pregenerated for electrons and protons using EGSnrc and geant4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (cuda) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a "ground truth" benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of Dmax. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤ 1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.

  5. Dosimetric impact of dual-energy CT tissue segmentation for low-energy prostate brachytherapy: a Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Remy, Charlotte; Lalonde, Arthur; Béliveau-Nadeau, Dominic; Carrier, Jean-François; Bouchard, Hugo

    2018-01-01

    The purpose of this study is to evaluate the impact of a novel tissue characterization method using dual-energy over single-energy computed tomography (DECT and SECT) on Monte Carlo (MC) dose calculations for low-dose rate (LDR) prostate brachytherapy performed in a patient like geometry. A virtual patient geometry is created using contours from a real patient pelvis CT scan, where known elemental compositions and varying densities are overwritten in each voxel. A second phantom is made with additional calcifications. Both phantoms are the ground truth with which all results are compared. Simulated CT images are generated from them using attenuation coefficients taken from the XCOM database with a 100 kVp spectrum for SECT and 80 and 140Sn kVp for DECT. Tissue segmentation for Monte Carlo dose calculation is made using a stoichiometric calibration method for the simulated SECT images. For the DECT images, Bayesian eigentissue decomposition is used. A LDR prostate brachytherapy plan is defined with 125I sources and then calculated using the EGSnrc user-code Brachydose for each case. Dose distributions and dose-volume histograms (DVH) are compared to ground truth to assess the accuracy of tissue segmentation. For noiseless images, DECT-based tissue segmentation outperforms the SECT procedure with a root mean square error (RMS) on relative errors on dose distributions respectively of 2.39% versus 7.77%, and provides DVHs closest to the reference DVHs for all tissues. For a medium level of CT noise, Bayesian eigentissue decomposition still performs better on the overall dose calculation as the RMS error is found to be of 7.83% compared to 9.15% for SECT. Both methods give a similar DVH for the prostate while the DECT segmentation remains more accurate for organs at risk and in presence of calcifications, with less than 5% of RMS errors within the calcifications versus up to 154% for SECT. In a patient-like geometry, DECT-based tissue segmentation provides dose distributions with the highest accuracy and the least bias compared to SECT. When imaging noise is considered, benefits of DECT are noticeable if important calcifications are found within the prostate.

  6. Comparison of film measurements and Monte Carlo simulations of dose delivered with very high-energy electron beams in a polystyrene phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazalova-Carter, Magdalena; Liu, Michael; Palma, Bianey

    2015-04-15

    Purpose: To measure radiation dose in a water-equivalent medium from very high-energy electron (VHEE) beams and make comparisons to Monte Carlo (MC) simulation results. Methods: Dose in a polystyrene phantom delivered by an experimental VHEE beam line was measured with Gafchromic films for three 50 MeV and two 70 MeV Gaussian beams of 4.0–6.9 mm FWHM and compared to corresponding MC-simulated dose distributions. MC dose in the polystyrene phantom was calculated with the EGSnrc/BEAMnrc and DOSXYZnrc codes based on the experimental setup. Additionally, the effect of 2% beam energy measurement uncertainty and possible non-zero beam angular spread on MC dosemore » distributions was evaluated. Results: MC simulated percentage depth dose (PDD) curves agreed with measurements within 4% for all beam sizes at both 50 and 70 MeV VHEE beams. Central axis PDD at 8 cm depth ranged from 14% to 19% for the 5.4–6.9 mm 50 MeV beams and it ranged from 14% to 18% for the 4.0–4.5 mm 70 MeV beams. MC simulated relative beam profiles of regularly shaped Gaussian beams evaluated at depths of 0.64 to 7.46 cm agreed with measurements to within 5%. A 2% beam energy uncertainty and 0.286° beam angular spread corresponded to a maximum 3.0% and 3.8% difference in depth dose curves of the 50 and 70 MeV electron beams, respectively. Absolute dose differences between MC simulations and film measurements of regularly shaped Gaussian beams were between 10% and 42%. Conclusions: The authors demonstrate that relative dose distributions for VHEE beams of 50–70 MeV can be measured with Gafchromic films and modeled with Monte Carlo simulations to an accuracy of 5%. The reported absolute dose differences likely caused by imperfect beam steering and subsequent charge loss revealed the importance of accurate VHEE beam control and diagnostics.« less

  7. Evaluation of an analytic linear Boltzmann transport equation solver for high-density inhomogeneities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, S. A. M.; Ansbacher, W.; Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 3P6

    2013-01-15

    Purpose: Acuros external beam (Acuros XB) is a novel dose calculation algorithm implemented through the ECLIPSE treatment planning system. The algorithm finds a deterministic solution to the linear Boltzmann transport equation, the same equation commonly solved stochastically by Monte Carlo methods. This work is an evaluation of Acuros XB, by comparison with Monte Carlo, for dose calculation applications involving high-density materials. Existing non-Monte Carlo clinical dose calculation algorithms, such as the analytic anisotropic algorithm (AAA), do not accurately model dose perturbations due to increased electron scatter within high-density volumes. Methods: Acuros XB, AAA, and EGSnrc based Monte Carlo are usedmore » to calculate dose distributions from 18 MV and 6 MV photon beams delivered to a cubic water phantom containing a rectangular high density (4.0-8.0 g/cm{sup 3}) volume at its center. The algorithms are also used to recalculate a clinical prostate treatment plan involving a unilateral hip prosthesis, originally evaluated using AAA. These results are compared graphically and numerically using gamma-index analysis. Radio-chromic film measurements are presented to augment Monte Carlo and Acuros XB dose perturbation data. Results: Using a 2% and 1 mm gamma-analysis, between 91.3% and 96.8% of Acuros XB dose voxels containing greater than 50% the normalized dose were in agreement with Monte Carlo data for virtual phantoms involving 18 MV and 6 MV photons, stainless steel and titanium alloy implants and for on-axis and oblique field delivery. A similar gamma-analysis of AAA against Monte Carlo data showed between 80.8% and 87.3% agreement. Comparing Acuros XB and AAA evaluations of a clinical prostate patient plan involving a unilateral hip prosthesis, Acuros XB showed good overall agreement with Monte Carlo while AAA underestimated dose on the upstream medial surface of the prosthesis due to electron scatter from the high-density material. Film measurements support the dose perturbations demonstrated by Monte Carlo and Acuros XB data. Conclusions: Acuros XB is shown to perform as well as Monte Carlo methods and better than existing clinical algorithms for dose calculations involving high-density volumes.« less

  8. Sci—Fri PM: Dosimetry—05: Megavoltage electron backscatter: EGSnrc results versus 21 experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, E. S. M.; The Ottawa Hospital Cancer Centre, Ottawa; Buchenberg, W.

    2014-08-15

    The accuracy of electron backscatter calculations at megavoltage energies is important for many medical physics applications. In this study, EGSnrc calculations of megavoltage electron backscatter (1–22 MeV) are performed and compared to the data from 21 experiments published between 1954 and 1993 for 25 single elements with atomic numbers from 3 to 92. Typical experimental uncertainties are 15%. For EGSnrc simulations, an ideal detector is assumed, and the most accurate electron physics options are employed, for a combined statistical and systematic uncertainty of 3%. The quantities compared are the backscatter coefficient and the energy spectra (in the backward hemisphere andmore » at specific detector locations). For the backscatter coefficient, the overall agreement is within ±2% in the absolute value of the backscatter coefficient (in per cent), and within 11% of the individual backscatter values. EGSnrc results are systematically on the higher end of the spread of the experimental data, which could be partially from systematic experimental errors discussed in the literature. For the energy spectra, reasonable agreement between simulations and experiments is observed, although there are significant variations in the experimental data. At the lower end of the spectra, simulations are higher than some experimental data, which could be due to reduced experimental sensitivity to lower energy electrons and/or over-estimation by EGSnrc for backscattered secondary electrons. In conclusion, overall good agreement is observed between EGSnrc backscatter calculations and experimental measurements for megavoltage electrons. There is a need for high quality experimental data for the energy spectra of backscattered electrons.« less

  9. Water equivalency evaluation of PRESAGE® dosimeters for dosimetry of Cs-137 and Ir-192 brachytherapy sources

    NASA Astrophysics Data System (ADS)

    Gorjiara, Tina; Hill, Robin; Kuncic, Zdenka; Baldock, Clive

    2010-11-01

    A major challenge in brachytherapy dosimetry is the measurement of steep dose gradients. This can be achieved with a high spatial resolution three dimensional (3D) dosimeter. PRESAGE® is a polyurethane based dosimeter which is suitable for 3D dosimetry. Since an ideal dosimeter is radiologically water equivalent, we have investigated the relative dose response of three different PRESAGE® formulations, two with a lower chloride and bromide content than original one, for Cs-137 and Ir-192 brachytherapy sources. Doses were calculated using the EGSnrc Monte Carlo package. Our results indicate that PRESAGE® dosimeters are suitable for relative dose measurement of Cs-137 and Ir-192 brachytherapy sources and the lower halogen content PRESAGE® dosimeters are more water equivalent than the original formulation.

  10. Monte Carlo Modeling of the Initial Radiation Emitted by a Nuclear Device in the National Capital Region

    DTIC Science & Technology

    2013-07-01

    also simulated in the models. Data was derived from calculations using the three-dimensional Monte Carlo radiation transport code MCNP (Monte Carlo N...32  B.  MCNP PHYSICS OPTIONS ......................................................................................... 33  C.  HAZUS...input deck’) for the MCNP , Monte Carlo N-Particle, radiation transport code. MCNP is a general-purpose code designed to simulate neutron, photon

  11. SU-E-T-796: Variation of Surface Photon Energy Spectra On Bone Heterogeneity and Beam Obliquity Between Flattened and Unflattened Beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J; Owrangi, A; Grigorov, G

    Purpose: This study investigates the spectra of surface photon energy and energy fluence in the bone heterogeneity and beam obliquity using flattened and unflattened photon beams. The spectra were calculated in a bone and water phantom using Monte Carlo simulation (the EGSnrc code). Methods: Spectra of energy, energy fluence and mean energy of the 6 MV flattened and unflattened photon beams (field size = 10 × 10 cm{sup 2}) produced by a Varian TrueBEAM linear accelerator were calculated at the surfaces of a bone and water phantom using Monte Carlo simulations. The spectral calculations were repeated with the beam anglesmore » turned from 0° to 15°, 30° and 45° in the phantoms. Results: It is found that the unflattened photon beams contained more photons in the low-energy range of 0 – 2 MeV than the flattened beams with a flattening filter. Compared to the water phantom, both the flattened and unflattened beams had slightly less photons in the energy range < 0.4 MeV when a bone layer of 1 cm is present under the phantom surface. This shows that the presence of the bone decreased the low-energy photons backscattered to the phantom surface. When the photon beams were rotated from 0° to 45°, the number of photon and mean photon energy increased with the beam angle. This is because both the flattened and unflattened beams became more hardened when the beam angle increased. With the bone heterogeneity, the mean energies of both photon beams increased correspondingly. This is due to the absorption of low-energy photons by the bone, resulting in more significant beam hardening. Conclusion: The photon spectral information is important in studies on the patient’s surface dose enhancement when using unflattened photon beams in radiotherapy.« less

  12. SU-F-T-360: Dosimetric Impacts On the Mucosa and Bone in Radiotherapy with Unflattened Photon Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J; Owrangi, A

    Purpose: This study investigated the dosimetric impacts on the mucosa and bone when using the unflattened photon beams in radiotherapy. Dose calculations were carried out by Monte Carlo simulation. Methods: Heterogeneous phantoms containing water (soft tissue and mucosa), air and bone, with mucosa thicknesses varying from 0.5 – 3 mm were irradiated by the 6 MV unflattened and flattened photon beams (field size = 10 × 10 cm{sup 2}), produced by a Varian TrueBEAM linear accelerator. The photon energy spectra of the beams, mean bone and mucosal doses with different mucosa thicknesses were calculated using the EGSnrc Monte Carlo code.more » Results: It is found that the flattened photon beams had higher mean bone doses (1.3% and 2% for upper and lower bone regarding the phantom geometry, respectively) than the unflattened beams, and the mean bone doses of both beams did not vary significantly with the mucosa thickness. Similarly, flattened photon beams had higher mucosal dose (0.9% and 1.6% for upper and lower mucosa, respectively) than the unflattened beams. This is due to the larger slope of the depth dose for the unflattened photon beams compared to the flattened. The mucosal doses of both beams were found increased with the mucosa thickness. Moreover, the mucosal dose differences between the unflattened and flattened beams increased with the mucosa thickness. For photon energy spectra on the mucosal layers, it is found that the unflattened photon beams contained a larger portion of lowenergy photons than the flattened beams. The photon energy spectra did not change significantly with the mucosa thickness. Conclusion: It is concluded that the mucosal and bone dose for the unflattened photon beams were not more than 2% lower than the flattened beams, though the flattening filter free beams contained larger portion of low-energy photons than the flattened beams.« less

  13. Magnetic field influences on the lateral dose response functions of photon-beam detectors: MC study of wall-less water-filled detectors with various densities.

    PubMed

    Looe, Hui Khee; Delfs, Björn; Poppinga, Daniela; Harder, Dietrich; Poppe, Björn

    2017-06-21

    The distortion of detector reading profiles across photon beams in the presence of magnetic fields is a developing subject of clinical photon-beam dosimetry. The underlying modification by the Lorentz force of a detector's lateral dose response function-the convolution kernel transforming the true cross-beam dose profile in water into the detector reading profile-is here studied for the first time. The three basic convolution kernels, the photon fluence response function, the dose deposition kernel, and the lateral dose response function, of wall-less cylindrical detectors filled with water of low, normal and enhanced density are shown by Monte Carlo simulation to be distorted in the prevailing direction of the Lorentz force. The asymmetric shape changes of these convolution kernels in a water medium and in magnetic fields of up to 1.5 T are confined to the lower millimetre range, and they depend on the photon beam quality, the magnetic flux density and the detector's density. The impact of this distortion on detector reading profiles is demonstrated using a narrow photon beam profile. For clinical applications it appears as favourable that the magnetic flux density dependent distortion of the lateral dose response function, as far as secondary electron transport is concerned, vanishes in the case of water-equivalent detectors of normal water density. By means of secondary electron history backtracing, the spatial distribution of the photon interactions giving rise either directly to secondary electrons or to scattered photons further downstream producing secondary electrons which contribute to the detector's signal, and their lateral shift due to the Lorentz force is elucidated. Electron history backtracing also serves to illustrate the correct treatment of the influences of the Lorentz force in the EGSnrc Monte Carlo code applied in this study.

  14. Comparison of air-kerma strength determinations for HDR (192)Ir sources.

    PubMed

    Rasmussen, Brian E; Davis, Stephen D; Schmidt, Cal R; Micka, John A; Dewerd, Larry A

    2011-12-01

    To perform a comparison of the interim air-kerma strength standard for high dose rate (HDR) (192)Ir brachytherapy sources maintained by the University of Wisconsin Accredited Dosimetry Calibration Laboratory (UWADCL) with measurements of the various source models using modified techniques from the literature. The current interim standard was established by Goetsch et al. in 1991 and has remained unchanged to date. The improved, laser-aligned seven-distance apparatus of the University of Wisconsin Medical Radiation Research Center (UWMRRC) was used to perform air-kerma strength measurements of five different HDR (192)Ir source models. The results of these measurements were compared with those from well chambers traceable to the original standard. Alternative methodologies for interpolating the (192)Ir air-kerma calibration coefficient from the NIST air-kerma standards at (137)Cs and 250 kVp x rays (M250) were investigated and intercompared. As part of the interpolation method comparison, the Monte Carlo code EGSnrc was used to calculate updated values of A(wall) for the Exradin A3 chamber used for air-kerma strength measurements. The effects of air attenuation and scatter, room scatter, as well as the solution method were investigated in detail. The average measurements when using the inverse N(K) interpolation method for the Classic Nucletron, Nucletron microSelectron, VariSource VS2000, GammaMed Plus, and Flexisource were found to be 0.47%, -0.10%, -1.13%, -0.20%, and 0.89% different than the existing standard, respectively. A further investigation of the differences observed between the sources was performed using MCNP5 Monte Carlo simulations of each source model inside a full model of an HDR 1000 Plus well chamber. Although the differences between the source models were found to be statistically significant, the equally weighted average difference between the seven-distance measurements and the well chambers was 0.01%, confirming that it is not necessary to update the current standard maintained at the UWADCL.

  15. Development of virtual patient models for permanent implant brachytherapy Monte Carlo dose calculations: interdependence of CT image artifact mitigation and tissue assignment.

    PubMed

    Miksys, N; Xu, C; Beaulieu, L; Thomson, R M

    2015-08-07

    This work investigates and compares CT image metallic artifact reduction (MAR) methods and tissue assignment schemes (TAS) for the development of virtual patient models for permanent implant brachytherapy Monte Carlo (MC) dose calculations. Four MAR techniques are investigated to mitigate seed artifacts from post-implant CT images of a homogeneous phantom and eight prostate patients: a raw sinogram approach using the original CT scanner data and three methods (simple threshold replacement (STR), 3D median filter, and virtual sinogram) requiring only the reconstructed CT image. Virtual patient models are developed using six TAS ranging from the AAPM-ESTRO-ABG TG-186 basic approach of assigning uniform density tissues (resulting in a model not dependent on MAR) to more complex models assigning prostate, calcification, and mixtures of prostate and calcification using CT-derived densities. The EGSnrc user-code BrachyDose is employed to calculate dose distributions. All four MAR methods eliminate bright seed spot artifacts, and the image-based methods provide comparable mitigation of artifacts compared with the raw sinogram approach. However, each MAR technique has limitations: STR is unable to mitigate low CT number artifacts, the median filter blurs the image which challenges the preservation of tissue heterogeneities, and both sinogram approaches introduce new streaks. Large local dose differences are generally due to differences in voxel tissue-type rather than mass density. The largest differences in target dose metrics (D90, V100, V150), over 50% lower compared to the other models, are when uncorrected CT images are used with TAS that consider calcifications. Metrics found using models which include calcifications are generally a few percent lower than prostate-only models. Generally, metrics from any MAR method and any TAS which considers calcifications agree within 6%. Overall, the studied MAR methods and TAS show promise for further retrospective MC dose calculation studies for various permanent implant brachytherapy treatments.

  16. Mathematical modelling of scanner-specific bowtie filters for Monte Carlo CT dosimetry

    NASA Astrophysics Data System (ADS)

    Kramer, R.; Cassola, V. F.; Andrade, M. E. A.; de Araújo, M. W. C.; Brenner, D. J.; Khoury, H. J.

    2017-02-01

    The purpose of bowtie filters in CT scanners is to homogenize the x-ray intensity measured by the detectors in order to improve the image quality and at the same time to reduce the dose to the patient because of the preferential filtering near the periphery of the fan beam. For CT dosimetry, especially for Monte Carlo calculations of organ and tissue absorbed doses to patients, it is important to take the effect of bowtie filters into account. However, material composition and dimensions of these filters are proprietary. Consequently, a method for bowtie filter simulation independent of access to proprietary data and/or to a specific scanner would be of interest to many researchers involved in CT dosimetry. This study presents such a method based on the weighted computer tomography dose index, CTDIw, defined in two cylindrical PMMA phantoms of 16 cm and 32 cm diameter. With an EGSnrc-based Monte Carlo (MC) code, ratios CTDIw/CTDI100,a were calculated for a specific CT scanner using PMMA bowtie filter models based on sigmoid Boltzmann functions combined with a scanner filter factor (SFF) which is modified during calculations until the calculated MC CTDIw/CTDI100,a matches ratios CTDIw/CTDI100,a, determined by measurements or found in publications for that specific scanner. Once the scanner-specific value for an SFF has been found, the bowtie filter algorithm can be used in any MC code to perform CT dosimetry for that specific scanner. The bowtie filter model proposed here was validated for CTDIw/CTDI100,a considering 11 different CT scanners and for CTDI100,c, CTDI100,p and their ratio considering 4 different CT scanners. Additionally, comparisons were made for lateral dose profiles free in air and using computational anthropomorphic phantoms. CTDIw/CTDI100,a determined with this new method agreed on average within 0.89% (max. 3.4%) and 1.64% (max. 4.5%) with corresponding data published by CTDosimetry (www.impactscan.org) for the CTDI HEAD and BODY phantoms, respectively. Comparison with results calculated using proprietary data for the PHILIPS Brilliance 64 scanner showed agreement on average within 2.5% (max. 5.8%) and with data measured for that scanner within 2.1% (max. 3.7%). Ratios of CTDI100,c/CTDI100, p for this study and corresponding data published by CTDosimetry (www.impactscan.org) agree on average within about 11% (max. 28.6%). Lateral dose profiles calculated with the proposed bowtie filter and with proprietary data agreed within 2% (max. 5.9%), and both calculated data agreed within 5.4% (max. 11.2%) with measured results. Application of the proposed bowtie filter and of the exactly modelled filter to human phantom Monte Carlo calculations show agreement on the average within less than 5% (max. 7.9%) for organ and tissue absorbed doses.

  17. Comparison of space radiation calculations for deterministic and Monte Carlo transport codes

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo

    For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.

  18. JADA: a graphical user interface for comprehensive internal dose assessment in nuclear medicine.

    PubMed

    Grimes, Joshua; Uribe, Carlos; Celler, Anna

    2013-07-01

    The main objective of this work was to design a comprehensive dosimetry package that would keep all aspects of internal dose calculation within the framework of a single software environment and that would be applicable for a variety of dose calculation approaches. Our MATLAB-based graphical user interface (GUI) can be used for processing data obtained using pure planar, pure SPECT, or hybrid planar/SPECT imaging. Time-activity data for source regions are obtained using a set of tools that allow the user to reconstruct SPECT images, load images, coregister a series of planar images, and to perform two-dimensional and three-dimensional image segmentation. Curve fits are applied to the acquired time-activity data to construct time-activity curves, which are then integrated to obtain time-integrated activity coefficients. Subsequently, dose estimates are made using one of three methods. The organ level dose calculation subGUI calculates mean organ doses that are equivalent to dose assessment performed by OLINDA/EXM. Voxelized dose calculation options, which include the voxel S value approach and Monte Carlo simulation using the EGSnrc user code DOSXYZnrc, are available within the process 3D image data subGUI. The developed internal dosimetry software package provides an assortment of tools for every step in the dose calculation process, eliminating the need for manual data transfer between programs. This saves times and minimizes user errors, while offering a versatility that can be used to efficiently perform patient-specific internal dose calculations in a variety of clinical situations.

  19. Measurement of multiple scattering of 13 and 20 MeV electrons by thin foils

    PubMed Central

    Ross, C. K.; McEwen, M. R.; McDonald, A. F.; Cojocaru, C. D.; Faddegon, B. A.

    2008-01-01

    To model the transport of electrons through material requires knowledge of how the electrons lose energy and scatter. Theoretical models are used to describe electron energy loss and scatter and these models are supported by a limited amount of measured data. The purpose of this work was to obtain additional data that can be used to test models of electron scattering. Measurements were carried out using 13 and 20 MeV pencil beams of electrons produced by the National Research Council of Canada research accelerator. The electron fluence was measured at several angular positions from 0° to 9° for scattering foils of different thicknesses and with atomic numbers ranging from 4 to 79. The angle, θ1∕e, at which the fluence has decreased to 1∕e of its value on the central axis was used to characterize the distributions. Measured values of θ1∕e ranged from 1.5° to 8° with a typical uncertainty of about 1%. Distributions calculated using the EGSnrc Monte Carlo code were compared to the measured distributions. In general, the calculated distributions are narrower than the measured ones. Typically, the difference between the measured and calculated values of θ1∕e is about 1.5%, with the maximum difference being 4%. The measured and calculated distributions are related through a simple scaling of the angle, indicating that they have the same shape. No significant trends with atomic number were observed. PMID:18841865

  20. SU-F-I-13: Correction Factor Computations for the NIST Ritz Free Air Chamber for Medium-Energy X Rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergstrom, P

    Purpose: The National Institute of Standards and Technology (NIST) uses 3 free-air chambers to establish primary standards for radiation dosimetry at x-ray energies. For medium-energy × rays, the Ritz free-air chamber is the main measurement device. In order to convert the charge or current collected by the chamber to the radiation quantities air kerma or air kerma rate, a number of correction factors specific to the chamber must be applied. Methods: We used the Monte Carlo codes EGSnrc and PENELOPE. Results: Among these correction factors are the diaphragm correction (which accounts for interactions of photons from the x-ray source inmore » the beam-defining diaphragm of the chamber), the scatter correction (which accounts for the effects of photons scattered out of the primary beam), the electron-loss correction (which accounts for electrons that only partially expend their energy in the collection region), the fluorescence correction (which accounts for ionization due to reabsorption ffluorescence photons and the bremsstrahlung correction (which accounts for the reabsorption of bremsstrahlung photons). We have computed monoenergetic corrections for the NIST Ritz chamber for the 1 cm, 3 cm and 7 cm collection plates. Conclusion: We find good agreement with other’s results for the 7 cm plate. The data used to obtain these correction factors will be used to establish air kerma and it’s uncertainty in the standard NIST x-ray beams.« less

  1. SU-E-T-101: Determination and Comparison of Correction Factors Obtained for TLDs in Small Field Lung Heterogenous Phantom Using Acuros XB and EGSnrc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soh, R; Lee, J; Harianto, F

    Purpose: To determine and compare the correction factors obtained for TLDs in 2 × 2cm{sup 2} small field in lung heterogenous phantom using Acuros XB (AXB) and EGSnrc. Methods: This study will simulate the correction factors due to the perturbation of TLD-100 chips (Harshaw/Thermoscientific, 3 × 3 × 0.9mm{sup 3}, 2.64g/cm{sup 3}) in small field lung medium for Stereotactic Body Radiation Therapy (SBRT). A physical lung phantom was simulated by a 14cm thick composite cork phantom (0.27g/cm{sup 3}, HU:-743 ± 11) sandwiched between 4cm thick Plastic Water (CIRS,Norfolk). Composite cork has been shown to be a good lung substitute materialmore » for dosimetric studies. 6MV photon beam from Varian Clinac iX (Varian Medical Systems, Palo Alto, CA) with field size 2 × 2cm{sup 2} was simulated. Depth dose profiles were obtained from the Eclipse treatment planning system Acuros XB (AXB) and independently from DOSxyznrc, EGSnrc. Correction factors was calculated by the ratio of unperturbed to perturbed dose. Since AXB has limitations in simulating actual material compositions, EGSnrc will also simulate the AXB-based material composition for comparison to the actual lung phantom. Results: TLD-100, with its finite size and relatively high density, causes significant perturbation in 2 × 2cm{sup 2} small field in a low lung density phantom. Correction factors calculated by both EGSnrc and AXB was found to be as low as 0.9. It is expected that the correction factor obtained by EGSnrc wlll be more accurate as it is able to simulate the actual phantom material compositions. AXB have a limited material library, therefore it only approximates the composition of TLD, Composite cork and Plastic water, contributing to uncertainties in TLD correction factors. Conclusion: It is expected that the correction factors obtained by EGSnrc will be more accurate. Studies will be done to investigate the correction factors for higher energies where perturbation may be more pronounced.« less

  2. Dosimetric response of variable-size cavities in photon-irradiated media and the behaviour of the Spencer-Attix cavity integral with increasing Δ.

    PubMed

    Kumar, Sudhir; Deshpande, Deepak D; Nahum, Alan E

    2016-04-07

    Cavity theory is fundamental to understanding and predicting dosimeter response. Conventional cavity theories have been shown to be consistent with one another by deriving the electron (+positron) and photon fluence spectra with the FLURZnrc user-code (EGSnrc Monte-Carlo system) in large volumes under quasi-CPE for photon beams of 1 MeV and 10 MeV in three materials (water, aluminium and copper) and then using these fluence spectra to evaluate and then inter-compare the Bragg-Gray, Spencer-Attix and 'large photon' 'cavity integrals'. The behaviour of the 'Spencer-Attix dose' (aka restricted cema), D S-A(▵), in a 1-MeV photon field in water has been investigated for a wide range of values of the cavity-size parameter ▵: D S-A(▵) decreases far below the Monte-Carlo dose (D MC) for ▵ greater than  ≈  30 keV due to secondary electrons with starting energies below ▵ not being 'counted'. We show that for a quasi-scatter-free geometry (D S-A(▵)/D MC) is closely equal to the proportion of energy transferred to Compton electrons with initial (kinetic) energies above ▵, derived from the Klein-Nishina (K-N) differential cross section. (D S-A(▵)/D MC) can be used to estimate the maximum size of a detector behaving as a Bragg-Gray cavity in a photon-irradiated medium as a function of photon-beam quality (under quasi CPE) e.g. a typical air-filled ion chamber is 'Bragg-Gray' at (monoenergetic) beam energies  ⩾260 keV. Finally, by varying the density of a silicon cavity (of 2.26 mm diameter and 2.0 mm thickness) in water, the response of different cavity 'sizes' was simulated; the Monte-Carlo-derived ratio D w/D Si for 6 MV and 15 MV photons varied from very close to the Spencer-Attix value at 'gas' densities, agreed well with Burlin cavity theory as ρ increased, and approached large photon behaviour for ρ  ≈  10 g cm(-3). The estimate of ▵ for the Si cavity was improved by incorporating a Monte-Carlo-derived correction for electron 'detours'. Excellent agreement was obtained between the Burlin 'd' factor for the Si cavity and D S-A(▵)/D MC at different (detour-corrected) ▵, thereby suggesting a further application for the D S-A(▵)/D MC ratio.

  3. Verification and Validation of Monte Carlo n-Particle Code 6 (MCNP6) with Neutron Protection Factor Measurements of an Iron Box

    DTIC Science & Technology

    2014-03-27

    VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR... PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR MEASUREMENTS OF AN IRON BOX THESIS Presented to the Faculty Department of Engineering...STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED iv AFIT-ENP-14-M-05 VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6

  4. Estimation of computed tomography dose index in cone beam computed tomography: MOSFET measurements and Monte Carlo simulations.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry; Toncheva, Greta; Yoo, Sua; Yin, Fang-Fang; Frush, Donald

    2010-05-01

    To address the lack of accurate dose estimation method in cone beam computed tomography (CBCT), we performed point dose metal oxide semiconductor field-effect transistor (MOSFET) measurements and Monte Carlo (MC) simulations. A Varian On-Board Imager (OBI) was employed to measure point doses in the polymethyl methacrylate (PMMA) CT phantoms with MOSFETs for standard and low dose modes. A MC model of the OBI x-ray tube was developed using BEAMnrc/EGSnrc MC system and validated by the half value layer, x-ray spectrum and lateral and depth dose profiles. We compared the weighted computed tomography dose index (CTDIw) between MOSFET measurements and MC simulations. The CTDIw was found to be 8.39 cGy for the head scan and 4.58 cGy for the body scan from the MOSFET measurements in standard dose mode, and 1.89 cGy for the head and 1.11 cGy for the body in low dose mode, respectively. The CTDIw from MC compared well to the MOSFET measurements within 5% differences. In conclusion, a MC model for Varian CBCT has been established and this approach may be easily extended from the CBCT geometry to multi-detector CT geometry.

  5. Organ doses can be estimated from the computed tomography (CT) dose index for cone-beam CT on radiotherapy equipment.

    PubMed

    Martin, Colin J; Abuhaimed, Abdullah; Sankaralingam, Marimuthu; Metwaly, Mohamed; Gentle, David J

    2016-06-01

    Cone beam computed tomography (CBCT) systems are fitted to radiotherapy linear accelerators and used for patient positioning prior to treatment by image guided radiotherapy (IGRT). Radiotherapists' and radiographers' knowledge of doses to organs from CBCT imaging is limited. The weighted CT dose index for a reference beam of width 20 mm (CTDIw,ref) is displayed on Varian CBCT imaging equipment known as an On-Board Imager (OBI) linked to the Truebeam linear accelerator. This has the potential to provide an indication of organ doses. This knowledge would be helpful for guidance of radiotherapy clinicians preparing treatments. Monte Carlo simulations of imaging protocols for head, thorax and pelvic scans have been performed using EGSnrc/BEAMnrc, EGSnrc/DOSXYZnrc, and ICRP reference computational male and female phantoms to derive the mean absorbed doses to organs and tissues, which have been compared with values for the CTDIw,ref displayed on the CBCT scanner console. Substantial variations in dose were observed between male and female phantoms. Nevertheless, the CTDIw,ref gave doses within  ±21% for the stomach and liver in thorax scans and 2  ×  CTDIw,ref can be used as a measure of doses to breast, lung and oesophagus. The CTDIw,ref could provide indications of doses to the brain for head scans, and the colon for pelvic scans. It is proposed that knowledge of the link between CTDIw for CBCT should be promoted and included in the training of radiotherapy staff.

  6. Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frambati, S.; Frignani, M.

    2012-07-01

    We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less

  7. A photon spectrometric dose-rate constant determination for the Advantage Pd-103 brachytherapy source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Zhe Jay; Bongiorni, Paul; Nath, Ravinder

    Purpose: Although several dosimetric characterizations using Monte Carlo simulation and thermoluminescent dosimetry (TLD) have been reported for the new Advantage Pd-103 source (IsoAid, LLC, Port Richey, FL), no AAPM consensus value has been established for the dosimetric parameters of the source. The aim of this work was to perform an additional dose-rate constant ({Lambda}) determination using a recently established photon spectrometry technique (PST) that is independent of the published TLD and Monte Carlo techniques. Methods: Three Model IAPD-103A Advantage Pd-103 sources were used in this study. The relative photon energy spectrum emitted by each source along the transverse axis wasmore » measured using a high-resolution germanium spectrometer designed for low-energy photons. For each source, the dose-rate constant was determined from its emitted energy spectrum. The PST-determined dose-rate constant ({sub PST}{Lambda}) was then compared to those determined by TLD ({sub TLD}{Lambda}) and Monte Carlo ({sub MC}{Lambda}) techniques. A likely consensus {Lambda} value was estimated as the arithmetic mean of the average {Lambda} values determined by each of three different techniques. Results: The average {sub PST}{Lambda} value for the three Advantage sources was found to be (0.676{+-}0.026) cGyh{sup -1} U{sup -1}. Intersource variation in {sub PST}{Lambda} was less than 0.01%. The {sub PST}{Lambda} was within 2% of the reported {sub MC}{Lambda} values determined by PTRAN, EGSnrc, and MCNP5 codes. It was 3.4% lower than the reported {sub TLD}{Lambda}. A likely consensus {Lambda} value was estimated to be (0.688{+-}0.026) cGyh{sup -1} U{sup -1}, similar to the AAPM consensus values recommended currently for the Theragenics (Buford, GA) Model 200 (0.686{+-}0.033) cGyh{sup -1} U{sup -1}, the NASI (Chatsworth, CA) Model MED3633 (0.688{+-}0.033) cGyh{sup -1} U{sup -1}, and the Best Medical (Springfield, VA) Model 2335 (0.685{+-}0.033) cGyh{sup -1} U{sup -1} {sup 103}Pd sources. Conclusions: An independent {Lambda} determination has been performed for the Advantage Pd-103 source. The {sub PST}{Lambda} obtained in this work provides additional information needed for establishing a more accurate consensus {Lambda} value for the Advantage Pd-103 source.« less

  8. Patient-specific CT dosimetry calculation: a feasibility study.

    PubMed

    Fearon, Thomas; Xie, Huchen; Cheng, Jason Y; Ning, Holly; Zhuge, Ying; Miller, Robert W

    2011-11-15

    Current estimation of radiation dose from computed tomography (CT) scans on patients has relied on the measurement of Computed Tomography Dose Index (CTDI) in standard cylindrical phantoms, and calculations based on mathematical representations of "standard man". Radiation dose to both adult and pediatric patients from a CT scan has been a concern, as noted in recent reports. The purpose of this study was to investigate the feasibility of adapting a radiation treatment planning system (RTPS) to provide patient-specific CT dosimetry. A radiation treatment planning system was modified to calculate patient-specific CT dose distributions, which can be represented by dose at specific points within an organ of interest, as well as organ dose-volumes (after image segmentation) for a GE Light Speed Ultra Plus CT scanner. The RTPS calculation algorithm is based on a semi-empirical, measured correction-based algorithm, which has been well established in the radiotherapy community. Digital representations of the physical phantoms (virtual phantom) were acquired with the GE CT scanner in axial mode. Thermoluminescent dosimeter (TLDs) measurements in pediatric anthropomorphic phantoms were utilized to validate the dose at specific points within organs of interest relative to RTPS calculations and Monte Carlo simulations of the same virtual phantoms (digital representation). Congruence of the calculated and measured point doses for the same physical anthropomorphic phantom geometry was used to verify the feasibility of the method. The RTPS algorithm can be extended to calculate the organ dose by calculating a dose distribution point-by-point for a designated volume. Electron Gamma Shower (EGSnrc) codes for radiation transport calculations developed by National Research Council of Canada (NRCC) were utilized to perform the Monte Carlo (MC) simulation. In general, the RTPS and MC dose calculations are within 10% of the TLD measurements for the infant and child chest scans. With respect to the dose comparisons for the head, the RTPS dose calculations are slightly higher (10%-20%) than the TLD measurements, while the MC results were within 10% of the TLD measurements. The advantage of the algebraic dose calculation engine of the RTPS is a substantially reduced computation time (minutes vs. days) relative to Monte Carlo calculations, as well as providing patient-specific dose estimation. It also provides the basis for a more elaborate reporting of dosimetric results, such as patient specific organ dose volumes after image segmentation.

  9. Considerations of MCNP Monte Carlo code to be used as a radiotherapy treatment planning tool.

    PubMed

    Juste, B; Miro, R; Gallardo, S; Verdu, G; Santos, A

    2005-01-01

    The present work has simulated the photon and electron transport in a Theratron 780® (MDS Nordion)60Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle). This project explains mainly the different methodologies carried out to speedup calculations in order to apply this code efficiently in radiotherapy treatment planning.

  10. Generation of a novel phase-space-based cylindrical dose kernel for IMRT optimization.

    PubMed

    Zhong, Hualiang; Chetty, Indrin J

    2012-05-01

    Improving dose calculation accuracy is crucial in intensity-modulated radiation therapy (IMRT). We have developed a method for generating a phase-space-based dose kernel for IMRT planning of lung cancer patients. Particle transport in the linear accelerator treatment head of a 21EX, 6 MV photon beam (Varian Medical Systems, Palo Alto, CA) was simulated using the EGSnrc/BEAMnrc code system. The phase space information was recorded under the secondary jaws. Each particle in the phase space file was associated with a beamlet whose index was calculated and saved in the particle's LATCH variable. The DOSXYZnrc code was modified to accumulate the energy deposited by each particle based on its beamlet index. Furthermore, the central axis of each beamlet was calculated from the orientation of all the particles in this beamlet. A cylinder was then defined around the central axis so that only the energy deposited within the cylinder was counted. A look-up table was established for each cylinder during the tallying process. The efficiency and accuracy of the cylindrical beamlet energy deposition approach was evaluated using a treatment plan developed on a simulated lung phantom. Profile and percentage depth doses computed in a water phantom for an open, square field size were within 1.5% of measurements. Dose optimized with the cylindrical dose kernel was found to be within 0.6% of that computed with the nontruncated 3D kernel. The cylindrical truncation reduced optimization time by approximately 80%. A method for generating a phase-space-based dose kernel, using a truncated cylinder for scoring dose, in beamlet-based optimization of lung treatment planning was developed and found to be in good agreement with the standard, nontruncated scoring approach. Compared to previous techniques, our method significantly reduces computational time and memory requirements, which may be useful for Monte-Carlo-based 4D IMRT or IMAT treatment planning.

  11. Poster — Thur Eve — 21: Off-axis dose perturbation effects in water in a 5 × 5 cm{sup 2} 18 MV photon beam for the PTW microLion and Exradin A1SL ionization chambers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Grady, K; Davis, S D; Papaconstadopoulos, P

    2014-08-15

    A PTW microLion liquid ionization chamber and an Exradin A1SL air-filled ionization chamber have been modeled using the egs-chamber user code of the EGSnrc system to determine their perturbation effects in water in a 5 × 5 cm{sup 2} 18 MV photon beam. A model of the Varian CL21EX linear accelerator was constructed using the BEAMnrc Monte Carlo code, and was validated by comparing measured PDDs and profiles from the microLion and A1SL chambers to calculated results that included chamber models. Measured PDDs for a 5 × 5 cm{sup 2} field for the microLion chamber agreed with calculations to withinmore » 1.5% beyond a depth of 0.5 cm, and the A1SL PDDs agreed within 1.0% beyond 1.0 cm. Measured and calculated profiles at 10 cm depth agreed within 1.0% for both chambers inside the field, and within 4.0% near the field edge. Local percent differences increased up to 15% at 4 cm outside the field. The ratio of dose to water in the absence of the chamber relative to dose in the chamber's active volume as a function of off-axis distance was calculated using the egs-chamber correlated sampling technique. The dose ratio was nearly constant inside the field and consistent with the stopping power ratios of water to detector material, but varied up to 3.3% near the field edge and 5.2% at 4 cm outside the field. Once these perturbation effects are fully characterized for more field sizes and detectors, they could be applied to clinical water tank measurements for improved dosimetric accuracy.« less

  12. Nuclide Depletion Capabilities in the Shift Monte Carlo Code

    DOE PAGES

    Davidson, Gregory G.; Pandya, Tara M.; Johnson, Seth R.; ...

    2017-12-21

    A new depletion capability has been developed in the Exnihilo radiation transport code suite. This capability enables massively parallel domain-decomposed coupling between the Shift continuous-energy Monte Carlo solver and the nuclide depletion solvers in ORIGEN to perform high-performance Monte Carlo depletion calculations. This paper describes this new depletion capability and discusses its various features, including a multi-level parallel decomposition, high-order transport-depletion coupling, and energy-integrated power renormalization. Several test problems are presented to validate the new capability against other Monte Carlo depletion codes, and the parallel performance of the new capability is analyzed.

  13. Vectorized Monte Carlo methods for reactor lattice analysis

    NASA Technical Reports Server (NTRS)

    Brown, F. B.

    1984-01-01

    Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.

  14. On the effective point of measurement in megavoltage photon beams.

    PubMed

    Kawrakow, Iwan

    2006-06-01

    This paper presents a numerical investigation of the effective point of measurement of thimble ionization chambers in megavoltage photon beams using Monte Carlo simulations with the EGSNRC system. It is shown that the effective point of measurement for relative photon beam dosimetry depends on every detail of the chamber design, including the cavity length, the mass density of the wall material, and the size of the central electrode, in addition to the cavity radius. Moreover, the effective point of measurement also depends on the beam quality and the field size. The paper therefore argues that the upstream shift of 0.6 times the cavity radius, recommended in current dosimetry protocols, is inadequate for accurate relative photon beam dosimetry, particularly in the build-up region. On the other hand, once the effective point of measurement is selected appropriately, measured depth-ionization curves can be equated to measured depth-dose curves for all depths within +/- 0.5%.

  15. Prompt Radiation Protection Factors

    DTIC Science & Technology

    2018-02-01

    dimensional Monte-Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection factors (ratio of dose in the open to...radiation was performed using the three dimensional Monte- Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection...by detonation of a nuclear device have placed renewed emphasis on evaluation of the consequences in case of such an event. The Defense Threat

  16. Lens of the eye dose calculation for neuro-interventional procedures and CBCT scans of the head

    NASA Astrophysics Data System (ADS)

    Xiong, Zhenyu; Vijayan, Sarath; Rana, Vijay; Jain, Amit; Rudin, Stephen; Bednarek, Daniel R.

    2016-03-01

    The aim of this work is to develop a method to calculate lens dose for fluoroscopically-guided neuro-interventional procedures and for CBCT scans of the head. EGSnrc Monte Carlo software is used to determine the dose to the lens of the eye for the projection geometry and exposure parameters used in these procedures. This information is provided by a digital CAN bus on the Toshiba Infinix C-Arm system which is saved in a log file by the real-time skin-dose tracking system (DTS) we previously developed. The x-ray beam spectra on this machine were simulated using BEAMnrc. These spectra were compared to those determined by SpekCalc and validated through measured percent-depth-dose (PDD) curves and half-value-layer (HVL) measurements. We simulated CBCT procedures in DOSXYZnrc for a CTDI head phantom and compared the surface dose distribution with that measured with Gafchromic film, and also for an SK150 head phantom and compared the lens dose with that measured with an ionization chamber. Both methods demonstrated good agreement. Organ dose calculated for a simulated neuro-interventional-procedure using DOSXYZnrc with the Zubal CT voxel phantom agreed within 10% with that calculated by PCXMC code for most organs. To calculate the lens dose in a neuro-interventional procedure, we developed a library of normalized lens dose values for different projection angles and kVp's. The total lens dose is then calculated by summing the values over all beam projections and can be included on the DTS report at the end of the procedure.

  17. Characterization of a fiber-coupled Al2O3:C luminescence dosimetry system for online in vivo dose verification during 192Ir brachytherapy.

    PubMed

    Andersen, Claus E; Nielsen, Søren Kynde; Greilich, Steffen; Helt-Hansen, Jakob; Lindegaard, Jacob Christian; Tanderup, Kari

    2009-03-01

    A prototype of a new dose-verification system has been developed to facilitate prevention and identification of dose delivery errors in remotely afterloaded brachytherapy. The system allows for automatic online in vivo dosimetry directly in the tumor region using small passive detector probes that fit into applicators such as standard needles or catheters. The system measures the absorbed dose rate (0.1 s time resolution) and total absorbed dose on the basis of radioluminescence (RL) and optically stimulated luminescence (OSL) from aluminum oxide crystals attached to optical fiber cables (1 mm outer diameter). The system was tested in the range from 0 to 4 Gy using a solid-water phantom, a Varian GammaMed Plus 192Ir PDR afterloader, and dosimetry probes inserted into stainless-steel brachytherapy needles. The calibrated system was found to be linear in the tested dose range. The reproducibility (one standard deviation) for RL and OSL measurements was 1.3%. The measured depth-dose profiles agreed well with the theoretical expectations computed with the EGSNRC Monte Carlo code, suggesting that the energy dependence for the dosimeter probes (relative to water) is less than 6% for source-to-probe distances in the range of 2-50 mm. Under certain conditions, the RL signal could be greatly disturbed by the so-called stem signal (i.e., unwanted light generated in the fiber cable upon irradiation). The OSL signal is not subject to this source of error. The tested system appears to be adequate for in vivo brachytherapy dosimetry.

  18. Monte Carlo calculation of dose rate conversion factors for external exposure to photon emitters in soil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clovas, A.; Zanthos, S.; Antonopoulos-Domis, M.

    2000-03-01

    The dose rate conversion factors {dot D}{sub CF} (absorbed dose rate in air per unit activity per unit of soil mass, nGy h{sup {minus}1} per Bq kg{sup {minus}1}) are calculated 1 m above ground for photon emitters of natural radionuclides uniformly distributed in the soil. Three Monte Carlo codes are used: (1) The MCNP code of Los Alamos; (2) The GEANT code of CERN; and (3) a Monte Carlo code developed in the Nuclear Technology Laboratory of the Aristotle University of Thessaloniki. The accuracy of the Monte Carlo results is tested by the comparison of the unscattered flux obtained bymore » the three Monte Carlo codes with an independent straightforward calculation. All codes and particularly the MCNP calculate accurately the absorbed dose rate in air due to the unscattered radiation. For the total radiation (unscattered plus scattered) the {dot D}{sub CF} values calculated from the three codes are in very good agreement between them. The comparison between these results and the results deduced previously by other authors indicates a good agreement (less than 15% of difference) for photon energies above 1,500 keV. Antithetically, the agreement is not as good (difference of 20--30%) for the low energy photons.« less

  19. Sci-Sat AM: Radiation Dosimetry and Practical Therapy Solutions - 11: Commissioning of a system for the measurement of electron stopping powers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McEwen, Malcolm; Roy, Timothy; Tessier, Frederic

    Purpose: To develop the techniques required to experimentally determine electron stopping powers for application in primary standards and dosimetry protocols. Method and Materials: A large-volume HPGe detector system (>80% efficiency) was commissioned for the measurement of high energy (5–35 MeV) electron beams. As a proof of principle the system was used with a Y-90/Sr-90 radioactive source. Thin plates of absorbing material (< 0.1 gcm-2) were then placed between the source and detector and the emerging electron spectrum was acquired. The full experimental geometry was modelled using the EGSnrc package to validate the detector design, optimize the experimental setup and comparemore » measured and calculated spectra. Results: The biggest challenge using a beta source was to identify a robust spectral parameter to determine for each measurement. An end-point-fitting routine was used to determine the maximum energy, Emax, of the beta spectrum for each absorber thickness t. The parameter dEmax/dt is related to the electron stopping power and the same routine was applied to both measured and simulated spectra. Although the standard uncertainty in dEmax/dt was of the order of 5 %, by taking the ratio of measured and Monte Carlo values for dEmax/dt the uncertainty of the fitting routine was eliminated and the uncertainty was reduced to less than 2 %. The agreement between measurement and simulation was within this uncertainty estimate. Conclusion: The investigation confirmed the experimental approach and demonstrated that EGSnrc could accurately determine correction factors that will be required for the final measurement setup in a linac beam.« less

  20. WE-AB-204-11: Development of a Nuclear Medicine Dosimetry Module for the GPU-Based Monte Carlo Code ARCHER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Xu, X

    Purpose: To develop a nuclear medicine dosimetry module for the GPU-based Monte Carlo code ARCHER. Methods: We have developed a nuclear medicine dosimetry module for the fast Monte Carlo code ARCHER. The coupled electron-photon Monte Carlo transport kernel included in ARCHER is built upon the Dose Planning Method code (DPM). The developed module manages the radioactive decay simulation by consecutively tracking several types of radiation on a per disintegration basis using the statistical sampling method. Optimization techniques such as persistent threads and prefetching are studied and implemented. The developed module is verified against the VIDA code, which is based onmore » Geant4 toolkit and has previously been verified against OLINDA/EXM. A voxelized geometry is used in the preliminary test: a sphere made of ICRP soft tissue is surrounded by a box filled with water. Uniform activity distribution of I-131 is assumed in the sphere. Results: The self-absorption dose factors (mGy/MBqs) of the sphere with varying diameters are calculated by ARCHER and VIDA respectively. ARCHER’s result is in agreement with VIDA’s that are obtained from a previous publication. VIDA takes hours of CPU time to finish the computation, while it takes ARCHER 4.31 seconds for the 12.4-cm uniform activity sphere case. For a fairer CPU-GPU comparison, more effort will be made to eliminate the algorithmic differences. Conclusion: The coupled electron-photon Monte Carlo code ARCHER has been extended to radioactive decay simulation for nuclear medicine dosimetry. The developed code exhibits good performance in our preliminary test. The GPU-based Monte Carlo code is developed with grant support from the National Institute of Biomedical Imaging and Bioengineering through an R01 grant (R01EB015478)« less

  1. Heterogeneous multiscale Monte Carlo simulations for gold nanoparticle radiosensitization.

    PubMed

    Martinov, Martin P; Thomson, Rowan M

    2017-02-01

    To introduce the heterogeneous multiscale (HetMS) model for Monte Carlo simulations of gold nanoparticle dose-enhanced radiation therapy (GNPT), a model characterized by its varying levels of detail on different length scales within a single phantom; to apply the HetMS model in two different scenarios relevant for GNPT and to compare computed results with others published. The HetMS model is implemented using an extended version of the EGSnrc user-code egs_chamber; the extended code is tested and verified via comparisons with recently published data from independent GNP simulations. Two distinct scenarios for the HetMS model are then considered: (a) monoenergetic photon beams (20 keV to 1 MeV) incident on a cylinder (1 cm radius, 3 cm length); (b) isotropic point source (brachytherapy source spectra) at the center of a 2.5 cm radius sphere with gold nanoparticles (GNPs) diffusing outwards from the center. Dose enhancement factors (DEFs) are compared for different source energies, depths in phantom, gold concentrations, GNP sizes, and modeling assumptions, as well as with independently published values. Simulation efficiencies are investigated. The HetMS MC simulations account for the competing effects of photon fluence perturbation (due to gold in the scatter media) coupled with enhanced local energy deposition (due to modeling discrete GNPs within subvolumes). DEFs are most sensitive to these effects for the lower source energies, varying with distance from the source; DEFs below unity (i.e., dose decreases, not enhancements) can occur at energies relevant for brachytherapy. For example, in the cylinder scenario, the 20 keV photon source has a DEF of 3.1 near the phantom's surface, decreasing to less than unity by 0.7 cm depth (for 20 mg/g). Compared to discrete modeling of GNPs throughout the gold-containing (treatment) volume, efficiencies are enhanced by up to a factor of 122 with the HetMS approach. For the spherical phantom, DEFs vary with time for diffusion, radionuclide, and radius; DEFs differ considerably from those computed using a widely applied analytic approach. By combining geometric models of varying complexity on different length scales within a single simulation, the HetMS model can effectively account for both macroscopic and microscopic effects which must both be considered for accurate computation of energy deposition and DEFs for GNPT. Efficiency gains with the HetMS approach enable diverse calculations which would otherwise be prohibitively long. The HetMS model may be extended to diverse scenarios relevant for GNPT, providing further avenues for research and development. © 2016 American Association of Physicists in Medicine.

  2. SU-E-I-28: Evaluating the Organ Dose From Computed Tomography Using Monte Carlo Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ono, T; Araki, F

    Purpose: To evaluate organ doses from computed tomography (CT) using Monte Carlo (MC) calculations. Methods: A Philips Brilliance CT scanner (64 slice) was simulated using the GMctdospp (IMPS, Germany) based on the EGSnrc user code. The X-ray spectra and a bowtie filter for MC simulations were determined to coincide with measurements of half-value layer (HVL) and off-center ratio (OCR) profile in air. The MC dose was calibrated from absorbed dose measurements using a Farmer chamber and a cylindrical water phantom. The dose distribution from CT was calculated using patient CT images and organ doses were evaluated from dose volume histograms.more » Results: The HVLs of Al at 80, 100, and 120 kV were 6.3, 7.7, and 8.7 mm, respectively. The calculated HVLs agreed with measurements within 0.3%. The calculated and measured OCR profiles agreed within 3%. For adult head scans (CTDIvol) =51.4 mGy), mean doses for brain stem, eye, and eye lens were 23.2, 34.2, and 37.6 mGy, respectively. For pediatric head scans (CTDIvol =35.6 mGy), mean doses for brain stem, eye, and eye lens were 19.3, 24.5, and 26.8 mGy, respectively. For adult chest scans (CTDIvol=19.0 mGy), mean doses for lung, heart, and spinal cord were 21.1, 22.0, and 15.5 mGy, respectively. For adult abdominal scans (CTDIvol=14.4 mGy), the mean doses for kidney, liver, pancreas, spleen, and spinal cord were 17.4, 16.5, 16.8, 16.8, and 13.1 mGy, respectively. For pediatric abdominal scans (CTDIvol=6.76 mGy), mean doses for kidney, liver, pancreas, spleen, and spinal cord were 8.24, 8.90, 8.17, 8.31, and 6.73 mGy, respectively. In head scan, organ doses were considerably different from CTDIvol values. Conclusion: MC dose distributions calculated by using patient CT images are useful to evaluate organ doses absorbed to individual patients.« less

  3. Variations in energy spectra and water-to-material stopping-power ratios in three-dimensional conformal and intensity-modulated photon fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, Si Young; Liu, H. Helen; Mohan, Radhe

    Because of complex dose distributions and dose gradients that are created in three-dimensional conformal radiotherapy (3D-CRT) and intensity-modulated radiation therapy (IMRT), photon- and electron-energy spectra might change significantly with spatial locations and doses. This study examined variations in photon- and electron-energy spectra in 3D-CRT and IMRT photon fields. The effects of spectral variations on water-to-material stopping-power ratios used in Monte Carlo treatment planning systems and the responses of energy-dependent dosimeters, such as thermoluminescent dosimeters (TLDs) and radiographic films were further studied. The EGSnrc Monte Carlo code was used to simulate megavoltage 3D-CRT and IMRT photon fields. The photon- and electron-energymore » spectra were calculated in 3D water phantoms and anthropomorphic phantoms based on the fluence scored in voxel grids. We then obtained the water-to-material stopping-power ratios in the local voxels using the Spencer-Attix cavity theory. Changes in the responses of films and TLDs were estimated based on the calculated local energy spectra and published data on the dosimeter energy dependency. Results showed that the photon-energy spectra strongly depended on spatial positions and doses in both the 3D-CRT and IMRT fields. The relative fraction of low-energy photons (<100 keV) increased inversely with the photon dose in low-dose regions of the fields. A similar but smaller effect was observed for electrons in the phantoms. The maximum variation of the water-to-material stopping-power ratio over the range of calculated dose for both 3D-CRT and IMRT was negligible (<1.0%) for ICRU tissue, cortical bone, and soft bone and less than 3.6% for dry air and lung. Because of spectral softening at low doses, radiographic films in the phantoms could over-respond to dose by more than 30%, whereas the over-response of TLDs was less than 10%. Thus, spatial variations of the photon- and electron-energy spectra should be considered as important factors in 3D-CRT and IMRT dosimetry.« less

  4. TU-H-CAMPUS-IeP1-04: Combined Organ Dose for Digital Subtraction Angiography and Computed Tomography Using Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakabe, D; Ohno, T; Araki, F

    Purpose: The purpose of this study was to evaluate the combined organ dose of digital subtraction angiography (DSA) and computed tomography (CT) using a Monte Carlo (MC) simulation on the abdominal intervention. Methods: The organ doses for DSA and CT were obtained with MC simulation and actual measurements using fluorescent-glass dosimeters at 7 abdominal portions in an Alderson-Rando phantom. DSA was performed from three directions: posterior anterior (PA), right anterior oblique (RAO), and left anterior oblique (LAO). The organ dose with MC simulation was compared with actual radiation dose measurements. Calculations for the MC simulation were carried out with themore » GMctdospp (IMPS, Germany) software based on the EGSnrc MC code. Finally, the combined organ dose for DSA and CT was calculated from the MC simulation using the X-ray conditions of a patient with a diagnosis of hepatocellular carcinoma. Results: For DSA from the PA direction, the organ doses for the actual measurements and MC simulation were 2.2 and 2.4 mGy/100 mAs at the liver, respectively, and 3.0 and 3.1 mGy/100 mAs at the spinal cord, while for CT, the organ doses were 15.2 and 15.1 mGy/100 mAs at the liver, and 14.6 and 13.5 mGy/100 mAs at the spinal cord. The maximum difference in organ dose between the actual measurements and the MC simulation was 11.0% of the spleen at PA, 8.2% of the spinal cord at RAO, and 6.1% of left kidney at LAO with DSA and 9.3% of the stomach with CT. The combined organ dose (4 DSAs and 6 CT scans) with the use of actual patient conditions was found to be 197.4 mGy for the liver and 205.1 mGy for the spinal cord. Conclusion: Our method makes it possible to accurately assess the organ dose to patients for abdominal intervention with combined DSA and CT.« less

  5. SU-E-T-608: Perturbation Corrections for Alanine Dosimeters in Different Phantom Materials in High-Energy Photon Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voigts-Rhetz, P von; Czarnecki, D; Anton, M

    Purpose: Alanine dosimeters are often used for in-vivo dosimetry purposes in radiation therapy. In a Monte Carlo study the influence of 20 different surrounding/phantom materials for alanine dosimeters was investigated. The investigations were performed in high-energy photon beams, covering the whole range from {sup 60}Co up to 25 MV-X. The aim of the study is the introduction of a perturbation correction k{sub env} for alanine dosimeters accounting for the environmental material. Methods: The influence of different surrounding materials on the response of alanine dosimeters was investigated with Monte Carlo simulations using the EGSnrc code. The photon source was adapted withmore » BEAMnrc to a {sup 60}Co unit and an Elekta (E{sub nom}=6, 10, 25 MV-X) linear accelerator. Different tissue-equivalent materials ranging from cortical bone to lung were investigated. In addition to available phantom materials, some material compositions were taken and scaled to different electron densities. The depth of the alanine detectors within the different phantom materials corresponds to 5 cm depth in water, i.e. the depth is scaled according to the electron density (n{sub e}/n{sub e,w}) of the corresponding phantom material. The dose was scored within the detector volume once for an alanine/paraffin mixture and once for a liquid water voxel. The relative response, the ratio of the absorbed dose to alanine to the absorbed dose to water, was calculated and compared to the corresponding ratio under reference conditions. Results: For each beam quality the relative response r and the correction factor for the environment kenv was calculated. k{sub env}=0.9991+0.0049 *((n{sub e}/n{sub e,w})−0.7659){sup 3} Conclusion: A perturbation correction factor k{sub env} accounting for the phantom environment has been introduced. The response of the alanine dosimeter can be considered independent of the surrounding material for relative electron densities (n{sub e}/n{sub e,w}) between 1 and 1.4. For denser materials such as bone or much less dense surroundings such as lung, a small correction would be appropriate.« less

  6. MO-F-CAMPUS-I-02: Accuracy in Converting the Average Breast Dose Into the Mean Glandular Dose (MGD) Using the F-Factor in Cone Beam Breast CT- a Monte Carlo Study Using Homogeneous and Quasi-Homogeneous Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, C; Zhong, Y; Wang, T

    2015-06-15

    Purpose: To investigate the accuracy in estimating the mean glandular dose (MGD) for homogeneous breast phantoms by converting from the average breast dose using the F-factor in cone beam breast CT. Methods: EGSnrc-based Monte Carlo codes were used to estimate the MGDs. 13-cm in diameter, 10-cm high hemi-ellipsoids were used to simulate pendant-geometry breasts. Two different types of hemi-ellipsoidal models were employed: voxels in quasi-homogeneous phantoms were designed as either adipose or glandular tissue while voxels in homogeneous phantoms were designed as the mixture of adipose and glandular tissues. Breast compositions of 25% and 50% volume glandular fractions (VGFs), definedmore » as the ratio of glandular tissue voxels to entire breast voxels in the quasi-homogeneous phantoms, were studied. These VGFs were converted into glandular fractions by weight and used to construct the corresponding homogeneous phantoms. 80 kVp x-rays with a mean energy of 47 keV was used in the simulation. A total of 109 photons were used to image the phantoms and the energies deposited in the phantom voxels were tallied. Breast doses in homogeneous phantoms were averaged over all voxels and then used to calculate the MGDs using the F-factors evaluated at the mean energy of the x-rays. The MGDs for quasi-homogeneous phantoms were computed directly by averaging the doses over all glandular tissue voxels. The MGDs estimated for the two types of phantoms were normalized to the free-in-air dose at the iso-center and compared. Results: The normalized MGDs were 0.756 and 0.732 mGy/mGy for the 25% and 50% VGF homogeneous breasts and 0.761 and 0.733 mGy/mGy for the corresponding quasi-homogeneous breasts, respectively. The MGDs estimated for the two types of phantoms were similar within 1% in this study. Conclusion: MGDs for homogeneous breast models may be adequately estimated by converting from the average breast dose using the F-factor.« less

  7. SU-F-T-361: Dose Enhancement Due to Nanoparticle Addition in Skin Radiotherapy: A Monte Carlo Study Using Kilovoltage Photon Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, X; Chow, J

    Purpose: This study investigated the dose enhancement due to addition of nanoparticles with different types and concentrations in skin radiotherapy using kilovoltage photon beams. Methods: An inhomogeneous water phantom (15×15×10 cm{sup 3}) having the skin target layer (0.5–5 mm), added with different concentrations (3–40 mg/ml) of nanoparticles (Au, Pt, I, Ag and Fe{sub 2}O{sub 3}), was irradiated by the 105 and 220 kVp photon beams produced by a Gulmay D3225 Orthovoltage unit. The circular cone of 5-cm diameter and source-to-surface distance of 20 cm were used. Doses in the skin target layer with and without adding the nanoparticles were calculatedmore » using Monte Carlo simulation (the EGSnrc code) through the macroscopic approach. Dose enhancement ratio (DER), defined as the ratio of dose at the target with nanoparticle addition to the dose without addition, was calculated for each type and concentration of nanoparticle in different target thickness. Results: For Au nanoparticle, DER dependence on target thickness for the 220 kVp photon beams was not significant. However, DER for Au nanoparticle was found decreasing with an increase of target thickness when the nanoparticle concentration was increased from 18 to 40 mg/ml using the 105 kVp photon beams. For nanoparticle concentration of 40 mg/ml, DER variation with target thickness was not significant for the 220 kVp photon beams, but DEF was found decreasing with the target thickness when lower energy of photon beam (105 kVp) was used. DEF was found increasing with an increase of nanoparticle concentration. The higher the DEF increasing rate, the higher the atomic number of the nanoparticle except I and Ag for the same target thickness. Conclusion: It is concluded that nanoparticle addition can result in dose enhancement in kilovoltage skin radiotherapy. Moreover, the DER is related to the photon beam energy, target thickness, atomic number and concentration of nanoparticles.« less

  8. TH-A-19A-04: Latent Uncertainties and Performance of a GPU-Implemented Pre-Calculated Track Monte Carlo Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, M; Seuntjens, J; Roberge, D

    Purpose: Assessing the performance and uncertainty of a pre-calculated Monte Carlo (PMC) algorithm for proton and electron transport running on graphics processing units (GPU). While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from recycling a limited number of tracks in the pre-generated track bank is missing from the literature. With a proper uncertainty analysis, an optimal pre-generated track bank size can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pre-generated for electrons and protons using EGSnrc and GEANT4, respectively. The PMC algorithm for track transport was implementedmore » on the CUDA programming framework. GPU-PMC dose distributions were compared to benchmark dose distributions simulated using general-purpose MC codes in the same conditions. A latent uncertainty analysis was performed by comparing GPUPMC dose values to a “ground truth” benchmark while varying the track bank size and primary particle histories. Results: GPU-PMC dose distributions and benchmark doses were within 1% of each other in voxels with dose greater than 50% of Dmax. In proton calculations, a submillimeter distance-to-agreement error was observed at the Bragg Peak. Latent uncertainty followed a Poisson distribution with the number of tracks per energy (TPE) and a track bank of 20,000 TPE produced a latent uncertainty of approximately 1%. Efficiency analysis showed a 937× and 508× gain over a single processor core running DOSXYZnrc for 16 MeV electrons in water and bone, respectively. Conclusion: The GPU-PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty below 1%. The track bank size necessary to achieve an optimal efficiency can be tuned based on the desired uncertainty. Coupled with a model to calculate dose contributions from uncharged particles, GPU-PMC is a candidate for inverse planning of modulated electron radiotherapy and scanned proton beams. This work was supported in part by FRSQ-MSSS (Grant No. 22090), NSERC RG (Grant No. 432290) and CIHR MOP (Grant No. MOP-211360)« less

  9. SU-E-J-08: Comparison of Unintended Radiation Doses to Organs at Risk Resulting From the Out-Of-Field Therapeutic Beams and From Image-Guidance X-Ray Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, G; Wang, L

    Purpose: The unintended radiation dose to organs at risk (OAR) can be contributed from imaging guidance procedures as well as from leakage and scatter of therapeutic beams. This study compares the imaging dose with the unintended out-of-field therapeutic dose to patient sensitive organs. Methods: The Monte Carlo EGSnrc user codes, BEAMnrc and DOSXYZnrc, were used to simulate kV X-ray sources from imaging devices as well as the therapeutic IMRT/VMAT beams and to calculate doses to target and OARs on patient treatment planning CT images. The accuracy of the Monte Carlo simulations was benchmarked against measurements in phantoms. The dose-volume histogrammore » was utilized in analyzing the patient organ doses. Results: The dose resulting from Standard Head kV-CBCT scans to bone and soft tissues ranges from 0.7 to 1.1 cGy and from 0.03 to 0.3 cGy, respectively. The dose resulting from Thorax scans on the chest to bone and soft tissues ranges from 1.1 to 1.8 cGy and from 0.3 to 0.6 cGy, respectively. The dose resulting from Pelvis scans on the abdomen to bone and soft tissues range from 3.2 to 4.2 cGy and from 1.2 to 2.2 cGy, respectively. The out-of-field doses to OAR are sensitive to the distance between the treated target and the OAR. For a typical Head-and-Neck IMRT/VMAT treatment the out-of-field doses to eyes are 1–3% of the target dose, or 2–6 cGy per fraction. Conclusion: The imaging doses to OAR are predictable based on the imaging protocols used when OARs are within the imaged volume and can be estimated and accounted for by using tabulated values. The unintended out-of-field doses are proportional to the target dose, strongly depend on the distance between the treated target and OAR, and are generally higher comparing to the imaging dose. This work was partially supported by Varian research grant VUMC40590.« less

  10. Criticality Calculations with MCNP6 - Practical Lectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2016-11-29

    These slides are used to teach MCNP (Monte Carlo N-Particle) usage to nuclear criticality safety analysts. The following are the lecture topics: course information, introduction, MCNP basics, criticality calculations, advanced geometry, tallies, adjoint-weighted tallies and sensitivities, physics and nuclear data, parameter studies, NCS validation I, NCS validation II, NCS validation III, case study 1 - solution tanks, case study 2 - fuel vault, case study 3 - B&W core, case study 4 - simple TRIGA, case study 5 - fissile mat. vault, criticality accident alarm systems. After completion of this course, you should be able to: Develop an input modelmore » for MCNP; Describe how cross section data impact Monte Carlo and deterministic codes; Describe the importance of validation of computer codes and how it is accomplished; Describe the methodology supporting Monte Carlo codes and deterministic codes; Describe pitfalls of Monte Carlo calculations; Discuss the strengths and weaknesses of Monte Carlo and Discrete Ordinants codes; The diffusion theory model is not strictly valid for treating fissile systems in which neutron absorption, voids, and/or material boundaries are present. In the context of these limitations, identify a fissile system for which a diffusion theory solution would be adequate.« less

  11. Simulation of Nuclear Reactor Kinetics by the Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Gomin, E. A.; Davidenko, V. D.; Zinchenko, A. S.; Kharchenko, I. K.

    2017-12-01

    The KIR computer code intended for calculations of nuclear reactor kinetics using the Monte Carlo method is described. The algorithm implemented in the code is described in detail. Some results of test calculations are given.

  12. Monte Carlo tests of the ELIPGRID-PC algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, J.R.

    1995-04-01

    The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID computer code of Singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM{reg_sign} PC. However, no known independent validation of the ELIPGRID algorithm exists. This document describes a Monte Carlo simulation-based validation of a modified version of the ELIPGRID-PC code. The modified ELIPGRID-PC code is shown to match Monte Carlo-calculated hot-spot detection probabilities to within {plus_minus}0.5% for 319 out of 320 test cases. The one exception, a very thin elliptical hot spot located within a rectangularmore » sampling grid, differed from the Monte Carlo-calculated probability by about 1%. These results provide confidence in the ability of the modified ELIPGRID-PC code to accurately predict hot-spot detection probabilities within an acceptable range of error.« less

  13. Use of Fluka to Create Dose Calculations

    NASA Technical Reports Server (NTRS)

    Lee, Kerry T.; Barzilla, Janet; Townsend, Lawrence; Brittingham, John

    2012-01-01

    Monte Carlo codes provide an effective means of modeling three dimensional radiation transport; however, their use is both time- and resource-intensive. The creation of a lookup table or parameterization from Monte Carlo simulation allows users to perform calculations with Monte Carlo results without replicating lengthy calculations. FLUKA Monte Carlo transport code was used to develop lookup tables and parameterizations for data resulting from the penetration of layers of aluminum, polyethylene, and water with areal densities ranging from 0 to 100 g/cm^2. Heavy charged ion radiation including ions from Z=1 to Z=26 and from 0.1 to 10 GeV/nucleon were simulated. Dose, dose equivalent, and fluence as a function of particle identity, energy, and scattering angle were examined at various depths. Calculations were compared against well-known results and against the results of other deterministic and Monte Carlo codes. Results will be presented.

  14. Parallel CARLOS-3D code development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Putnam, J.M.; Kotulski, J.D.

    1996-02-01

    CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions tomore » the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.« less

  15. Patient‐specific CT dosimetry calculation: a feasibility study

    PubMed Central

    Xie, Huchen; Cheng, Jason Y.; Ning, Holly; Zhuge, Ying; Miller, Robert W.

    2011-01-01

    Current estimation of radiation dose from computed tomography (CT) scans on patients has relied on the measurement of Computed Tomography Dose Index (CTDI) in standard cylindrical phantoms, and calculations based on mathematical representations of “standard man”. Radiation dose to both adult and pediatric patients from a CT scan has been a concern, as noted in recent reports. The purpose of this study was to investigate the feasibility of adapting a radiation treatment planning system (RTPS) to provide patient‐specific CT dosimetry. A radiation treatment planning system was modified to calculate patient‐specific CT dose distributions, which can be represented by dose at specific points within an organ of interest, as well as organ dose‐volumes (after image segmentation) for a GE Light Speed Ultra Plus CT scanner. The RTPS calculation algorithm is based on a semi‐empirical, measured correction‐based algorithm, which has been well established in the radiotherapy community. Digital representations of the physical phantoms (virtual phantom) were acquired with the GE CT scanner in axial mode. Thermoluminescent dosimeter (TLDs) measurements in pediatric anthropomorphic phantoms were utilized to validate the dose at specific points within organs of interest relative to RTPS calculations and Monte Carlo simulations of the same virtual phantoms (digital representation). Congruence of the calculated and measured point doses for the same physical anthropomorphic phantom geometry was used to verify the feasibility of the method. The RTPS algorithm can be extended to calculate the organ dose by calculating a dose distribution point‐by‐point for a designated volume. Electron Gamma Shower (EGSnrc) codes for radiation transport calculations developed by National Research Council of Canada (NRCC) were utilized to perform the Monte Carlo (MC) simulation. In general, the RTPS and MC dose calculations are within 10% of the TLD measurements for the infant and child chest scans. With respect to the dose comparisons for the head, the RTPS dose calculations are slightly higher (10%–20%) than the TLD measurements, while the MC results were within 10% of the TLD measurements. The advantage of the algebraic dose calculation engine of the RTPS is a substantially reduced computation time (minutes vs. days) relative to Monte Carlo calculations, as well as providing patient‐specific dose estimation. It also provides the basis for a more elaborate reporting of dosimetric results, such as patient specific organ dose volumes after image segmentation. PACS numbers: 87.55.D‐, 87.57.Q‐, 87.53.Bn, 87.55.K‐ PMID:22089016

  16. A hyperboliod representation of the bone-marrow interface within 3D NMR images of trabecular bone: applications to skeletal dosimetry

    NASA Astrophysics Data System (ADS)

    Rajon, D. A.; Shah, A. P.; Watchman, C. J.; Brindle, J. M.; Bolch, W. E.

    2003-06-01

    Recent advances in physical models of skeletal dosimetry utilize high-resolution NMR microscopy images of trabecular bone. These images are coupled to radiation transport codes to assess energy deposition within active bone marrow irradiated by bone- or marrow-incorporated radionuclides. Recent studies have demonstrated that the rectangular shape of image voxels is responsible for cross-region (bone-to-marrow) absorbed fraction errors of up to 50% for very low-energy electrons (<50 keV). In this study, a new hyperboloid adaptation of the marching cube (MC) image-visualization algorithm is implemented within 3D digital images of trabecular bone to better define the bone-marrow interface, and thus reduce voxel effects in the assessment of cross-region absorbed fractions. To test the method, a mathematical sample of trabecular bone was constructed, composed of a random distribution of spherical marrow cavities, and subsequently coupled to the EGSnrc radiation code to generate reference values for the energy deposition in marrow or bone. Next, digital images of the bone model were constructed over a range of simulated image resolutions, and coupled to EGSnrc using the hyperboloid MC (HMC) algorithm. For the radionuclides 33P, 117mSn, 131I and 153Sm, values of S(marrow←bone) estimated using voxel models of trabecular bone were shown to have relative errors of 10%, 9%, <1% and <1% at a voxel size of 150 µm. At a voxel size of 60 µm, these errors were 6%, 5%, <1% and <1%, respectively. When the HMC model was applied during particle transport, the relative errors on S(marrow←bone) for these same radionuclides were reduced to 7%, 6%, <1% and <1% at a voxel size of 150 µm, and to 2%, 2%, <1% and <1% at a voxel size of 60 µm. The technique was also applied to a real NMR image of human trabecular bone with a similar demonstration of reductions in dosimetry errors.

  17. Validation of GPU-accelerated superposition-convolution dose computations for the Small Animal Radiation Research Platform.

    PubMed

    Cho, Nathan; Tsiamas, Panagiotis; Velarde, Esteban; Tryggestad, Erik; Jacques, Robert; Berbeco, Ross; McNutt, Todd; Kazanzides, Peter; Wong, John

    2018-05-01

    The Small Animal Radiation Research Platform (SARRP) has been developed for conformal microirradiation with on-board cone beam CT (CBCT) guidance. The graphics processing unit (GPU)-accelerated Superposition-Convolution (SC) method for dose computation has been integrated into the treatment planning system (TPS) for SARRP. This paper describes the validation of the SC method for the kilovoltage energy by comparing with EBT2 film measurements and Monte Carlo (MC) simulations. MC data were simulated by EGSnrc code with 3 × 10 8 -1.5 × 10 9 histories, while 21 photon energy bins were used to model the 220 kVp x-rays in the SC method. Various types of phantoms including plastic water, cork, graphite, and aluminum were used to encompass the range of densities of mouse organs. For the comparison, percentage depth dose (PDD) of SC, MC, and film measurements were analyzed. Cross beam (x,y) dosimetric profiles of SC and film measurements are also presented. Correction factors (CFz) to convert SC to MC dose-to-medium are derived from the SC and MC simulations in homogeneous phantoms of aluminum and graphite to improve the estimation. The SC method produces dose values that are within 5% of film measurements and MC simulations in the flat regions of the profile. The dose is less accurate at the edges, due to factors such as geometric uncertainties of film placement and difference in dose calculation grids. The GPU-accelerated Superposition-Convolution dose computation method was successfully validated with EBT2 film measurements and MC calculations. The SC method offers much faster computation speed than MC and provides calculations of both dose-to-water in medium and dose-to-medium in medium. © 2018 American Association of Physicists in Medicine.

  18. Organ dose conversion coefficients for voxel models of the reference male and female from idealized photon exposures

    NASA Astrophysics Data System (ADS)

    Schlattl, H.; Zankl, M.; Petoussi-Henss, N.

    2007-04-01

    A new series of organ equivalent dose conversion coefficients for whole body external photon exposure is presented for a standardized couple of human voxel models, called Rex and Regina. Irradiations from broad parallel beams in antero-posterior, postero-anterior, left- and right-side lateral directions as well as from a 360° rotational source have been performed numerically by the Monte Carlo transport code EGSnrc. Dose conversion coefficients from an isotropically distributed source were computed, too. The voxel models Rex and Regina originating from real patient CT data comply in body and organ dimensions with the currently valid reference values given by the International Commission on Radiological Protection (ICRP) for the average Caucasian man and woman, respectively. While the equivalent dose conversion coefficients of many organs are in quite good agreement with the reference values of ICRP Publication 74, for some organs and certain geometries the discrepancies amount to 30% or more. Differences between the sexes are of the same order with mostly higher dose conversion coefficients in the smaller female model. However, much smaller deviations from the ICRP values are observed for the resulting effective dose conversion coefficients. With the still valid definition for the effective dose (ICRP Publication 60), the greatest change appears in lateral exposures with a decrease in the new models of at most 9%. However, when the modified definition of the effective dose as suggested by an ICRP draft is applied, the largest deviation from the current reference values is obtained in postero-anterior geometry with a reduction of the effective dose conversion coefficient by at most 12%.

  19. Sci-Thur PM – Brachytherapy 01: Fast brachytherapy dose calculations: Characterization of egs-brachy features to enhance simulation efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamberland, Marc; Taylor, Randle E.P.; Rogers, Da

    2016-08-15

    Purpose: egs-brachy is a fast, new EGSnrc user-code for brachytherapy applications. This study characterizes egs-brachy features that enhance simulation efficiency. Methods: Calculations are performed to characterize efficiency gains from various features. Simulations include radionuclide and miniature x-ray tube sources in water phantoms and idealized prostate, breast, and eye plaque treatments. Features characterized include voxel indexing of sources to reduce boundary checks during radiation transport, scoring collision kerma via tracklength estimator, recycling photons emitted from sources, and using phase space data to initiate simulations. Bremsstrahlung cross section enhancement (BCSE), uniform bremsstrahlung splitting (UBS), and Russian Roulette (RR) are considered for electronicmore » brachytherapy. Results: Efficiency is enhanced by a factor of up to 300 using tracklength versus interaction scoring of collision kerma and by up to 2.7 and 2.6 using phase space sources and particle recycling respectively compared to simulations in which particles are initiated within sources. On a single 2.5 GHz Intel Xeon E5-2680 processor cor, simulations approximating prostate and breast permanent implant ((2 mm){sup 3} voxels) and eye plaque ((1 mm){sup 3}) treatments take as little as 9 s (prostate, eye) and up to 31 s (breast) to achieve 2% statistical uncertainty on doses within the PTV. For electronic brachytherapy, BCSE, UBS, and RR enhance efficiency by a factor >2000 compared to a factor of >10{sup 4} using a phase space source. Conclusion: egs-brachy features provide substantial efficiency gains, resulting in calculation times sufficiently fast for full Monte Carlo simulations for routine brachytherapy treatment planning.« less

  20. Absolute dosimetry on a dynamically scanned sample for synchrotron radiotherapy using graphite calorimetry and ionization chambers

    NASA Astrophysics Data System (ADS)

    Lye, J. E.; Harty, P. D.; Butler, D. J.; Crosbie, J. C.; Livingstone, J.; Poole, C. M.; Ramanathan, G.; Wright, T.; Stevenson, A. W.

    2016-06-01

    The absolute dose delivered to a dynamically scanned sample in the Imaging and Medical Beamline (IMBL) on the Australian Synchrotron was measured with a graphite calorimeter anticipated to be established as a primary standard for synchrotron dosimetry. The calorimetry was compared to measurements using a free-air chamber (FAC), a PTW 31 014 Pinpoint ionization chamber, and a PTW 34 001 Roos ionization chamber. The IMBL beam height is limited to approximately 2 mm. To produce clinically useful beams of a few centimetres the beam must be scanned in the vertical direction. In practice it is the patient/detector that is scanned and the scanning velocity defines the dose that is delivered. The calorimeter, FAC, and Roos chamber measure the dose area product which is then converted to central axis dose with the scanned beam area derived from Monte Carlo (MC) simulations and film measurements. The Pinpoint chamber measures the central axis dose directly and does not require beam area measurements. The calorimeter and FAC measure dose from first principles. The calorimetry requires conversion of the measured absorbed dose to graphite to absorbed dose to water using MC calculations with the EGSnrc code. Air kerma measurements from the free air chamber were converted to absorbed dose to water using the AAPM TG-61 protocol. The two ionization chambers are secondary standards requiring calibration with kilovoltage x-ray tubes. The Roos and Pinpoint chambers were calibrated against the Australian primary standard for air kerma at the Australian Radiation Protection and Nuclear Safety Agency (ARPANSA). Agreement of order 2% or better was obtained between the calorimetry and ionization chambers. The FAC measured a dose 3-5% higher than the calorimetry, within the stated uncertainties.

  1. Sci-Sat AM: Radiation Dosimetry and Practical Therapy Solutions - 06: Investigation of an absorbed dose to water formalism for a miniature low-energy x-ray source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, Peter; Seuntjens, Jan

    Purpose: We present a formalism for calculating the absorbed dose to water from a miniature x-ray source (The INTRABEAM system, Carl Zeiss), using a parallel-plate ionization chamber calibrated in terms of air-kerma. Monte Carlo calculations were performed to derive a chamber conversion factor (C{sub Q}) from reference air-kerma to dose to water for the INTRABEAM. C{sub Q} was investigated as a function of depth in water, and compared with the manufacturer’s reported value. The effect of chamber air cavity dimension tolerance was also investigated. Methods: Air-kerma (A{sub k}) from a reference beam was calculated using the EGSnrc user code cavity.more » Using egs-chamber, a model of a PTW 34013 parallel-plate ionization chamber was created according to manufacturer specifications. The dose to the chamber air cavity (D{sub gas}) was simulated both in-air (with reference beam) and in-water (with INTRABEAM source). Dose to a small water voxel (D{sub w}) was also calculated. C{sub Q} was derived from these quantities. Results: C{sub Q} was found to vary by up to 15% (1.30 vs 1.11) between chamber dimension extremes. The agreement between chamber C{sub Q} was found to improve with increasing depth in water. However, in all cases investigated, C{sub Q} was larger than the manufacturer reported value of 1.054. Conclusions: Our results show that cavity dimension tolerance has a significant effect on C{sub Q}, with differences as large as 15%. In all cases considered, C{sub Q} was found to be larger than the reported value of 1.054. This suggests that the recommended calculation underestimates the dose to water.« less

  2. Skeletal dosimetry in the MAX06 and the FAX06 phantoms for external exposure to photons based on vertebral 3D-microCT images

    NASA Astrophysics Data System (ADS)

    Kramer, R.; Khoury, H. J.; Vieira, J. W.; Kawrakow, I.

    2006-12-01

    3D-microCT images of vertebral bodies from three different individuals have been segmented into trabecular bone, bone marrow and bone surface cells (BSC), and then introduced into the spongiosa voxels of the MAX06 and the FAX06 phantoms, in order to calculate the equivalent dose to the red bone marrow (RBM) and the BSC in the marrow cavities of trabecular bone with the EGSnrc Monte Carlo code from whole-body exposure to external photon radiation. The MAX06 and the FAX06 phantoms consist of about 150 million 1.2 mm cubic voxels each, a part of which are spongiosa voxels surrounded by cortical bone. In order to use the segmented 3D-microCT images for skeletal dosimetry, spongiosa voxels in the MAX06 and the FAX06 phantom were replaced at runtime by so-called micro matrices representing segmented trabecular bone, marrow and BSC in 17.65, 30 and 60 µm cubic voxels. The 3D-microCT image-based RBM and BSC equivalent doses for external exposure to photons presented here for the first time for complete human skeletons are in agreement with the results calculated with the three correction factor method and the fluence-to-dose response functions for the same phantoms taking into account the conceptual differences between the different methods. Additionally the microCT image-based results have been compared with corresponding data from earlier studies for other human phantoms. This article is dedicated to Prof. Dr Guenter Drexler from the Laboratório de Ciências Radiológicas, State University of Rio de Janeiro, on the occasion of his 70th birthday.

  3. The influence of neutron contamination on dosimetry in external photon beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horst, Felix, E-mail: felix.ernst.horst@kmub.thm.de; Czarnecki, Damian; Zink, Klemens

    Purpose: Photon fields with energies above ∼7 MeV are contaminated by neutrons due to photonuclear reactions. Their influence on dosimetry—although considered to be very low—is widely unexplored. Methods: In this work, Monte Carlo based investigations into this issue performed with FLUKA and EGSNRC are presented. A typical Linac head in 18 MV-X mode was modeled equivalently within both codes. EGSNRC was used for the photon and FLUKA for the neutron production and transport simulation. Water depth dose profiles and the response of different detectors (Farmer chamber, TLD-100, TLD-600H, and TLD-700H chip) in five representative depths were simulated and the neutrons’more » impact (neutron absorbed dose relative to photon absorbed dose) was calculated. To take account of the neutrons’ influence, a theoretically required correction factor was defined and calculated for five representative water depths. Results: The neutrons’ impact on the absorbed dose to water was found to be below 0.1% for all depths and their impact on the response of the Farmer chamber and the TLD-700H chip was found to be even less. For the TLD-100 and the TLD-600H chip it was found to be up to 0.3% and 0.7%, respectively. The theoretical correction factors to be applied to absorbed dose to water values measured with these four detectors in a depth different from the reference/calibration depth were calculated and found to be below 0.05% for the Farmer chamber and the TLD-700H chip, but up to 0.15% and 0.35% for the TLD-100 and TLD-600H chips, respectively. In thermoluminescence dosimetry the neutrons’ influence (and therefore the additional inaccuracy in measurement) was found to be higher for TLD materials whose {sup 6}Li fraction is high, such as TLD-100 and TLD-600H, resulting from the thermal neutron capture reaction on {sup 6}Li. Conclusions: The impact of photoneutrons on the absorbed dose to water and on the response of a typical ionization chamber as well as three different types of TLD chips was quantified and was as expected found to be very low relative to that of the primary photons. For most practical reasons the neutrons’ influence on dosimetry might be neglected while for absolute precise thermoluminescence dosimetry in high energy photon fields, the use of TLD-700H (<0.03% {sup 6}Li) instead of the commonly used TLD-100 (7.4% {sup 6}Li) or even the extra neutron sensitive TLD-600H is recommended (95.6% {sup 6}Li) due to the additional inaccuracy in measurement for TLD materials with a high {sup 6}Li fraction.« less

  4. The influence of neutron contamination on dosimetry in external photon beam radiotherapy.

    PubMed

    Horst, Felix; Czarnecki, Damian; Zink, Klemens

    2015-11-01

    Photon fields with energies above ∼7 MeV are contaminated by neutrons due to photonuclear reactions. Their influence on dosimetry-although considered to be very low-is widely unexplored. In this work, Monte Carlo based investigations into this issue performed with fluka and egsnrc are presented. A typical Linac head in 18 MV-X mode was modeled equivalently within both codes. egsnrc was used for the photon and fluka for the neutron production and transport simulation. Water depth dose profiles and the response of different detectors (Farmer chamber, TLD-100, TLD-600H, and TLD-700H chip) in five representative depths were simulated and the neutrons' impact (neutron absorbed dose relative to photon absorbed dose) was calculated. To take account of the neutrons' influence, a theoretically required correction factor was defined and calculated for five representative water depths. The neutrons' impact on the absorbed dose to water was found to be below 0.1% for all depths and their impact on the response of the Farmer chamber and the TLD-700H chip was found to be even less. For the TLD-100 and the TLD-600H chip it was found to be up to 0.3% and 0.7%, respectively. The theoretical correction factors to be applied to absorbed dose to water values measured with these four detectors in a depth different from the reference/calibration depth were calculated and found to be below 0.05% for the Farmer chamber and the TLD-700H chip, but up to 0.15% and 0.35% for the TLD-100 and TLD-600H chips, respectively. In thermoluminescence dosimetry the neutrons' influence (and therefore the additional inaccuracy in measurement) was found to be higher for TLD materials whose 6Li fraction is high, such as TLD-100 and TLD-600H, resulting from the thermal neutron capture reaction on 6Li. The impact of photoneutrons on the absorbed dose to water and on the response of a typical ionization chamber as well as three different types of TLD chips was quantified and was as expected found to be very low relative to that of the primary photons. For most practical reasons the neutrons' influence on dosimetry might be neglected while for absolute precise thermoluminescence dosimetry in high energy photon fields, the use of TLD-700H (<0.03% 6Li) instead of the commonly used TLD-100 (7.4% 6Li) or even the extra neutron sensitive TLD-600H is recommended (95.6% 6Li) due to the additional inaccuracy in measurement for TLD materials with a high 6Li fraction.

  5. SKIRT: The design of a suite of input models for Monte Carlo radiative transfer simulations

    NASA Astrophysics Data System (ADS)

    Baes, M.; Camps, P.

    2015-09-01

    The Monte Carlo method is the most popular technique to perform radiative transfer simulations in a general 3D geometry. The algorithms behind and acceleration techniques for Monte Carlo radiative transfer are discussed extensively in the literature, and many different Monte Carlo codes are publicly available. On the contrary, the design of a suite of components that can be used for the distribution of sources and sinks in radiative transfer codes has received very little attention. The availability of such models, with different degrees of complexity, has many benefits. For example, they can serve as toy models to test new physical ingredients, or as parameterised models for inverse radiative transfer fitting. For 3D Monte Carlo codes, this requires algorithms to efficiently generate random positions from 3D density distributions. We describe the design of a flexible suite of components for the Monte Carlo radiative transfer code SKIRT. The design is based on a combination of basic building blocks (which can be either analytical toy models or numerical models defined on grids or a set of particles) and the extensive use of decorators that combine and alter these building blocks to more complex structures. For a number of decorators, e.g. those that add spiral structure or clumpiness, we provide a detailed description of the algorithms that can be used to generate random positions. Advantages of this decorator-based design include code transparency, the avoidance of code duplication, and an increase in code maintainability. Moreover, since decorators can be chained without problems, very complex models can easily be constructed out of simple building blocks. Finally, based on a number of test simulations, we demonstrate that our design using customised random position generators is superior to a simpler design based on a generic black-box random position generator.

  6. NRMC - A GPU code for N-Reverse Monte Carlo modeling of fluids in confined media

    NASA Astrophysics Data System (ADS)

    Sánchez-Gil, Vicente; Noya, Eva G.; Lomba, Enrique

    2017-08-01

    NRMC is a parallel code for performing N-Reverse Monte Carlo modeling of fluids in confined media [V. Sánchez-Gil, E.G. Noya, E. Lomba, J. Chem. Phys. 140 (2014) 024504]. This method is an extension of the usual Reverse Monte Carlo method to obtain structural models of confined fluids compatible with experimental diffraction patterns, specifically designed to overcome the problem of slow diffusion that can appear under conditions of tight confinement. Most of the computational time in N-Reverse Monte Carlo modeling is spent in the evaluation of the structure factor for each trial configuration, a calculation that can be easily parallelized. Implementation of the structure factor evaluation in NVIDIA® CUDA so that the code can be run on GPUs leads to a speed up of up to two orders of magnitude.

  7. The Monte Carlo photoionization and moving-mesh radiation hydrodynamics code CMACIONIZE

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, B.; Wood, K.

    2018-04-01

    We present the public Monte Carlo photoionization and moving-mesh radiation hydrodynamics code CMACIONIZE, which can be used to simulate the self-consistent evolution of HII regions surrounding young O and B stars, or other sources of ionizing radiation. The code combines a Monte Carlo photoionization algorithm that uses a complex mix of hydrogen, helium and several coolants in order to self-consistently solve for the ionization and temperature balance at any given type, with a standard first order hydrodynamics scheme. The code can be run as a post-processing tool to get the line emission from an existing simulation snapshot, but can also be used to run full radiation hydrodynamical simulations. Both the radiation transfer and the hydrodynamics are implemented in a general way that is independent of the grid structure that is used to discretize the system, allowing it to be run both as a standard fixed grid code, but also as a moving-mesh code.

  8. SU-F-303-15: Ion Chamber Dose Response in Magnetic Fields as a Function of Incident Photon Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkov, V. N.; Rogers, D. W. O.

    2015-06-15

    Purpose: In considering the continued development of synergetic MRI-radiation therapy machines, we seek to quantify the variability of ion chamber response per unit dose in the presence of magnetic fields of varying strength as a function of incident photon beam quality and geometric configuration. Methods: To account for the effect of magnetic fields on the trajectory of charged particles a new algorithm was introduced into the EGSnrc Monte Carlo code. In the egs-chamber user code the dose to the cavity of an NE2571 ion chamber is calculated in two configurations, in 0 to 2 T magnetic fields, with an incomingmore » parallel 10×10 cm{sup 2} photon beam with energies ranging between 0.5 MeV and 8 MeV. In the first, the photon beam is incident on the long-axis of the ion chamber (config-1), and in the second the beam is parallel to the long-axis and incident from the conical end of the chamber (config-2). For both, the magnetic field is perpendicular to the direction of the beam and the long axis of the chamber. Results: The ion chamber response per unit dose to water at the same point is determined as a function of magnetic field and is normalized to the 0T case for each of incoming photon energies. For both configurations, accurate modeling of the ion chamber yielded closer agreement with the experimental results obtained by Meijsing et. al (2009). Config-1 yields a gradual increase in response with increasing field strength to a maximum of 13.4% and 1.4% for 1 MeV and 8 MeV photon beams, respectively. Config-2 produced a decrease in response of up to 6% and 13% for 0.5 MeV and 8 MeV beams, respectively. Conclusion: These results provide further support for ion chamber calibration in MRI-radiotherapy coupled systems and demonstrates noticeable energy dependence for clinically relevant fields.« less

  9. SU-G-JeP3-06: Lower KV Image Dose Are Expected From a Limited-Angle Intra-Fractional Verification (LIVE) System for SBRT Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, G; Yin, F; Ren, L

    Purpose: In order to track the tumor movement for patient positioning verification during arc treatment delivery or in between 3D/IMRT beams for stereotactic body radiation therapy (SBRT), the limited-angle kV projections acquisition simultaneously during arc treatment delivery or in-between static treatment beams as the gantry moves to the next beam angle was proposed. The purpose of this study is to estimate additional imaging dose resulting from multiple tomosynthesis acquisitions in-between static treatment beams and to compare with that of a conventional kV-CBCT acquisition. Methods: kV imaging system integrated into Varian TrueBeam accelerators was modeled using EGSnrc Monte Carlo user code,more » BEAMnrc and DOSXYZnrc code was used in dose calculations. The simulated realistic kV beams from the Varian TrueBeam OBI 1.5 system were used to calculate dose to patient based on CT images. Organ doses were analyzed using DVHs. The imaging dose to patient resulting from realistic multiple tomosynthesis acquisitions with each 25–30 degree kV source rotation between 6 treatment beam gantry angles was studied. Results: For a typical lung SBRT treatment delivery much lower (20–50%) kV imaging doses from the sum of realistic six tomosynthesis acquisitions with each 25–30 degree x-ray source rotation between six treatment beam gantry angles were observed compared to that from a single CBCT image acquisition. Conclusion: This work indicates that the kV imaging in this proposed Limited-angle Intra-fractional Verification (LIVE) System for SBRT Treatments has a negligible imaging dose increase. It is worth to note that the MV imaging dose caused by MV projection acquisition in-between static beams in LIVE can be minimized by restricting the imaging to the target region and reducing the number of projections acquired. For arc treatments, MV imaging acquisition in LIVE does not add additional imaging dose as the MV images are acquired from treatment beams directly during the treatment.« less

  10. SU-G-201-08: Energy Response of Thermoluminescent Microcube Dosimeters in Water for Kilovoltage X-Ray Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Maso, L; Lawless, M; Culberson, W

    Purpose: To characterize the energy dependence for TLD-100 microcubes in water at kilovoltage energies. Methods: TLD-100 microcubes with dimensions of (1 × 1 × 1) mm{sup 3} were irradiated with kilovoltage x-rays in a custom-built thin-window liquid water phantom. The TLD-100 microcubes were held in Virtual Water™ probes and aligned at a 2 cm depth in water. Irradiations were performed using the M-series x-ray beams of energies ranging from 50-250 kVp and normalized to a {sup 60}Co beam located at the UWADCL. Simulations using the EGSnrc Monte Carlo Code System were performed to model the x-ray beams, the {sup 60}Comore » beam, the water phantom and the dosimeters in the phantom. The egs-chamber user code was used to tally the dose to the TLDs and the dose to water. The measurements and calculations were used to determine the intrinsic energy dependence, absorbed-dose energy dependence, and absorbed-dose sensitivity. These values were compared to TLD-100 chips with dimensions of (3.2 × 0.9 × 0.9) mm{sup 3}. Results: The measured TLD-100 microcube response per dose to water among all investigated x-ray energies had a maximum percent difference of 61% relative to {sup 60}Co. The simulated ratio of dose to water to the dose to TLD had a maximum percent difference of 29% relative to {sup 60}Co. The ratio of dose to TLD to the TLD output had a maximum percent difference of 13% relative to {sup 60}Co. The maximum percent difference for the absorbed-dose sensitivity was 15% more than the used value of 1.41. Conclusion: These results confirm that differences in beam quality have a significant effect on TLD response when irradiated in water. These results also indicated a difference in TLD-100 response between microcube and chip geometries. The intrinsic energy dependence and the absorbed-dose energy dependence deviated up to 10% between TLD-100 microcubes and chips.« less

  11. TU-H-BRC-03: Evaluation of Very High-Energy Electron (VHEE) Beams in Comparison to VMAT and PBS Treatment Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schueler, E; Loo, B; Maxim, P

    2016-06-15

    Purpose: The aim of this study was to evaluate the performance of very high-energy electron (VHEE) beams in comparison to clinically delivered treatment plans generated with volumetric modulated arc therapy (VMAT) and proton pencil beam scanning (PBS) technology. Methods: Three clinical cases were selected (prostate, lung, and pediatric CNS). The VHEE plans were calculated in the Monte Carlo EGSnrc code and pencil beam doses were calculated using the DOSxyznrc MC code for 100 and 200 MeV beams. Treatment plans with VHEE, VMAT, and PBS were optimized in a research version of RayStation using an in house build script in ordermore » to minimize operator bias between the different techniques. Results: For the prostate cancer case, the PBS plan showed lower mean organ at risk (OAR) doses compared to the other modalities. An exception was the femoral heads, due to the lateral beam arrangements. The VMAT plan showed lower mean doses to the rectum and the bladder compared to the 100 MeV VHEE plan. The lung cancer case showed minor differences between the three modalities. However, the PBS plan showed a lower contralateral lung dose. The pediatric CNS case showed a better conformity and lower spinal cord dose for the 100 MeV VHEE plan. For all cases, the 200 MeV VHEE plans were found to be similar to or better than the 100 MeV VHEE plans. Conclusion: The present study showed that VHEE plans are similar or superior to VMAT plans with reduced mean OAR dose and increased target conformity for a variety of clinical cases. With increased VHEE energy, better conformity and even higher reductions in mean OAR doses can be achieved. Funding: DoD, Award#:W81XWH-13-1-0165, Weston Havens Foundation, Bio-X (Stanford University), the Office of the Dean of the Medical School, the Office of the Provost (Stanford University), and the Swedish Childhood Cancer Foundation. BL and PM are founders of TibaRay,Inc. BL and PM have received research grants from Varian and RaySearch Laboratory.« less

  12. Capabilities overview of the MORET 5 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cochet, B.; Jinaphanh, A.; Heulers, L.; Jacquet, O.

    2014-06-01

    The MORET code is a simulation tool that solves the transport equation for neutrons using the Monte Carlo method. It allows users to model complex three-dimensional geometrical configurations, describe the materials, define their own tallies in order to analyse the results. The MORET code has been initially designed to perform calculations for criticality safety assessments. New features has been introduced in the MORET 5 code to expand its use for reactor applications. This paper presents an overview of the MORET 5 code capabilities, going through the description of materials, the geometry modelling, the transport simulation and the definition of the outputs.

  13. Extensions of the MCNP5 and TRIPOLI4 Monte Carlo Codes for Transient Reactor Analysis

    NASA Astrophysics Data System (ADS)

    Hoogenboom, J. Eduard; Sjenitzer, Bart L.

    2014-06-01

    To simulate reactor transients for safety analysis with the Monte Carlo method the generation and decay of delayed neutron precursors is implemented in the MCNP5 and TRIPOLI4 general purpose Monte Carlo codes. Important new variance reduction techniques like forced decay of precursors in each time interval and the branchless collision method are included to obtain reasonable statistics for the power production per time interval. For simulation of practical reactor transients also the feedback effect from the thermal-hydraulics must be included. This requires coupling of the Monte Carlo code with a thermal-hydraulics (TH) code, providing the temperature distribution in the reactor, which affects the neutron transport via the cross section data. The TH code also provides the coolant density distribution in the reactor, directly influencing the neutron transport. Different techniques for this coupling are discussed. As a demonstration a 3x3 mini fuel assembly with a moving control rod is considered for MCNP5 and a mini core existing of 3x3 PWR fuel assemblies with control rods and burnable poisons for TRIPOLI4. Results are shown for reactor transients due to control rod movement or withdrawal. The TRIPOLI4 transient calculation is started at low power and includes thermal-hydraulic feedback. The power rises about 10 decades and finally stabilises the reactor power at a much higher level than initial. The examples demonstrate that the modified Monte Carlo codes are capable of performing correct transient calculations, taking into account all geometrical and cross section detail.

  14. Monte Carlo reference data sets for imaging research: Executive summary of the report of AAPM Research Committee Task Group 195.

    PubMed

    Sechopoulos, Ioannis; Ali, Elsayed S M; Badal, Andreu; Badano, Aldo; Boone, John M; Kyprianou, Iacovos S; Mainegra-Hing, Ernesto; McMillan, Kyle L; McNitt-Gray, Michael F; Rogers, D W O; Samei, Ehsan; Turner, Adam C

    2015-10-01

    The use of Monte Carlo simulations in diagnostic medical imaging research is widespread due to its flexibility and ability to estimate quantities that are challenging to measure empirically. However, any new Monte Carlo simulation code needs to be validated before it can be used reliably. The type and degree of validation required depends on the goals of the research project, but, typically, such validation involves either comparison of simulation results to physical measurements or to previously published results obtained with established Monte Carlo codes. The former is complicated due to nuances of experimental conditions and uncertainty, while the latter is challenging due to typical graphical presentation and lack of simulation details in previous publications. In addition, entering the field of Monte Carlo simulations in general involves a steep learning curve. It is not a simple task to learn how to program and interpret a Monte Carlo simulation, even when using one of the publicly available code packages. This Task Group report provides a common reference for benchmarking Monte Carlo simulations across a range of Monte Carlo codes and simulation scenarios. In the report, all simulation conditions are provided for six different Monte Carlo simulation cases that involve common x-ray based imaging research areas. The results obtained for the six cases using four publicly available Monte Carlo software packages are included in tabular form. In addition to a full description of all simulation conditions and results, a discussion and comparison of results among the Monte Carlo packages and the lessons learned during the compilation of these results are included. This abridged version of the report includes only an introductory description of the six cases and a brief example of the results of one of the cases. This work provides an investigator the necessary information to benchmark his/her Monte Carlo simulation software against the reference cases included here before performing his/her own novel research. In addition, an investigator entering the field of Monte Carlo simulations can use these descriptions and results as a self-teaching tool to ensure that he/she is able to perform a specific simulation correctly. Finally, educators can assign these cases as learning projects as part of course objectives or training programs.

  15. (U) Introduction to Monte Carlo Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungerford, Aimee L.

    2017-03-20

    Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.

  16. Improvements of MCOR: A Monte Carlo depletion code system for fuel assembly reference calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tippayakul, C.; Ivanov, K.; Misu, S.

    2006-07-01

    This paper presents the improvements of MCOR, a Monte Carlo depletion code system for fuel assembly reference calculations. The improvements of MCOR were initiated by the cooperation between the Penn State Univ. and AREVA NP to enhance the original Penn State Univ. MCOR version in order to be used as a new Monte Carlo depletion analysis tool. Essentially, a new depletion module using KORIGEN is utilized to replace the existing ORIGEN-S depletion module in MCOR. Furthermore, the online burnup cross section generation by the Monte Carlo calculation is implemented in the improved version instead of using the burnup cross sectionmore » library pre-generated by a transport code. Other code features have also been added to make the new MCOR version easier to use. This paper, in addition, presents the result comparisons of the original and the improved MCOR versions against CASMO-4 and OCTOPUS. It was observed in the comparisons that there were quite significant improvements of the results in terms of k{sub inf}, fission rate distributions and isotopic contents. (authors)« less

  17. MCNP (Monte Carlo Neutron Photon) capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. The general-purpose continuous-energy Monte Carlo code MCNP (Monte Carlo Neutron Photon), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tally characteristics with standard MCNP features. The time-dependent capabilitymore » of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data. A rich collections of variance reduction features can greatly increase the efficiency of a calculation. MCNP is written in FORTRAN 77 and has been run on variety of computer systems from scientific workstations to supercomputers. The next production version of MCNP will include features such as continuous-energy electron transport and a multitasking option. Areas of ongoing research of interest to the well logging community include angle biasing, adaptive Monte Carlo, improved discrete ordinates capabilities, and discrete ordinates/Monte Carlo hybrid development. Los Alamos has requested approval by the Department of Energy to create a Radiation Transport Computational Facility under their User Facility Program to increase external interactions with industry, universities, and other government organizations. 21 refs.« less

  18. Depletion Calculations Based on Perturbations. Application to the Study of a Rep-Like Assembly at Beginning of Cycle with TRIPOLI-4®.

    NASA Astrophysics Data System (ADS)

    Dieudonne, Cyril; Dumonteil, Eric; Malvagi, Fausto; M'Backé Diop, Cheikh

    2014-06-01

    For several years, Monte Carlo burnup/depletion codes have appeared, which couple Monte Carlo codes to simulate the neutron transport to deterministic methods, which handle the medium depletion due to the neutron flux. Solving Boltzmann and Bateman equations in such a way allows to track fine 3-dimensional effects and to get rid of multi-group hypotheses done by deterministic solvers. The counterpart is the prohibitive calculation time due to the Monte Carlo solver called at each time step. In this paper we present a methodology to avoid the repetitive and time-expensive Monte Carlo simulations, and to replace them by perturbation calculations: indeed the different burnup steps may be seen as perturbations of the isotopic concentration of an initial Monte Carlo simulation. In a first time we will present this method, and provide details on the perturbative technique used, namely the correlated sampling. In a second time the implementation of this method in the TRIPOLI-4® code will be discussed, as well as the precise calculation scheme a meme to bring important speed-up of the depletion calculation. Finally, this technique will be used to calculate the depletion of a REP-like assembly, studied at beginning of its cycle. After having validated the method with a reference calculation we will show that it can speed-up by nearly an order of magnitude standard Monte-Carlo depletion codes.

  19. PyMercury: Interactive Python for the Mercury Monte Carlo Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iandola, F N; O'Brien, M J; Procassini, R J

    2010-11-29

    Monte Carlo particle transport applications are often written in low-level languages (C/C++) for optimal performance on clusters and supercomputers. However, this development approach often sacrifices straightforward usability and testing in the interest of fast application performance. To improve usability, some high-performance computing applications employ mixed-language programming with high-level and low-level languages. In this study, we consider the benefits of incorporating an interactive Python interface into a Monte Carlo application. With PyMercury, a new Python extension to the Mercury general-purpose Monte Carlo particle transport code, we improve application usability without diminishing performance. In two case studies, we illustrate how PyMercury improvesmore » usability and simplifies testing and validation in a Monte Carlo application. In short, PyMercury demonstrates the value of interactive Python for Monte Carlo particle transport applications. In the future, we expect interactive Python to play an increasingly significant role in Monte Carlo usage and testing.« less

  20. SUPREM-DSMC: A New Scalable, Parallel, Reacting, Multidimensional Direct Simulation Monte Carlo Flow Code

    NASA Technical Reports Server (NTRS)

    Campbell, David; Wysong, Ingrid; Kaplan, Carolyn; Mott, David; Wadsworth, Dean; VanGilder, Douglas

    2000-01-01

    An AFRL/NRL team has recently been selected to develop a scalable, parallel, reacting, multidimensional (SUPREM) Direct Simulation Monte Carlo (DSMC) code for the DoD user community under the High Performance Computing Modernization Office (HPCMO) Common High Performance Computing Software Support Initiative (CHSSI). This paper will introduce the JANNAF Exhaust Plume community to this three-year development effort and present the overall goals, schedule, and current status of this new code.

  1. MCNP capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tallymore » characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.« less

  2. Preliminary estimates of nucleon fluxes in a water target exposed to solar-flare protons: BRYNTRN versus Monte Carlo code

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.; Lone, M. A.; Wong, P. Y.; Costen, Robert C.

    1994-01-01

    A baryon transport code (BRYNTRN) has previously been verified using available Monte Carlo results for a solar-flare spectrum as the reference. Excellent results were obtained, but the comparisons were limited to the available data on dose and dose equivalent for moderate penetration studies that involve minor contributions from secondary neutrons. To further verify the code, the secondary energy spectra of protons and neutrons are calculated using BRYNTRN and LAHET (Los Alamos High-Energy Transport code, which is a Monte Carlo code). These calculations are compared for three locations within a water slab exposed to the February 1956 solar-proton spectrum. Reasonable agreement was obtained when various considerations related to the calculational techniques and their limitations were taken into account. Although the Monte Carlo results are preliminary, it appears that the neutron albedo, which is not currently treated in BRYNTRN, might be a cause for the large discrepancy seen at small penetration depths. It also appears that the nonelastic neutron production cross sections in BRYNTRN may underestimate the number of neutrons produced in proton collisions with energies below 200 MeV. The notion that the poor energy resolution in BRYNTRN may cause a large truncation error in neutron elastic scattering requires further study.

  3. Experimental benchmarking of a Monte Carlo dose simulation code for pediatric CT

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Yoshizumi, Terry; Colsher, James G.; Jones, Robert P.; Frush, Donald P.

    2007-03-01

    In recent years, there has been a desire to reduce CT radiation dose to children because of their susceptibility and prolonged risk for cancer induction. Concerns arise, however, as to the impact of dose reduction on image quality and thus potentially on diagnostic accuracy. To study the dose and image quality relationship, we are developing a simulation code to calculate organ dose in pediatric CT patients. To benchmark this code, a cylindrical phantom was built to represent a pediatric torso, which allows measurements of dose distributions from its center to its periphery. Dose distributions for axial CT scans were measured on a 64-slice multidetector CT (MDCT) scanner (GE Healthcare, Chalfont St. Giles, UK). The same measurements were simulated using a Monte Carlo code (PENELOPE, Universitat de Barcelona) with the applicable CT geometry including bowtie filter. The deviations between simulated and measured dose values were generally within 5%. To our knowledge, this work is one of the first attempts to compare measured radial dose distributions on a cylindrical phantom with Monte Carlo simulated results. It provides a simple and effective method for benchmarking organ dose simulation codes and demonstrates the potential of Monte Carlo simulation for investigating the relationship between dose and image quality for pediatric CT patients.

  4. SU-E-J-205: Monte Carlo Modeling of Ultrasound Probes for Real-Time Ultrasound Image-Guided Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hristov, D; Schlosser, J; Bazalova, M

    2014-06-01

    Purpose: To quantify the effect of ultrasound (US) probe beam attenuation for radiation therapy delivered under real-time US image guidance by means of Monte Carlo (MC) simulations. Methods: MC models of two Philips US probes, an X6-1 matrix-array transducer and a C5-2 curved-array transducer, were built based on their CT images in the EGSnrc BEAMnrc and DOSXYZnrc codes. Due to the metal parts, the probes were scanned in a Tomotherapy machine with a 3.5 MV beam. Mass densities in the probes were assigned based on an electron density calibration phantom consisting of cylinders with mass densities between 0.2–8.0 g/cm{sup 3}.more » Beam attenuation due to the probes was measured in a solid water phantom for a 6 MV and 15 MV 15x15 cm{sup 2} beam delivered on a Varian Trilogy linear accelerator. The dose was measured with the PTW-729 ionization chamber array at two depths and compared to MC simulations. The extreme case beam attenuation expected in robotic US image guided radiotherapy for probes in upright position was quantified by means of MC simulations. Results: The 3.5 MV CT number to mass density calibration curve was found to be linear with R{sup 2} > 0.99. The maximum mass densities were 4.6 and 4.2 g/cm{sup 3} in the C5-2 and X6-1 probe, respectively. Gamma analysis of the simulated and measured doses revealed that over 98% of measurement points passed the 3%/3mm criteria for both probes and measurement depths. The extreme attenuation for probes in upright position was found to be 25% and 31% for the C5-2 and X6-1 probe, respectively, for both 6 and 15 MV beams at 10 cm depth. Conclusion: MC models of two US probes used for real-time image guidance during radiotherapy have been built. As a Result, radiotherapy treatment planning with the imaging probes in place can now be performed. J Schlosser is an employee of SoniTrack Systems, Inc. D Hristov has financial interest in SoniTrack Systems, Inc.« less

  5. Force field development with GOMC, a fast new Monte Carlo molecular simulation code

    NASA Astrophysics Data System (ADS)

    Mick, Jason Richard

    In this work GOMC (GPU Optimized Monte Carlo) a new fast, flexible, and free molecular Monte Carlo code for the simulation atomistic chemical systems is presented. The results of a large Lennard-Jonesium simulation in the Gibbs ensemble is presented. Force fields developed using the code are also presented. To fit the models a quantitative fitting process is outlined using a scoring function and heat maps. The presented n-6 force fields include force fields for noble gases and branched alkanes. These force fields are shown to be the most accurate LJ or n-6 force fields to date for these compounds, capable of reproducing pure fluid behavior and binary mixture behavior to a high degree of accuracy.

  6. Monte Carlo Simulation of a Segmented Detector for Low-Energy Electron Antineutrinos

    NASA Astrophysics Data System (ADS)

    Qomi, H. Akhtari; Safari, M. J.; Davani, F. Abbasi

    2017-11-01

    Detection of low-energy electron antineutrinos is of importance for several purposes, such as ex-vessel reactor monitoring, neutrino oscillation studies, etc. The inverse beta decay (IBD) is the interaction that is responsible for detection mechanism in (organic) plastic scintillation detectors. Here, a detailed study will be presented dealing with the radiation and optical transport simulation of a typical segmented antineutrino detector withMonte Carlo method using MCNPX and FLUKA codes. This study shows different aspects of the detector, benefiting from inherent capabilities of the Monte Carlo simulation codes.

  7. Tally and geometry definition influence on the computing time in radiotherapy treatment planning with MCNP Monte Carlo code.

    PubMed

    Juste, B; Miro, R; Gallardo, S; Santos, A; Verdu, G

    2006-01-01

    The present work has simulated the photon and electron transport in a Theratron 780 (MDS Nordion) (60)Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle), version 5. In order to become computationally more efficient in view of taking part in the practical field of radiotherapy treatment planning, this work is focused mainly on the analysis of dose results and on the required computing time of different tallies applied in the model to speed up calculations.

  8. EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.

  9. Radiation dose enhancement in skin therapy with nanoparticle addition: A Monte Carlo study on kilovoltage photon and megavoltage electron beams

    PubMed Central

    Zheng, Xiao J; Chow, James C L

    2017-01-01

    AIM To investigated the dose enhancement due to the incorporation of nanoparticles in skin therapy using the kilovoltage (kV) photon and megavoltage (MV) electron beams. Monte Carlo simulations were used to predict the dose enhancement when different types and concentrations of nanoparticles were added to skin target layers of varying thickness. METHODS Clinical kV photon beams (105 and 220 kVp) and MV electron beams (4 and 6 MeV), produced by a Gulmay D3225 orthovoltage unit and a Varian 21 EX linear accelerator, were simulated using the EGSnrc Monte Carlo code. Doses at skin target layers with thicknesses ranging from 0.5 to 5 mm for the photon beams and 0.5 to 10 mm for the electron beams were determined. The skin target layer was added with the Au, Pt, I, Ag and Fe2O3 nanoparticles with concentrations ranging from 3 to 40 mg/mL. The dose enhancement ratio (DER), defined as the dose at the target layer with nanoparticle addition divided by the dose at the layer without nanoparticle addition, was calculated for each nanoparticle type, nanoparticle concentration and target layer thickness. RESULTS It was found that among all nanoparticles, Au had the highest DER (5.2-6.3) when irradiated with kV photon beams. Dependence of the DER on the target layer thickness was not significant for the 220 kVp photon beam but it was for 105 kVp beam for Au nanoparticle concentrations higher than 18 mg/mL. For other nanoparticles, the DER was dependent on the atomic number of the nanoparticle and energy spectrum of the photon beams. All nanoparticles showed an increase of DER with nanoparticle concentration during the photon beam irradiations regardless of thickness. For electron beams, the Au nanoparticles were found to have the highest DER (1.01-1.08) when the beam energy was equal to 4 MeV, but this was drastically lower than the DER values found using photon beams. The DER was also found affected by the depth of maximum dose of the electron beam and target thickness. For other nanoparticles with lower atomic number, DERs in the range of 0.99-1.02 were found using the 4 and 6 MeV electron beams. CONCLUSION In nanoparticle-enhanced skin therapy, Au nanoparticle addition can achieve the highest dose enhancement with 105 kVp photon beams. Electron beams, while popular for skin therapy, did not produce as high dose enhancements as kV photon beams. Additionally, the DER is dependent on nanoparticle type, nanoparticle concentration, skin target thickness and energies of the photon and electron beams. PMID:28298966

  10. Monte Carlo calculated correction factors for diodes and ion chambers in small photon fields.

    PubMed

    Czarnecki, D; Zink, K

    2013-04-21

    The application of small photon fields in modern radiotherapy requires the determination of total scatter factors Scp or field factors Ω(f(clin), f(msr))(Q(clin), Q(msr)) with high precision. Both quantities require the knowledge of the field-size-dependent and detector-dependent correction factor k(f(clin), f(msr))(Q(clin), Q(msr)). The aim of this study is the determination of the correction factor k(f(clin), f(msr))(Q(clin), Q(msr)) for different types of detectors in a clinical 6 MV photon beam of a Siemens KD linear accelerator. The EGSnrc Monte Carlo code was used to calculate the dose to water and the dose to different detectors to determine the field factor as well as the mentioned correction factor for different small square field sizes. Besides this, the mean water to air stopping power ratio as well as the ratio of the mean energy absorption coefficients for the relevant materials was calculated for different small field sizes. As the beam source, a Monte Carlo based model of a Siemens KD linear accelerator was used. The results show that in the case of ionization chambers the detector volume has the largest impact on the correction factor k(f(clin), f(msr))(Q(clin), Q(msr)); this perturbation may contribute up to 50% to the correction factor. Field-dependent changes in stopping-power ratios are negligible. The magnitude of k(f(clin), f(msr))(Q(clin), Q(msr)) is of the order of 1.2 at a field size of 1 × 1 cm(2) for the large volume ion chamber PTW31010 and is still in the range of 1.05-1.07 for the PinPoint chambers PTW31014 and PTW31016. For the diode detectors included in this study (PTW60016, PTW 60017), the correction factor deviates no more than 2% from unity in field sizes between 10 × 10 and 1 × 1 cm(2), but below this field size there is a steep decrease of k(f(clin), f(msr))(Q(clin), Q(msr)) below unity, i.e. a strong overestimation of dose. Besides the field size and detector dependence, the results reveal a clear dependence of the correction factor on the accelerator geometry for field sizes below 1 × 1 cm(2), i.e. on the beam spot size of the primary electrons hitting the target. This effect is especially pronounced for the ionization chambers. In conclusion, comparing all detectors, the unshielded diode PTW60017 is highly recommended for small field dosimetry, since its correction factor k(f(clin), f(msr))(Q(clin), Q(msr)) is closest to unity in small fields and mainly independent of the electron beam spot size.

  11. ME(SSY)**2: Monte Carlo Code for Star Cluster Simulations

    NASA Astrophysics Data System (ADS)

    Freitag, Marc Dewi

    2013-02-01

    ME(SSY)**2 stands for “Monte-carlo Experiments with Spherically SYmmetric Stellar SYstems." This code simulates the long term evolution of spherical clusters of stars; it was devised specifically to treat dense galactic nuclei. It is based on the pioneering Monte Carlo scheme proposed by Hénon in the 70's and includes all relevant physical ingredients (2-body relaxation, stellar mass spectrum, collisions, tidal disruption, ldots). It is basically a Monte Carlo resolution of the Fokker-Planck equation. It can cope with any stellar mass spectrum or velocity distribution. Being a particle-based method, it also allows one to take stellar collisions into account in a very realistic way. This unique code, featuring most important physical processes, allows million particle simulations, spanning a Hubble time, in a few CPU days on standard personal computers and provides a wealth of data only rivalized by N-body simulations. The current version of the software requires the use of routines from the "Numerical Recipes in Fortran 77" (http://www.nrbook.com/a/bookfpdf.php).

  12. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Shawn A.

    The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.

  13. A Monte-Carlo Benchmark of TRIPOLI-4® and MCNP on ITER neutronics

    NASA Astrophysics Data System (ADS)

    Blanchet, David; Pénéliau, Yannick; Eschbach, Romain; Fontaine, Bruno; Cantone, Bruno; Ferlet, Marc; Gauthier, Eric; Guillon, Christophe; Letellier, Laurent; Proust, Maxime; Mota, Fernando; Palermo, Iole; Rios, Luis; Guern, Frédéric Le; Kocan, Martin; Reichle, Roger

    2017-09-01

    Radiation protection and shielding studies are often based on the extensive use of 3D Monte-Carlo neutron and photon transport simulations. ITER organization hence recommends the use of MCNP-5 code (version 1.60), in association with the FENDL-2.1 neutron cross section data library, specifically dedicated to fusion applications. The MCNP reference model of the ITER tokamak, the `C-lite', is being continuously developed and improved. This article proposes to develop an alternative model, equivalent to the 'C-lite', but for the Monte-Carlo code TRIPOLI-4®. A benchmark study is defined to test this new model. Since one of the most critical areas for ITER neutronics analysis concerns the assessment of radiation levels and Shutdown Dose Rates (SDDR) behind the Equatorial Port Plugs (EPP), the benchmark is conducted to compare the neutron flux through the EPP. This problem is quite challenging with regard to the complex geometry and considering the important neutron flux attenuation ranging from 1014 down to 108 n•cm-2•s-1. Such code-to-code comparison provides independent validation of the Monte-Carlo simulations, improving the confidence in neutronic results.

  14. AREVA Developments for an Efficient and Reliable use of Monte Carlo codes for Radiation Transport Applications

    NASA Astrophysics Data System (ADS)

    Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald

    2017-09-01

    In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.

  15. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less

  16. Energetic properties' investigation of removing flattening filter at phantom surface: Monte Carlo study using BEAMnrc code, DOSXYZnrc code and BEAMDP code

    NASA Astrophysics Data System (ADS)

    Bencheikh, Mohamed; Maghnouj, Abdelmajid; Tajmouati, Jaouad

    2017-11-01

    The Monte Carlo calculation method is considered to be the most accurate method for dose calculation in radiotherapy and beam characterization investigation, in this study, the Varian Clinac 2100 medical linear accelerator with and without flattening filter (FF) was modelled. The objective of this study was to determine flattening filter impact on particles' energy properties at phantom surface in terms of energy fluence, mean energy, and energy fluence distribution. The Monte Carlo codes used in this study were BEAMnrc code for simulating linac head, DOSXYZnrc code for simulating the absorbed dose in a water phantom, and BEAMDP for extracting energy properties. Field size was 10 × 10 cm2, simulated photon beam energy was 6 MV and SSD was 100 cm. The Monte Carlo geometry was validated by a gamma index acceptance rate of 99% in PDD and 98% in dose profiles, gamma criteria was 3% for dose difference and 3mm for distance to agreement. In without-FF, the energetic properties was as following: electron contribution was increased by more than 300% in energy fluence, almost 14% in mean energy and 1900% in energy fluence distribution, however, photon contribution was increased 50% in energy fluence, and almost 18% in mean energy and almost 35% in energy fluence distribution. The removing flattening filter promotes the increasing of electron contamination energy versus photon energy; our study can contribute in the evolution of removing flattening filter configuration in future linac.

  17. Advanced Computational Methods for Monte Carlo Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  18. Recent advances and future prospects for Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B

    2010-01-01

    The history of Monte Carlo methods is closely linked to that of computers: The first known Monte Carlo program was written in 1947 for the ENIAC; a pre-release of the first Fortran compiler was used for Monte Carlo In 1957; Monte Carlo codes were adapted to vector computers in the 1980s, clusters and parallel computers in the 1990s, and teraflop systems in the 2000s. Recent advances include hierarchical parallelism, combining threaded calculations on multicore processors with message-passing among different nodes. With the advances In computmg, Monte Carlo codes have evolved with new capabilities and new ways of use. Production codesmore » such as MCNP, MVP, MONK, TRIPOLI and SCALE are now 20-30 years old (or more) and are very rich in advanced featUres. The former 'method of last resort' has now become the first choice for many applications. Calculations are now routinely performed on office computers, not just on supercomputers. Current research and development efforts are investigating the use of Monte Carlo methods on FPGAs. GPUs, and many-core processors. Other far-reaching research is exploring ways to adapt Monte Carlo methods to future exaflop systems that may have 1M or more concurrent computational processes.« less

  19. Preliminary results of 3D dose calculations with MCNP-4B code from a SPECT image.

    PubMed

    Rodríguez Gual, M; Lima, F F; Sospedra Alfonso, R; González González, J; Calderón Marín, C

    2004-01-01

    Interface software was developed to generate the input file to run Monte Carlo MCNP-4B code from medical image in Interfile format version 3.3. The software was tested using a spherical phantom of tomography slides with known cumulated activity distribution in Interfile format generated with IMAGAMMA medical image processing system. The 3D dose calculation obtained with Monte Carlo MCNP-4B code was compared with the voxel S factor method. The results show a relative error between both methods less than 1 %.

  20. PRELIMINARY COUPLING OF THE MONTE CARLO CODE OPENMC AND THE MULTIPHYSICS OBJECT-ORIENTED SIMULATION ENVIRONMENT (MOOSE) FOR ANALYZING DOPPLER FEEDBACK IN MONTE CARLO SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Ellis; Derek Gaston; Benoit Forget

    In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes.more » An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.« less

  1. WARP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergmann, Ryan M.; Rowland, Kelly L.

    2017-04-12

    WARP, which can stand for ``Weaving All the Random Particles,'' is a three-dimensional (3D) continuous energy Monte Carlo neutron transport code developed at UC Berkeley to efficiently execute on NVIDIA graphics processing unit (GPU) platforms. WARP accelerates Monte Carlo simulations while preserving the benefits of using the Monte Carlo method, namely, that very few physical and geometrical simplifications are applied. WARP is able to calculate multiplication factors, neutron flux distributions (in both space and energy), and fission source distributions for time-independent neutron transport problems. It can run in both criticality or fixed source modes, but fixed source mode is currentlymore » not robust, optimized, or maintained in the newest version. WARP can transport neutrons in unrestricted arrangements of parallelepipeds, hexagonal prisms, cylinders, and spheres. The goal of developing WARP is to investigate algorithms that can grow into a full-featured, continuous energy, Monte Carlo neutron transport code that is accelerated by running on GPUs. The crux of the effort is to make Monte Carlo calculations faster while producing accurate results. Modern supercomputers are commonly being built with GPU coprocessor cards in their nodes to increase their computational efficiency and performance. GPUs execute efficiently on data-parallel problems, but most CPU codes, including those for Monte Carlo neutral particle transport, are predominantly task-parallel. WARP uses a data-parallel neutron transport algorithm to take advantage of the computing power GPUs offer.« less

  2. Morse Monte Carlo Radiation Transport Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one maymore » determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)« less

  3. Portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Calore, Enrico; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Fabio Schifano, Sebastiano; Silvi, Giorgio; Tripiccione, Raffaele

    2018-03-01

    Varying from multi-core CPU processors to many-core GPUs, the present scenario of HPC architectures is extremely heterogeneous. In this context, code portability is increasingly important for easy maintainability of applications; this is relevant in scientific computing where code changes are numerous and frequent. In this talk we present the design and optimization of a state-of-the-art production level LQCD Monte Carlo application, using the OpenACC directives model. OpenACC aims to abstract parallel programming to a descriptive level, where programmers do not need to specify the mapping of the code on the target machine. We describe the OpenACC implementation and show that the same code is able to target different architectures, including state-of-the-art CPUs and GPUs.

  4. High-Fidelity Coupled Monte-Carlo/Thermal-Hydraulics Calculations

    NASA Astrophysics Data System (ADS)

    Ivanov, Aleksandar; Sanchez, Victor; Ivanov, Kostadin

    2014-06-01

    Monte Carlo methods have been used as reference reactor physics calculation tools worldwide. The advance in computer technology allows the calculation of detailed flux distributions in both space and energy. In most of the cases however, those calculations are done under the assumption of homogeneous material density and temperature distributions. The aim of this work is to develop a consistent methodology for providing realistic three-dimensional thermal-hydraulic distributions by coupling the in-house developed sub-channel code SUBCHANFLOW with the standard Monte-Carlo transport code MCNP. In addition to the innovative technique of on-the fly material definition, a flux-based weight-window technique has been introduced to improve both the magnitude and the distribution of the relative errors. Finally, a coupled code system for the simulation of steady-state reactor physics problems has been developed. Besides the problem of effective feedback data interchange between the codes, the treatment of temperature dependence of the continuous energy nuclear data has been investigated.

  5. Positron follow-up in liquid water: I. A new Monte Carlo track-structure code.

    PubMed

    Champion, C; Le Loirec, C

    2006-04-07

    When biological matter is irradiated by charged particles, a wide variety of interactions occur, which lead to a deep modification of the cellular environment. To understand the fine structure of the microscopic distribution of energy deposits, Monte Carlo event-by-event simulations are particularly suitable. However, the development of these track-structure codes needs accurate interaction cross sections for all the electronic processes: ionization, excitation, positronium formation and even elastic scattering. Under these conditions, we have recently developed a Monte Carlo code for positrons in water, the latter being commonly used to simulate the biological medium. All the processes are studied in detail via theoretical differential and total cross-section calculations performed by using partial wave methods. Comparisons with existing theoretical and experimental data in terms of stopping powers, mean energy transfers and ranges show very good agreements. Moreover, thanks to the theoretical description of positronium formation, we have access, for the first time, to the complete kinematics of the electron capture process. Then, the present Monte Carlo code is able to describe the detailed positronium history, which will provide useful information for medical imaging (like positron emission tomography) where improvements are needed to define with the best accuracy the tumoural volumes.

  6. Validation of a personalized dosimetric evaluation tool (Oedipe) for targeted radiotherapy based on the Monte Carlo MCNPX code

    NASA Astrophysics Data System (ADS)

    Chiavassa, S.; Aubineau-Lanièce, I.; Bitar, A.; Lisbona, A.; Barbet, J.; Franck, D.; Jourdain, J. R.; Bardiès, M.

    2006-02-01

    Dosimetric studies are necessary for all patients treated with targeted radiotherapy. In order to attain the precision required, we have developed Oedipe, a dosimetric tool based on the MCNPX Monte Carlo code. The anatomy of each patient is considered in the form of a voxel-based geometry created using computed tomography (CT) images or magnetic resonance imaging (MRI). Oedipe enables dosimetry studies to be carried out at the voxel scale. Validation of the results obtained by comparison with existing methods is complex because there are multiple sources of variation: calculation methods (different Monte Carlo codes, point kernel), patient representations (model or specific) and geometry definitions (mathematical or voxel-based). In this paper, we validate Oedipe by taking each of these parameters into account independently. Monte Carlo methodology requires long calculation times, particularly in the case of voxel-based geometries, and this is one of the limits of personalized dosimetric methods. However, our results show that the use of voxel-based geometry as opposed to a mathematically defined geometry decreases the calculation time two-fold, due to an optimization of the MCNPX2.5e code. It is therefore possible to envisage the use of Oedipe for personalized dosimetry in the clinical context of targeted radiotherapy.

  7. Investigation of practical approaches to evaluating cumulative dose for cone beam computed tomography (CBCT) from standard CT dosimetry measurements: a Monte Carlo study.

    PubMed

    Abuhaimed, Abdullah; Martin, Colin J; Sankaralingam, Marimuthu; Gentle, David J

    2015-07-21

    A function called Gx(L) was introduced by the International Commission on Radiation Units and Measurements (ICRU) Report-87 to facilitate measurement of cumulative dose for CT scans within long phantoms as recommended by the American Association of Physicists in Medicine (AAPM) TG-111. The Gx(L) function is equal to the ratio of the cumulative dose at the middle of a CT scan to the volume weighted CTDI (CTDIvol), and was investigated for conventional multi-slice CT scanners operating with a moving table. As the stationary table mode, which is the basis for cone beam CT (CBCT) scans, differs from that used for conventional CT scans, the aim of this study was to investigate the extension of the Gx(L) function to CBCT scans. An On-Board Imager (OBI) system integrated with a TrueBeam linac was simulated with Monte Carlo EGSnrc/BEAMnrc, and the absorbed dose was calculated within PMMA, polyethylene (PE), and water head and body phantoms using EGSnrc/DOSXYZnrc, where the body PE body phantom emulated the ICRU/AAPM phantom. Beams of width 40-500 mm and beam qualities at tube potentials of 80-140 kV were studied. Application of a modified function of beam width (W) termed Gx(W), for which the cumulative dose for CBCT scans f (0) is normalized to the weighted CTDI (CTDIw) for a reference beam of width 40 mm, was investigated as a possible option. However, differences were found in Gx(W) with tube potential, especially for body phantoms, and these were considered to be due to differences in geometry between wide beams used for CBCT scans and those for conventional CT. Therefore, a modified function Gx(W)100 has been proposed, taking the form of values of f (0) at each position in a long phantom, normalized with respect to dose indices f 100(150)x measured with a 100 mm pencil ionization chamber within standard 150 mm PMMA phantoms, using the same scanning parameters, beam widths and positions within the phantom. f 100(150)x averages the dose resulting from a CBCT scan over the 100 mm length. Like the Gx(L) function, the Gx(W)100 function showed only a weak dependency on tube potential at most positions for the phantoms studied. The results were fitted to polynomial equations from which f (0) within the longer PMMA, PE, or water phantoms can be evaluated from measurements of f 100(150)x. Comparisons with other studies, suggest that these functions may be suitable for application to any CT or CBCT scan acquired with stationary table mode.

  8. Performance and accuracy of criticality calculations performed using WARP – A framework for continuous energy Monte Carlo neutron transport in general 3D geometries on GPUs

    DOE PAGES

    Bergmann, Ryan M.; Rowland, Kelly L.; Radnović, Nikola; ...

    2017-05-01

    In this companion paper to "Algorithmic Choices in WARP - A Framework for Continuous Energy Monte Carlo Neutron Transport in General 3D Geometries on GPUs" (doi:10.1016/j.anucene.2014.10.039), the WARP Monte Carlo neutron transport framework for graphics processing units (GPUs) is benchmarked against production-level central processing unit (CPU) Monte Carlo neutron transport codes for both performance and accuracy. We compare neutron flux spectra, multiplication factors, runtimes, speedup factors, and costs of various GPU and CPU platforms running either WARP, Serpent 2.1.24, or MCNP 6.1. WARP compares well with the results of the production-level codes, and it is shown that on the newestmore » hardware considered, GPU platforms running WARP are between 0.8 to 7.6 times as fast as CPU platforms running production codes. Also, the GPU platforms running WARP were between 15% and 50% as expensive to purchase and between 80% to 90% as expensive to operate as equivalent CPU platforms performing at an equal simulation rate.« less

  9. Performance and accuracy of criticality calculations performed using WARP – A framework for continuous energy Monte Carlo neutron transport in general 3D geometries on GPUs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergmann, Ryan M.; Rowland, Kelly L.; Radnović, Nikola

    In this companion paper to "Algorithmic Choices in WARP - A Framework for Continuous Energy Monte Carlo Neutron Transport in General 3D Geometries on GPUs" (doi:10.1016/j.anucene.2014.10.039), the WARP Monte Carlo neutron transport framework for graphics processing units (GPUs) is benchmarked against production-level central processing unit (CPU) Monte Carlo neutron transport codes for both performance and accuracy. We compare neutron flux spectra, multiplication factors, runtimes, speedup factors, and costs of various GPU and CPU platforms running either WARP, Serpent 2.1.24, or MCNP 6.1. WARP compares well with the results of the production-level codes, and it is shown that on the newestmore » hardware considered, GPU platforms running WARP are between 0.8 to 7.6 times as fast as CPU platforms running production codes. Also, the GPU platforms running WARP were between 15% and 50% as expensive to purchase and between 80% to 90% as expensive to operate as equivalent CPU platforms performing at an equal simulation rate.« less

  10. Treating voxel geometries in radiation protection dosimetry with a patched version of the Monte Carlo codes MCNP and MCNPX.

    PubMed

    Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P

    2007-01-01

    The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported.

  11. Verification of Three Dimensional Triangular Prismatic Discrete Ordinates Transport Code ENSEMBLE-TRIZ by Comparison with Monte Carlo Code GMVP

    NASA Astrophysics Data System (ADS)

    Homma, Yuto; Moriwaki, Hiroyuki; Ohki, Shigeo; Ikeda, Kazumi

    2014-06-01

    This paper deals with verification of three dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at beginning of cycle of an initial core and at beginning and end of cycle of equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multi-plication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity.

  12. Monte Carlo Calculations of Polarized Microwave Radiation Emerging from Cloud Structures

    NASA Technical Reports Server (NTRS)

    Kummerow, Christian; Roberti, Laura

    1998-01-01

    The last decade has seen tremendous growth in cloud dynamical and microphysical models that are able to simulate storms and storm systems with very high spatial resolution, typically of the order of a few kilometers. The fairly realistic distributions of cloud and hydrometeor properties that these models generate has in turn led to a renewed interest in the three-dimensional microwave radiative transfer modeling needed to understand the effect of cloud and rainfall inhomogeneities upon microwave observations. Monte Carlo methods, and particularly backwards Monte Carlo methods have shown themselves to be very desirable due to the quick convergence of the solutions. Unfortunately, backwards Monte Carlo methods are not well suited to treat polarized radiation. This study reviews the existing Monte Carlo methods and presents a new polarized Monte Carlo radiative transfer code. The code is based on a forward scheme but uses aliasing techniques to keep the computational requirements equivalent to the backwards solution. Radiative transfer computations have been performed using a microphysical-dynamical cloud model and the results are presented together with the algorithm description.

  13. A modern Monte Carlo investigation of the TG-43 dosimetry parameters for an {sup 125}I seed already having AAPM consensus data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aryal, Prakash; Molloy, Janelle A.; Rivard, Mark J., E-mail: mark.j.rivard@gmail.com

    2014-02-15

    Purpose: To investigate potential causes for differences in TG-43 brachytherapy dosimetry parameters in the existent literature for the model IAI-125A{sup 125}I seed and to propose new standard dosimetry parameters. Methods: The MCNP5 code was used for Monte Carlo (MC) simulations. Sensitivity of dose distributions, and subsequently TG-43 dosimetry parameters, was explored to reproduce historical methods upon which American Association of Physicists in Medicine (AAPM) consensus data are based. Twelve simulation conditions varying{sup 125}I coating thickness, coating mass density, photon interaction cross-section library, and photon emission spectrum were examined. Results: Varying{sup 125}I coating thickness, coating mass density, photon cross-section library, andmore » photon emission spectrum for the model IAI-125A seed changed the dose-rate constant by up to 0.9%, about 1%, about 3%, and 3%, respectively, in comparison to the proposed standard value of 0.922 cGy h{sup −1} U{sup −1}. The dose-rate constant values by Solberg et al. [“Dosimetric parameters of three new solid core {sup 125}I brachytherapy sources,” J. Appl. Clin. Med. Phys. 3, 119–134 (2002)], Meigooni et al. [“Experimental and theoretical determination of dosimetric characteristics of IsoAid ADVANTAGE™ {sup 125}I brachytherapy source,” Med. Phys. 29, 2152–2158 (2002)], and Taylor and Rogers [“An EGSnrc Monte Carlo-calculated database of TG-43 parameters,” Med. Phys. 35, 4228–4241 (2008)] for the model IAI-125A seed and Kennedy et al. [“Experimental and Monte Carlo determination of the TG-43 dosimetric parameters for the model 9011 THINSeed™ brachytherapy source,” Med. Phys. 37, 1681–1688 (2010)] for the model 6711 seed were +4.3% (0.962 cGy h{sup −1} U{sup −1}), +6.2% (0.98 cGy h{sup −1} U{sup −1}), +0.3% (0.925 cGy h{sup −1} U{sup −1}), and −0.2% (0.921 cGy h{sup −1} U{sup −1}), respectively, in comparison to the proposed standard value. Differences in the radial dose functions between the current study and both Solberg et al. and Meigooni et al. were <10% for r ≤ 5 cm, and increased for r > 5 cm with a maximum difference of 29% at r = 9 cm. In comparison to Taylor and Rogers, these differences were lower (maximum of 2% at r = 9 cm). For the similarly designed model 6711 {sup 125}I seed, differences did not exceed 0.5% for 0.5 ≤ r ≤ 10 cm. Radial dose function values varied by 1% as coating thickness and coating density were changed. Varying the cross-section library and source spectrum altered the radial dose function by 25% and 12%, respectively, but these differences occurred at r = 10 cm where the dose rates were very low. The 2D anisotropy function results were most similar to those of Solberg et al. and most different to those of Meigooni et al. The observed order of simulation condition variables from most to least important for influencing the 2D anisotropy function was spectrum, coating thickness, coating density, and cross-section library. Conclusions: Several MC radiation transport codes are available for calculation of the TG-43 dosimetry parameters for brachytherapy seeds. The physics models in these codes and their related cross-section libraries have been updated and improved since publication of the 2007 AAPM TG-43U1S1 report. Results using modern data indicated statistically significant differences in these dosimetry parameters in comparison to data recommended in the TG-43U1S1 report. Therefore, it seems that professional societies such as the AAPM should consider reevaluating the consensus data for this and others seeds and establishing a process of regular evaluations in which consensus data are based upon methods that remain state-of-the-art.« less

  14. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. Thesemore » lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations. Beginning MCNP users are encouraged to review LA-UR-09-00380, "Criticality Calculations with MCNP: A Primer (3nd Edition)" (available at http:// mcnp.lanl.gov under "Reference Collection") prior to the class. No Monte Carlo class can be complete without having students write their own simple Monte Carlo routines for basic random sampling, use of the random number generator, and simplified particle transport simulation.« less

  15. Performance Study of Monte Carlo Codes on Xeon Phi Coprocessors — Testing MCNP 6.1 and Profiling ARCHER Geometry Module on the FS7ONNi Problem

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Wolfe, Noah; Lin, Hui; Zieb, Kris; Ji, Wei; Caracappa, Peter; Carothers, Christopher; Xu, X. George

    2017-09-01

    This paper contains two parts revolving around Monte Carlo transport simulation on Intel Many Integrated Core coprocessors (MIC, also known as Xeon Phi). (1) MCNP 6.1 was recompiled into multithreading (OpenMP) and multiprocessing (MPI) forms respectively without modification to the source code. The new codes were tested on a 60-core 5110P MIC. The test case was FS7ONNi, a radiation shielding problem used in MCNP's verification and validation suite. It was observed that both codes became slower on the MIC than on a 6-core X5650 CPU, by a factor of 4 for the MPI code and, abnormally, 20 for the OpenMP code, and both exhibited limited capability of strong scaling. (2) We have recently added a Constructive Solid Geometry (CSG) module to our ARCHER code to provide better support for geometry modelling in radiation shielding simulation. The functions of this module are frequently called in the particle random walk process. To identify the performance bottleneck we developed a CSG proxy application and profiled the code using the geometry data from FS7ONNi. The profiling data showed that the code was primarily memory latency bound on the MIC. This study suggests that despite low initial porting e_ort, Monte Carlo codes do not naturally lend themselves to the MIC platform — just like to the GPUs, and that the memory latency problem needs to be addressed in order to achieve decent performance gain.

  16. TU-F-CAMPUS-T-04: Variations in Nominally Identical Small Fields From Photon Jaw Reproducibility and Associated Effects On Small Field Dosimetric Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muir, B R; McEwen, M R

    2015-06-15

    Purpose: To investigate uncertainties in small field output factors and detector specific correction factors from variations in field size for nominally identical fields using measurements and Monte Carlo simulations. Methods: Repeated measurements of small field output factors are made with the Exradin W1 (plastic scintillation detector) and the PTW microDiamond (synthetic diamond detector) in beams from the Elekta Precise linear accelerator. We investigate corrections for a 0.6x0.6 cm{sup 2} nominal field size shaped with secondary photon jaws at 100 cm source to surface distance (SSD). Measurements of small field profiles are made in a water phantom at 10 cm depthmore » using both detectors and are subsequently used for accurate detector positioning. Supplementary Monte Carlo simulations with EGSnrc are used to calculate the absorbed dose to the detector and absorbed dose to water under the same conditions when varying field size. The jaws in the BEAMnrc model of the accelerator are varied by a reasonable amount to investigate the same situation without the influence of measurements uncertainties (such as detector positioning or variation in beam output). Results: For both detectors, small field output factor measurements differ by up to 11 % when repeated measurements are made in nominally identical 0.6x0.6 cm{sup 2} fields. Variations in the FWHM of measured profiles are consistent with field size variations reported by the accelerator. Monte Carlo simulations of the dose to detector vary by up to 16 % under worst case variations in field size. These variations are also present in calculations of absorbed dose to water. However, calculated detector specific correction factors are within 1 % when varying field size because of cancellation of effects. Conclusion: Clinical physicists should be aware of potentially significant uncertainties in measured output factors required for dosimetry of small fields due to field size variations for nominally identical fields.« less

  17. Estimating the uncertainty of calculated out-of-field organ dose from a commercial treatment planning system.

    PubMed

    Wang, Lilie; Ding, George X

    2018-06-12

    Therapeutic radiation to cancer patients is accompanied by unintended radiation to organs outside the treatment field. It is known that the model-based dose algorithm has limitation in calculating the out-of-field doses. This study evaluated the out-of-field dose calculated by the Varian Eclipse treatment planning system (v.11 with AAA algorithm) in realistic treatment plans with the goal of estimating the uncertainties of calculated organ doses. Photon beam phase-space files for TrueBeam linear accelerator were provided by Varian. These were used as incident sources in EGSnrc Monte Carlo simulations of radiation transport through the downstream jaws and MLC. Dynamic movements of the MLC leaves were fully modeled based on treatment plans using IMRT or VMAT techniques. The Monte Carlo calculated out-of-field doses were then compared with those calculated by Eclipse. The dose comparisons were performed for different beam energies and treatment sites, including head-and-neck, lung, and pelvis. For 6 MV (FF/FFF), 10 MV (FF/FFF), and 15 MV (FF) beams, Eclipse underestimated out-of-field local doses by 30%-50% compared with Monte Carlo calculations when the local dose was <1% of prescribed dose. The accuracy of out-of-field dose calculations using Eclipse is improved when collimator jaws were set at the smallest possible aperture for MLC openings. The Eclipse system consistently underestimates out-of-field dose by a factor of 2 for all beam energies studied at the local dose level of less than 1% of prescribed dose. These findings are useful in providing information on the uncertainties of out-of-field organ doses calculated by Eclipse treatment planning system. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  18. Monte Carlo simulation of proton track structure in biological matter

    DOE PAGES

    Quinto, Michele A.; Monti, Juan M.; Weck, Philippe F.; ...

    2017-05-25

    Here, understanding the radiation-induced effects at the cellular and subcellular levels remains crucial for predicting the evolution of irradiated biological matter. In this context, Monte Carlo track-structure simulations have rapidly emerged among the most suitable and powerful tools. However, most existing Monte Carlo track-structure codes rely heavily on the use of semi-empirical cross sections as well as water as a surrogate for biological matter. In the current work, we report on the up-to-date version of our homemade Monte Carlo code TILDA-V – devoted to the modeling of the slowing-down of 10 keV–100 MeV protons in both water and DNA –more » where the main collisional processes are described by means of an extensive set of ab initio differential and total cross sections.« less

  19. Monte Carlo simulation of proton track structure in biological matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinto, Michele A.; Monti, Juan M.; Weck, Philippe F.

    Here, understanding the radiation-induced effects at the cellular and subcellular levels remains crucial for predicting the evolution of irradiated biological matter. In this context, Monte Carlo track-structure simulations have rapidly emerged among the most suitable and powerful tools. However, most existing Monte Carlo track-structure codes rely heavily on the use of semi-empirical cross sections as well as water as a surrogate for biological matter. In the current work, we report on the up-to-date version of our homemade Monte Carlo code TILDA-V – devoted to the modeling of the slowing-down of 10 keV–100 MeV protons in both water and DNA –more » where the main collisional processes are described by means of an extensive set of ab initio differential and total cross sections.« less

  20. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vuong, A; Chow, J

    Purpose: This study investigated the surface dose variation in preclinical irradiation using small animal, when monoenergetic photon beams with energy range from 50 keV to 1.25 MeV were used. Methods: Inhomogeneous, homogeneous and bone-tissue homogeneous mouse phantom based on the same CT image set were used. The homogeneous and bone-tissue homogeneous phantom were created with the relative electron density of all and only bone voxels of the mouse overridden to one, respectively. Monte Carlo simulation based on the EGSnrc-based code was used to calculate the surface dose, when the phantoms were irradiated by a 360° photon arc with energies rangingmore » from 50 keV to 1.25 MeV. The mean surface doses of the three phantoms were calculated. In addition, the surface doses from partial arcs, 45°–315°, 125°–225°, 45°–125° and 225°–315° covering the anterior, posterior, right lateral and left lateral region of the mouse were determined using different photon beam energies. Results: When the prescribed dose at the isocenter of the mouse was 2 Gy, the maximum mean surface doses, found at the 50-keV photon beams, were 0.358 Gy, 0.363 Gy and 0.350 Gy for the inhomogeneous, homogeneous and bone-tissue homogeneous mouse phantom, respectively. The mean surface dose of the mouse was found decreasing with an increase of the photon beam energy. For surface dose in different orientations, the lateral regions of the mouse were receiving lower dose than the anterior and posterior regions. This may be due to the increase of beam attenuation along the horizontal (left-right) axis than the vertical (anterior-posterior) in the mouse. Conclusion: It is concluded that consideration of phantom inhomogeneity in the dose calculation resulted in a lower mean surface dose of the mouse. The mean surface dose also decreased with an increase of photon beam energy in the kilovoltage range.« less

  2. SU-E-T-439: An Improved Formula of Scatter-To-Primary Ratio for Photon Dose Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, T

    2014-06-01

    Purpose: Scatter-to-primary ratio (SPR) is an important dosimetric quantity that describes the contribution from the scatter photons in an external photon beam. The purpose of this study is to develop an improved analytical formula to describe SPR as a function of circular field size (r) and depth (d) using Monte Carlo (MC) simulation. Methods: MC simulation was performed for Mohan photon spectra (Co-60, 4, 6, 10, 15, 23 MV) using EGSNRC code. Point-spread scatter dose kernels in water are generated. The scatter-to-primary ratio (SPR) is also calculated using MC simulation as a function of field size for circular field sizemore » with radius r and depth d. The doses from forward scatter and backscatter photons are calculated using a convolution of the point-spread scatter dose kernel and by accounting for scatter photons contributing to dose before (z'd) reaching the depth of interest, d, where z' is the location of scatter photons, respectively. The depth dependence of the ratio of the forward scatter and backscatter doses is determined as a function of depth and field size. Results: We are able to improve the existing 3-parameter (a, w, d0) empirical formula for SPR by introducing depth dependence for one of the parameter d0, which becomes 0 for deeper depths. The depth dependence of d0 can be directly calculated as a ratio of backscatter-to-forward scatter doses for otherwise the same field and depth. With the improved empirical formula, we can fit SPR for all megavoltage photon beams to within 2%. Existing 3-parameter formula cannot fit SPR data for Co-60 to better than 3.1%. Conclusion: An improved empirical formula is developed to fit SPR for all megavoltage photon energies to within 2%.« less

  3. Sci-Thur AM: YIS – 04: Stopping power-to-Cherenkov power ratios and beam quality specification for clinical Cherenkov emission dosimetry of electrons: beam-specific effects and experimental validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zlateva, Yana; Seuntjens, Jan; El Naqa, Issam

    Purpose: To advance towards clinical Cherenkov emission (CE)-based dosimetry by investigating beam-specific effects on Monte Carlo-calculated electron-beam stopping power-to-CE power ratios (SCRs), addressing electron beam quality specification in terms of CE, and validating simulations with measurements. Methods: The EGSnrc user code SPRRZnrc, used to calculate Spencer-Attix stopping-power ratios, was modified to instead calculate SCRs. SCRs were calculated for 6- to 22-MeV clinical electron beams from Varian TrueBeam, Clinac 21EX, and Clinac 2100C/D accelerators. Experiments were performed with a 20-MeV electron beam from a Varian TrueBeam accelerator, using a diffraction grating spectrometer with optical fiber input and a cooled back-illuminated CCD.more » A fluorophore was dissolved in the water to remove CE signal anisotropy. Results: It was found that angular spread of the incident beam has little effect on the SCR (≤ 0.3% at d{sub max}), while both the electron spectrum and photon contamination increase the SCR at shallow depths and decrease it at large depths. A universal data fit of R{sub 50} in terms of C{sub 50} (50% CE depth) revealed a strong linear dependence (R{sup 2} > 0.9999). The SCR was fit with a Burns-type equation (R{sup 2} = 0.9974, NRMSD = 0.5%). Below-threshold incident radiation was found to have minimal effect on beam quality specification (< 0.1%). Experiments and simulations were in good agreement. Conclusions: Our findings confirm the feasibility of the proposed CE dosimetry method, contingent on computation of SCRs from additional accelerators and on further experimental validation. This work constitutes an important step towards clinical high-resolution out-of-beam CE dosimetry.« less

  4. SU‐C‐105‐05: Reference Dosimetry of High‐Energy Electron Beams with a Farmer‐Type Ionization Chamber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muir, B; Rogers, D

    2013-06-15

    Purpose: To investigate gradient effects and provide Monte Carlo calculated beam quality conversion factors to characterize the Farmer‐type NE2571 ion chamber for high‐energy reference dosimetry of clinical electron beams. Methods: The EGSnrc code system is used to calculate the absorbed dose to water and to the gas in a fully modeled NE2571 chamber as a function of depth in a water phantom. Electron beams incident on the surface of the phantom are modeled using realistic BEAMnrc accelerator simulations and electron beam spectra. Beam quality conversion factors are determined using calculated doses to water and to air in the chamber inmore » high‐energy electron beams and in a cobalt‐60 reference field. Calculated water‐to‐air stopping power ratios are employed for investigation of the overall ion chamber perturbation factor. Results: An upstream shift of 0.3–0.4 multiplied by the chamber radius, r-cav, both minimizes the variation of the overall ion chamber perturbation factor with depth and reduces the difference between the beam quality specifier (R{sub 5} {sub 0}) calculated using ion chamber simulations and that obtained with simulations of dose‐to‐water in the phantom. Beam quality conversion factors are obtained at the reference depth and gradient effects are optimized using a shift of 0.2r-cav. The photon‐electron conversion factor, k-ecal, amounts to 0.906 when gradient effects are minimized using the shift established here and 0.903 if no shift of the data is used. Systematic uncertainties in beam quality conversion factors are investigated and amount to between 0.4 to 1.1% depending on assumptions used. Conclusion: The calculations obtained in this work characterize the use of an NE2571 ion chamber for reference dosimetry of high‐energy electron beams. These results will be useful as the AAPM continues to review their reference dosimetry protocols.« less

  5. The perturbation correction factors for cylindrical ionization chambers in high-energy photon beams.

    PubMed

    Yoshiyama, Fumiaki; Araki, Fujio; Ono, Takeshi

    2010-07-01

    In this study, we calculated perturbation correction factors for cylindrical ionization chambers in high-energy photon beams by using Monte Carlo simulations. We modeled four Farmer-type cylindrical chambers with the EGSnrc/Cavity code and calculated the cavity or electron fluence correction factor, P (cav), the displacement correction factor, P (dis), the wall correction factor, P (wall), the stem correction factor, P (stem), the central electrode correction factor, P (cel), and the overall perturbation correction factor, P (Q). The calculated P (dis) values for PTW30010/30013 chambers were 0.9967 +/- 0.0017, 0.9983 +/- 0.0019, and 0.9980 +/- 0.0019, respectively, for (60)Co, 4 MV, and 10 MV photon beams. The value for a (60)Co beam was about 1.0% higher than the 0.988 value recommended by the IAEA TRS-398 protocol. The P (dis) values had a substantial discrepancy compared to those of IAEA TRS-398 and AAPM TG-51 at all photon energies. The P (wall) values were from 0.9994 +/- 0.0020 to 1.0031 +/- 0.0020 for PTW30010 and from 0.9961 +/- 0.0018 to 0.9991 +/- 0.0017 for PTW30011/30012, in the range of (60)Co-10 MV. The P (wall) values for PTW30011/30012 were around 0.3% lower than those of the IAEA TRS-398. Also, the chamber response with and without a 1 mm PMMA water-proofing sleeve agreed within their combined uncertainty. The calculated P (stem) values ranged from 0.9945 +/- 0.0014 to 0.9965 +/- 0.0014, but they are not considered in current dosimetry protocols. The values were no significant difference on beam qualities. P (cel) for a 1 mm aluminum electrode agreed within 0.3% with that of IAEA TRS-398. The overall perturbation factors agreed within 0.4% with those for IAEA TRS-398.

  6. SU-F-207-01: Comparison of Beam Characteristics and Organ Dose From Four Commercial Multidetector Computed Tomography Scanners

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohno, T; Araki, F

    2015-06-15

    Purpose: To compare dosimetric properties and patient organ doses from four commercial multidetector CT (MDCT) using Monte Carlo (MC) simulation based on the absorbed dose measured using a Farmer chamber and cylindrical water phantoms according to AAPM TG-111. Methods: Four commercial MDCT were modeled using the GMctdospp (IMPS, Germany) based on the EGSnrc user code. The incident photon spectrum and bowtie filter for MC simulations were determined so that calculated values of aluminum half-value layer (Al-HVL) and off-center ratio (OCR) profile in air agreed with measured values. The MC dose was calibrated from absorbed dose measurements using a Farmer chambermore » and cylindrical water phantoms. The dose distributions of head, chest, and abdominal scan were calculated using patient CT images and mean organ doses were evaluated from dose volume histograms. Results: The HVLs at 120 kVp of Brilliance, LightSpeed, Aquilion, and SOMATOM were 9.1, 7.5, 7.2, and 8.7 mm, respectively. The calculated Al-HVLs agreed with measurements within 0.3%. The calculated and measured OCR profiles agreed within 5%. For adult head scans, mean doses for eye lens from Brilliance, LightSpeed, Aquilion, and SOMATOM were 21.7, 38.5, 47.2 and 28.4 mGy, respectively. For chest scans, mean doses for lung from Brilliance, LightSpeed, Aquilion, and SOMATOM were 21.1, 26.1, 35.3 and 24.0 mGy, respectively. For adult abdominal scans, the mean doses for liver from Brilliance, LightSpeed, Aquilion, and SOMATOM were 16.5, 21.3, 22.7, and 18.0 mGy, respectively. The absorbed doses increased with decreasing Al-HVL. The organ doses from Aquilion were two greater than those from Brilliance in head scan. Conclusion: MC dose distributions based on absorbed dose measurement in cylindrical water phantom are useful to evaluate individual patient organ doses.« less

  7. SU-E-T-510: Calculation of High Resolution and Material-Specific Photon Energy Deposition Kernels.

    PubMed

    Huang, J; Childress, N; Kry, S

    2012-06-01

    To calculate photon energy deposition kernels (EDKs) used for convolution/superposition dose calculation at a higher resolution than the original Mackie et al. 1988 kernels and to calculate material-specific kernels that describe how energy is transported and deposited by secondary particles when the incident photon interacts in a material other than water. The high resolution EDKs for various incident photon energies were generated using the EGSnrc user-code EDKnrc, which forces incident photons to interact at the center of a 60 cm radius sphere of water. The simulation geometry is essentially the same as the original Mackie calculation but with a greater number of scoring voxels (48 radial, 144 angular bins). For the material-specific EDKs, incident photons were forced to interact at the center of a 1 mm radius sphere of material (lung, cortical bone, silver, or titanium) surrounded by a 60 cm radius water sphere, using the original scoring voxel geometry implemented by Mackie et al. 1988 (24 radial, 48 angular bins). Our Monte Carlo-calculated high resolution EDKs showed excellent agreement with the Mackie kernels, with our kernels providing more information about energy deposition close to the interaction site. Furthermore, our EDKs resulted in smoother dose deposition functions due to the finer resolution and greater number of simulation histories. The material-specific EDK results show that the angular distribution of energy deposition is different for incident photons interacting in different materials. Calculated from the angular dose distribution for 300 keV incident photons, the expected polar angle for dose deposition () is 28.6° for water, 33.3° for lung, 36.0° for cortical bone, 44.6° for titanium, and 58.1° for silver, showing a dependence on the material in which the primary photon interacts. These high resolution and material-specific EDKs have implications for convolution/superposition dose calculations in heterogeneous patient geometries, especially at material interfaces. © 2012 American Association of Physicists in Medicine.

  8. SU-E-T-609: Perturbation Effects of Pedicle Screws On Radiotherapy Dose Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bar-Deroma, R; Borzov, E; Nevelsky, A

    2015-06-15

    Purpose: Radiation therapy in conjunction with surgical implant fixation is a common combined treatment in case of bone metastases. However, metal implants generally used in orthopedic implants perturb radiation dose distributions. Carbon-Fiber Reinforced (CFR) PEEK material has been recently introduced for production of intramedullary screws and plates. Gold powder can be added to the CFR-PEEK material in order to enhance visibility of the screws during intraoperative imaging procedures. In this work, we investigated the perturbation effects of the pedicle screws made of CFR-PEEK, CFR-PEEK with added gold powder (CFR-PEEK-AU) and Titanium (Ti) on radiotherapy dose distributions. Methods: Monte Carlo (MC)more » simulations were performed using the EGSnrc code package for 6MV beams with 10×10 fields at SSD=100cm. By means of MC simulations, dose distributions around titanium, CFR- PEEK and CFR-PEEK-AU screws (manufactured by Carbo-Fix Orthopedics LTD, Israel) placed in a water phantom were calculated. The screw axis was either parallel or perpendicular to the beam axis. Dose perturbation (relative to dose in homogeneous water phantom) was assessed. Results: Maximum overdose due to backscatter was 10% for the Ti screws, 5% for the CFR-PEEK-AU screws and effectively zero for the CFR-PEEK screws. Maximum underdose due to attenuation was 25% for the Ti screws, 15% for the CFR-PEEK-AU screws and 5% for the CFR-PEEK screws. Conclusion: Titanium screws introduce the largest distortion on the radiation dose distribution. The gold powder added to the CFR-PEEK material improves visibility at the cost of increased dose perturbation. CFR-PEEK screws caused minimal alteration on the dose distribution. This can decrease possible over and underdose of adjacent tissue and thus favorably influence treatment efficiency. The use of such implants has potential clinical advantage in the treatment of neoplastic bone disease.« less

  9. An iterative three-dimensional electron density imaging algorithm using uncollimated compton scattered x rays from a polyenergetic primary pencil beam.

    PubMed

    Van Uytven, Eric; Pistorius, Stephen; Gordon, Richard

    2007-01-01

    X-ray film-screen mammography is currently the gold standard for detecting breast cancer. However, one disadvantage is that it projects a three-dimensional (3D) object onto a two-dimensional (2D) image, reducing contrast between small lesions and layers of normal tissue. Another limitation is its reduced sensitivity in women with mammographically dense breasts. Computed tomography (CT) produces a 3D image yet has had a limited role in mammography due to its relatively high dose, low resolution, and low contrast. As a first step towards implementing quantitative 3D mammography, which may improve the ability to detect and specify breast tumors, we have developed an analytical technique that can use Compton scatter to obtain 3D information of an object from a single projection. Imaging material with a pencil beam of polychromatic x rays produces a characteristic scattered photon spectrum at each point on the detector plane. A comparable distribution may be calculated using a known incident x-ray energy spectrum, beam shape, and an initial estimate of the object's 3D mass attenuation and electron density. Our iterative minimization algorithm changes the initially arbitrary electron density voxel matrix to reduce regular differences between the analytically predicted and experimentally measured spectra at each point on the detector plane. The simulated electron density converges to that of the object as the differences are minimized. The reconstruction algorithm has been validated using simulated data produced by the EGSnrc Monte Carlo code system. We applied the imaging algorithm to a cylindrically symmetric breast tissue phantom containing multiple inhomogeneities. A preliminary ROC analysis scores greater than 0.96, which indicate that under the described simplifying conditions, this approach shows promise in identifying and localizing inhomogeneities which simulate 0.5 mm calcifications with an image voxel resolution of 0.25 cm and at a dose comparable to mammography.

  10. Measurement-based model of a wide-bore CT scanner for Monte Carlo dosimetric calculations with GMCTdospp software.

    PubMed

    Skrzyński, Witold

    2014-11-01

    The aim of this work was to create a model of a wide-bore Siemens Somatom Sensation Open CT scanner for use with GMCTdospp, which is an EGSnrc-based software tool dedicated for Monte Carlo calculations of dose in CT examinations. The method was based on matching spectrum and filtration to half value layer and dose profile, and thus was similar to the method of Turner et al. (Med. Phys. 36, pp. 2154-2164). Input data on unfiltered beam spectra were taken from two sources: the TASMIP model and IPEM Report 78. Two sources of HVL data were also used, namely measurements and documentation. Dose profile along the fan-beam was measured with Gafchromic RTQA-1010 (QA+) film. Two-component model of filtration was assumed: bow-tie filter made of aluminum with 0.5 mm thickness on central axis, and flat filter made of one of four materials: aluminum, graphite, lead, or titanium. Good agreement between calculations and measurements was obtained for models based on the measured values of HVL. Doses calculated with GMCTdospp differed from the doses measured with pencil ion chamber placed in PMMA phantom by less than 5%, and root mean square difference for four tube potentials and three positions in the phantom did not exceed 2.5%. The differences for models based on HVL values from documentation exceeded 10%. Models based on TASMIP spectra and IPEM78 spectra performed equally well. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Common radiation analysis model for 75,000 pound thrust NERVA engine (1137400E)

    NASA Technical Reports Server (NTRS)

    Warman, E. A.; Lindsey, B. A.

    1972-01-01

    The mathematical model and sources of radiation used for the radiation analysis and shielding activities in support of the design of the 1137400E version of the 75,000 lbs thrust NERVA engine are presented. The nuclear subsystem (NSS) and non-nuclear components are discussed. The geometrical model for the NSS is two dimensional as required for the DOT discrete ordinates computer code or for an azimuthally symetrical three dimensional Point Kernel or Monte Carlo code. The geometrical model for the non-nuclear components is three dimensional in the FASTER geometry format. This geometry routine is inherent in the ANSC versions of the QAD and GGG Point Kernal programs and the COHORT Monte Carlo program. Data are included pertaining to a pressure vessel surface radiation source data tape which has been used as the basis for starting ANSC analyses with the DASH code to bridge into the COHORT Monte Carlo code using the WANL supplied DOT angular flux leakage data. In addition to the model descriptions and sources of radiation, the methods of analyses are briefly described.

  12. Supernova Light Curves and Spectra from Two Different Codes: Supernu and Phoenix

    NASA Astrophysics Data System (ADS)

    Van Rossum, Daniel R; Wollaeger, Ryan T

    2014-08-01

    The observed similarities between light curve shapes from Type Ia supernovae, and in particular the correlation of light curve shape and brightness, have been actively studied for more than two decades. In recent years, hydronamic simulations of white dwarf explosions have advanced greatly, and multiple mechanisms that could potentially produce Type Ia supernovae have been explored in detail. The question which of the proposed mechanisms is (or are) possibly realized in nature remains challenging to answer, but detailed synthetic light curves and spectra from explosion simulations are very helpful and important guidelines towards answering this question.We present results from a newly developed radiation transport code, Supernu. Supernu solves the supernova radiation transfer problem uses a novel technique based on a hybrid between Implicit Monte Carlo and Discrete Diffusion Monte Carlo. This technique enhances the efficiency with respect to traditional implicit monte carlo codes and thus lends itself perfectly for multi-dimensional simulations. We show direct comparisons of light curves and spectra from Type Ia simulations with Supernu versus the legacy Phoenix code.

  13. A collision history-based approach to Sensitivity/Perturbation calculations in the continuous energy Monte Carlo code SERPENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giuseppe Palmiotti

    In this work, the implementation of a collision history-based approach to sensitivity/perturbation calculations in the Monte Carlo code SERPENT is discussed. The proposed methods allow the calculation of the eects of nuclear data perturbation on several response functions: the eective multiplication factor, reaction rate ratios and bilinear ratios (e.g., eective kinetics parameters). SERPENT results are compared to ERANOS and TSUNAMI Generalized Perturbation Theory calculations for two fast metallic systems and for a PWR pin-cell benchmark. New methods for the calculation of sensitivities to angular scattering distributions are also presented, which adopts fully continuous (in energy and angle) Monte Carlo estimators.

  14. A comparison of the COG and MCNP codes in computational neutron capture therapy modeling, Part I: boron neutron capture therapy models.

    PubMed

    Culbertson, C N; Wangerin, K; Ghandourah, E; Jevremovic, T

    2005-08-01

    The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for neutron capture therapy related modeling. A boron neutron capture therapy model was analyzed comparing COG calculational results to results from the widely used MCNP4B (Monte Carlo N-Particle) transport code. The approach for computing neutron fluence rate and each dose component relevant in boron neutron capture therapy is described, and calculated values are shown in detail. The differences between the COG and MCNP predictions are qualified and quantified. The differences are generally small and suggest that the COG code can be applied for BNCT research related problems.

  15. Path Toward a Unified Geometry for Radiation Transport

    NASA Astrophysics Data System (ADS)

    Lee, Kerry

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The work-flow for doing radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats.

  16. Million-body star cluster simulations: comparisons between Monte Carlo and direct N-body

    NASA Astrophysics Data System (ADS)

    Rodriguez, Carl L.; Morscher, Meagan; Wang, Long; Chatterjee, Sourav; Rasio, Frederic A.; Spurzem, Rainer

    2016-12-01

    We present the first detailed comparison between million-body globular cluster simulations computed with a Hénon-type Monte Carlo code, CMC, and a direct N-body code, NBODY6++GPU. Both simulations start from an identical cluster model with 106 particles, and include all of the relevant physics needed to treat the system in a highly realistic way. With the two codes `frozen' (no fine-tuning of any free parameters or internal algorithms of the codes) we find good agreement in the overall evolution of the two models. Furthermore, we find that in both models, large numbers of stellar-mass black holes (>1000) are retained for 12 Gyr. Thus, the very accurate direct N-body approach confirms recent predictions that black holes can be retained in present-day, old globular clusters. We find only minor disagreements between the two models and attribute these to the small-N dynamics driving the evolution of the cluster core for which the Monte Carlo assumptions are less ideal. Based on the overwhelming general agreement between the two models computed using these vastly different techniques, we conclude that our Monte Carlo approach, which is more approximate, but dramatically faster compared to the direct N-body, is capable of producing an accurate description of the long-term evolution of massive globular clusters even when the clusters contain large populations of stellar-mass black holes.

  17. Monte Carlo simulations for angular and spatial distributions in therapeutic-energy proton beams

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Pan, C. Y.; Chiang, K. J.; Yuan, M. C.; Chu, C. H.; Tsai, Y. W.; Teng, P. K.; Lin, C. H.; Chao, T. C.; Lee, C. C.; Tung, C. J.; Chen, A. E.

    2017-11-01

    The purpose of this study is to compare the angular and spatial distributions of therapeutic-energy proton beams obtained from the FLUKA, GEANT4 and MCNP6 Monte Carlo codes. The Monte Carlo simulations of proton beams passing through two thin targets and a water phantom were investigated to compare the primary and secondary proton fluence distributions and dosimetric differences among these codes. The angular fluence distributions, central axis depth-dose profiles, and lateral distributions of the Bragg peak cross-field were calculated to compare the proton angular and spatial distributions and energy deposition. Benchmark verifications from three different Monte Carlo simulations could be used to evaluate the residual proton fluence for the mean range and to estimate the depth and lateral dose distributions and the characteristic depths and lengths along the central axis as the physical indices corresponding to the evaluation of treatment effectiveness. The results showed a general agreement among codes, except that some deviations were found in the penumbra region. These calculated results are also particularly helpful for understanding primary and secondary proton components for stray radiation calculation and reference proton standard determination, as well as for determining lateral dose distribution performance in proton small-field dosimetry. By demonstrating these calculations, this work could serve as a guide to the recent field of Monte Carlo methods for therapeutic-energy protons.

  18. Rigorous-two-Steps scheme of TRIPOLI-4® Monte Carlo code validation for shutdown dose rate calculation

    NASA Astrophysics Data System (ADS)

    Jaboulay, Jean-Charles; Brun, Emeric; Hugot, François-Xavier; Huynh, Tan-Dat; Malouch, Fadhel; Mancusi, Davide; Tsilanizara, Aime

    2017-09-01

    After fission or fusion reactor shutdown the activated structure emits decay photons. For maintenance operations the radiation dose map must be established in the reactor building. Several calculation schemes have been developed to calculate the shutdown dose rate. These schemes are widely developed in fusion application and more precisely for the ITER tokamak. This paper presents the rigorous-two-steps scheme implemented at CEA. It is based on the TRIPOLI-4® Monte Carlo code and the inventory code MENDEL. The ITER shutdown dose rate benchmark has been carried out, results are in a good agreement with the other participant.

  19. Validating the performance of correlated fission multiplicity implementation in radiation transport codes with subcritical neutron multiplication benchmark experiments

    DOE PAGES

    Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson; ...

    2018-06-14

    Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less

  20. Comparative Dosimetric Estimates of a 25 keV Electron Micro-beam with three Monte Carlo Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mainardi, Enrico; Donahue, Richard J.; Blakely, Eleanor A.

    2002-09-11

    The calculations presented compare the different performances of the three Monte Carlo codes PENELOPE-1999, MCNP-4C and PITS, for the evaluation of Dose profiles from a 25 keV electron micro-beam traversing individual cells. The overall model of a cell is a water cylinder equivalent for the three codes but with a different internal scoring geometry: hollow cylinders for PENELOPE and MCNP, whereas spheres are used for the PITS code. A cylindrical cell geometry with scoring volumes with the shape of hollow cylinders was initially selected for PENELOPE and MCNP because of its superior simulation of the actual shape and dimensions ofmore » a cell and for its improved computer-time efficiency if compared to spherical internal volumes. Some of the transfer points and energy transfer that constitute a radiation track may actually fall in the space between spheres, that would be outside the spherical scoring volume. This internal geometry, along with the PENELOPE algorithm, drastically reduced the computer time when using this code if comparing with event-by-event Monte Carlo codes like PITS. This preliminary work has been important to address dosimetric estimates at low electron energies. It demonstrates that codes like PENELOPE can be used for Dose evaluation, even with such small geometries and energies involved, which are far below the normal use for which the code was created. Further work (initiated in Summer 2002) is still needed however, to create a user-code for PENELOPE that allows uniform comparison of exact cell geometries, integral volumes and also microdosimetric scoring quantities, a field where track-structure codes like PITS, written for this purpose, are believed to be superior.« less

  1. Validating the performance of correlated fission multiplicity implementation in radiation transport codes with subcritical neutron multiplication benchmark experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson

    Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less

  2. Analysis of Naval Ammunition Stock Positioning

    DTIC Science & Technology

    2015-12-01

    model takes once the Monte -Carlo simulation determines the assigned probabilities for site-to-site locations. Column two shows how the simulation...stockpiles and positioning them at coastal Navy facilities. A Monte -Carlo simulation model was developed to simulate expected cost and delivery...TERMS supply chain management, Monte -Carlo simulation, risk, delivery performance, stock positioning 15. NUMBER OF PAGES 85 16. PRICE CODE 17

  3. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  4. Application of the A.C. Admittance Technique to Double Layer Studies on Polycrystalline Gold Electrodes

    DTIC Science & Technology

    1992-02-24

    AVAiLABILITY STATEMENT 12b. DISTRIBUTION CODE Unclassified 1 . %Bsr’RACT , 3’ um . Crl) A detailed examination of the dependence of the a.c. admittance...NUMBER OF PAGES double layer at gold/solution interface, a.c. admittance techniques, constant phase element model 1 . PRCE CODE 17. SECURITY...Chemistry University of California Davis, CA 95616 U.S.A. tOn leave from the Instituto de Fisica e Quimica de Sao Carlos, USP, Sao Carlos, SP 13560

  5. General Monte Carlo reliability simulation code including common mode failures and HARP fault/error-handling

    NASA Technical Reports Server (NTRS)

    Platt, M. E.; Lewis, E. E.; Boehm, F.

    1991-01-01

    A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.

  6. The Serpent Monte Carlo Code: Status, Development and Applications in 2013

    NASA Astrophysics Data System (ADS)

    Leppänen, Jaakko; Pusa, Maria; Viitanen, Tuomas; Valtavirta, Ville; Kaltiaisenaho, Toni

    2014-06-01

    The Serpent Monte Carlo reactor physics burnup calculation code has been developed at VTT Technical Research Centre of Finland since 2004, and is currently used in 100 universities and research organizations around the world. This paper presents the brief history of the project, together with the currently available methods and capabilities and plans for future work. Typical user applications are introduced in the form of a summary review on Serpent-related publications over the past few years.

  7. Monte Carlo dosimetry for {sup 103}Pd, {sup 125}I, and {sup 131}Cs ocular brachytherapy with various plaque models using an eye phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lesperance, Marielle; Martinov, M.; Thomson, R. M., E-mail: rthomson@physics.carleton.ca

    Purpose: To investigate dosimetry for ocular brachytherapy for a range of eye plaque models containing{sup 103}Pd, {sup 125}I, or {sup 131}Cs seeds with model-based dose calculations. Methods: Five representative plaque models are developed based on a literature review and are compared to the standardized COMS plaque, including plaques consisting of a stainless steel backing and acrylic insert, and gold alloy backings with: short collimating lips and acrylic insert, no lips and silicone polymer insert, no lips and a thin acrylic layer, and individual collimating slots for each seed within the backing and no insert. Monte Carlo simulations are performed usingmore » the EGSnrc user-code BrachyDose for single and multiple seed configurations for the plaques in water and within an eye model (including nonwater media). Simulations under TG-43 assumptions are also performed, i.e., with the same seed configurations in water, neglecting interseed and plaque effects. Maximum and average doses to ocular structures as well as isodose contours are compared for simulations of each radionuclide within the plaque models. Results: The presence of the plaque affects the dose distribution substantially along the plaque axis for both single seed and multiseed simulations of each plaque design in water. Of all the plaque models, the COMS plaque generally has the largest effect on the dose distribution in water along the plaque axis. Differences between doses for single and multiple seed configurations vary between plaque models and radionuclides. Collimation is most substantial for the plaque with individual collimating slots. For plaques in the full eye model, average dose in the tumor region differs from those for the TG-43 simulations by up to 10% for{sup 125}I and {sup 131}Cs, and up to 17% for {sup 103}Pd, and in the lens region by up to 29% for {sup 125}I, 34% for {sup 103}Pd, and 28% for {sup 131}Cs. For the same prescription dose to the tumor apex, the lowest doses to critical ocular structures are generally delivered with plaques containing {sup 103}Pd seeds. Conclusions: The combined effects of ocular and plaque media on dose are significant and vary with plaque model and radionuclide, suggesting the importance of model-based dose calculations employing accurate ocular and plaque media and geometries for eye plaque brachytherapy.« less

  8. Low-energy photons in high-energy photon fields--Monte Carlo generated spectra and a new descriptive parameter.

    PubMed

    Chofor, Ndimofor; Harder, Dietrich; Willborn, Kay; Rühmann, Antje; Poppe, Björn

    2011-09-01

    The varying low-energy contribution to the photon spectra at points within and around radiotherapy photon fields is associated with variations in the responses of non-water equivalent dosimeters and in the water-to-material dose conversion factors for tissues such as the red bone marrow. In addition, the presence of low-energy photons in the photon spectrum enhances the RBE in general and in particular for the induction of second malignancies. The present study discusses the general rules valid for the low-energy spectral component of radiotherapeutic photon beams at points within and in the periphery of the treatment field, taking as an example the Siemens Primus linear accelerator at 6 MV and 15 MV. The photon spectra at these points and their typical variations due to the target system, attenuation, single and multiple Compton scattering, are described by the Monte Carlo method, using the code BEAMnrc/EGSnrc. A survey of the role of low energy photons in the spectra within and around radiotherapy fields is presented. In addition to the spectra, some data compression has proven useful to support the overview of the behaviour of the low-energy component. A characteristic indicator of the presence of low-energy photons is the dose fraction attributable to photons with energies not exceeding 200 keV, termed P(D)(200 keV). Its values are calculated for different depths and lateral positions within a water phantom. For a pencil beam of 6 or 15 MV primary photons in water, the radial distribution of P(D)(200 keV) is bellshaped, with a wide-ranging exponential tail of half value 6 to 7 cm. The P(D)(200 keV) value obtained on the central axis of a photon field shows an approximately proportional increase with field size. Out-of-field P(D)(200 keV) values are up to an order of magnitude higher than on the central axis for the same irradiation depth. The 2D pattern of P(D)(200 keV) for a radiotherapy field visualizes the regions, e.g. at the field margin, where changes of detector responses and dose conversion factors, as well as increases of the RBE have to be anticipated. Parameter P(D)(200 keV) can also be used as a guidance supporting the selection of a calibration geometry suitable for radiation dosimeters to be used in small radiation fields. Copyright © 2011. Published by Elsevier GmbH.

  9. Monte Carlo study and design of system for implementation of Rotational Total Skin Electron Irradiation technique

    NASA Astrophysics Data System (ADS)

    Ansari, M.; Abbasi Davani, F.; Lamehi Rashti, M.; Monadi, Sh.; Emami, H.

    2018-05-01

    Total skin electron irradiation technique is used in treatment of the mycosis fungoid. The implementation of this technique requires non-standard measurements and complex dosimetry methods. Depending on the linear accelerator (Linac) type, bunker size, room dimensions and dosimetry equipment, the design of instruments for appropriate set up and implementation of TSEI in different radiation therapy centers varies. The studies which have been done in this article provide an introduction to the implementing of this method for the first time in Iran and its results can be used for the centers with similar specifications in the world. This article determined the electron beam characteristic of TSEI for the only electron accelerator, located at the radiation center of the Seyed Alshohada Hospital of Isfahan (NEPTUN 10PC), by performing Monte Carlo simulations and using EGSnrc-based codes (BEAMnrc and DOSXYZnrc). For the best uniformity of the vertical profile, the optimal angle of gantry was defined at SSD=350 cm. The effect of the degrader plane that is located at a distance of 20 cm from the patient surface, was evaluated on the amount of energy reduction of the beam, the opening of the electron beam field and the homogeneity of the dose distribution. The transversal dose distribution from the whole treatment with Stanford technique (six dual fields) and Rotational technique was simulated in a CT-based anthropomorphic phantom. Also, the percentage depth dose in the head, neck, thorax, abdomen and legs was obtained for both methods. The simulation results show that the 20o angle between the horizontal and the beam central axis is optimal in order to provide the best vertical dose uniformity. The mean energy decreases from 6.1 MeV (the exit window) to 3 MeV (the treatment surface) by placing a degrader with 0.8 cm thickness in front of the treatment plane. FWHM of the angular distribution of the electron beam increased from 15o at SSD=100 cm to more than 30o on the treatment surface by traversing the PMMA degrader. The MC calculated percentage depth dose curves in different organs of anthropomorphic phantom for RTSEI indicates that the depth of maximum dose is on the surface of the phantom and Isodose curve of 80% is formed at a depth less than 4 mm. the results also show, with the degrader plane in front of the patient, the degree of homogeneity of the dose distribution for both Stanford and rotational techniques is the same.

  10. Metallic artifact mitigation and organ-constrained tissue assignment for Monte Carlo calculations of permanent implant lung brachytherapy.

    PubMed

    Sutherland, J G H; Miksys, N; Furutani, K M; Thomson, R M

    2014-01-01

    To investigate methods of generating accurate patient-specific computational phantoms for the Monte Carlo calculation of lung brachytherapy patient dose distributions. Four metallic artifact mitigation methods are applied to six lung brachytherapy patient computed tomography (CT) images: simple threshold replacement (STR) identifies high CT values in the vicinity of the seeds and replaces them with estimated true values; fan beam virtual sinogram replaces artifact-affected values in a virtual sinogram and performs a filtered back-projection to generate a corrected image; 3D median filter replaces voxel values that differ from the median value in a region of interest surrounding the voxel and then applies a second filter to reduce noise; and a combination of fan beam virtual sinogram and STR. Computational phantoms are generated from artifact-corrected and uncorrected images using several tissue assignment schemes: both lung-contour constrained and unconstrained global schemes are considered. Voxel mass densities are assigned based on voxel CT number or using the nominal tissue mass densities. Dose distributions are calculated using the EGSnrc user-code BrachyDose for (125)I, (103)Pd, and (131)Cs seeds and are compared directly as well as through dose volume histograms and dose metrics for target volumes surrounding surgical sutures. Metallic artifact mitigation techniques vary in ability to reduce artifacts while preserving tissue detail. Notably, images corrected with the fan beam virtual sinogram have reduced artifacts but residual artifacts near sources remain requiring additional use of STR; the 3D median filter removes artifacts but simultaneously removes detail in lung and bone. Doses vary considerably between computational phantoms with the largest differences arising from artifact-affected voxels assigned to bone in the vicinity of the seeds. Consequently, when metallic artifact reduction and constrained tissue assignment within lung contours are employed in generated phantoms, this erroneous assignment is reduced, generally resulting in higher doses. Lung-constrained tissue assignment also results in increased doses in regions of interest due to a reduction in the erroneous assignment of adipose to voxels within lung contours. Differences in dose metrics calculated for different computational phantoms are sensitive to radionuclide photon spectra with the largest differences for (103)Pd seeds and smallest but still considerable differences for (131)Cs seeds. Despite producing differences in CT images, dose metrics calculated using the STR, fan beam + STR, and 3D median filter techniques produce similar dose metrics. Results suggest that the accuracy of dose distributions for permanent implant lung brachytherapy is improved by applying lung-constrained tissue assignment schemes to metallic artifact corrected images.

  11. Deriving detector-specific correction factors for rectangular small fields using a scintillator detector.

    PubMed

    Qin, Yujiao; Zhong, Hualiang; Wen, Ning; Snyder, Karen; Huang, Yimei; Chetty, Indrin J

    2016-11-08

    The goal of this study was to investigate small field output factors (OFs) for flat-tening filter-free (FFF) beams on a dedicated stereotactic linear accelerator-based system. From this data, the collimator exchange effect was quantified, and detector-specific correction factors were generated. Output factors for 16 jaw-collimated small fields (from 0.5 to 2 cm) were measured using five different detectors including an ion chamber (CC01), a stereotactic field diode (SFD), a diode detector (Edge), Gafchromic film (EBT3), and a plastic scintillator detector (PSD, W1). Chamber, diodes, and PSD measurements were performed in a Wellhofer water tank, while films were irradiated in solid water at 100 cm source-to-surface distance and 10 cm depth. The collimator exchange effect was quantified for rectangular fields. Monte Carlo (MC) simulations of the measured configurations were also performed using the EGSnrc/DOSXYZnrc code. Output factors measured by the PSD and verified against film and MC calculations were chosen as the benchmark measurements. Compared with plastic scintillator detector (PSD), the small volume ion chamber (CC01) underestimated output factors by an average of -1.0% ± 4.9% (max. = -11.7% for 0.5 × 0.5 cm2 square field). The stereotactic diode (SFD) overestimated output factors by 2.5% ± 0.4% (max. = 3.3% for 0.5 × 1 cm2 rectangular field). The other diode detector (Edge) also overestimated the OFs by an average of 4.2% ± 0.9% (max. = 6.0% for 1 × 1 cm2 square field). Gafchromic film (EBT3) measure-ments and MC calculations agreed with the scintillator detector measurements within 0.6% ± 1.8% and 1.2% ± 1.5%, respectively. Across all the X and Y jaw combinations, the average collimator exchange effect was computed: 1.4% ± 1.1% (CC01), 5.8% ± 5.4% (SFD), 5.1% ± 4.8% (Edge diode), 3.5% ± 5.0% (Monte Carlo), 3.8% ± 4.7% (film), and 5.5% ± 5.1% (PSD). Small field detectors should be used with caution with a clear understanding of their behaviors, especially for FFF beams and small, elongated fields. The scintillator detector exhibited good agreement against Gafchromic film measurements and MC simulations over the range of field sizes studied. The collimator exchange effect was found to be impor-tant at these small field sizes. Detector-specific correction factors were computed using the scintillator measurements as the benchmark. © 2016 The Authors.

  12. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation.

    PubMed

    Yoriyaz, Hélio; Moralles, Maurício; Siqueira, Paulo de Tarso Dalledone; Guimarães, Carla da Costa; Cintra, Felipe Belonsi; dos Santos, Adimir

    2009-11-01

    Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons. Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  13. Parallel Grand Canonical Monte Carlo (ParaGrandMC) Simulation Code

    NASA Technical Reports Server (NTRS)

    Yamakov, Vesselin I.

    2016-01-01

    This report provides an overview of the Parallel Grand Canonical Monte Carlo (ParaGrandMC) simulation code. This is a highly scalable parallel FORTRAN code for simulating the thermodynamic evolution of metal alloy systems at the atomic level, and predicting the thermodynamic state, phase diagram, chemical composition and mechanical properties. The code is designed to simulate multi-component alloy systems, predict solid-state phase transformations such as austenite-martensite transformations, precipitate formation, recrystallization, capillary effects at interfaces, surface absorption, etc., which can aid the design of novel metallic alloys. While the software is mainly tailored for modeling metal alloys, it can also be used for other types of solid-state systems, and to some degree for liquid or gaseous systems, including multiphase systems forming solid-liquid-gas interfaces.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.

    In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less

  15. The Monte Carlo code MCPTV--Monte Carlo dose calculation in radiation therapy with carbon ions.

    PubMed

    Karg, Juergen; Speer, Stefan; Schmidt, Manfred; Mueller, Reinhold

    2010-07-07

    The Monte Carlo code MCPTV is presented. MCPTV is designed for dose calculation in treatment planning in radiation therapy with particles and especially carbon ions. MCPTV has a voxel-based concept and can perform a fast calculation of the dose distribution on patient CT data. Material and density information from CT are taken into account. Electromagnetic and nuclear interactions are implemented. Furthermore the algorithm gives information about the particle spectra and the energy deposition in each voxel. This can be used to calculate the relative biological effectiveness (RBE) for each voxel. Depth dose distributions are compared to experimental data giving good agreement. A clinical example is shown to demonstrate the capabilities of the MCPTV dose calculation.

  16. Parallel Monte Carlo transport modeling in the context of a time-dependent, three-dimensional multi-physics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Procassini, R.J.

    1997-12-31

    The fine-scale, multi-space resolution that is envisioned for accurate simulations of complex weapons systems in three spatial dimensions implies flop-rate and memory-storage requirements that will only be obtained in the near future through the use of parallel computational techniques. Since the Monte Carlo transport models in these simulations usually stress both of these computational resources, they are prime candidates for parallelization. The MONACO Monte Carlo transport package, which is currently under development at LLNL, will utilize two types of parallelism within the context of a multi-physics design code: decomposition of the spatial domain across processors (spatial parallelism) and distribution ofmore » particles in a given spatial subdomain across additional processors (particle parallelism). This implementation of the package will utilize explicit data communication between domains (message passing). Such a parallel implementation of a Monte Carlo transport model will result in non-deterministic communication patterns. The communication of particles between subdomains during a Monte Carlo time step may require a significant level of effort to achieve a high parallel efficiency.« less

  17. Light transport feature for SCINFUL.

    PubMed

    Etaati, G R; Ghal-Eh, N

    2008-03-01

    An extended version of the scintillator response function prediction code SCINFUL has been developed by incorporating PHOTRACK, a Monte Carlo light transport code. Comparisons of calculated and experimental results for organic scintillators exposed to neutrons show that the extended code improves the predictive capability of SCINFUL.

  18. SPAMCART: a code for smoothed particle Monte Carlo radiative transfer

    NASA Astrophysics Data System (ADS)

    Lomax, O.; Whitworth, A. P.

    2016-10-01

    We present a code for generating synthetic spectral energy distributions and intensity maps from smoothed particle hydrodynamics simulation snapshots. The code is based on the Lucy Monte Carlo radiative transfer method, I.e. it follows discrete luminosity packets as they propagate through a density field, and then uses their trajectories to compute the radiative equilibrium temperature of the ambient dust. The sources can be extended and/or embedded, and discrete and/or diffuse. The density is not mapped on to a grid, and therefore the calculation is performed at exactly the same resolution as the hydrodynamics. We present two example calculations using this method. First, we demonstrate that the code strictly adheres to Kirchhoff's law of radiation. Secondly, we present synthetic intensity maps and spectra of an embedded protostellar multiple system. The algorithm uses data structures that are already constructed for other purposes in modern particle codes. It is therefore relatively simple to implement.

  19. The specific purpose Monte Carlo code McENL for simulating the response of epithermal neutron lifetime well logging tools

    NASA Astrophysics Data System (ADS)

    Prettyman, T. H.; Gardner, R. P.; Verghese, K.

    1993-08-01

    A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.

  20. Monte Carlo Particle Lists: MCPL

    NASA Astrophysics Data System (ADS)

    Kittelmann, T.; Klinkby, E.; Knudsen, E. B.; Willendrup, P.; Cai, X. X.; Kanaki, K.

    2017-09-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.

  1. Angular and radial dependence of the energy response factor for LIF-TLD micro-rods in 125L permanent implant source.

    PubMed

    Mobit, Paul; Badragan, Iulian

    2006-01-01

    EGSnrc Monte Carlo simulations were used to calculate the angular and radial dependence of the energy response factor for LiF-thermoluminescence dosemeters (TLDs) irradiated with a commercially available (125)I permanent brachytherapy source. The LiF-TLDs were modelled as cylindrical micro-rods of length 6 mm and with diameters of 1 mm and 5 mm. The results show that for a LiF-TLD micro-rod of 1 mm diameter, the energy response relative to (60)Co gamma rays is 1.406 +/- 0.3% for a polar angle of 90 degrees and radial distance of 1.0 cm. When the diameter of the micro-rod is increased from 1 to 5 mm, the energy response decreases to 1.32 +/- 0.3% at the same point. The variation with position of the energy response factor is not >5% in a 6 cm x 6 cm x 6 cm calculation grid for the 5 mm diameter micro-rod. The results show that there is a change in the photon spectrum with angle and radial distance, which causes the variation of the energy response.

  2. Cellular dosimetry calculations for Strontium-90 using Monte Carlo code PENELOPE.

    PubMed

    Hocine, Nora; Farlay, Delphine; Boivin, Georges; Franck, Didier; Agarande, Michelle

    2014-11-01

    To improve risk assessments associated with chronic exposure to Strontium-90 (Sr-90), for both the environment and human health, it is necessary to know the energy distribution in specific cells or tissue. Monte Carlo (MC) simulation codes are extremely useful tools for calculating deposition energy. The present work was focused on the validation of the MC code PENetration and Energy LOss of Positrons and Electrons (PENELOPE) and the assessment of dose distribution to bone marrow cells from punctual Sr-90 source localized within the cortical bone part. S-values (absorbed dose per unit cumulated activity) calculations using Monte Carlo simulations were performed by using PENELOPE and Monte Carlo N-Particle eXtended (MCNPX). Cytoplasm, nucleus, cell surface, mouse femur bone and Sr-90 radiation source were simulated. Cells are assumed to be spherical with the radii of the cell and cell nucleus ranging from 2-10 μm. The Sr-90 source is assumed to be uniformly distributed in cell nucleus, cytoplasm and cell surface. The comparison of S-values calculated with PENELOPE to MCNPX results and the Medical Internal Radiation Dose (MIRD) values agreed very well since the relative deviations were less than 4.5%. The dose distribution to mouse bone marrow cells showed that the cells localized near the cortical part received the maximum dose. The MC code PENELOPE may prove useful for cellular dosimetry involving radiation transport through materials other than water, or for complex distributions of radionuclides and geometries.

  3. Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes

    NASA Technical Reports Server (NTRS)

    Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.

    2010-01-01

    The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,

  4. An update on the BQCD Hybrid Monte Carlo program

    NASA Astrophysics Data System (ADS)

    Haar, Taylor Ryan; Nakamura, Yoshifumi; Stüben, Hinnerk

    2018-03-01

    We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n) for K, cSW and chemical potential reweighting), a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.

  5. A method for radiological characterization based on fluence conversion coefficients

    NASA Astrophysics Data System (ADS)

    Froeschl, Robert

    2018-06-01

    Radiological characterization of components in accelerator environments is often required to ensure adequate radiation protection during maintenance, transport and handling as well as for the selection of the proper disposal pathway. The relevant quantities are typical the weighted sums of specific activities with radionuclide-specific weighting coefficients. Traditional methods based on Monte Carlo simulations are radionuclide creation-event based or the particle fluences in the regions of interest are scored and then off-line weighted with radionuclide production cross sections. The presented method bases the radiological characterization on a set of fluence conversion coefficients. For a given irradiation profile and cool-down time, radionuclide production cross-sections, material composition and radionuclide-specific weighting coefficients, a set of particle type and energy dependent fluence conversion coefficients is computed. These fluence conversion coefficients can then be used in a Monte Carlo transport code to perform on-line weighting to directly obtain the desired radiological characterization, either by using built-in multiplier features such as in the PHITS code or by writing a dedicated user routine such as for the FLUKA code. The presented method has been validated against the standard event-based methods directly available in Monte Carlo transport codes.

  6. Monte Carlo simulation of β γ coincidence system using plastic scintillators in 4π geometry

    NASA Astrophysics Data System (ADS)

    Dias, M. S.; Piuvezam-Filho, H.; Baccarelli, A. M.; Takeda, M. N.; Koskinas, M. F.

    2007-09-01

    A modified version of a Monte Carlo code called Esquema, developed at the Nuclear Metrology Laboratory in IPEN, São Paulo, Brazil, has been applied for simulating a 4 πβ(PS)-γ coincidence system designed for primary radionuclide standardisation. This system consists of a plastic scintillator in 4 π geometry, for alpha or electron detection, coupled to a NaI(Tl) counter for gamma-ray detection. The response curves for monoenergetic electrons and photons have been calculated previously by Penelope code and applied as input data to code Esquema. The latter code simulates all the disintegration processes, from the precursor nucleus to the ground state of the daughter radionuclide. As a result, the curve between the observed disintegration rate as a function of the beta efficiency parameter can be simulated. A least-squares fit between the experimental activity values and the Monte Carlo calculation provided the actual radioactive source activity, without need of conventional extrapolation procedures. Application of this methodology to 60Co and 133Ba radioactive sources is presented and showed results in good agreement with a conventional proportional counter 4 πβ(PC)-γ coincidence system.

  7. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodrigues, Anna; Yin, Fang-Fang; Wu, Qiuwen, E-mail: Qiuwen.Wu@Duke.edu

    2015-05-15

    Purpose: To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. Methods: In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimatedmore » field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm{sup 2} were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R{sub 100}, R{sub 50}, R{sub p}, and R{sub p+} for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Results: Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R{sub 100}, R{sub 50}, and R{sub p} were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. Conclusions: We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6 to 20 MeV for open and collimated field sizes from 3 × 3 to 25 × 25 cm{sup 2} were studied and results were compared to the measurement data with excellent agreement. Application of this framework can thus be used as the platform for treatment planning of dynamic electron arc radiotherapy and other advanced dynamic techniques with electron beams.« less

  8. A study on the suitability of the PTW microDiamond detector for kilovoltage x-ray beam dosimetry.

    PubMed

    Damodar, Joshita; Odgers, David; Pope, Dane; Hill, Robin

    2018-05-01

    Kilovoltage x-ray beams are widely used in treating skin cancers and in biological irradiators. In this work, we have evaluated four dosimeters (ionization chambers and solid state detectors) in their suitability for relative dosimetry of kilovoltage x-ray beams in the energy range of 50 - 280kVp. The solid state detectors, which have not been investigated with low energy x-rays, were the PTW 60019 microDiamond synthetic diamond detector and the PTW 60012 diode. The two ionization chambers used were the PTW Advanced Markus parallel plate chamber and the PTW PinPoint small volume chamber. For each of the dosimeters, percentage depth doses were measured in water over the full range of x-ray beams and for field sizes ranging from 2cm diameter to 12 × 12cm. In addition, depth doses were measured for a narrow aperture (7mm diameter) using the PTW microDiamond detector. For comparison, the measured data was compared with Monte Carlo calculated doses using the EGSnrc Monte Carlo package. The depth dose results indicate that the Advanced Markus parallel plate and PinPoint ionization chambers were suitable for depth dose measurements in the beam quality range with an uncertainty of less than 3%, including in the regions closer to the surface of the water as compared with Monte Carlo depth dose data for all six energy beams. The response of the PTW Diode E detector was accurate to within 4% for all field sizes in the energy range of 50-125kVp but showed larger variations for higher energies of up to 12% with the 12 × 12cm field size. In comparison, the microDiamond detector had good agreement over all energies for both smaller and larger field sizes generally within 1% as compared to the Advanced Markus chamber field and Monte Carlo calculations. The only exceptions were in measuring the dose at the surface of the water phantom where larger differences were found. For the 7mm diameter field, the agreement between the microDiamond detector and Monte Carlo calculations was good being better than 1% except at the surface. Based on these results, the PTW microDiamond detector has shown to be a suitable detector for relative dosimetry of low energy x-ray beams over a wide range of x-ray beam energies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs.

    PubMed

    Rodrigues, Anna; Sawkey, Daren; Yin, Fang-Fang; Wu, Qiuwen

    2015-05-01

    To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimated field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm(2) were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R100, R50, Rp, and Rp+ for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R100, R50, and Rp were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6 to 20 MeV for open and collimated field sizes from 3 × 3 to 25 × 25 cm(2) were studied and results were compared to the measurement data with excellent agreement. Application of this framework can thus be used as the platform for treatment planning of dynamic electron arc radiotherapy and other advanced dynamic techniques with electron beams.

  10. Path Toward a Unifid Geometry for Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lee, Kerry; Barzilla, Janet; Davis, Andrew; Zachmann

    2014-01-01

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex computer-aided design (CAD) models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN [high charge and energy transport code developed by NASA Langley Research Center (LaRC)], are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit-specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The workflow for achieving radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats

  11. Stopping power and dose calculations with analytical and Monte Carlo methods for protons and prompt gamma range verification

    NASA Astrophysics Data System (ADS)

    Usta, Metin; Tufan, Mustafa Çağatay; Aydın, Güral; Bozkurt, Ahmet

    2018-07-01

    In this study, we have performed the calculations stopping power, depth dose, and range verification for proton beams using dielectric and Bethe-Bloch theories and FLUKA, Geant4 and MCNPX Monte Carlo codes. In the framework, as analytical studies, Drude model was applied for dielectric theory and effective charge approach with Roothaan-Hartree-Fock charge densities was used in Bethe theory. In the simulations different setup parameters were selected to evaluate the performance of three distinct Monte Carlo codes. The lung and breast tissues were investigated are considered to be related to the most common types of cancer throughout the world. The results were compared with each other and the available data in literature. In addition, the obtained results were verified with prompt gamma range data. In both stopping power values and depth-dose distributions, it was found that the Monte Carlo values give better results compared with the analytical ones while the results that agree best with ICRU data in terms of stopping power are those of the effective charge approach between the analytical methods and of the FLUKA code among the MC packages. In the depth dose distributions of the examined tissues, although the Bragg curves for Monte Carlo almost overlap, the analytical ones show significant deviations that become more pronounce with increasing energy. Verifications with the results of prompt gamma photons were attempted for 100-200 MeV protons which are regarded important for proton therapy. The analytical results are within 2%-5% and the Monte Carlo values are within 0%-2% as compared with those of the prompt gammas.

  12. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Rourke, Patrick Francis

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  13. Optimization of beam shaping assembly based on D-T neutron generator and dose evaluation for BNCT

    NASA Astrophysics Data System (ADS)

    Naeem, Hamza; Chen, Chaobin; Zheng, Huaqing; Song, Jing

    2017-04-01

    The feasibility of developing an epithermal neutron beam for a boron neutron capture therapy (BNCT) facility based on a high intensity D-T fusion neutron generator (HINEG) and using the Monte Carlo code SuperMC (Super Monte Carlo simulation program for nuclear and radiation process) is proposed in this study. The Monte Carlo code SuperMC is used to determine and optimize the final configuration of the beam shaping assembly (BSA). The optimal BSA design in a cylindrical geometry which consists of a natural uranium sphere (14 cm) as a neutron multiplier, AlF3 and TiF3 as moderators (20 cm each), Cd (1 mm) as a thermal neutron filter, Bi (5 cm) as a gamma shield, and Pb as a reflector and collimator to guide neutrons towards the exit window. The epithermal neutron beam flux of the proposed model is 5.73 × 109 n/cm2s, and other dosimetric parameters for the BNCT reported by IAEA-TECDOC-1223 have been verified. The phantom dose analysis shows that the designed BSA is accurate, efficient and suitable for BNCT applications. Thus, the Monte Carlo code SuperMC is concluded to be capable of simulating the BSA and the dose calculation for BNCT, and high epithermal flux can be achieved using proposed BSA.

  14. On determining dose rate constants spectroscopically.

    PubMed

    Rodriguez, M; Rogers, D W O

    2013-01-01

    To investigate several aspects of the Chen and Nath spectroscopic method of determining the dose rate constants of (125)I and (103)Pd seeds [Z. Chen and R. Nath, Phys. Med. Biol. 55, 6089-6104 (2010)] including the accuracy of using a line or dual-point source approximation as done in their method, and the accuracy of ignoring the effects of the scattered photons in the spectra. Additionally, the authors investigate the accuracy of the literature's many different spectra for bare, i.e., unencapsulated (125)I and (103)Pd sources. Spectra generated by 14 (125)I and 6 (103)Pd seeds were calculated in vacuo at 10 cm from the source in a 2.7 × 2.7 × 0.05 cm(3) voxel using the EGSnrc BrachyDose Monte Carlo code. Calculated spectra used the initial photon spectra recommended by AAPM's TG-43U1 and NCRP (National Council of Radiation Protection and Measurements) Report 58 for the (125)I seeds, or TG-43U1 and NNDC(2000) (National Nuclear Data Center, 2000) for (103)Pd seeds. The emitted spectra were treated as coming from a line or dual-point source in a Monte Carlo simulation to calculate the dose rate constant. The TG-43U1 definition of the dose rate constant was used. These calculations were performed using the full spectrum including scattered photons or using only the main peaks in the spectrum as done experimentally. Statistical uncertainties on the air kerma/history and the dose rate/history were ≤0.2%. The dose rate constants were also calculated using Monte Carlo simulations of the full seed model. The ratio of the intensity of the 31 keV line relative to that of the main peak in (125)I spectra is, on average, 6.8% higher when calculated with the NCRP Report 58 initial spectrum vs that calculated with TG-43U1 initial spectrum. The (103)Pd spectra exhibit an average 6.2% decrease in the 22.9 keV line relative to the main peak when calculated with the TG-43U1 rather than the NNDC(2000) initial spectrum. The measured values from three different investigations are in much better agreement with the calculations using the NCRP Report 58 and NNDC(2000) initial spectra with average discrepancies of 0.9% and 1.7% for the (125)I and (103)Pd seeds, respectively. However, there are no differences in the calculated TG-43U1 brachytherapy parameters using either initial spectrum in both cases. Similarly, there were no differences outside the statistical uncertainties of 0.1% or 0.2%, in the average energy, air kerma/history, dose rate/history, and dose rate constant when calculated using either the full photon spectrum or the main-peaks-only spectrum. Our calculated dose rate constants based on using the calculated on-axis spectrum and a line or dual-point source model are in excellent agreement (0.5% on average) with the values of Chen and Nath, verifying the accuracy of their more approximate method of going from the spectrum to the dose rate constant. However, the dose rate constants based on full seed models differ by between +4.6% and -1.5% from those based on the line or dual-point source approximations. These results suggest that the main value of spectroscopic measurements is to verify full Monte Carlo models of the seeds by comparison to the calculated spectra.

  15. Benchmarking study of the MCNP code against cold critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitaraman, S.

    1991-01-01

    The purpose of this study was to benchmark the widely used Monte Carlo code MCNP against a set of cold critical experiments with a view to using the code as a means of independently verifying the performance of faster but less accurate Monte Carlo and deterministic codes. The experiments simulated consisted of both fast and thermal criticals as well as fuel in a variety of chemical forms. A standard set of benchmark cold critical experiments was modeled. These included the two fast experiments, GODIVA and JEZEBEL, the TRX metallic uranium thermal experiments, the Babcock and Wilcox oxide and mixed oxidemore » experiments, and the Oak Ridge National Laboratory (ORNL) and Pacific Northwest Laboratory (PNL) nitrate solution experiments. The principal case studied was a small critical experiment that was performed with boiling water reactor bundles.« less

  16. CMacIonize: Monte Carlo photoionisation and moving-mesh radiation hydrodynamics

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, Bert; Wood, Kenneth

    2018-02-01

    CMacIonize simulates the self-consistent evolution of HII regions surrounding young O and B stars, or other sources of ionizing radiation. The code combines a Monte Carlo photoionization algorithm that uses a complex mix of hydrogen, helium and several coolants in order to self-consistently solve for the ionization and temperature balance at any given time, with a standard first order hydrodynamics scheme. The code can be run as a post-processing tool to get the line emission from an existing simulation snapshot, but can also be used to run full radiation hydrodynamical simulations. Both the radiation transfer and the hydrodynamics are implemented in a general way that is independent of the grid structure that is used to discretize the system, allowing it to be run both as a standard fixed grid code and also as a moving-mesh code.

  17. The effect of ambient pressure on well chamber response: Monte Carlo calculated results for the HDR 1000 plus.

    PubMed

    Bohm, Tim D; Griffin, Sheridan L; DeLuca, Paul M; DeWerd, Larry A

    2005-04-01

    The determination of the air kerma strength of a brachytherapy seed is necessary for effective treatment planning. Well ionization chambers are used on site at therapy clinics to determine the air kerma strength of seeds. In this work, the response of the Standard Imaging HDR 1000 Plus well chamber to ambient pressure is examined using Monte Carlo calculations. The experimental work examining the response of this chamber as well as other chambers is presented in a companion paper. The Monte Carlo results show that for low-energy photon sources, the application of the standard temperature pressure PTP correction factor produces an over-response at the reduced air densities/pressures corresponding to high elevations. With photon sources of 20 to 40 keV, the normalized PTP corrected chamber response is as much as 10% to 20% over unity for air densities/pressures corresponding to an elevation of 3048 m (10000 ft) above sea level. At air densities corresponding to an elevation of 1524 m (5000 ft), the normalized PTP-corrected chamber response is 5% to 10% over unity for these photon sources. With higher-energy photon sources (>100 keV), the normalized PTP corrected chamber response is near unity. For low-energy beta sources of 0.25 to 0.50 MeV, the normalized PTP-corrected chamber response is as much as 4% to 12% over unity for air densities/pressures corresponding to an elevation of 3048 m (10000 ft) above sea level. Higher-energy beta sources (>0.75 MeV) have a normalized PTP corrected chamber response near unity. Comparing calculated and measured chamber responses for common 103Pd- and 125I-based brachytherapy seeds show agreement to within 2.7% and 1.9%, respectively. Comparing MCNP calculated chamber responses with EGSnrc calculated chamber responses show agreement to within 3.1% at photon energies of 20 to 40 keV. We conclude that Monte Carlo transport calculations accurately model the response of this well chamber. Further, applying the standard PTP correction factor for this well chamber is insufficient in accounting for the change in chamber response with air pressure for low-energy (<100 keV) photon and low-energy (<0.75 MeV)beta sources.

  18. Comparison of Geant4-DNA simulation of S-values with other Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    André, T.; Morini, F.; Karamitros, M.; Delorme, R.; Le Loirec, C.; Campos, L.; Champion, C.; Groetz, J.-E.; Fromm, M.; Bordage, M.-C.; Perrot, Y.; Barberet, Ph.; Bernal, M. A.; Brown, J. M. C.; Deleuze, M. S.; Francis, Z.; Ivanchenko, V.; Mascialino, B.; Zacharatou, C.; Bardiès, M.; Incerti, S.

    2014-01-01

    Monte Carlo simulations of S-values have been carried out with the Geant4-DNA extension of the Geant4 toolkit. The S-values have been simulated for monoenergetic electrons with energies ranging from 0.1 keV up to 20 keV, in liquid water spheres (for four radii, chosen between 10 nm and 1 μm), and for electrons emitted by five isotopes of iodine (131, 132, 133, 134 and 135), in liquid water spheres of varying radius (from 15 μm up to 250 μm). The results have been compared to those obtained from other Monte Carlo codes and from other published data. The use of the Kolmogorov-Smirnov test has allowed confirming the statistical compatibility of all simulation results.

  19. Reducing statistical uncertainties in simulated organ doses of phantoms immersed in water

    DOE PAGES

    Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.; ...

    2016-08-13

    In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less

  20. Applying Quantum Monte Carlo to the Electronic Structure Problem

    NASA Astrophysics Data System (ADS)

    Powell, Andrew D.; Dawes, Richard

    2016-06-01

    Two distinct types of Quantum Monte Carlo (QMC) calculations are applied to electronic structure problems such as calculating potential energy curves and producing benchmark values for reaction barriers. First, Variational and Diffusion Monte Carlo (VMC and DMC) methods using a trial wavefunction subject to the fixed node approximation were tested using the CASINO code.[1] Next, Full Configuration Interaction Quantum Monte Carlo (FCIQMC), along with its initiator extension (i-FCIQMC) were tested using the NECI code.[2] FCIQMC seeks the FCI energy for a specific basis set. At a reduced cost, the efficient i-FCIQMC method can be applied to systems in which the standard FCIQMC approach proves to be too costly. Since all of these methods are statistical approaches, uncertainties (error-bars) are introduced for each calculated energy. This study tests the performance of the methods relative to traditional quantum chemistry for some benchmark systems. References: [1] R. J. Needs et al., J. Phys.: Condensed Matter 22, 023201 (2010). [2] G. H. Booth et al., J. Chem. Phys. 131, 054106 (2009).

  1. Monte Carlo simulation of liver cancer treatment with 166Ho-loaded glass microspheres

    NASA Astrophysics Data System (ADS)

    da Costa Guimarães, Carla; Moralles, Maurício; Roberto Martinelli, José

    2014-02-01

    Microspheres loaded with pure beta-emitter radioisotopes are used in the treatment of some types of liver cancer. The Instituto de Pesquisas Energéticas e Nucleares (IPEN) is developing 166Ho-loaded glass microspheres as an alternative to the commercially available 90Y microspheres. This work describes the implementation of a Monte Carlo code to simulate both the irradiation effects and the imaging of 166Ho and 90Y sources localized in different parts of the liver. Results obtained with the code and perspectives for the future are discussed.

  2. Skyshine radiation from a pressurized water reactor containment dome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, W.H.

    1986-06-01

    The radiation dose rates resulting from airborne activities inside a postaccident pressurized water reactor containment are calculated by a discrete ordinates/Monte Carlo combined method. The calculated total dose rates and the skyshine component are presented as a function of distance from the containment at three different elevations for various gamma-ray source energies. The one-dimensional (ANISN code) is used to approximate the skyshine dose rates from the hemisphere dome, and the results are compared favorably to more rigorous results calculated by a three-dimensional Monte Carlo code.

  3. ScintSim1: A new Monte Carlo simulation code for transport of optical photons in 2D arrays of scintillation detectors

    PubMed Central

    Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali

    2014-01-01

    Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization. PMID:24600168

  4. ScintSim1: A new Monte Carlo simulation code for transport of optical photons in 2D arrays of scintillation detectors.

    PubMed

    Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali

    2014-01-01

    Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization.

  5. The EPQ Code System for Simulating the Thermal Response of Plasma-Facing Components to High-Energy Electron Impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Robert Cameron; Steiner, Don

    2004-06-15

    The generation of runaway electrons during a thermal plasma disruption is a concern for the safe and economical operation of a tokamak power system. Runaway electrons have high energy, 10 to 300 MeV, and may potentially cause extensive damage to plasma-facing components (PFCs) through large temperature increases, melting of metallic components, surface erosion, and possible burnout of coolant tubes. The EPQ code system was developed to simulate the thermal response of PFCs to a runaway electron impact. The EPQ code system consists of several parts: UNIX scripts that control the operation of an electron-photon Monte Carlo code to calculate themore » interaction of the runaway electrons with the plasma-facing materials; a finite difference code to calculate the thermal response, melting, and surface erosion of the materials; a code to process, scale, transform, and convert the electron Monte Carlo data to volumetric heating rates for use in the thermal code; and several minor and auxiliary codes for the manipulation and postprocessing of the data. The electron-photon Monte Carlo code used was Electron-Gamma-Shower (EGS), developed and maintained by the National Research Center of Canada. The Quick-Therm-Two-Dimensional-Nonlinear (QTTN) thermal code solves the two-dimensional cylindrical modified heat conduction equation using the Quickest third-order accurate and stable explicit finite difference method and is capable of tracking melting or surface erosion. The EPQ code system is validated using a series of analytical solutions and simulations of experiments. The verification of the QTTN thermal code with analytical solutions shows that the code with the Quickest method is better than 99.9% accurate. The benchmarking of the EPQ code system and QTTN versus experiments showed that QTTN's erosion tracking method is accurate within 30% and that EPQ is able to predict the occurrence of melting within the proper time constraints. QTTN and EPQ are verified and validated as able to calculate the temperature distribution, phase change, and surface erosion successfully.« less

  6. Monte Carlo simulations of {sup 3}He ion physical characteristics in a water phantom and evaluation of radiobiological effectiveness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taleei, Reza; Guan, Fada; Peeler, Chris

    Purpose: {sup 3}He ions may hold great potential for clinical therapy because of both their physical and biological properties. In this study, the authors investigated the physical properties, i.e., the depth-dose curves from primary and secondary particles, and the energy distributions of helium ({sup 3}He) ions. A relative biological effectiveness (RBE) model was applied to assess the biological effectiveness on survival of multiple cell lines. Methods: In light of the lack of experimental measurements and cross sections, the authors used Monte Carlo methods to study the energy deposition of {sup 3}He ions. The transport of {sup 3}He ions in watermore » was simulated by using three Monte Carlo codes—FLUKA, GEANT4, and MCNPX—for incident beams with Gaussian energy distributions with average energies of 527 and 699 MeV and a full width at half maximum of 3.3 MeV in both cases. The RBE of each was evaluated by using the repair-misrepair-fixation model. In all of the simulations with each of the three Monte Carlo codes, the same geometry and primary beam parameters were used. Results: Energy deposition as a function of depth and energy spectra with high resolution was calculated on the central axis of the beam. Secondary proton dose from the primary {sup 3}He beams was predicted quite differently by the three Monte Carlo systems. The predictions differed by as much as a factor of 2. Microdosimetric parameters such as dose mean lineal energy (y{sub D}), frequency mean lineal energy (y{sub F}), and frequency mean specific energy (z{sub F}) were used to characterize the radiation beam quality at four depths of the Bragg curve. Calculated RBE values were close to 1 at the entrance, reached on average 1.8 and 1.6 for prostate and head and neck cancer cell lines at the Bragg peak for both energies, but showed some variations between the different Monte Carlo codes. Conclusions: Although the Monte Carlo codes provided different results in energy deposition and especially in secondary particle production (most of the differences between the three codes were observed close to the Bragg peak, where the energy spectrum broadens), the results in terms of RBE were generally similar.« less

  7. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods

    PubMed Central

    Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.

    2011-01-01

    We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276

  8. Development of the MCNPX depletion capability: A Monte Carlo linked depletion method that automates the coupling between MCNPX and CINDER90 for high fidelity burnup calculations

    NASA Astrophysics Data System (ADS)

    Fensin, Michael Lorne

    Monte Carlo-linked depletion methods have gained recent interest due to the ability to more accurately model complex 3-dimesional geometries and better track the evolution of temporal nuclide inventory by simulating the actual physical process utilizing continuous energy coefficients. The integration of CINDER90 into the MCNPX Monte Carlo radiation transport code provides a high-fidelity completely self-contained Monte-Carlo-linked depletion capability in a well established, widely accepted Monte Carlo radiation transport code that is compatible with most nuclear criticality (KCODE) particle tracking features in MCNPX. MCNPX depletion tracks all necessary reaction rates and follows as many isotopes as cross section data permits in order to achieve a highly accurate temporal nuclide inventory solution. This work chronicles relevant nuclear history, surveys current methodologies of depletion theory, details the methodology in applied MCNPX and provides benchmark results for three independent OECD/NEA benchmarks. Relevant nuclear history, from the Oklo reactor two billion years ago to the current major United States nuclear fuel cycle development programs, is addressed in order to supply the motivation for the development of this technology. A survey of current reaction rate and temporal nuclide inventory techniques is then provided to offer justification for the depletion strategy applied within MCNPX. The MCNPX depletion strategy is then dissected and each code feature is detailed chronicling the methodology development from the original linking of MONTEBURNS and MCNP to the most recent public release of the integrated capability (MCNPX 2.6.F). Calculation results of the OECD/NEA Phase IB benchmark, H. B. Robinson benchmark and OECD/NEA Phase IVB are then provided. The acceptable results of these calculations offer sufficient confidence in the predictive capability of the MCNPX depletion method. This capability sets up a significant foundation, in a well established and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.

  9. Advances in Monte-Carlo code TRIPOLI-4®'s treatment of the electromagnetic cascade

    NASA Astrophysics Data System (ADS)

    Mancusi, Davide; Bonin, Alice; Hugot, François-Xavier; Malouch, Fadhel

    2018-01-01

    TRIPOLI-4® is a Monte-Carlo particle-transport code developed at CEA-Saclay (France) that is employed in the domains of nuclear-reactor physics, criticality-safety, shielding/radiation protection and nuclear instrumentation. The goal of this paper is to report on current developments, validation and verification made in TRIPOLI-4 in the electron/positron/photon sector. The new capabilities and improvements concern refinements to the electron transport algorithm, the introduction of a charge-deposition score, the new thick-target bremsstrahlung option, the upgrade of the bremsstrahlung model and the improvement of electron angular straggling at low energy. The importance of each of the developments above is illustrated by comparisons with calculations performed with other codes and with experimental data.

  10. Calculation of response matrix of CaSO 4:Dy based neutron dosimeter using Monte Carlo code FLUKA and measurement of 241Am-Be spectra

    NASA Astrophysics Data System (ADS)

    Chatterjee, S.; Bakshi, A. K.; Tripathy, S. P.

    2010-09-01

    Response matrix for CaSO 4:Dy based neutron dosimeter was generated using Monte Carlo code FLUKA in the energy range thermal to 20 MeV for a set of eight Bonner spheres of diameter 3-12″ including the bare one. Response of the neutron dosimeter was measured for the above set of spheres for 241Am-Be neutron source covered with 2 mm lead. An analytical expression for the response function was devised as a function of sphere mass. Using Frascati Unfolding Iteration Tool (FRUIT) unfolding code, the neutron spectrum of 241Am-Be was unfolded and compared with standard IAEA spectrum for the same.

  11. McSKY: A hybrid Monte-Carlo lime-beam code for shielded gamma skyshine calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shultis, J.K.; Faw, R.E.; Stedry, M.H.

    1994-07-01

    McSKY evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated into either a vertical cone or a vertical structure with an N-sided polygon cross section. The code assumes an overhead shield of two materials, through the user can specify zero shield thickness for an unshielded calculation. The code uses a Monte-Carlo algorithm to evaluate transport through source shields and the integral line source to describe photon transport through the atmosphere. The source energy must be between 0.02 and 100 MeV. For heavily shielded sources with energies above 20 MeV, McSKY results must be used cautiously, especially at detectormore » locations near the source.« less

  12. MUFFSgenMC: An Open Source MUon Flexible Framework for Spectral GENeration for Monte Carlo Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatzidakis, Stylianos; Greulich, Christopher

    A cosmic ray Muon Flexible Framework for Spectral GENeration for Monte Carlo Applications (MUFFSgenMC) has been developed to support state-of-the-art cosmic ray muon tomographic applications. The flexible framework allows for easy and fast creation of source terms for popular Monte Carlo applications like GEANT4 and MCNP. This code framework simplifies the process of simulations used for cosmic ray muon tomography.

  13. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less

  14. TU-AB-BRC-10: Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison of GPU and MIC Computing Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Xu, X

    Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analyticallymore » derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.« less

  15. Diagnosing Undersampling Biases in Monte Carlo Eigenvalue and Flux Tally Estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M.; Rearden, Bradley T.; Marshall, William J.

    2017-02-08

    Here, this study focuses on understanding the phenomena in Monte Carlo simulations known as undersampling, in which Monte Carlo tally estimates may not encounter a sufficient number of particles during each generation to obtain unbiased tally estimates. Steady-state Monte Carlo simulations were performed using the KENO Monte Carlo tools within the SCALE code system for models of several burnup credit applications with varying degrees of spatial and isotopic complexities, and the incidence and impact of undersampling on eigenvalue and flux estimates were examined. Using an inadequate number of particle histories in each generation was found to produce a maximum bias of ~100 pcm in eigenvalue estimates and biases that exceeded 10% in fuel pin flux tally estimates. Having quantified the potential magnitude of undersampling biases in eigenvalue and flux tally estimates in these systems, this study then investigated whether Markov Chain Monte Carlo convergence metrics could be integrated into Monte Carlo simulations to predict the onset and magnitude of undersampling biases. Five potential metrics for identifying undersampling biases were implemented in the SCALE code system and evaluated for their ability to predict undersampling biases by comparing the test metric scores with the observed undersampling biases. Finally, of the five convergence metrics that were investigated, three (the Heidelberger-Welch relative half-width, the Gelman-Rubin more » $$\\hat{R}_c$$ diagnostic, and tally entropy) showed the potential to accurately predict the behavior of undersampling biases in the responses examined.« less

  16. Building Process Improvement Business Cases Using Bayesian Belief Networks and Monte Carlo Simulation

    DTIC Science & Technology

    2009-07-01

    simulation. The pilot described in this paper used this two-step approach within a Define, Measure, Analyze, Improve, and Control ( DMAIC ) framework to...networks, BBN, Monte Carlo simulation, DMAIC , Six Sigma, business case 15. NUMBER OF PAGES 35 16. PRICE CODE 17. SECURITY CLASSIFICATION OF

  17. METHES: A Monte Carlo collision code for the simulation of electron transport in low temperature plasmas

    NASA Astrophysics Data System (ADS)

    Rabie, M.; Franck, C. M.

    2016-06-01

    We present a freely available MATLAB code for the simulation of electron transport in arbitrary gas mixtures in the presence of uniform electric fields. For steady-state electron transport, the program provides the transport coefficients, reaction rates and the electron energy distribution function. The program uses established Monte Carlo techniques and is compatible with the electron scattering cross section files from the open-access Plasma Data Exchange Project LXCat. The code is written in object-oriented design, allowing the tracing and visualization of the spatiotemporal evolution of electron swarms and the temporal development of the mean energy and the electron number due to attachment and/or ionization processes. We benchmark our code with well-known model gases as well as the real gases argon, N2, O2, CF4, SF6 and mixtures of N2 and O2.

  18. A parallel Monte Carlo code for planar and SPECT imaging: implementation, verification and applications in (131)I SPECT.

    PubMed

    Dewaraja, Yuni K; Ljungberg, Michael; Majumdar, Amitava; Bose, Abhijit; Koral, Kenneth F

    2002-02-01

    This paper reports the implementation of the SIMIND Monte Carlo code on an IBM SP2 distributed memory parallel computer. Basic aspects of running Monte Carlo particle transport calculations on parallel architectures are described. Our parallelization is based on equally partitioning photons among the processors and uses the Message Passing Interface (MPI) library for interprocessor communication and the Scalable Parallel Random Number Generator (SPRNG) to generate uncorrelated random number streams. These parallelization techniques are also applicable to other distributed memory architectures. A linear increase in computing speed with the number of processors is demonstrated for up to 32 processors. This speed-up is especially significant in Single Photon Emission Computed Tomography (SPECT) simulations involving higher energy photon emitters, where explicit modeling of the phantom and collimator is required. For (131)I, the accuracy of the parallel code is demonstrated by comparing simulated and experimental SPECT images from a heart/thorax phantom. Clinically realistic SPECT simulations using the voxel-man phantom are carried out to assess scatter and attenuation correction.

  19. Multi-D Full Boltzmann Neutrino Hydrodynamic Simulations in Core Collapse Supernovae and their detailed comparison with Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Nagakura, Hiroki; Richers, Sherwood; Ott, Christian; Iwakami, Wakana; Furusawa, Shun; Sumiyoshi, Kohsuke; Yamada, Shoichi

    2017-01-01

    We have developed a multi-d radiation-hydrodynamic code which solves first-principles Boltzmann equation for neutrino transport. It is currently applicable specifically for core-collapse supernovae (CCSNe), but we will extend their applicability to further extreme phenomena such as black hole formation and coalescence of double neutron stars. In this meeting, I will discuss about two things; (1) detailed comparison with a Monte-Carlo neutrino transport (2) axisymmetric CCSNe simulations. The project (1) gives us confidence of our code. The Monte-Carlo code has been developed by Caltech group and it is specialized to obtain a steady state. Among CCSNe community, this is the first attempt to compare two different methods for multi-d neutrino transport. I will show the result of these comparison. For the project (2), I particularly focus on the property of neutrino distribution function in the semi-transparent region where only first-principle Boltzmann solver can appropriately handle the neutrino transport. In addition to these analyses, I will also discuss the ``explodability'' by neutrino heating mechanism.

  20. Computing Temperatures in Optically Thick Protoplanetary Disks

    NASA Technical Reports Server (NTRS)

    Capuder, Lawrence F.. Jr.

    2011-01-01

    We worked with a Monte Carlo radiative transfer code to simulate the transfer of energy through protoplanetary disks, where planet formation occurs. The code tracks photons from the star into the disk, through scattering, absorption and re-emission, until they escape to infinity. High optical depths in the disk interior dominate the computation time because it takes the photon packet many interactions to get out of the region. High optical depths also receive few photons and therefore do not have well-estimated temperatures. We applied a modified random walk (MRW) approximation for treating high optical depths and to speed up the Monte Carlo calculations. The MRW is implemented by calculating the average number of interactions the photon packet will undergo in diffusing within a single cell of the spatial grid and then updating the packet position, packet frequencies, and local radiation absorption rate appropriately. The MRW approximation was then tested for accuracy and speed compared to the original code. We determined that MRW provides accurate answers to Monte Carlo Radiative transfer simulations. The speed gained from using MRW is shown to be proportional to the disk mass.

  1. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brantley, Patrick; Dawson, Shawn; McKinley, Scott

    2016-04-20

    The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, wemore » also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.« less

  2. Validation of a vector version of the 6S radiative transfer code for atmospheric correction of satellite data. Part I: Path radiance

    NASA Astrophysics Data System (ADS)

    Kotchenova, Svetlana Y.; Vermote, Eric F.; Matarrese, Raffaella; Klemm, Frank J., Jr.

    2006-09-01

    A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm.

  3. Validation of a vector version of the 6S radiative transfer code for atmospheric correction of satellite data. Part I: path radiance.

    PubMed

    Kotchenova, Svetlana Y; Vermote, Eric F; Matarrese, Raffaella; Klemm, Frank J

    2006-09-10

    A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm.

  4. Combined experimental and Monte Carlo verification of brachytherapy plans for vaginal applicators

    NASA Astrophysics Data System (ADS)

    Sloboda, Ron S.; Wang, Ruqing

    1998-12-01

    Dose rates in a phantom around a shielded and an unshielded vaginal applicator containing Selectron low-dose-rate sources were determined by experiment and Monte Carlo simulation. Measurements were performed with thermoluminescent dosimeters in a white polystyrene phantom using an experimental protocol geared for precision. Calculations for the same set-up were done using a version of the EGS4 Monte Carlo code system modified for brachytherapy applications into which a new combinatorial geometry package developed by Bielajew was recently incorporated. Measured dose rates agree with Monte Carlo estimates to within 5% (1 SD) for the unshielded applicator, while highlighting some experimental uncertainties for the shielded applicator. Monte Carlo calculations were also done to determine a value for the effective transmission of the shield required for clinical treatment planning, and to estimate the dose rate in water at points in axial and sagittal planes transecting the shielded applicator. Comparison with dose rates generated by the planning system indicates that agreement is better than 5% (1 SD) at most positions. The precision thermoluminescent dosimetry protocol and modified Monte Carlo code are effective complementary tools for brachytherapy applicator dosimetry.

  5. TU-AB-BRC-05: Creation of a Monte Carlo TrueBeam Model by Reproducing Varian Phase Space Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Grady, K; Davis, S; Seuntjens, J

    Purpose: To create a Varian TrueBeam 6 MV FFF Monte Carlo model using BEAMnrc/EGSnrc that accurately reproduces the Varian representative dataset, followed by tuning the model’s source parameters to accurately reproduce in-house measurements. Methods: A BEAMnrc TrueBeam model for 6 MV FFF has been created by modifying a validated 6 MV Varian CL21EX model. Geometric dimensions and materials were adjusted in a trial and error approach to match the fluence and spectra of TrueBeam phase spaces output by the Varian VirtuaLinac. Once the model’s phase space matched Varian’s counterpart using the default source parameters, it was validated to match 10more » × 10 cm{sup 2} Varian representative data obtained with the IBA CC13. The source parameters were then tuned to match in-house 5 × 5 cm{sup 2} PTW microDiamond measurements. All dose to water simulations included detector models to include the effects of volume averaging and the non-water equivalence of the chamber materials, allowing for more accurate source parameter selection. Results: The Varian phase space spectra and fluence were matched with excellent agreement. The in-house model’s PDD agreement with CC13 TrueBeam representative data was within 0.9% local percent difference beyond the first 3 mm. Profile agreement at 10 cm depth was within 0.9% local percent difference and 1.3 mm distance-to-agreement in the central axis and penumbra regions, respectively. Once the source parameters were tuned, PDD agreement with microDiamond measurements was within 0.9% local percent difference beyond 2 mm. The microDiamond profile agreement at 10 cm depth was within 0.6% local percent difference and 0.4 mm distance-to-agreement in the central axis and penumbra regions, respectively. Conclusion: An accurate in-house Monte Carlo model of the Varian TrueBeam was achieved independently of the Varian phase space solution and was tuned to in-house measurements. KO acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290).« less

  6. FitSKIRT: genetic algorithms to automatically fit dusty galaxies with a Monte Carlo radiative transfer code

    NASA Astrophysics Data System (ADS)

    De Geyter, G.; Baes, M.; Fritz, J.; Camps, P.

    2013-02-01

    We present FitSKIRT, a method to efficiently fit radiative transfer models to UV/optical images of dusty galaxies. These images have the advantage that they have better spatial resolution compared to FIR/submm data. FitSKIRT uses the GAlib genetic algorithm library to optimize the output of the SKIRT Monte Carlo radiative transfer code. Genetic algorithms prove to be a valuable tool in handling the multi- dimensional search space as well as the noise induced by the random nature of the Monte Carlo radiative transfer code. FitSKIRT is tested on artificial images of a simulated edge-on spiral galaxy, where we gradually increase the number of fitted parameters. We find that we can recover all model parameters, even if all 11 model parameters are left unconstrained. Finally, we apply the FitSKIRT code to a V-band image of the edge-on spiral galaxy NGC 4013. This galaxy has been modeled previously by other authors using different combinations of radiative transfer codes and optimization methods. Given the different models and techniques and the complexity and degeneracies in the parameter space, we find reasonable agreement between the different models. We conclude that the FitSKIRT method allows comparison between different models and geometries in a quantitative manner and minimizes the need of human intervention and biasing. The high level of automation makes it an ideal tool to use on larger sets of observed data.

  7. Solution of the Burnett equations for hypersonic flows near the continuum limit

    NASA Technical Reports Server (NTRS)

    Imlay, Scott T.

    1992-01-01

    The INCA code, a three-dimensional Navier-Stokes code for analysis of hypersonic flowfields, was modified to analyze the lower reaches of the continuum transition regime, where the Navier-Stokes equations become inaccurate and Monte Carlo methods become too computationally expensive. The two-dimensional Burnett equations and the three-dimensional rotational energy transport equation were added to the code and one- and two-dimensional calculations were performed. For the structure of normal shock waves, the Burnett equations give consistently better results than Navier-Stokes equations and compare reasonably well with Monte Carlo methods. For two-dimensional flow of Nitrogen past a circular cylinder the Burnett equations predict the total drag reasonably well. Care must be taken, however, not to exceed the range of validity of the Burnett equations.

  8. Monte Carlo simulation of ion-neutral charge exchange collisions and grid erosion in an ion thruster

    NASA Technical Reports Server (NTRS)

    Peng, Xiaohang; Ruyten, Wilhelmus M.; Keefer, Dennis

    1991-01-01

    A combined particle-in-cell (PIC)/Monte Carlo simulation model has been developed in which the PIC method is used to simulate the charge exchange collisions. It is noted that a number of features were reproduced correctly by this code, but that its assumption of two-dimensional axisymmetry for a single set of grid apertures precluded the reproduction of the most characteristic feature of actual test data; namely, the concentrated grid erosion at the geometric center of the hexagonal aperture array. The first results of a three-dimensional code, which takes into account the hexagonal symmetry of the grid, are presented. It is shown that, with this code, the experimentally observed erosion patterns are reproduced correctly, demonstrating explicitly the concentration of sputtering between apertures.

  9. Accuracy and convergence of coupled finite-volume/Monte Carlo codes for plasma edge simulations of nuclear fusion reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghoos, K., E-mail: kristel.ghoos@kuleuven.be; Dekeyser, W.; Samaey, G.

    2016-10-01

    The plasma and neutral transport in the plasma edge of a nuclear fusion reactor is usually simulated using coupled finite volume (FV)/Monte Carlo (MC) codes. However, under conditions of future reactors like ITER and DEMO, convergence issues become apparent. This paper examines the convergence behaviour and the numerical error contributions with a simplified FV/MC model for three coupling techniques: Correlated Sampling, Random Noise and Robbins Monro. Also, practical procedures to estimate the errors in complex codes are proposed. Moreover, first results with more complex models show that an order of magnitude speedup can be achieved without any loss in accuracymore » by making use of averaging in the Random Noise coupling technique.« less

  10. Scaling GDL for Multi-cores to Process Planck HFI Beams Monte Carlo on HPC

    NASA Astrophysics Data System (ADS)

    Coulais, A.; Schellens, M.; Duvert, G.; Park, J.; Arabas, S.; Erard, S.; Roudier, G.; Hivon, E.; Mottet, S.; Laurent, B.; Pinter, M.; Kasradze, N.; Ayad, M.

    2014-05-01

    After reviewing the majors progress done in GDL -now in 0.9.4- on performance and plotting capabilities since ADASS XXI paper (Coulais et al. 2012), we detail how a large code for Planck HFI beams Monte Carlo was successfully transposed from IDL to GDL on HPC.

  11. A Novel Implementation of Massively Parallel Three Dimensional Monte Carlo Radiation Transport

    NASA Astrophysics Data System (ADS)

    Robinson, P. B.; Peterson, J. D. L.

    2005-12-01

    The goal of our summer project was to implement the difference formulation for radiation transport into Cosmos++, a multidimensional, massively parallel, magneto hydrodynamics code for astrophysical applications (Peter Anninos - AX). The difference formulation is a new method for Symbolic Implicit Monte Carlo thermal transport (Brooks and Szöke - PAT). Formerly, simultaneous implementation of fully implicit Monte Carlo radiation transport in multiple dimensions on multiple processors had not been convincingly demonstrated. We found that a combination of the difference formulation and the inherent structure of Cosmos++ makes such an implementation both accurate and straightforward. We developed a "nearly nearest neighbor physics" technique to allow each processor to work independently, even with a fully implicit code. This technique coupled with the increased accuracy of an implicit Monte Carlo solution and the efficiency of parallel computing systems allows us to demonstrate the possibility of massively parallel thermal transport. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48

  12. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit.

    PubMed

    Badal, Andreu; Badano, Aldo

    2009-11-01

    It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  13. An empirical approach to estimate near-infra-red photon propagation and optically induced drug release in brain tissues

    NASA Astrophysics Data System (ADS)

    Prabhu Verleker, Akshay; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M.

    2015-03-01

    The purpose of this study is to develop an alternate empirical approach to estimate near-infra-red (NIR) photon propagation and quantify optically induced drug release in brain metastasis, without relying on computationally expensive Monte Carlo techniques (gold standard). Targeted drug delivery with optically induced drug release is a noninvasive means to treat cancers and metastasis. This study is part of a larger project to treat brain metastasis by delivering lapatinib-drug-nanocomplexes and activating NIR-induced drug release. The empirical model was developed using a weighted approach to estimate photon scattering in tissues and calibrated using a GPU based 3D Monte Carlo. The empirical model was developed and tested against Monte Carlo in optical brain phantoms for pencil beams (width 1mm) and broad beams (width 10mm). The empirical algorithm was tested against the Monte Carlo for different albedos along with diffusion equation and in simulated brain phantoms resembling white-matter (μs'=8.25mm-1, μa=0.005mm-1) and gray-matter (μs'=2.45mm-1, μa=0.035mm-1) at wavelength 800nm. The goodness of fit between the two models was determined using coefficient of determination (R-squared analysis). Preliminary results show the Empirical algorithm matches Monte Carlo simulated fluence over a wide range of albedo (0.7 to 0.99), while the diffusion equation fails for lower albedo. The photon fluence generated by empirical code matched the Monte Carlo in homogeneous phantoms (R2=0.99). While GPU based Monte Carlo achieved 300X acceleration compared to earlier CPU based models, the empirical code is 700X faster than the Monte Carlo for a typical super-Gaussian laser beam.

  14. PEPSI — a Monte Carlo generator for polarized leptoproduction

    NASA Astrophysics Data System (ADS)

    Mankiewicz, L.; Schäfer, A.; Veltri, M.

    1992-09-01

    We describe PEPSI (Polarized Electron Proton Scattering Interactions), a Monte Carlo program for polarized deep inelastic leptoproduction mediated by electromagnetic interaction, and explain how to use it. The code is a modification of the LEPTO 4.3 Lund Monte Carlo for unpolarized scattering. The hard virtual gamma-parton scattering is generated according to the polarization-dependent QCD cross-section of the first order in α S. PEPSI requires the standard polarization-independent JETSET routines to simulate the fragmentation into final hadrons.

  15. The radiation fields around a proton therapy facility: A comparison of Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Ottaviano, G.; Picardi, L.; Pillon, M.; Ronsivalle, C.; Sandri, S.

    2014-02-01

    A proton therapy test facility with a beam current lower than 10 nA in average, and an energy up to 150 MeV, is planned to be sited at the Frascati ENEA Research Center, in Italy. The accelerator is composed of a sequence of linear sections. The first one is a commercial 7 MeV proton linac, from which the beam is injected in a SCDTL (Side Coupled Drift Tube Linac) structure reaching the energy of 52 MeV. Then a conventional CCL (coupled Cavity Linac) with side coupling cavities completes the accelerator. The linear structure has the important advantage that the main radiation losses during the acceleration process occur to protons with energy below 20 MeV, with a consequent low production of neutrons and secondary radiation. From the radiation protection point of view the source of radiation for this facility is then almost completely located at the final target. Physical and geometrical models of the device have been developed and implemented into radiation transport computer codes based on the Monte Carlo method. The scope is the assessment of the radiation field around the main source for supporting the safety analysis. For the assessment independent researchers used two different Monte Carlo computer codes named FLUKA (FLUktuierende KAskade) and MCNPX (Monte Carlo N-Particle eXtended) respectively. Both are general purpose tools for calculations of particle transport and interactions with matter, covering an extended range of applications including proton beam analysis. Nevertheless each one utilizes its own nuclear cross section libraries and uses specific physics models for particle types and energies. The models implemented into the codes are described and the results are presented. The differences between the two calculations are reported and discussed pointing out disadvantages and advantages of each code in the specific application.

  16. Development of Monte Carlo based real-time treatment planning system with fast calculation algorithm for boron neutron capture therapy.

    PubMed

    Takada, Kenta; Kumada, Hiroaki; Liem, Peng Hong; Sakurai, Hideyuki; Sakae, Takeji

    2016-12-01

    We simulated the effect of patient displacement on organ doses in boron neutron capture therapy (BNCT). In addition, we developed a faster calculation algorithm (NCT high-speed) to simulate irradiation more efficiently. We simulated dose evaluation for the standard irradiation position (reference position) using a head phantom. Cases were assumed where the patient body is shifted in lateral directions compared to the reference position, as well as in the direction away from the irradiation aperture. For three groups of neutron (thermal, epithermal, and fast), flux distribution using NCT high-speed with a voxelized homogeneous phantom was calculated. The three groups of neutron fluxes were calculated for the same conditions with Monte Carlo code. These calculated results were compared. In the evaluations of body movements, there were no significant differences even with shifting up to 9mm in the lateral directions. However, the dose decreased by about 10% with shifts of 9mm in a direction away from the irradiation aperture. When comparing both calculations in the phantom surface up to 3cm, the maximum differences between the fluxes calculated by NCT high-speed with those calculated by Monte Carlo code for thermal neutrons and epithermal neutrons were 10% and 18%, respectively. The time required for NCT high-speed code was about 1/10th compared to Monte Carlo calculation. In the evaluation, the longitudinal displacement has a considerable effect on the organ doses. We also achieved faster calculation of depth distribution of thermal neutron flux using NCT high-speed calculation code. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy.

    PubMed

    Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe

    2015-07-07

    The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm(3) calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.

  18. Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy

    NASA Astrophysics Data System (ADS)

    Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe

    2015-07-01

    The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm3 calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.

  19. Development of a new multi-modal Monte-Carlo radiotherapy planning system.

    PubMed

    Kumada, H; Nakamura, T; Komeda, M; Matsumura, A

    2009-07-01

    A new multi-modal Monte-Carlo radiotherapy planning system (developing code: JCDS-FX) is under development at Japan Atomic Energy Agency. This system builds on fundamental technologies of JCDS applied to actual boron neutron capture therapy (BNCT) trials in JRR-4. One of features of the JCDS-FX is that PHITS has been applied to particle transport calculation. PHITS is a multi-purpose particle Monte-Carlo transport code. Hence application of PHITS enables to evaluate total doses given to a patient by a combined modality therapy. Moreover, JCDS-FX with PHITS can be used for the study of accelerator based BNCT. To verify calculation accuracy of the JCDS-FX, dose evaluations for neutron irradiation of a cylindrical water phantom and for an actual clinical trial were performed, then the results were compared with calculations by JCDS with MCNP. The verification results demonstrated that JCDS-FX is applicable to BNCT treatment planning in practical use.

  20. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    DOE PAGES

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results.« less

  1. Design and optimization of a portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele

    The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.

  2. Track structure in radiation biology: theory and applications.

    PubMed

    Nikjoo, H; Uehara, S; Wilson, W E; Hoshi, M; Goodhead, D T

    1998-04-01

    A brief review is presented of the basic concepts in track structure and the relative merit of various theoretical approaches adopted in Monte-Carlo track-structure codes are examined. In the second part of the paper, a formal cluster analysis is introduced to calculate cluster-distance distributions. Total experimental ionization cross-sections were least-square fitted and compared with the calculation by various theoretical methods. Monte-Carlo track-structure code Kurbuc was used to examine and compare the spectrum of the secondary electrons generated by using functions given by Born-Bethe, Jain-Khare, Gryzinsky, Kim-Rudd, Mott and Vriens' theories. The cluster analysis in track structure was carried out using the k-means method and Hartigan algorithm. Data are presented on experimental and calculated total ionization cross-sections: inverse mean free path (IMFP) as a function of electron energy used in Monte-Carlo track-structure codes; the spectrum of secondary electrons generated by different functions for 500 eV primary electrons; cluster analysis for 4 MeV and 20 MeV alpha-particles in terms of the frequency of total cluster energy to the root-mean-square (rms) radius of the cluster and differential distance distributions for a pair of clusters; and finally relative frequency distribution for energy deposited in DNA, single-strand break and double-strand breaks for 10MeV/u protons, alpha-particles and carbon ions. There are a number of Monte-Carlo track-structure codes that have been developed independently and the bench-marking presented in this paper allows a better choice of the theoretical method adopted in a track-structure code to be made. A systematic bench-marking of cross-sections and spectra of the secondary electrons shows differences between the codes at atomic level, but such differences are not significant in biophysical modelling at the macromolecular level. Clustered-damage evaluation shows: that a substantial proportion of dose ( 30%) is deposited by low-energy electrons; the majority of DNA damage lesions are of simple type; the complexity of damage increases with increased LET, while the total yield of strand breaks remains constant; and at high LET values nearly 70% of all double-strand breaks are of complex type.

  3. Improved Convergence Rate of Multi-Group Scattering Moment Tallies for Monte Carlo Neutron Transport Codes

    NASA Astrophysics Data System (ADS)

    Nelson, Adam

    Multi-group scattering moment matrices are critical to the solution of the multi-group form of the neutron transport equation, as they are responsible for describing the change in direction and energy of neutrons. These matrices, however, are difficult to correctly calculate from the measured nuclear data with both deterministic and stochastic methods. Calculating these parameters when using deterministic methods requires a set of assumptions which do not hold true in all conditions. These quantities can be calculated accurately with stochastic methods, however doing so is computationally expensive due to the poor efficiency of tallying scattering moment matrices. This work presents an improved method of obtaining multi-group scattering moment matrices from a Monte Carlo neutron transport code. This improved method of tallying the scattering moment matrices is based on recognizing that all of the outgoing particle information is known a priori and can be taken advantage of to increase the tallying efficiency (therefore reducing the uncertainty) of the stochastically integrated tallies. In this scheme, the complete outgoing probability distribution is tallied, supplying every one of the scattering moment matrices elements with its share of data. In addition to reducing the uncertainty, this method allows for the use of a track-length estimation process potentially offering even further improvement to the tallying efficiency. Unfortunately, to produce the needed distributions, the probability functions themselves must undergo an integration over the outgoing energy and scattering angle dimensions. This integration is too costly to perform during the Monte Carlo simulation itself and therefore must be performed in advance by way of a pre-processing code. The new method increases the information obtained from tally events and therefore has a significantly higher efficiency than the currently used techniques. The improved method has been implemented in a code system containing a new pre-processor code, NDPP, and a Monte Carlo neutron transport code, OpenMC. This method is then tested in a pin cell problem and a larger problem designed to accentuate the importance of scattering moment matrices. These tests show that accuracy was retained while the figure-of-merit for generating scattering moment matrices and fission energy spectra was significantly improved.

  4. Poster — Thur Eve — 53: Novel Technique for the Measurement of Ultra-Superficial Doses Using Gafchromic Film

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcos, M.; Devic, S.

    2014-08-15

    Purpose: Dose build-up and electron contamination are two closely related quantities with important implications in radiotherapy, yet they are quite difficult to measure with great certainty. We present a novel technique for measuring ultra-superficial doses. Method and Materials: We used Gafchromic EBT-3 film which have an effective point of measurement of roughly 153 micros (effective depth in water). By peeling off one of the polyester layers, the active layer becomes the top layer and we obtain a film with a effective point of measurement of 15 microns (effective depth in water). A film calibration was performed using a 180 kVpmore » orthovoltage beam. Since the active layer of the film may have been compressed or perturbed during the peeling of clear polyester we use a triple-channel film calibration technique to minimize the effects of non-uniformity in the active layer. We measured surface doses of orthovoltage beams with lead cutouts in place to introduce contaminant photoelectrons. Results: Our measurements show that the dose enhancement near the edges of the lead were about 125% relative to central axis for 6 cm diameter cutouts up to 170% for 2 cm diameter cutouts, which were within 5% of our EGSnrc based Monte Carlo simulations.« less

  5. Review of Hybrid (Deterministic/Monte Carlo) Radiation Transport Methods, Codes, and Applications at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, John C; Peplow, Douglas E.; Mosher, Scott W

    2011-01-01

    This paper provides a review of the hybrid (Monte Carlo/deterministic) radiation transport methods and codes used at the Oak Ridge National Laboratory and examples of their application for increasing the efficiency of real-world, fixed-source Monte Carlo analyses. The two principal hybrid methods are (1) Consistent Adjoint Driven Importance Sampling (CADIS) for optimization of a localized detector (tally) region (e.g., flux, dose, or reaction rate at a particular location) and (2) Forward Weighted CADIS (FW-CADIS) for optimizing distributions (e.g., mesh tallies over all or part of the problem space) or multiple localized detector regions (e.g., simultaneous optimization of two or moremore » localized tally regions). The two methods have been implemented and automated in both the MAVRIC sequence of SCALE 6 and ADVANTG, a code that works with the MCNP code. As implemented, the methods utilize the results of approximate, fast-running 3-D discrete ordinates transport calculations (with the Denovo code) to generate consistent space- and energy-dependent source and transport (weight windows) biasing parameters. These methods and codes have been applied to many relevant and challenging problems, including calculations of PWR ex-core thermal detector response, dose rates throughout an entire PWR facility, site boundary dose from arrays of commercial spent fuel storage casks, radiation fields for criticality accident alarm system placement, and detector response for special nuclear material detection scenarios and nuclear well-logging tools. Substantial computational speed-ups, generally O(102-4), have been realized for all applications to date. This paper provides a brief review of the methods, their implementation, results of their application, and current development activities, as well as a considerable list of references for readers seeking more information about the methods and/or their applications.« less

  6. MO-F-CAMPUS-J-01: Effect of Iodine Contrast Agent Concentration On Cerebrovascular Dose for Synchrotron Radiation Microangiography Based On a Simple Mouse Head Model and a Voxel Mouse Head Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H; Jing, J; Xie, C

    Purpose: To find effective setting methods to mitigate the irradiation injure in synchrotron radiation microangiography(SRA) by Monte Carlo simulation. Methods: A mouse 1-D head model and a segmented voxel mouse head phantom were simulated by EGSnrc/Dosxyznrc code to investigate the dose enhancement effect of the iodine contrast agent irradiated by a monochromatic synchrotron radiation(SR) source. The influence of, like iodine concentration (IC), vessel width and depth, with and without skull layer protection and the various incident X ray energies, were simulated. The dose enhancement effect and the absolute dose based on the segmented voxel mouse head phantom were evaluated. Results:more » The dose enhancement ratio depends little on the irradiation depth, but strongly on the IC, which is linearly increases with IC. The skull layer protection cannot be ignored in SRA, the 700µm thick skull could decrease 10% of the dose. The incident X-ray energy can significantly affact the dose. E.g. compared to the dose of 33.2keV for 50mgI/ml, the 32.7keV dose decreases 38%, whereas the dose of 33.7 keV increases 69.2%, and the variation will strengthen more with enhanced IC. The segmented voxel mouse head phantom also showed that the average dose enhancement effect and the maximal voxel dose per photon depends little on the iodine voxel volume ratio, but strongly on IC. Conclusion: To decrease dose damage in SRA, the high-Z contrast agent should be used as little as possible, and try to avoid radiating locally the injected position immediately after the contrast agent injection. The fragile vessel containing iodine should avoid closely irradiating. Avoiding irradiating through the no or thin skull region, or appending thin equivalent material from outside to protect is also a better method. As long as SRA image quality is ensured, using incident X-ray energy as low as possible.« less

  7. FASH and MASH: female and male adult human phantoms based on polygon mesh surfaces: II. Dosimetric calculations

    NASA Astrophysics Data System (ADS)

    Kramer, R.; Cassola, V. F.; Khoury, H. J.; Vieira, J. W.; de Melo Lima, V. J.; Robson Brown, K.

    2010-01-01

    Female and male adult human phantoms, called FASH (Female Adult meSH) and MASH (Male Adult meSH), have been developed in the first part of this study using 3D animation software and anatomical atlases to replace the image-based FAX06 and the MAX06 voxel phantoms. 3D modelling methods allow for phantom development independent from medical images of patients, volunteers or cadavers. The second part of this study investigates the dosimetric implications for organ and tissue equivalent doses due to the anatomical differences between the new and the old phantoms. These differences are mainly caused by the supine position of human bodies during scanning in order to acquire digital images for voxel phantom development. Compared to an upright standing person, in image-based voxel phantoms organs are often coronally shifted towards the head and sometimes the sagittal diameter of the trunk is reduced by a gravitational change of the fat distribution. In addition, volumes of adipose and muscle tissue shielding internal organs are sometimes too small, because adaptation of organ volumes to ICRP-based organ masses often occurs at the expense of general soft tissues, such as adipose, muscle or unspecified soft tissue. These effects have dosimetric consequences, especially for partial body exposure, such as in x-ray diagnosis, but also for whole body external exposure and for internal exposure. Using the EGSnrc Monte Carlo code, internal and external exposure to photons and electrons has been simulated with both pairs of phantoms. The results show differences between organ and tissue equivalent doses for the upright standing FASH/MASH and the image-based supine FAX06/MAX06 phantoms of up to 80% for external exposure and up to 100% for internal exposure. Similar differences were found for external exposure between FASH/MASH and REGINA/REX, the reference voxel phantoms of the International Commission on Radiological Protection. Comparison of effective doses for external photon exposure showed good agreement between FASH/MASH and REGINA/REX, but large differences between FASH/MASH and the mesh-based RPI_AM and the RPI_AF phantoms, developed at the Rensselaer Polytechnic Institute (RPI).

  8. SU-C-201-07: Towards Clinical Cherenkov Emission Dosimetry: Stopping Power-To-Cherenkov Power Ratios and Beam Quality Specification of Clinical Electron Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zlateva, Y; Seuntjens, J; El Naqa, I

    Purpose: We propose a Cherenkov emission (CE)-based reference dosimetry method, which in contrast to ionization chamber-based dosimetry, employs spectrum-averaged electron restricted mass collision stopping power-to-Cherenkov power ratios (SCRs), and we examine Monte Carlo-calculated SCRs and beam quality specification of clinical electron beams. Methods: The EGSnrc user code SPRRZnrc was modified to compute SCRs instead of stopping-power ratios (single medium: water; cut-off: CE threshold (observing Spencer-Attix conditions); CE power: Frank-Tamm). SCRs are calculated with BEAMnrc for realistic electron beams with nominal energies of 6–22 MeV from three Varian accelerators (TrueBeam Clinac 21EX, Clinac 2100C/D) and for mono-energetic beams of energies equalmore » to the mean electron energy at the water surface. Sources of deviation between clinical and mono-energetic SCRs are analyzed quantitatively. A universal fit for the beam-quality index R{sub 50} in terms of the depth of 50% CE C{sub 50} is carried out. Results: SCRs at reference depth are overestimated by mono-energetic values by up to 0.2% for a 6-MeV beam and underestimated by up to 2.3% for a 22-MeV beam. The variation is mainly due to the clinical beam spectrum and photon contamination. Beam angular spread has a small effect across all depths and energies. The influence of the electron spectrum becomes increasingly significant at large depths, while at shallow depths and high beam energies photon contamination is predominant (up to 2.0%). The universal data fit reveals a strong linear correlation between R{sub 50} and C{sub 50} (ρ > 0.99999). Conclusion: CE is inherent to radiotherapy beams and can be detected outside the beam with available optical technologies, which makes it an ideal candidate for out-of-beam high-resolution 3D dosimetry. Successful clinical implementation of CE dosimetry hinges on the development of robust protocols for converting measured CE to radiation dose. Our findings constitute a key step towards clinical CE dosimetry.« less

  9. SU-F-T-577: Comparison of Small Field Dosimetry Measurements in Fields Shaped with Conical Applicators On Two Different Accelerating Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muir, B; McEwen, M; Belec, J

    2016-06-15

    Purpose: To investigate small field dosimetry measurements and associated uncertainties when conical applicators are used to shape treatment fields from two different accelerating systems. Methods: Output factor measurements are made in water in beams from the CyberKnife radiosurgery system, which uses conical applicators to shape fields from a (flattening filter-free) 6 MV beam, and in a 6 MV beam from the Elekta Precise linear accelerator (with flattening filter) with BrainLab external conical applicators fitted to shape the field. The measurements use various detectors: (i) an Exradin A16 ion chamber, (ii) two Exradin W1 plastic scintillation detectors, (iii) a Sun Nuclearmore » Edge diode, and (iv) two PTW microDiamond synthetic diamond detectors. Profiles are used for accurate detector positioning and to specify field size (FWHM). Output factor measurements are corrected with detector specific correction factors taken from the literature where available and/or from Monte Carlo simulations using the EGSnrc code system. Results: Differences in measurements of up to 1.7% are observed with a given detector type in the same beam (i.e., intra-detector variability). Corrected results from different detectors in the same beam (inter-detector differences) show deviations up to 3 %. Combining data for all detectors and comparing results from the two accelerators results in a 5.9% maximum difference for the smallest field sizes (FWHM=5.2–5.6 mm), well outside the combined uncertainties (∼1% for the smallest beams) and/or differences among detectors. This suggests that the FWHM of a measured profile is not a good specifier to compare results from different small fields with the same nominal energy. Conclusion: Large differences in results for both intra-detector variability and inter-detector differences suggest potentially high uncertainties in detector-specific correction factors. Differences between the results measured in circular fields from different accelerating systems provide insight into sources of variability in small field dosimetric measurements reported in the literature.« less

  10. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Optimization of the Monte Carlo code for modeling of photon migration in tissue.

    PubMed

    Zołek, Norbert S; Liebert, Adam; Maniewski, Roman

    2006-10-01

    The Monte Carlo method is frequently used to simulate light transport in turbid media because of its simplicity and flexibility, allowing to analyze complicated geometrical structures. Monte Carlo simulations are, however, time consuming because of the necessity to track the paths of individual photons. The time consuming computation is mainly associated with the calculation of the logarithmic and trigonometric functions as well as the generation of pseudo-random numbers. In this paper, the Monte Carlo algorithm was developed and optimized, by approximation of the logarithmic and trigonometric functions. The approximations were based on polynomial and rational functions, and the errors of these approximations are less than 1% of the values of the original functions. The proposed algorithm was verified by simulations of the time-resolved reflectance at several source-detector separations. The results of the calculation using the approximated algorithm were compared with those of the Monte Carlo simulations obtained with an exact computation of the logarithm and trigonometric functions as well as with the solution of the diffusion equation. The errors of the moments of the simulated distributions of times of flight of photons (total number of photons, mean time of flight and variance) are less than 2% for a range of optical properties, typical of living tissues. The proposed approximated algorithm allows to speed up the Monte Carlo simulations by a factor of 4. The developed code can be used on parallel machines, allowing for further acceleration.

  12. Verification of unfold error estimates in the UFO code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehl, D.L.; Biggs, F.

    Spectral unfolding is an inverse mathematical operation which attempts to obtain spectral source information from a set of tabulated response functions and data measurements. Several unfold algorithms have appeared over the past 30 years; among them is the UFO (UnFold Operator) code. In addition to an unfolded spectrum, UFO also estimates the unfold uncertainty (error) induced by running the code in a Monte Carlo fashion with prescribed data distributions (Gaussian deviates). In the problem studied, data were simulated from an arbitrarily chosen blackbody spectrum (10 keV) and a set of overlapping response functions. The data were assumed to have anmore » imprecision of 5% (standard deviation). 100 random data sets were generated. The built-in estimate of unfold uncertainty agreed with the Monte Carlo estimate to within the statistical resolution of this relatively small sample size (95% confidence level). A possible 10% bias between the two methods was unresolved. The Monte Carlo technique is also useful in underdetemined problems, for which the error matrix method does not apply. UFO has been applied to the diagnosis of low energy x rays emitted by Z-Pinch and ion-beam driven hohlraums.« less

  13. Monte Carlo track structure for radiation biology and space applications

    NASA Technical Reports Server (NTRS)

    Nikjoo, H.; Uehara, S.; Khvostunov, I. G.; Cucinotta, F. A.; Wilson, W. E.; Goodhead, D. T.

    2001-01-01

    Over the past two decades event by event Monte Carlo track structure codes have increasingly been used for biophysical modelling and radiotherapy. Advent of these codes has helped to shed light on many aspects of microdosimetry and mechanism of damage by ionising radiation in the cell. These codes have continuously been modified to include new improved cross sections and computational techniques. This paper provides a summary of input data for ionizations, excitations and elastic scattering cross sections for event by event Monte Carlo track structure simulations for electrons and ions in the form of parametric equations, which makes it easy to reproduce the data. Stopping power and radial distribution of dose are presented for ions and compared with experimental data. A model is described for simulation of full slowing down of proton tracks in water in the range 1 keV to 1 MeV. Modelling and calculations are presented for the response of a TEPC proportional counter irradiated with 5 MeV alpha-particles. Distributions are presented for the wall and wall-less counters. Data shows contribution of indirect effects to the lineal energy distribution for the wall counters responses even at such a low ion energy.

  14. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  15. Monte Carlo calculations of initial energies of electrons in water irradiated by photons with energies up to 1GeV.

    PubMed

    Todo, A S; Hiromoto, G; Turner, J E; Hamm, R N; Wright, H A

    1982-12-01

    Previous calculations of the initial energies of electrons produced in water irradiated by photons are extended to 1 GeV by including pair and triplet production. Calculations were performed with the Monte Carlo computer code PHOEL-3, which replaces the earlier code, PHOEL-2. Tables of initial electron energies are presented for single interactions of monoenergetic photons at a number of energies from 10 keV to 1 GeV. These tables can be used to compute kerma in water irradiated by photons with arbitrary energy spectra to 1 GeV. In addition, separate tables of Compton-and pair-electron spectra are given over this energy range. The code PHOEL-3 is available from the Radiation Shielding Information Center, Oak Ridge National Laboratory, Oak Ridge, TN 37830.

  16. A 3DHZETRN Code in a Spherical Uniform Sphere with Monte Carlo Verification

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2014-01-01

    The computationally efficient HZETRN code has been used in recent trade studies for lunar and Martian exploration and is currently being used in the engineering development of the next generation of space vehicles, habitats, and extra vehicular activity equipment. A new version (3DHZETRN) capable of transporting High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation is under development. In the present report, new algorithms for light ion and neutron propagation with well-defined convergence criteria in 3D objects is developed and tested against Monte Carlo simulations to verify the solution methodology. The code will be available through the software system, OLTARIS, for shield design and validation and provides a basis for personal computer software capable of space shield analysis and optimization.

  17. Kinetic Monte Carlo simulation of dopant-defect systems under submicrosecond laser thermal processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisicaro, G.; Pelaz, Lourdes; Lopez, P.

    2012-11-06

    An innovative Kinetic Monte Carlo (KMC) code has been developed, which rules the post-implant kinetics of the defects system in the extremely far-from-the equilibrium conditions caused by the laser irradiation close to the liquid-solid interface. It considers defect diffusion, annihilation and clustering. The code properly implements, consistently to the stochastic formalism, the fast varying local event rates related to the thermal field T(r,t) evolution. This feature of our numerical method represents an important advancement with respect to current state of the art KMC codes. The reduction of the implantation damage and its reorganization in defect aggregates are studied as amore » function of the process conditions. Phosphorus activation efficiency, experimentally determined in similar conditions, has been related to the emerging damage scenario.« less

  18. DNA strand breaks induced by electrons simulated with Nanodosimetry Monte Carlo Simulation Code: NASIC.

    PubMed

    Li, Junli; Li, Chunyan; Qiu, Rui; Yan, Congchong; Xie, Wenzhang; Wu, Zhen; Zeng, Zhi; Tung, Chuanjong

    2015-09-01

    The method of Monte Carlo simulation is a powerful tool to investigate the details of radiation biological damage at the molecular level. In this paper, a Monte Carlo code called NASIC (Nanodosimetry Monte Carlo Simulation Code) was developed. It includes physical module, pre-chemical module, chemical module, geometric module and DNA damage module. The physical module can simulate physical tracks of low-energy electrons in the liquid water event-by-event. More than one set of inelastic cross sections were calculated by applying the dielectric function method of Emfietzoglou's optical-data treatments, with different optical data sets and dispersion models. In the pre-chemical module, the ionised and excited water molecules undergo dissociation processes. In the chemical module, the produced radiolytic chemical species diffuse and react. In the geometric module, an atomic model of 46 chromatin fibres in a spherical nucleus of human lymphocyte was established. In the DNA damage module, the direct damages induced by the energy depositions of the electrons and the indirect damages induced by the radiolytic chemical species were calculated. The parameters should be adjusted to make the simulation results be agreed with the experimental results. In this paper, the influence study of the inelastic cross sections and vibrational excitation reaction on the parameters and the DNA strand break yields were studied. Further work of NASIC is underway. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Testing of Error-Correcting Sparse Permutation Channel Codes

    NASA Technical Reports Server (NTRS)

    Shcheglov, Kirill, V.; Orlov, Sergei S.

    2008-01-01

    A computer program performs Monte Carlo direct numerical simulations for testing sparse permutation channel codes, which offer strong error-correction capabilities at high code rates and are considered especially suitable for storage of digital data in holographic and volume memories. A word in a code of this type is characterized by, among other things, a sparseness parameter (M) and a fixed number (K) of 1 or "on" bits in a channel block length of N.

  20. Metallic artifact mitigation and organ-constrained tissue assignment for Monte Carlo calculations of permanent implant lung brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutherland, J. G. H.; Miksys, N.; Thomson, R. M., E-mail: rthomson@physics.carleton.ca

    2014-01-15

    Purpose: To investigate methods of generating accurate patient-specific computational phantoms for the Monte Carlo calculation of lung brachytherapy patient dose distributions. Methods: Four metallic artifact mitigation methods are applied to six lung brachytherapy patient computed tomography (CT) images: simple threshold replacement (STR) identifies high CT values in the vicinity of the seeds and replaces them with estimated true values; fan beam virtual sinogram replaces artifact-affected values in a virtual sinogram and performs a filtered back-projection to generate a corrected image; 3D median filter replaces voxel values that differ from the median value in a region of interest surrounding the voxelmore » and then applies a second filter to reduce noise; and a combination of fan beam virtual sinogram and STR. Computational phantoms are generated from artifact-corrected and uncorrected images using several tissue assignment schemes: both lung-contour constrained and unconstrained global schemes are considered. Voxel mass densities are assigned based on voxel CT number or using the nominal tissue mass densities. Dose distributions are calculated using the EGSnrc user-code BrachyDose for{sup 125}I, {sup 103}Pd, and {sup 131}Cs seeds and are compared directly as well as through dose volume histograms and dose metrics for target volumes surrounding surgical sutures. Results: Metallic artifact mitigation techniques vary in ability to reduce artifacts while preserving tissue detail. Notably, images corrected with the fan beam virtual sinogram have reduced artifacts but residual artifacts near sources remain requiring additional use of STR; the 3D median filter removes artifacts but simultaneously removes detail in lung and bone. Doses vary considerably between computational phantoms with the largest differences arising from artifact-affected voxels assigned to bone in the vicinity of the seeds. Consequently, when metallic artifact reduction and constrained tissue assignment within lung contours are employed in generated phantoms, this erroneous assignment is reduced, generally resulting in higher doses. Lung-constrained tissue assignment also results in increased doses in regions of interest due to a reduction in the erroneous assignment of adipose to voxels within lung contours. Differences in dose metrics calculated for different computational phantoms are sensitive to radionuclide photon spectra with the largest differences for{sup 103}Pd seeds and smallest but still considerable differences for {sup 131}Cs seeds. Conclusions: Despite producing differences in CT images, dose metrics calculated using the STR, fan beam + STR, and 3D median filter techniques produce similar dose metrics. Results suggest that the accuracy of dose distributions for permanent implant lung brachytherapy is improved by applying lung-constrained tissue assignment schemes to metallic artifact corrected images.« less

  1. Mesh-based Monte Carlo code for fluorescence modeling in complex tissues with irregular boundaries

    NASA Astrophysics Data System (ADS)

    Wilson, Robert H.; Chen, Leng-Chun; Lloyd, William; Kuo, Shiuhyang; Marcelo, Cynthia; Feinberg, Stephen E.; Mycek, Mary-Ann

    2011-07-01

    There is a growing need for the development of computational models that can account for complex tissue morphology in simulations of photon propagation. We describe the development and validation of a user-friendly, MATLAB-based Monte Carlo code that uses analytically-defined surface meshes to model heterogeneous tissue geometry. The code can use information from non-linear optical microscopy images to discriminate the fluorescence photons (from endogenous or exogenous fluorophores) detected from different layers of complex turbid media. We present a specific application of modeling a layered human tissue-engineered construct (Ex Vivo Produced Oral Mucosa Equivalent, EVPOME) designed for use in repair of oral tissue following surgery. Second-harmonic generation microscopic imaging of an EVPOME construct (oral keratinocytes atop a scaffold coated with human type IV collagen) was employed to determine an approximate analytical expression for the complex shape of the interface between the two layers. This expression can then be inserted into the code to correct the simulated fluorescence for the effect of the irregular tissue geometry.

  2. Object-Oriented/Data-Oriented Design of a Direct Simulation Monte Carlo Algorithm

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2014-01-01

    Over the past decade, there has been much progress towards improved phenomenological modeling and algorithmic updates for the direct simulation Monte Carlo (DSMC) method, which provides a probabilistic physical simulation of gas Rows. These improvements have largely been based on the work of the originator of the DSMC method, Graeme Bird. Of primary importance are improved chemistry, internal energy, and physics modeling and a reduction in time to solution. These allow for an expanded range of possible solutions In altitude and velocity space. NASA's current production code, the DSMC Analysis Code (DAC), is well-established and based on Bird's 1994 algorithms written in Fortran 77 and has proven difficult to upgrade. A new DSMC code is being developed in the C++ programming language using object-oriented and data-oriented design paradigms to facilitate the inclusion of the recent improvements and future development activities. The development efforts on the new code, the Multiphysics Algorithm with Particles (MAP), are described, and performance comparisons are made with DAC.

  3. Comparison of EGS4 and MCNP Monte Carlo codes when calculating radiotherapy depth doses.

    PubMed

    Love, P A; Lewis, D G; Al-Affan, I A; Smith, C W

    1998-05-01

    The Monte Carlo codes EGS4 and MCNP have been compared when calculating radiotherapy depth doses in water. The aims of the work were to study (i) the differences between calculated depth doses in water for a range of monoenergetic photon energies and (ii) the relative efficiency of the two codes for different electron transport energy cut-offs. The depth doses from the two codes agree with each other within the statistical uncertainties of the calculations (1-2%). The relative depth doses also agree with data tabulated in the British Journal of Radiology Supplement 25. A discrepancy in the dose build-up region may by attributed to the different electron transport algorithims used by EGS4 and MCNP. This discrepancy is considerably reduced when the improved electron transport routines are used in the latest (4B) version of MCNP. Timing calculations show that EGS4 is at least 50% faster than MCNP for the geometries used in the simulations.

  4. Three-dimensional Monte-Carlo simulation of gamma-ray scattering and production in the atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, D.J.

    1989-05-15

    Monte Carlo codes have been developed to simulate gamma-ray scattering and production in the atmosphere. The scattering code simulates interactions of low-energy gamma rays (20 to several hundred keV) from an astronomical point source in the atmosphere; a modified code also simulates scattering in a spacecraft. Four incident spectra, typical of gamma-ray bursts, solar flares, and the Crab pulsar, and 511 keV line radiation have been studied. These simulations are consistent with observations of solar flare radiation scattered from the atmosphere. The production code simulates the interactions of cosmic rays which produce high-energy (above 10 MeV) photons and electrons. Itmore » has been used to calculate gamma-ray and electron albedo intensities at Palestine, Texas and at the equator; the results agree with observations in most respects. With minor modifications this code can be used to calculate intensities of other high-energy particles. Both codes are fully three-dimensional, incorporating a curved atmosphere; the production code also incorporates the variation with both zenith and azimuth of the incident cosmic-ray intensity due to geomagnetic effects. These effects are clearly reflected in the calculated albedo by intensity contrasts between the horizon and nadir, and between the east and west horizons.« less

  5. Overview of Recent Radiation Transport Code Comparisons for Space Applications

    NASA Astrophysics Data System (ADS)

    Townsend, Lawrence

    Recent advances in radiation transport code development for space applications have resulted in various comparisons of code predictions for a variety of scenarios and codes. Comparisons among both Monte Carlo and deterministic codes have been made and published by vari-ous groups and collaborations, including comparisons involving, but not limited to HZETRN, HETC-HEDS, FLUKA, GEANT, PHITS, and MCNPX. In this work, an overview of recent code prediction inter-comparisons, including comparisons to available experimental data, is presented and discussed, with emphases on those areas of agreement and disagreement among the various code predictions and published data.

  6. Metis: A Pure Metropolis Markov Chain Monte Carlo Bayesian Inference Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, Cameron Russell; Mckigney, Edward Allen

    The use of Bayesian inference in data analysis has become the standard for large scienti c experiments [1, 2]. The Monte Carlo Codes Group(XCP-3) at Los Alamos has developed a simple set of algorithms currently implemented in C++ and Python to easily perform at-prior Markov Chain Monte Carlo Bayesian inference with pure Metropolis sampling. These implementations are designed to be user friendly and extensible for customization based on speci c application requirements. This document describes the algorithmic choices made and presents two use cases.

  7. Monte Carlo simulation of Ising models by multispin coding on a vector computer

    NASA Astrophysics Data System (ADS)

    Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus

    1984-11-01

    Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.

  8. Radiation Transport Tools for Space Applications: A Review

    NASA Technical Reports Server (NTRS)

    Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn

    2008-01-01

    This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.

  9. Use of single scatter electron monte carlo transport for medical radiation sciences

    DOEpatents

    Svatos, Michelle M.

    2001-01-01

    The single scatter Monte Carlo code CREEP models precise microscopic interactions of electrons with matter to enhance physical understanding of radiation sciences. It is designed to simulate electrons in any medium, including materials important for biological studies. It simulates each interaction individually by sampling from a library which contains accurate information over a broad range of energies.

  10. Stochastic Analysis of Orbital Lifetimes of Spacecraft

    NASA Technical Reports Server (NTRS)

    Sasamoto, Washito; Goodliff, Kandyce; Cornelius, David

    2008-01-01

    A document discusses (1) a Monte-Carlo-based methodology for probabilistic prediction and analysis of orbital lifetimes of spacecraft and (2) Orbital Lifetime Monte Carlo (OLMC)--a Fortran computer program, consisting of a previously developed long-term orbit-propagator integrated with a Monte Carlo engine. OLMC enables modeling of variances of key physical parameters that affect orbital lifetimes through the use of probability distributions. These parameters include altitude, speed, and flight-path angle at insertion into orbit; solar flux; and launch delays. The products of OLMC are predicted lifetimes (durations above specified minimum altitudes) for the number of user-specified cases. Histograms generated from such predictions can be used to determine the probabilities that spacecraft will satisfy lifetime requirements. The document discusses uncertainties that affect modeling of orbital lifetimes. Issues of repeatability, smoothness of distributions, and code run time are considered for the purpose of establishing values of code-specific parameters and number of Monte Carlo runs. Results from test cases are interpreted as demonstrating that solar-flux predictions are primary sources of variations in predicted lifetimes. Therefore, it is concluded, multiple sets of predictions should be utilized to fully characterize the lifetime range of a spacecraft.

  11. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badal, Andreu; Badano, Aldo

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-raymore » imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.« less

  12. Monte Carlo method for calculating the radiation skyshine produced by electron accelerators

    NASA Astrophysics Data System (ADS)

    Kong, Chaocheng; Li, Quanfeng; Chen, Huaibi; Du, Taibin; Cheng, Cheng; Tang, Chuanxiang; Zhu, Li; Zhang, Hui; Pei, Zhigang; Ming, Shenjin

    2005-06-01

    Using the MCNP4C Monte Carlo code, the X-ray skyshine produced by 9 MeV, 15 MeV and 21 MeV electron linear accelerators were calculated respectively with a new two-step method combined with the split and roulette variance reduction technique. Results of the Monte Carlo simulation, the empirical formulas used for skyshine calculation and the dose measurements were analyzed and compared. In conclusion, the skyshine dose measurements agreed reasonably with the results computed by the Monte Carlo method, but deviated from computational results given by empirical formulas. The effect on skyshine dose caused by different structures of accelerator head is also discussed in this paper.

  13. Use of the ETA-1 reactor for the validation of the multi-group APOLLO2-MORET 5 code and the Monte Carlo continuous energy MORET 5 code

    NASA Astrophysics Data System (ADS)

    Leclaire, N.; Cochet, B.; Le Dauphin, F. X.; Haeck, W.; Jacquet, O.

    2014-06-01

    The present paper aims at providing experimental validation for the use of the MORET 5 code for advanced concepts of reactor involving thorium and heavy water. It therefore constitutes an opportunity to test and improve the thermal-scattering data of heavy water and also to test the recent implementation of probability tables in the MORET 5 code.

  14. A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes

    NASA Technical Reports Server (NTRS)

    Schnittman, Jeremy David; Krolik, Julian H.

    2013-01-01

    We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.

  15. Performance analysis of a parallel Monte Carlo code for simulating solar radiative transfer in cloudy atmospheres using CUDA-enabled NVIDIA GPU

    NASA Astrophysics Data System (ADS)

    Russkova, Tatiana V.

    2017-11-01

    One tool to improve the performance of Monte Carlo methods for numerical simulation of light transport in the Earth's atmosphere is the parallel technology. A new algorithm oriented to parallel execution on the CUDA-enabled NVIDIA graphics processor is discussed. The efficiency of parallelization is analyzed on the basis of calculating the upward and downward fluxes of solar radiation in both a vertically homogeneous and inhomogeneous models of the atmosphere. The results of testing the new code under various atmospheric conditions including continuous singlelayered and multilayered clouds, and selective molecular absorption are presented. The results of testing the code using video cards with different compute capability are analyzed. It is shown that the changeover of computing from conventional PCs to the architecture of graphics processors gives more than a hundredfold increase in performance and fully reveals the capabilities of the technology used.

  16. Monte Carlo simulations in Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Loudos, George K.

    2007-11-01

    Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.

  17. Molecular dynamics and dynamic Monte-Carlo simulation of irradiation damage with focused ion beams

    NASA Astrophysics Data System (ADS)

    Ohya, Kaoru

    2017-03-01

    The focused ion beam (FIB) has become an important tool for micro- and nanostructuring of samples such as milling, deposition and imaging. However, this leads to damage of the surface on the nanometer scale from implanted projectile ions and recoiled material atoms. It is therefore important to investigate each kind of damage quantitatively. We present a dynamic Monte-Carlo (MC) simulation code to simulate the morphological and compositional changes of a multilayered sample under ion irradiation and a molecular dynamics (MD) simulation code to simulate dose-dependent changes in the backscattering-ion (BSI)/secondary-electron (SE) yields of a crystalline sample. Recent progress in the codes for research to simulate the surface morphology and Mo/Si layers intermixing in an EUV lithography mask irradiated with FIBs, and the crystalline orientation effect on BSI and SE yields relating to the channeling contrast in scanning ion microscopes, is also presented.

  18. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, V. I.; Makarova, A. S.; Ryazantsev, O. B.; Samarin, S. I.; Uglov, A. S.

    2014-06-01

    A great breakthrough in proton therapy has happened in the new century: several tens of dedicated centers are now operated throughout the world and their number increases every year. An important component of proton therapy is a treatment planning system. To make calculations faster, these systems usually use analytical methods whose reliability and accuracy do not allow the advantages of this method of treatment to implement to the full extent. Predictions by the Monte Carlo (MC) method are a "gold" standard for the verification of calculations with these systems. At the Institute of Experimental and Theoretical Physics (ITEP) which is one of the eldest proton therapy centers in the world, an MC code is an integral part of their treatment planning system. This code which is called IThMC was developed by scientists from RFNC-VNIITF (Snezhinsk) under ISTC Project 3563.

  19. Parallelization of KENO-Va Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Ramón, Javier; Peña, Jorge

    1995-07-01

    KENO-Va is a code integrated within the SCALE system developed by Oak Ridge that solves the transport equation through the Monte Carlo Method. It is being used at the Consejo de Seguridad Nuclear (CSN) to perform criticality calculations for fuel storage pools and shipping casks. Two parallel versions of the code: one for shared memory machines and other for distributed memory systems using the message-passing interface PVM have been generated. In both versions the neutrons of each generation are tracked in parallel. In order to preserve the reproducibility of the results in both versions, advanced seeds for random numbers were used. The CONVEX C3440 with four processors and shared memory at CSN was used to implement the shared memory version. A FDDI network of 6 HP9000/735 was employed to implement the message-passing version using proprietary PVM. The speedup obtained was 3.6 in both cases.

  20. Portable multi-node LQCD Monte Carlo simulations using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Calore, Enrico; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Sanfilippo, Francesco; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele

    This paper describes a state-of-the-art parallel Lattice QCD Monte Carlo code for staggered fermions, purposely designed to be portable across different computer architectures, including GPUs and commodity CPUs. Portability is achieved using the OpenACC parallel programming model, used to develop a code that can be compiled for several processor architectures. The paper focuses on parallelization on multiple computing nodes using OpenACC to manage parallelism within the node, and OpenMPI to manage parallelism among the nodes. We first discuss the available strategies to be adopted to maximize performances, we then describe selected relevant details of the code, and finally measure the level of performance and scaling-performance that we are able to achieve. The work focuses mainly on GPUs, which offer a significantly high level of performances for this application, but also compares with results measured on other processors.

  1. Experimental measurements with Monte Carlo corrections and theoretical calculations of neutron inelastic scattering cross section of 115In

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Xiao, Jun; Luo, Xiaobing

    2016-10-01

    The neutron inelastic scattering cross section of 115In has been measured by the activation technique at neutron energies of 2.95, 3.94, and 5.24 MeV with the neutron capture cross sections of 197Au as an internal standard. The effects of multiple scattering and flux attenuation were corrected using the Monte Carlo code GEANT4. Based on the experimental values, the 115In neutron inelastic scattering cross sections data were theoretically calculated between the 1 and 15 MeV with the TALYS software code, the theoretical results of this study are in reasonable agreement with the available experimental results.

  2. Track-structure simulations for charged particles.

    PubMed

    Dingfelder, Michael

    2012-11-01

    Monte Carlo track-structure simulations provide a detailed and accurate picture of radiation transport of charged particles through condensed matter of biological interest. Liquid water serves as a surrogate for soft tissue and is used in most Monte Carlo track-structure codes. Basic theories of radiation transport and track-structure simulations are discussed and differences compared to condensed history codes highlighted. Interaction cross sections for electrons, protons, alpha particles, and light and heavy ions are required input data for track-structure simulations. Different calculation methods, including the plane-wave Born approximation, the dielectric theory, and semi-empirical approaches are presented using liquid water as a target. Low-energy electron transport and light ion transport are discussed as areas of special interest.

  3. Coupled reactors analysis: New needs and advances using Monte Carlo methodology

    DOE PAGES

    Aufiero, M.; Palmiotti, G.; Salvatores, M.; ...

    2016-08-20

    Coupled reactors and the coupling features of large or heterogeneous core reactors can be investigated with the Avery theory that allows a physics understanding of the main features of these systems. However, the complex geometries that are often encountered in association with coupled reactors, require a detailed geometry description that can be easily provided by modern Monte Carlo (MC) codes. This implies a MC calculation of the coupling parameters defined by Avery and of the sensitivity coefficients that allow further detailed physics analysis. The results presented in this paper show that the MC code SERPENT has been successfully modifed tomore » meet the required capabilities.« less

  4. PBMC: Pre-conditioned Backward Monte Carlo code for radiative transport in planetary atmospheres

    NASA Astrophysics Data System (ADS)

    García Muñoz, A.; Mills, F. P.

    2017-08-01

    PBMC (Pre-Conditioned Backward Monte Carlo) solves the vector Radiative Transport Equation (vRTE) and can be applied to planetary atmospheres irradiated from above. The code builds the solution by simulating the photon trajectories from the detector towards the radiation source, i.e. in the reverse order of the actual photon displacements. In accounting for the polarization in the sampling of photon propagation directions and pre-conditioning the scattering matrix with information from the scattering matrices of prior (in the BMC integration order) photon collisions, PBMC avoids the unstable and biased solutions of classical BMC algorithms for conservative, optically-thick, strongly-polarizing media such as Rayleigh atmospheres.

  5. Organ and effective doses in newborn patients during helical multislice computed tomography examination

    NASA Astrophysics Data System (ADS)

    Staton, Robert J.; Lee, Choonik; Lee, Choonsik; Williams, Matt D.; Hintenlang, David E.; Arreola, Manuel M.; Williams, Jonathon L.; Bolch, Wesley E.

    2006-10-01

    In this study, two computational phantoms of the newborn patient were used to assess individual organ doses and effective doses delivered during head, chest, abdomen, pelvis, and torso examinations using the Siemens SOMATOM Sensation 16 helical multi-slice computed tomography (MSCT) scanner. The stylized phantom used to model the patient anatomy was the revised ORNL newborn phantom by Han et al (2006 Health Phys.90 337). The tomographic phantom used in the study was that developed by Nipper et al (2002 Phys. Med. Biol. 47 3143) as recently revised by Staton et al (2006 Med. Phys. 33 3283). The stylized model was implemented within the MCNP5 radiation transport code, while the tomographic phantom was incorporated within the EGSnrc code. In both codes, the x-ray source was modelled as a fan beam originating from the focal spot at a fan angle of 52° and a focal-spot-to-axis distance of 57 cm. The helical path of the source was explicitly modelled based on variations in collimator setting (12 mm or 24 mm), detector pitch and scan length. Tube potentials of 80, 100 and 120 kVp were considered in this study. Beam profile data were acquired using radiological film measurements on a 16 cm PMMA phantom, which yielded effective beam widths of 14.7 mm and 26.8 mm for collimator settings of 12 mm and 24 mm, respectively. Values of absolute organ absorbed dose were determined via the use of normalization factors defined as the ratio of the CTDI100 measured in-phantom and that determined by Monte Carlo simulation of the PMMA phantom and ion chamber. Across various technique factors, effective dose differences between the stylized and tomographic phantoms ranged from +2% to +9% for head exams, -4% to -2% for chest exams, +8% to +24% for abdominal exams, -16% to -12% for pelvic exams and -7% to 0% for chest-abdomen-pelvis (CAP) exams. In many cases, however, relatively close agreement in effective dose was accomplished at the expense of compensating errors in individual organ dose. Per cent differences in organ dose between the stylized and tomographic phantoms at 120 kVp and 12 mm collimator setting ranged from -25% (skin) to +164% (muscle) for head exams, -92% (thyroid) to +98% (ovaries) for chest exams, -144% (uterus) to +112% (ovaries) for abdominal exams, -98% (SI wall) to +20% (thymus) for pelvic exams and -60% (extrathoracic airways) to +13% (ovaries) for CAP exams. Better agreement was seen between the two phantom types for organs entirely within the scan field. In these cases, corresponding per cent differences in organ absorbed dose did not vary more than 17%. For all scans, the effective dose was found to range approximately 1-13 mSv across the scan parameters and scan regions. The largest effective dose occurred for CAP scans at 120 kVp.

  6. Neutron Deep Penetration Calculations in Light Water with Monte Carlo TRIPOLI-4® Variance Reduction Techniques

    NASA Astrophysics Data System (ADS)

    Lee, Yi-Kang

    2017-09-01

    Nuclear decommissioning takes place in several stages due to the radioactivity in the reactor structure materials. A good estimation of the neutron activation products distributed in the reactor structure materials impacts obviously on the decommissioning planning and the low-level radioactive waste management. Continuous energy Monte-Carlo radiation transport code TRIPOLI-4 has been applied on radiation protection and shielding analyses. To enhance the TRIPOLI-4 application in nuclear decommissioning activities, both experimental and computational benchmarks are being performed. To calculate the neutron activation of the shielding and structure materials of nuclear facilities, the knowledge of 3D neutron flux map and energy spectra must be first investigated. To perform this type of neutron deep penetration calculations with the Monte Carlo transport code, variance reduction techniques are necessary in order to reduce the uncertainty of the neutron activation estimation. In this study, variance reduction options of the TRIPOLI-4 code were used on the NAIADE 1 light water shielding benchmark. This benchmark document is available from the OECD/NEA SINBAD shielding benchmark database. From this benchmark database, a simplified NAIADE 1 water shielding model was first proposed in this work in order to make the code validation easier. Determination of the fission neutron transport was performed in light water for penetration up to 50 cm for fast neutrons and up to about 180 cm for thermal neutrons. Measurement and calculation results were benchmarked. Variance reduction options and their performance were discussed and compared.

  7. Verification of unfold error estimates in the unfold operator code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehl, D.L.; Biggs, F.

    Spectral unfolding is an inverse mathematical operation that attempts to obtain spectral source information from a set of response functions and data measurements. Several unfold algorithms have appeared over the past 30 years; among them is the unfold operator (UFO) code written at Sandia National Laboratories. In addition to an unfolded spectrum, the UFO code also estimates the unfold uncertainty (error) induced by estimated random uncertainties in the data. In UFO the unfold uncertainty is obtained from the error matrix. This built-in estimate has now been compared to error estimates obtained by running the code in a Monte Carlo fashionmore » with prescribed data distributions (Gaussian deviates). In the test problem studied, data were simulated from an arbitrarily chosen blackbody spectrum (10 keV) and a set of overlapping response functions. The data were assumed to have an imprecision of 5{percent} (standard deviation). One hundred random data sets were generated. The built-in estimate of unfold uncertainty agreed with the Monte Carlo estimate to within the statistical resolution of this relatively small sample size (95{percent} confidence level). A possible 10{percent} bias between the two methods was unresolved. The Monte Carlo technique is also useful in underdetermined problems, for which the error matrix method does not apply. UFO has been applied to the diagnosis of low energy x rays emitted by Z-pinch and ion-beam driven hohlraums. {copyright} {ital 1997 American Institute of Physics.}« less

  8. ITS version 5.0 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    ITS is a powerful and user-friendly software package permitting state of the art Monte Carlo solution of linear time-independent couple electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theoristsmore » alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2)multigroup codes with adjoint transport capabilities, and (3) parallel implementations of all ITS codes. Moreover the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less

  9. Nuclear reaction measurements on tissue-equivalent materials and GEANT4 Monte Carlo simulations for hadrontherapy

    NASA Astrophysics Data System (ADS)

    De Napoli, M.; Romano, F.; D'Urso, D.; Licciardello, T.; Agodi, C.; Candiano, G.; Cappuzzello, F.; Cirrone, G. A. P.; Cuttone, G.; Musumarra, A.; Pandola, L.; Scuderi, V.

    2014-12-01

    When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned. Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u-1 12C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.

  10. Comparison of UWCC MOX fuel measurements to MCNP-REN calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abhold, M.; Baker, M.; Jie, R.

    1998-12-31

    The development of neutron coincidence counting has greatly improved the accuracy and versatility of neutron-based techniques to assay fissile materials. Today, the shift register analyzer connected to either a passive or active neutron detector is widely used by both domestic and international safeguards organizations. The continued development of these techniques and detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model, as it is currently used, fails to accurately predict detector response in highly multiplying mediums such as mixed-oxide (MOX) lightmore » water reactor fuel assemblies. For this reason, efforts have been made to modify the currently used Monte Carlo codes and to develop new analytical methods so that this model is not required to predict detector response. The authors describe their efforts to modify a widely used Monte Carlo code for this purpose and also compare calculational results with experimental measurements.« less

  11. Coupled Monte Carlo neutronics and thermal hydraulics for power reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernnat, W.; Buck, M.; Mattes, M.

    The availability of high performance computing resources enables more and more the use of detailed Monte Carlo models even for full core power reactors. The detailed structure of the core can be described by lattices, modeled by so-called repeated structures e.g. in Monte Carlo codes such as MCNP5 or MCNPX. For cores with mainly uniform material compositions, fuel and moderator temperatures, there is no problem in constructing core models. However, when the material composition and the temperatures vary strongly a huge number of different material cells must be described which complicate the input and in many cases exceed code ormore » memory limits. The second problem arises with the preparation of corresponding temperature dependent cross sections and thermal scattering laws. Only if these problems can be solved, a realistic coupling of Monte Carlo neutronics with an appropriate thermal-hydraulics model is possible. In this paper a method for the treatment of detailed material and temperature distributions in MCNP5 is described based on user-specified internal functions which assign distinct elements of the core cells to material specifications (e.g. water density) and temperatures from a thermal-hydraulics code. The core grid itself can be described with a uniform material specification. The temperature dependency of cross sections and thermal neutron scattering laws is taken into account by interpolation, requiring only a limited number of data sets generated for different temperatures. Applications will be shown for the stationary part of the Purdue PWR benchmark using ATHLET for thermal- hydraulics and for a generic Modular High Temperature reactor using THERMIX for thermal- hydraulics. (authors)« less

  12. Standardizing Methods for Weapons Accuracy and Effectiveness Evaluation

    DTIC Science & Technology

    2014-06-01

    37  B.  MONTE CARLO APPROACH............................37  C.  EXPECTED VALUE THEOREM..........................38  D.  PHIT /PNM METHODOLOGY...MATLAB CODE – SR_CDF_DATA.......................96  F.  MATLAB CODE – GE_EXTRACT........................98  G.  MATLAB CODE - PHIT /PNM...Normal fit to test data.........................18  Figure 11.  Double Normal fit to test data..................19  Figure 12.  PHIT /PNM Methodology (from

  13. Space Applications of the FLUKA Monte-Carlo Code: Lunar and Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Anderson, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Elkhayari, N.; Empl, A.; Fasso, A.; Ferrari, A.; hide

    2004-01-01

    NASA has recognized the need for making additional heavy-ion collision measurements at the U.S. Brookhaven National Laboratory in order to support further improvement of several particle physics transport-code models for space exploration applications. FLUKA has been identified as one of these codes and we will review the nature and status of this investigation as it relates to high-energy heavy-ion physics.

  14. Calculated criticality for sup 235 U/graphite systems using the VIM Monte Carlo code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, P.J.; Grasseschi, G.L.; Olsen, D.N.

    1992-01-01

    Calculations for highly enriched uranium and graphite systems gained renewed interest recently for the new production modular high-temperature gas-cooled reactor (MHTGR). Experiments to validate the physics calculations for these systems are being prepared for the Transient Reactor Test Facility (TREAT) reactor at Argonne National Laboratory (ANL-West) and in the Compact Nuclear Power Source facility at Los Alamos National Laboratory. The continuous-energy Monte Carlo code VIM, or equivalently the MCNP code, can utilize fully detailed models of the MHTGR and serve as benchmarks for the approximate multigroup methods necessary in full reactor calculations. Validation of these codes and their associated nuclearmore » data did not exist for highly enriched {sup 235}U/graphite systems. Experimental data, used in development of more approximate methods, dates back to the 1960s. The authors have selected two independent sets of experiments for calculation with the VIM code. The carbon-to-uranium (C/U) ratios encompass the range of 2,000, representative of the new production MHTGR, to the ratio of 10,000 in the fuel of TREAT. Calculations used the ENDF/B-V data.« less

  15. Design Analysis of SNS Target StationBiological Shielding Monoligh with Proton Power Uprate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bekar, Kursat B.; Ibrahim, Ahmad M.

    2017-05-01

    This report documents the analysis of the dose rate in the experiment area outside the Spallation Neutron Source (SNS) target station shielding monolith with proton beam energy of 1.3 GeV. The analysis implemented a coupled three dimensional (3D)/two dimensional (2D) approach that used both the Monte Carlo N-Particle Extended (MCNPX) 3D Monte Carlo code and the Discrete Ordinates Transport (DORT) two dimensional deterministic code. The analysis with proton beam energy of 1.3 GeV showed that the dose rate in continuously occupied areas on the lateral surface outside the SNS target station shielding monolith is less than 0.25 mrem/h, which compliesmore » with the SNS facility design objective. However, the methods and codes used in this analysis are out of date and unsupported, and the 2D approximation of the target shielding monolith does not accurately represent the geometry. We recommend that this analysis is updated with modern codes and libraries such as ADVANTG or SHIFT. These codes have demonstrated very high efficiency in performing full 3D radiation shielding analyses of similar and even more difficult problems.« less

  16. COCOA code for creating mock observations of star cluster models

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Dalessandro, Emanuele

    2018-04-01

    We introduce and present results from the COCOA (Cluster simulatiOn Comparison with ObservAtions) code that has been developed to create idealized mock photometric observations using results from numerical simulations of star cluster evolution. COCOA is able to present the output of realistic numerical simulations of star clusters carried out using Monte Carlo or N-body codes in a way that is useful for direct comparison with photometric observations. In this paper, we describe the COCOA code and demonstrate its different applications by utilizing globular cluster (GC) models simulated with the MOCCA (MOnte Carlo Cluster simulAtor) code. COCOA is used to synthetically observe these different GC models with optical telescopes, perform point spread function photometry, and subsequently produce observed colour-magnitude diagrams. We also use COCOA to compare the results from synthetic observations of a cluster model that has the same age and metallicity as the Galactic GC NGC 2808 with observations of the same cluster carried out with a 2.2 m optical telescope. We find that COCOA can effectively simulate realistic observations and recover photometric data. COCOA has numerous scientific applications that maybe be helpful for both theoreticians and observers that work on star clusters. Plans for further improving and developing the code are also discussed in this paper.

  17. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.

  18. Voxel2MCNP: a framework for modeling, simulation and evaluation of radiation transport scenarios for Monte Carlo codes.

    PubMed

    Pölz, Stefan; Laubersheimer, Sven; Eberhardt, Jakob S; Harrendorf, Marco A; Keck, Thomas; Benzler, Andreas; Breustedt, Bastian

    2013-08-21

    The basic idea of Voxel2MCNP is to provide a framework supporting users in modeling radiation transport scenarios using voxel phantoms and other geometric models, generating corresponding input for the Monte Carlo code MCNPX, and evaluating simulation output. Applications at Karlsruhe Institute of Technology are primarily whole and partial body counter calibration and calculation of dose conversion coefficients. A new generic data model describing data related to radiation transport, including phantom and detector geometries and their properties, sources, tallies and materials, has been developed. It is modular and generally independent of the targeted Monte Carlo code. The data model has been implemented as an XML-based file format to facilitate data exchange, and integrated with Voxel2MCNP to provide a common interface for modeling, visualization, and evaluation of data. Also, extensions to allow compatibility with several file formats, such as ENSDF for nuclear structure properties and radioactive decay data, SimpleGeo for solid geometry modeling, ImageJ for voxel lattices, and MCNPX's MCTAL for simulation results have been added. The framework is presented and discussed in this paper and example workflows for body counter calibration and calculation of dose conversion coefficients is given to illustrate its application.

  19. Monte Carlo simulations versus experimental measurements in a small animal PET system. A comparison in the NEMA NU 4-2008 framework

    NASA Astrophysics Data System (ADS)

    Popota, F. D.; Aguiar, P.; España, S.; Lois, C.; Udias, J. M.; Ros, D.; Pavia, J.; Gispert, J. D.

    2015-01-01

    In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system’s sensitivity and scatter fraction for an energy window of 350-650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system’s dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems.

  20. Monte Carlo simulations versus experimental measurements in a small animal PET system. A comparison in the NEMA NU 4-2008 framework.

    PubMed

    Popota, F D; Aguiar, P; España, S; Lois, C; Udias, J M; Ros, D; Pavia, J; Gispert, J D

    2015-01-07

    In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system's sensitivity and scatter fraction for an energy window of 350-650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system's dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems.

  1. Synthetic neutron camera and spectrometer in JET based on AFSI-ASCOT simulations

    NASA Astrophysics Data System (ADS)

    Sirén, P.; Varje, J.; Weisen, H.; Koskela, T.; contributors, JET

    2017-09-01

    The ASCOT Fusion Source Integrator (AFSI) has been used to calculate neutron production rates and spectra corresponding to the JET 19-channel neutron camera (KN3) and the time-of-flight spectrometer (TOFOR) as ideal diagnostics, without detector-related effects. AFSI calculates fusion product distributions in 4D, based on Monte Carlo integration from arbitrary reactant distribution functions. The distribution functions were calculated by the ASCOT Monte Carlo particle orbit following code for thermal, NBI and ICRH particle reactions. Fusion cross-sections were defined based on the Bosch-Hale model and both DD and DT reactions have been included. Neutrons generated by AFSI-ASCOT simulations have already been applied as a neutron source of the Serpent neutron transport code in ITER studies. Additionally, AFSI has been selected to be a main tool as the fusion product generator in the complete analysis calculation chain: ASCOT - AFSI - SERPENT (neutron and gamma transport Monte Carlo code) - APROS (system and power plant modelling code), which encompasses the plasma as an energy source, heat deposition in plant structures as well as cooling and balance-of-plant in DEMO applications and other reactor relevant analyses. This conference paper presents the first results and validation of the AFSI DD fusion model for different auxiliary heating scenarios (NBI, ICRH) with very different fast particle distribution functions. Both calculated quantities (production rates and spectra) have been compared with experimental data from KN3 and synthetic spectrometer data from ControlRoom code. No unexplained differences have been observed. In future work, AFSI will be extended for synthetic gamma diagnostics and additionally, AFSI will be used as part of the neutron transport calculation chain to model real diagnostics instead of ideal synthetic diagnostics for quantitative benchmarking.

  2. Design and implementation of coded aperture coherent scatter spectral imaging of cancerous and healthy breast tissue samples

    PubMed Central

    Lakshmanan, Manu N.; Greenberg, Joel A.; Samei, Ehsan; Kapadia, Anuj J.

    2016-01-01

    Abstract. A scatter imaging technique for the differentiation of cancerous and healthy breast tissue in a heterogeneous sample is introduced in this work. Such a technique has potential utility in intraoperative margin assessment during lumpectomy procedures. In this work, we investigate the feasibility of the imaging method for tumor classification using Monte Carlo simulations and physical experiments. The coded aperture coherent scatter spectral imaging technique was used to reconstruct three-dimensional (3-D) images of breast tissue samples acquired through a single-position snapshot acquisition, without rotation as is required in coherent scatter computed tomography. We perform a quantitative assessment of the accuracy of the cancerous voxel classification using Monte Carlo simulations of the imaging system; describe our experimental implementation of coded aperture scatter imaging; show the reconstructed images of the breast tissue samples; and present segmentations of the 3-D images in order to identify the cancerous and healthy tissue in the samples. From the Monte Carlo simulations, we find that coded aperture scatter imaging is able to reconstruct images of the samples and identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) inside them with a cancerous voxel identification sensitivity, specificity, and accuracy of 92.4%, 91.9%, and 92.0%, respectively. From the experimental results, we find that the technique is able to identify cancerous and healthy tissue samples and reconstruct differential coherent scatter cross sections that are highly correlated with those measured by other groups using x-ray diffraction. Coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue inside samples within a time on the order of a minute per slice. PMID:26962543

  3. Design and implementation of coded aperture coherent scatter spectral imaging of cancerous and healthy breast tissue samples.

    PubMed

    Lakshmanan, Manu N; Greenberg, Joel A; Samei, Ehsan; Kapadia, Anuj J

    2016-01-01

    A scatter imaging technique for the differentiation of cancerous and healthy breast tissue in a heterogeneous sample is introduced in this work. Such a technique has potential utility in intraoperative margin assessment during lumpectomy procedures. In this work, we investigate the feasibility of the imaging method for tumor classification using Monte Carlo simulations and physical experiments. The coded aperture coherent scatter spectral imaging technique was used to reconstruct three-dimensional (3-D) images of breast tissue samples acquired through a single-position snapshot acquisition, without rotation as is required in coherent scatter computed tomography. We perform a quantitative assessment of the accuracy of the cancerous voxel classification using Monte Carlo simulations of the imaging system; describe our experimental implementation of coded aperture scatter imaging; show the reconstructed images of the breast tissue samples; and present segmentations of the 3-D images in order to identify the cancerous and healthy tissue in the samples. From the Monte Carlo simulations, we find that coded aperture scatter imaging is able to reconstruct images of the samples and identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) inside them with a cancerous voxel identification sensitivity, specificity, and accuracy of 92.4%, 91.9%, and 92.0%, respectively. From the experimental results, we find that the technique is able to identify cancerous and healthy tissue samples and reconstruct differential coherent scatter cross sections that are highly correlated with those measured by other groups using x-ray diffraction. Coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue inside samples within a time on the order of a minute per slice.

  4. SU-E-T-590: Optimizing Magnetic Field Strengths with Matlab for An Ion-Optic System in Particle Therapy Consisting of Two Quadrupole Magnets for Subsequent Simulations with the Monte-Carlo Code FLUKA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumann, K; Weber, U; Simeonov, Y

    Purpose: Aim of this study was to optimize the magnetic field strengths of two quadrupole magnets in a particle therapy facility in order to obtain a beam quality suitable for spot beam scanning. Methods: The particle transport through an ion-optic system of a particle therapy facility consisting of the beam tube, two quadrupole magnets and a beam monitor system was calculated with the help of Matlab by using matrices that solve the equation of motion of a charged particle in a magnetic field and field-free region, respectively. The magnetic field strengths were optimized in order to obtain a circular andmore » thin beam spot at the iso-center of the therapy facility. These optimized field strengths were subsequently transferred to the Monte-Carlo code FLUKA and the transport of 80 MeV/u C12-ions through this ion-optic system was calculated by using a user-routine to implement magnetic fields. The fluence along the beam-axis and at the iso-center was evaluated. Results: The magnetic field strengths could be optimized by using Matlab and transferred to the Monte-Carlo code FLUKA. The implementation via a user-routine was successful. Analyzing the fluence-pattern along the beam-axis the characteristic focusing and de-focusing effects of the quadrupole magnets could be reproduced. Furthermore the beam spot at the iso-center was circular and significantly thinner compared to an unfocused beam. Conclusion: In this study a Matlab tool was developed to optimize magnetic field strengths for an ion-optic system consisting of two quadrupole magnets as part of a particle therapy facility. These magnetic field strengths could subsequently be transferred to and implemented in the Monte-Carlo code FLUKA to simulate the particle transport through this optimized ion-optic system.« less

  5. MC3: Multi-core Markov-chain Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Lust, Nate; Foster, AJ; Stemm, Madison; Loredo, Tom; Stevenson, Kevin; Campo, Chris; Hardin, Matt; Hardy, Ryan

    2016-10-01

    MC3 (Multi-core Markov-chain Monte Carlo) is a Bayesian statistics tool that can be executed from the shell prompt or interactively through the Python interpreter with single- or multiple-CPU parallel computing. It offers Markov-chain Monte Carlo (MCMC) posterior-distribution sampling for several algorithms, Levenberg-Marquardt least-squares optimization, and uniform non-informative, Jeffreys non-informative, or Gaussian-informative priors. MC3 can share the same value among multiple parameters and fix the value of parameters to constant values, and offers Gelman-Rubin convergence testing and correlated-noise estimation with time-averaging or wavelet-based likelihood estimation methods.

  6. Monte Carlo Simulation of Nonlinear Radiation Induced Plasmas. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Wang, B. S.

    1972-01-01

    A Monte Carlo simulation model for radiation induced plasmas with nonlinear properties due to recombination was, employing a piecewise linearized predict-correct iterative technique. Several important variance reduction techniques were developed and incorporated into the model, including an antithetic variates technique. This approach is especially efficient for plasma systems with inhomogeneous media, multidimensions, and irregular boundaries. The Monte Carlo code developed has been applied to the determination of the electron energy distribution function and related parameters for a noble gas plasma created by alpha-particle irradiation. The characteristics of the radiation induced plasma involved are given.

  7. SU-F-I-53: Coded Aperture Coherent Scatter Spectral Imaging of the Breast: A Monte Carlo Evaluation of Absorbed Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, R; Lakshmanan, M; Fong, G

    Purpose: Coherent scatter based imaging has shown improved contrast and molecular specificity over conventional digital mammography however the biological risks have not been quantified due to a lack of accurate information on absorbed dose. This study intends to characterize the dose distribution and average glandular dose from coded aperture coherent scatter spectral imaging of the breast. The dose deposited in the breast from this new diagnostic imaging modality has not yet been quantitatively evaluated. Here, various digitized anthropomorphic phantoms are tested in a Monte Carlo simulation to evaluate the absorbed dose distribution and average glandular dose using clinically feasible scanmore » protocols. Methods: Geant4 Monte Carlo radiation transport simulation software is used to replicate the coded aperture coherent scatter spectral imaging system. Energy sensitive, photon counting detectors are used to characterize the x-ray beam spectra for various imaging protocols. This input spectra is cross-validated with the results from XSPECT, a commercially available application that yields x-ray tube specific spectra for the operating parameters employed. XSPECT is also used to determine the appropriate number of photons emitted per mAs of tube current at a given kVp tube potential. With the implementation of the XCAT digital anthropomorphic breast phantom library, a variety of breast sizes with differing anatomical structure are evaluated. Simulations were performed with and without compression of the breast for dose comparison. Results: Through the Monte Carlo evaluation of a diverse population of breast types imaged under real-world scan conditions, a clinically relevant average glandular dose for this new imaging modality is extrapolated. Conclusion: With access to the physical coherent scatter imaging system used in the simulation, the results of this Monte Carlo study may be used to directly influence the future development of the modality to keep breast dose to a minimum while still maintaining clinically viable image quality.« less

  8. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  9. Application of a Java-based, univel geometry, neutral particle Monte Carlo code to the searchlight problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles A. Wemple; Joshua J. Cogliati

    2005-04-01

    A univel geometry, neutral particle Monte Carlo transport code, written entirely in the Java programming language, is under development for medical radiotherapy applications. The code uses ENDF-VI based continuous energy cross section data in a flexible XML format. Full neutron-photon coupling, including detailed photon production and photonuclear reactions, is included. Charged particle equilibrium is assumed within the patient model so that detailed transport of electrons produced by photon interactions may be neglected. External beam and internal distributed source descriptions for mixed neutron-photon sources are allowed. Flux and dose tallies are performed on a univel basis. A four-tap, shift-register-sequence random numbermore » generator is used. Initial verification and validation testing of the basic neutron transport routines is underway. The searchlight problem was chosen as a suitable first application because of the simplicity of the physical model. Results show excellent agreement with analytic solutions. Computation times for similar numbers of histories are comparable to other neutron MC codes written in C and FORTRAN.« less

  10. Monte Carlo and discrete-ordinate simulations of spectral radiances in a coupled air-tissue system.

    PubMed

    Hestenes, Kjersti; Nielsen, Kristian P; Zhao, Lu; Stamnes, Jakob J; Stamnes, Knut

    2007-04-20

    We perform a detailed comparison study of Monte Carlo (MC) simulations and discrete-ordinate radiative-transfer (DISORT) calculations of spectral radiances in a 1D coupled air-tissue (CAT) system consisting of horizontal plane-parallel layers. The MC and DISORT models have the same physical basis, including coupling between the air and the tissue, and we use the same air and tissue input parameters for both codes. We find excellent agreement between radiances obtained with the two codes, both above and in the tissue. Our tests cover typical optical properties of skin tissue at the 280, 540, and 650 nm wavelengths. The normalized volume scattering function for internal structures in the skin is represented by the one-parameter Henyey-Greenstein function for large particles and the Rayleigh scattering function for small particles. The CAT-DISORT code is found to be approximately 1000 times faster than the CAT-MC code. We also show that the spectral radiance field is strongly dependent on the inherent optical properties of the skin tissue.

  11. Dosimetric comparison of Monte Carlo codes (EGS4, MCNP, MCNPX) considering external and internal exposures of the Zubal phantom to electron and photon sources.

    PubMed

    Chiavassa, S; Lemosquet, A; Aubineau-Lanièce, I; de Carlan, L; Clairand, I; Ferrer, L; Bardiès, M; Franck, D; Zankl, M

    2005-01-01

    This paper aims at comparing dosimetric assessments performed with three Monte Carlo codes: EGS4, MCNP4c2 and MCNPX2.5e, using a realistic voxel phantom, namely the Zubal phantom, in two configurations of exposure. The first one deals with an external irradiation corresponding to the example of a radiological accident. The results are obtained using the EGS4 and the MCNP4c2 codes and expressed in terms of the mean absorbed dose (in Gy per source particle) for brain, lungs, liver and spleen. The second one deals with an internal exposure corresponding to the treatment of a medullary thyroid cancer by 131I-labelled radiopharmaceutical. The results are obtained by EGS4 and MCNPX2.5e and compared in terms of S-values (expressed in mGy per kBq and per hour) for liver, kidney, whole body and thyroid. The results of these two studies are presented and differences between the codes are analysed and discussed.

  12. A comparison between EGS4 and MCNP computer modeling of an in vivo X-ray fluorescence system.

    PubMed

    Al-Ghorabie, F H; Natto, S S; Al-Lyhiani, S H

    2001-03-01

    The Monte Carlo computer codes EGS4 and MCNP were used to develop a theoretical model of a 180 degrees geometry in vivo X-ray fluorescence system for the measurement of platinum concentration in head and neck tumors. The model included specification of the photon source, collimators, phantoms and detector. Theoretical results were compared and evaluated against X-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In Vivo Analysis and Cancer Research Group. The EGS4 results agreed well with the MCNP results. However, agreement between the measured spectral shape obtained using the experimental X-ray fluorescence system and the simulated spectral shape obtained using the two Monte Carlo codes was relatively poor. The main reason for the disagreement between the results arises from the basic assumptions which the two codes used in their calculations. Both codes assume a "free" electron model for Compton interactions. This assumption will underestimate the results and invalidates any predicted and experimental spectra when compared with each other.

  13. Comparisons between MCNP, EGS4 and experiment for clinical electron beams.

    PubMed

    Jeraj, R; Keall, P J; Ostwald, P M

    1999-03-01

    Understanding the limitations of Monte Carlo codes is essential in order to avoid systematic errors in simulations, and to suggest further improvement of the codes. MCNP and EGS4, Monte Carlo codes commonly used in medical physics, were compared and evaluated against electron depth dose data and experimental backscatter results obtained using clinical radiotherapy beams. Different physical models and algorithms used in the codes give significantly different depth dose curves and electron backscattering factors. The default version of MCNP calculates electron depth dose curves which are too penetrating. The MCNP results agree better with experiment if the ITS-style energy-indexing algorithm is used. EGS4 underpredicts electron backscattering for high-Z materials. The results slightly improve if optimal PRESTA-I parameters are used. MCNP simulates backscattering well even for high-Z materials. To conclude the comparison, a timing study was performed. EGS4 is generally faster than MCNP and use of a large number of scoring voxels dramatically slows down the MCNP calculation. However, use of a large number of geometry voxels in MCNP only slightly affects the speed of the calculation.

  14. MCNP Version 6.2 Release Notes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werner, Christopher John; Bull, Jeffrey S.; Solomon, C. J.

    Monte Carlo N-Particle or MCNP ® is a general-purpose Monte Carlo radiation-transport code designed to track many particle types over broad ranges of energies. This MCNP Version 6.2 follows the MCNP6.1.1 beta version and has been released in order to provide the radiation transport community with the latest feature developments and bug fixes for MCNP. Since the last release of MCNP major work has been conducted to improve the code base, add features, and provide tools to facilitate ease of use of MCNP version 6.2 as well as the analysis of results. These release notes serve as a general guidemore » for the new/improved physics, source, data, tallies, unstructured mesh, code enhancements and tools. For more detailed information on each of the topics, please refer to the appropriate references or the user manual which can be found at http://mcnp.lanl.gov. This release of MCNP version 6.2 contains 39 new features in addition to 172 bug fixes and code enhancements. There are still some 33 known issues the user should familiarize themselves with (see Appendix).« less

  15. Status of the Space Radiation Monte Carlos Simulation Based on FLUKA and ROOT

    NASA Technical Reports Server (NTRS)

    Andersen, Victor; Carminati, Federico; Empl, Anton; Ferrari, Alfredo; Pinsky, Lawrence; Sala, Paola; Wilson, Thomas L.

    2002-01-01

    The NASA-funded project reported on at the first IWSSRR in Arona to develop a Monte-Carlo simulation program for use in simulating the space radiation environment based on the FLUKA and ROOT codes is well into its second year of development, and considerable progress has been made. The general tasks required to achieve the final goals include the addition of heavy-ion interactions into the FLUKA code and the provision of a ROOT-based interface to FLUKA. The most significant progress to date includes the incorporation of the DPMJET event generator code within FLUKA to handle heavy-ion interactions for incident projectile energies greater than 3GeV/A. The ongoing effort intends to extend the treatment of these interactions down to 10 MeV, and at present two alternative approaches are being explored. The ROOT interface is being pursued in conjunction with the CERN LHC ALICE software team through an adaptation of their existing AliROOT software. As a check on the validity of the code, a simulation of the recent data taken by the ATIC experiment is underway.

  16. Rotating and translating anthropomorphic head voxel models to establish an horizontal Frankfort plane for dental CBCT Monte Carlo simulations: a dose comparison study

    NASA Astrophysics Data System (ADS)

    Stratis, A.; Zhang, G.; Jacobs, R.; Bogaerts, R.; Bosmans, H.

    2016-12-01

    In order to carry out Monte Carlo (MC) dosimetry studies, voxel phantoms, modeling human anatomy, and organ-based segmentation of CT image data sets are applied to simulation frameworks. The resulting voxel phantoms preserve patient CT acquisition geometry; in the case of head voxel models built upon head CT images, the head support with which CT scanners are equipped introduces an inclination to the head, and hence to the head voxel model. In dental cone beam CT (CBCT) imaging, patients are always positioned in such a way that the Frankfort line is horizontal, implying that there is no head inclination. The orientation of the head is important, as it influences the distance of critical radiosensitive organs like the thyroid and the esophagus from the x-ray tube. This work aims to propose a procedure to adjust head voxel phantom orientation, and to investigate the impact of head inclination on organ doses in dental CBCT MC dosimetry studies. The female adult ICRP, and three in-house-built paediatric voxel phantoms were in this study. An EGSnrc MC framework was employed to simulate two commonly used protocols; a Morita Accuitomo 170 dental CBCT scanner (FOVs: 60  ×  60 mm2 and 80  ×  80 mm2, standard resolution), and a 3D Teeth protocol (FOV: 100  ×  90 mm2) in a Planmeca Promax 3D MAX scanner. Result analysis revealed large absorbed organ dose differences in radiosensitive organs between the original and the geometrically corrected voxel models of this study, ranging from  -45.6% to 39.3%. Therefore, accurate dental CBCT MC dose calculations require geometrical adjustments to be applied to head voxel models.

  17. Massively parallelized Monte Carlo software to calculate the light propagation in arbitrarily shaped 3D turbid media

    NASA Astrophysics Data System (ADS)

    Zoller, Christian; Hohmann, Ansgar; Ertl, Thomas; Kienle, Alwin

    2017-07-01

    The Monte Carlo method is often referred as the gold standard to calculate the light propagation in turbid media [1]. Especially for complex shaped geometries where no analytical solutions are available the Monte Carlo method becomes very important [1, 2]. In this work a Monte Carlo software is presented, to simulate the light propagation in complex shaped geometries. To improve the simulation time the code is based on OpenCL such that graphics cards can be used as well as other computing devices. Within the software an illumination concept is presented to realize easily all kinds of light sources, like spatial frequency domain (SFD), optical fibers or Gaussian beam profiles. Moreover different objects, which are not connected to each other, can be considered simultaneously, without any additional preprocessing. This Monte Carlo software can be used for many applications. In this work the transmission spectrum of a tooth and the color reconstruction of a virtual object are shown, using results from the Monte Carlo software.

  18. Review of Hybrid (Deterministic/Monte Carlo) Radiation Transport Methods, Codes, and Applications at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, John C; Peplow, Douglas E.; Mosher, Scott W

    2010-01-01

    This paper provides a review of the hybrid (Monte Carlo/deterministic) radiation transport methods and codes used at the Oak Ridge National Laboratory and examples of their application for increasing the efficiency of real-world, fixed-source Monte Carlo analyses. The two principal hybrid methods are (1) Consistent Adjoint Driven Importance Sampling (CADIS) for optimization of a localized detector (tally) region (e.g., flux, dose, or reaction rate at a particular location) and (2) Forward Weighted CADIS (FW-CADIS) for optimizing distributions (e.g., mesh tallies over all or part of the problem space) or multiple localized detector regions (e.g., simultaneous optimization of two or moremore » localized tally regions). The two methods have been implemented and automated in both the MAVRIC sequence of SCALE 6 and ADVANTG, a code that works with the MCNP code. As implemented, the methods utilize the results of approximate, fast-running 3-D discrete ordinates transport calculations (with the Denovo code) to generate consistent space- and energy-dependent source and transport (weight windows) biasing parameters. These methods and codes have been applied to many relevant and challenging problems, including calculations of PWR ex-core thermal detector response, dose rates throughout an entire PWR facility, site boundary dose from arrays of commercial spent fuel storage casks, radiation fields for criticality accident alarm system placement, and detector response for special nuclear material detection scenarios and nuclear well-logging tools. Substantial computational speed-ups, generally O(10{sup 2-4}), have been realized for all applications to date. This paper provides a brief review of the methods, their implementation, results of their application, and current development activities, as well as a considerable list of references for readers seeking more information about the methods and/or their applications.« less

  19. TH-A-19A-11: Validation of GPU-Based Monte Carlo Code (gPMC) Versus Fully Implemented Monte Carlo Code (TOPAS) for Proton Radiation Therapy: Clinical Cases Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giantsoudi, D; Schuemann, J; Dowdell, S

    Purpose: For proton radiation therapy, Monte Carlo simulation (MCS) methods are recognized as the gold-standard dose calculation approach. Although previously unrealistic due to limitations in available computing power, GPU-based applications allow MCS of proton treatment fields to be performed in routine clinical use, on time scales comparable to that of conventional pencil-beam algorithms. This study focuses on validating the results of our GPU-based code (gPMC) versus fully implemented proton therapy based MCS code (TOPAS) for clinical patient cases. Methods: Two treatment sites were selected to provide clinical cases for this study: head-and-neck cases due to anatomical geometrical complexity (air cavitiesmore » and density heterogeneities), making dose calculation very challenging, and prostate cases due to higher proton energies used and close proximity of the treatment target to sensitive organs at risk. Both gPMC and TOPAS methods were used to calculate 3-dimensional dose distributions for all patients in this study. Comparisons were performed based on target coverage indices (mean dose, V90 and D90) and gamma index distributions for 2% of the prescription dose and 2mm. Results: For seven out of eight studied cases, mean target dose, V90 and D90 differed less than 2% between TOPAS and gPMC dose distributions. Gamma index analysis for all prostate patients resulted in passing rate of more than 99% of voxels in the target. Four out of five head-neck-cases showed passing rate of gamma index for the target of more than 99%, the fifth having a gamma index passing rate of 93%. Conclusion: Our current work showed excellent agreement between our GPU-based MCS code and fully implemented proton therapy based MC code for a group of dosimetrically challenging patient cases.« less

  20. Monte Carlo calculation of the atmospheric antinucleon flux

    NASA Astrophysics Data System (ADS)

    Djemil, T.; Attallah, R.; Capdevielle, J. N.

    2009-12-01

    The atmospheric antiproton and antineutron energy spectra are calculated at float altitude using the CORSIKA package in a three-dimensional Monte Carlo simulation. The hadronic interaction is treated by the FLUKA code below 80 GeV/nucleon and NEXUS elsewhere. The solar modulation which is described by the force field theory and the geomagnetic effects are taken into account. The numerical results are compared with the BESS-2001 experimental data.

  1. Monte Carlo simulation of Alaska wolf survival

    NASA Astrophysics Data System (ADS)

    Feingold, S. J.

    1996-02-01

    Alaskan wolves live in a harsh climate and are hunted intensively. Penna's biological aging code, using Monte Carlo methods, has been adapted to simulate wolf survival. It was run on the case in which hunting causes the disruption of wolves' social structure. Social disruption was shown to increase the number of deaths occurring at a given level of hunting. For high levels of social disruption, the population did not survive.

  2. OBJECT KINETIC MONTE CARLO SIMULATIONS OF MICROSTRUCTURE EVOLUTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard L.

    2013-09-30

    The objective is to report the development of the flexible object kinetic Monte Carlo (OKMC) simulation code KSOME (kinetic simulation of microstructure evolution) which can be used to simulate microstructure evolution of complex systems under irradiation. In this report we briefly describe the capabilities of KSOME and present preliminary results for short term annealing of single cascades in tungsten at various primary-knock-on atom (PKA) energies and temperatures.

  3. SABRINA: an interactive three-dimensional geometry-mnodeling program for MCNP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, J.T. III

    SABRINA is a fully interactive three-dimensional geometry-modeling program for MCNP, a Los Alamos Monte Carlo code for neutron and photon transport. In SABRINA, a user constructs either body geometry or surface geometry models and debugs spatial descriptions for the resulting objects. This enhanced capability significantly reduces effort in constructing and debugging complicated three-dimensional geometry models for Monte Carlo analysis. 2 refs., 33 figs.

  4. QMCPACK: an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids

    NASA Astrophysics Data System (ADS)

    Kim, Jeongnim; Baczewski, Andrew D.; Beaudet, Todd D.; Benali, Anouar; Chandler Bennett, M.; Berrill, Mark A.; Blunt, Nick S.; Josué Landinez Borda, Edgar; Casula, Michele; Ceperley, David M.; Chiesa, Simone; Clark, Bryan K.; Clay, Raymond C., III; Delaney, Kris T.; Dewing, Mark; Esler, Kenneth P.; Hao, Hongxia; Heinonen, Olle; Kent, Paul R. C.; Krogel, Jaron T.; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M. Graham; Luo, Ye; Malone, Fionn D.; Martin, Richard M.; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A.; Mitas, Lubos; Morales, Miguel A.; Neuscamman, Eric; Parker, William D.; Pineda Flores, Sergio D.; Romero, Nichols A.; Rubenstein, Brenda M.; Shea, Jacqueline A. R.; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F.; Townsend, Joshua P.; Tubman, Norm M.; Van Der Goetz, Brett; Vincent, Jordan E.; ChangMo Yang, D.; Yang, Yubo; Zhang, Shuai; Zhao, Luning

    2018-05-01

    QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater–Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.

  5. QMCPACK: an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids.

    PubMed

    Kim, Jeongnim; Baczewski, Andrew T; Beaudet, Todd D; Benali, Anouar; Bennett, M Chandler; Berrill, Mark A; Blunt, Nick S; Borda, Edgar Josué Landinez; Casula, Michele; Ceperley, David M; Chiesa, Simone; Clark, Bryan K; Clay, Raymond C; Delaney, Kris T; Dewing, Mark; Esler, Kenneth P; Hao, Hongxia; Heinonen, Olle; Kent, Paul R C; Krogel, Jaron T; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M Graham; Luo, Ye; Malone, Fionn D; Martin, Richard M; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A; Mitas, Lubos; Morales, Miguel A; Neuscamman, Eric; Parker, William D; Pineda Flores, Sergio D; Romero, Nichols A; Rubenstein, Brenda M; Shea, Jacqueline A R; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F; Townsend, Joshua P; Tubman, Norm M; Van Der Goetz, Brett; Vincent, Jordan E; Yang, D ChangMo; Yang, Yubo; Zhang, Shuai; Zhao, Luning

    2018-05-16

    QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program's capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.

  6. [Series: Medical Applications of the PHITS Code (2): Acceleration by Parallel Computing].

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko

    2015-01-01

    Time-consuming Monte Carlo dose calculation becomes feasible owing to the development of computer technology. However, the recent development is due to emergence of the multi-core high performance computers. Therefore, parallel computing becomes a key to achieve good performance of software programs. A Monte Carlo simulation code PHITS contains two parallel computing functions, the distributed-memory parallelization using protocols of message passing interface (MPI) and the shared-memory parallelization using open multi-processing (OpenMP) directives. Users can choose the two functions according to their needs. This paper gives the explanation of the two functions with their advantages and disadvantages. Some test applications are also provided to show their performance using a typical multi-core high performance workstation.

  7. Development of a multi-modal Monte-Carlo radiation treatment planning system combined with PHITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumada, Hiroaki; Nakamura, Takemi; Komeda, Masao

    A new multi-modal Monte-Carlo radiation treatment planning system is under development at Japan Atomic Energy Agency. This system (developing code: JCDS-FX) builds on fundamental technologies of JCDS. JCDS was developed by JAEA to perform treatment planning of boron neutron capture therapy (BNCT) which is being conducted at JRR-4 in JAEA. JCDS has many advantages based on practical accomplishments for actual clinical trials of BNCT at JRR-4, the advantages have been taken over to JCDS-FX. One of the features of JCDS-FX is that PHITS has been applied to particle transport calculation. PHITS is a multipurpose particle Monte-Carlo transport code, thus applicationmore » of PHITS enables to evaluate doses for not only BNCT but also several radiotherapies like proton therapy. To verify calculation accuracy of JCDS-FX with PHITS for BNCT, treatment planning of an actual BNCT conducted at JRR-4 was performed retrospectively. The verification results demonstrated the new system was applicable to BNCT clinical trials in practical use. In framework of R and D for laser-driven proton therapy, we begin study for application of JCDS-FX combined with PHITS to proton therapy in addition to BNCT. Several features and performances of the new multimodal Monte-Carlo radiotherapy planning system are presented.« less

  8. CPMC-Lab: A MATLAB package for Constrained Path Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Nguyen, Huy; Shi, Hao; Xu, Jie; Zhang, Shiwei

    2014-12-01

    We describe CPMC-Lab, a MATLAB program for the constrained-path and phaseless auxiliary-field Monte Carlo methods. These methods have allowed applications ranging from the study of strongly correlated models, such as the Hubbard model, to ab initio calculations in molecules and solids. The present package implements the full ground-state constrained-path Monte Carlo (CPMC) method in MATLAB with a graphical interface, using the Hubbard model as an example. The package can perform calculations in finite supercells in any dimensions, under periodic or twist boundary conditions. Importance sampling and all other algorithmic details of a total energy calculation are included and illustrated. This open-source tool allows users to experiment with various model and run parameters and visualize the results. It provides a direct and interactive environment to learn the method and study the code with minimal overhead for setup. Furthermore, the package can be easily generalized for auxiliary-field quantum Monte Carlo (AFQMC) calculations in many other models for correlated electron systems, and can serve as a template for developing a production code for AFQMC total energy calculations in real materials. Several illustrative studies are carried out in one- and two-dimensional lattices on total energy, kinetic energy, potential energy, and charge- and spin-gaps.

  9. Reconstruction of Human Monte Carlo Geometry from Segmented Images

    NASA Astrophysics Data System (ADS)

    Zhao, Kai; Cheng, Mengyun; Fan, Yanchang; Wang, Wen; Long, Pengcheng; Wu, Yican

    2014-06-01

    Human computational phantoms have been used extensively for scientific experimental analysis and experimental simulation. This article presented a method for human geometry reconstruction from a series of segmented images of a Chinese visible human dataset. The phantom geometry could actually describe detailed structure of an organ and could be converted into the input file of the Monte Carlo codes for dose calculation. A whole-body computational phantom of Chinese adult female has been established by FDS Team which is named Rad-HUMAN with about 28.8 billion voxel number. For being processed conveniently, different organs on images were segmented with different RGB colors and the voxels were assigned with positions of the dataset. For refinement, the positions were first sampled. Secondly, the large sums of voxels inside the organ were three-dimensional adjacent, however, there were not thoroughly mergence methods to reduce the cell amounts for the description of the organ. In this study, the voxels on the organ surface were taken into consideration of the mergence which could produce fewer cells for the organs. At the same time, an indexed based sorting algorithm was put forward for enhancing the mergence speed. Finally, the Rad-HUMAN which included a total of 46 organs and tissues was described by the cuboids into the Monte Carlo Monte Carlo Geometry for the simulation. The Monte Carlo geometry was constructed directly from the segmented images and the voxels was merged exhaustively. Each organ geometry model was constructed without ambiguity and self-crossing, its geometry information could represent the accuracy appearance and precise interior structure of the organs. The constructed geometry largely retaining the original shape of organs could easily be described into different Monte Carlo codes input file such as MCNP. Its universal property was testified and high-performance was experimentally verified

  10. Status of the Monte Carlo library least-squares (MCLLS) approach for non-linear radiation analyzer problems

    NASA Astrophysics Data System (ADS)

    Gardner, Robin P.; Xu, Libai

    2009-10-01

    The Center for Engineering Applications of Radioisotopes (CEAR) has been working for over a decade on the Monte Carlo library least-squares (MCLLS) approach for treating non-linear radiation analyzer problems including: (1) prompt gamma-ray neutron activation analysis (PGNAA) for bulk analysis, (2) energy-dispersive X-ray fluorescence (EDXRF) analyzers, and (3) carbon/oxygen tool analysis in oil well logging. This approach essentially consists of using Monte Carlo simulation to generate the libraries of all the elements to be analyzed plus any other required background libraries. These libraries are then used in the linear library least-squares (LLS) approach with unknown sample spectra to analyze for all elements in the sample. Iterations of this are used until the LLS values agree with the composition used to generate the libraries. The current status of the methods (and topics) necessary to implement the MCLLS approach is reported. This includes: (1) the Monte Carlo codes such as CEARXRF, CEARCPG, and CEARCO for forward generation of the necessary elemental library spectra for the LLS calculation for X-ray fluorescence, neutron capture prompt gamma-ray analyzers, and carbon/oxygen tools; (2) the correction of spectral pulse pile-up (PPU) distortion by Monte Carlo simulation with the code CEARIPPU; (3) generation of detector response functions (DRF) for detectors with linear and non-linear responses for Monte Carlo simulation of pulse-height spectra; and (4) the use of the differential operator (DO) technique to make the necessary iterations for non-linear responses practical. In addition to commonly analyzed single spectra, coincidence spectra or even two-dimensional (2-D) coincidence spectra can also be used in the MCLLS approach and may provide more accurate results.

  11. A Deterministic Transport Code for Space Environment Electrons

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Chang, C. K.; Norman, Ryan B.; Blattnig, Steve R.; Badavi, Francis F.; Adamczyk, Anne M.

    2010-01-01

    A deterministic computational procedure has been developed to describe transport of space environment electrons in various shield media. This code is an upgrade and extension of an earlier electron code. Whereas the former code was formulated on the basis of parametric functions derived from limited laboratory data, the present code utilizes well established theoretical representations to describe the relevant interactions and transport processes. The shield material specification has been made more general, as have the pertinent cross sections. A combined mean free path and average trajectory approach has been used in the transport formalism. Comparisons with Monte Carlo calculations are presented.

  12. Benchmark test of transport calculations of gold and nickel activation with implications for neutron kerma at Hiroshima.

    PubMed

    Hoshi, M; Hiraoka, M; Hayakawa, N; Sawada, S; Munaka, M; Kuramoto, A; Oka, T; Iwatani, K; Shizuma, K; Hasai, H

    1992-11-01

    A benchmark test of the Monte Carlo neutron and photon transport code system (MCNP) was performed using a 252Cf fission neutron source to validate the use of the code for the energy spectrum analyses of Hiroshima atomic bomb neutrons. Nuclear data libraries used in the Monte Carlo neutron and photon transport code calculation were ENDF/B-III, ENDF/B-IV, LASL-SUB, and ENDL-73. The neutron moderators used were granite (the main component of which is SiO2, with a small fraction of hydrogen), Newlight [polyethylene with 3.7% boron (natural)], ammonium chloride (NH4Cl), and water (H2O). Each moderator was 65 cm thick. The neutron detectors were gold and nickel foils, which were used to detect thermal and epithermal neutrons (4.9 eV) and fast neutrons (> 0.5 MeV), respectively. Measured activity data from neutron-irradiated gold and nickel foils in these moderators decreased to about 1/1,000th or 1/10,000th, which correspond to about 1,500 m ground distance from the hypocenter in Hiroshima. For both gold and nickel detectors, the measured activities and the calculated values agreed within 10%. The slopes of the depth-yield relations in each moderator, except granite, were similar for neutrons detected by the gold and nickel foils. From the results of these studies, the Monte Carlo neutron and photon transport code was verified to be accurate enough for use with the elements hydrogen, carbon, nitrogen, oxygen, silicon, chlorine, and cadmium, and for the incident 252Cf fission spectrum neutrons.

  13. CGRO Guest Investigator Program

    NASA Technical Reports Server (NTRS)

    Begelman, Mitchell C.

    1997-01-01

    The following are highlights from the research supported by this grant: (1) Theory of gamma-ray blazars: We studied the theory of gamma-ray blazars, being among the first investigators to propose that the GeV emission arises from Comptonization of diffuse radiation surrounding the jet, rather than from the synchrotron-self-Compton mechanism. In related work, we uncovered possible connections between the mechanisms of gamma-ray blazars and those of intraday radio variability, and have conducted a general study of the role of Compton radiation drag on the dynamics of relativistic jets. (2) A Nonlinear Monte Carlo code for gamma-ray spectrum formation: We developed, tested, and applied the first Nonlinear Monte Carlo (NLMC) code for simulating gamma-ray production and transfer under much more general (and realistic) conditions than are accessible with other techniques. The present version of the code is designed to simulate conditions thought to be present in active galactic nuclei and certain types of X-ray binaries, and includes the physics needed to model thermal and nonthermal electron-positron pair cascades. Unlike traditional Monte-Carlo techniques, our method can accurately handle highly non-linear systems in which the radiation and particle backgrounds must be determined self-consistently and in which the particle energies span many orders of magnitude. Unlike models based on kinetic equations, our code can handle arbitrary source geometries and relativistic kinematic effects In its first important application following testing, we showed that popular semi-analytic accretion disk corona models for Seyfert spectra are seriously in error, and demonstrated how the spectra can be simulated if the disk is sparsely covered by localized 'flares'.

  14. Error threshold for color codes and random three-body Ising models.

    PubMed

    Katzgraber, Helmut G; Bombin, H; Martin-Delgado, M A

    2009-08-28

    We study the error threshold of color codes, a class of topological quantum codes that allow a direct implementation of quantum Clifford gates suitable for entanglement distillation, teleportation, and fault-tolerant quantum computation. We map the error-correction process onto a statistical mechanical random three-body Ising model and study its phase diagram via Monte Carlo simulations. The obtained error threshold of p(c) = 0.109(2) is very close to that of Kitaev's toric code, showing that enhanced computational capabilities do not necessarily imply lower resistance to noise.

  15. KEWPIE: A dynamical cascade code for decaying exited compound nuclei

    NASA Astrophysics Data System (ADS)

    Bouriquet, Bertrand; Abe, Yasuhisa; Boilley, David

    2004-05-01

    A new dynamical cascade code for decaying hot nuclei is proposed and specially adapted to the synthesis of super-heavy nuclei. For such a case, the interesting channel is of the tiny fraction that will decay through particles emission, thus the code avoids classical Monte-Carlo methods and proposes a new numerical scheme. The time dependence is explicitely taken into account in order to cope with the fact that fission decay rate might not be constant. The code allows to evaluate both statistical and dynamical observables. Results are successfully compared to experimental data.

  16. Monte Carlo N Particle code - Dose distribution of clinical electron beams in inhomogeneous phantoms

    PubMed Central

    Nedaie, H. A.; Mosleh-Shirazi, M. A.; Allahverdi, M.

    2013-01-01

    Electron dose distributions calculated using the currently available analytical methods can be associated with large uncertainties. The Monte Carlo method is the most accurate method for dose calculation in electron beams. Most of the clinical electron beam simulation studies have been performed using non- MCNP [Monte Carlo N Particle] codes. Given the differences between Monte Carlo codes, this work aims to evaluate the accuracy of MCNP4C-simulated electron dose distributions in a homogenous phantom and around inhomogeneities. Different types of phantoms ranging in complexity were used; namely, a homogeneous water phantom and phantoms made of polymethyl methacrylate slabs containing different-sized, low- and high-density inserts of heterogeneous materials. Electron beams with 8 and 15 MeV nominal energy generated by an Elekta Synergy linear accelerator were investigated. Measurements were performed for a 10 cm × 10 cm applicator at a source-to-surface distance of 100 cm. Individual parts of the beam-defining system were introduced into the simulation one at a time in order to show their effect on depth doses. In contrast to the first scattering foil, the secondary scattering foil, X and Y jaws and applicator provide up to 5% of the dose. A 2%/2 mm agreement between MCNP and measurements was found in the homogenous phantom, and in the presence of heterogeneities in the range of 1-3%, being generally within 2% of the measurements for both energies in a "complex" phantom. A full-component simulation is necessary in order to obtain a realistic model of the beam. The MCNP4C results agree well with the measured electron dose distributions. PMID:23533162

  17. Unfolding linac photon spectra and incident electron energies from experimental transmission data, with direct independent validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, E. S. M.; McEwen, M. R.; Rogers, D. W. O.

    2012-11-15

    Purpose: In a recent computational study, an improved physics-based approach was proposed for unfolding linac photon spectra and incident electron energies from transmission data. In this approach, energy differentiation is improved by simultaneously using transmission data for multiple attenuators and detectors, and the unfolding robustness is improved by using a four-parameter functional form to describe the photon spectrum. The purpose of the current study is to validate this approach experimentally, and to demonstrate its application on a typical clinical linac. Methods: The validation makes use of the recent transmission measurements performed on the Vickers research linac of National Research Councilmore » Canada. For this linac, the photon spectra were previously measured using a NaI detector, and the incident electron parameters are independently known. The transmission data are for eight beams in the range 10-30 MV using thick Be, Al and Pb bremsstrahlung targets. To demonstrate the approach on a typical clinical linac, new measurements are performed on an Elekta Precise linac for 6, 10 and 25 MV beams. The different experimental setups are modeled using EGSnrc, with the newly added photonuclear attenuation included. Results: For the validation on the research linac, the 95% confidence bounds of the unfolded spectra fall within the noise of the NaI data. The unfolded spectra agree with the EGSnrc spectra (calculated using independently known electron parameters) with RMS energy fluence deviations of 4.5%. The accuracy of unfolding the incident electron energy is shown to be {approx}3%. A transmission cutoff of only 10% is suitable for accurate unfolding, provided that the other components of the proposed approach are implemented. For the demonstration on a clinical linac, the unfolded incident electron energies and their 68% confidence bounds for the 6, 10 and 25 MV beams are 6.1 {+-} 0.1, 9.3 {+-} 0.1, and 19.3 {+-} 0.2 MeV, respectively. The unfolded spectra for the clinical linac agree with the EGSnrc spectra (calculated using the unfolded electron energies) with RMS energy fluence deviations of 3.7%. The corresponding measured and EGSnrc-calculated transmission data agree within 1.5%, where the typical transmission measurement uncertainty on the clinical linac is 0.4% (not including the uncertainties on the incident electron parameters). Conclusions: The approach proposed in an earlier study for unfolding photon spectra and incident electron energies from transmission data is accurate and practical for clinical use.« less

  18. Effect of improved TLD dosimetry on the determination of dose rate constants for {sup 125}I and {sup 103}Pd brachytherapy seeds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, M., E-mail: manuel.rodriguez@rmp.uhn.ca; Rogers, D. W. O.

    Purpose: To more accurately account for the relative intrinsic energy dependence and relative absorbed-dose energy dependence of TLDs when used to measure dose rate constants (DRCs) for {sup 125}I and {sup 103}Pd brachytherapy seeds, to thereby establish revised “measured values” for all seeds and compare the revised values with Monte Carlo and consensus values. Methods: The relative absorbed-dose energy dependence, f{sup rel}, for TLDs and the phantom correction, P{sub phant}, are calculated for {sup 125}I and {sup 103}Pd seeds using the EGSnrc BrachyDose and DOSXYZnrc codes. The original energy dependence and phantom corrections applied to DRC measurements are replaced bymore » calculated (f{sup rel}){sup −1} and P{sub phant} values for 24 different seed models. By comparing the modified measured DRCs to the MC values, an appropriate relative intrinsic energy dependence, k{sub bq}{sup rel}, is determined. The new P{sub phant} values and relative absorbed-dose sensitivities, S{sub AD}{sup rel}, calculated as the product of (f{sup rel}){sup −1} and (k{sub bq}{sup rel}){sup −1}, are used to individually revise the measured DRCs for comparison with Monte Carlo calculated values and TG-43U1 or TG-43U1S1 consensus values. Results: In general, f{sup rel} is sensitive to the energy spectra and models of the brachytherapy seeds. Values may vary up to 8.4% among {sup 125}I and {sup 103}Pd seed models and common TLD shapes. P{sub phant} values depend primarily on the isotope used. Deduced (k{sub bq}{sup rel}){sup −1} values are 1.074 ± 0.015 and 1.084 ± 0.026 for {sup 125}I and {sup 103}Pd seeds, respectively. For (1 mm){sup 3} chips, this implies an overall absorbed-dose sensitivity relative to {sup 60}Co or 6 MV calibrations of 1.51 ± 1% and 1.47 ± 2% for {sup 125}I and {sup 103}Pd seeds, respectively, as opposed to the widely used value of 1.41. Values of P{sub phant} calculated here have much lower statistical uncertainties than literature values, but systematic uncertainties from density and composition uncertainties are significant. Using these revised values with the literature’s DRC measurements, the average discrepancies between revised measured values and Monte Carlo values are 1.2% and 0.2% for {sup 125}I and {sup 103}Pd seeds, respectively, compared to average discrepancies for the original measured values of 4.8%. On average, the revised measured values are 4.3% and 5.9% lower than the original measured values for {sup 103}Pd and {sup 125}I seeds, respectively. The average of revised DRCs and Monte Carlo values is 3.8% and 2.8% lower for {sup 125}I and {sup 103}Pd seeds, respectively, than the consensus values in TG-43U1 or TG-43U1S1. Conclusions: This work shows that f{sup rel} is TLD shape and seed model dependent suggesting a need to update the generalized energy response dependence, i.e., relative absorbed-dose sensitivity, measured 25 years ago and applied often to DRC measurements of {sup 125}I and {sup 103}Pd brachytherapy seeds. The intrinsic energy dependence for LiF TLDs deduced here is consistent with previous dosimetry studies and emphasizes the need to revise the DRC consensus values reported by TG-43U1 or TG-43U1S1.« less

  19. On determining dose rate constants spectroscopically

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, M.; Rogers, D. W. O.

    2013-01-15

    Purpose: To investigate several aspects of the Chen and Nath spectroscopic method of determining the dose rate constants of {sup 125}I and {sup 103}Pd seeds [Z. Chen and R. Nath, Phys. Med. Biol. 55, 6089-6104 (2010)] including the accuracy of using a line or dual-point source approximation as done in their method, and the accuracy of ignoring the effects of the scattered photons in the spectra. Additionally, the authors investigate the accuracy of the literature's many different spectra for bare, i.e., unencapsulated {sup 125}I and {sup 103}Pd sources. Methods: Spectra generated by 14 {sup 125}I and 6 {sup 103}Pd seedsmore » were calculated in vacuo at 10 cm from the source in a 2.7 Multiplication-Sign 2.7 Multiplication-Sign 0.05 cm{sup 3} voxel using the EGSnrc BrachyDose Monte Carlo code. Calculated spectra used the initial photon spectra recommended by AAPM's TG-43U1 and NCRP (National Council of Radiation Protection and Measurements) Report 58 for the {sup 125}I seeds, or TG-43U1 and NNDC(2000) (National Nuclear Data Center, 2000) for {sup 103}Pd seeds. The emitted spectra were treated as coming from a line or dual-point source in a Monte Carlo simulation to calculate the dose rate constant. The TG-43U1 definition of the dose rate constant was used. These calculations were performed using the full spectrum including scattered photons or using only the main peaks in the spectrum as done experimentally. Statistical uncertainties on the air kerma/history and the dose rate/history were Less-Than-Or-Slanted-Equal-To 0.2%. The dose rate constants were also calculated using Monte Carlo simulations of the full seed model. Results: The ratio of the intensity of the 31 keV line relative to that of the main peak in {sup 125}I spectra is, on average, 6.8% higher when calculated with the NCRP Report 58 initial spectrum vs that calculated with TG-43U1 initial spectrum. The {sup 103}Pd spectra exhibit an average 6.2% decrease in the 22.9 keV line relative to the main peak when calculated with the TG-43U1 rather than the NNDC(2000) initial spectrum. The measured values from three different investigations are in much better agreement with the calculations using the NCRP Report 58 and NNDC(2000) initial spectra with average discrepancies of 0.9% and 1.7% for the {sup 125}I and {sup 103}Pd seeds, respectively. However, there are no differences in the calculated TG-43U1 brachytherapy parameters using either initial spectrum in both cases. Similarly, there were no differences outside the statistical uncertainties of 0.1% or 0.2%, in the average energy, air kerma/history, dose rate/history, and dose rate constant when calculated using either the full photon spectrum or the main-peaks-only spectrum. Conclusions: Our calculated dose rate constants based on using the calculated on-axis spectrum and a line or dual-point source model are in excellent agreement (0.5% on average) with the values of Chen and Nath, verifying the accuracy of their more approximate method of going from the spectrum to the dose rate constant. However, the dose rate constants based on full seed models differ by between +4.6% and -1.5% from those based on the line or dual-point source approximations. These results suggest that the main value of spectroscopic measurements is to verify full Monte Carlo models of the seeds by comparison to the calculated spectra.« less

  20. Fixed forced detection for fast SPECT Monte-Carlo simulation

    NASA Astrophysics Data System (ADS)

    Cajgfinger, T.; Rit, S.; Létang, J. M.; Halty, A.; Sarrut, D.

    2018-03-01

    Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.

  1. Fixed forced detection for fast SPECT Monte-Carlo simulation.

    PubMed

    Cajgfinger, T; Rit, S; Létang, J M; Halty, A; Sarrut, D

    2018-03-02

    Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.

  2. On the uncertainties of photon mass energy-absorption coefficients and their ratios for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Andreo, Pedro; Burns, David T.; Salvat, Francesc

    2012-04-01

    A systematic analysis of the available data has been carried out for mass energy-absorption coefficients and their ratios for air, graphite and water for photon energies between 1 keV and 2 MeV, using representative kilovoltage x-ray spectra for mammography and diagnostic radiology below 100 kV, and for 192Ir and 60Co gamma-ray spectra. The aim of this work was to establish ‘an envelope of uncertainty’ based on the spread of the available data. Type A uncertainties were determined from the results of Monte Carlo (MC) calculations with the PENELOPE and EGSnrc systems, yielding mean values for µen/ρ with a given statistical standard uncertainty. Type B estimates were based on two groupings. The first grouping consisted of MC calculations based on a similar implementation but using different data and/or approximations. The second grouping was formed by various datasets, obtained by different authors or methods using the same or different basic data, and with different implementations (analytical, MC-based, or a combination of the two); these datasets were the compilations of NIST, Hubbell, Johns-Cunningham, Attix and Higgins, plus MC calculations with PENELOPE and EGSnrc. The combined standard uncertainty, uc, for the µen/ρ values for the mammography x-ray spectra is 2.5%, decreasing gradually to 1.6% for kilovoltage x-ray spectra up to 100 kV. For 60Co and 192Ir, uc is approximately 0.1%. The Type B uncertainty analysis for the ratios of µen/ρ values includes four methods of analysis and concludes that for the present data the assumption that the data interval represents 95% confidence limits is a good compromise. For the mammography x-ray spectra, the combined standard uncertainties of (µen/ρ)graphite,air and (µen/ρ)graphite,water are 1.5%, and 0.5% for (µen/ρ)water,air, decreasing gradually down to uc = 0.1% for the three µen/ρ ratios for the gamma-ray spectra. The present estimates are shown to coincide well with those of Hubbell (1977 Rad. Res. 70 58-81), except for the lowest energy range (radiodiagnostic) where it is concluded that current databases and their systematic analysis represent an improvement over the older Hubbell estimations. The results for (µen/ρ)graphite,air for the gamma-ray dosimetry range are moderately higher than those of Seltzer and Bergstrom (2005 private communication).

  3. Validation of the analytical methods in the LWR code BOXER for gadolinium-loaded fuel pins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Arkuszewski, J.J.; Kamboj, B.K.

    1990-01-01

    Due to the very high absorption occurring in gadolinium-loaded fuel pins, calculations of lattices with such pins present are a demanding test of the analysis methods in light water reactor (LWR) cell and assembly codes. Considerable effort has, therefore, been devoted to the validation of code methods for gadolinia fuel. The goal of the work reported in this paper is to check the analysis methods in the LWR cell/assembly code BOXER and its associated cross-section processing code ETOBOX, by comparison of BOXER results with those from a very accurate Monte Carlo calculation for a gadolinium benchmark problem. Initial results ofmore » such a comparison have been previously reported. However, the Monte Carlo calculations, done with the MCNP code, were performed at Los Alamos National Laboratory using ENDF/B-V data, while the BOXER calculations were performed at the Paul Scherrer Institute using JEF-1 nuclear data. This difference in the basic nuclear data used for the two calculations, caused by the restricted nature of these evaluated data files, led to associated uncertainties in a comparison of the results for methods validation. In the joint investigations at the Georgia Institute of Technology and PSI, such uncertainty in this comparison was eliminated by using ENDF/B-V data for BOXER calculations at Georgia Tech.« less

  4. Methodology comparison for gamma-heating calculations in material-testing reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemaire, M.; Vaglio-Gaudard, C.; Lyoussi, A.

    2015-07-01

    The Jules Horowitz Reactor (JHR) is a Material-Testing Reactor (MTR) under construction in the south of France at CEA Cadarache (French Alternative Energies and Atomic Energy Commission). It will typically host about 20 simultaneous irradiation experiments in the core and in the beryllium reflector. These experiments will help us better understand the complex phenomena occurring during the accelerated ageing of materials and the irradiation of nuclear fuels. Gamma heating, i.e. photon energy deposition, is mainly responsible for temperature rise in non-fuelled zones of nuclear reactors, including JHR internal structures and irradiation devices. As temperature is a key parameter for physicalmore » models describing the behavior of material, accurate control of temperature, and hence gamma heating, is required in irradiation devices and samples in order to perform an advanced suitable analysis of future experimental results. From a broader point of view, JHR global attractiveness as a MTR depends on its ability to monitor experimental parameters with high accuracy, including gamma heating. Strict control of temperature levels is also necessary in terms of safety. As JHR structures are warmed up by gamma heating, they must be appropriately cooled down to prevent creep deformation or melting. Cooling-power sizing is based on calculated levels of gamma heating in the JHR. Due to these safety concerns, accurate calculation of gamma heating with well-controlled bias and associated uncertainty as low as possible is all the more important. There are two main kinds of calculation bias: bias coming from nuclear data on the one hand and bias coming from physical approximations assumed by computer codes and by general calculation route on the other hand. The former must be determined by comparison between calculation and experimental data; the latter by calculation comparisons between codes and between methodologies. In this presentation, we focus on this latter kind of bias. Nuclear heating is represented by the physical quantity called absorbed dose (energy deposition induced by particle-matter interactions, divided by mass). Its calculation with Monte Carlo codes is possible but computationally expensive as it requires transport simulation of charged particles, along with neutrons and photons. For that reason, the calculation of another physical quantity, called KERMA, is often preferred, as KERMA calculation with Monte Carlo codes only requires transport of neutral particles. However, KERMA is only an estimator of the absorbed dose and many conditions must be fulfilled for KERMA to be equal to absorbed dose, including so-called condition of electronic equilibrium. Also, Monte Carlo computations of absorbed dose still present some physical approximations, even though there is only a limited number of them. Some of these approximations are linked to the way how Monte Carlo codes apprehend the transport simulation of charged particles and the productive and destructive interactions between photons, electrons and positrons. There exists a huge variety of electromagnetic shower models which tackle this topic. Differences in the implementation of these models can lead to discrepancies in calculated values of absorbed dose between different Monte Carlo codes. The magnitude of order of such potential discrepancies should be quantified for JHR gamma-heating calculations. We consequently present a two-pronged plan. In a first phase, we intend to perform compared absorbed dose / KERMA Monte Carlo calculations in the JHR. This way, we will study the presence or absence of electronic equilibrium in the different JHR structures and experimental devices and we will give recommendations for the choice of KERMA or absorbed dose when calculating gamma heating in the JHR. In a second phase, we intend to perform compared TRIPOLI4 / MCNP absorbed dose calculations in a simplified JHR-representative geometry. For this comparison, we will use the same nuclear data library for both codes (the European library JEFF3.1.1 and photon library EPDL97) so as to isolate the effects from electromagnetic shower models on absorbed dose calculation. This way, we hope to get insightful feedback on these models and their implementation in Monte Carlo codes. (authors)« less

  5. Chemical application of diffusion quantum Monte Carlo

    NASA Technical Reports Server (NTRS)

    Reynolds, P. J.; Lester, W. A., Jr.

    1984-01-01

    The diffusion quantum Monte Carlo (QMC) method gives a stochastic solution to the Schroedinger equation. This approach is receiving increasing attention in chemical applications as a result of its high accuracy. However, reducing statistical uncertainty remains a priority because chemical effects are often obtained as small differences of large numbers. As an example, the single-triplet splitting of the energy of the methylene molecule CH sub 2 is given. The QMC algorithm was implemented on the CYBER 205, first as a direct transcription of the algorithm running on the VAX 11/780, and second by explicitly writing vector code for all loops longer than a crossover length C. The speed of the codes relative to one another as a function of C, and relative to the VAX, are discussed. The computational time dependence obtained versus the number of basis functions is discussed and this is compared with that obtained from traditional quantum chemistry codes and that obtained from traditional computer architectures.

  6. A fully-implicit Particle-In-Cell Monte Carlo Collision code for the simulation of inductively coupled plasmas

    NASA Astrophysics Data System (ADS)

    Mattei, S.; Nishida, K.; Onai, M.; Lettry, J.; Tran, M. Q.; Hatayama, A.

    2017-12-01

    We present a fully-implicit electromagnetic Particle-In-Cell Monte Carlo collision code, called NINJA, written for the simulation of inductively coupled plasmas. NINJA employs a kinetic enslaved Jacobian-Free Newton Krylov method to solve self-consistently the interaction between the electromagnetic field generated by the radio-frequency coil and the plasma response. The simulated plasma includes a kinetic description of charged and neutral species as well as the collision processes between them. The algorithm allows simulations with cell sizes much larger than the Debye length and time steps in excess of the Courant-Friedrichs-Lewy condition whilst preserving the conservation of the total energy. The code is applied to the simulation of the plasma discharge of the Linac4 H- ion source at CERN. Simulation results of plasma density, temperature and EEDF are discussed and compared with optical emission spectroscopy measurements. A systematic study of the energy conservation as a function of the numerical parameters is presented.

  7. Dynamic Monte Carlo simulations of radiatively accelerated GRB fireballs

    NASA Astrophysics Data System (ADS)

    Chhotray, Atul; Lazzati, Davide

    2018-05-01

    We present a novel Dynamic Monte Carlo code (DynaMo code) that self-consistently simulates the Compton-scattering-driven dynamic evolution of a plasma. We use the DynaMo code to investigate the time-dependent expansion and acceleration of dissipationless gamma-ray burst fireballs by varying their initial opacities and baryonic content. We study the opacity and energy density evolution of an initially optically thick, radiation-dominated fireball across its entire phase space - in particular during the Rph < Rsat regime. Our results reveal new phases of fireball evolution: a transition phase with a radial extent of several orders of magnitude - the fireball transitions from Γ ∝ R to Γ ∝ R0, a post-photospheric acceleration phase - where fireballs accelerate beyond the photosphere and a Thomson-dominated acceleration phase - characterized by slow acceleration of optically thick, matter-dominated fireballs due to Thomson scattering. We quantify the new phases by providing analytical expressions of Lorentz factor evolution, which will be useful for deriving jet parameters.

  8. High-Throughput Characterization of Porous Materials Using Graphics Processing Units

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jihan; Martin, Richard L.; Rübel, Oliver

    We have developed a high-throughput graphics processing units (GPU) code that can characterize a large database of crystalline porous materials. In our algorithm, the GPU is utilized to accelerate energy grid calculations where the grid values represent interactions (i.e., Lennard-Jones + Coulomb potentials) between gas molecules (i.e., CHmore » $$_{4}$$ and CO$$_{2}$$) and material's framework atoms. Using a parallel flood fill CPU algorithm, inaccessible regions inside the framework structures are identified and blocked based on their energy profiles. Finally, we compute the Henry coefficients and heats of adsorption through statistical Widom insertion Monte Carlo moves in the domain restricted to the accessible space. The code offers significant speedup over a single core CPU code and allows us to characterize a set of porous materials at least an order of magnitude larger than ones considered in earlier studies. For structures selected from such a prescreening algorithm, full adsorption isotherms can be calculated by conducting multiple grand canonical Monte Carlo simulations concurrently within the GPU.« less

  9. A comparative study of history-based versus vectorized Monte Carlo methods in the GPU/CUDA environment for a simple neutron eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Du, Xining; Ji, Wei; Xu, X. George; Brown, Forrest B.

    2014-06-01

    For nuclear reactor analysis such as the neutron eigenvalue calculations, the time consuming Monte Carlo (MC) simulations can be accelerated by using graphics processing units (GPUs). However, traditional MC methods are often history-based, and their performance on GPUs is affected significantly by the thread divergence problem. In this paper we describe the development of a newly designed event-based vectorized MC algorithm for solving the neutron eigenvalue problem. The code was implemented using NVIDIA's Compute Unified Device Architecture (CUDA), and tested on a NVIDIA Tesla M2090 GPU card. We found that although the vectorized MC algorithm greatly reduces the occurrence of thread divergence thus enhancing the warp execution efficiency, the overall simulation speed is roughly ten times slower than the history-based MC code on GPUs. Profiling results suggest that the slow speed is probably due to the memory access latency caused by the large amount of global memory transactions. Possible solutions to improve the code efficiency are discussed.

  10. Fast quantum Monte Carlo on a GPU

    NASA Astrophysics Data System (ADS)

    Lutsyshyn, Y.

    2015-02-01

    We present a scheme for the parallelization of quantum Monte Carlo method on graphical processing units, focusing on variational Monte Carlo simulation of bosonic systems. We use asynchronous execution schemes with shared memory persistence, and obtain an excellent utilization of the accelerator. The CUDA code is provided along with a package that simulates liquid helium-4. The program was benchmarked on several models of Nvidia GPU, including Fermi GTX560 and M2090, and the Kepler architecture K20 GPU. Special optimization was developed for the Kepler cards, including placement of data structures in the register space of the Kepler GPUs. Kepler-specific optimization is discussed.

  11. Improved radial dose function estimation using current version MCNP Monte-Carlo simulation: Model 6711 and ISC3500 125I brachytherapy sources.

    PubMed

    Duggan, Dennis M

    2004-12-01

    Improved cross-sections in a new version of the Monte-Carlo N-particle (MCNP) code may eliminate discrepancies between radial dose functions (as defined by American Association of Physicists in Medicine Task Group 43) derived from Monte-Carlo simulations of low-energy photon-emitting brachytherapy sources and those from measurements on the same sources with thermoluminescent dosimeters. This is demonstrated for two 125I brachytherapy seed models, the Implant Sciences Model ISC3500 (I-Plant) and the Amersham Health Model 6711, by simulating their radial dose functions with two versions of MCNP, 4c2 and 5.

  12. A preliminary Monte Carlo study for the treatment head of a carbon-ion radiotherapy facility using TOPAS

    NASA Astrophysics Data System (ADS)

    Liu, Hongdong; Zhang, Lian; Chen, Zhi; Liu, Xinguo; Dai, Zhongying; Li, Qiang; Xu, Xie George

    2017-09-01

    In medical physics it is desirable to have a Monte Carlo code that is less complex, reliable yet flexible for dose verification, optimization, and component design. TOPAS is a newly developed Monte Carlo simulation tool which combines extensive radiation physics libraries available in Geant4 code, easyto-use geometry and support for visualization. Although TOPAS has been widely tested and verified in simulations of proton therapy, there has been no reported application for carbon ion therapy. To evaluate the feasibility and accuracy of TOPAS simulations for carbon ion therapy, a licensed TOPAS code (version 3_0_p1) was used to carry out a dosimetric study of therapeutic carbon ions. Results of depth dose profile based on different physics models have been obtained and compared with the measurements. It is found that the G4QMD model is at least as accurate as the TOPAS default BIC physics model for carbon ions, but when the energy is increased to relatively high levels such as 400 MeV/u, the G4QMD model shows preferable performance. Also, simulations of special components used in the treatment head at the Institute of Modern Physics facility was conducted to investigate the Spread-Out dose distribution in water. The physical dose in water of SOBP was found to be consistent with the aim of the 6 cm ridge filter.

  13. GUINEVERE experiment: Kinetic analysis of some reactivity measurement methods by deterministic and Monte Carlo codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bianchini, G.; Burgio, N.; Carta, M.

    The GUINEVERE experiment (Generation of Uninterrupted Intense Neutrons at the lead Venus Reactor) is an experimental program in support of the ADS technology presently carried out at SCK-CEN in Mol (Belgium). In the experiment a modified lay-out of the original thermal VENUS critical facility is coupled to an accelerator, built by the French body CNRS in Grenoble, working in both continuous and pulsed mode and delivering 14 MeV neutrons by bombardment of deuterons on a tritium-target. The modified lay-out of the facility consists of a fast subcritical core made of 30% U-235 enriched metallic Uranium in a lead matrix. Severalmore » off-line and on-line reactivity measurement techniques will be investigated during the experimental campaign. This report is focused on the simulation by deterministic (ERANOS French code) and Monte Carlo (MCNPX US code) calculations of three reactivity measurement techniques, Slope ({alpha}-fitting), Area-ratio and Source-jerk, applied to a GUINEVERE subcritical configuration (namely SC1). The inferred reactivity, in dollar units, by the Area-ratio method shows an overall agreement between the two deterministic and Monte Carlo computational approaches, whereas the MCNPX Source-jerk results are affected by large uncertainties and allow only partial conclusions about the comparison. Finally, no particular spatial dependence of the results is observed in the case of the GUINEVERE SC1 subcritical configuration. (authors)« less

  14. Implementation of new physics models for low energy electrons in liquid water in Geant4-DNA.

    PubMed

    Bordage, M C; Bordes, J; Edel, S; Terrissol, M; Franceries, X; Bardiès, M; Lampe, N; Incerti, S

    2016-12-01

    A new alternative set of elastic and inelastic cross sections has been added to the very low energy extension of the Geant4 Monte Carlo simulation toolkit, Geant4-DNA, for the simulation of electron interactions in liquid water. These cross sections have been obtained from the CPA100 Monte Carlo track structure code, which has been a reference in the microdosimetry community for many years. They are compared to the default Geant4-DNA cross sections and show better agreement with published data. In order to verify the correct implementation of the CPA100 cross section models in Geant4-DNA, simulations of the number of interactions and ranges were performed using Geant4-DNA with this new set of models, and the results were compared with corresponding results from the original CPA100 code. Good agreement is observed between the implementations, with relative differences lower than 1% regardless of the incident electron energy. Useful quantities related to the deposited energy at the scale of the cell or the organ of interest for internal dosimetry, like dose point kernels, are also calculated using these new physics models. They are compared with results obtained using the well-known Penelope Monte Carlo code. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. A new Monte Carlo code for light transport in biological tissue.

    PubMed

    Torres-García, Eugenio; Oros-Pantoja, Rigoberto; Aranda-Lara, Liliana; Vieyra-Reyes, Patricia

    2018-04-01

    The aim of this work was to develop an event-by-event Monte Carlo code for light transport (called MCLTmx) to identify and quantify ballistic, diffuse, and absorbed photons, as well as their interaction coordinates inside the biological tissue. The mean free path length was computed between two interactions for scattering or absorption processes, and if necessary scatter angles were calculated, until the photon disappeared or went out of region of interest. A three-layer array (air-tissue-air) was used, forming a semi-infinite sandwich. The light source was placed at (0,0,0), emitting towards (0,0,1). The input data were: refractive indices, target thickness (0.02, 0.05, 0.1, 0.5, and 1 cm), number of particle histories, and λ from which the code calculated: anisotropy, scattering, and absorption coefficients. Validation presents differences less than 0.1% compared with that reported in the literature. The MCLTmx code discriminates between ballistic and diffuse photons, and inside of biological tissue, it calculates: specular reflection, diffuse reflection, ballistics transmission, diffuse transmission and absorption, and all parameters dependent on wavelength and thickness. The MCLTmx code can be useful for light transport inside any medium by changing the parameters that describe the new medium: anisotropy, dispersion and attenuation coefficients, and refractive indices for specific wavelength.

  16. ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2008-04-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a methodmore » for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.« less

  17. A Deep Penetration Problem Calculation Using AETIUS:An Easy Modeling Discrete Ordinates Transport Code UsIng Unstructured Tetrahedral Mesh, Shared Memory Parallel

    NASA Astrophysics Data System (ADS)

    KIM, Jong Woon; LEE, Young-Ouk

    2017-09-01

    As computing power gets better and better, computer codes that use a deterministic method seem to be less useful than those using the Monte Carlo method. In addition, users do not like to think about space, angles, and energy discretization for deterministic codes. However, a deterministic method is still powerful in that we can obtain a solution of the flux throughout the problem, particularly as when particles can barely penetrate, such as in a deep penetration problem with small detection volumes. Recently, a new state-of-the-art discrete-ordinates code, ATTILA, was developed and has been widely used in several applications. ATTILA provides the capabilities to solve geometrically complex 3-D transport problems by using an unstructured tetrahedral mesh. Since 2009, we have been developing our own code by benchmarking ATTILA. AETIUS is a discrete ordinates code that uses an unstructured tetrahedral mesh such as ATTILA. For pre- and post- processing, Gmsh is used to generate an unstructured tetrahedral mesh by importing a CAD file (*.step) and visualizing the calculation results of AETIUS. Using a CAD tool, the geometry can be modeled very easily. In this paper, we describe a brief overview of AETIUS and provide numerical results from both AETIUS and a Monte Carlo code, MCNP5, in a deep penetration problem with small detection volumes. The results demonstrate the effectiveness and efficiency of AETIUS for such calculations.

  18. Constraining physical parameters of ultra-fast outflows in PDS 456 with Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Hagino, K.; Odaka, H.; Done, C.; Gandhi, P.; Takahashi, T.

    2014-07-01

    Deep absorption lines with extremely high velocity of ˜0.3c observed in PDS 456 spectra strongly indicate the existence of ultra-fast outflows (UFOs). However, the launching and acceleration mechanisms of UFOs are still uncertain. One possible way to solve this is to constrain physical parameters as a function of distance from the source. In order to study the spatial dependence of parameters, it is essential to adopt 3-dimensional Monte Carlo simulations that treat radiation transfer in arbitrary geometry. We have developed a new simulation code of X-ray radiation reprocessed in AGN outflow. Our code implements radiative transfer in 3-dimensional biconical disk wind geometry, based on Monte Carlo simulation framework called MONACO (Watanabe et al. 2006, Odaka et al. 2011). Our simulations reproduce FeXXV and FeXXVI absorption features seen in the spectra. Also, broad Fe emission lines, which reflects the geometry and viewing angle, is successfully reproduced. By comparing the simulated spectra with Suzaku data, we obtained constraints on physical parameters. We discuss launching and acceleration mechanisms of UFOs in PDS 456 based on our analysis.

  19. SU-E-T-493: Accelerated Monte Carlo Methods for Photon Dosimetry Using a Dual-GPU System and CUDA.

    PubMed

    Liu, T; Ding, A; Xu, X

    2012-06-01

    To develop a Graphics Processing Unit (GPU) based Monte Carlo (MC) code that accelerates dose calculations on a dual-GPU system. We simulated a clinical case of prostate cancer treatment. A voxelized abdomen phantom derived from 120 CT slices was used containing 218×126×60 voxels, and a GE LightSpeed 16-MDCT scanner was modeled. A CPU version of the MC code was first developed in C++ and tested on Intel Xeon X5660 2.8GHz CPU, then it was translated into GPU version using CUDA C 4.1 and run on a dual Tesla m 2 090 GPU system. The code was featured with automatic assignment of simulation task to multiple GPUs, as well as accurate calculation of energy- and material- dependent cross-sections. Double-precision floating point format was used for accuracy. Doses to the rectum, prostate, bladder and femoral heads were calculated. When running on a single GPU, the MC GPU code was found to be ×19 times faster than the CPU code and ×42 times faster than MCNPX. These speedup factors were doubled on the dual-GPU system. The dose Result was benchmarked against MCNPX and a maximum difference of 1% was observed when the relative error is kept below 0.1%. A GPU-based MC code was developed for dose calculations using detailed patient and CT scanner models. Efficiency and accuracy were both guaranteed in this code. Scalability of the code was confirmed on the dual-GPU system. © 2012 American Association of Physicists in Medicine.

  20. Gravitational microlensing of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Mao, Shude

    1993-01-01

    A Monte Carlo code is developed to calculate gravitational microlensing in three dimensions when the lensing optical depth is low or moderate (not greater than 0.25). The code calculates positions of microimages and time delays between the microimages. The majority of lensed gamma-ray bursts should show a simple double-burst structure, as predicted by a single point mass lens model. A small fraction should show complicated multiple events due to the collective effects of several point masses (black holes). Cosmological models with a significant fraction of mass density in massive compact objects can be tested by searching for microlensing events in the current BATSE data. Our catalog generated by 10,000 Monte Carlo models is accessible through the computer network. The catalog can be used to take realistic selection effects into account.

  1. A Monte Carlo code for the fragmentation of polarized quarks

    NASA Astrophysics Data System (ADS)

    Kerbizi, A.; Artru, X.; Belghobsi, Z.; Bradamante, F.; Martin, A.

    2017-12-01

    We describe a Monte Carlo code for the fragmentation of polarized quarks into pseudoscalar mesons. The quark jet is generated by iteration of the splitting q → h + q‧ where q and q‧ indicate quarks and h a hadron. The splitting function describing the energy sharing between q‧ and h is calculated on the basis of the Symmetric Lund Model where the quark spin is introduced through spin matrices as foreseen in the 3 P 0 mechanism. A complex mass parameter is introduced for the parametrisation of the Collins effect. The results for the Collins analysing power and the comparison with the Collins asymmetries measured by the COMPASS collaboration are presented. For the first time preliminary results on the simulated azimuthal asymmetry due to the Boer-Mulders function are also given.

  2. Investigation of some possible changes in Am-Be neutron source configuration in order to increase the thermal neutron flux using Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Basiri, H.; Tavakoli-Anbaran, H.

    2018-01-01

    Am-Be neutrons source is based on (α, n) reaction and generates neutrons in the energy range of 0-11 MeV. Since the thermal neutrons are widely used in different fields, in this work, we investigate how to improve the source configuration in order to increase the thermal flux. These suggested changes include a spherical moderator instead of common cylindrical geometry, a reflector layer and an appropriate materials selection in order to achieve the maximum thermal flux. All calculations were done by using MCNP1 Monte Carlo code. Our final results indicated that a spherical paraffin moderator, a layer of beryllium as a reflector can efficiently increase the thermal neutron flux of Am-Be source.

  3. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  4. SU-E-T-561: Monte Carlo-Based Organ Dose Reconstruction Using Pre-Contoured Human Model for Hodgkins Lymphoma Patients Treated by Cobalt-60 External Beam Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, J; Pelletier, C; Lee, C

    Purpose: Organ doses for the Hodgkin’s lymphoma patients treated with cobalt-60 radiation were estimated using an anthropomorphic model and Monte Carlo modeling. Methods: A cobalt-60 treatment unit modeled in the BEAMnrc Monte Carlo code was used to produce phase space data. The Monte Carlo simulation was verified with percent depth dose measurement in water at various field sizes. Radiation transport through the lung blocks were modeled by adjusting the weights of phase space data. We imported a precontoured adult female hybrid model and generated a treatment plan. The adjusted phase space data and the human model were imported to themore » XVMC Monte Carlo code for dose calculation. The organ mean doses were estimated and dose volume histograms were plotted. Results: The percent depth dose agreement between measurement and calculation in water phantom was within 2% for all field sizes. The mean organ doses of heart, left breast, right breast, and spleen for the selected case were 44.3, 24.1, 14.6 and 3.4 Gy, respectively with the midline prescription dose of 40.0 Gy. Conclusion: Organ doses were estimated for the patient group whose threedimensional images are not available. This development may open the door to more accurate dose reconstruction and estimates of uncertainties in secondary cancer risk for Hodgkin’s lymphoma patients. This work was partially supported by the intramural research program of the National Institutes of Health, National Cancer Institute, Division of Cancer Epidemiology and Genetics.« less

  5. Performance Comparison of Orthogonal and Quasi-orthogonal Codes in Quasi-Synchronous Cellular CDMA Communication

    NASA Astrophysics Data System (ADS)

    Jos, Sujit; Kumar, Preetam; Chakrabarti, Saswat

    Orthogonal and quasi-orthogonal codes are integral part of any DS-CDMA based cellular systems. Orthogonal codes are ideal for use in perfectly synchronous scenario like downlink cellular communication. Quasi-orthogonal codes are preferred over orthogonal codes in the uplink communication where perfect synchronization cannot be achieved. In this paper, we attempt to compare orthogonal and quasi-orthogonal codes in presence of timing synchronization error. This will give insight into the synchronization demands in DS-CDMA systems employing the two classes of sequences. The synchronization error considered is smaller than chip duration. Monte-Carlo simulations have been carried out to verify the analytical and numerical results.

  6. Economic Education within the BME Research Community: Rejoinder to "Identifying Research Topic Development in Business and Management Education Research Using Legitimation Code Theory"

    ERIC Educational Resources Information Center

    Asarta, Carlos J.

    2016-01-01

    Carlos Asarta comments here that Arbaugh, Fornaciari, and Hwang (2016) are to be commended for their work ("Identifying Research Topic Development in Business and Management Education Research Using Legitimation Code Theory" "Journal of Management Education," Dec 2016, see EJ1118407). Asarta says that they make several…

  7. Computation of a Canadian SCWR unit cell with deterministic and Monte Carlo codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrisson, G.; Marleau, G.

    2012-07-01

    The Canadian SCWR has the potential to achieve the goals that the generation IV nuclear reactors must meet. As part of the optimization process for this design concept, lattice cell calculations are routinely performed using deterministic codes. In this study, the first step (self-shielding treatment) of the computation scheme developed with the deterministic code DRAGON for the Canadian SCWR has been validated. Some options available in the module responsible for the resonance self-shielding calculation in DRAGON 3.06 and different microscopic cross section libraries based on the ENDF/B-VII.0 evaluated nuclear data file have been tested and compared to a reference calculationmore » performed with the Monte Carlo code SERPENT under the same conditions. Compared to SERPENT, DRAGON underestimates the infinite multiplication factor in all cases. In general, the original Stammler model with the Livolant-Jeanpierre approximations are the most appropriate self-shielding options to use in this case of study. In addition, the 89 groups WIMS-AECL library for slight enriched uranium and the 172 groups WLUP library for a mixture of plutonium and thorium give the most consistent results with those of SERPENT. (authors)« less

  8. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  9. SPIDERMAN: Fast code to simulate secondary transits and phase curves

    NASA Astrophysics Data System (ADS)

    Louden, Tom; Kreidberg, Laura

    2017-11-01

    SPIDERMAN calculates exoplanet phase curves and secondary eclipses with arbitrary surface brightness distributions in two dimensions. The code uses a geometrical algorithm to solve exactly the area of sections of the disc of the planet that are occulted by the star. Approximately 1000 models can be generated per second in typical use, which makes making Markov Chain Monte Carlo analyses practicable. The code is modular and allows comparison of the effect of multiple different brightness distributions for a dataset.

  10. Muon simulation codes MUSIC and MUSUN for underground physics

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, V. A.

    2009-03-01

    The paper describes two Monte Carlo codes dedicated to muon simulations: MUSIC (MUon SImulation Code) and MUSUN (MUon Simulations UNderground). MUSIC is a package for muon transport through matter. It is particularly useful for propagating muons through large thickness of rock or water, for instance from the surface down to underground/underwater laboratory. MUSUN is designed to use the results of muon transport through rock/water to generate muons in or around underground laboratory taking into account their energy spectrum and angular distribution.

  11. [Medical Applications of the PHITS Code I: Recent Improvements and Biological Dose Estimation Model].

    PubMed

    Sato, Tatsuhiko; Furuta, Takuya; Hashimoto, Shintaro; Kuga, Naoya

    2015-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code developed through the collaboration of several institutes mainly in Japan. It can analyze the motion of nearly all radiations over wide energy ranges in 3-dimensional matters. It has been used for various applications including medical physics. This paper reviews the recent improvements of the code, together with the biological dose estimation method developed on the basis of the microdosimetric function implemented in PHITS.

  12. Hot zero power reactor calculations using the Insilico code

    DOE PAGES

    Hamilton, Steven P.; Evans, Thomas M.; Davidson, Gregory G.; ...

    2016-03-18

    In this paper we describe the reactor physics simulation capabilities of the insilico code. A description of the various capabilities of the code is provided, including detailed discussion of the geometry, meshing, cross section processing, and neutron transport options. Numerical results demonstrate that the insilico SP N solver with pin-homogenized cross section generation is capable of delivering highly accurate full-core simulation of various PWR problems. Comparison to both Monte Carlo calculations and measured plant data is provided.

  13. Some computer graphical user interfaces in radiation therapy.

    PubMed

    Chow, James C L

    2016-03-28

    In this review, five graphical user interfaces (GUIs) used in radiation therapy practices and researches are introduced. They are: (1) the treatment time calculator, superficial X-ray treatment time calculator (SUPCALC) used in the superficial X-ray radiation therapy; (2) the monitor unit calculator, electron monitor unit calculator (EMUC) used in the electron radiation therapy; (3) the multileaf collimator machine file creator, sliding window intensity modulated radiotherapy (SWIMRT) used in generating fluence map for research and quality assurance in intensity modulated radiation therapy; (4) the treatment planning system, DOSCTP used in the calculation of 3D dose distribution using Monte Carlo simulation; and (5) the monitor unit calculator, photon beam monitor unit calculator (PMUC) used in photon beam radiation therapy. One common issue of these GUIs is that all user-friendly interfaces are linked to complex formulas and algorithms based on various theories, which do not have to be understood and noted by the user. In that case, user only needs to input the required information with help from graphical elements in order to produce desired results. SUPCALC is a superficial radiation treatment time calculator using the GUI technique to provide a convenient way for radiation therapist to calculate the treatment time, and keep a record for the skin cancer patient. EMUC is an electron monitor unit calculator for electron radiation therapy. Instead of doing hand calculation according to pre-determined dosimetric tables, clinical user needs only to input the required drawing of electron field in computer graphical file format, prescription dose, and beam parameters to EMUC to calculate the required monitor unit for the electron beam treatment. EMUC is based on a semi-experimental theory of sector-integration algorithm. SWIMRT is a multileaf collimator machine file creator to generate a fluence map produced by a medical linear accelerator. This machine file controls the multileaf collimator to deliver intensity modulated beams for a specific fluence map used in quality assurance or research. DOSCTP is a treatment planning system using the computed tomography images. Radiation beams (photon or electron) with different energies and field sizes produced by a linear accelerator can be placed in different positions to irradiate the tumour in the patient. DOSCTP is linked to a Monte Carlo simulation engine using the EGSnrc-based code, so that 3D dose distribution can be determined accurately for radiation therapy. Moreover, DOSCTP can be used for treatment planning of patient or small animal. PMUC is a GUI for calculation of the monitor unit based on the prescription dose of patient in photon beam radiation therapy. The calculation is based on dose corrections in changes of photon beam energy, treatment depth, field size, jaw position, beam axis, treatment distance and beam modifiers. All GUIs mentioned in this review were written either by the Microsoft Visual Basic.net or a MATLAB GUI development tool called GUIDE. In addition, all GUIs were verified and tested using measurements to ensure their accuracies were up to clinical acceptable levels for implementations.

  14. An unbiased Hessian representation for Monte Carlo PDFs.

    PubMed

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Latorre, José Ignacio; Rojo, Juan

    We develop a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis. We validate the methodology by first showing that it faithfully reproduces a native Monte Carlo PDF set (NNPDF3.0), and then, that if applied to Hessian PDF set (MMHT14) which was transformed into a Monte Carlo set, it gives back the starting PDFs with minimal information loss. We then show that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (MC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea and a Hessian version of the recently suggested PDF compression algorithm (CMC-PDFs). The mc2hessian conversion code is made publicly available together with (through LHAPDF6) a Hessian representations of the NNPDF3.0 set, and the MC-H PDF set.

  15. NOTE: Monte Carlo evaluation of kerma in an HDR brachytherapy bunker

    NASA Astrophysics Data System (ADS)

    Pérez-Calatayud, J.; Granero, D.; Ballester, F.; Casal, E.; Crispin, V.; Puchades, V.; León, A.; Verdú, G.

    2004-12-01

    In recent years, the use of high dose rate (HDR) after-loader machines has greatly increased due to the shift from traditional Cs-137/Ir-192 low dose rate (LDR) to HDR brachytherapy. The method used to calculate the required concrete and, where appropriate, lead shielding in the door is based on analytical methods provided by documents published by the ICRP, the IAEA and the NCRP. The purpose of this study is to perform a more realistic kerma evaluation at the entrance maze door of an HDR bunker using the Monte Carlo code GEANT4. The Monte Carlo results were validated experimentally. The spectrum at the maze entrance door, obtained with Monte Carlo, has an average energy of about 110 keV, maintaining a similar value along the length of the maze. The comparison of results from the aforementioned values with the Monte Carlo ones shows that results obtained using the albedo coefficient from the ICRP document more closely match those given by the Monte Carlo method, although the maximum value given by MC calculations is 30% greater.

  16. Continuous Energy Photon Transport Implementation in MCATK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Terry R.; Trahan, Travis John; Sweezy, Jeremy Ed

    2016-10-31

    The Monte Carlo Application ToolKit (MCATK) code development team has implemented Monte Carlo photon transport into the MCATK software suite. The current particle transport capabilities in MCATK, which process the tracking and collision physics, have been extended to enable tracking of photons using the same continuous energy approximation. We describe the four photoatomic processes implemented, which are coherent scattering, incoherent scattering, pair-production, and photoelectric absorption. The accompanying background, implementation, and verification of these processes will be presented.

  17. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    NASA Astrophysics Data System (ADS)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 83617 No. of bytes in distributed program, including test data, etc.: 1038160 Distribution format: tar.gz Programming language: C++. Computer: Tested on several PCs and on Mac. Operating system: Linux, Mac OS X, Windows (native and cygwin). RAM: It is dependent on the input data but usually between 1 and 10 MB. Classification: 2.5, 21.1. External routines: XrayLib (https://github.com/tschoonj/xraylib/wiki) Nature of problem: Simulation of a wide range of X-ray imaging and spectroscopy experiments using different types of sources and detectors. Solution method: XRMC is a versatile program that is useful for the simulation of a wide range of X-ray imaging and spectroscopy experiments. It enables the simulation of monochromatic and polychromatic X-ray sources, with unpolarised or partially/completely polarised radiation. Single-element detectors as well as two-dimensional pixel detectors can be used in the simulations, with several acquisition options. In the current version of the program, the sample is modelled by combining convex three-dimensional objects demarcated by quadric surfaces, such as planes, ellipsoids and cylinders. The Monte Carlo approach makes XRMC able to accurately simulate X-ray photon transport and interactions with matter up to any order of interaction. The differential cross-sections and all other quantities related to the interaction processes (photoelectric absorption, fluorescence emission, elastic and inelastic scattering) are computed using the xraylib software library, which is currently the most complete and up-to-date software library for X-ray parameters. The use of variance reduction techniques makes XRMC able to reduce the simulation time by several orders of magnitude compared to other general-purpose Monte Carlo simulation programs. Running time: It is dependent on the complexity of the simulation. For the examples distributed with the code, it ranges from less than 1 s to a few minutes.

  18. Evaluation of the DRAGON code for VHTR design analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division

    2006-01-12

    This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less

  19. Pion and electromagnetic contribution to dose: Comparisons of HZETRN to Monte Carlo results and ISS data

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Blattnig, Steve R.; Reddell, Brandon; Bahadori, Amir; Norman, Ryan B.; Badavi, Francis F.

    2013-07-01

    Recent work has indicated that pion production and the associated electromagnetic (EM) cascade may be an important contribution to the total astronaut exposure in space. Recent extensions to the deterministic space radiation transport code, HZETRN, allow the production and transport of pions, muons, electrons, positrons, and photons. In this paper, the extended code is compared to the Monte Carlo codes, Geant4, PHITS, and FLUKA, in slab geometries exposed to galactic cosmic ray (GCR) boundary conditions. While improvements in the HZETRN transport formalism for the new particles are needed, it is shown that reasonable agreement on dose is found at larger shielding thicknesses commonly found on the International Space Station (ISS). Finally, the extended code is compared to ISS data on a minute-by-minute basis over a seven day period in 2001. The impact of pion/EM production on exposure estimates and validation results is clearly shown. The Badhwar-O'Neill (BO) 2004 and 2010 models are used to generate the GCR boundary condition at each time-step allowing the impact of environmental model improvements on validation results to be quantified as well. It is found that the updated BO2010 model noticeably reduces overall exposure estimates from the BO2004 model, and the additional production mechanisms in HZETRN provide some compensation. It is shown that the overestimates provided by the BO2004 GCR model in previous validation studies led to deflated uncertainty estimates for environmental, physics, and transport models, and allowed an important physical interaction (π/EM) to be overlooked in model development. Despite the additional π/EM production mechanisms in HZETRN, a systematic under-prediction of total dose is observed in comparison to Monte Carlo results and measured data.

  20. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.

  1. Extension of applicable neutron energy of DARWIN up to 1 GeV.

    PubMed

    Satoh, D; Sato, T; Endo, A; Matsufuji, N; Takada, M

    2007-01-01

    The radiation-dose monitor, DARWIN, needs a set of response functions of the liquid organic scintillator to assess a neutron dose. SCINFUL-QMD is a Monte Carlo based computer code to evaluate the response functions. In order to improve the accuracy of the code, a new light-output function based on the experimental data was developed for the production and transport of protons deuterons, tritons, (3)He nuclei and alpha particles, and incorporated into the code. The applicable energy of DARWIN was extended to 1 GeV using the response functions calculated by the modified SCINFUL-QMD code.

  2. BRYNTRN: A baryon transport model

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.; Nealy, John E.; Chun, Sang Y.; Hong, B. S.; Buck, Warren W.; Lamkin, S. L.; Ganapol, Barry D.; Khan, Ferdous; Cucinotta, Francis A.

    1989-01-01

    The development of an interaction data base and a numerical solution to the transport of baryons through an arbitrary shield material based on a straight ahead approximation of the Boltzmann equation are described. The code is most accurate for continuous energy boundary values, but gives reasonable results for discrete spectra at the boundary using even a relatively coarse energy grid (30 points) and large spatial increments (1 cm in H2O). The resulting computer code is self-contained, efficient and ready to use. The code requires only a very small fraction of the computer resources required for Monte Carlo codes.

  3. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expandedmore » to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  4. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    PubMed Central

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold

    2016-01-01

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian’s kmc. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems. PMID:27370123

  5. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model.

    PubMed

    Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold

    2016-07-01

    The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.

  6. Coupled multi-group neutron photon transport for the simulation of high-resolution gamma-ray spectroscopy applications

    NASA Astrophysics Data System (ADS)

    Burns, Kimberly Ann

    The accurate and efficient simulation of coupled neutron-photon problems is necessary for several important radiation detection applications. Examples include the detection of nuclear threats concealed in cargo containers and prompt gamma neutron activation analysis for nondestructive determination of elemental composition of unknown samples. In these applications, high-resolution gamma-ray spectrometers are used to preserve as much information as possible about the emitted photon flux, which consists of both continuum and characteristic gamma rays with discrete energies. Monte Carlo transport is the most commonly used modeling tool for this type of problem, but computational times for many problems can be prohibitive. This work explores the use of coupled Monte Carlo-deterministic methods for the simulation of neutron-induced photons for high-resolution gamma-ray spectroscopy applications. RAdiation Detection Scenario Analysis Toolbox (RADSAT), a code which couples deterministic and Monte Carlo transport to perform radiation detection scenario analysis in three dimensions [1], was used as the building block for the methods derived in this work. RADSAT was capable of performing coupled deterministic-Monte Carlo simulations for gamma-only and neutron-only problems. The purpose of this work was to develop the methodology necessary to perform coupled neutron-photon calculations and add this capability to RADSAT. Performing coupled neutron-photon calculations requires four main steps: the deterministic neutron transport calculation, the neutron-induced photon spectrum calculation, the deterministic photon transport calculation, and the Monte Carlo detector response calculation. The necessary requirements for each of these steps were determined. A major challenge in utilizing multigroup deterministic transport methods for neutron-photon problems was maintaining the discrete neutron-induced photon signatures throughout the simulation. Existing coupled neutron-photon cross-section libraries and the methods used to produce neutron-induced photons were unsuitable for high-resolution gamma-ray spectroscopy applications. Central to this work was the development of a method for generating multigroup neutron-photon cross-sections in a way that separates the discrete and continuum photon emissions so the neutron-induced photon signatures were preserved. The RADSAT-NG cross-section library was developed as a specialized multigroup neutron-photon cross-section set for the simulation of high-resolution gamma-ray spectroscopy applications. The methodology and cross sections were tested using code-to-code comparison with MCNP5 [2] and NJOY [3]. A simple benchmark geometry was used for all cases compared with MCNP. The geometry consists of a cubical sample with a 252Cf neutron source on one side and a HPGe gamma-ray spectrometer on the opposing side. Different materials were examined in the cubical sample: polyethylene (C2H4), P, N, O, and Fe. The cross sections for each of the materials were compared to cross sections collapsed using NJOY. Comparisons of the volume-averaged neutron flux within the sample, volume-averaged photon flux within the detector, and high-purity gamma-ray spectrometer response (only for polyethylene) were completed using RADSAT and MCNP. The code-to-code comparisons show promising results for the coupled Monte Carlo-deterministic method. The RADSAT-NG cross-section production method showed good agreement with NJOY for all materials considered although some additional work is needed in the resonance region and in the first and last energy bin. Some cross section discrepancies existed in the lowest and highest energy bin, but the overall shape and magnitude of the two methods agreed. For the volume-averaged photon flux within the detector, typically the five most intense lines agree to within approximately 5% of the MCNP calculated flux for all of materials considered. The agreement in the code-to-code comparisons cases demonstrates a proof-of-concept of the method for use in RADSAT for coupled neutron-photon problems in high-resolution gamma-ray spectroscopy applications. One of the primary motivators for using the coupled method over pure Monte Carlo method is the potential for significantly lower computational times. For the code-to-code comparison cases, the run times for RADSAT were approximately 25--500 times shorter than for MCNP, as shown in Table 1. This was assuming a 40 mCi 252Cf neutron source and 600 seconds of "real-world" measurement time. The only variance reduction technique implemented in the MCNP calculation was forward biasing of the source toward the sample target. Improved MCNP runtimes could be achieved with the addition of more advanced variance reduction techniques.

  7. Monte Carlo MCNP-4B-based absorbed dose distribution estimates for patient-specific dosimetry.

    PubMed

    Yoriyaz, H; Stabin, M G; dos Santos, A

    2001-04-01

    This study was intended to verify the capability of the Monte Carlo MCNP-4B code to evaluate spatial dose distribution based on information gathered from CT or SPECT. A new three-dimensional (3D) dose calculation approach for internal emitter use in radioimmunotherapy (RIT) was developed using the Monte Carlo MCNP-4B code as the photon and electron transport engine. It was shown that the MCNP-4B computer code can be used with voxel-based anatomic and physiologic data to provide 3D dose distributions. This study showed that the MCNP-4B code can be used to develop a treatment planning system that will provide such information in a time manner, if dose reporting is suitably optimized. If each organ is divided into small regions where the average energy deposition is calculated with a typical volume of 0.4 cm(3), regional dose distributions can be provided with reasonable central processing unit times (on the order of 12-24 h on a 200-MHz personal computer or modest workstation). Further efforts to provide semiautomated region identification (segmentation) and improvement of marrow dose calculations are needed to supply a complete system for RIT. It is envisioned that all such efforts will continue to develop and that internal dose calculations may soon be brought to a similar level of accuracy, detail, and robustness as is commonly expected in external dose treatment planning. For this study we developed a code with a user-friendly interface that works on several nuclear medicine imaging platforms and provides timely patient-specific dose information to the physician and medical physicist. Future therapy with internal emitters should use a 3D dose calculation approach, which represents a significant advance over dose information provided by the standard geometric phantoms used for more than 20 y (which permit reporting of only average organ doses for certain standardized individuals)

  8. A Monte Carlo model system for core analysis and epithermal neutron beam design at the Washington State University Radiation Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, T.D. Jr.

    1996-05-01

    The Monte Carlo Model System (MCMS) for the Washington State University (WSU) Radiation Center provides a means through which core criticality and power distributions can be calculated, as well as providing a method for neutron and photon transport necessary for BNCT epithermal neutron beam design. The computational code used in this Model System is MCNP4A. The geometric capability of this Monte Carlo code allows the WSU system to be modeled very accurately. A working knowledge of the MCNP4A neutron transport code increases the flexibility of the Model System and is recommended, however, the eigenvalue/power density problems can be run withmore » little direct knowledge of MCNP4A. Neutron and photon particle transport require more experience with the MCNP4A code. The Model System consists of two coupled subsystems; the Core Analysis and Source Plane Generator Model (CASP), and the BeamPort Shell Particle Transport Model (BSPT). The CASP Model incorporates the S({alpha}, {beta}) thermal treatment, and is run as a criticality problem yielding, the system eigenvalue (k{sub eff}), the core power distribution, and an implicit surface source for subsequent particle transport in the BSPT Model. The BSPT Model uses the source plane generated by a CASP run to transport particles through the thermal column beamport. The user can create filter arrangements in the beamport and then calculate characteristics necessary for assessing the BNCT potential of the given filter want. Examples of the characteristics to be calculated are: neutron fluxes, neutron currents, fast neutron KERMAs and gamma KERMAs. The MCMS is a useful tool for the WSU system. Those unfamiliar with the MCNP4A code can use the MCMS transparently for core analysis, while more experienced users will find the particle transport capabilities very powerful for BNCT filter design.« less

  9. hybrid\\scriptsize{{MANTIS}}: a CPU-GPU Monte Carlo method for modeling indirect x-ray detectors with columnar scintillators

    NASA Astrophysics Data System (ADS)

    Sharma, Diksha; Badal, Andreu; Badano, Aldo

    2012-04-01

    The computational modeling of medical imaging systems often requires obtaining a large number of simulated images with low statistical uncertainty which translates into prohibitive computing times. We describe a novel hybrid approach for Monte Carlo simulations that maximizes utilization of CPUs and GPUs in modern workstations. We apply the method to the modeling of indirect x-ray detectors using a new and improved version of the code \\scriptsize{{MANTIS}}, an open source software tool used for the Monte Carlo simulations of indirect x-ray imagers. We first describe a GPU implementation of the physics and geometry models in fast\\scriptsize{{DETECT}}2 (the optical transport model) and a serial CPU version of the same code. We discuss its new features like on-the-fly column geometry and columnar crosstalk in relation to the \\scriptsize{{MANTIS}} code, and point out areas where our model provides more flexibility for the modeling of realistic columnar structures in large area detectors. Second, we modify \\scriptsize{{PENELOPE}} (the open source software package that handles the x-ray and electron transport in \\scriptsize{{MANTIS}}) to allow direct output of location and energy deposited during x-ray and electron interactions occurring within the scintillator. This information is then handled by optical transport routines in fast\\scriptsize{{DETECT}}2. A load balancer dynamically allocates optical transport showers to the GPU and CPU computing cores. Our hybrid\\scriptsize{{MANTIS}} approach achieves a significant speed-up factor of 627 when compared to \\scriptsize{{MANTIS}} and of 35 when compared to the same code running only in a CPU instead of a GPU. Using hybrid\\scriptsize{{MANTIS}}, we successfully hide hours of optical transport time by running it in parallel with the x-ray and electron transport, thus shifting the computational bottleneck from optical to x-ray transport. The new code requires much less memory than \\scriptsize{{MANTIS}} and, as a result, allows us to efficiently simulate large area detectors.

  10. Monte Carlo charged-particle tracking and energy deposition on a Lagrangian mesh.

    PubMed

    Yuan, J; Moses, G A; McKenty, P W

    2005-10-01

    A Monte Carlo algorithm for alpha particle tracking and energy deposition on a cylindrical computational mesh in a Lagrangian hydrodynamics code used for inertial confinement fusion (ICF) simulations is presented. The straight line approximation is used to follow propagation of "Monte Carlo particles" which represent collections of alpha particles generated from thermonuclear deuterium-tritium (DT) reactions. Energy deposition in the plasma is modeled by the continuous slowing down approximation. The scheme addresses various aspects arising in the coupling of Monte Carlo tracking with Lagrangian hydrodynamics; such as non-orthogonal severely distorted mesh cells, particle relocation on the moving mesh and particle relocation after rezoning. A comparison with the flux-limited multi-group diffusion transport method is presented for a polar direct drive target design for the National Ignition Facility. Simulations show the Monte Carlo transport method predicts about earlier ignition than predicted by the diffusion method, and generates higher hot spot temperature. Nearly linear speed-up is achieved for multi-processor parallel simulations.

  11. QMCPACK : an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids

    DOE PAGES

    Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.; ...

    2018-04-19

    QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less

  12. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweezy, Jeremy Ed

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gammamore » transport with multi-temperature treatment, static eigenvalue (k eff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.« less

  13. QMCPACK : an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.

    QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less

  14. Experimental and Monte Carlo studies of fluence corrections for graphite calorimetry in low- and high-energy clinical proton beams.

    PubMed

    Lourenço, Ana; Thomas, Russell; Bouchard, Hugo; Kacperek, Andrzej; Vondracek, Vladimir; Royle, Gary; Palmans, Hugo

    2016-07-01

    The aim of this study was to determine fluence corrections necessary to convert absorbed dose to graphite, measured by graphite calorimetry, to absorbed dose to water. Fluence corrections were obtained from experiments and Monte Carlo simulations in low- and high-energy proton beams. Fluence corrections were calculated to account for the difference in fluence between water and graphite at equivalent depths. Measurements were performed with narrow proton beams. Plane-parallel-plate ionization chambers with a large collecting area compared to the beam diameter were used to intercept the whole beam. High- and low-energy proton beams were provided by a scanning and double scattering delivery system, respectively. A mathematical formalism was established to relate fluence corrections derived from Monte Carlo simulations, using the fluka code [A. Ferrari et al., "fluka: A multi-particle transport code," in CERN 2005-10, INFN/TC 05/11, SLAC-R-773 (2005) and T. T. Böhlen et al., "The fluka Code: Developments and challenges for high energy and medical applications," Nucl. Data Sheets 120, 211-214 (2014)], to partial fluence corrections measured experimentally. A good agreement was found between the partial fluence corrections derived by Monte Carlo simulations and those determined experimentally. For a high-energy beam of 180 MeV, the fluence corrections from Monte Carlo simulations were found to increase from 0.99 to 1.04 with depth. In the case of a low-energy beam of 60 MeV, the magnitude of fluence corrections was approximately 0.99 at all depths when calculated in the sensitive area of the chamber used in the experiments. Fluence correction calculations were also performed for a larger area and found to increase from 0.99 at the surface to 1.01 at greater depths. Fluence corrections obtained experimentally are partial fluence corrections because they account for differences in the primary and part of the secondary particle fluence. A correction factor, F(d), has been established to relate fluence corrections defined theoretically to partial fluence corrections derived experimentally. The findings presented here are also relevant to water and tissue-equivalent-plastic materials given their carbon content.

  15. Monte Carlo and analytical calculations for characterization of gas bremsstrahlung in ILSF insertion devices

    NASA Astrophysics Data System (ADS)

    Salimi, E.; Rahighi, J.; Sardari, D.; Mahdavi, S. R.; Lamehi Rachti, M.

    2014-12-01

    Gas bremsstrahlung is generated in high energy electron storage rings through interaction of the electron beam with the residual gas molecules in vacuum chamber. In this paper, Monte Carlo calculation has been performed to evaluate radiation hazard due to gas bremsstrahlung in the Iranian Light Source Facility (ILSF) insertion devices. Shutter/stopper dimensions is determined and dose rate from the photoneutrons via the giant resonance photonuclear reaction which takes place inside the shutter/stopper is also obtained. Some other characteristics of gas bremsstrahlung such as photon fluence, energy spectrum, angular distribution and equivalent dose in tissue equivalent phantom have also been investigated by FLUKA Monte Carlo code.

  16. Monte Carlo study of four dimensional binary hard hypersphere mixtures

    NASA Astrophysics Data System (ADS)

    Bishop, Marvin; Whitlock, Paula A.

    2012-01-01

    A multithreaded Monte Carlo code was used to study the properties of binary mixtures of hard hyperspheres in four dimensions. The ratios of the diameters of the hyperspheres examined were 0.4, 0.5, 0.6, and 0.8. Many total densities of the binary mixtures were investigated. The pair correlation functions and the equations of state were determined and compared with other simulation results and theoretical predictions. At lower diameter ratios the pair correlation functions of the mixture agree with the pair correlation function of a one component fluid at an appropriately scaled density. The theoretical results for the equation of state compare well to the Monte Carlo calculations for all but the highest densities studied.

  17. Origins of the changing detector response in small megavoltage photon radiation fields.

    PubMed

    Fenwick, John D; Georgiou, Georgios; Rowbottom, Carl G; Underwood, Tracy S A; Kumar, Sudhir; Nahum, Alan E

    2018-06-08

    Differences in detector response between measured small fields, f clin, and wider reference fields, f msr , can be overcome by using correction factors [Formula: see text] or by designing detectors with field-size invariant responses. The changing response in small fields is caused by perturbations of the electron fluence within the detector sensitive volume. For solid-state detectors, it has recently been suggested that these perturbations might be caused by the non-water-equivalent effective atomic numbers Z of detector materials, rather than by their non-water-like densities. Using the EGSnrc Monte Carlo code we have analyzed the response of a PTW 60017 diode detector in a 6 MV beam, calculating the [Formula: see text] correction factor from computed doses absorbed by water and by the detector sensitive volume in 0.5  ×  0.5 and 4  ×  4 cm 2 fields. In addition to the 'real' detector, fully modelled according to the manufacturer's blue-prints, we calculated doses and [Formula: see text] factors for a 'Z  →  water' detector variant in which mass stopping-powers and microscopic interaction coefficients were set to those of water while preserving real material densities, and for a 'density  →  1' variant in which densities were set to 1 g cm -3 , leaving mass stopping-powers and interaction coefficients at real levels. [Formula: see text] equalled 0.910  ±  0.005 (2 standard deviations) for the real detector, was insignificantly different at 0.912  ±  0.005 for the 'Z  →  H 2 O' variant, but equalled 1.012  ±  0.006 for the 'density  →  1' variant. For the 60017 diode in a 6 MV beam, then, [Formula: see text] was determined primarily by the detector's density rather than its atomic composition. Further calculations showed this remained the case in a 15 MV beam. Interestingly, the sensitive volume electron fluence was perturbed more by detector atomic composition than by density; however, the density-dependent perturbation varied with field-size, whereas the Z-dependent perturbation was relatively constant, little affecting [Formula: see text].

  18. SU-F-T-667: Development and Validation of Dose Calculation for An Open-Source KV Treatment Planning System for Small Animal Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prajapati, S; Mo, X; Bednarz, B

    Purpose: An open-source, convolution/superposition based kV-treatment planning system(TPS) was developed for small animal radiotherapy from previously existed in-house MV-TPS. It is flexible and applicable to both step and shoot and helical tomotherapy treatment delivery. For initial commissioning process, the dose calculation from kV-TPS was compared with measurements and Monte Carlo(MC) simulations. Methods: High resolution, low energy kernels were simulated using EGSnrc user code EDKnrc, which was used as an input in kV-TPS together with MC-simulated x-ray beam spectrum. The Blue Water™ homogeneous phantom (with film inserts) and heterogeneous phantom (with film and TLD inserts) were fabricated. Phantom was placed atmore » 100cm SSD, and was irradiated with 250 kVp beam for 10mins with 1.1cm × 1.1cm open field (at 100cm) created by newly designed binary micro-MLC assembly positioned at 90cm SSD. Gafchromic™ EBT3 film was calibrated in-phantom following AAPM TG-61 guidelines, and were used for measurement at 5 different depths in phantom. Calibrated TLD-100s were obtained from ADCL. EGS and MNCP5 simulation were used to model experimental irradiation set up calculation of dose in phantom. Results: Using the homogeneous phantom, dose difference between film and kV-TPS was calculated: mean(x)=0.9%; maximum difference(MD)=3.1%; standard deviation(σ)=1.1%. Dose difference between MCNP5 and kV-TPS was: x=1.5%; MD=4.6%; σ=1.9%. Dose difference between EGS and kV-TPS was: x=0.8%; MD=1.9%; σ=0.8%. Using the heterogeneous phantom, dose difference between film and kV-TPS was: x=2.6%; MD=3%; σ=1.1%; and dose difference between TLD and kV-TPS was: x=2.9%; MD=6.4%; σ=2.5%. Conclusion: The inhouse, open-source kV-TPS dose calculation system was comparable within 5% of measurements and MC simulations in both homogeneous and heterogeneous phantoms. The dose calculation system of the kV-TPS is validated as a part of initial commissioning process for small animal radiotherapy. The kV-TPS has the potential for accurate dose calculation for any kV treatment or imaging modalities.« less

  19. Dedicated high dose rate 192Ir brachytherapy radiation fields for in vitro cell exposures at variable source-target cell distances: killing of mammalian cells depends on temporal dose rate fluctuation

    NASA Astrophysics Data System (ADS)

    Veigel, Cornelia; Hartmann, Günther H.; Fritz, Peter; Debus, Jürgen; Weber, Klaus-Josef

    2017-02-01

    Afterloading brachytherapy is conducted by the stepwise movement of a radioactive source through surgically implanted applicator tubes where at predefined dwell positions calculated dwell times optimize spatial dose delivery with respect to a planned dose level. The temporal exposure pattern exhibits drastic fluctuations in dose rate at a given coordinate and within a single treatment session because of the discontinuous and repeated source movement into the target volume. This could potentially affect biological response. Therefore, mammalian cells were exposed as monolayers to a high dose rate 192Ir source by utilizing a dedicated irradiation device where the distance between a planar array of radioactive source positions and the plane of the cell monolayer could be varied from 2.5 mm to 40 mm, thus varying dose rate pattern for any chosen total dose. The Gammamed IIi afterloading system equipped with a nominal 370 GBq (10 Ci) 192-Ir source was used to irradiate V79 Chinese hamster lung fibroblasts from both confluent and from exponential growth phase with dose up to 12 Gy (at room temperature, total exposure not exceeding 1 h). For comparison, V79 cells were also exposed to 6 MV x-rays from a clinical linear accelerator (dose rate of 2.5 Gy min-1). As biological endpoint, cell survival was determined by standard colony forming assay. Dose measurements were conducted with a diamond detector (sensitive area 7.3 mm2), calibrated by means of 60Co radiation. Additionally, dose delivery was simulated by Monte Carlo calculations using the EGSnrc code system. The calculated secondary electron fluence spectra at the cell location did not indicate a significant change of radiation quality (i.e. higher linear energy transfer) at the lower distances. Clonogenic cell survival curves obtained after brachytherapy exhibited an altered biological response compared to x-rays which was characterized by a significant reduction of the survival curve shoulder when dose rate fluctuations were high. Therefore, also for the time scale of the present investigation, cellular effects of radiation are not invariant to the temporal pattern in dose rate. We propose that with high dose rate variation the cells activate less efficiently their DNA damage response than after continuous irradiation.

  20. The central electrode correction factor for high-Z electrodes in small ionization chambers.

    PubMed

    Muir, B R; Rogers, D W O

    2011-02-01

    Recent Monte Carlo calculations of beam quality conversion factors for ion chambers that use high-Z electrodes [B. R. Muir and D. W. O. Rogers, Med. Phys. 37, 5939-5950 (2010)] have shown large deviations of kQ values from values calculated using the same techniques as the TG-51 and TRS-398 protocols. This report investigates the central electrode correction factor, Pcel, for these chambers. Ionization chambers are modeled and Pcel is calculated using the EGSnrc user code egs_chamber for three cases: in photon and electron beams under reference conditions; as a function of distance from an iridium-192 point source in a water phantom; and as a function of depth in a water phantom on which a 200 kVp x-ray source or 6 MV beam is incident. In photon beams, differences of up to 3% between Pcel calculations for a chamber with a high-Z electrode and those used by TG-51 for a 1 mm diameter aluminum electrode are observed. The central electrode correction factor for a given value of the beam quality specifier is different depending on the amount of filtration of the photon beam. However, in an unfiltered 6 MV beam, Pcel, varies by only 0.3% for a chamber with a high-Z electrode as the depth is varied from 1 to 20 cm in water. The difference between Pcel calculations for chambers with high-Z electrodes and TG-51 values for a chamber with an aluminum electrode is up to 0.45% in electron beams. The central electrode correction, which is roughly proportional to the chambers absorbed dose sensitivity, is found to be large and variable as a function of distance for chambers with high-Z and aluminum electrodes in low-energy photon fields. In this work, ionization chambers that employ high-Z electrodes have been shown to be problematic in various situations. For beam quality conversion factors, the ratio of Pcel in a beam quality Q to that in a Co-60 beam is required; for some chambers, kQ is significantly different from current dosimetry protocol values because of central electrode effects. It would be best for manufacturers to avoid producing ion chambers that use high-Z electrodes.

Top