Use of Fluka to Create Dose Calculations
NASA Technical Reports Server (NTRS)
Lee, Kerry T.; Barzilla, Janet; Townsend, Lawrence; Brittingham, John
2012-01-01
Monte Carlo codes provide an effective means of modeling three dimensional radiation transport; however, their use is both time- and resource-intensive. The creation of a lookup table or parameterization from Monte Carlo simulation allows users to perform calculations with Monte Carlo results without replicating lengthy calculations. FLUKA Monte Carlo transport code was used to develop lookup tables and parameterizations for data resulting from the penetration of layers of aluminum, polyethylene, and water with areal densities ranging from 0 to 100 g/cm^2. Heavy charged ion radiation including ions from Z=1 to Z=26 and from 0.1 to 10 GeV/nucleon were simulated. Dose, dose equivalent, and fluence as a function of particle identity, energy, and scattering angle were examined at various depths. Calculations were compared against well-known results and against the results of other deterministic and Monte Carlo codes. Results will be presented.
Status of the Space Radiation Monte Carlos Simulation Based on FLUKA and ROOT
NASA Technical Reports Server (NTRS)
Andersen, Victor; Carminati, Federico; Empl, Anton; Ferrari, Alfredo; Pinsky, Lawrence; Sala, Paola; Wilson, Thomas L.
2002-01-01
The NASA-funded project reported on at the first IWSSRR in Arona to develop a Monte-Carlo simulation program for use in simulating the space radiation environment based on the FLUKA and ROOT codes is well into its second year of development, and considerable progress has been made. The general tasks required to achieve the final goals include the addition of heavy-ion interactions into the FLUKA code and the provision of a ROOT-based interface to FLUKA. The most significant progress to date includes the incorporation of the DPMJET event generator code within FLUKA to handle heavy-ion interactions for incident projectile energies greater than 3GeV/A. The ongoing effort intends to extend the treatment of these interactions down to 10 MeV, and at present two alternative approaches are being explored. The ROOT interface is being pursued in conjunction with the CERN LHC ALICE software team through an adaptation of their existing AliROOT software. As a check on the validity of the code, a simulation of the recent data taken by the ATIC experiment is underway.
FLUKA simulation studies on in-phantom dosimetric parameters of a LINAC-based BNCT
NASA Astrophysics Data System (ADS)
Ghal-Eh, N.; Goudarzi, H.; Rahmani, F.
2017-12-01
The Monte Carlo simulation code, FLUKA version 2011.2c.5, has been used to estimate the in-phantom dosimetric parameters for use in BNCT studies. The in-phantom parameters of a typical Snyder head, which are necessary information prior to any clinical treatment, have been calculated with both FLUKA and MCNPX codes, which exhibit a promising agreement. The results confirm that FLUKA can be regarded as a good alternative for the MCNPX in BNCT dosimetry simulations.
Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*
Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G
2014-01-01
Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image-based dosimetry in nuclear medicine. PMID:24200697
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M
Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less
NASA Astrophysics Data System (ADS)
Chatterjee, S.; Bakshi, A. K.; Tripathy, S. P.
2010-09-01
Response matrix for CaSO 4:Dy based neutron dosimeter was generated using Monte Carlo code FLUKA in the energy range thermal to 20 MeV for a set of eight Bonner spheres of diameter 3-12″ including the bare one. Response of the neutron dosimeter was measured for the above set of spheres for 241Am-Be neutron source covered with 2 mm lead. An analytical expression for the response function was devised as a function of sphere mass. Using Frascati Unfolding Iteration Tool (FRUIT) unfolding code, the neutron spectrum of 241Am-Be was unfolded and compared with standard IAEA spectrum for the same.
Space Applications of the FLUKA Monte-Carlo Code: Lunar and Planetary Exploration
NASA Technical Reports Server (NTRS)
Anderson, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Elkhayari, N.; Empl, A.; Fasso, A.; Ferrari, A.;
2004-01-01
NASA has recognized the need for making additional heavy-ion collision measurements at the U.S. Brookhaven National Laboratory in order to support further improvement of several particle physics transport-code models for space exploration applications. FLUKA has been identified as one of these codes and we will review the nature and status of this investigation as it relates to high-energy heavy-ion physics.
Monte Carlo Simulation of a Segmented Detector for Low-Energy Electron Antineutrinos
NASA Astrophysics Data System (ADS)
Qomi, H. Akhtari; Safari, M. J.; Davani, F. Abbasi
2017-11-01
Detection of low-energy electron antineutrinos is of importance for several purposes, such as ex-vessel reactor monitoring, neutrino oscillation studies, etc. The inverse beta decay (IBD) is the interaction that is responsible for detection mechanism in (organic) plastic scintillation detectors. Here, a detailed study will be presented dealing with the radiation and optical transport simulation of a typical segmented antineutrino detector withMonte Carlo method using MCNPX and FLUKA codes. This study shows different aspects of the detector, benefiting from inherent capabilities of the Monte Carlo simulation codes.
NASA Technical Reports Server (NTRS)
Reddell, Brandon
2015-01-01
Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Botta, F; Di Dia, A; Pedroli, G
The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK),more » quantifying the energy deposition all around a point isotropic source, is often the one.Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10–3 MeV) and for beta emitting isotopes commonly used for therapy (89Sr, 90Y, 131I, 153Sm, 177Lu, 186Re, and 188Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8·RCSDA and 0.9·RCSDA for monoenergetic electrons (RCSDA being the continuous slowing down approximation range) and within 0.8·X90 and 0.9·X90 for isotopes (X90 being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9·RCSDA and 0.9·X90 for electrons and isotopes, respectively.Results: Concerning monoenergetic electrons, within 0.8·RCSDA (where 90%–97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The discrepancies between fluka and the other codes are of the same order of magnitude than those observed when comparing the other codes among them, which can be referred to the different simulation algorithms. When considering the beta spectra, discrepancies notably reduce: within 0.9·X90, fluka and penelope differ for less than 1% in water and less than 2% in bone with any of the isotopes here considered. Complete data of fluka DPKs are given as Supplementary Material as a tool to perform dosimetry by analytical point kernel convolution.Conclusions: fluka provides reliable results when transporting electrons in the low energy range, proving to be an adequate tool for nuclear medicine dosimetry.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumann, K; Weber, U; Simeonov, Y
Purpose: Aim of this study was to optimize the magnetic field strengths of two quadrupole magnets in a particle therapy facility in order to obtain a beam quality suitable for spot beam scanning. Methods: The particle transport through an ion-optic system of a particle therapy facility consisting of the beam tube, two quadrupole magnets and a beam monitor system was calculated with the help of Matlab by using matrices that solve the equation of motion of a charged particle in a magnetic field and field-free region, respectively. The magnetic field strengths were optimized in order to obtain a circular andmore » thin beam spot at the iso-center of the therapy facility. These optimized field strengths were subsequently transferred to the Monte-Carlo code FLUKA and the transport of 80 MeV/u C12-ions through this ion-optic system was calculated by using a user-routine to implement magnetic fields. The fluence along the beam-axis and at the iso-center was evaluated. Results: The magnetic field strengths could be optimized by using Matlab and transferred to the Monte-Carlo code FLUKA. The implementation via a user-routine was successful. Analyzing the fluence-pattern along the beam-axis the characteristic focusing and de-focusing effects of the quadrupole magnets could be reproduced. Furthermore the beam spot at the iso-center was circular and significantly thinner compared to an unfocused beam. Conclusion: In this study a Matlab tool was developed to optimize magnetic field strengths for an ion-optic system consisting of two quadrupole magnets as part of a particle therapy facility. These magnetic field strengths could subsequently be transferred to and implemented in the Monte-Carlo code FLUKA to simulate the particle transport through this optimized ion-optic system.« less
NASA Technical Reports Server (NTRS)
Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.;
2006-01-01
FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Botta, F.; Mairani, A.; Battistoni, G.
Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernelmore » (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10{sup -3} MeV) and for beta emitting isotopes commonly used for therapy ({sup 89}Sr, {sup 90}Y, {sup 131}I, {sup 153}Sm, {sup 177}Lu, {sup 186}Re, and {sup 188}Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8{center_dot}R{sub CSDA} and 0.9{center_dot}R{sub CSDA} for monoenergetic electrons (R{sub CSDA} being the continuous slowing down approximation range) and within 0.8{center_dot}X{sub 90} and 0.9{center_dot}X{sub 90} for isotopes (X{sub 90} being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9{center_dot}R{sub CSDA} and 0.9{center_dot}X{sub 90} for electrons and isotopes, respectively. Results: Concerning monoenergetic electrons, within 0.8{center_dot}R{sub CSDA} (where 90%-97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The discrepancies between fluka and the other codes are of the same order of magnitude than those observed when comparing the other codes among them, which can be referred to the different simulation algorithms. When considering the beta spectra, discrepancies notably reduce: within 0.9{center_dot}X{sub 90}, fluka and penelope differ for less than 1% in water and less than 2% in bone with any of the isotopes here considered. Complete data of fluka DPKs are given as Supplementary Material as a tool to perform dosimetry by analytical point kernel convolution. Conclusions: fluka provides reliable results when transporting electrons in the low energy range, proving to be an adequate tool for nuclear medicine dosimetry.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, S; Shin, E H; Kim, J
2015-06-15
Purpose: To evaluate the shielding wall design to protect patients, staff and member of the general public for secondary neutron using a simply analytic solution, multi-Monte Carlo code MCNPX, ANISN and FLUKA. Methods: An analytical and multi-Monte Carlo method were calculated for proton facility (Sumitomo Heavy Industry Ltd.) at Samsung Medical Center in Korea. The NCRP-144 analytical evaluation methods, which produced conservative estimates on the dose equivalent values for the shielding, were used for analytical evaluations. Then, the radiation transport was simulated with the multi-Monte Carlo code. The neutron dose at evaluation point is got by the value using themore » production of the simulation value and the neutron dose coefficient introduced in ICRP-74. Results: The evaluation points of accelerator control room and control room entrance are mainly influenced by the point of the proton beam loss. So the neutron dose equivalent of accelerator control room for evaluation point is 0.651, 1.530, 0.912, 0.943 mSv/yr and the entrance of cyclotron room is 0.465, 0.790, 0.522, 0.453 mSv/yr with calculation by the method of NCRP-144 formalism, ANISN, FLUKA and MCNP, respectively. The most of Result of MCNPX and FLUKA using the complicated geometry showed smaller values than Result of ANISN. Conclusion: The neutron shielding for a proton therapy facility has been evaluated by the analytic model and multi-Monte Carlo methods. We confirmed that the setting of shielding was located in well accessible area to people when the proton facility is operated.« less
Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes
NASA Technical Reports Server (NTRS)
Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.
2010-01-01
The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,
NASA Astrophysics Data System (ADS)
Tessonnier, T.; Mairani, A.; Brons, S.; Sala, P.; Cerutti, F.; Ferrari, A.; Haberer, T.; Debus, J.; Parodi, K.
2017-08-01
In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo treatment planning engine based on the same FLUKA code, or an independent analytical planning system fed with a validated database of inputs calculated with FLUKA.
Tessonnier, T; Mairani, A; Brons, S; Sala, P; Cerutti, F; Ferrari, A; Haberer, T; Debus, J; Parodi, K
2017-08-01
In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4 He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo treatment planning engine based on the same FLUKA code, or an independent analytical planning system fed with a validated database of inputs calculated with FLUKA.
NASA Astrophysics Data System (ADS)
Usta, Metin; Tufan, Mustafa Çağatay; Aydın, Güral; Bozkurt, Ahmet
2018-07-01
In this study, we have performed the calculations stopping power, depth dose, and range verification for proton beams using dielectric and Bethe-Bloch theories and FLUKA, Geant4 and MCNPX Monte Carlo codes. In the framework, as analytical studies, Drude model was applied for dielectric theory and effective charge approach with Roothaan-Hartree-Fock charge densities was used in Bethe theory. In the simulations different setup parameters were selected to evaluate the performance of three distinct Monte Carlo codes. The lung and breast tissues were investigated are considered to be related to the most common types of cancer throughout the world. The results were compared with each other and the available data in literature. In addition, the obtained results were verified with prompt gamma range data. In both stopping power values and depth-dose distributions, it was found that the Monte Carlo values give better results compared with the analytical ones while the results that agree best with ICRU data in terms of stopping power are those of the effective charge approach between the analytical methods and of the FLUKA code among the MC packages. In the depth dose distributions of the examined tissues, although the Bragg curves for Monte Carlo almost overlap, the analytical ones show significant deviations that become more pronounce with increasing energy. Verifications with the results of prompt gamma photons were attempted for 100-200 MeV protons which are regarded important for proton therapy. The analytical results are within 2%-5% and the Monte Carlo values are within 0%-2% as compared with those of the prompt gammas.
Monte Carlo calculation of the atmospheric antinucleon flux
NASA Astrophysics Data System (ADS)
Djemil, T.; Attallah, R.; Capdevielle, J. N.
2009-12-01
The atmospheric antiproton and antineutron energy spectra are calculated at float altitude using the CORSIKA package in a three-dimensional Monte Carlo simulation. The hadronic interaction is treated by the FLUKA code below 80 GeV/nucleon and NEXUS elsewhere. The solar modulation which is described by the force field theory and the geomagnetic effects are taken into account. The numerical results are compared with the BESS-2001 experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumann, K; Weber, U; Simeonov, Y
2015-06-15
Purpose: Aim of this study was to analyze the modulating, broadening effect on the Bragg Peak due to heterogeneous geometries like multi-wire chambers in the beam path of a particle therapy beam line. The effect was described by a mathematical model which was implemented in the Monte-Carlo code FLUKA via user-routines, in order to reduce the computation time for the simulations. Methods: The depth dose curve of 80 MeV/u C12-ions in a water phantom was calculated using the Monte-Carlo code FLUKA (reference curve). The modulating effect on this dose distribution behind eleven mesh-like foils (periodicity ∼80 microns) occurring in amore » typical set of multi-wire and dose chambers was mathematically described by optimizing a normal distribution so that the reverence curve convoluted with this distribution equals the modulated dose curve. This distribution describes a displacement in water and was transferred in a probability distribution of the thickness of the eleven foils using the water equivalent thickness of the foil’s material. From this distribution the distribution of the thickness of one foil was determined inversely. In FLUKA the heterogeneous foils were replaced by homogeneous foils and a user-routine was programmed that varies the thickness of the homogeneous foils for each simulated particle using this distribution. Results: Using the mathematical model and user-routine in FLUKA the broadening effect could be reproduced exactly when replacing the heterogeneous foils by homogeneous ones. The computation time was reduced by 90 percent. Conclusion: In this study the broadening effect on the Bragg Peak due to heterogeneous structures was analyzed, described by a mathematical model and implemented in FLUKA via user-routines. Applying these routines the computing time was reduced by 90 percent. The developed tool can be used for any heterogeneous structure in the dimensions of microns to millimeters, in principle even for organic materials like lung tissue.« less
Ali, F; Waker, A J; Waller, E J
2014-10-01
Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Comparison of space radiation calculations for deterministic and Monte Carlo transport codes
NASA Astrophysics Data System (ADS)
Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo
For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.
Lourenço, Ana; Thomas, Russell; Bouchard, Hugo; Kacperek, Andrzej; Vondracek, Vladimir; Royle, Gary; Palmans, Hugo
2016-07-01
The aim of this study was to determine fluence corrections necessary to convert absorbed dose to graphite, measured by graphite calorimetry, to absorbed dose to water. Fluence corrections were obtained from experiments and Monte Carlo simulations in low- and high-energy proton beams. Fluence corrections were calculated to account for the difference in fluence between water and graphite at equivalent depths. Measurements were performed with narrow proton beams. Plane-parallel-plate ionization chambers with a large collecting area compared to the beam diameter were used to intercept the whole beam. High- and low-energy proton beams were provided by a scanning and double scattering delivery system, respectively. A mathematical formalism was established to relate fluence corrections derived from Monte Carlo simulations, using the fluka code [A. Ferrari et al., "fluka: A multi-particle transport code," in CERN 2005-10, INFN/TC 05/11, SLAC-R-773 (2005) and T. T. Böhlen et al., "The fluka Code: Developments and challenges for high energy and medical applications," Nucl. Data Sheets 120, 211-214 (2014)], to partial fluence corrections measured experimentally. A good agreement was found between the partial fluence corrections derived by Monte Carlo simulations and those determined experimentally. For a high-energy beam of 180 MeV, the fluence corrections from Monte Carlo simulations were found to increase from 0.99 to 1.04 with depth. In the case of a low-energy beam of 60 MeV, the magnitude of fluence corrections was approximately 0.99 at all depths when calculated in the sensitive area of the chamber used in the experiments. Fluence correction calculations were also performed for a larger area and found to increase from 0.99 at the surface to 1.01 at greater depths. Fluence corrections obtained experimentally are partial fluence corrections because they account for differences in the primary and part of the secondary particle fluence. A correction factor, F(d), has been established to relate fluence corrections defined theoretically to partial fluence corrections derived experimentally. The findings presented here are also relevant to water and tissue-equivalent-plastic materials given their carbon content.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taleei, R; Qin, N; Jiang, S
2016-06-15
Purpose: Biological treatment plan optimization is of great interest for proton therapy. It requires extensive Monte Carlo (MC) simulations to compute physical dose and biological quantities. Recently, a gPMC package was developed for rapid MC dose calculations on a GPU platform. This work investigated its suitability for proton therapy biological optimization in terms of accuracy and efficiency. Methods: We performed simulations of a proton pencil beam with energies of 75, 150 and 225 MeV in a homogeneous water phantom using gPMC and FLUKA. Physical dose and energy spectra for each ion type on the central beam axis were scored. Relativemore » Biological Effectiveness (RBE) was calculated using repair-misrepair-fixation model. Microdosimetry calculations were performed using Monte Carlo Damage Simulation (MCDS). Results: Ranges computed by the two codes agreed within 1 mm. Physical dose difference was less than 2.5 % at the Bragg peak. RBE-weighted dose agreed within 5 % at the Bragg peak. Differences in microdosimetric quantities such as dose average lineal energy transfer and specific energy were < 10%. The simulation time per source particle with FLUKA was 0.0018 sec, while gPMC was ∼ 600 times faster. Conclusion: Physical dose computed by FLUKA and gPMC were in a good agreement. The RBE differences along the central axis were small, and RBE-weighted dose difference was found to be acceptable. The combined accuracy and efficiency makes gPMC suitable for proton therapy biological optimization.« less
Radiation Protection Considerations
NASA Astrophysics Data System (ADS)
Adorisio, C.; Roesler, S.; Urscheler, C.; Vincke, H.
This chapter summarizes the legal Radiation Protection (RP) framework to be considered in the design of HiLumi LHC. It details design limits and constraints, dose objectives and explains how the As Low As Reasonably Achievable (ALARA) approach is formalized at CERN. Furthermore, features of the FLUKA Monte Carlo code are summarized that are of relevance for RP studies. Results of FLUKA simulations for residual dose rates during Long Shutdown 1 (LS1) are compared to measurements demonstrating good agreement and providing proof for the accuracy of FLUKA predictions for future shutdowns. Finally, an outlook for the residual dose rate evolution until LS3 is given.
NASA Astrophysics Data System (ADS)
Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.
2018-03-01
The CERN High Energy AcceleRator Mixed field facility (CHARM) is located in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5 ṡ1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7 ṡ1010 p/s that then impacts on the CHARM target. The shielding of the CHARM facility also includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target. This facility consists of 80 cm of cast iron and 360 cm of concrete with barite concrete in some places. Activation samples of bismuth and aluminium were placed in the CSBF and in the CHARM access corridor in July 2015. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields for these samples. The results estimated by FLUKA Monte Carlo simulations are compared to activation measurements of these samples. The comparison between FLUKA simulations and the measured values from γ-spectrometry gives an agreement better than a factor of 2.
NASA Astrophysics Data System (ADS)
Salimi, E.; Rahighi, J.; Sardari, D.; Mahdavi, S. R.; Lamehi Rachti, M.
2014-12-01
Gas bremsstrahlung is generated in high energy electron storage rings through interaction of the electron beam with the residual gas molecules in vacuum chamber. In this paper, Monte Carlo calculation has been performed to evaluate radiation hazard due to gas bremsstrahlung in the Iranian Light Source Facility (ILSF) insertion devices. Shutter/stopper dimensions is determined and dose rate from the photoneutrons via the giant resonance photonuclear reaction which takes place inside the shutter/stopper is also obtained. Some other characteristics of gas bremsstrahlung such as photon fluence, energy spectrum, angular distribution and equivalent dose in tissue equivalent phantom have also been investigated by FLUKA Monte Carlo code.
NASA Astrophysics Data System (ADS)
Lourenço, A.; Shipley, D.; Wellock, N.; Thomas, R.; Bouchard, H.; Kacperek, A.; Fracchiolla, F.; Lorentini, S.; Schwarz, M.; MacDougall, N.; Royle, G.; Palmans, H.
2017-05-01
The aim of this work was to evaluate the water-equivalence of new trial plastics designed specifically for light-ion beam dosimetry as well as commercially available plastics in clinical proton beams. The water-equivalence of materials was tested by computing a plastic-to-water conversion factor, {{H}\\text{pl,\\text{w}}} . Trial materials were characterized experimentally in 60 MeV and 226 MeV un-modulated proton beams and the results were compared with Monte Carlo simulations using the FLUKA code. For the high-energy beam, a comparison between the trial plastics and various commercial plastics was also performed using FLUKA and Geant4 Monte Carlo codes. Experimental information was obtained from laterally integrated depth-dose ionization chamber measurements in water, with and without plastic slabs with variable thicknesses in front of the water phantom. Fluence correction factors, {{k}\\text{fl}} , between water and various materials were also derived using the Monte Carlo method. For the 60 MeV proton beam, {{H}\\text{pl,\\text{w}}} and {{k}\\text{fl}} factors were within 1% from unity for all trial plastics. For the 226 MeV proton beam, experimental {{H}\\text{pl,\\text{w}}} values deviated from unity by a maximum of about 1% for the three trial plastics and experimental results showed no advantage regarding which of the plastics was the most equivalent to water. Different magnitudes of corrections were found between Geant4 and FLUKA for the various materials due mainly to the use of different nonelastic nuclear data. Nevertheless, for the 226 MeV proton beam, {{H}\\text{pl,\\text{w}}} correction factors were within 2% from unity for all the materials. Considering the results from the two Monte Carlo codes, PMMA and trial plastic #3 had the smallest {{H}\\text{pl,\\text{w}}} values, where maximum deviations from unity were 1%, however, PMMA range differed by 16% from that of water. Overall, {{k}\\text{fl}} factors were deviating more from unity than {{H}\\text{pl,\\text{w}}} factors and could amount to a few percent for some materials.
Lourenço, A; Shipley, D; Wellock, N; Thomas, R; Bouchard, H; Kacperek, A; Fracchiolla, F; Lorentini, S; Schwarz, M; MacDougall, N; Royle, G; Palmans, H
2017-05-21
The aim of this work was to evaluate the water-equivalence of new trial plastics designed specifically for light-ion beam dosimetry as well as commercially available plastics in clinical proton beams. The water-equivalence of materials was tested by computing a plastic-to-water conversion factor, [Formula: see text]. Trial materials were characterized experimentally in 60 MeV and 226 MeV un-modulated proton beams and the results were compared with Monte Carlo simulations using the FLUKA code. For the high-energy beam, a comparison between the trial plastics and various commercial plastics was also performed using FLUKA and Geant4 Monte Carlo codes. Experimental information was obtained from laterally integrated depth-dose ionization chamber measurements in water, with and without plastic slabs with variable thicknesses in front of the water phantom. Fluence correction factors, [Formula: see text], between water and various materials were also derived using the Monte Carlo method. For the 60 MeV proton beam, [Formula: see text] and [Formula: see text] factors were within 1% from unity for all trial plastics. For the 226 MeV proton beam, experimental [Formula: see text] values deviated from unity by a maximum of about 1% for the three trial plastics and experimental results showed no advantage regarding which of the plastics was the most equivalent to water. Different magnitudes of corrections were found between Geant4 and FLUKA for the various materials due mainly to the use of different nonelastic nuclear data. Nevertheless, for the 226 MeV proton beam, [Formula: see text] correction factors were within 2% from unity for all the materials. Considering the results from the two Monte Carlo codes, PMMA and trial plastic #3 had the smallest [Formula: see text] values, where maximum deviations from unity were 1%, however, PMMA range differed by 16% from that of water. Overall, [Formula: see text] factors were deviating more from unity than [Formula: see text] factors and could amount to a few percent for some materials.
ActiWiz 3 – an overview of the latest developments and their application
NASA Astrophysics Data System (ADS)
Vincke, H.; Theis, C.
2018-06-01
In 2011 the ActiWiz code was developed at CERN in order to optimize the choice of materials for accelerator equipment from a radiological point of view. Since then the code has been extended to allow for calculating complete nuclide inventories and provide evaluations with respect to radiotoxicity, inhalation doses, etc. Until now the software included only pre-defined radiation environments for CERN’s high-energy proton accelerators which were based on FLUKA Monte Carlo calculations. Eventually the decision was taken to invest into a major revamping of the code. Starting with version 3 the software is not limited anymore to pre-defined radiation fields but within a few seconds it can also treat arbitrary environments of which fluence spectra are available. This has become possible due to the use of ~100 CPU years’ worth of FLUKA Monte Carlo simulations as well as the JEFF cross-section library for neutrons < 20 MeV. Eventually the latest code version allowed for the efficient inclusion of 42 additional radiation environments of the LHC experiments as well as considerably more flexibility in view of characterizing also waste from CERN’s Large Electron Positron collider (LEP). New fully integrated analysis functionalities like automatic evaluation of difficult-to-measure nuclides, rapid assessment of the temporal evolution of quantities like radiotoxicity or dose-rates, etc. make the software a powerful tool for characterization complementary to general purpose MC codes like FLUKA. In this paper an overview of the capabilities will be given using recent examples from the domain of waste characterization as well as operational radiation protection.
Path Toward a Unified Geometry for Radiation Transport
NASA Astrophysics Data System (ADS)
Lee, Kerry
The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The work-flow for doing radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats.
Monte Carlo Methods in Materials Science Based on FLUKA and ROOT
NASA Technical Reports Server (NTRS)
Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor
2003-01-01
A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the CERN ALICE (A Large Ion Collisions Experiment) software team through an adaptation of their existing AliROOT (ALICE Using ROOT) architecture. In order to check our progress against actual data, we have chosen to simulate the ATIC14 (Advanced Thin Ionization Calorimeter) cosmic-ray astrophysics balloon payload as well as neutron fluences in the Mir spacecraft. This paper contains a summary of status of this project, and a roadmap to its successful completion.
A Detailed FLUKA-2005 Monte Carlo Simulation for the ATIC Detector
NASA Technical Reports Server (NTRS)
Gunasingha, R. M.; Fazely, A. R.; Adams, J. H.; Ahn, H. S.; Bashindzhagyan, G. L.; Batkov, K. E.; Chang, J.; Christl, M.; Ganel, O.; Guzik, T. G.
2006-01-01
We have performed a detailed Monte Carlo (MC) calculation for the Advanced thin Ionization Calorimeter (ATIC) detector using the MC code FLUKA-2005 which is capable of simulating particles up to 10 PeV. The ATIC detector has completed two successful balloon flights from McMurdo, Antarctica lasting a total of more than 35 days. ATIC is designed as a multiple, long duration balloon Bight, investigation of the cosmic ray spectra from below 50 GeV to near 100 TeV total energy; using a fully active Bismuth Germanate @GO) calorimeter. It is equipped with a large mosaic of silicon detector pixels capable of charge identification and as a particle tracking system, three projective layers of x-y scintillator hodoscopes were employed, above, in the middle and below a 0.75 nuclear interaction length graphite target. Our calculations are part of an analysis package of both A- and energy-dependences of different nuclei interacting with the ATIC detector. The MC simulates the responses of different components of the detector such as the Simatrix, the scintillator hodoscopes and the BGO calorimeter to various nuclei. We also show comparisons of the FLUKA-2005 MC calculations with a GEANT calculation and data for protons, He and CNO.
Monte Carlo simulations for angular and spatial distributions in therapeutic-energy proton beams
NASA Astrophysics Data System (ADS)
Lin, Yi-Chun; Pan, C. Y.; Chiang, K. J.; Yuan, M. C.; Chu, C. H.; Tsai, Y. W.; Teng, P. K.; Lin, C. H.; Chao, T. C.; Lee, C. C.; Tung, C. J.; Chen, A. E.
2017-11-01
The purpose of this study is to compare the angular and spatial distributions of therapeutic-energy proton beams obtained from the FLUKA, GEANT4 and MCNP6 Monte Carlo codes. The Monte Carlo simulations of proton beams passing through two thin targets and a water phantom were investigated to compare the primary and secondary proton fluence distributions and dosimetric differences among these codes. The angular fluence distributions, central axis depth-dose profiles, and lateral distributions of the Bragg peak cross-field were calculated to compare the proton angular and spatial distributions and energy deposition. Benchmark verifications from three different Monte Carlo simulations could be used to evaluate the residual proton fluence for the mean range and to estimate the depth and lateral dose distributions and the characteristic depths and lengths along the central axis as the physical indices corresponding to the evaluation of treatment effectiveness. The results showed a general agreement among codes, except that some deviations were found in the penumbra region. These calculated results are also particularly helpful for understanding primary and secondary proton components for stray radiation calculation and reference proton standard determination, as well as for determining lateral dose distribution performance in proton small-field dosimetry. By demonstrating these calculations, this work could serve as a guide to the recent field of Monte Carlo methods for therapeutic-energy protons.
NASA Astrophysics Data System (ADS)
Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.
2016-01-01
Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm3, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique. We therefore conclude that customization parameters must be set with reference to the optimized parameters of the corresponding irradiation technique in order to render them useful for achieving artifact-free MC simulation for use in computational experiments and clinical treatments.
Path Toward a Unifid Geometry for Radiation Transport
NASA Technical Reports Server (NTRS)
Lee, Kerry; Barzilla, Janet; Davis, Andrew; Zachmann
2014-01-01
The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex computer-aided design (CAD) models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN [high charge and energy transport code developed by NASA Langley Research Center (LaRC)], are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit-specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The workflow for achieving radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats
NASA Astrophysics Data System (ADS)
Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.
2014-10-01
Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.
Overview of Recent Radiation Transport Code Comparisons for Space Applications
NASA Astrophysics Data System (ADS)
Townsend, Lawrence
Recent advances in radiation transport code development for space applications have resulted in various comparisons of code predictions for a variety of scenarios and codes. Comparisons among both Monte Carlo and deterministic codes have been made and published by vari-ous groups and collaborations, including comparisons involving, but not limited to HZETRN, HETC-HEDS, FLUKA, GEANT, PHITS, and MCNPX. In this work, an overview of recent code prediction inter-comparisons, including comparisons to available experimental data, is presented and discussed, with emphases on those areas of agreement and disagreement among the various code predictions and published data.
A method for radiological characterization based on fluence conversion coefficients
NASA Astrophysics Data System (ADS)
Froeschl, Robert
2018-06-01
Radiological characterization of components in accelerator environments is often required to ensure adequate radiation protection during maintenance, transport and handling as well as for the selection of the proper disposal pathway. The relevant quantities are typical the weighted sums of specific activities with radionuclide-specific weighting coefficients. Traditional methods based on Monte Carlo simulations are radionuclide creation-event based or the particle fluences in the regions of interest are scored and then off-line weighted with radionuclide production cross sections. The presented method bases the radiological characterization on a set of fluence conversion coefficients. For a given irradiation profile and cool-down time, radionuclide production cross-sections, material composition and radionuclide-specific weighting coefficients, a set of particle type and energy dependent fluence conversion coefficients is computed. These fluence conversion coefficients can then be used in a Monte Carlo transport code to perform on-line weighting to directly obtain the desired radiological characterization, either by using built-in multiplier features such as in the PHITS code or by writing a dedicated user routine such as for the FLUKA code. The presented method has been validated against the standard event-based methods directly available in Monte Carlo transport codes.
Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems
NASA Technical Reports Server (NTRS)
Rojdev, Kristina; Atwell, William; Boeder, Paul; Koontz, Steve
2014-01-01
NASA's future missions are focused on deep space for human exploration that do not provide a simple emergency return to Earth. In addition, the deep space environment contains a constant background Galactic Cosmic Ray (GCR) radiation exposure, as well as periodic Solar Particle Events (SPEs) that can produce intense amounts of radiation in a short amount of time. Given these conditions, it is important that the avionics systems for deep space human missions are not susceptible to Single Event Effects (SEE) that can occur from radiation interactions with electronic components. The typical process to minimizing SEE effects is through using heritage hardware and extensive testing programs that are very costly. Previous work by Koontz, et al. [1] utilized an analysis-based method for investigating electronic component susceptibility. In their paper, FLUKA, a Monte Carlo transport code, was used to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data. In addition, CREME-96, a deterministic code, was also compared with FLUKA and in-flight data. However, FLUKA has a long run-time (on the order of days), and CREME-96 has not been updated in several years. This paper will investigate the use of HZETRN 2010, a deterministic transport code developed at NASA Langley Research Center, as another tool that can be used to analyze SEE and SEU rates. The benefits to using HZETRN over FLUKA and CREME-96 are that it has a very fast run time (on the order of minutes) and has been shown to be of similar accuracy as other deterministic and Monte Carlo codes when considering dose [2, 3, 4]. The 2010 version of HZETRN has updated its treatment of secondary neutrons and thus has improved its accuracy over previous versions. In this paper, the Linear Energy Transfer (LET) spectra are of interest rather than the total ionizing dose. Therefore, the LET spectra output from HZETRN 2010 will be compared with the FLUKA and in-flight data to validate HZETRN 2010 as a computational tool for SEE qualification by analysis. Furthermore, extrapolation of these data to interplanetary environments at 1 AU will be investigated to determine whether HZETRN 2010 can be used successfully and confidently for deep space mission analyses.
FLUKA simulation of TEPC response to cosmic radiation.
Beck, P; Ferrari, A; Pelliccioni, M; Rollet, S; Villari, R
2005-01-01
The aircrew exposure to cosmic radiation can be assessed by calculation with codes validated by measurements. However, the relationship between doses in the free atmosphere, as calculated by the codes and from results of measurements performed within the aircraft, is still unclear. The response of a tissue-equivalent proportional counter (TEPC) has already been simulated successfully by the Monte Carlo transport code FLUKA. Absorbed dose rate and ambient dose equivalent rate distributions as functions of lineal energy have been simulated for several reference sources and mixed radiation fields. The agreement between simulation and measurements has been well demonstrated. In order to evaluate the influence of aircraft structures on aircrew exposure assessment, the response of TEPC in the free atmosphere and on-board is now simulated. The calculated results are discussed and compared with other calculations and measurements.
Comparison of Fluka-2006 Monte Carlo Simulation and Flight Data for the ATIC Detector
NASA Technical Reports Server (NTRS)
Gunasingha, R.M.; Fazely, A.R.; Adams, J.H.; Ahn, H.S.; Bashindzhagyan, G.L.; Chang, J.; Christl, M.; Ganel, O.; Guzik, T.G.; Isbert, J.;
2007-01-01
We have performed a detailed Monte Carlo (MC) simulation for the Advanced Thin Ionization Calorimeter (ATIC) detector using the MC code FLUKA-2006 which is capable of simulating particles up to 10 PeV. The ATIC detector has completed two successful balloon flights from McMurdo, Antarctica lasting a total of more than 35 days. ATIC is designed as a multiple, long duration balloon flight, investigation of the cosmic ray spectra from below 50 GeV to near 100 TeV total energy; using a fully active Bismuth Germanate(BGO) calorimeter. It is equipped with a large mosaic of.silicon detector pixels capable of charge identification, and, for particle tracking, three projective layers of x-y scintillator hodoscopes, located above, in the middle and below a 0.75 nuclear interaction length graphite target. Our simulations are part of an analysis package of both nuclear (A) and energy dependences for different nuclei interacting in the ATIC detector. The MC simulates the response of different components of the detector such as the Si-matrix, the scintillator hodoscopes and the BGO calorimeter to various nuclei. We present comparisons of the FLUKA-2006 MC calculations with GEANT calculations and with the ATIC CERN data and ATIC flight data.
Benchmark studies of induced radioactivity produced in LHC materials, Part I: Specific activities.
Brugger, M; Khater, H; Mayer, S; Prinz, A; Roesler, S; Ulrici, L; Vincke, H
2005-01-01
Samples of materials which will be used in the LHC machine for shielding and construction components were irradiated in the stray radiation field of the CERN-EU high-energy reference field facility. After irradiation, the specific activities induced in the various samples were analysed with a high-precision gamma spectrometer at various cooling times, allowing identification of isotopes with a wide range of half-lives. Furthermore, the irradiation experiment was simulated in detail with the FLUKA Monte Carlo code. A comparison of measured and calculated specific activities shows good agreement, supporting the use of FLUKA for estimating the level of induced activity in the LHC.
The FLUKA code for space applications: recent developments
NASA Technical Reports Server (NTRS)
Andersen, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.;
2004-01-01
The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to-date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results have been obtained using a modified version of the RQMD-2.4 code. This interim solution is now fully operational, while waiting for the development of new models based on the FLUKA hadron-nucleus interaction code, a newly developed QMD code, and the implementation of the Boltzmann master equation theory for low energy ion interactions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.
Monte Carlo Simulations of Background Spectra in Integral Imager Detectors
NASA Technical Reports Server (NTRS)
Armstrong, T. W.; Colborn, B. L.; Dietz, K. L.; Ramsey, B. D.; Weisskopf, M. C.
1998-01-01
Predictions of the expected gamma-ray backgrounds in the ISGRI (CdTe) and PiCsIT (Csl) detectors on INTEGRAL due to cosmic-ray interactions and the diffuse gamma-ray background have been made using a coupled set of Monte Carlo radiation transport codes (HETC, FLUKA, EGS4, and MORSE) and a detailed, 3-D mass model of the spacecraft and detector assemblies. The simulations include both the prompt background component from induced hadronic and electromagnetic cascades and the delayed component due to emissions from induced radioactivity. Background spectra have been obtained with and without the use of active (BGO) shielding and charged particle rejection to evaluate the effectiveness of anticoincidence counting on background rejection.
NASA Astrophysics Data System (ADS)
Magro, G.; Dahle, T. J.; Molinelli, S.; Ciocca, M.; Fossati, P.; Ferrari, A.; Inaniwa, T.; Matsufuji, N.; Ytre-Hauge, K. S.; Mairani, A.
2017-05-01
Particle therapy facilities often require Monte Carlo (MC) simulations to overcome intrinsic limitations of analytical treatment planning systems (TPS) related to the description of the mixed radiation field and beam interaction with tissue inhomogeneities. Some of these uncertainties may affect the computation of effective dose distributions; therefore, particle therapy dedicated MC codes should provide both absorbed and biological doses. Two biophysical models are currently applied clinically in particle therapy: the local effect model (LEM) and the microdosimetric kinetic model (MKM). In this paper, we describe the coupling of the NIRS (National Institute for Radiological Sciences, Japan) clinical dose to the FLUKA MC code. We moved from the implementation of the model itself to its application in clinical cases, according to the NIRS approach, where a scaling factor is introduced to rescale the (carbon-equivalent) biological dose to a clinical dose level. A high level of agreement was found with published data by exploring a range of values for the MKM input parameters, while some differences were registered in forward recalculations of NIRS patient plans, mainly attributable to differences with the analytical TPS dose engine (taken as reference) in describing the mixed radiation field (lateral spread and fragmentation). We presented a tool which is being used at the Italian National Center for Oncological Hadrontherapy to support the comparison study between the NIRS clinical dose level and the LEM dose specification.
Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank
2018-02-01
Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario
2016-12-01
In the planning of a new cyclotron facility, an accurate knowledge of the radiation field around the accelerator is fundamental for the design of shielding, the protection of workers, the general public and the environment. Monte Carlo simulations can be very useful in this process, and their use is constantly increasing. However, few data have been published so far as regards the proper validation of Monte Carlo simulation against experimental measurements, particularly in the energy range of biomedical cyclotrons. In this work a detailed model of an existing installation of a GE PETtrace 16.5MeV cyclotron was developed using FLUKA. An extensive measurement campaign of the neutron ambient dose equivalent H ∗ (10) in marked positions around the cyclotron was conducted using a neutron rem-counter probe and CR39 neutron detectors. Data from a previous measurement campaign performed by our group using TLDs were also re-evaluated. The FLUKA model was then validated by comparing the results of high-statistics simulations with experimental data. In 10 out of 12 measurement locations, FLUKA simulations were in agreement within uncertainties with all the three different sets of experimental data; in the remaining 2 positions, the agreement was with 2/3 of the measurements. Our work allows to quantitatively validate our FLUKA simulation setup and confirms that Monte Carlo technique can produce accurate results in the energy range of biomedical cyclotrons. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Simulation of radiation environment for the LHeC detector
NASA Astrophysics Data System (ADS)
Nayaz, Abdullah; Piliçer, Ercan; Joya, Musa
2017-02-01
The detector response and simulation of radiation environment for the Large Hadron electron Collider (LHeC) baseline detector is estimated to predict its performance over the lifetime of the project. In this work, the geometry of the LHeC detector, as reported in LHeC Conceptual Design Report (CDR), built in FLUKA Monte Carlo tool in order to simulate the detector response and radiation environment. For this purpose, events of electrons and protons with high enough energy were sent isotropically from interaction point of the detector. As a result, the detector response and radiation background for the LHeC detector, with different USRBIN code (ENERGY, HADGT20M, ALL-CHAR, ALL-PAR) in FLUKA, are presented.
NASA Astrophysics Data System (ADS)
Lin, Yi-Chun; Liu, Yuan-Hao; Nievaart, Sander; Chen, Yen-Fu; Wu, Shu-Wei; Chou, Wen-Tsae; Jiang, Shiang-Huei
2011-10-01
High energy photon (over 10 MeV) and neutron beams adopted in radiobiology and radiotherapy always produce mixed neutron/gamma-ray fields. The Mg(Ar) ionization chambers are commonly applied to determine the gamma-ray dose because of its neutron insensitive characteristic. Nowadays, many perturbation corrections for accurate dose estimation and lots of treatment planning systems are based on Monte Carlo technique. The Monte Carlo codes EGSnrc, FLUKA, GEANT4, MCNP5, and MCNPX were used to evaluate energy dependent response functions of the Exradin M2 Mg(Ar) ionization chamber to a parallel photon beam with mono-energies from 20 keV to 20 MeV. For the sake of validation, measurements were carefully performed in well-defined (a) primary M-100 X-ray calibration field, (b) primary 60Co calibration beam, (c) 6-MV, and (d) 10-MV therapeutic beams in hospital. At energy region below 100 keV, MCNP5 and MCNPX both had lower responses than other codes. For energies above 1 MeV, the MCNP ITS-mode greatly resembled other three codes and the differences were within 5%. Comparing to the measured currents, MCNP5 and MCNPX using ITS-mode had perfect agreement with the 60Co, and 10-MV beams. But at X-ray energy region, the derivations reached 17%. This work shows us a better insight into the performance of different Monte Carlo codes in photon-electron transport calculation. Regarding the application of the mixed field dosimetry like BNCT, MCNP with ITS-mode is recognized as the most suitable tool by this work.
A model for the accurate computation of the lateral scattering of protons in water
NASA Astrophysics Data System (ADS)
Bellinzona, E. V.; Ciocca, M.; Embriaco, A.; Ferrari, A.; Fontana, A.; Mairani, A.; Parodi, K.; Rotondi, A.; Sala, P.; Tessonnier, T.
2016-02-01
A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.
A model for the accurate computation of the lateral scattering of protons in water.
Bellinzona, E V; Ciocca, M; Embriaco, A; Ferrari, A; Fontana, A; Mairani, A; Parodi, K; Rotondi, A; Sala, P; Tessonnier, T
2016-02-21
A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.
NASA Astrophysics Data System (ADS)
Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Ledoux, X.; Laurent, B.; Thomas, J.-C.; Clerc, T.; Desmezières, V.; Dupuis, M.; Madeline, A.; Dessay, E.; Grinyer, G. F.; Grinyer, J.; Menard, N.; Porée, F.; Achouri, L.; Delaunay, F.; Parlog, M.
2018-07-01
Double differential neutron spectra (energy, angle) originating from a thick natCu target bombarded by a 12 MeV/nucleon 36S16+ beam were measured by the activation method and the Time-of-flight technique at the Grand Accélérateur National d'Ions Lourds (GANIL). A neutron spectrum unfolding algorithm combining the SAND-II iterative method and Monte-Carlo techniques was developed for the analysis of the activation results that cover a wide range of neutron energies. It was implemented into a graphical user interface program, called GanUnfold. The experimental neutron spectra are compared to Monte-Carlo simulations performed using the PHITS and FLUKA codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, Vinita J.; Schaefer, Charles; Kahnhauser, Henry
The National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory was shut down in September 2014. Lead bricks used as radiological shadow shielding within the accelerator were exposed to stray radiation fields during normal operations. The FLUKA code, a fully integrated Monte Carlo simulation package for the interaction and transport of particles and nuclei in matter, was used to estimate induced radioactivity in this shielding and stainless steel beam pipe from known beam losses. The FLUKA output was processed using MICROSHIELD® to estimate on-contact exposure rates with individually exposed bricks to help design and optimize the radiological survey process. Thismore » entire process can be modeled using FLUKA, but use of MICROSHIELD® as a secondary method was chosen because of the project’s resource constraints. Due to the compressed schedule and lack of shielding configuration data, simple FLUKA models were developed in this paper. FLUKA activity estimates for stainless steel were compared with sampling data to validate results, which show that simple FLUKA models and irradiation geometries can be used to predict radioactivity inventories accurately in exposed materials. During decommissioning 0.1% of the lead bricks were found to have measurable levels of induced radioactivity. Finally, post-processing with MICROSHIELD® provides an acceptable secondary method of estimating residual exposure rates.« less
FLUKA Monte Carlo simulations and benchmark measurements for the LHC beam loss monitors
NASA Astrophysics Data System (ADS)
Sarchiapone, L.; Brugger, M.; Dehning, B.; Kramer, D.; Stockner, M.; Vlachoudis, V.
2007-10-01
One of the crucial elements in terms of machine protection for CERN's Large Hadron Collider (LHC) is its beam loss monitoring (BLM) system. On-line loss measurements must prevent the superconducting magnets from quenching and protect the machine components from damages due to unforeseen critical beam losses. In order to ensure the BLM's design quality, in the final design phase of the LHC detailed FLUKA Monte Carlo simulations were performed for the betatron collimation insertion. In addition, benchmark measurements were carried out with LHC type BLMs installed at the CERN-EU high-energy Reference Field facility (CERF). This paper presents results of FLUKA calculations performed for BLMs installed in the collimation region, compares the results of the CERF measurement with FLUKA simulations and evaluates related uncertainties. This, together with the fact that the CERF source spectra at the respective BLM locations are comparable with those at the LHC, allows assessing the sensitivity of the performed LHC design studies.
Estimation of Airborne Radioactivity Induced by 8-GeV-Class Electron LINAC Accelerator.
Asano, Yoshihiro
2017-10-01
Airborne radioactivity induced by high-energy electrons from 6 to 10 GeV is estimated by using analytical methods and the Monte Carlo codes PHITS and FLUKA. Measurements using a gas monitor with a NaI(Tl) scintillator are carried out in air from a dump room at SACLA, an x-ray free-electron laser facility with 7.8-GeV electrons and are compared to the simulations.
Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes
NASA Technical Reports Server (NTRS)
Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.
2001-01-01
The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as part of their continuing efforts to support the users of the FLUKA code within the particle physics community. In keeping with the spirit of developing an evolving physics code, we are planning as part of this project, to participate in the efforts to validate the core FLUKA physics in ground-based accelerator test runs. The emphasis of these test runs will be the physics of greatest interest in the simulation of the space radiation environment. Such a tool will be of great value to planners, designers and operators of future space missions, as well as for the design of the vehicles and habitats to be used on such missions. It will also be of aid to future experiments of various kinds that may be affected at some level by the ambient radiation environment, or in the analysis of hybrid experiment designs that have been discussed for space-based astronomy and astrophysics. The tool will be of value to the Life Sciences personnel involved in the prediction and measurement of radiation doses experienced by the crewmembers on such missions. In addition, the tool will be of great use to the planners of experiments to measure and evaluate the space radiation environment itself. It can likewise be useful in the analysis of safe havens, hazard migration plans, and NASA's call for new research in composites and to NASA engineers modeling the radiation exposure of electronic circuits. This code will provide an important complimentary check on the predictions of analytic codes such as BRYNTRN/HZETRN that are presently used for many similar applications, and which have shortcomings that are more easily overcome with Monte Carlo type simulations. Finally, it is acknowledged that there are similar efforts based around the use of the GEANT4 Monte-Carlo transport code currently under development at CERN. It is our intention to make our software modular and sufficiently flexible to allow the parallel use of either FLUKA or GEANT4 as the physics transport engine.
NASA Technical Reports Server (NTRS)
Ballarini, F.; Biaggi, M.; De Biaggi, L.; Ferrari, A.; Ottolenghi, A.; Panzarasa, A.; Paretzke, H. G.; Pelliccioni, M.; Sala, P.; Scannicchio, D.;
2004-01-01
Distributions of absorbed dose and DNA clustered damage yields in various organs and tissues following the October 1989 solar particle event (SPE) were calculated by coupling the FLUKA Monte Carlo transport code with two anthropomorphic phantoms (a mathematical model and a voxel model), with the main aim of quantifying the role of the shielding features in modulating organ doses. The phantoms, which were assumed to be in deep space, were inserted into a shielding box of variable thickness and material and were irradiated with the proton spectra of the October 1989 event. Average numbers of DNA lesions per cell in different organs were calculated by adopting a technique already tested in previous works, consisting of integrating into "condensed-history" Monte Carlo transport codes--such as FLUKA--yields of radiobiological damage, either calculated with "event-by-event" track structure simulations, or taken from experimental works available in the literature. More specifically, the yields of "Complex Lesions" (or "CL", defined and calculated as a clustered DNA damage in a previous work) per unit dose and DNA mass (CL Gy-1 Da-1) due to the various beam components, including those derived from nuclear interactions with the shielding and the human body, were integrated in FLUKA. This provided spatial distributions of CL/cell yields in different organs, as well as distributions of absorbed doses. The contributions of primary protons and secondary hadrons were calculated separately, and the simulations were repeated for values of Al shielding thickness ranging between 1 and 20 g/cm2. Slight differences were found between the two phantom types. Skin and eye lenses were found to receive larger doses with respect to internal organs; however, shielding was more effective for skin and lenses. Secondary particles arising from nuclear interactions were found to have a minor role, although their relative contribution was found to be larger for the Complex Lesions than for the absorbed dose, due to their higher LET and thus higher biological effectiveness. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ballarini, F.; Biaggi, M.; De Biaggi, L.; Ferrari, A.; Ottolenghi, A.; Panzarasa, A.; Paretzke, H. G.; Pelliccioni, M.; Sala, P.; Scannicchio, D.; Zankl, M.
2004-01-01
Distributions of absorbed dose and DNA clustered damage yields in various organs and tissues following the October 1989 solar particle event (SPE) were calculated by coupling the FLUKA Monte Carlo transport code with two anthropomorphic phantoms (a mathematical model and a voxel model), with the main aim of quantifying the role of the shielding features in modulating organ doses. The phantoms, which were assumed to be in deep space, were inserted into a shielding box of variable thickness and material and were irradiated with the proton spectra of the October 1989 event. Average numbers of DNA lesions per cell in different organs were calculated by adopting a technique already tested in previous works, consisting of integrating into "condensed-history" Monte Carlo transport codes - such as FLUKA - yields of radiobiological damage, either calculated with "event-by-event" track structure simulations, or taken from experimental works available in the literature. More specifically, the yields of "Complex Lesions" (or "CL", defined and calculated as a clustered DNA damage in a previous work) per unit dose and DNA mass (CL Gy -1 Da -1) due to the various beam components, including those derived from nuclear interactions with the shielding and the human body, were integrated in FLUKA. This provided spatial distributions of CL/cell yields in different organs, as well as distributions of absorbed doses. The contributions of primary protons and secondary hadrons were calculated separately, and the simulations were repeated for values of Al shielding thickness ranging between 1 and 20 g/cm 2. Slight differences were found between the two phantom types. Skin and eye lenses were found to receive larger doses with respect to internal organs; however, shielding was more effective for skin and lenses. Secondary particles arising from nuclear interactions were found to have a minor role, although their relative contribution was found to be larger for the Complex Lesions than for the absorbed dose, due to their higher LET and thus higher biological effectiveness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moskvin, V; Tsiamas, P; Axente, M
2015-06-15
Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determinemore » the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.« less
Update On the Status of the FLUKA Monte Carlo Transport Code*
NASA Technical Reports Server (NTRS)
Ferrari, A.; Lorenzo-Sentis, M.; Roesler, S.; Smirnov, G.; Sommerer, F.; Theis, C.; Vlachoudis, V.; Carboni, M.; Mostacci, A.; Pelliccioni, M.
2006-01-01
The FLUKA Monte Carlo transport code is a well-known simulation tool in High Energy Physics. FLUKA is a dynamic tool in the sense that it is being continually updated and improved by the authors. We review the progress achieved since the last CHEP Conference on the physics models, some technical improvements to the code and some recent applications. From the point of view of the physics, improvements have been made with the extension of PEANUT to higher energies for p, n, pi, pbar/nbar and for nbars down to the lowest energies, the addition of the online capability to evolve radioactive products and get subsequent dose rates, upgrading of the treatment of EM interactions with the elimination of the need to separately prepare preprocessed files. A new coherent photon scattering model, an updated treatment of the photo-electric effect, an improved pair production model, new photon cross sections from the LLNL Cullen database have been implemented. In the field of nucleus-- nucleus interactions the electromagnetic dissociation of heavy ions has been added along with the extension of the interaction models for some nuclide pairs to energies below 100 MeV/A using the BME approach, as well as the development of an improved QMD model for intermediate energies. Both DPMJET 2.53 and 3 remain available along with rQMD 2.4 for heavy ion interactions above 100 MeV/A. Technical improvements include the ability to use parentheses in setting up the combinatorial geometry, the introduction of pre-processor directives in the input stream. a new random number generator with full 64 bit randomness, new routines for mathematical special functions (adapted from SLATEC). Finally, work is progressing on the deployment of a user-friendly GUI input interface as well as a CAD-like geometry creation and visualization tool. On the application front, FLUKA has been used to extensively evaluate the potential space radiation effects on astronauts for future deep space missions, the activation dose for beam target areas, dose calculations for radiation therapy as well as being adapted for use in the simulation of events in the ALICE detector at the LHC.
NASA Astrophysics Data System (ADS)
Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.
2017-09-01
Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.
NASA Astrophysics Data System (ADS)
Aygun, Bünyamin; Korkut, Turgay; Karabulut, Abdulhalik
2016-05-01
Despite the possibility of depletion of fossil fuels increasing energy needs the use of radiation tends to increase. Recently the security-focused debate about planned nuclear power plants still continues. The objective of this thesis is to prevent the radiation spread from nuclear reactors into the environment. In order to do this, we produced higher performanced of new shielding materials which are high radiation holders in reactors operation. Some additives used in new shielding materials; some of iron (Fe), rhenium (Re), nickel (Ni), chromium (Cr), boron (B), copper (Cu), tungsten (W), tantalum (Ta), boron carbide (B4C). The results of this experiments indicated that these materials are good shields against gamma and neutrons. The powder metallurgy technique was used to produce new shielding materials. CERN - FLUKA Geant4 Monte Carlo simulation code and WinXCom were used for determination of the percentages of high temperature resistant and high-level fast neutron and gamma shielding materials participated components. Super alloys was produced and then the experimental fast neutron dose equivalent measurements and gamma radiation absorpsion of the new shielding materials were carried out. The produced products to be used safely reactors not only in nuclear medicine, in the treatment room, for the storage of nuclear waste, nuclear research laboratories, against cosmic radiation in space vehicles and has the qualities.
Monte Carlo calculation of the radiation field at aircraft altitudes.
Roesler, S; Heinrich, W; Schraube, H
2002-01-01
Energy spectra of secondary cosmic rays are calculated for aircraft altitudes and a discrete set of solar modulation parameters and rigidity cut-off values covering all possible conditions. The calculations are based on the Monte Carlo code FLUKA and on the most recent information on the interstellar cosmic ray flux including a detailed model of solar modulation. Results are compared to a large variety of experimental data obtained on the ground and aboard aircraft and balloons, such as neutron, proton, and muon spectra and yields of charged particles. Furthermore, particle fluence is converted into ambient dose equivalent and effective dose and the dependence of these quantities on height above sea level, solar modulation, and geographical location is studied. Finally, calculated dose equivalent is compared to results of comprehensive measurements performed aboard aircraft.
Source terms, shielding calculations and soil activation for a medical cyclotron.
Konheiser, J; Naumann, B; Ferrari, A; Brachem, C; Müller, S E
2016-12-01
Calculations of the shielding and estimates of soil activation for a medical cyclotron are presented in this work. Based on the neutron source term from the 18 O(p,n) 18 F reaction produced by a 28 MeV proton beam, neutron and gamma dose rates outside the building were estimated with the Monte Carlo code MCNP6 (Goorley et al 2012 Nucl. Technol. 180 298-315). The neutron source term was calculated with the MCNP6 code and FLUKA (Ferrari et al 2005 INFN/TC_05/11, SLAC-R-773) code as well as with supplied data by the manufacturer. MCNP and FLUKA calculations yielded comparable results, while the neutron yield obtained using the manufacturer-supplied information is about a factor of 5 smaller. The difference is attributed to the missing channels in the manufacturer-supplied neutron source terms which considers only the 18 O(p,n) 18 F reaction, whereas the MCNP and FLUKA calculations include additional neutron reaction channels. Soil activation was performed using the FLUKA code. The estimated dose rate based on MCNP6 calculations in the public area is about 0.035 µSv h -1 and thus significantly below the reference value of 0.5 µSv h -1 (2011 Strahlenschutzverordnung, 9 Auflage vom 01.11.2011, Bundesanzeiger Verlag). After 5 years of continuous beam operation and a subsequent decay time of 30 d, the activity concentration of the soil is about 0.34 Bq g -1 .
NASA Astrophysics Data System (ADS)
Darafsheh, Arash; Taleei, Reza; Kassaee, Alireza; Finlay, Jarod C.
2017-03-01
We experimentally and by means of Monte Carlo simulations investigated the origin of the visible signal responsible for proton therapy dose measurement using bare plastic optical fibers. Experimentally, the fiber optic probe, embedded in tissue-mimicking plastics, was irradiated with a proton beam produced by a proton therapy cyclotron and the luminescence spectroscopy was performed by a CCD-coupled spectrograph to analyze the emission spectrum of the fiber tip. Monte Carlo simulations were performed using FLUKA Monte Carlo code to stochastically simulate radiation transport, ionizing radiation dose deposition, and optical emission of Čerenkov radiation. The spectroscopic study of proton-irradiated plastic fibers showed a continuous spectrum with shape different from that of Čerenkov radiation. The Monte Carlo simulations confirmed that the amount of the generated Čerenkov light does not follow the radiation absorbed dose in a medium. Our results show that the origin of the optical signal responsible for the proton dose measurement using bare optical fibers is not Čerenkov radiation. Our results point toward a connection between the scintillation of the plastic material of the fiber and the origin of the signal responsible for dose measurement.
Comparison of Transport Codes, HZETRN, HETC and FLUKA, Using 1977 GCR Solar Minimum Spectra
NASA Technical Reports Server (NTRS)
Heinbockel, John H.; Slaba, Tony C.; Tripathi, Ram K.; Blattnig, Steve R.; Norbury, John W.; Badavi, Francis F.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.;
2009-01-01
The HZETRN deterministic radiation transport code is one of several tools developed to analyze the effects of harmful galactic cosmic rays (GCR) and solar particle events (SPE) on mission planning, astronaut shielding and instrumentation. This paper is a comparison study involving the two Monte Carlo transport codes, HETC-HEDS and FLUKA, and the deterministic transport code, HZETRN. Each code is used to transport ions from the 1977 solar minimum GCR spectrum impinging upon a 20 g/cm2 Aluminum slab followed by a 30 g/cm2 water slab. This research is part of a systematic effort of verification and validation to quantify the accuracy of HZETRN and determine areas where it can be improved. Comparisons of dose and dose equivalent values at various depths in the water slab are presented in this report. This is followed by a comparison of the proton fluxes, and the forward, backward and total neutron fluxes at various depths in the water slab. Comparisons of the secondary light ion 2H, 3H, 3He and 4He fluxes are also examined.
Induced Radioactivity in Lead Shielding at the National Synchrotron Light Source
Ghosh, Vinita J.; Schaefer, Charles; Kahnhauser, Henry
2017-06-30
The National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory was shut down in September 2014. Lead bricks used as radiological shadow shielding within the accelerator were exposed to stray radiation fields during normal operations. The FLUKA code, a fully integrated Monte Carlo simulation package for the interaction and transport of particles and nuclei in matter, was used to estimate induced radioactivity in this shielding and stainless steel beam pipe from known beam losses. The FLUKA output was processed using MICROSHIELD® to estimate on-contact exposure rates with individually exposed bricks to help design and optimize the radiological survey process. Thismore » entire process can be modeled using FLUKA, but use of MICROSHIELD® as a secondary method was chosen because of the project’s resource constraints. Due to the compressed schedule and lack of shielding configuration data, simple FLUKA models were developed in this paper. FLUKA activity estimates for stainless steel were compared with sampling data to validate results, which show that simple FLUKA models and irradiation geometries can be used to predict radioactivity inventories accurately in exposed materials. During decommissioning 0.1% of the lead bricks were found to have measurable levels of induced radioactivity. Finally, post-processing with MICROSHIELD® provides an acceptable secondary method of estimating residual exposure rates.« less
Induced Radioactivity in Lead Shielding at the National Synchrotron Light Source.
Ghosh, Vinita J; Schaefer, Charles; Kahnhauser, Henry
2017-06-01
The National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory was shut down in September 2014. Lead bricks used as radiological shadow shielding within the accelerator were exposed to stray radiation fields during normal operations. The FLUKA code, a fully integrated Monte Carlo simulation package for the interaction and transport of particles and nuclei in matter, was used to estimate induced radioactivity in this shielding and stainless steel beam pipe from known beam losses. The FLUKA output was processed using MICROSHIELD® to estimate on-contact exposure rates with individually exposed bricks to help design and optimize the radiological survey process. This entire process can be modeled using FLUKA, but use of MICROSHIELD® as a secondary method was chosen because of the project's resource constraints. Due to the compressed schedule and lack of shielding configuration data, simple FLUKA models were developed. FLUKA activity estimates for stainless steel were compared with sampling data to validate results, which show that simple FLUKA models and irradiation geometries can be used to predict radioactivity inventories accurately in exposed materials. During decommissioning 0.1% of the lead bricks were found to have measurable levels of induced radioactivity. Post-processing with MICROSHIELD® provides an acceptable secondary method of estimating residual exposure rates.
Gas bremsstrahlung shielding calculation for first optic enclosure of ILSF medical beamline
NASA Astrophysics Data System (ADS)
Beigzadeh Jalali, H.; Salimi, E.; Rahighi, J.
2016-10-01
Gas bremsstrahlung is generated in high energy electron storage ring accompanies the synchrotron radiation into the beamlines and strike the various components of the beamline. In this paper, radiation shielding calculation for secondary gas bremsstrahlung is performed for the first optics enclosure (FOE) of medical beamline of the Iranian Light Source Facility (ILSF). Dose equivalent rate (DER) calculation is accomplished using FLUKA Monte Carlo code. A comprehensive study of DER distribution at the back wall, sides and roof is given.
The radiation fields around a proton therapy facility: A comparison of Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Ottaviano, G.; Picardi, L.; Pillon, M.; Ronsivalle, C.; Sandri, S.
2014-02-01
A proton therapy test facility with a beam current lower than 10 nA in average, and an energy up to 150 MeV, is planned to be sited at the Frascati ENEA Research Center, in Italy. The accelerator is composed of a sequence of linear sections. The first one is a commercial 7 MeV proton linac, from which the beam is injected in a SCDTL (Side Coupled Drift Tube Linac) structure reaching the energy of 52 MeV. Then a conventional CCL (coupled Cavity Linac) with side coupling cavities completes the accelerator. The linear structure has the important advantage that the main radiation losses during the acceleration process occur to protons with energy below 20 MeV, with a consequent low production of neutrons and secondary radiation. From the radiation protection point of view the source of radiation for this facility is then almost completely located at the final target. Physical and geometrical models of the device have been developed and implemented into radiation transport computer codes based on the Monte Carlo method. The scope is the assessment of the radiation field around the main source for supporting the safety analysis. For the assessment independent researchers used two different Monte Carlo computer codes named FLUKA (FLUktuierende KAskade) and MCNPX (Monte Carlo N-Particle eXtended) respectively. Both are general purpose tools for calculations of particle transport and interactions with matter, covering an extended range of applications including proton beam analysis. Nevertheless each one utilizes its own nuclear cross section libraries and uses specific physics models for particle types and energies. The models implemented into the codes are described and the results are presented. The differences between the two calculations are reported and discussed pointing out disadvantages and advantages of each code in the specific application.
Prompt radiation, shielding and induced radioactivity in a high-power 160 MeV proton linac
NASA Astrophysics Data System (ADS)
Magistris, Matteo; Silari, Marco
2006-06-01
CERN is designing a 160 MeV proton linear accelerator, both for a future intensity upgrade of the LHC and as a possible first stage of a 2.2 GeV superconducting proton linac. A first estimate of the required shielding was obtained by means of a simple analytical model. The source terms and the attenuation lengths used in the present study were calculated with the Monte Carlo cascade code FLUKA. Detailed FLUKA simulations were performed to investigate the contribution of neutron skyshine and backscattering to the expected dose rate in the areas around the linac tunnel. An estimate of the induced radioactivity in the magnets, vacuum chamber, the cooling system and the concrete shield was performed. A preliminary thermal study of the beam dump is also discussed.
Coupled Neutron Transport for HZETRN
NASA Technical Reports Server (NTRS)
Slaba, Tony C.; Blattnig, Steve R.
2009-01-01
Exposure estimates inside space vehicles, surface habitats, and high altitude aircrafts exposed to space radiation are highly influenced by secondary neutron production. The deterministic transport code HZETRN has been identified as a reliable and efficient tool for such studies, but improvements to the underlying transport models and numerical methods are still necessary. In this paper, the forward-backward (FB) and directionally coupled forward-backward (DC) neutron transport models are derived, numerical methods for the FB model are reviewed, and a computationally efficient numerical solution is presented for the DC model. Both models are compared to the Monte Carlo codes HETC-HEDS, FLUKA, and MCNPX, and the DC model is shown to agree closely with the Monte Carlo results. Finally, it is found in the development of either model that the decoupling of low energy neutrons from the light particle transport procedure adversely affects low energy light ion fluence spectra and exposure quantities. A first order correction is presented to resolve the problem, and it is shown to be both accurate and efficient.
NASA Astrophysics Data System (ADS)
Sunil, C.; Tyagi, Mohit; Biju, K.; Shanbhag, A. A.; Bandyopadhyay, T.
2015-12-01
The scarcity and the high cost of 3He has spurred the use of various detectors for neutron monitoring. A new lithium yttrium borate scintillator developed in BARC has been studied for its use in a neutron rem counter. The scintillator is made of natural lithium and boron, and the yield of reaction products that will generate a signal in a real time detector has been studied by FLUKA Monte Carlo radiation transport code. A 2 cm lead introduced to enhance the gamma rejection shows no appreciable change in the shape of the fluence response or in the yield of reaction products. The fluence response when normalized at the average energy of an Am-Be neutron source shows promise of being used as rem counter.
NASA Technical Reports Server (NTRS)
Wilson, Thomas L.; Pinsky, Lawrence; Andersen, Victor; Empl, Anton; Lee, Kerry; Smirmov, Georgi; Zapp, Neal; Ferrari, Alfredo; Tsoulou, Katerina; Roesler, Stefan;
2005-01-01
Simulating the Space Radiation environment with Monte Carlo Codes, such as FLUKA, requires the ability to model the interactions of heavy ions as they penetrate spacecraft and crew member's bodies. Monte-Carlo-type transport codes use total interaction cross sections to determine probabilistically when a particular type of interaction has occurred. Then, at that point, a distinct event generator is employed to determine separately the results of that interaction. The space radiation environment contains a full spectrum of radiation types, including relativistic nuclei, which are the most important component for the evaluation of crew doses. Interactions between incident protons with target nuclei in the spacecraft materials and crew member's bodies are well understood. However, the situation is substantially less comfortable for incident heavier nuclei (heavy ions). We have been engaged in developing several related heavy ion interaction models based on a Quantum Molecular Dynamics-type approach for energies up through about 5 GeV per nucleon (GeV/A) as part of a NASA Consortium that includes a parallel program of cross section measurements to guide and verify this code development.
Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum
NASA Technical Reports Server (NTRS)
Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.;
2009-01-01
Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.
Study on radiation production in the charge stripping section of the RISP linear accelerator
NASA Astrophysics Data System (ADS)
Oh, Joo-Hee; Oranj, Leila Mokhtari; Lee, Hee-Seock; Ko, Seung-Kook
2015-02-01
The linear accelerator of the Rare Isotope Science Project (RISP) accelerates 200 MeV/nucleon 238U ions in a multi-charge states. Many kinds of radiations are generated while the primary beam is transported along the beam line. The stripping process using thin carbon foil leads to complicated radiation environments at the 90-degree bending section. The charge distribution of 238U ions after the carbon charge stripper was calculated by using the LISE++ program. The estimates of the radiation environments were carried out by using the well-proved Monte Carlo codes PHITS and FLUKA. The tracks of 238U ions in various charge states were identified using the magnetic field subroutine of the PHITS code. The dose distribution caused by U beam losses for those tracks was obtained over the accelerator tunnel. A modified calculation was applied for tracking the multi-charged U beams because the fundamental idea of PHITS and FLUKA was to transport fully-ionized ion beam. In this study, the beam loss pattern after a stripping section was observed, and the radiation production by heavy ions was studied. Finally, the performance of the PHITS and the FLUKA codes was validated for estimating the radiation production at the stripping section by applying a modified method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taleei, Reza; Guan, Fada; Peeler, Chris
Purpose: {sup 3}He ions may hold great potential for clinical therapy because of both their physical and biological properties. In this study, the authors investigated the physical properties, i.e., the depth-dose curves from primary and secondary particles, and the energy distributions of helium ({sup 3}He) ions. A relative biological effectiveness (RBE) model was applied to assess the biological effectiveness on survival of multiple cell lines. Methods: In light of the lack of experimental measurements and cross sections, the authors used Monte Carlo methods to study the energy deposition of {sup 3}He ions. The transport of {sup 3}He ions in watermore » was simulated by using three Monte Carlo codes—FLUKA, GEANT4, and MCNPX—for incident beams with Gaussian energy distributions with average energies of 527 and 699 MeV and a full width at half maximum of 3.3 MeV in both cases. The RBE of each was evaluated by using the repair-misrepair-fixation model. In all of the simulations with each of the three Monte Carlo codes, the same geometry and primary beam parameters were used. Results: Energy deposition as a function of depth and energy spectra with high resolution was calculated on the central axis of the beam. Secondary proton dose from the primary {sup 3}He beams was predicted quite differently by the three Monte Carlo systems. The predictions differed by as much as a factor of 2. Microdosimetric parameters such as dose mean lineal energy (y{sub D}), frequency mean lineal energy (y{sub F}), and frequency mean specific energy (z{sub F}) were used to characterize the radiation beam quality at four depths of the Bragg curve. Calculated RBE values were close to 1 at the entrance, reached on average 1.8 and 1.6 for prostate and head and neck cancer cell lines at the Bragg peak for both energies, but showed some variations between the different Monte Carlo codes. Conclusions: Although the Monte Carlo codes provided different results in energy deposition and especially in secondary particle production (most of the differences between the three codes were observed close to the Bragg peak, where the energy spectrum broadens), the results in terms of RBE were generally similar.« less
NASA Technical Reports Server (NTRS)
Atwell, William; Koontz, Steve; Reddell, Brandon; Rojdev, Kristina; Franklin, Jennifer
2010-01-01
Both crew and radio-sensitive systems, especially electronics must be protected from the effects of the space radiation environment. One method of mitigating this radiation exposure is to use passive-shielding materials. In previous vehicle designs such as the International Space Station (ISS), materials such as aluminum and polyethylene have been used as parasitic shielding to protect crew and electronics from exposure, but these designs add mass and decrease the amount of usable volume inside the vehicle. Thus, it is of interest to understand whether structural materials can also be designed to provide the radiation shielding capability needed for crew and electronics, while still providing weight savings and increased useable volume when compared against previous vehicle shielding designs. In this paper, we present calculations and analysis using the HZETRN (deterministic) and FLUKA (Monte Carlo) codes to investigate the radiation mitigation properties of these structural shielding materials, which includes graded-Z and composite materials. This work is also a follow-on to an earlier paper, that compared computational results for three radiation transport codes, HZETRN, HETC, and FLUKA, using the Feb. 1956 solar particle event (SPE) spectrum. In the following analysis, we consider the October 1989 Ground Level Enhanced (GLE) SPE as the input source term based on the Band function fitting method. Using HZETRN and FLUKA, parametric absorbed doses at the center of a hemispherical structure on the lunar surface are calculated for various thicknesses of graded-Z layups and an all-aluminum structure. HZETRN and FLUKA calculations are compared and are in reasonable (18% to 27%) agreement. Both codes are in agreement with respect to the predicted shielding material performance trends. The results from both HZETRN and FLUKA are analyzed and the radiation protection properties and potential weight savings of various materials and materials lay-ups are compared.
The estimation of background production by cosmic rays in high-energy gamma ray telescopes
NASA Technical Reports Server (NTRS)
Edwards, H. L.; Nolan, P. L.; Lin, Y. C.; Koch, D. G.; Bertsch, D. L.; Fichtel, C. E.; Hartman, R. C.; Hunter, S. D.; Kniffen, D. A.; Hughes, E. B.
1991-01-01
A calculational method of estimating instrumental background in high-energy gamma-ray telescopes, using the hadronic Monte Carlo code FLUKA87, is presented. The method is applied to the SAS-2 and EGRET telescope designs and is also used to explore the level of background to be expected for alternative configurations of the proposed GRITS telescope, which adapts the external fuel tank of a Space Shuttle as a gamma-ray telescope with a very large collecting area. The background produced in proton-beam tests of EGRET is much less than the predicted level. This discrepancy appears to be due to the FLUKA87 inability to transport evaporation nucleons. It is predicted that the background in EGRET will be no more than 4-10 percent of the extragalactic diffuse gamma radiation.
Use of borated polyethylene to improve low energy response of a prompt gamma based neutron dosimeter
NASA Astrophysics Data System (ADS)
Priyada, P.; Ashwini, U.; Sarkar, P. K.
2016-05-01
The feasibility of using a combined sample of borated polyethylene and normal polyethylene to estimate neutron ambient dose equivalent from measured prompt gamma emissions is investigated theoretically to demonstrate improvements in low energy neutron dose response compared to only polyethylene. Monte Carlo simulations have been carried out using the FLUKA code to calculate the response of boron, hydrogen and carbon prompt gamma emissions to mono energetic neutrons. The weighted least square method is employed to arrive at the best linear combination of these responses that approximates the ICRP fluence to dose conversion coefficients well in the energy range of 10-8 MeV to 14 MeV. The configuration of the combined system is optimized through FLUKA simulations. The proposed method is validated theoretically with five different workplace neutron spectra with satisfactory outcome.
Experimental approach to measure thick target neutron yields induced by heavy ions for shielding
NASA Astrophysics Data System (ADS)
Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Brouillard, C.; Clerc, T.; Damoy, S.; Desmezières, V.; Dessay, E.; Dupuis, M.; Grinyer, G. F.; Grinyer, J.; Jacquot, B.; Ledoux, X.; Madeline, A.; Menard, N.; Michel, M.; Morel, V.; Porée, F.; Rannou, B.; Savalle, A.
2017-09-01
Double differential (angular and energy) neutron distributions were measured using an activation foil technique. Reactions were induced by impinging two low-energy heavy-ion beams accelerated with the GANIL CSS1 cyclotron: (36S (12 MeV/u) and 208Pb (6.25 MeV/u)) onto thick natCu targets. Results have been compared to Monte-Carlo calculations from two codes (PHITS and FLUKA) for the purpose of benchmarking radiation protection and shielding requirements. This comparison suggests a disagreement between calculations and experiment, particularly for high-energy neutrons.
Benchmark of neutron production cross sections with Monte Carlo codes
NASA Astrophysics Data System (ADS)
Tsai, Pi-En; Lai, Bo-Lun; Heilbronn, Lawrence H.; Sheu, Rong-Jiun
2018-02-01
Aiming to provide critical information in the fields of heavy ion therapy, radiation shielding in space, and facility design for heavy-ion research accelerators, the physics models in three Monte Carlo simulation codes - PHITS, FLUKA, and MCNP6, were systematically benchmarked with comparisons to fifteen sets of experimental data for neutron production cross sections, which include various combinations of 12C, 20Ne, 40Ar, 84Kr and 132Xe projectiles and natLi, natC, natAl, natCu, and natPb target nuclides at incident energies between 135 MeV/nucleon and 600 MeV/nucleon. For neutron energies above 60% of the specific projectile energy per nucleon, the LAQGMS03.03 in MCNP6, the JQMD/JQMD-2.0 in PHITS, and the RQMD-2.4 in FLUKA all show a better agreement with data in heavy-projectile systems than with light-projectile systems, suggesting that the collective properties of projectile nuclei and nucleon interactions in the nucleus should be considered for light projectiles. For intermediate-energy neutrons whose energies are below the 60% projectile energy per nucleon and above 20 MeV, FLUKA is likely to overestimate the secondary neutron production, while MCNP6 tends towards underestimation. PHITS with JQMD shows a mild tendency for underestimation, but the JQMD-2.0 model with a modified physics description for central collisions generally improves the agreement between data and calculations. For low-energy neutrons (below 20 MeV), which are dominated by the evaporation mechanism, PHITS (which uses GEM linked with JQMD and JQMD-2.0) and FLUKA both tend to overestimate the production cross section, whereas MCNP6 tends to underestimate more systems than to overestimate. For total neutron production cross sections, the trends of the benchmark results over the entire energy range are similar to the trends seen in the dominate energy region. Also, the comparison of GEM coupled with either JQMD or JQMD-2.0 in the PHITS code indicates that the model used to describe the first stage of a nucleus-nucleus collision also affects the low-energy neutron production. Thus, in this case, a proper combination of two physics models is desired to reproduce the measured results. In addition, code users should be aware that certain models consistently produce secondary neutrons within a constant fraction of another model in certain energy regions, which might be correlated to different physics treatments in different models.
NASA Astrophysics Data System (ADS)
Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.; Yashima, H.
2018-06-01
The CERN High energy AcceleRator Mixed field (CHARM) facility is situated in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5·1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7·1010 protons per second. The extracted proton beam impacts on a cylindrical copper target. The shielding of the CHARM facility includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target that allows deep shielding penetration benchmark studies of various shielding materials. This facility has been significantly upgraded during the extended technical stop at the beginning of 2016. It consists now of 40 cm of cast iron shielding, a 200 cm long removable sample holder concrete block with 3 inserts for activation samples, a material test location that is used for the measurement of the attenuation length for different shielding materials as well as for sample activation at different thicknesses of the shielding materials. Activation samples of bismuth, aluminium and indium were placed in the CSBF in September 2016 to characterize the upgraded version of the CSBF. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields of bismuth isotopes (206 Bi, 205 Bi, 204 Bi, 203 Bi, 202 Bi, 201 Bi) from 209 Bi, 24 Na from 27 Al and 115 m I from 115 I for these samples. The production yields estimated by FLUKA Monte Carlo simulations are compared to the production yields obtained from γ-spectroscopy measurements of the samples taking the beam intensity profile into account. The agreement between FLUKA predictions and γ-spectroscopy measurements for the production yields is at a level of a factor of 2.
Neutron Transport Models and Methods for HZETRN and Coupling to Low Energy Light Ion Transport
NASA Technical Reports Server (NTRS)
Blattnig, S.R.; Slaba, T.C.; Heinbockel, J.H.
2008-01-01
Exposure estimates inside space vehicles, surface habitats, and high altitude aircraft exposed to space radiation are highly influenced by secondary neutron production. The deterministic transport code HZETRN has been identified as a reliable and efficient tool for such studies, but improvements to the underlying transport models and numerical methods are still necessary. In this paper, the forward-backward (FB) and directionally coupled forward-backward (DC) neutron transport models are derived, numerical methods for the FB model are reviewed, and a computationally efficient numerical solution is presented for the DC model. Both models are compared to the Monte Carlo codes HETCHEDS and FLUKA, and the DC model is shown to agree closely with the Monte Carlo results. Finally, it is found in the development of either model that the decoupling of low energy neutrons from the light ion (A<4) transport procedure adversely affects low energy light ion fluence spectra and exposure quantities. A first order correction is presented to resolve the problem, and it is shown to be both accurate and efficient.
NASA Astrophysics Data System (ADS)
Ermis, Elif Ebru
2017-02-01
The photon mass attenuation coefficients of LiF, BaSO4, CaCO3 and CaSO4 thermoluminescent dosimetric compounds at 100; 300; 500; 600; 800; 1,000; 1,500; 2,000; 3,000 and 5,000 keV gamma-ray energies were calculated. For this purpose, FLUKA Monte Carlo (MC) program which is one of the well-known MC codes was used in this study. Furthermore, obtained results were analyzed by means of ROOT program. National Institute of Standards and Technology (NIST) values were also used to compare the obtained theoretical values because the mass attenuation values of the used compounds could not found in the literature. Calculated mass attenuation coefficients were highly in accordance with the NIST values. As a consequence, FLUKA was successful in calculating the mass attenuation coefficients of the most used thermoluminescent compound.
Radon, T; Gutermuth, F; Fehrenbacher, G
2005-01-01
The Gesellschaft für Schwerionenforschung (GSI) is planning a significant expansion of its accelerator facilities. Compared to the present GSI facility, a factor of 100 in primary beam intensities and up to a factor of 10,000 in secondary radioactive beam intensities are key technical goals of the proposal. The second branch of the so-called Facility for Antiproton and Ion Research (FAIR) is the production of antiprotons and their storage in rings and traps. The facility will provide beam energies a factor of approximately 15 higher than presently available at the GSI for all ions, from protons to uranium. The shielding design of the synchrotron SIS 100/300 is shown exemplarily by using Monte Carlo calculations with the FLUKA code. The experimental area serving the investigation of compressed baryonic matter is analysed in the same way. In addition, a dose comparison is made for an experimental area operated with medium energy heavy-ion beams. Here, Monte Carlo calculations are performed by using either heavy-ion primary particles or proton beams with intensities scaled by the mass number of the corresponding heavy-ion beam.
Interactive three-dimensional visualization and creation of geometries for Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Theis, C.; Buchegger, K. H.; Brugger, M.; Forkel-Wirth, D.; Roesler, S.; Vincke, H.
2006-06-01
The implementation of three-dimensional geometries for the simulation of radiation transport problems is a very time-consuming task. Each particle transport code supplies its own scripting language and syntax for creating the geometries. All of them are based on the Constructive Solid Geometry scheme requiring textual description. This makes the creation a tedious and error-prone task, which is especially hard to master for novice users. The Monte Carlo code FLUKA comes with built-in support for creating two-dimensional cross-sections through the geometry and FLUKACAD, a custom-built converter to the commercial Computer Aided Design package AutoCAD, exists for 3D visualization. For other codes, like MCNPX, a couple of different tools are available, but they are often specifically tailored to the particle transport code and its approach used for implementing geometries. Complex constructive solid modeling usually requires very fast and expensive special purpose hardware, which is not widely available. In this paper SimpleGeo is presented, which is an implementation of a generic versatile interactive geometry modeler using off-the-shelf hardware. It is running on Windows, with a Linux version currently under preparation. This paper describes its functionality, which allows for rapid interactive visualization as well as generation of three-dimensional geometries, and also discusses critical issues regarding common CAD systems.
Use of Existing CAD Models for Radiation Shielding Analysis
NASA Technical Reports Server (NTRS)
Lee, K. T.; Barzilla, J. E.; Wilson, P.; Davis, A.; Zachman, J.
2015-01-01
The utility of a radiation exposure analysis depends not only on the accuracy of the underlying particle transport code, but also on the accuracy of the geometric representations of both the vehicle used as radiation shielding mass and the phantom representation of the human form. The current NASA/Space Radiation Analysis Group (SRAG) process to determine crew radiation exposure in a vehicle design incorporates both output from an analytic High Z and Energy Particle Transport (HZETRN) code and the properties (i.e., material thicknesses) of a previously processed drawing. This geometry pre-process can be time-consuming, and the results are less accurate than those determined using a Monte Carlo-based particle transport code. The current work aims to improve this process. Although several Monte Carlo programs (FLUKA, Geant4) are readily available, most use an internal geometry engine. The lack of an interface with the standard CAD formats used by the vehicle designers limits the ability of the user to communicate complex geometries. Translation of native CAD drawings into a format readable by these transport programs is time consuming and prone to error. The Direct Accelerated Geometry -United (DAGU) project is intended to provide an interface between the native vehicle or phantom CAD geometry and multiple particle transport codes to minimize problem setup, computing time and analysis error.
Comparison with simulations to experimental data for photo-neutron reactions using SPring-8 Injector
NASA Astrophysics Data System (ADS)
Asano, Yoshihiro
2017-09-01
Simulations of photo-nuclear reactions by using Monte Carlo codes PHITS and FLUKA have been performed to compare to the measured data at the SPring-8 injector with 250MeV and 961MeV electrons. Measurement data of Bismuth-206 productions due to photo-nuclear reactions of 209Bi(γ,3n) 206Bi and high energy neutron reactions of 209Bi(n,4n)206 Bi at the beam dumps have been compared with the simulations. Neutron leakage spectra outside the shield wall are also compared between experiments and simulations.
Preliminary Modelling of Radiation Levels at the Fermilab PIP-II Linac
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lari, L.; Cerutti, F.; Esposito, L. S.
PIP-II is the Fermilab's flagship project for providing powerful, high-intensity proton beams to the laboratory's experiments. The heart of PIP-II is an 800-MeV superconducting linac accelerator. It will be located in a new tunnel with new service buildings and connected to the present Booster through a new transfer line. To support the design of civil engineering and mechanical integration, this paper provides preliminary estimation of radiation level in the gallery at an operational beam loss limit of 0.1 W/m, by means of Monte Carlo calculations with FLUKA and MARS15 codes.
Neutron Productions from thin Be target irradiated by 50 MeV/u 238U beam
NASA Astrophysics Data System (ADS)
Lee, Hee-Seock; Oh, Joo-Hee; Jung, Nam-Suk; Oranj, Leila Mokhtari; Nakao, Noriaki; Uwamino, Yoshitomo
2017-09-01
Neutrons generated from thin beryllium target by 50 MeV/u 238U beam were measured using activation analysis at 15, 30, 45, and 90 degrees from the beam direction. A 0.085 mm-thick Be stripper of RIBF was used as the neutron generating target. Activation detectors of bismuth, cobalt, and aluminum were placed out of the stripper chamber. The threshold reactions of 209Bi(n, xn)210-xBi(x=4 8), 59Co(n, xn)60-xCO(x=2 5), 59Co(n, 2nα)54Mn, 27Al(n, α)24Na, and 27Al(n,2nα)22Na were applied to measure the production rates of radionuclides. The neutron spectra were obtained using an unfolding method with the SAND-II code. All of production rates and neutron spectra were compared with the calculated results using Monte Carlo codes, the PHITS and the FLUKA. The FLUKA results showed better agreement with the measurements than the PHITS. The discrepancy between the measurements and the calculations were discussed.
Residual activity evaluation: a benchmark between ANITA, FISPACT, FLUKA and PHITS codes
NASA Astrophysics Data System (ADS)
Firpo, Gabriele; Viberti, Carlo Maria; Ferrari, Anna; Frisoni, Manuela
2017-09-01
The activity of residual nuclides dictates the radiation fields in periodic inspections/repairs (maintenance periods) and dismantling operations (decommissioning phase) of accelerator facilities (i.e., medical, industrial, research) and nuclear reactors. Therefore, the correct prediction of the material activation allows for a more accurate planning of the activities, in line with the ALARA (As Low As Reasonably Achievable) principles. The scope of the present work is to show the results of a comparison between residual total specific activity versus a set of cooling time instants (from zero up to 10 years after irradiation) as obtained by two analytical (FISPACT and ANITA) and two Monte Carlo (FLUKA and PHITS) codes, making use of their default nuclear data libraries. A set of 40 irradiating scenarios is considered, i.e. neutron and proton particles of different energies, ranging from zero to many hundreds MeV, impinging on pure elements or materials of standard composition typically used in industrial applications (namely, AISI SS316 and Portland concrete). In some cases, experimental results were also available for a more thorough benchmark.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Englbrecht, F; Parodi, K; Trinkl, S
2016-06-15
Purpose: To simulate secondary neutron radiation-fields produced at different positions during phantom irradiation inside a scanning proton therapy gantry treatment room. Further, to identify origin, energy distribution and angular emission as function of proton beam energy. Methods: GEANT4 and FLUKA Monte-Carlo codes were used to model the relevant parts of the treatment room in a gantry-equipped pencil beam scanning proton therapy facility including walls, floor, metallic gantry-components, patient table and the homogeneous PMMA target. The proton beams were modeled based on experimental beam ranges in water and spot shapes in air. Neutron energy spectra were simulated at 0°, 45°, 90°more » and 135° relative to the beam axis at 2m distance from isocenter, as well as 11×11 cm2 fields for 75MeV, 140MeV, 200MeV and for 118MeV with 5cm PMMA range-shifter. The total neutron energy distribution was recorded for these four positions and proton energies. Additionally, the room-components generating secondary neutrons in the room and their contributions to the total spectrum were identified and quantified. Results: FLUKA and GEANT4 simulated neutron spectra showed good general agreement in the whole energy range of 10{sup −}9 to 10{sup 2} MeV. Comparison of measured spectra with the simulated contributions of the various room components helped to limit the complexity of the room model, by identifying the dominant contributions to the secondary neutron spectrum. The iron of the bending magnet and counterweight were identified as sources of secondary evaporation-neutrons, which were lacking in simplified room models. Conclusion: Thorough Monte-Carlo simulations have been performed to complement Bonner-sphere spectrometry measurements of secondary neutrons in a clinical proton therapy treatment room. Such calculations helped disentangling the origin of secondary neutrons and their dominant contributions to measured spectra, besides providing a useful validation of widely used Monte-Carlo packages in comparison to experimental data. Cluster of Excellence of the German Research Foundation (DFG) “Munich-Centre for Advanced Photonics (MAP)”.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fasso, A.; Ferrari, A.; Ferrari, A.
In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, andmore » with the SLAC data.« less
NASA Astrophysics Data System (ADS)
Slaba, Tony C.; Blattnig, Steve R.; Reddell, Brandon; Bahadori, Amir; Norman, Ryan B.; Badavi, Francis F.
2013-07-01
Recent work has indicated that pion production and the associated electromagnetic (EM) cascade may be an important contribution to the total astronaut exposure in space. Recent extensions to the deterministic space radiation transport code, HZETRN, allow the production and transport of pions, muons, electrons, positrons, and photons. In this paper, the extended code is compared to the Monte Carlo codes, Geant4, PHITS, and FLUKA, in slab geometries exposed to galactic cosmic ray (GCR) boundary conditions. While improvements in the HZETRN transport formalism for the new particles are needed, it is shown that reasonable agreement on dose is found at larger shielding thicknesses commonly found on the International Space Station (ISS). Finally, the extended code is compared to ISS data on a minute-by-minute basis over a seven day period in 2001. The impact of pion/EM production on exposure estimates and validation results is clearly shown. The Badhwar-O'Neill (BO) 2004 and 2010 models are used to generate the GCR boundary condition at each time-step allowing the impact of environmental model improvements on validation results to be quantified as well. It is found that the updated BO2010 model noticeably reduces overall exposure estimates from the BO2004 model, and the additional production mechanisms in HZETRN provide some compensation. It is shown that the overestimates provided by the BO2004 GCR model in previous validation studies led to deflated uncertainty estimates for environmental, physics, and transport models, and allowed an important physical interaction (π/EM) to be overlooked in model development. Despite the additional π/EM production mechanisms in HZETRN, a systematic under-prediction of total dose is observed in comparison to Monte Carlo results and measured data.
Al-Affan, I A M; Hugtenburg, R P; Bari, D S; Al-Saleh, W M; Piliero, M; Evans, S; Al-Hasan, M; Al-Zughul, B; Al-Kharouf, S; Ghaith, A
2015-02-01
This study explores the possibility of using lead to cover part of the radiation therapy facility maze walls in order to absorb low energy photons and reduce the total dose at the maze entrance of radiation therapy rooms. Experiments and Monte Carlo simulations were utilized to establish the possibility of using high-Z materials to cover the concrete walls of the maze in order to reduce the dose of the scattered photons at the maze entrance. The dose of the backscattered photons from a concrete wall was measured for various scattering angles. The dose was also calculated by the FLUKA and EGSnrc Monte Carlo codes. The FLUKA code was also used to simulate an existing radiotherapy room to study the effect of multiple scattering when adding lead to cover the concrete walls of the maze. Monoenergetic photons were used to represent the main components of the x ray spectrum up to 10 MV. It was observed that when the concrete wall was covered with just 2 mm of lead, the measured dose rate at all backscattering angles was reduced by 20% for photons of energy comparable to Co-60 emissions and 70% for Cs-137 emissions. The simulations with FLUKA and EGS showed that the reduction in the dose was potentially even higher when lead was added. One explanation for the reduction is the increased absorption of backscattered photons due to the photoelectric interaction in lead. The results also showed that adding 2 mm lead to the concrete walls and floor of the maze reduced the dose at the maze entrance by up to 90%. This novel proposal of covering part or the entire maze walls with a few millimeters of lead would have a direct implication for the design of radiation therapy facilities and would assist in upgrading the design of some mazes, especially those in facilities with limited space where the maze length cannot be extended to sufficiently reduce the dose. © 2015 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Affan, I. A. M., E-mail: info@medphys-environment.co.uk; Hugtenburg, R. P.; Piliero, M.
Purpose: This study explores the possibility of using lead to cover part of the radiation therapy facility maze walls in order to absorb low energy photons and reduce the total dose at the maze entrance of radiation therapy rooms. Methods: Experiments and Monte Carlo simulations were utilized to establish the possibility of using high-Z materials to cover the concrete walls of the maze in order to reduce the dose of the scattered photons at the maze entrance. The dose of the backscattered photons from a concrete wall was measured for various scattering angles. The dose was also calculated by themore » FLUKA and EGSnrc Monte Carlo codes. The FLUKA code was also used to simulate an existing radiotherapy room to study the effect of multiple scattering when adding lead to cover the concrete walls of the maze. Monoenergetic photons were used to represent the main components of the x ray spectrum up to 10 MV. Results: It was observed that when the concrete wall was covered with just 2 mm of lead, the measured dose rate at all backscattering angles was reduced by 20% for photons of energy comparable to Co-60 emissions and 70% for Cs-137 emissions. The simulations with FLUKA and EGS showed that the reduction in the dose was potentially even higher when lead was added. One explanation for the reduction is the increased absorption of backscattered photons due to the photoelectric interaction in lead. The results also showed that adding 2 mm lead to the concrete walls and floor of the maze reduced the dose at the maze entrance by up to 90%. Conclusions: This novel proposal of covering part or the entire maze walls with a few millimeters of lead would have a direct implication for the design of radiation therapy facilities and would assist in upgrading the design of some mazes, especially those in facilities with limited space where the maze length cannot be extended to sufficiently reduce the dose.« less
NASA Astrophysics Data System (ADS)
Muraro, S.; Battistoni, G.; Belcari, N.; Bisogni, M. G.; Camarlinghi, N.; Cristoforetti, L.; Del Guerra, A.; Ferrari, A.; Fracchiolla, F.; Morrocchi, M.; Righetto, R.; Sala, P.; Schwarz, M.; Sportelli, G.; Topi, A.; Rosso, V.
2017-12-01
Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two planar heads, and designed to be installed along the beam line to acquire data also during the irradiation. Different acquisitions are analyzed and compared with the MC predictions, with a special focus on validating the PET detectors response for activity range verification.
MCNPX simulation of proton dose distribution in homogeneous and CT phantoms
NASA Astrophysics Data System (ADS)
Lee, C. C.; Lee, Y. J.; Tung, C. J.; Cheng, H. W.; Chao, T. C.
2014-02-01
A dose simulation system was constructed based on the MCNPX Monte Carlo package to simulate proton dose distribution in homogeneous and CT phantoms. Conversion from Hounsfield unit of a patient CT image set to material information necessary for Monte Carlo simulation is based on Schneider's approach. In order to validate this simulation system, inter-comparison of depth dose distributions among those obtained from the MCNPX, GEANT4 and FLUKA codes for a 160 MeV monoenergetic proton beam incident normally on the surface of a homogeneous water phantom was performed. For dose validation within the CT phantom, direct comparison with measurement is infeasible. Instead, this study took the approach to indirectly compare the 50% ranges (R50%) along the central axis by our system to the NIST CSDA ranges for beams with 160 and 115 MeV energies. Comparison result within the homogeneous phantom shows good agreement. Differences of simulated R50% among the three codes are less than 1 mm. For results within the CT phantom, the MCNPX simulated water equivalent Req,50% are compatible with the CSDA water equivalent ranges from the NIST database with differences of 0.7 and 4.1 mm for 160 and 115 MeV beams, respectively.
A Bonner Sphere Spectrometer with extended response matrix
NASA Astrophysics Data System (ADS)
Birattari, C.; Dimovasili, E.; Mitaroff, A.; Silari, M.
2010-08-01
This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators.
Radiation Protection Studies for Medical Particle Accelerators using Fluka Monte Carlo Code.
Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario
2017-04-01
Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of 41Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Sarria, D.
2016-12-01
The field of High Energy Atmospheric Physics (HEAP) includes the study of energetic events related to thunderstorms, such as Terrestrial Gamma-ray Flashes (TGF), associated electron-positron beams (TEB), gamma-ray glows and Thunderstorm Ground Enhancements (TGE). Understanding these phenomena requires accurate models for the interaction of particles with atmospheric air and electro-magnetic fields in the <100 MeV energy range. This study is the next step of the work presented in [C. Rutjes et al., 2016] that compared the performances of various codes in the absence of electro-magnetic fields. In the first part, we quantify simple but informative test cases of electrons in various electric field profiles. We will compare the avalanche length (of the Relativistic Runaway Electron Avalanche (RREA) process), the photon/electron spectra and spatial scattering. In particular, we test the effect of the low-energy threshold, that was found to be very important [Skeltved et al., 2014]. Note that even without a field, it was found to be important because of the straggling effect [C. Rutjes et al., 2016]. For this first part, we will be comparing GEANT4 (different flavours), FLUKA and the custom made code GRRR. In the second part, we test the propagation of these high energy particles in the atmosphere, from production altitude (around 10 km to 18 km) to satellite altitude (600 km). We use a simple and clearly fixed set-up for the atmospheric density, the geomagnetic field, the initial conditions, and the detection conditions of the particles. For this second part, we will be comparing GEANT4 (different flavours), FLUKA/CORSIKA and the custom made code MC-PEPTITA. References : C. Rutjes et al., 2016. Evaluation of Monte Carlo tools for high energy atmospheric physics. Geosci. Model Dev. Under review. Skeltved, A. B. et al., 2014. Modelling the relativistic runaway electron avalanche and the feedback mechanism with geant4. JGRA, doi :10.1002/2014JA020504.
Infantino, Angelo; Valtieri, Lorenzo; Cicoria, Gianfranco; Pancaldi, Davide; Mostacci, Domiziano; Marengo, Mario
2015-12-01
In a medical cyclotron facility, (41)Ar (t1/2 = 109.34 m) is produced by the activation of air due to the neutron flux during irradiation, according to the (40)Ar(n,γ)(41)Ar reaction; this is particularly relevant in widely diffused high beam current cyclotrons for the production of PET radionuclides. While theoretical estimations of the (41)Ar production have been published, no data are available on direct experimental measurements for a biomedical cyclotron. In this work, we describe a sampling methodology and report the results of an extensive measurement campaign. Furthermore, the experimental results are compared with Monte Carlo simulations performed with the FLUKA code. To measure (41)Ar activity, air samples were taken inside the cyclotron bunker in sealed Marinelli beakers, during the routine production of (18)F with a 16.5 MeV GE-PETtrace cyclotron; this sampling thus reproduces a situation of absence of air changes. Samples analysis was performed in a gamma-ray spectrometry system equipped with HPGe detector. Monte Carlo assessment of the (41)Ar saturation yield was performed directly using the standard FLUKA score RESNUCLE, and off-line by the convolution of neutron fluence with cross section data. The average (41)Ar saturation yield per one liter of air of (41)Ar, measured in gamma-ray spectrometry, resulted to be 3.0 ± 0.6 Bq/µA*dm(3) while simulations gave a result of 6.9 ± 0.3 Bq/µA*dm(3) in the direct assessment and 6.92 ± 0.22 Bq/µA*dm(3) by the convolution neutron fluence-to-cross section. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Predicting induced radioactivity for the accelerator operations at the Taiwan Photon Source.
Sheu, R J; Jiang, S H
2010-12-01
This study investigates the characteristics of induced radioactivity due to the operations of a 3-GeV electron accelerator at the Taiwan Photon Source (TPS). According to the beam loss analysis, the authors set two representative irradiation conditions for the activation analysis. The FLUKA Monte Carlo code has been used to predict the isotope inventories, residual activities, and remanent dose rates as a function of time. The calculation model itself is simple but conservative for the evaluation of induced radioactivity in a light source facility. This study highlights the importance of beam loss scenarios and demonstrates the great advantage of using FLUKA in comparing the predicted radioactivity with corresponding regulatory limits. The calculated results lead to the conclusion that, due to fairly low electron consumption, the radioactivity induced in the accelerator components and surrounding concrete walls of the TPS is rather moderate and manageable, while the possible activation of air and cooling water in the tunnel and their environmental releases are negligible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Y; Singh, H; Islam, M
2014-06-01
Purpose: Output dependence on field size for uniform scanning beams, and the accuracy of treatment planning system (TPS) calculation are not well studied. The purpose of this work is to investigate the dependence of output on field size for uniform scanning beams and compare it among TPS calculation, measurements and Monte Carlo simulations. Methods: Field size dependence was studied using various field sizes between 2.5 cm diameter to 10 cm diameter. The field size factor was studied for a number of proton range and modulation combinations based on output at the center of spread out Bragg peak normalized to amore » 10 cm diameter field. Three methods were used and compared in this study: 1) TPS calculation, 2) ionization chamber measurement, and 3) Monte Carlos simulation. The XiO TPS (Electa, St. Louis) was used to calculate the output factor using a pencil beam algorithm; a pinpoint ionization chamber was used for measurements; and the Fluka code was used for Monte Carlo simulations. Results: The field size factor varied with proton beam parameters, such as range, modulation, and calibration depth, and could decrease over 10% from a 10 cm to 3 cm diameter field for a large range proton beam. The XiO TPS predicted the field size factor relatively well at large field size, but could differ from measurements by 5% or more for small field and large range beams. Monte Carlo simulations predicted the field size factor within 1.5% of measurements. Conclusion: Output factor can vary largely with field size, and needs to be accounted for accurate proton beam delivery. This is especially important for small field beams such as in stereotactic proton therapy, where the field size dependence is large and TPS calculation is inaccurate. Measurements or Monte Carlo simulations are recommended for output determination for such cases.« less
The Application of FLUKA to Dosimetry and Radiation Therapy
NASA Technical Reports Server (NTRS)
Wilson, Thomas L.; Andersen, Victor; Pinsky, Lawrence; Ferrari, Alfredo; Battistoni, Giusenni
2005-01-01
Monte Carlo transport codes like FLUKA are useful for many purposes, and one of those is the simulation of the effects of radiation traversing the human body. In particular, radiation has been used in cancer therapy for a long time, and recently this has been extended to include heavy ion particle beams. The advent of this particular type of therapy has led to the need for increased capabilities in the transport codes used to simulate the detailed nature of the treatment doses to the Y O U S tissues that are encountered. This capability is also of interest to NASA because of the nature of the radiation environment in space.[l] While in space, the crew members bodies are continually being traversed by virtually all forms of radiation. In assessing the risk that this exposure causes, heavy ions are of primary importance. These arise both from the primary external space radiation itself, as well as fragments that result from interactions during the traversal of that radiation through any intervening material including intervening body tissue itself. Thus the capability to characterize the details of the radiation field accurately within a human body subjected to such external 'beams" is of critical importance.
Beck, P; Latocha, M; Dorman, L; Pelliccioni, M; Rollet, S
2007-01-01
As required by the European Directive 96/29/Euratom, radiation exposure due to natural ionizing radiation has to be taken into account at workplaces if the effective dose could become more than 1 mSv per year. An example of workers concerned by this directive is aircraft crew due to cosmic radiation exposure in the atmosphere. Extensive measurement campaigns on board aircrafts have been carried out to assess ambient dose equivalent. A consortium of European dosimetry institutes within EURADOS WG5 summarized experimental data and results of calculations, together with detailed descriptions of the methods for measurements and calculations. The radiation protection quantity of interest is the effective dose, E (ISO). The comparison of results by measurements and calculations is done in terms of the operational quantity ambient dose equivalent, H(10). This paper gives an overview of the EURADOS Aircraft Crew In-Flight Database and it presents a new empirical model describing fitting functions for this data. Furthermore, it describes numerical simulations performed with the Monte Carlo code FLUKA-2005 using an updated version of the cosmic radiation primary spectra. The ratio between ambient dose equivalent and effective dose at commercial flight altitudes, calculated with FLUKA-2005, is discussed. Finally, it presents the aviation dosimetry model AVIDOS based on FLUKA-2005 simulations for routine dose assessment. The code has been developed by Austrian Research Centers (ARC) for the public usage (http://avidos.healthphysics.at).
NASA Astrophysics Data System (ADS)
Bazo, J.; Rojas, J. M.; Best, S.; Bruna, R.; Endress, E.; Mendoza, P.; Poma, V.; Gago, A. M.
2018-03-01
Samples of two characteristic semiconductor sensor materials, silicon and germanium, have been irradiated with neutrons produced at the RP-10 Nuclear Research Reactor at 4.5 MW. Their radionuclides photon spectra have been measured with high resolution gamma spectroscopy, quantifying four radioisotopes (28Al, 29Al for Si and 75Ge and 77Ge for Ge). We have compared the radionuclides production and their emission spectrum data with Monte Carlo simulation results from FLUKA. Thus we have tested FLUKA's low energy neutron library (ENDF/B-VIIR) and decay photon scoring with respect to the activation of these semiconductors. We conclude that FLUKA is capable of predicting relative photon peak amplitudes, with gamma intensities greater than 1%, of produced radionuclides with an average uncertainty of 13%. This work allows us to estimate the corresponding systematic error on neutron activation simulation studies of these sensor materials.
NASA Technical Reports Server (NTRS)
Rojdev, Kristina; Koontz, Steve; Reddell, Brandon; Atwell, William; Boeder, Paul
2015-01-01
An accurate prediction of spacecraft avionics single event effect (SEE) radiation susceptibility is key to ensuring a safe and reliable vehicle. This is particularly important for long-duration deep space missions for human exploration where there is little or no chance for a quick emergency return to Earth. Monte Carlo nuclear reaction and transport codes such as FLUKA can be used to generate very accurate models of the expected in-flight radiation environment for SEE analyses. A major downside to using a Monte Carlo-based code is that the run times can be very long (on the order of days). A more popular choice for SEE calculations is the CREME96 deterministic code, which offers significantly shorter run times (on the order of seconds). However, CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Another modeling option to consider is the deterministic code HZETRN 20104, which includes updates to address secondary particle shower effects more accurately. This paper builds on previous work by Rojdev, et al. to compare the use of HZETRN 2010 against CREME96 as a tool to verify spacecraft avionics system reliability in a space flight SEE environment. This paper will discuss modifications made to HZETRN 2010 to improve its performance for calculating SEE rates and compare results with both in-flight SEE rates and other calculation methods.
A versatile multi-objective FLUKA optimization using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Vlachoudis, Vasilis; Antoniucci, Guido Arnau; Mathot, Serge; Kozlowska, Wioletta Sandra; Vretenar, Maurizio
2017-09-01
Quite often Monte Carlo simulation studies require a multi phase-space optimization, a complicated task, heavily relying on the operator experience and judgment. Examples of such calculations are shielding calculations with stringent conditions in the cost, in residual dose, material properties and space available, or in the medical field optimizing the dose delivered to a patient under a hadron treatment. The present paper describes our implementation inside flair[1] the advanced user interface of FLUKA[2,3] of a multi-objective Genetic Algorithm[Erreur ! Source du renvoi introuvable.] to facilitate the search for the optimum solution.
Forward neutron production at the Fermilab Main Injector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nigmanov, T. S.; Rajaram, D.; Longo, M. J.
2011-01-01
We have measured cross sections for forward neutron production from a variety of targets using proton beams from the Fermilab Main Injector. Measurements were performed for proton beam momenta of 58, 84, and 120 GeV/c. The cross section dependence on the atomic weight (A) of the targets was found to vary as A{sup {alpha}}, where {alpha} is 0.46{+-}0.06 for a beam momentum of 58 GeV/c and 0.54{+-}0.05 for 120 GeV/c. The cross sections show reasonable agreement with FLUKA and DPMJET Monte Carlos. Comparisons have also been made with the LAQGSM Monte Carlo.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roesler, Stefan
2002-09-19
Energy spectra of high-energy neutrons and neutron time-of-flight spectra were calculated for the setup of experiment T-454 performed with a NE213 liquid scintillator at the Final Focus Test Beam (FFTB) facility at the Stanford Linear Accelerator Center. The neutrons were created by the interaction a 28.7 GeV electron beam in the aluminum beam dump of the FFTB which is housed inside a thick steel and concrete shielding. In order to determine the attenuation length of high-energy neutrons additional concrete shielding of various thicknesses was placed outside the existing shielding. The calculations were performed using the FLUKA interaction and transport code.more » The energy and time-of-flight were recorded for the location of the detector allowing a detailed comparison with the experimental data. A generally good description of the data is achieved adding confidence to the use of FLUKA for the design of shielding for high-energy electron accelerators.« less
Development of a Space Radiation Monte Carlo Computer Simulation
NASA Technical Reports Server (NTRS)
Pinsky, Lawrence S.
1997-01-01
The ultimate purpose of this effort is to undertake the development of a computer simulation of the radiation environment encountered in spacecraft which is based upon the Monte Carlo technique. The current plan is to adapt and modify a Monte Carlo calculation code known as FLUKA, which is presently used in high energy and heavy ion physics, to simulate the radiation environment present in spacecraft during missions. The initial effort would be directed towards modeling the MIR and Space Shuttle environments, but the long range goal is to develop a program for the accurate prediction of the radiation environment likely to be encountered on future planned endeavors such as the Space Station, a Lunar Return Mission, or a Mars Mission. The longer the mission, especially those which will not have the shielding protection of the earth's magnetic field, the more critical the radiation threat will be. The ultimate goal of this research is to produce a code that will be useful to mission planners and engineers who need to have detailed projections of radiation exposures at specified locations within the spacecraft and for either specific times during the mission or integrated over the entire mission. In concert with the development of the simulation, it is desired to integrate it with a state-of-the-art interactive 3-D graphics-capable analysis package known as ROOT, to allow easy investigation and visualization of the results. The efforts reported on here include the initial development of the program and the demonstration of the efficacy of the technique through a model simulation of the MIR environment. This information was used to write a proposal to obtain follow-on permanent funding for this project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darafsheh, A; Kassaee, A; Finlay, J
Purpose: The nature of the background visible light observed during fiber optic dosimetry of proton beams, whether it is due to Cherenkov radiation or not, has been debated in the literature recently. In this work, experimentally and by means of Monte Carlo simulations, we shed light on this problem and investigated the nature of the background visible light observed in fiber optics irradiated with proton beams. Methods: A bare silica fiber optics was embedded in tissue-mimicking phantoms and irradiated with clinical proton beams with energies of 100–225 MeV at Roberts Proton Therapy Center. Luminescence spectroscopy was performed by a CCD-coupledmore » spectrograph to analyze in detail the emission spectrum of the fiber tip across the visible range of 400–700 nm. Monte Carlo simulation was performed by using FLUKA Monte Carlo code to simulate Cherenkov light and ionizing radiation dose deposition in the fiber. Results: The experimental spectra of the irradiated silica fiber shows two distinct peaks at 450 and 650 nm, whose spectral shape is different from that of Cherenkov radiation. We believe that the nature of these peaks are connected to the point defects of silica including oxygen-deficiency center (ODC) and non-bridging oxygen hole center (NBOHC). Monte Carlo simulations confirmed the experimental observations that Cherenkov radiation cannot be solely responsible for such a signal. Conclusion: We showed that Cherenkov radiation is not the dominant visible signal observed in bare fiber optics irradiated with proton beams. We observed two distinct peaks at 450 and 650 nm whose nature is connected with the point defects of silica fiber including oxygen-deficiency center and non-bridging oxygen hole center.« less
Development of the 3DHZETRN code for space radiation protection
NASA Astrophysics Data System (ADS)
Wilson, John; Badavi, Francis; Slaba, Tony; Reddell, Brandon; Bahadori, Amir; Singleterry, Robert
Space radiation protection requires computationally efficient shield assessment methods that have been verified and validated. The HZETRN code is the engineering design code used for low Earth orbit dosimetric analysis and astronaut record keeping with end-to-end validation to twenty percent in Space Shuttle and International Space Station operations. HZETRN treated diffusive leakage only at the distal surface limiting its application to systems with a large radius of curvature. A revision of HZETRN that included forward and backward diffusion allowed neutron leakage to be evaluated at both the near and distal surfaces. That revision provided a deterministic code of high computational efficiency that was in substantial agreement with Monte Carlo (MC) codes in flat plates (at least to the degree that MC codes agree among themselves). In the present paper, the 3DHZETRN formalism capable of evaluation in general geometry is described. Benchmarking will help quantify uncertainty with MC codes (Geant4, FLUKA, MCNP6, and PHITS) in simple shapes such as spheres within spherical shells and boxes. Connection of the 3DHZETRN to general geometry will be discussed.
Radiation protection considerations along a radioactive ion beam transport line
NASA Astrophysics Data System (ADS)
Sarchiapone, Lucia; Zafiropoulos, Demetre
2016-09-01
The goal of the SPES project is to produce accelerated radioactive ion beams for Physics studies at “Laboratori Nazionali di Legnaro” (INFN, Italy). This accelerator complex is scheduled to be built by 2016 for an effective operation in 2017. Radioactive species are produced in a uranium carbide target, by the interaction of 200 μA of protons at 40 MeV. All of the ionized species in the 1+ state come out of the target (ISOL method), and pass through a Wien filter for a first selection and an HMRS (high mass resolution spectrometer). Then they are transported by an electrostatic beam line toward a charge state breeder (where the 1+ to n+ multi-ionization takes place) before selection and reacceleration at the already existing superconducting linac. The work concerning dose evaluations, activation calculation, and radiation protection constraints related to the transport of the radioactive ion beam (RIB) from the target to the mass separator will be described in this paper. The FLUKA code has been used as tool for those calculations needing Monte Carlo simulations, in particular for the evaluation of the dose rate due to the presence of the radioactive beam in the selection/interaction points. The time evolution of a radionuclide inventory can be computed online with FLUKA for arbitrary irradiation profiles and decay times. The activity evolution is analytically evaluated through the implementation of the Bateman equations. Furthermore, the generation and transport of decay radiation (limited to gamma, beta- and beta+ emissions) is possible, referring to a dedicated database of decay emissions using mostly information obtained from NNDC, sometimes supplemented with other data and checked for consistency. When the use of Monte Carlo simulations was not feasible, the Bateman equations, or possible simplifications, have been used directly.
Monte Carlo simulations of a low energy proton beamline for radiobiological experiments.
Dahle, Tordis J; Rykkelid, Anne Marit; Stokkevåg, Camilla H; Mairani, Andrea; Görgen, Andreas; Edin, Nina J; Rørvik, Eivind; Fjæra, Lars Fredrik; Malinen, Eirik; Ytre-Hauge, Kristian S
2017-06-01
In order to determine the relative biological effectiveness (RBE) of protons with high accuracy, radiobiological experiments with detailed knowledge of the linear energy transfer (LET) are needed. Cell survival data from high LET protons are sparse and experiments with low energy protons to achieve high LET values are therefore required. The aim of this study was to quantify LET distributions from a low energy proton beam by using Monte Carlo (MC) simulations, and to further compare to a proton beam representing a typical minimum energy available at clinical facilities. A Markus ionization chamber and Gafchromic films were employed in dose measurements in the proton beam at Oslo Cyclotron Laboratory. Dose profiles were also calculated using the FLUKA MC code, with the MC beam parameters optimized based on comparisons with the measurements. LET spectra and dose-averaged LET (LET d ) were then estimated in FLUKA, and compared with LET calculated from an 80 MeV proton beam. The initial proton energy was determined to be 15.5 MeV, with a Gaussian energy distribution of 0.2% full width at half maximum (FWHM) and a Gaussian lateral spread of 2 mm FWHM. The LET d increased with depth, from approximately 5 keV/μm in the entrance to approximately 40 keV/μm in the distal dose fall-off. The LET d values were considerably higher and the LET spectra were much narrower than the corresponding spectra from the 80 MeV beam. MC simulations accurately modeled the dose distribution from the proton beam and could be used to estimate the LET at any position in the setup. The setup can be used to study the RBE for protons at high LET d , which is not achievable in clinical proton therapy facilities.
Activation of accelerator construction materials by heavy ions
NASA Astrophysics Data System (ADS)
Katrík, P.; Mustafin, E.; Hoffmann, D. H. H.; Pavlovič, M.; Strašík, I.
2015-12-01
Activation data for an aluminum target irradiated by 200 MeV/u 238U ion beam are presented in the paper. The target was irradiated in the stacked-foil geometry and analyzed using gamma-ray spectroscopy. The purpose of the experiment was to study the role of primary particles, projectile fragments, and target fragments in the activation process using the depth profiling of residual activity. The study brought information on which particles contribute dominantly to the target activation. The experimental data were compared with the Monte Carlo simulations by the FLUKA 2011.2c.0 code. This study is a part of a research program devoted to activation of accelerator construction materials by high-energy (⩾200 MeV/u) heavy ions at GSI Darmstadt. The experimental data are needed to validate the computer codes used for simulation of interaction of swift heavy ions with matter.
Dose response of alanine detectors irradiated with carbon ion beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herrmann, Rochus; Jaekel, Oliver; Palmans, Hugo
Purpose: The dose response of the alanine detector shows a dependence on particle energy and type when irradiated with ion beams. The purpose of this study is to investigate the response behavior of the alanine detector in clinical carbon ion beams and compare the results to model predictions. Methods: Alanine detectors have been irradiated with carbon ions with an energy range of 89-400 MeV/u. The relative effectiveness of alanine has been measured in this regime. Pristine and spread out Bragg peak depth-dose curves have been measured with alanine dosimeters. The track structure based alanine response model developed by Hansen andmore » Olsen has been implemented in the Monte Carlo code FLUKA and calculations were compared to experimental results. Results: Calculations of the relative effectiveness deviate less than 5% from the measured values for monoenergetic beams. Measured depth-dose curves deviate from predictions in the peak region, most pronounced at the distal edge of the peak. Conclusions: The used model and its implementation show a good overall agreement for quasimonoenergetic measurements. Deviations in depth-dose measurements are mainly attributed to uncertainties of the detector geometry implemented in the Monte Carlo simulations.« less
Monte Carlo simulation of secondary neutron dose for scanning proton therapy using FLUKA
Lee, Chaeyeong; Lee, Sangmin; Lee, Seung-Jae; Song, Hankyeol; Kim, Dae-Hyun; Cho, Sungkoo; Jo, Kwanghyun; Han, Youngyih; Chung, Yong Hyun
2017-01-01
Proton therapy is a rapidly progressing field for cancer treatment. Globally, many proton therapy facilities are being commissioned or under construction. Secondary neutrons are an important issue during the commissioning process of a proton therapy facility. The purpose of this study is to model and validate scanning nozzles of proton therapy at Samsung Medical Center (SMC) by Monte Carlo simulation for beam commissioning. After the commissioning, a secondary neutron ambient dose from proton scanning nozzle (Gantry 1) was simulated and measured. This simulation was performed to evaluate beam properties such as percent depth dose curve, Bragg peak, and distal fall-off, so that they could be verified with measured data. Using the validated beam nozzle, the secondary neutron ambient dose was simulated and then compared with the measured ambient dose from Gantry 1. We calculated secondary neutron dose at several different points. We demonstrated the validity modeling a proton scanning nozzle system to evaluate various parameters using FLUKA. The measured secondary neutron ambient dose showed a similar tendency with the simulation result. This work will increase the knowledge necessary for the development of radiation safety technology in medical particle accelerators. PMID:29045491
Benchmark studies of induced radioactivity produced in LHC materials, Part II: Remanent dose rates.
Brugger, M; Khater, H; Mayer, S; Prinz, A; Roesler, S; Ulrici, L; Vincke, H
2005-01-01
A new method to estimate remanent dose rates, to be used with the Monte Carlo code FLUKA, was benchmarked against measurements from an experiment that was performed at the CERN-EU high-energy reference field facility. An extensive collection of samples of different materials were placed downstream of, and laterally to, a copper target, intercepting a positively charged mixed hadron beam with a momentum of 120 GeV c(-1). Emphasis was put on the reduction of uncertainties by taking measures such as careful monitoring of the irradiation parameters, using different instruments to measure dose rates, adopting detailed elemental analyses of the irradiated materials and making detailed simulations of the irradiation experiment. The measured and calculated dose rates are in good agreement.
Shielding design for the front end of the CERN SPL.
Magistris, Matteo; Silari, Marco; Vincke, Helmut
2005-01-01
CERN is designing a 2.2-GeV Superconducting Proton Linac (SPL) with a beam power of 4 MW, to be used for the production of a neutrino superbeam. The SPL front end will initially accelerate 2 x 10(14) negative hydrogen ions per second up to an energy of 120 MeV. The FLUKA Monte Carlo code was employed for shielding design. The proposed shielding is a combined iron-concrete structure, which also takes into consideration the required RF wave-guide ducts and access labyrinths to the machine. Two beam-loss scenarios were investigated: (1) constant beam loss of 1 Wm(-1) over the whole accelerator length and (2) full beam loss occurring at various locations. A comparison with results based on simplified approaches is also presented.
2013-07-01
also simulated in the models. Data was derived from calculations using the three-dimensional Monte Carlo radiation transport code MCNP (Monte Carlo N...32 B. MCNP PHYSICS OPTIONS ......................................................................................... 33 C. HAZUS...input deck’) for the MCNP , Monte Carlo N-Particle, radiation transport code. MCNP is a general-purpose code designed to simulate neutron, photon
Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems
NASA Technical Reports Server (NTRS)
Rojdev, Kristina; Koontz, Steve; Atwell, William; Boeder, Paul
2014-01-01
NASA's future missions are focused on long-duration deep space missions for human exploration which offers no options for a quick emergency return to Earth. The combination of long mission duration with no quick emergency return option leads to unprecedented spacecraft system safety and reliability requirements. It is important that spacecraft avionics systems for human deep space missions are not susceptible to Single Event Effect (SEE) failures caused by space radiation (primarily the continuous galactic cosmic ray background and the occasional solar particle event) interactions with electronic components and systems. SEE effects are typically managed during the design, development, and test (DD&T) phase of spacecraft development by using heritage hardware (if possible) and through extensive component level testing, followed by system level failure analysis tasks that are both time consuming and costly. The ultimate product of the SEE DD&T program is a prediction of spacecraft avionics reliability in the flight environment produced using various nuclear reaction and transport codes in combination with the component and subsystem level radiation test data. Previous work by Koontz, et al.1 utilized FLUKA, a Monte Carlo nuclear reaction and transport code, to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data for a variety of spacecraft and space flight environments. However, FLUKA has a long run-time (on the order of days). CREME962, an easy to use deterministic code offering short run times, was also compared with FLUKA predictions and in-flight data. CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Thus, this paper will investigate the use of HZETRN 20103, a fast and easy to use deterministic transport code, similar to CREME96, that was developed at NASA Langley Research Center primarily for flight crew ionizing radiation dose assessments. HZETRN 2010 includes updates to address secondary particle shower effects more accurately, and might be used as another tool to verify spacecraft avionics system reliability in space flight SEE environments.
NASA Astrophysics Data System (ADS)
Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.
2015-09-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.
Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M
2015-09-07
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo(®) TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus(®) chamber. An EBT3(®) film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.
NASA Astrophysics Data System (ADS)
Infantino, Angelo; Marengo, Mario; Baschetti, Serafina; Cicoria, Gianfranco; Longo Vaschetto, Vittorio; Lucconi, Giulia; Massucci, Piera; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano
2015-11-01
Biomedical cyclotrons for production of Positron Emission Tomography (PET) radionuclides and radiotherapy with hadrons or ions are widely diffused and established in hospitals as well as in industrial facilities and research sites. Guidelines for site planning and installation, as well as for radiation protection assessment, are given in a number of international documents; however, these well-established guides typically offer analytic methods of calculation of both shielding and materials activation, in approximate or idealized geometry set up. The availability of Monte Carlo codes with accurate and up-to-date libraries for transport and interactions of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of nowadays computers, makes systematic use of simulations with realistic geometries possible, yielding equipment and site specific evaluation of the source terms, shielding requirements and all quantities relevant to radiation protection. In this work, the well-known Monte Carlo code FLUKA was used to simulate two representative models of cyclotron for PET radionuclides production, including their targetry; and one type of proton therapy cyclotron including the energy selection system. Simulations yield estimates of various quantities of radiological interest, including the effective dose distribution around the equipment, the effective number of neutron produced per incident proton and the activation of target materials, the structure of the cyclotron, the energy degrader, the vault walls and the soil. The model was validated against experimental measurements and comparison with well-established reference data. Neutron ambient dose equivalent H*(10) was measured around a GE PETtrace cyclotron: an average ratio between experimental measurement and simulations of 0.99±0.07 was found. Saturation yield of 18F, produced by the well-known 18O(p,n)18F reaction, was calculated and compared with the IAEA recommended value: a ratio simulation to IAEA of 1.01±0.10 was found.
Vectorized Monte Carlo methods for reactor lattice analysis
NASA Technical Reports Server (NTRS)
Brown, F. B.
1984-01-01
Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.
Prompt Radiation Protection Factors
2018-02-01
dimensional Monte-Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection factors (ratio of dose in the open to...radiation was performed using the three dimensional Monte- Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection...by detonation of a nuclear device have placed renewed emphasis on evaluation of the consequences in case of such an event. The Defense Threat
Monte Carlo calculations of positron emitter yields in proton radiotherapy.
Seravalli, E; Robert, C; Bauer, J; Stichelbaut, F; Kurz, C; Smeets, J; Van Ngoc Ty, C; Schaart, D R; Buvat, I; Parodi, K; Verhaegen, F
2012-03-21
Positron emission tomography (PET) is a promising tool for monitoring the three-dimensional dose distribution in charged particle radiotherapy. PET imaging during or shortly after proton treatment is based on the detection of annihilation photons following the ß(+)-decay of radionuclides resulting from nuclear reactions in the irradiated tissue. Therapy monitoring is achieved by comparing the measured spatial distribution of irradiation-induced ß(+)-activity with the predicted distribution based on the treatment plan. The accuracy of the calculated distribution depends on the correctness of the computational models, implemented in the employed Monte Carlo (MC) codes that describe the interactions of the charged particle beam with matter and the production of radionuclides and secondary particles. However, no well-established theoretical models exist for predicting the nuclear interactions and so phenomenological models are typically used based on parameters derived from experimental data. Unfortunately, the experimental data presently available are insufficient to validate such phenomenological hadronic interaction models. Hence, a comparison among the models used by the different MC packages is desirable. In this work, starting from a common geometry, we compare the performances of MCNPX, GATE and PHITS MC codes in predicting the amount and spatial distribution of proton-induced activity, at therapeutic energies, to the already experimentally validated PET modelling based on the FLUKA MC code. In particular, we show how the amount of ß(+)-emitters produced in tissue-like media depends on the physics model and cross-sectional data used to describe the proton nuclear interactions, thus calling for future experimental campaigns aiming at supporting improvements of MC modelling for clinical application of PET monitoring. © 2012 Institute of Physics and Engineering in Medicine
Darafsheh, Arash; Taleei, Reza; Kassaee, Alireza; Finlay, Jarod C
2016-11-01
Proton beam dosimetry using bare plastic optical fibers has emerged as a simple approach to proton beam dosimetry. The source of the signal in this method has been attributed to Čerenkov radiation. The aim of this work was a phenomenological study of the nature of the visible light responsible for the signal in bare fiber optic dosimetry of proton therapy beams. Plastic fiber optic probes embedded in solid water phantoms were irradiated with proton beams of energies 100, 180, and 225 MeV produced by a proton therapy cyclotron. Luminescence spectroscopy was performed by a CCD-coupled spectrometer. The spectra were acquired at various depths in phantom to measure the percentage depth dose (PDD) for each beam energy. For comparison, the PDD curves were acquired using a standard multilayer ion chamber device. In order to further analyze the contribution of the Čerenkov radiation in the spectra, Monte Carlo simulation was performed using fluka Monte Carlo code to stochastically simulate radiation transport, ionizing radiation dose deposition, and optical emission of Čerenkov radiation. The measured depth doses using the bare fiber are in agreement with measurements performed by the multilayer ion chamber device, indicating the feasibility of using bare fiber probes for proton beam dosimetry. The spectroscopic study of proton-irradiated fibers showed a continuous spectrum with a shape different from that of Čerenkov radiation. The Monte Carlo simulations confirmed that the amount of the generated Čerenkov light does not follow the radiation absorbed dose in a medium. The source of the optical signal responsible for the proton dose measurement using bare optical fibers is not Čerenkov radiation. It is fluorescence of the plastic material of the fiber.
2014-03-27
VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR... PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR MEASUREMENTS OF AN IRON BOX THESIS Presented to the Faculty Department of Engineering...STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED iv AFIT-ENP-14-M-05 VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6
Design and spectrum calculation of 4H-SiC thermal neutron detectors using FLUKA and TCAD
NASA Astrophysics Data System (ADS)
Huang, Haili; Tang, Xiaoyan; Guo, Hui; Zhang, Yimen; Zhang, Yimeng; Zhang, Yuming
2016-10-01
SiC is a promising material for neutron detection in a harsh environment due to its wide band gap, high displacement threshold energy and high thermal conductivity. To increase the detection efficiency of SiC, a converter such as 6LiF or 10B is introduced. In this paper, pulse-height spectra of a PIN diode with a 6LiF conversion layer exposed to thermal neutrons (0.026 eV) are calculated using TCAD and Monte Carlo simulations. First, the conversion efficiency of a thermal neutron with respect to the thickness of 6LiF was calculated by using a FLUKA code, and a maximal efficiency of approximately 5% was achieved. Next, the energy distributions of both 3H and α induced by the 6LiF reaction according to different ranges of emission angle are analyzed. Subsequently, transient pulses generated by the bombardment of single 3H or α-particles are calculated. Finally, pulse height spectra are obtained with a detector efficiency of 4.53%. Comparisons of the simulated result with the experimental data are also presented, and the calculated spectrum shows an acceptable similarity to the experimental data. This work would be useful for radiation-sensing applications, especially for SiC detector design.
Considerations of MCNP Monte Carlo code to be used as a radiotherapy treatment planning tool.
Juste, B; Miro, R; Gallardo, S; Verdu, G; Santos, A
2005-01-01
The present work has simulated the photon and electron transport in a Theratron 780® (MDS Nordion)60Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle). This project explains mainly the different methodologies carried out to speedup calculations in order to apply this code efficiently in radiotherapy treatment planning.
Nuclide Depletion Capabilities in the Shift Monte Carlo Code
Davidson, Gregory G.; Pandya, Tara M.; Johnson, Seth R.; ...
2017-12-21
A new depletion capability has been developed in the Exnihilo radiation transport code suite. This capability enables massively parallel domain-decomposed coupling between the Shift continuous-energy Monte Carlo solver and the nuclide depletion solvers in ORIGEN to perform high-performance Monte Carlo depletion calculations. This paper describes this new depletion capability and discusses its various features, including a multi-level parallel decomposition, high-order transport-depletion coupling, and energy-integrated power renormalization. Several test problems are presented to validate the new capability against other Monte Carlo depletion codes, and the parallel performance of the new capability is analyzed.
A FLUKA simulation of the KLOE electromagnetic calorimeter
NASA Astrophysics Data System (ADS)
Di Micco, B.; Branchini, P.; Ferrari, A.; Loffredo, S.; Passeri, A.; Patera, V.
2007-10-01
We present the simulation of the KLOE calorimeter with the FLUKA Monte Carlo program. The response of the detector to electromagnetic showers has been studied and compared with the publicly available KLOE data. The energy and the time resolution of the electromagnetic clusters is in good agreement with the data. The simulation has been also used to study a possible improvement of the KLOE calorimeter using multianode photo-multipliers. An HAMAMATSU R7600-M16 photomultiplier has been assembled in order to determine the whole cross talk matrix that has been included in the simulation. The cross talk matrix takes into account the effects of a realistic photo-multiplier's electronics and of its coupling to the active material. The performance of the modified readout has been compared to the usual KLOE configuration.
Shielding and activation calculations around the reactor core for the MYRRHA ADS design
NASA Astrophysics Data System (ADS)
Ferrari, Anna; Mueller, Stefan; Konheiser, J.; Castelliti, D.; Sarotto, M.; Stankovskiy, A.
2017-09-01
In the frame of the FP7 European project MAXSIMA, an extensive simulation study has been done to assess the main shielding problems in view of the construction of the MYRRHA accelerator-driven system at SCK·CEN in Mol (Belgium). An innovative method based on the combined use of the two state-of-the-art Monte Carlo codes MCNPX and FLUKA has been used, with the goal to characterize complex, realistic neutron fields around the core barrel, to be used as source terms in detailed analyses of the radiation fields due to the system in operation, and of the coupled residual radiation. The main results of the shielding analysis are presented, as well as the construction of an activation database of all the key structural materials. The results evidenced a powerful way to analyse the shielding and activation problems, with direct and clear implications on the design solutions.
Investigation on demagnetization of Nd2Fe14B permanent magnets induced by irradiation
NASA Astrophysics Data System (ADS)
Li, Zhefu; Jia, Yanyan; Liu, Renduo; Xu, Yuhai; Wang, Guanghong; Xia, Xiaobin
2017-12-01
Nd2Fe14B is an important component of insertion devices, which are used in synchrotron radiation sources, and could be demagnetized by irradiation. In the present study, the Monte Carlo code FLUKA was used to analyze the irradiation field of Nd2Fe14B, and it was confirmed that the main demagnetization particle was neutron. Nd2Fe14B permanent magnet samples were irradiated by Ar ions at different doses to simulate neutron irradiation damage. The hysteresis loops were measured using a vibrating sample magnetometer, and the microstructure evolutions were characterized by transmission electron microscopy. Moreover, the relationship between them was discussed. The results indicate that the decrease in saturated magnetization is caused by the changes in microstructure. The evolution of single crystals into an amorphous structure is the reason for the demagnetization phenomenon of Nd2Fe14B permanent magnets when considering its microscopic structure.
Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frambati, S.; Frignani, M.
2012-07-01
We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less
NASA Astrophysics Data System (ADS)
Agosteo, S.; Birattari, C.; Dimovasili, E.; Foglio Para, A.; Silari, M.; Ulrici, L.; Vincke, H.
2005-02-01
The neutron emission from 50 mm thick copper, silver and lead targets bombarded by a mixed proton/pion beam with momentum of 40 GeV/c were measured at the CERN Super Proton Synchrotron. The neutron yield and spectral fluence per incident particle on target were measured with an extended range Bonner sphere spectrometer in the angular range 30-135° with respect to the beam direction. Monte Carlo simulations with the FLUKA code were performed to provide a priori information for the unfolding of the experimental data. The spectral fluences show two peaks, an isotropic evaporation component centred at 3 MeV and a high-energy peak sitting around 100-150 MeV. The experimental neutron yields are given in four energy bins: <100 keV, 0.1-20 MeV, 20-500 MeV and 0.5-2 GeV. The total yields show a systematic discrepancy of 30-50%, with a peak of 70% at the largest angles, with respect to the results of the Monte Carlo simulations, which it is believed to be mainly due to uncertainties in the beam normalization factor. Analytic expressions are given for the variation of the integral yield as a function of emission angle and of target mass number.
(U) Introduction to Monte Carlo Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hungerford, Aimee L.
2017-03-20
Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamp, Florian; Department of Radiation Oncology, Technische Universität München, Klinikum Rechts der Isar, München; Physik-Department, Technische Universität München, Garching
2015-11-01
Purpose: The physical and biological differences between heavy ions and photons have not been fully exploited and could improve treatment outcomes. In carbon ion therapy, treatment planning must account for physical properties, such as the absorbed dose and nuclear fragmentation, and for differences in the relative biological effectiveness (RBE) of ions compared with photons. We combined the mechanistic repair-misrepair-fixation (RMF) model with Monte Carlo-generated fragmentation spectra for biological optimization of carbon ion treatment plans. Methods and Materials: Relative changes in double-strand break yields and radiosensitivity parameters with particle type and energy were determined using the independently benchmarked Monte Carlo damagemore » simulation and the RMF model to estimate the RBE values for primary carbon ions and secondary fragments. Depth-dependent energy spectra were generated with the Monte Carlo code FLUKA for clinically relevant initial carbon ion energies. The predicted trends in RBE were compared with the published experimental data. Biological optimization for carbon ions was implemented in a 3-dimensional research treatment planning tool. Results: We compared the RBE and RBE-weighted dose (RWD) distributions of different carbon ion treatment scenarios with and without nuclear fragments. The inclusion of fragments in the simulations led to smaller RBE predictions. A validation of RMF against measured cell survival data reported in published studies showed reasonable agreement. We calculated and optimized the RWD distributions on patient data and compared the RMF predictions with those from other biological models. The RBE values in an astrocytoma tumor ranged from 2.2 to 4.9 (mean 2.8) for a RWD of 3 Gy(RBE) assuming (α/β){sub X} = 2 Gy. Conclusions: These studies provide new information to quantify and assess uncertainties in the clinically relevant RBE values for carbon ion therapy based on biophysical mechanisms. We present results from the first biological optimization of carbon ion radiation therapy beams on patient data using a combined RMF and Monte Carlo damage simulation modeling approach. The presented method is advantageous for fast biological optimization.« less
Kamp, Florian; Cabal, Gonzalo; Mairani, Andrea; Parodi, Katia; Wilkens, Jan J; Carlson, David J
2015-11-01
The physical and biological differences between heavy ions and photons have not been fully exploited and could improve treatment outcomes. In carbon ion therapy, treatment planning must account for physical properties, such as the absorbed dose and nuclear fragmentation, and for differences in the relative biological effectiveness (RBE) of ions compared with photons. We combined the mechanistic repair-misrepair-fixation (RMF) model with Monte Carlo-generated fragmentation spectra for biological optimization of carbon ion treatment plans. Relative changes in double-strand break yields and radiosensitivity parameters with particle type and energy were determined using the independently benchmarked Monte Carlo damage simulation and the RMF model to estimate the RBE values for primary carbon ions and secondary fragments. Depth-dependent energy spectra were generated with the Monte Carlo code FLUKA for clinically relevant initial carbon ion energies. The predicted trends in RBE were compared with the published experimental data. Biological optimization for carbon ions was implemented in a 3-dimensional research treatment planning tool. We compared the RBE and RBE-weighted dose (RWD) distributions of different carbon ion treatment scenarios with and without nuclear fragments. The inclusion of fragments in the simulations led to smaller RBE predictions. A validation of RMF against measured cell survival data reported in published studies showed reasonable agreement. We calculated and optimized the RWD distributions on patient data and compared the RMF predictions with those from other biological models. The RBE values in an astrocytoma tumor ranged from 2.2 to 4.9 (mean 2.8) for a RWD of 3 Gy(RBE) assuming (α/β)X = 2 Gy. These studies provide new information to quantify and assess uncertainties in the clinically relevant RBE values for carbon ion therapy based on biophysical mechanisms. We present results from the first biological optimization of carbon ion radiation therapy beams on patient data using a combined RMF and Monte Carlo damage simulation modeling approach. The presented method is advantageous for fast biological optimization. Copyright © 2015 Elsevier Inc. All rights reserved.
Energy deposition and thermal effects of runaway electrons in ITER-FEAT plasma facing components
NASA Astrophysics Data System (ADS)
Maddaluno, G.; Maruccia, G.; Merola, M.; Rollet, S.
2003-03-01
The profile of energy deposited by runaway electrons (RAEs) of 10 or 50 MeV in International Thermonuclear Experimental Reactor-Fusion Energy Advanced Tokamak (ITER-FEAT) plasma facing components (PFCs) and the subsequent temperature pattern have been calculated by using the Monte Carlo code FLUKA and the finite element heat conduction code ANSYS. The RAE energy deposition density was assumed to be 50 MJ/m 2 and both 10 and 100 ms deposition times were considered. Five different configurations of PFCs were investigated: primary first wall armoured with Be, with and without protecting CFC poloidal limiters, both port limiter first wall options (Be flat tile and CFC monoblock), divertor baffle first wall, armoured with W. The analysis has outlined that for all the configurations but one (port limiter with Be flat tile) the heat sink and the cooling tube beneath the armour are well protected for both RAE energies and for both energy deposition times. On the other hand large melting (W, Be) or sublimation (C) of the surface layer occurs, eventually affecting the PFCs lifetime.
Neutron spectrometry with a monolithic silicon telescope.
Agosteo, S; D'Angelo, G; Fazzi, A; Para, A Foglio; Pola, A; Zotto, P
2007-01-01
A neutron spectrometer was set-up by coupling a polyethylene converter with a monolithic silicon telescope, consisting of a DeltaE and an E stage-detector (about 2 and 500 microm thick, respectively). The detection system was irradiated with monoenergetic neutrons at INFN-Laboratori Nazionali di Legnaro (Legnaro, Italy). The maximum detectable energy, imposed by the thickness of the E stage, is about 8 MeV for the present detector. The scatter plots of the energy deposited in the two stages were acquired using two independent electronic chains. The distributions of the recoil-protons are well-discriminated from those due to secondary electrons for energies above 0.350 MeV. The experimental spectra of the recoil-protons were compared with the results of Monte Carlo simulations using the FLUKA code. An analytical model that takes into account the geometrical structure of the silicon telescope was developed, validated and implemented in an unfolding code. The capability of reproducing continuous neutron spectra was investigated by irradiating the detector with neutrons from a thick beryllium target bombarded with protons. The measured spectra were compared with data taken from the literature. Satisfactory agreement was found.
The Energy Spectra of Heavy Nuclei Measured by the ATIC Experiment
NASA Technical Reports Server (NTRS)
Panov, A. D.; Adams, J. H.; Ahn, H. S.; Bashindzhagyan, G. L.; Batkov, K. E.; Chang, J.; Christl, M.; Fazley, A. R.; Ganel, O.; Gunasingha, R. M.
2004-01-01
ATIC (Advanced Thin Ionization Calorimeter) is a balloon-borne experiment to measure the spectra and composition of primary cosmic rays in the region of total energy from 100 GeV to near 100 TeV for Z from 1 to 26. ATIC consists of a pixelated silicon matrix detector to measure charge plus a fully active BGO calorimeter, to measure energy, located below a carbon target interleaved with three layers of scintillator hodoscope. The ATIC instrument had a second (scientific) flight from McMurdo, Antarctica from 12/29/02 to 1/18/03, yielding 20 days of good data. The GEANT 3.21 Monte Carlo code with the QGSM event generator and the FLUKA code with the DPMJET-II event generator were used to convert energy deposition measurements to primary energy. We present the preliminary energy spectra for the abundant elements C, O, Ne, Mg, Si and Fe and compare them with the results of the first (test) flight of ATIC in 2000-01 and with results from the HEAO-3 and CRN experiments.
Interfacing MCNPX and McStas for simulation of neutron transport
NASA Astrophysics Data System (ADS)
Klinkby, Esben; Lauritzen, Bent; Nonbøl, Erik; Kjær Willendrup, Peter; Filges, Uwe; Wohlmuther, Michael; Gallmeier, Franz X.
2013-02-01
Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX (Waters et al., 2007 [1]) or FLUKA (Battistoni et al., 2007; Ferrari et al., 2005 [2,3]) whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as McStas (Lefmann and Nielsen, 1999; Willendrup et al., 2004, 2011a,b [4-7]). The coupling between the two simulation suites typically consists of providing analytical fits of MCNPX neutron spectra to McStas. This method is generally successful but has limitations, as it e.g. does not allow for re-entry of neutrons into the MCNPX regime. Previous work to resolve such shortcomings includes the introduction of McStas inspired supermirrors in MCNPX. In the present paper different approaches to interface MCNPX and McStas are presented and applied to a simple test case. The direct coupling between MCNPX and McStas allows for more accurate simulations of e.g. complex moderator geometries, backgrounds, interference between beam-lines as well as shielding requirements along the neutron guides.
Sechopoulos, Ioannis; Ali, Elsayed S M; Badal, Andreu; Badano, Aldo; Boone, John M; Kyprianou, Iacovos S; Mainegra-Hing, Ernesto; McMillan, Kyle L; McNitt-Gray, Michael F; Rogers, D W O; Samei, Ehsan; Turner, Adam C
2015-10-01
The use of Monte Carlo simulations in diagnostic medical imaging research is widespread due to its flexibility and ability to estimate quantities that are challenging to measure empirically. However, any new Monte Carlo simulation code needs to be validated before it can be used reliably. The type and degree of validation required depends on the goals of the research project, but, typically, such validation involves either comparison of simulation results to physical measurements or to previously published results obtained with established Monte Carlo codes. The former is complicated due to nuances of experimental conditions and uncertainty, while the latter is challenging due to typical graphical presentation and lack of simulation details in previous publications. In addition, entering the field of Monte Carlo simulations in general involves a steep learning curve. It is not a simple task to learn how to program and interpret a Monte Carlo simulation, even when using one of the publicly available code packages. This Task Group report provides a common reference for benchmarking Monte Carlo simulations across a range of Monte Carlo codes and simulation scenarios. In the report, all simulation conditions are provided for six different Monte Carlo simulation cases that involve common x-ray based imaging research areas. The results obtained for the six cases using four publicly available Monte Carlo software packages are included in tabular form. In addition to a full description of all simulation conditions and results, a discussion and comparison of results among the Monte Carlo packages and the lessons learned during the compilation of these results are included. This abridged version of the report includes only an introductory description of the six cases and a brief example of the results of one of the cases. This work provides an investigator the necessary information to benchmark his/her Monte Carlo simulation software against the reference cases included here before performing his/her own novel research. In addition, an investigator entering the field of Monte Carlo simulations can use these descriptions and results as a self-teaching tool to ensure that he/she is able to perform a specific simulation correctly. Finally, educators can assign these cases as learning projects as part of course objectives or training programs.
Poster - 40: Treatment Verification of a 3D-printed Eye Phantom for Proton Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunning, Chelsea; Lindsay, Clay; Unick, Nick
Purpose: Ocular melanoma is a form of eye cancer which is often treated using proton therapy. The benefit of the steep proton dose gradient can only be leveraged for accurate patient eye alignment. A treatment-planning program was written to plan on a 3D-printed anatomical eye-phantom, which was then irradiated to demonstrate the feasibility of verifying in vivo dosimetry for proton therapy using PET imaging. Methods: A 3D CAD eye model with critical organs was designed and voxelized into the Monte-Carlo transport code FLUKA. Proton dose and PET isotope production were simulated for a treatment plan of a test tumour, generatedmore » by a 2D treatment-planning program developed using NumPy and proton range tables. Next, a plastic eye-phantom was 3D-printed from the CAD model, irradiated at the TRIUMF Proton Therapy facility, and imaged using a PET scanner. Results: The treatment-planning program prediction of the range setting and modulator wheel was verified in FLUKA to treat the tumour with at least 90% dose coverage for both tissue and plastic. An axial isotope distribution of the PET isotopes was simulated in FLUKA and converted to PET scan counts. Meanwhile, the 3D-printed eye-phantom successfully yielded a PET signal. Conclusions: The 2D treatment-planning program can predict required parameters to sufficiently treat an eye tumour, which was experimentally verified using commercial 3D-printing hardware to manufacture eye-phantoms. Comparison between the simulated and measured PET isotope distribution could provide a more realistic test of eye alignment, and a variation of the method using radiographic film is being developed.« less
PyMercury: Interactive Python for the Mercury Monte Carlo Particle Transport Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iandola, F N; O'Brien, M J; Procassini, R J
2010-11-29
Monte Carlo particle transport applications are often written in low-level languages (C/C++) for optimal performance on clusters and supercomputers. However, this development approach often sacrifices straightforward usability and testing in the interest of fast application performance. To improve usability, some high-performance computing applications employ mixed-language programming with high-level and low-level languages. In this study, we consider the benefits of incorporating an interactive Python interface into a Monte Carlo application. With PyMercury, a new Python extension to the Mercury general-purpose Monte Carlo particle transport code, we improve application usability without diminishing performance. In two case studies, we illustrate how PyMercury improvesmore » usability and simplifies testing and validation in a Monte Carlo application. In short, PyMercury demonstrates the value of interactive Python for Monte Carlo particle transport applications. In the future, we expect interactive Python to play an increasingly significant role in Monte Carlo usage and testing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, R; Streitmatter, S; Traneus, E
2016-06-15
Purpose: Validate implementation of a published RBE model for DSB induction (RBEDSB) in several general purpose Monte Carlo (MC) code systems and the RayStation™ treatment planning system (TPS). For protons and other light ions, DSB induction is a critical initiating molecular event that correlates well with the RBE for cell survival. Methods: An efficient algorithm to incorporate information on proton and light ion RBEDSB from the independently tested Monte Carlo Damage Simulation (MCDS) has now been integrated into MCNP (Stewart et al. PMB 60, 8249–8274, 2015), FLUKA, TOPAS and a research build of the RayStation™ TPS. To cross-validate the RBEDSBmore » model implementation LET distributions, depth-dose and lateral (dose and RBEDSB) profiles for monodirectional monoenergetic (100 to 200 MeV) protons incident on a water phantom are compared. The effects of recoil and secondary ion production ({sub 2}H{sub +}, {sub 3}H{sub +}, {sub 3}He{sub 2+}, {sub 4}He{sub 2+}), spot size (3 and 10 mm), and transport physics on beam profiles and RBEDSB are examined. Results: Depth-dose and RBEDSB profiles among all of the MC models are in excellent agreement using a 1 mm distance criterion (width of a voxel). For a 100 MeV proton beam (10 mm spot), RBEDSB = 1.2 ± 0.03 (− 2–3%) at the tip of the Bragg peak and increases to 1.59 ± 0.3 two mm distal to the Bragg peak. RBEDSB tends to decrease as the kinetic energy of the incident proton increases. Conclusion: The model for proton RBEDSB has been accurately implemented into FLUKA, MCNP, TOPAS and the RayStation™TPS. The transport of secondary light ions (Z > 1) has a significant impact on RBEDSB, especially distal to the Bragg peak, although light ions have a small effect on (dosexRBEDSB) profiles. The ability to incorporate spatial variations in proton RBE within a TPS creates new opportunities to individualize treatment plans and increase the therapeutic ratio. Dr. Erik Traneus is employed full-time as a Research Scientist at RaySearch Laboratories. The research build of the RayStation used in the study was made available to the University of Washington free of charge. RaySearch Laboratories did not provide any monetary support for the reported studies.« less
Palmans, H; Al-Sulaiti, L; Andreo, P; Shipley, D; Lühr, A; Bassler, N; Martinkovič, J; Dobrovodský, J; Rossomme, S; Thomas, R A S; Kacperek, A
2013-05-21
The conversion of absorbed dose-to-graphite in a graphite phantom to absorbed dose-to-water in a water phantom is performed by water to graphite stopping power ratios. If, however, the charged particle fluence is not equal at equivalent depths in graphite and water, a fluence correction factor, kfl, is required as well. This is particularly relevant to the derivation of absorbed dose-to-water, the quantity of interest in radiotherapy, from a measurement of absorbed dose-to-graphite obtained with a graphite calorimeter. In this work, fluence correction factors for the conversion from dose-to-graphite in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity and the analytical and Monte Carlo codes give consistent values when considering the differences in secondary particle transport. When considering only protons the fluence correction factors are unity at the surface and increase with depth by 0.5% to 1.5% depending on the code. When the fluence of all charged particles is considered, the fluence correction factor is about 0.5% lower than unity at shallow depths predominantly due to the contributions from alpha particles and increases to values above unity near the Bragg peak. Fluence correction factors directly derived from the fluence distributions differential in energy at equivalent depths in water and graphite can be described by kfl = 0.9964 + 0.0024·zw-eq with a relative standard uncertainty of 0.2%. Fluence correction factors derived from a ratio of calculated doses at equivalent depths in water and graphite can be described by kfl = 0.9947 + 0.0024·zw-eq with a relative standard uncertainty of 0.3%. These results are of direct relevance to graphite calorimetry in low-energy protons but given that the fluence correction factor is almost solely influenced by non-elastic nuclear interactions the results are also relevant for plastic phantoms that consist of carbon, oxygen and hydrogen atoms as well as for soft tissues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clovas, A.; Zanthos, S.; Antonopoulos-Domis, M.
2000-03-01
The dose rate conversion factors {dot D}{sub CF} (absorbed dose rate in air per unit activity per unit of soil mass, nGy h{sup {minus}1} per Bq kg{sup {minus}1}) are calculated 1 m above ground for photon emitters of natural radionuclides uniformly distributed in the soil. Three Monte Carlo codes are used: (1) The MCNP code of Los Alamos; (2) The GEANT code of CERN; and (3) a Monte Carlo code developed in the Nuclear Technology Laboratory of the Aristotle University of Thessaloniki. The accuracy of the Monte Carlo results is tested by the comparison of the unscattered flux obtained bymore » the three Monte Carlo codes with an independent straightforward calculation. All codes and particularly the MCNP calculate accurately the absorbed dose rate in air due to the unscattered radiation. For the total radiation (unscattered plus scattered) the {dot D}{sub CF} values calculated from the three codes are in very good agreement between them. The comparison between these results and the results deduced previously by other authors indicates a good agreement (less than 15% of difference) for photon energies above 1,500 keV. Antithetically, the agreement is not as good (difference of 20--30%) for the low energy photons.« less
Bassler, Niels; Kantemiris, Ioannis; Karaiskos, Pantelis; Engelke, Julia; Holzscheiter, Michael H; Petersen, Jørgen B
2010-04-01
Antiprotons have been suggested as a possibly superior modality for radiotherapy, due to the energy released when antiprotons annihilate, which enhances the Bragg peak and introduces a high-LET component to the dose. However, concerns are expressed about the inferior lateral dose distribution caused by the annihilation products. We use the Monte Carlo code FLUKA to generate depth-dose kernels for protons, antiprotons, and carbon ions. Using these we then build virtual treatment plans optimized according to ICRU recommendations for the different beam modalities, which then are recalculated with FLUKA. Dose-volume histograms generated from these plans can be used to compare the different irradiations. The enhancement in physical and possibly biological dose from annihilating antiprotons can significantly lower the dose in the entrance channel; but only at the expense of a diffuse low dose background from long-range secondary particles. Lateral dose distributions are improved using active beam delivery methods, instead of flat fields. Dose-volume histograms for different treatment scenarios show that antiprotons have the potential to reduce the volume of normal tissue receiving medium to high dose, however, in the low dose region antiprotons are inferior to both protons and carbon ions. This limits the potential usage to situations where dose to normal tissue must be reduced as much as possible. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Criticality Calculations with MCNP6 - Practical Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2016-11-29
These slides are used to teach MCNP (Monte Carlo N-Particle) usage to nuclear criticality safety analysts. The following are the lecture topics: course information, introduction, MCNP basics, criticality calculations, advanced geometry, tallies, adjoint-weighted tallies and sensitivities, physics and nuclear data, parameter studies, NCS validation I, NCS validation II, NCS validation III, case study 1 - solution tanks, case study 2 - fuel vault, case study 3 - B&W core, case study 4 - simple TRIGA, case study 5 - fissile mat. vault, criticality accident alarm systems. After completion of this course, you should be able to: Develop an input modelmore » for MCNP; Describe how cross section data impact Monte Carlo and deterministic codes; Describe the importance of validation of computer codes and how it is accomplished; Describe the methodology supporting Monte Carlo codes and deterministic codes; Describe pitfalls of Monte Carlo calculations; Discuss the strengths and weaknesses of Monte Carlo and Discrete Ordinants codes; The diffusion theory model is not strictly valid for treating fissile systems in which neutron absorption, voids, and/or material boundaries are present. In the context of these limitations, identify a fissile system for which a diffusion theory solution would be adequate.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, T; Lin, H; Xu, X
Purpose: To develop a nuclear medicine dosimetry module for the GPU-based Monte Carlo code ARCHER. Methods: We have developed a nuclear medicine dosimetry module for the fast Monte Carlo code ARCHER. The coupled electron-photon Monte Carlo transport kernel included in ARCHER is built upon the Dose Planning Method code (DPM). The developed module manages the radioactive decay simulation by consecutively tracking several types of radiation on a per disintegration basis using the statistical sampling method. Optimization techniques such as persistent threads and prefetching are studied and implemented. The developed module is verified against the VIDA code, which is based onmore » Geant4 toolkit and has previously been verified against OLINDA/EXM. A voxelized geometry is used in the preliminary test: a sphere made of ICRP soft tissue is surrounded by a box filled with water. Uniform activity distribution of I-131 is assumed in the sphere. Results: The self-absorption dose factors (mGy/MBqs) of the sphere with varying diameters are calculated by ARCHER and VIDA respectively. ARCHER’s result is in agreement with VIDA’s that are obtained from a previous publication. VIDA takes hours of CPU time to finish the computation, while it takes ARCHER 4.31 seconds for the 12.4-cm uniform activity sphere case. For a fairer CPU-GPU comparison, more effort will be made to eliminate the algorithmic differences. Conclusion: The coupled electron-photon Monte Carlo code ARCHER has been extended to radioactive decay simulation for nuclear medicine dosimetry. The developed code exhibits good performance in our preliminary test. The GPU-based Monte Carlo code is developed with grant support from the National Institute of Biomedical Imaging and Bioengineering through an R01 grant (R01EB015478)« less
MATSIM: Development of a Voxel Model of the MATROSHKA Astronaut Dosimetric Phantom
NASA Astrophysics Data System (ADS)
Beck, Peter; Zechner, Andrea; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Hranitzky, Christian; Latocha, Marcin; Reitz, Günther; Stadtmann, Hannes; Vana, Norbert; Wind, Michael
2011-08-01
The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center, to perform FLUKA Monte Carlo simulations of the MATROSHKA numerical phantom irradiated under reference radiation field conditions as well as for the radiation environment at the International Space Station (ISS). MATSIM is carried out as co-investigation of the ESA ELIPS projects SORD and RADIS (commonly known as MATROSHKA), an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. During MATSIM a computer tomography scan of the MATROSHKA phantom has been converted into a high resolution 3-dimensional voxel model. The energy imparted and absorbed dose distribution inside the model is determined for various radiation fields. The major goal of the MATSIM project is the validation of the numerical model under reference radiation conditions and further investigations under the radiation environment at ISS. In this report we compare depth dose distributions inside the phantom measured with thermoluminescence detectors (TLDs) and an ionization chamber with FLUKA Monte Carlo particle transport simulations due to 60Co photon exposure. Further reference irradiations with neutrons, protons and heavy ions are planned. The fully validated numerical model MATSIM will provide a perfect tool to assess the radiation exposure to humans during current and future space missions to ISS, Moon, Mars and beyond.
Recent advances and future prospects for Monte Carlo
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B
2010-01-01
The history of Monte Carlo methods is closely linked to that of computers: The first known Monte Carlo program was written in 1947 for the ENIAC; a pre-release of the first Fortran compiler was used for Monte Carlo In 1957; Monte Carlo codes were adapted to vector computers in the 1980s, clusters and parallel computers in the 1990s, and teraflop systems in the 2000s. Recent advances include hierarchical parallelism, combining threaded calculations on multicore processors with message-passing among different nodes. With the advances In computmg, Monte Carlo codes have evolved with new capabilities and new ways of use. Production codesmore » such as MCNP, MVP, MONK, TRIPOLI and SCALE are now 20-30 years old (or more) and are very rich in advanced featUres. The former 'method of last resort' has now become the first choice for many applications. Calculations are now routinely performed on office computers, not just on supercomputers. Current research and development efforts are investigating the use of Monte Carlo methods on FPGAs. GPUs, and many-core processors. Other far-reaching research is exploring ways to adapt Monte Carlo methods to future exaflop systems that may have 1M or more concurrent computational processes.« less
Space Radiation Transport Codes: A Comparative Study for Galactic Cosmic Rays Environment
NASA Astrophysics Data System (ADS)
Tripathi, Ram; Wilson, John W.; Townsend, Lawrence W.; Gabriel, Tony; Pinsky, Lawrence S.; Slaba, Tony
For long duration and/or deep space human missions, protection from severe space radiation exposure is a challenging design constraint and may be a potential limiting factor. The space radiation environment consists of galactic cosmic rays (GCR), solar particle events (SPE), trapped radiation, and includes ions of all the known elements over a very broad energy range. These ions penetrate spacecraft materials producing nuclear fragments and secondary particles that damage biological tissues, microelectronic devices, and materials. In deep space missions, where the Earth's magnetic field does not provide protection from space radiation, the GCR environment is significantly enhanced due to the absence of geomagnetic cut-off and is a major component of radiation exposure. Accurate risk assessments critically depend on the accuracy of the input information as well as radiation transport codes used, and so systematic verification of codes is necessary. In this study, comparisons are made between the deterministic code HZETRN2006 and the Monte Carlo codes HETC-HEDS and FLUKA for an aluminum shield followed by a water target exposed to the 1977 solar minimum GCR spectrum. Interaction and transport of high charge ions present in GCR radiation environment provide a more stringent constraint in the comparison of the codes. Dose, dose equivalent and flux spectra are compared; details of the comparisons will be discussed, and conclusions will be drawn for future directions.
Simulation of Nuclear Reactor Kinetics by the Monte Carlo Method
NASA Astrophysics Data System (ADS)
Gomin, E. A.; Davidenko, V. D.; Zinchenko, A. S.; Kharchenko, I. K.
2017-12-01
The KIR computer code intended for calculations of nuclear reactor kinetics using the Monte Carlo method is described. The algorithm implemented in the code is described in detail. Some results of test calculations are given.
Monte Carlo tests of the ELIPGRID-PC algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, J.R.
1995-04-01
The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID computer code of Singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM{reg_sign} PC. However, no known independent validation of the ELIPGRID algorithm exists. This document describes a Monte Carlo simulation-based validation of a modified version of the ELIPGRID-PC code. The modified ELIPGRID-PC code is shown to match Monte Carlo-calculated hot-spot detection probabilities to within {plus_minus}0.5% for 319 out of 320 test cases. The one exception, a very thin elliptical hot spot located within a rectangularmore » sampling grid, differed from the Monte Carlo-calculated probability by about 1%. These results provide confidence in the ability of the modified ELIPGRID-PC code to accurately predict hot-spot detection probabilities within an acceptable range of error.« less
Measurement of antiproton annihilation on Cu, Ag and Au with emulsion films
NASA Astrophysics Data System (ADS)
Aghion, S.; Amsler, C.; Ariga, A.; Ariga, T.; Bonomi, G.; Bräunig, P.; Brusa, R. S.; Cabaret, L.; Caccia, M.; Caravita, R.; Castelli, F.; Cerchiari, G.; Comparat, D.; Consolati, G.; Demetrio, A.; Di Noto, L.; Doser, M.; Ereditato, A.; Evans, C.; Ferragut, R.; Fesel, J.; Fontana, A.; Gerber, S.; Giammarchi, M.; Gligorova, A.; Guatieri, F.; Haider, S.; Hinterberger, A.; Holmestad, H.; Huse, T.; Kawada, J.; Kellerbauer, A.; Kimura, M.; Krasnický, D.; Lagomarsino, V.; Lansonneur, P.; Lebrun, P.; Malbrunot, C.; Mariazzi, S.; Matveev, V.; Mazzotta, Z.; Müller, S. R.; Nebbia, G.; Nedelec, P.; Oberthaler, M.; Pacifico, N.; Pagano, D.; Penasa, L.; Petracek, V.; Pistillo, C.; Prelz, F.; Prevedelli, M.; Ravelli, L.; Rienaecker, B.; RØhne, O. M.; Rotondi, A.; Sacerdoti, M.; Sandaker, H.; Santoro, R.; Scampoli, P.; Simon, M.; Smestad, L.; Sorrentino, F.; Testera, G.; Tietje, I. C.; Vamosi, S.; Vladymyrov, M.; Widmann, E.; Yzombard, P.; Zimmer, C.; Zmeskal, J.; Zurlo, N.
2017-04-01
The characteristics of low energy antiproton annihilations on nuclei (e.g. hadronization and product multiplicities) are not well known, and Monte Carlo simulation packages that use different models provide different descriptions of the annihilation events. In this study, we measured the particle multiplicities resulting from antiproton annihilations on nuclei. The results were compared with predictions obtained using different models in the simulation tools GEANT4 and FLUKA. For this study, we exposed thin targets (Cu, Ag and Au) to a very low energy antiproton beam from CERN's Antiproton Decelerator, exploiting the secondary beamline available in the AEgIS experimental zone. The antiproton annihilation products were detected using emulsion films developed at the Laboratory of High Energy Physics in Bern, where they were analysed at the automatic microscope facility. The fragment multiplicity measured in this study is in good agreement with results obtained with FLUKA simulations for both minimally and heavily ionizing particles.
NASA Astrophysics Data System (ADS)
Infantino, Angelo; Alía, Rubén García; Besana, Maria Ilaria; Brugger, Markus; Cerutti, Francesco
2017-09-01
As part of its post-LHC high energy physics program, CERN is conducting a study for a new proton-proton collider, called Future Circular Collider (FCC-hh), running at center-of-mass energies of up to 100 TeV in a new 100 km tunnel. The study includes a 90-350 GeV lepton collider (FCC-ee) as well as a lepton-hadron option (FCC-he). In this work, FLUKA Monte Carlo simulation was extensively used to perform a first evaluation of the radiation environment in critical areas for electronics in the FCC-hh tunnel. The model of the tunnel was created based on the original civil engineering studies already performed and further integrated in the existing FLUKA models of the beam line. The radiation levels in critical areas, such as the racks for electronics and cables, power converters, service areas, local tunnel extensions was evaluated.
NASA Astrophysics Data System (ADS)
Lin, Yi-Chun; Huang, Tseng-Te; Liu, Yuan-Hao; Chen, Wei-Lin; Chen, Yen-Fu; Wu, Shu-Wei; Nievaart, Sander; Jiang, Shiang-Huei
2015-06-01
The paired ionization chambers (ICs) technique is commonly employed to determine neutron and photon doses in radiology or radiotherapy neutron beams, where neutron dose shows very strong dependence on the accuracy of accompanying high energy photon dose. During the dose derivation, it is an important issue to evaluate the photon and electron response functions of two commercially available ionization chambers, denoted as TE(TE) and Mg(Ar), used in our reactor based epithermal neutron beam. Nowadays, most perturbation corrections for accurate dose determination and many treatment planning systems are based on the Monte Carlo technique. We used general purposed Monte Carlo codes, MCNP5, EGSnrc, FLUKA or GEANT4 for benchmark verifications among them and carefully measured values for a precise estimation of chamber current from absorbed dose rate of cavity gas. Also, energy dependent response functions of two chambers were calculated in a parallel beam with mono-energies from 20 keV to 20 MeV photons and electrons by using the optimal simple spherical and detailed IC models. The measurements were performed in the well-defined (a) four primary M-80, M-100, M120 and M150 X-ray calibration fields, (b) primary 60Co calibration beam, (c) 6 MV and 10 MV photon, (d) 6 MeV and 18 MeV electron LINACs in hospital and (e) BNCT clinical trials neutron beam. For the TE(TE) chamber, all codes were almost identical over the whole photon energy range. In the Mg(Ar) chamber, MCNP5 showed lower response than other codes for photon energy region below 0.1 MeV and presented similar response above 0.2 MeV (agreed within 5% in the simple spherical model). With the increase of electron energy, the response difference between MCNP5 and other codes became larger in both chambers. Compared with the measured currents, MCNP5 had the difference from the measurement data within 5% for the 60Co, 6 MV, 10 MV, 6 MeV and 18 MeV LINACs beams. But for the Mg(Ar) chamber, the derivations reached 7.8-16.5% below 120 kVp X-ray beams. In this study, we were especially interested in BNCT doses where low energy photon contribution is less to ignore, MCNP model is recognized as the most suitable to simulate wide photon-electron and neutron energy distributed responses of the paired ICs. Also, MCNP provides the best prediction of BNCT source adjustment by the detector's neutron and photon responses.
Nadar, M Y; Akar, D K; Rao, D D; Kulkarni, M S; Pradeepkumar, K S
2015-12-01
Assessment of intake due to long-lived actinides by inhalation pathway is carried out by lung monitoring of the radiation workers inside totally shielded steel room using sensitive detection systems such as Phoswich and an array of HPGe detectors. In this paper, uncertainties in the lung activity estimation due to positional errors, chest wall thickness (CWT) and detector background variation are evaluated. First, calibration factors (CFs) of Phoswich and an array of three HPGe detectors are estimated by incorporating ICRP male thorax voxel phantom and detectors in Monte Carlo code 'FLUKA'. CFs are estimated for the uniform source distribution in lungs of the phantom for various photon energies. The variation in the CFs for positional errors of ±0.5, 1 and 1.5 cm in horizontal and vertical direction along the chest are studied. The positional errors are also evaluated by resizing the voxel phantom. Combined uncertainties are estimated at different energies using the uncertainties due to CWT, detector positioning, detector background variation of an uncontaminated adult person and counting statistics in the form of scattering factors (SFs). SFs are found to decrease with increase in energy. With HPGe array, highest SF of 1.84 is found at 18 keV. It reduces to 1.36 at 238 keV. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
SU-F-T-656: Monte Carlo Study On Air Activation Around a Medical Electron Linac
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horst, F; GSI Helmholtz Centre for Heavy Ion Research, Darmstadt; Fehrenbacher, G
Purpose: In high energy photon therapy, several radiation protection issues result from photonuclear reactions. The activation of air - directly by photonuclear reactions as well as indirectly by capture of photoneutrons generated inside the linac head - is a major point of concern for the medical staff. The purpose of this study was to estimate the annual effective dose to medical workers due to activated air around a medical high energy electron linac by means of Monte Carlo simulations. Methods: The treatment head of a Varian Clinac in 18 MV-X mode as well as the surrounding concrete bunker were modeledmore » and the radiation transport was simulated using the Monte Carlo code FLUKA, starting from the primary electron striking the bremsstrahlung target. The activation yields in air from photo-disintegration of O-16 and N-14 nuclei as well as from neutron capture on Ar-40 nuclei were obtained from the simulations. The activation build-up, radioactive decay and air ventilation were studied using a mathematical model. The annual effective dose to workers was estimated by using published isotope specific conversion factors. Results: The oxygen and nitrogen activation yields were in contrast to the argon activation yield found to be field size dependent. The impact of the treatment room ventilation on the different air activation products was investigated and quantified. An estimate with very conservative assumptions gave an annual effective dose to workers of < 1 mSv/a. Conclusion: From the results of this study it can be concluded that the contribution of air activation to the radiation exposure to medical workers should be negligible in modern photon therapy, especially when it is compared to the dose due to prompt neutrons and the activation of heavy solid materials such as the jaws and the collimators inside the linac head.« less
Advanced Computational Methods for Monte Carlo Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.
Space Radiation Transport Code Development: 3DHZETRN
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
The space radiation transport code, HZETRN, has been used extensively for research, vehicle design optimization, risk analysis, and related applications. One of the simplifying features of the HZETRN transport formalism is the straight-ahead approximation, wherein all particles are assumed to travel along a common axis. This reduces the governing equation to one spatial dimension allowing enormous simplification and highly efficient computational procedures to be implemented. Despite the physical simplifications, the HZETRN code is widely used for space applications and has been found to agree well with fully 3D Monte Carlo simulations in many circumstances. Recent work has focused on the development of 3D transport corrections for neutrons and light ions (Z < 2) for which the straight-ahead approximation is known to be less accurate. Within the development of 3D corrections, well-defined convergence criteria have been considered, allowing approximation errors at each stage in model development to be quantified. The present level of development assumes the neutron cross sections have an isotropic component treated within N explicit angular directions and a forward component represented by the straight-ahead approximation. The N = 1 solution refers to the straight-ahead treatment, while N = 2 represents the bi-directional model in current use for engineering design. The figure below shows neutrons, protons, and alphas for various values of N at locations in an aluminum sphere exposed to a solar particle event (SPE) spectrum. The neutron fluence converges quickly in simple geometry with N > 14 directions. The improved code, 3DHZETRN, transports neutrons, light ions, and heavy ions under space-like boundary conditions through general geometry while maintaining a high degree of computational efficiency. A brief overview of the 3D transport formalism for neutrons and light ions is given, and extensive benchmarking results with the Monte Carlo codes Geant4, FLUKA, and PHITS are provided for a variety of boundary conditions and geometries. Improvements provided by the 3D corrections are made clear in the comparisons. Developments needed to connect 3DHZETRN to vehicle design and optimization studies will be discussed. Future theoretical development will relax the forward plus isotropic interaction assumption to more general angular dependence.
Design of a finger ring extremity dosemeter based on OSL readout of alpha-Al2O3:C.
Durham, J S; Zhang, X; Payne, F; Akselrod, M S
2002-01-01
A finger-ring dosemeter and reader has been designed that uses OSL readout of alpha-Al2O3:C (aluminium oxide). The use of aluminium oxide is important because it allows the sensitive element of the dosemeter to be a very thin layer that reduces the beta and gamma energy dependence to acceptable levels without compromising the required sensitivity for dose measurement. OSL readout allows the ring dosemeter to be interrogated with minimal disassembly. The ring dosemeter consists of three components: aluminium oxide powder for measurement of dose, an aluminium substrate that gives structure to the ring, and an aluminised Mylar cover to prevent the aluminium oxide from exposure to light. The thicknesses of the three components have been optimised for beta response using the Monte Carlo computer code FLUKA. A reader was also designed and developed that allows the dosemeter to be read after removing the Mylar. Future efforts are discussed.
Double-layer neutron shield design as neutron shielding application
NASA Astrophysics Data System (ADS)
Sariyer, Demet; Küçer, Rahmi
2018-02-01
The shield design in particle accelerators and other high energy facilities are mainly connected to the high-energy neutrons. The deep penetration of neutrons through massive shield has become a very serious problem. For shielding to be efficient, most of these neutrons should be confined to the shielding volume. If the interior space will become limited, the sufficient thickness of multilayer shield must be used. Concrete and iron are widely used as a multilayer shield material. Two layers shield material was selected to guarantee radiation safety outside of the shield against neutrons generated in the interaction of the different proton energies. One of them was one meter of concrete, the other was iron-contained material (FeB, Fe2B and stainless-steel) to be determined shield thicknesses. FLUKA Monte Carlo code was used for shield design geometry and required neutron dose distributions. The resulting two layered shields are shown better performance than single used concrete, thus the shield design could leave more space in the interior shielded areas.
Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.
Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B
2010-09-01
The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made.
NRMC - A GPU code for N-Reverse Monte Carlo modeling of fluids in confined media
NASA Astrophysics Data System (ADS)
Sánchez-Gil, Vicente; Noya, Eva G.; Lomba, Enrique
2017-08-01
NRMC is a parallel code for performing N-Reverse Monte Carlo modeling of fluids in confined media [V. Sánchez-Gil, E.G. Noya, E. Lomba, J. Chem. Phys. 140 (2014) 024504]. This method is an extension of the usual Reverse Monte Carlo method to obtain structural models of confined fluids compatible with experimental diffraction patterns, specifically designed to overcome the problem of slow diffusion that can appear under conditions of tight confinement. Most of the computational time in N-Reverse Monte Carlo modeling is spent in the evaluation of the structure factor for each trial configuration, a calculation that can be easily parallelized. Implementation of the structure factor evaluation in NVIDIA® CUDA so that the code can be run on GPUs leads to a speed up of up to two orders of magnitude.
Monte Carlo Techniques for Nuclear Systems - Theory Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. Thesemore » lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations. Beginning MCNP users are encouraged to review LA-UR-09-00380, "Criticality Calculations with MCNP: A Primer (3nd Edition)" (available at http:// mcnp.lanl.gov under "Reference Collection") prior to the class. No Monte Carlo class can be complete without having students write their own simple Monte Carlo routines for basic random sampling, use of the random number generator, and simplified particle transport simulation.« less
NASA Astrophysics Data System (ADS)
Spezi, Emiliano; Leal, Antonio
2013-04-01
The Third European Workshop on Monte Carlo Treatment Planning (MCTP2012) was held from 15-18 May, 2012 in Seville, Spain. The event was organized by the Universidad de Sevilla with the support of the European Workgroup on Monte Carlo Treatment Planning (EWG-MCTP). MCTP2012 followed two successful meetings, one held in Ghent (Belgium) in 2006 (Reynaert 2007) and one in Cardiff (UK) in 2009 (Spezi 2010). The recurrence of these workshops together with successful events held in parallel by McGill University in Montreal (Seuntjens et al 2012), show consolidated interest from the scientific community in Monte Carlo (MC) treatment planning. The workshop was attended by a total of 90 participants, mainly coming from a medical physics background. A total of 48 oral presentations and 15 posters were delivered in specific scientific sessions including dosimetry, code development, imaging, modelling of photon and electron radiation transport, external beam radiation therapy, nuclear medicine, brachitherapy and hadrontherapy. A copy of the programme is available on the workshop's website (www.mctp2012.com). In this special section of Physics in Medicine and Biology we report six papers that were selected following the journal's rigorous peer review procedure. These papers actually provide a good cross section of the areas of application of MC in treatment planning that were discussed at MCTP2012. Czarnecki and Zink (2013) and Wagner et al (2013) present the results of their work in small field dosimetry. Czarnecki and Zink (2013) studied field size and detector dependent correction factors for diodes and ion chambers within a clinical 6MV photon beam generated by a Siemens linear accelerator. Their modelling work based on the BEAMnrc/EGSnrc codes and experimental measurements revealed that unshielded diodes were the best choice for small field dosimetry because of their independence from the electron beam spot size and correction factor close to unity. Wagner et al (2013) investigated the recombination effect on liquid ionization chambers for stereotactic radiotherapy, a field of increasing importance in external beam radiotherapy. They modelled both radiation source (Cyberknife unit) and detector with the BEAMnrc/EGSnrc codes and quantified the dependence of the response of this type of detectors on factors such as the volume effect and the electrode. They also recommended that these dependences be accounted for in measurements involving small fields. In the field of external beam radiotherapy, Chakarova et al (2013) showed how total body irradiation (TBI) could be improved by simulating patient treatments with MC. In particular, BEAMnrc/EGSnrc based simulations highlighted the importance of optimizing individual compensators for TBI treatments. In the same area of application, Mairani et al (2013) reported on a new tool for treatment planning in proton therapy based on the FLUKA MC code. The software, used to model both proton therapy beam and patient anatomy, supports single-field and multiple-field optimization and can be used to optimize physical and relative biological effectiveness (RBE)-weighted dose distribution, using both constant and variable RBE models. In the field of nuclear medicine Marcatili et al (2013) presented RAYDOSE, a Geant4-based code specifically developed for applications in molecular radiotherapy (MRT). RAYDOSE has been designed to work in MRT trials using sequential positron emission tomography (PET) or single-photon emission tomography (SPECT) imaging to model patient specific time-dependent metabolic uptake and to calculate the total 3D dose distribution. The code was validated through experimental measurements in homogeneous and heterogeneous phantoms. Finally, in the field of code development Miras et al (2013) reported on CloudMC, a Windows Azure-based application for the parallelization of MC calculations in a dynamic cluster environment. Although the performance of CloudMC has been tested with the PENELOPE MC code, the authors report that software has been designed in a way that it should be independent of the type of MC code, provided that simulation meets a number of operational criteria. We wish to thank Elekta/CMS Inc., the University of Seville, the Junta of Andalusia and the European Regional Development Fund for their financial support. We would like also to acknowledge the members of EWG-MCTP for their help in peer-reviewing all the abstracts, and all the invited speakers who kindly agreed to deliver keynote presentations in their area of expertise. A final word of thanks to our colleagues who worked on the reviewing process of the papers selected for this special section and to the IOP Publishing staff who made it possible. MCTP2012 was accredited by the European Federation of Organisations for Medical Physics as a CPD event for medical physicists. Emiliano Spezi and Antonio Leal Guest Editors References Chakarova R, Müntzing K, Krantz M, E Hedin E and Hertzman S 2013 Monte Carlo optimization of total body irradiation in a phantom and patient geometry Phys. Med. Biol. 58 2461-69 Czarnecki D and Zink K 2013 Monte Carlo calculated correction factors for diodes and ion chambers in small photon fields Phys. Med. Biol. 58 2431-44 Mairani A, Böhlen T T, Schiavi A, Tessonnier T, Molinelli S, Brons S, Battistoni G, Parodi K and Patera V 2013 A Monte Carlo-based treatment planning tool for proton therapy Phys. Med. Biol. 58 2471-90 Marcatili S, Pettinato C, Daniels S, Lewis G, Edwards P, Fanti S and Spezi E 2013 Development and validation of RAYDOSE: a Geant4 based application for molecular radiotherapy Phys. Med. Biol. 58 2491-508 Miras H, Jiménez R, Miras C and Gomà C 2013 CloudMC: A cloud computing application for Monte Carlo simulation Phys. Med. Biol. 58 N125-33 Reynaert N 2007 First European Workshop on Monte Carlo Treatment Planning J. Phys.: Conf. Ser. 74 011001 Seuntjens J, Beaulieu L, El Naqa I and Després P 2012 Special section: Selected papers from the Fourth International Workshop on Recent Advances in Monte Carlo Techniques for Radiation Therapy Phys. Med. Biol. 57 (11) E01 Spezi E 2010 Special section: Selected papers from the Second European Workshop on Monte Carlo Treatment Planning (MCTP2009) Phys. Med. Biol. 55 (16) E01 Wagner A, Crop F, Lacornerie T, Vandevelde F and Reynaert N 2013 Use of a liquid ionization chamber for stereotactic radiotherapy dosimetry Phys. Med. Biol. 58 2445-59
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortiz-Ramírez, Pablo, E-mail: rapeitor@ug.uchile.cl; Ruiz, Andrés
The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of themore » intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.« less
NASA Astrophysics Data System (ADS)
Dieudonne, Cyril; Dumonteil, Eric; Malvagi, Fausto; M'Backé Diop, Cheikh
2014-06-01
For several years, Monte Carlo burnup/depletion codes have appeared, which couple Monte Carlo codes to simulate the neutron transport to deterministic methods, which handle the medium depletion due to the neutron flux. Solving Boltzmann and Bateman equations in such a way allows to track fine 3-dimensional effects and to get rid of multi-group hypotheses done by deterministic solvers. The counterpart is the prohibitive calculation time due to the Monte Carlo solver called at each time step. In this paper we present a methodology to avoid the repetitive and time-expensive Monte Carlo simulations, and to replace them by perturbation calculations: indeed the different burnup steps may be seen as perturbations of the isotopic concentration of an initial Monte Carlo simulation. In a first time we will present this method, and provide details on the perturbative technique used, namely the correlated sampling. In a second time the implementation of this method in the TRIPOLI-4® code will be discussed, as well as the precise calculation scheme a meme to bring important speed-up of the depletion calculation. Finally, this technique will be used to calculate the depletion of a REP-like assembly, studied at beginning of its cycle. After having validated the method with a reference calculation we will show that it can speed-up by nearly an order of magnitude standard Monte-Carlo depletion codes.
Monte Carlo Calculations of Polarized Microwave Radiation Emerging from Cloud Structures
NASA Technical Reports Server (NTRS)
Kummerow, Christian; Roberti, Laura
1998-01-01
The last decade has seen tremendous growth in cloud dynamical and microphysical models that are able to simulate storms and storm systems with very high spatial resolution, typically of the order of a few kilometers. The fairly realistic distributions of cloud and hydrometeor properties that these models generate has in turn led to a renewed interest in the three-dimensional microwave radiative transfer modeling needed to understand the effect of cloud and rainfall inhomogeneities upon microwave observations. Monte Carlo methods, and particularly backwards Monte Carlo methods have shown themselves to be very desirable due to the quick convergence of the solutions. Unfortunately, backwards Monte Carlo methods are not well suited to treat polarized radiation. This study reviews the existing Monte Carlo methods and presents a new polarized Monte Carlo radiative transfer code. The code is based on a forward scheme but uses aliasing techniques to keep the computational requirements equivalent to the backwards solution. Radiative transfer computations have been performed using a microphysical-dynamical cloud model and the results are presented together with the algorithm description.
Juste, B; Miro, R; Gallardo, S; Santos, A; Verdu, G
2006-01-01
The present work has simulated the photon and electron transport in a Theratron 780 (MDS Nordion) (60)Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle), version 5. In order to become computationally more efficient in view of taking part in the practical field of radiotherapy treatment planning, this work is focused mainly on the analysis of dose results and on the required computing time of different tallies applied in the model to speed up calculations.
SKIRT: The design of a suite of input models for Monte Carlo radiative transfer simulations
NASA Astrophysics Data System (ADS)
Baes, M.; Camps, P.
2015-09-01
The Monte Carlo method is the most popular technique to perform radiative transfer simulations in a general 3D geometry. The algorithms behind and acceleration techniques for Monte Carlo radiative transfer are discussed extensively in the literature, and many different Monte Carlo codes are publicly available. On the contrary, the design of a suite of components that can be used for the distribution of sources and sinks in radiative transfer codes has received very little attention. The availability of such models, with different degrees of complexity, has many benefits. For example, they can serve as toy models to test new physical ingredients, or as parameterised models for inverse radiative transfer fitting. For 3D Monte Carlo codes, this requires algorithms to efficiently generate random positions from 3D density distributions. We describe the design of a flexible suite of components for the Monte Carlo radiative transfer code SKIRT. The design is based on a combination of basic building blocks (which can be either analytical toy models or numerical models defined on grids or a set of particles) and the extensive use of decorators that combine and alter these building blocks to more complex structures. For a number of decorators, e.g. those that add spiral structure or clumpiness, we provide a detailed description of the algorithms that can be used to generate random positions. Advantages of this decorator-based design include code transparency, the avoidance of code duplication, and an increase in code maintainability. Moreover, since decorators can be chained without problems, very complex models can easily be constructed out of simple building blocks. Finally, based on a number of test simulations, we demonstrate that our design using customised random position generators is superior to a simpler design based on a generic black-box random position generator.
MCNP (Monte Carlo Neutron Photon) capabilities for nuclear well logging calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forster, R.A.; Little, R.C.; Briesmeister, J.F.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. The general-purpose continuous-energy Monte Carlo code MCNP (Monte Carlo Neutron Photon), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tally characteristics with standard MCNP features. The time-dependent capabilitymore » of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data. A rich collections of variance reduction features can greatly increase the efficiency of a calculation. MCNP is written in FORTRAN 77 and has been run on variety of computer systems from scientific workstations to supercomputers. The next production version of MCNP will include features such as continuous-energy electron transport and a multitasking option. Areas of ongoing research of interest to the well logging community include angle biasing, adaptive Monte Carlo, improved discrete ordinates capabilities, and discrete ordinates/Monte Carlo hybrid development. Los Alamos has requested approval by the Department of Energy to create a Radiation Transport Computational Facility under their User Facility Program to increase external interactions with industry, universities, and other government organizations. 21 refs.« less
NASA Astrophysics Data System (ADS)
Bauer, J.; Unholtz, D.; Kurz, C.; Parodi, K.
2013-08-01
We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β+ activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β+ activity induced in the investigated targets within a few per cent. Moreover, the simulated distal activity fall-off positions, representing the central quantity for treatment monitoring in terms of beam range verification, are found to agree within 0.6 mm with the measurements at different initial beam energies in both homogeneous and heterogeneous targets. Based on work presented at the Third European Workshop on Monte Carlo Treatment Planning (Seville, 15-18 May 2012).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthew Ellis; Derek Gaston; Benoit Forget
In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes.more » An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergmann, Ryan M.; Rowland, Kelly L.
2017-04-12
WARP, which can stand for ``Weaving All the Random Particles,'' is a three-dimensional (3D) continuous energy Monte Carlo neutron transport code developed at UC Berkeley to efficiently execute on NVIDIA graphics processing unit (GPU) platforms. WARP accelerates Monte Carlo simulations while preserving the benefits of using the Monte Carlo method, namely, that very few physical and geometrical simplifications are applied. WARP is able to calculate multiplication factors, neutron flux distributions (in both space and energy), and fission source distributions for time-independent neutron transport problems. It can run in both criticality or fixed source modes, but fixed source mode is currentlymore » not robust, optimized, or maintained in the newest version. WARP can transport neutrons in unrestricted arrangements of parallelepipeds, hexagonal prisms, cylinders, and spheres. The goal of developing WARP is to investigate algorithms that can grow into a full-featured, continuous energy, Monte Carlo neutron transport code that is accelerated by running on GPUs. The crux of the effort is to make Monte Carlo calculations faster while producing accurate results. Modern supercomputers are commonly being built with GPU coprocessor cards in their nodes to increase their computational efficiency and performance. GPUs execute efficiently on data-parallel problems, but most CPU codes, including those for Monte Carlo neutral particle transport, are predominantly task-parallel. WARP uses a data-parallel neutron transport algorithm to take advantage of the computing power GPUs offer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moskvin, V; Pirlepesov, F; Tsiamas, P
Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less
NASA Astrophysics Data System (ADS)
Esposito, A.; Frasciello, O.; Pelliccioni, M.
2017-09-01
ELI-NP will be a new international research infrastructure facility for laser-based Nuclear Physics to be built in Magurele, south west of Bucharest, Romania. For the machine to operate as an intense γ rays' source based on Compton back-scattering, electron beams are employed, undergoing a two stage acceleration to 320 MeV and 740 MeV (and, with an eventual energy upgrade, also to 840 MeV) beam energies. In order to assess the radiation safety issues, concerning the effectiveness of the dumps in absorbing the primary electron beams, the generated prompt radiation field and the residual dose rates coming from the activation of constituent materials, as well as the shielding of the adjacent environments against both prompt and residual radiation fields, an extensive design study by means of Monte Carlo simulations with FLUKA code was performed, for both low energy 320 MeV and high energy 720 MeV (840 MeV) beam dumps. For the low energy dump we discuss also the rational of the choice to place it in the building basement, instead of installing it in one of the shielding wall at the machine level, as it was originally conceived. Ambient dose equivalent rate constraints, according to the Rumenian law in force in radiation protection matter were 0.1 /iSv/h everywhere outside the shielding walls and 1.4 μiSv/h outside the high energy dump area. The dumps' placements and layouts are shown to be fully compliant with the dose constraints and environmental impact.
NASA Astrophysics Data System (ADS)
Koontz, S. L.; Atwell, W. A.; Reddell, B.; Rojdev, K.
2010-12-01
In the this paper, we report the results of modeling and simulation studies in which the radiation transport code FLUKA (FLUktuierende KAskade) is used to determine the changes in total ionizing dose (TID) and single-event effect (SEE) environments behind aluminum, polyethylene, carbon, and titanium shielding masses when the assumed form (i.e., Band or Exponential) of the solar particle event (SPE) kinetic energy spectra is changed. FLUKA simulations are fully three dimensional with an isotropic particle flux incident on a concentric spherical shell shielding mass and detector structure. FLUKA is a fully integrated and extensively verified Monte Carlo simulation package for the interaction and transport of high-energy particles and nuclei in matter. The effects are reported of both energetic primary protons penetrating the shield mass and secondary particle showers caused by energetic primary protons colliding with shielding mass nuclei. SPE heavy ion spectra are not addressed. Our results, in agreement with previous studies, show that use of the Exponential form of the event spectra can seriously underestimate spacecraft SPE TID and SEE environments in some, but not all, shielding mass cases. The SPE spectra investigated are taken from four specific SPEs that produced ground-level events (GLEs) during solar cycle 23 (1997-2008). GLEs are produced by highly energetic solar particle events (ESP), i.e., those that contain significant fluences of 700 MeV to 10 GeV protons. Highly energetic SPEs are implicated in increased rates of spacecraft anomalies and spacecraft failures. High-energy protons interact with Earth’s atmosphere via nuclear reaction to produce secondary particles, some of which are neutrons that can be detected at the Earth’s surface by the global neutron monitor network. GLEs are one part of the overall SPE resulting from a particular solar flare or coronal mass ejection event on the sun. The ESP part of the particle event, detected by spacecraft, is often associated with the arrival of a “shock front” at Earth some hours after the arrival of the GLE. The specific SPEs used in this analysis are those of: 1) November 6, 1997 - GLE only; 2) July 14-15, 2000 - GLE from the 14th plus ESP from the 15th; 3) November 4-6, 2001 - GLE and ESP from the 4th; and 4) October 28-29, 2003 - GLE and ESP from the 28th plus GLE from the 29th. The corresponding Band and Exponential spectra used in this paper are like those previously reported.
Improvements of MCOR: A Monte Carlo depletion code system for fuel assembly reference calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tippayakul, C.; Ivanov, K.; Misu, S.
2006-07-01
This paper presents the improvements of MCOR, a Monte Carlo depletion code system for fuel assembly reference calculations. The improvements of MCOR were initiated by the cooperation between the Penn State Univ. and AREVA NP to enhance the original Penn State Univ. MCOR version in order to be used as a new Monte Carlo depletion analysis tool. Essentially, a new depletion module using KORIGEN is utilized to replace the existing ORIGEN-S depletion module in MCOR. Furthermore, the online burnup cross section generation by the Monte Carlo calculation is implemented in the improved version instead of using the burnup cross sectionmore » library pre-generated by a transport code. Other code features have also been added to make the new MCOR version easier to use. This paper, in addition, presents the result comparisons of the original and the improved MCOR versions against CASMO-4 and OCTOPUS. It was observed in the comparisons that there were quite significant improvements of the results in terms of k{sub inf}, fission rate distributions and isotopic contents. (authors)« less
New radiation protection calibration facility at CERN.
Brugger, Markus; Carbonez, Pierre; Pozzi, Fabio; Silari, Marco; Vincke, Helmut
2014-10-01
The CERN radiation protection group has designed a new state-of-the-art calibration laboratory to replace the present facility, which is >20 y old. The new laboratory, presently under construction, will be equipped with neutron and gamma sources, as well as an X-ray generator and a beta irradiator. The present work describes the project to design the facility, including the facility placement criteria, the 'point-zero' measurements and the shielding study performed via FLUKA Monte Carlo simulations. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Extensions of the MCNP5 and TRIPOLI4 Monte Carlo Codes for Transient Reactor Analysis
NASA Astrophysics Data System (ADS)
Hoogenboom, J. Eduard; Sjenitzer, Bart L.
2014-06-01
To simulate reactor transients for safety analysis with the Monte Carlo method the generation and decay of delayed neutron precursors is implemented in the MCNP5 and TRIPOLI4 general purpose Monte Carlo codes. Important new variance reduction techniques like forced decay of precursors in each time interval and the branchless collision method are included to obtain reasonable statistics for the power production per time interval. For simulation of practical reactor transients also the feedback effect from the thermal-hydraulics must be included. This requires coupling of the Monte Carlo code with a thermal-hydraulics (TH) code, providing the temperature distribution in the reactor, which affects the neutron transport via the cross section data. The TH code also provides the coolant density distribution in the reactor, directly influencing the neutron transport. Different techniques for this coupling are discussed. As a demonstration a 3x3 mini fuel assembly with a moving control rod is considered for MCNP5 and a mini core existing of 3x3 PWR fuel assemblies with control rods and burnable poisons for TRIPOLI4. Results are shown for reactor transients due to control rod movement or withdrawal. The TRIPOLI4 transient calculation is started at low power and includes thermal-hydraulic feedback. The power rises about 10 decades and finally stabilises the reactor power at a much higher level than initial. The examples demonstrate that the modified Monte Carlo codes are capable of performing correct transient calculations, taking into account all geometrical and cross section detail.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simeonov, Y; Penchev, P; Ringbaek, T Printz
2016-06-15
Purpose: Active raster scanning in particle therapy results in highly conformal dose distributions. Treatment time, however, is relatively high due to the large number of different iso-energy layers used. By using only one energy and the so called 3D range-modulator irradiation times of a few seconds only can be achieved, thus making delivery of homogeneous dose to moving targets (e.g. lung cancer) more reliable. Methods: A 3D range-modulator consisting of many pins with base area of 2.25 mm2 and different lengths was developed and manufactured with rapid prototyping technique. The form of the 3D range-modulator was optimised for a sphericalmore » target volume with 5 cm diameter placed at 25 cm in a water phantom. Monte Carlo simulations using the FLUKA package were carried out to evaluate the modulating effect of the 3D range-modulator and simulate the resulting dose distribution. The fine and complicated contour form of the 3D range-modulator was taken into account by a specially programmed user routine. Additionally FLUKA was extended with the capability of intensity modulated scanning. To verify the simulation results dose measurements were carried out at the Heidelberg Ion Therapy Center (HIT) with a 400.41 MeV 12C beam. Results: The high resolution measurements show that the 3D range-modulator is capable of producing homogeneous 3D conformal dose distributions, simultaneously reducing significantly irradiation time. Measured dose is in very good agreement with the previously conducted FLUKA simulations, where slight differences were traced back to minor manufacturing deviations from the perfect optimised form. Conclusion: Combined with the advantages of very short treatment time the 3D range-modulator could be an alternative to treat small to medium sized tumours (e.g. lung metastasis) with the same conformity as full raster-scanning treatment. Further simulations and measurements of more complex cases will be conducted to investigate the full potential of the 3D range-modulator.« less
Analysis of Naval Ammunition Stock Positioning
2015-12-01
model takes once the Monte -Carlo simulation determines the assigned probabilities for site-to-site locations. Column two shows how the simulation...stockpiles and positioning them at coastal Navy facilities. A Monte -Carlo simulation model was developed to simulate expected cost and delivery...TERMS supply chain management, Monte -Carlo simulation, risk, delivery performance, stock positioning 15. NUMBER OF PAGES 85 16. PRICE CODE 17
ME(SSY)**2: Monte Carlo Code for Star Cluster Simulations
NASA Astrophysics Data System (ADS)
Freitag, Marc Dewi
2013-02-01
ME(SSY)**2 stands for “Monte-carlo Experiments with Spherically SYmmetric Stellar SYstems." This code simulates the long term evolution of spherical clusters of stars; it was devised specifically to treat dense galactic nuclei. It is based on the pioneering Monte Carlo scheme proposed by Hénon in the 70's and includes all relevant physical ingredients (2-body relaxation, stellar mass spectrum, collisions, tidal disruption, ldots). It is basically a Monte Carlo resolution of the Fokker-Planck equation. It can cope with any stellar mass spectrum or velocity distribution. Being a particle-based method, it also allows one to take stellar collisions into account in a very realistic way. This unique code, featuring most important physical processes, allows million particle simulations, spanning a Hubble time, in a few CPU days on standard personal computers and provides a wealth of data only rivalized by N-body simulations. The current version of the software requires the use of routines from the "Numerical Recipes in Fortran 77" (http://www.nrbook.com/a/bookfpdf.php).
Force field development with GOMC, a fast new Monte Carlo molecular simulation code
NASA Astrophysics Data System (ADS)
Mick, Jason Richard
In this work GOMC (GPU Optimized Monte Carlo) a new fast, flexible, and free molecular Monte Carlo code for the simulation atomistic chemical systems is presented. The results of a large Lennard-Jonesium simulation in the Gibbs ensemble is presented. Force fields developed using the code are also presented. To fit the models a quantitative fitting process is outlined using a scoring function and heat maps. The presented n-6 force fields include force fields for noble gases and branched alkanes. These force fields are shown to be the most accurate LJ or n-6 force fields to date for these compounds, capable of reproducing pure fluid behavior and binary mixture behavior to a high degree of accuracy.
Monte Carlo simulation of proton track structure in biological matter
Quinto, Michele A.; Monti, Juan M.; Weck, Philippe F.; ...
2017-05-25
Here, understanding the radiation-induced effects at the cellular and subcellular levels remains crucial for predicting the evolution of irradiated biological matter. In this context, Monte Carlo track-structure simulations have rapidly emerged among the most suitable and powerful tools. However, most existing Monte Carlo track-structure codes rely heavily on the use of semi-empirical cross sections as well as water as a surrogate for biological matter. In the current work, we report on the up-to-date version of our homemade Monte Carlo code TILDA-V – devoted to the modeling of the slowing-down of 10 keV–100 MeV protons in both water and DNA –more » where the main collisional processes are described by means of an extensive set of ab initio differential and total cross sections.« less
Monte Carlo simulation of proton track structure in biological matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinto, Michele A.; Monti, Juan M.; Weck, Philippe F.
Here, understanding the radiation-induced effects at the cellular and subcellular levels remains crucial for predicting the evolution of irradiated biological matter. In this context, Monte Carlo track-structure simulations have rapidly emerged among the most suitable and powerful tools. However, most existing Monte Carlo track-structure codes rely heavily on the use of semi-empirical cross sections as well as water as a surrogate for biological matter. In the current work, we report on the up-to-date version of our homemade Monte Carlo code TILDA-V – devoted to the modeling of the slowing-down of 10 keV–100 MeV protons in both water and DNA –more » where the main collisional processes are described by means of an extensive set of ab initio differential and total cross sections.« less
NASA Astrophysics Data System (ADS)
Camarlinghi, N.; Sportelli, G.; Battistoni, G.; Belcari, N.; Cecchetti, M.; Cirrone, G. A. P.; Cuttone, G.; Ferretti, S.; Kraan, A.; Retico, A.; Romano, F.; Sala, P.; Straub, K.; Tramontana, A.; Del Guerra, A.; Rosso, V.
2014-04-01
Ion therapy allows the delivery of highly conformal dose taking advantage of the sharp depth-dose distribution at the Bragg-peak. However, patient positioning errors and anatomical uncertainties can cause dose distortions. To exploit the full potential of ion therapy, an accurate monitoring system of the ion range is needed. Among the proposed methods to monitor the ion range, Positron Emission Tomography (PET) has proven to be the most mature technique, allowing to reconstruct the β+ activity generated in the patient by the nuclear interaction of the ions, that can be acquired during or after the treatment. Taking advantages of the spatial correlation between positron emitters created along the ions path and the dose distribution, it is possible to reconstruct the ion range. Due to the high single rates generated during the beam extraction, the acquisition of the β+ activity is typically performed after the irradiation (cyclotron) or in between the synchrotron spills. Indeed the single photon rate can be one or more orders of magnitude higher than normal for cyclotron. Therefore, acquiring the activity during the beam irradiation requires a detector with a very short dead time. In this work, the DoPET detector, capable of sustaining the high event rate generated during the cyclotron irradiation, is presented. The capability of the system to acquire data during and after the irradiation will be demonstrated by showing the reconstructed activity for different PMMA irradiations performed using clinical dose rates and the 62 MeV proton beam at the CATANA-LNS-INFN. The reconstructed activity widths will be compared with the results obtained by simulating the proton beam interaction with the FLUKA Monte Carlo. The presented data are in good agreement with the FLUKA Monte Carlo.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koren, S; Bragilovski, D; Tafo, A Guemnie
Purpose: To evaluate the clinical feasibility of IntraBeam intra operative kV irradiation beam device for ocular conjunctiva treatments. The Intra-Beam system offers a 4.4 mm diameter needle applicator, that is not suitable for treatment of a large surface with limits access. We propose an adaptor that will answer to this clinical need and provide initial dosimetry. Methods: The dose distribution of the needle applicator is non uniform and hence not suitable for treatment of relatively large surfaces. We designed an adapter to the needle applicator that will filter the X-rays and produce a conformal dose distribution over the treatment areamore » while shielding surfaces to be spared. Dose distributions were simulated using FLUKA is a fully integrated particle physics Monte Carlo simulation package. Results: We designed a wedge applicator made of Polythermide window and stainless steel for collimating. We compare the dose distribution to that of the known needle and surface applicators. Conclusion: Initial dosimetry shows feasibility of this approach. While further refinements to the design may be warranted, the results support construction of a prototype and confirmation of the Monte Carlo dosimetry with measured data.« less
Absorbed Dose and Dose Equivalent Calculations for Modeling Effective Dose
NASA Technical Reports Server (NTRS)
Welton, Andrew; Lee, Kerry
2010-01-01
While in orbit, Astronauts are exposed to a much higher dose of ionizing radiation than when on the ground. It is important to model how shielding designs on spacecraft reduce radiation effective dose pre-flight, and determine whether or not a danger to humans is presented. However, in order to calculate effective dose, dose equivalent calculations are needed. Dose equivalent takes into account an absorbed dose of radiation and the biological effectiveness of ionizing radiation. This is important in preventing long-term, stochastic radiation effects in humans spending time in space. Monte carlo simulations run with the particle transport code FLUKA, give absorbed and equivalent dose data for relevant shielding. The shielding geometry used in the dose calculations is a layered slab design, consisting of aluminum, polyethylene, and water. Water is used to simulate the soft tissues that compose the human body. The results obtained will provide information on how the shielding performs with many thicknesses of each material in the slab. This allows them to be directly applicable to modern spacecraft shielding geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A.I. Drozhdin, N.V. Mokhov and M. Huhtinen
1999-04-13
The effect of possible accidental beam loss in LHC on the IP5 insertion elements and CMS detector is studied via realistic Monte Carlo simulations. Such beam loss could be the consequence of an unsynchronized abort or in worst case an accidental prefire of one of the abort kicker modules. Simulations with the STRUCT code show that this beam losses would take place in the IP5 inner and outer triplets. MARS simulations of the hadronic and electro-magnetic cascades induced in such an event indicate severe heating of the inner triplet quadrupoles. In order to protect the IP5 elements, two methods aremore » proposed: a set of shadow collimators in the outer triplet and a prefired module compensation using a special module charged with an opposite voltage (antikicker). The remnants of the accidental beam loss entering the experimental hall have been used as input for FLUKA simulations in the CMS detector. It is shown that it is vital to take measures to reliably protect the expensive CMS tracker components.« less
Depth profile of production yields of natPb(p, xn) 206,205,204,203,202,201Bi nuclear reactions
NASA Astrophysics Data System (ADS)
Mokhtari Oranj, Leila; Jung, Nam-Suk; Kim, Dong-Hyun; Lee, Arim; Bae, Oryun; Lee, Hee-Seock
2016-11-01
Experimental and simulation studies on the depth profiles of production yields of natPb(p, xn) 206,205,204,203,202,201Bi nuclear reactions were carried out. Irradiation experiments were performed at the high-intensity proton linac facility (KOMAC) in Korea. The targets, irradiated by 100-MeV protons, were arranged in a stack consisting of natural Pb, Al, Au foils and Pb plates. The proton beam intensity was determined by activation analysis method using 27Al(p, 3p1n)24Na, 197Au(p, p1n)196Au, and 197Au(p, p3n)194Au monitor reactions and also by Gafchromic film dosimetry method. The yields of produced radio-nuclei in the natPb activation foils and monitor foils were measured by HPGe spectroscopy system. Monte Carlo simulations were performed by FLUKA, PHITS/DCHAIN-SP, and MCNPX/FISPACT codes and the calculated data were compared with the experimental results. A satisfactory agreement was observed between the present experimental data and the simulations.
The Monte Carlo photoionization and moving-mesh radiation hydrodynamics code CMACIONIZE
NASA Astrophysics Data System (ADS)
Vandenbroucke, B.; Wood, K.
2018-04-01
We present the public Monte Carlo photoionization and moving-mesh radiation hydrodynamics code CMACIONIZE, which can be used to simulate the self-consistent evolution of HII regions surrounding young O and B stars, or other sources of ionizing radiation. The code combines a Monte Carlo photoionization algorithm that uses a complex mix of hydrogen, helium and several coolants in order to self-consistently solve for the ionization and temperature balance at any given type, with a standard first order hydrodynamics scheme. The code can be run as a post-processing tool to get the line emission from an existing simulation snapshot, but can also be used to run full radiation hydrodynamical simulations. Both the radiation transfer and the hydrodynamics are implemented in a general way that is independent of the grid structure that is used to discretize the system, allowing it to be run both as a standard fixed grid code, but also as a moving-mesh code.
LLNL Mercury Project Trinity Open Science Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Shawn A.
The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.
A Monte-Carlo Benchmark of TRIPOLI-4® and MCNP on ITER neutronics
NASA Astrophysics Data System (ADS)
Blanchet, David; Pénéliau, Yannick; Eschbach, Romain; Fontaine, Bruno; Cantone, Bruno; Ferlet, Marc; Gauthier, Eric; Guillon, Christophe; Letellier, Laurent; Proust, Maxime; Mota, Fernando; Palermo, Iole; Rios, Luis; Guern, Frédéric Le; Kocan, Martin; Reichle, Roger
2017-09-01
Radiation protection and shielding studies are often based on the extensive use of 3D Monte-Carlo neutron and photon transport simulations. ITER organization hence recommends the use of MCNP-5 code (version 1.60), in association with the FENDL-2.1 neutron cross section data library, specifically dedicated to fusion applications. The MCNP reference model of the ITER tokamak, the `C-lite', is being continuously developed and improved. This article proposes to develop an alternative model, equivalent to the 'C-lite', but for the Monte-Carlo code TRIPOLI-4®. A benchmark study is defined to test this new model. Since one of the most critical areas for ITER neutronics analysis concerns the assessment of radiation levels and Shutdown Dose Rates (SDDR) behind the Equatorial Port Plugs (EPP), the benchmark is conducted to compare the neutron flux through the EPP. This problem is quite challenging with regard to the complex geometry and considering the important neutron flux attenuation ranging from 1014 down to 108 n•cm-2•s-1. Such code-to-code comparison provides independent validation of the Monte-Carlo simulations, improving the confidence in neutronic results.
NASA Astrophysics Data System (ADS)
Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald
2017-09-01
In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.
NASA Technical Reports Server (NTRS)
Shinn, Judy L.; Wilson, John W.; Lone, M. A.; Wong, P. Y.; Costen, Robert C.
1994-01-01
A baryon transport code (BRYNTRN) has previously been verified using available Monte Carlo results for a solar-flare spectrum as the reference. Excellent results were obtained, but the comparisons were limited to the available data on dose and dose equivalent for moderate penetration studies that involve minor contributions from secondary neutrons. To further verify the code, the secondary energy spectra of protons and neutrons are calculated using BRYNTRN and LAHET (Los Alamos High-Energy Transport code, which is a Monte Carlo code). These calculations are compared for three locations within a water slab exposed to the February 1956 solar-proton spectrum. Reasonable agreement was obtained when various considerations related to the calculational techniques and their limitations were taken into account. Although the Monte Carlo results are preliminary, it appears that the neutron albedo, which is not currently treated in BRYNTRN, might be a cause for the large discrepancy seen at small penetration depths. It also appears that the nonelastic neutron production cross sections in BRYNTRN may underestimate the number of neutrons produced in proton collisions with energies below 200 MeV. The notion that the poor energy resolution in BRYNTRN may cause a large truncation error in neutron elastic scattering requires further study.
Experimental benchmarking of a Monte Carlo dose simulation code for pediatric CT
NASA Astrophysics Data System (ADS)
Li, Xiang; Samei, Ehsan; Yoshizumi, Terry; Colsher, James G.; Jones, Robert P.; Frush, Donald P.
2007-03-01
In recent years, there has been a desire to reduce CT radiation dose to children because of their susceptibility and prolonged risk for cancer induction. Concerns arise, however, as to the impact of dose reduction on image quality and thus potentially on diagnostic accuracy. To study the dose and image quality relationship, we are developing a simulation code to calculate organ dose in pediatric CT patients. To benchmark this code, a cylindrical phantom was built to represent a pediatric torso, which allows measurements of dose distributions from its center to its periphery. Dose distributions for axial CT scans were measured on a 64-slice multidetector CT (MDCT) scanner (GE Healthcare, Chalfont St. Giles, UK). The same measurements were simulated using a Monte Carlo code (PENELOPE, Universitat de Barcelona) with the applicable CT geometry including bowtie filter. The deviations between simulated and measured dose values were generally within 5%. To our knowledge, this work is one of the first attempts to compare measured radial dose distributions on a cylindrical phantom with Monte Carlo simulated results. It provides a simple and effective method for benchmarking organ dose simulation codes and demonstrates the potential of Monte Carlo simulation for investigating the relationship between dose and image quality for pediatric CT patients.
Monte Carlo modelling the dosimetric effects of electrode material on diamond detectors.
Baluti, Florentina; Deloar, Hossain M; Lansley, Stuart P; Meyer, Juergen
2015-03-01
Diamond detectors for radiation dosimetry were modelled using the EGSnrc Monte Carlo code to investigate the influence of electrode material and detector orientation on the absorbed dose. The small dimensions of the electrode/diamond/electrode detector structure required very thin voxels and the use of non-standard DOSXYZnrc Monte Carlo model parameters. The interface phenomena was investigated by simulating a 6 MV beam and detectors with different electrode materials, namely Al, Ag, Cu and Au, with thickens of 0.1 µm for the electrodes and 0.1 mm for the diamond, in both perpendicular and parallel detector orientation with regards to the incident beam. The smallest perturbations were observed for the parallel detector orientation and Al electrodes (Z = 13). In summary, EGSnrc Monte Carlo code is well suited for modelling small detector geometries. The Monte Carlo model developed is a useful tool to investigate the dosimetric effects caused by different electrode materials. To minimise perturbations cause by the detector electrodes, it is recommended that the electrodes should be made from a low-atomic number material and placed parallel to the beam direction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Procassini, R.J.
1997-12-31
The fine-scale, multi-space resolution that is envisioned for accurate simulations of complex weapons systems in three spatial dimensions implies flop-rate and memory-storage requirements that will only be obtained in the near future through the use of parallel computational techniques. Since the Monte Carlo transport models in these simulations usually stress both of these computational resources, they are prime candidates for parallelization. The MONACO Monte Carlo transport package, which is currently under development at LLNL, will utilize two types of parallelism within the context of a multi-physics design code: decomposition of the spatial domain across processors (spatial parallelism) and distribution ofmore » particles in a given spatial subdomain across additional processors (particle parallelism). This implementation of the package will utilize explicit data communication between domains (message passing). Such a parallel implementation of a Monte Carlo transport model will result in non-deterministic communication patterns. The communication of particles between subdomains during a Monte Carlo time step may require a significant level of effort to achieve a high parallel efficiency.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Rourke, Patrick Francis
The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.
NASA Technical Reports Server (NTRS)
Campbell, David; Wysong, Ingrid; Kaplan, Carolyn; Mott, David; Wadsworth, Dean; VanGilder, Douglas
2000-01-01
An AFRL/NRL team has recently been selected to develop a scalable, parallel, reacting, multidimensional (SUPREM) Direct Simulation Monte Carlo (DSMC) code for the DoD user community under the High Performance Computing Modernization Office (HPCMO) Common High Performance Computing Software Support Initiative (CHSSI). This paper will introduce the JANNAF Exhaust Plume community to this three-year development effort and present the overall goals, schedule, and current status of this new code.
Wang, R; Li, X A
2001-02-01
The dose parameters for the beta-particle emitting 90Sr/90Y source for intravascular brachytherapy (IVBT) have been calculated by different investigators. At a distant distance from the source, noticeable differences are seen in these parameters calculated using different Monte Carlo codes. The purpose of this work is to quantify as well as to understand these differences. We have compared a series of calculations using an EGS4, an EGSnrc, and the MCNP Monte Carlo codes. Data calculated and compared include the depth dose curve for a broad parallel beam of electrons, and radial dose distributions for point electron sources (monoenergetic or polyenergetic) and for a real 90Sr/90Y source. For the 90Sr/90Y source, the doses at the reference position (2 mm radial distance) calculated by the three code agree within 2%. However, the differences between the dose calculated by the three codes can be over 20% in the radial distance range interested in IVBT. The difference increases with radial distance from source, and reaches 30% at the tail of dose curve. These differences may be partially attributed to the different multiple scattering theories and Monte Carlo models for electron transport adopted in these three codes. Doses calculated by the EGSnrc code are more accurate than those by the EGS4. The two calculations agree within 5% for radial distance <6 mm.
New estimation method of neutron skyshine for a high-energy particle accelerator
NASA Astrophysics Data System (ADS)
Oh, Joo-Hee; Jung, Nam-Suk; Lee, Hee-Seock; Ko, Seung-Kook
2016-09-01
A skyshine is the dominant component of the prompt radiation at off-site. Several experimental studies have been done to estimate the neutron skyshine at a few accelerator facilities. In this work, the neutron transports from a source place to off-site location were simulated using the Monte Carlo codes, FLUKA and PHITS. The transport paths were classified as skyshine, direct (transport), groundshine and multiple-shine to understand the contribution of each path and to develop a general evaluation method. The effect of each path was estimated in the view of the dose at far locations. The neutron dose was calculated using the neutron energy spectra obtained from each detector placed up to a maximum of 1 km from the accelerator. The highest altitude of the sky region in this simulation was set as 2 km from the floor of the accelerator facility. The initial model of this study was the 10 GeV electron accelerator, PAL-XFEL. Different compositions and densities of air, soil and ordinary concrete were applied in this calculation, and their dependences were reviewed. The estimation method used in this study was compared with the well-known methods suggested by Rindi, Stevenson and Stepleton, and also with the simple code, SHINE3. The results obtained using this method agreed well with those using Rindi's formula.
NASA Astrophysics Data System (ADS)
Stamatopoulos, A.; Kanellakopoulos, A.; Kalamara, A.; Diakaki, M.; Tsinganis, A.; Kokkoris, M.; Michalopoulou, V.; Axiotis, M.; Lagoyiannis, A.; Vlastou, R.
2018-01-01
The 234U neutron-induced fission cross-section has been measured at incident neutron energies of 452, 550, 651 keV and 7.5, 8.7, 10 MeV using the 7Li ( p, n) and the 2H( d, n) reactions, respectively, relative to the 235U( n, f ) and 238U( n, f ) reference reactions. The measurement was performed at the neutron beam facility of the National Center for Scientific Research "Demokritos", using a set-up based on Micromegas detectors. The active mass of the actinide samples and the corresponding impurities were determined via α-spectroscopy using a surface barrier silicon detector. The neutron spectra intercepted by the actinide samples have been thoroughly studied by coupling the NeuSDesc and MCNP5 codes, taking into account the energy and angular straggling of the primary ion beams in the neutron source targets in addition to contributions from competing reactions ( e.g. deuteron break-up) and neutron scattering in the surrounding materials. Auxiliary Monte Carlo simulations were performed making combined use of the FLUKA and GEF codes, focusing particularly on the determination of the fission fragment detection efficiency. The developed methodology and the final results are presented.
Radiological Studies for the LCLS Beam Abort System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santana Leitner, M.; Vollaire, J.; Mao, X.S.
2008-03-25
The Linac Coherent Light Source (LCLS), a pioneer hard x-ray free electron laser is currently under construction at the Stanford Linear Accelerator Center. It is expected that by 2009 LCLS will deliver laser pulses of unprecedented brightness and short length, which will be used in several forefront research applications. This ambitious project encompasses major design challenges to the radiation protection like the numerous sources and the number of surveyed objects. In order to sort those, the showers from various loss sources have been tracked along a detailed model covering 1/2 mile of LCLS accelerator by means of the Monte Carlomore » intra nuclear cascade codes FLUKA and MARS15. This article covers the FLUKA studies of heat load; prompt and residual dose and environmental impact for the LCLS beam abort system.« less
Bergmann, Ryan M.; Rowland, Kelly L.; Radnović, Nikola; ...
2017-05-01
In this companion paper to "Algorithmic Choices in WARP - A Framework for Continuous Energy Monte Carlo Neutron Transport in General 3D Geometries on GPUs" (doi:10.1016/j.anucene.2014.10.039), the WARP Monte Carlo neutron transport framework for graphics processing units (GPUs) is benchmarked against production-level central processing unit (CPU) Monte Carlo neutron transport codes for both performance and accuracy. We compare neutron flux spectra, multiplication factors, runtimes, speedup factors, and costs of various GPU and CPU platforms running either WARP, Serpent 2.1.24, or MCNP 6.1. WARP compares well with the results of the production-level codes, and it is shown that on the newestmore » hardware considered, GPU platforms running WARP are between 0.8 to 7.6 times as fast as CPU platforms running production codes. Also, the GPU platforms running WARP were between 15% and 50% as expensive to purchase and between 80% to 90% as expensive to operate as equivalent CPU platforms performing at an equal simulation rate.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergmann, Ryan M.; Rowland, Kelly L.; Radnović, Nikola
In this companion paper to "Algorithmic Choices in WARP - A Framework for Continuous Energy Monte Carlo Neutron Transport in General 3D Geometries on GPUs" (doi:10.1016/j.anucene.2014.10.039), the WARP Monte Carlo neutron transport framework for graphics processing units (GPUs) is benchmarked against production-level central processing unit (CPU) Monte Carlo neutron transport codes for both performance and accuracy. We compare neutron flux spectra, multiplication factors, runtimes, speedup factors, and costs of various GPU and CPU platforms running either WARP, Serpent 2.1.24, or MCNP 6.1. WARP compares well with the results of the production-level codes, and it is shown that on the newestmore » hardware considered, GPU platforms running WARP are between 0.8 to 7.6 times as fast as CPU platforms running production codes. Also, the GPU platforms running WARP were between 15% and 50% as expensive to purchase and between 80% to 90% as expensive to operate as equivalent CPU platforms performing at an equal simulation rate.« less
Diagnosing Undersampling Biases in Monte Carlo Eigenvalue and Flux Tally Estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, Christopher M.; Rearden, Bradley T.; Marshall, William J.
2017-02-08
Here, this study focuses on understanding the phenomena in Monte Carlo simulations known as undersampling, in which Monte Carlo tally estimates may not encounter a sufficient number of particles during each generation to obtain unbiased tally estimates. Steady-state Monte Carlo simulations were performed using the KENO Monte Carlo tools within the SCALE code system for models of several burnup credit applications with varying degrees of spatial and isotopic complexities, and the incidence and impact of undersampling on eigenvalue and flux estimates were examined. Using an inadequate number of particle histories in each generation was found to produce a maximum bias of ~100 pcm in eigenvalue estimates and biases that exceeded 10% in fuel pin flux tally estimates. Having quantified the potential magnitude of undersampling biases in eigenvalue and flux tally estimates in these systems, this study then investigated whether Markov Chain Monte Carlo convergence metrics could be integrated into Monte Carlo simulations to predict the onset and magnitude of undersampling biases. Five potential metrics for identifying undersampling biases were implemented in the SCALE code system and evaluated for their ability to predict undersampling biases by comparing the test metric scores with the observed undersampling biases. Finally, of the five convergence metrics that were investigated, three (the Heidelberger-Welch relative half-width, the Gelman-Rubin more » $$\\hat{R}_c$$ diagnostic, and tally entropy) showed the potential to accurately predict the behavior of undersampling biases in the responses examined.« less
NASA Astrophysics Data System (ADS)
Islam, M. R.; Collums, T. L.; Zheng, Y.; Monson, J.; Benton, E. R.
2013-11-01
The production of secondary neutrons is an undesirable byproduct of proton therapy and it is important to quantify the contribution from secondary neutrons to patient dose received outside the treatment volume. The purpose of this study is to investigate the off-axis dose equivalent from secondary neutrons experimentally using CR-39 plastic nuclear track detectors (PNTD) at ProCure Proton Therapy Center, Oklahoma City, OK. In this experiment, we placed several layers of CR-39 PNTD laterally outside the treatment volume inside a phantom and in air at various depths and angles with respect to the primary beam axis. Three different proton beams with max energies of 78, 162 and 226 MeV and 4 cm modulation width, a 5 cm diameter brass aperture, and a small snout located 38 cm from isocenter were used for the entire experiment. Monte Carlo simulations were also performed based on the experimental setup using a simplified snout configuration and the FLUKA Monte Carlo radiation transport code. The measured ratio of secondary neutron dose equivalent to therapeutic primary proton dose (H/D) ranged from 0.3 ± 0.08 mSv Gy-1 for 78 MeV proton beam to 37.4 ± 2.42 mSv Gy-1 for 226 MeV proton beam. Both experiment and simulation showed a similar decreasing trend in dose equivalent with distance to the central axis and the magnitude varied by a factor of about 2 in most locations. H/D was found to increase as the energy of the primary proton beam increased and higher H/D was observed at 135° compared to 45° and 90°. The overall higher H/D in air indicates the predominance of external neutrons produced in the nozzle rather than inside the body.
NASA Astrophysics Data System (ADS)
Mishev, A. L.; Velinov, P. I. Y.
2014-12-01
In the last few years an essential progress in development of physical models for cosmic ray induced ionization in the atmosphere is achieved. The majority of these models are full target, i.e. based on Monte Carlo simulation of an electromagnetic-muon-nucleon cascade in the atmosphere. Basically, the contribution of proton nuclei is highlighted, i.e. the contribution of primary cosmic ray α-particles and heavy nuclei to the atmospheric ionization is neglected or scaled to protons. The development of cosmic ray induced atmospheric cascade is sensitive to the energy and mass of the primary cosmic ray particle. The largest uncertainties in Monte Carlo simulations of a cascade in the Earth atmosphere are due to assumed hadron interaction models, the so-called hadron generators. In the work presented here we compare the ionization yield functions Y for primary cosmic ray nuclei, such as α-particles, Oxygen and Iron nuclei, assuming different hadron interaction models. The computations are fulfilled with the CORSIKA 6.9 code using GHEISHA 2002, FLUKA 2011, UrQMD hadron generators for energy below 80 GeV/nucleon and QGSJET II for energy above 80 GeV/nucleon. The observed difference between hadron generators is widely discussed. The influence of different atmospheric parametrizations, namely US standard atmosphere, US standard atmosphere winter and summer profiles on ion production rate is studied. Assuming realistic primary cosmic ray mass composition, the ion production rate is obtained at several rigidity cut-offs - from 1 GV (high latitudes) to 15 GV (equatorial latitudes) using various hadron generators. The computations are compared with experimental data. A conclusion concerning the consistency of the hadron generators is stated.
MCNP capabilities for nuclear well logging calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forster, R.A.; Little, R.C.; Briesmeister, J.F.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tallymore » characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatzidakis, Stylianos; Greulich, Christopher
A cosmic ray Muon Flexible Framework for Spectral GENeration for Monte Carlo Applications (MUFFSgenMC) has been developed to support state-of-the-art cosmic ray muon tomographic applications. The flexible framework allows for easy and fast creation of source terms for popular Monte Carlo applications like GEANT4 and MCNP. This code framework simplifies the process of simulations used for cosmic ray muon tomography.
NASA Astrophysics Data System (ADS)
Prabhu Verleker, Akshay; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M.
2015-03-01
The purpose of this study is to develop an alternate empirical approach to estimate near-infra-red (NIR) photon propagation and quantify optically induced drug release in brain metastasis, without relying on computationally expensive Monte Carlo techniques (gold standard). Targeted drug delivery with optically induced drug release is a noninvasive means to treat cancers and metastasis. This study is part of a larger project to treat brain metastasis by delivering lapatinib-drug-nanocomplexes and activating NIR-induced drug release. The empirical model was developed using a weighted approach to estimate photon scattering in tissues and calibrated using a GPU based 3D Monte Carlo. The empirical model was developed and tested against Monte Carlo in optical brain phantoms for pencil beams (width 1mm) and broad beams (width 10mm). The empirical algorithm was tested against the Monte Carlo for different albedos along with diffusion equation and in simulated brain phantoms resembling white-matter (μs'=8.25mm-1, μa=0.005mm-1) and gray-matter (μs'=2.45mm-1, μa=0.035mm-1) at wavelength 800nm. The goodness of fit between the two models was determined using coefficient of determination (R-squared analysis). Preliminary results show the Empirical algorithm matches Monte Carlo simulated fluence over a wide range of albedo (0.7 to 0.99), while the diffusion equation fails for lower albedo. The photon fluence generated by empirical code matched the Monte Carlo in homogeneous phantoms (R2=0.99). While GPU based Monte Carlo achieved 300X acceleration compared to earlier CPU based models, the empirical code is 700X faster than the Monte Carlo for a typical super-Gaussian laser beam.
Capabilities overview of the MORET 5 Monte Carlo code
NASA Astrophysics Data System (ADS)
Cochet, B.; Jinaphanh, A.; Heulers, L.; Jacquet, O.
2014-06-01
The MORET code is a simulation tool that solves the transport equation for neutrons using the Monte Carlo method. It allows users to model complex three-dimensional geometrical configurations, describe the materials, define their own tallies in order to analyse the results. The MORET code has been initially designed to perform calculations for criticality safety assessments. New features has been introduced in the MORET 5 code to expand its use for reactor applications. This paper presents an overview of the MORET 5 code capabilities, going through the description of materials, the geometry modelling, the transport simulation and the definition of the outputs.
Forward Neutron Production at the Fermilab Main Injector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nigmanov, T.S.; /Michigan U.; Rajaram, D.
2010-10-01
We have measured cross sections for forward neutron production from a variety of targets using proton beams from the Fermilab Main Injector. Measurements were performed for proton beam momenta of 58 GeV/c, 84 GeV/c, and 120 GeV/c. The cross section dependence on the atomic weight (A) of the targets was found to vary as A{sup a} where a is 0.46 {+-} 0.06 for a beam momentum of 58 GeV/c and 0.54 {+-} 0.05 for 120 GeV/c. The cross sections show reasonable agreement with FLUKA and DPMJET Monte Carlos. Comparisons have also been made with the LAQGSM Monte Carlo. The MIPPmore » (Main Injector Particle Production) experiment (FNAL E907) [1] acquired data in the Meson Center beam line at Fermilab. The primary purposes of the experiment were to investigate scaling laws in hadron fragmentation [2], to obtain hadron production data for the NuMI (Neutrinos at the Main Injector [3]) target to be used for calculating neutrino fluxes, and to obtain inclusive pion, neutron, and photon production data to facilitate proton radiography [4]. While there is considerable data available on inclusive charged particle production [5], there is little data on neutron production. In this article we present results for forward neutron production using proton beams of 58 GeV/c, 84 GeV/c, and 120 GeV/c on hydrogen, beryllium, carbon, bismuth, and uranium targets, and compare these data with predictions from Monte Carlo simulations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giuseppe Palmiotti
In this work, the implementation of a collision history-based approach to sensitivity/perturbation calculations in the Monte Carlo code SERPENT is discussed. The proposed methods allow the calculation of the eects of nuclear data perturbation on several response functions: the eective multiplication factor, reaction rate ratios and bilinear ratios (e.g., eective kinetics parameters). SERPENT results are compared to ERANOS and TSUNAMI Generalized Perturbation Theory calculations for two fast metallic systems and for a PWR pin-cell benchmark. New methods for the calculation of sensitivities to angular scattering distributions are also presented, which adopts fully continuous (in energy and angle) Monte Carlo estimators.
Monte Carlo Particle Lists: MCPL
NASA Astrophysics Data System (ADS)
Kittelmann, T.; Klinkby, E.; Knudsen, E. B.; Willendrup, P.; Cai, X. X.; Kanaki, K.
2017-09-01
A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moskvin, V; Pirlepesov, F; Farr, J
2016-06-15
Purpose: Dose-weighted linear energy transfer (dLET) has been shown to be useful for the analysis of late effects in proton therapy. This study presents the results of the testing of the dLET concept for intensity modulated proton therapy (IMPT) with a discrete spot scanning beam system without use of an aperture or compensator (AC). Methods: IMPT (no AC) and broad beams (BB) with (AC) were simulated in the TOPAS and FLUKA code systems. Information from the independently tested Monte Carlo Damage Simulation (MCDS) was integrated into the FLUKA code systems to account for spatial variations in the RBE for protonsmore » and other light ions using an endpoint of DNA double strand break (DSB) induction. Results: The proton spectra for IMPT beams at the depths beyond the distal edge contain a tail of high energy protons up to 100 MeV. The integral from the tail is compatible with the number of 5–8 MeV protons at the tip of the Bragg peak (BP). The dose averaged energy (dEav) decreases to 7 MeV at the tip of (BP) and then increases to about 15 MeV beyond the distal edge. Neutrons produced in the nozzle are two orders of magnitude higher for BB with AC than for IMPT in low energy part of the spectra. The dLET values beyond of the distal edge of the BP are 5 times larger for the IMPT than for BB with the AC. Contrarily, negligible differences are seen in the RBE estimates for IMPT and BB with AC beyond the distal edge of the BP. Conclusion: The analysis of late effects in IMPT with a spot scanning and double scattering or scanning techniques with AC may requires both dLET and RBE as quantitative parameters to characterize effects beyond the distal edge of the BP.« less
NASA Astrophysics Data System (ADS)
Kramer, S. L.; Ghosh, V. J.; Breitfeller, M.; Wahl, W.
2016-11-01
Third generation high brightness light sources are designed to have low emittance and high current beams, which contribute to higher beam loss rates that will be compensated by Top-Off injection. Shielding for these higher loss rates will be critical to protect the projected higher occupancy factors for the users. Top-Off injection requires a full energy injector, which will demand greater consideration of the potential abnormal beam miss-steering and localized losses that could occur. The high energy electron injection beam produces significantly higher neutron component dose to the experimental floor than a lower energy beam injection and ramped operations. Minimizing this dose will require adequate knowledge of where the miss-steered beam can occur and sufficient EM shielding close to the loss point, in order to attenuate the energy of the particles in the EM shower below the neutron production threshold (<10 MeV), which will spread the incident energy on the bulk shield walls and thereby the dose penetrating the shield walls. Designing supplemental shielding near the loss point using the analytic shielding model is shown to be inadequate because of its lack of geometry specification for the EM shower process. To predict the dose rates outside the tunnel requires detailed description of the geometry and materials that the beam losses will encounter inside the tunnel. Modern radiation shielding Monte-Carlo codes, like FLUKA, can handle this geometric description of the radiation transport process in sufficient detail, allowing accurate predictions of the dose rates expected and the ability to show weaknesses in the design before a high radiation incident occurs. The effort required to adequately define the accelerator geometry for these codes has been greatly reduced with the implementation of the graphical interface of FLAIR to FLUKA. This made the effective shielding process for NSLS-II quite accurate and reliable. The principles used to provide supplemental shielding to the NSLS-II accelerators and the lessons learned from this process are presented.
Positron follow-up in liquid water: I. A new Monte Carlo track-structure code.
Champion, C; Le Loirec, C
2006-04-07
When biological matter is irradiated by charged particles, a wide variety of interactions occur, which lead to a deep modification of the cellular environment. To understand the fine structure of the microscopic distribution of energy deposits, Monte Carlo event-by-event simulations are particularly suitable. However, the development of these track-structure codes needs accurate interaction cross sections for all the electronic processes: ionization, excitation, positronium formation and even elastic scattering. Under these conditions, we have recently developed a Monte Carlo code for positrons in water, the latter being commonly used to simulate the biological medium. All the processes are studied in detail via theoretical differential and total cross-section calculations performed by using partial wave methods. Comparisons with existing theoretical and experimental data in terms of stopping powers, mean energy transfers and ranges show very good agreements. Moreover, thanks to the theoretical description of positronium formation, we have access, for the first time, to the complete kinematics of the electron capture process. Then, the present Monte Carlo code is able to describe the detailed positronium history, which will provide useful information for medical imaging (like positron emission tomography) where improvements are needed to define with the best accuracy the tumoural volumes.
NASA Astrophysics Data System (ADS)
Chiavassa, S.; Aubineau-Lanièce, I.; Bitar, A.; Lisbona, A.; Barbet, J.; Franck, D.; Jourdain, J. R.; Bardiès, M.
2006-02-01
Dosimetric studies are necessary for all patients treated with targeted radiotherapy. In order to attain the precision required, we have developed Oedipe, a dosimetric tool based on the MCNPX Monte Carlo code. The anatomy of each patient is considered in the form of a voxel-based geometry created using computed tomography (CT) images or magnetic resonance imaging (MRI). Oedipe enables dosimetry studies to be carried out at the voxel scale. Validation of the results obtained by comparison with existing methods is complex because there are multiple sources of variation: calculation methods (different Monte Carlo codes, point kernel), patient representations (model or specific) and geometry definitions (mathematical or voxel-based). In this paper, we validate Oedipe by taking each of these parameters into account independently. Monte Carlo methodology requires long calculation times, particularly in the case of voxel-based geometries, and this is one of the limits of personalized dosimetric methods. However, our results show that the use of voxel-based geometry as opposed to a mathematically defined geometry decreases the calculation time two-fold, due to an optimization of the MCNPX2.5e code. It is therefore possible to envisage the use of Oedipe for personalized dosimetry in the clinical context of targeted radiotherapy.
Effect of the multiple scattering of electrons in Monte Carlo simulation of LINACS.
Vilches, Manuel; García-Pareja, Salvador; Guerrero, Rafael; Anguiano, Marta; Lallena, Antonio M
2008-01-01
Results obtained from Monte Carlo simulations of the transport of electrons in thin slabs of dense material media and air slabs with different widths are analyzed. Various general purpose Monte Carlo codes have been used: PENELOPE, GEANT3, GEANT4, EGSNRC, MCNPX. Non-negligible differences between the angular and radial distributions after the slabs have been found. The effects of these differences on the depth doses measured in water are also discussed.
Combined experimental and Monte Carlo verification of
brachytherapy plans for vaginal applicators
NASA Astrophysics Data System (ADS)
Sloboda, Ron S.; Wang, Ruqing
1998-12-01
Dose rates in a phantom around a shielded and an unshielded vaginal applicator containing Selectron low-dose-rate
sources were determined by experiment and Monte Carlo simulation. Measurements were performed with thermoluminescent dosimeters in a white polystyrene phantom using an experimental protocol geared for precision. Calculations for the same set-up were done using a version of the EGS4 Monte Carlo code system modified for brachytherapy applications into which a new combinatorial geometry package developed by Bielajew was recently incorporated. Measured dose rates agree with Monte Carlo estimates to within 5% (1 SD) for the unshielded applicator, while highlighting some experimental uncertainties for the shielded applicator. Monte Carlo calculations were also done to determine a value for the effective transmission of the shield required for clinical treatment planning, and to estimate the dose rate in water at points in axial and sagittal planes transecting the shielded applicator. Comparison with dose rates generated by the planning system indicates that agreement is better than 5% (1 SD) at most positions. The precision thermoluminescent dosimetry protocol and modified Monte Carlo code are effective complementary tools for brachytherapy applicator dosimetry.
Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.
2011-01-01
We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276
FLUKA: A Multi-Particle Transport Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan
2005-12-14
This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.
NASA Astrophysics Data System (ADS)
Bencheikh, Mohamed; Maghnouj, Abdelmajid; Tajmouati, Jaouad
2017-11-01
The Monte Carlo calculation method is considered to be the most accurate method for dose calculation in radiotherapy and beam characterization investigation, in this study, the Varian Clinac 2100 medical linear accelerator with and without flattening filter (FF) was modelled. The objective of this study was to determine flattening filter impact on particles' energy properties at phantom surface in terms of energy fluence, mean energy, and energy fluence distribution. The Monte Carlo codes used in this study were BEAMnrc code for simulating linac head, DOSXYZnrc code for simulating the absorbed dose in a water phantom, and BEAMDP for extracting energy properties. Field size was 10 × 10 cm2, simulated photon beam energy was 6 MV and SSD was 100 cm. The Monte Carlo geometry was validated by a gamma index acceptance rate of 99% in PDD and 98% in dose profiles, gamma criteria was 3% for dose difference and 3mm for distance to agreement. In without-FF, the energetic properties was as following: electron contribution was increased by more than 300% in energy fluence, almost 14% in mean energy and 1900% in energy fluence distribution, however, photon contribution was increased 50% in energy fluence, and almost 18% in mean energy and almost 35% in energy fluence distribution. The removing flattening filter promotes the increasing of electron contamination energy versus photon energy; our study can contribute in the evolution of removing flattening filter configuration in future linac.
Space-radiation-induced Photon Luminescence of the Moon
NASA Technical Reports Server (NTRS)
Wilson, Thomas; Lee, Kerry
2008-01-01
We report on the results of a study of the photon luminescence of the Moon induced by Galactic Cosmic Rays (GCRs) and space radiation from the Sun, using the Monte Carlo program FLUKA. The model of the lunar surface is taken to be the chemical composition of soils found at various landing sites during the Apollo and Luna programs, averaged over all such sites to define a generic regolith for the present analysis. This then becomes the target that is bombarded by Galactic Cosmic Rays (GCRs) and Solar Energetic Particles (SEPs) above 1 keV in FLUKA to determine the photon fluence albedo produced by the Moon's surface when there is no sunlight and Earthshine. This is to be distinguished from the gamma-ray spectrum produced by the radioactive decay of radiogenic constituents lying in the surface and interior of the Moon. From the photon fluence we derive the spectrum which can be utilized to examine existing lunar spectral data and to design orbiting instrumentation for measuring various components of the space-radiation-induced photon luminescence present on the Moon.
Applying Quantum Monte Carlo to the Electronic Structure Problem
NASA Astrophysics Data System (ADS)
Powell, Andrew D.; Dawes, Richard
2016-06-01
Two distinct types of Quantum Monte Carlo (QMC) calculations are applied to electronic structure problems such as calculating potential energy curves and producing benchmark values for reaction barriers. First, Variational and Diffusion Monte Carlo (VMC and DMC) methods using a trial wavefunction subject to the fixed node approximation were tested using the CASINO code.[1] Next, Full Configuration Interaction Quantum Monte Carlo (FCIQMC), along with its initiator extension (i-FCIQMC) were tested using the NECI code.[2] FCIQMC seeks the FCI energy for a specific basis set. At a reduced cost, the efficient i-FCIQMC method can be applied to systems in which the standard FCIQMC approach proves to be too costly. Since all of these methods are statistical approaches, uncertainties (error-bars) are introduced for each calculated energy. This study tests the performance of the methods relative to traditional quantum chemistry for some benchmark systems. References: [1] R. J. Needs et al., J. Phys.: Condensed Matter 22, 023201 (2010). [2] G. H. Booth et al., J. Chem. Phys. 131, 054106 (2009).
High-Fidelity Coupled Monte-Carlo/Thermal-Hydraulics Calculations
NASA Astrophysics Data System (ADS)
Ivanov, Aleksandar; Sanchez, Victor; Ivanov, Kostadin
2014-06-01
Monte Carlo methods have been used as reference reactor physics calculation tools worldwide. The advance in computer technology allows the calculation of detailed flux distributions in both space and energy. In most of the cases however, those calculations are done under the assumption of homogeneous material density and temperature distributions. The aim of this work is to develop a consistent methodology for providing realistic three-dimensional thermal-hydraulic distributions by coupling the in-house developed sub-channel code SUBCHANFLOW with the standard Monte-Carlo transport code MCNP. In addition to the innovative technique of on-the fly material definition, a flux-based weight-window technique has been introduced to improve both the magnitude and the distribution of the relative errors. Finally, a coupled code system for the simulation of steady-state reactor physics problems has been developed. Besides the problem of effective feedback data interchange between the codes, the treatment of temperature dependence of the continuous energy nuclear data has been investigated.
PEPSI — a Monte Carlo generator for polarized leptoproduction
NASA Astrophysics Data System (ADS)
Mankiewicz, L.; Schäfer, A.; Veltri, M.
1992-09-01
We describe PEPSI (Polarized Electron Proton Scattering Interactions), a Monte Carlo program for polarized deep inelastic leptoproduction mediated by electromagnetic interaction, and explain how to use it. The code is a modification of the LEPTO 4.3 Lund Monte Carlo for unpolarized scattering. The hard virtual gamma-parton scattering is generated according to the polarization-dependent QCD cross-section of the first order in α S. PEPSI requires the standard polarization-independent JETSET routines to simulate the fragmentation into final hadrons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.
In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less
The Monte Carlo code MCPTV--Monte Carlo dose calculation in radiation therapy with carbon ions.
Karg, Juergen; Speer, Stefan; Schmidt, Manfred; Mueller, Reinhold
2010-07-07
The Monte Carlo code MCPTV is presented. MCPTV is designed for dose calculation in treatment planning in radiation therapy with particles and especially carbon ions. MCPTV has a voxel-based concept and can perform a fast calculation of the dose distribution on patient CT data. Material and density information from CT are taken into account. Electromagnetic and nuclear interactions are implemented. Furthermore the algorithm gives information about the particle spectra and the energy deposition in each voxel. This can be used to calculate the relative biological effectiveness (RBE) for each voxel. Depth dose distributions are compared to experimental data giving good agreement. A clinical example is shown to demonstrate the capabilities of the MCPTV dose calculation.
NASA Astrophysics Data System (ADS)
Zoller, Christian; Hohmann, Ansgar; Ertl, Thomas; Kienle, Alwin
2017-07-01
The Monte Carlo method is often referred as the gold standard to calculate the light propagation in turbid media [1]. Especially for complex shaped geometries where no analytical solutions are available the Monte Carlo method becomes very important [1, 2]. In this work a Monte Carlo software is presented, to simulate the light propagation in complex shaped geometries. To improve the simulation time the code is based on OpenCL such that graphics cards can be used as well as other computing devices. Within the software an illumination concept is presented to realize easily all kinds of light sources, like spatial frequency domain (SFD), optical fibers or Gaussian beam profiles. Moreover different objects, which are not connected to each other, can be considered simultaneously, without any additional preprocessing. This Monte Carlo software can be used for many applications. In this work the transmission spectrum of a tooth and the color reconstruction of a virtual object are shown, using results from the Monte Carlo software.
Gorshkov, Anton V; Kirillin, Mikhail Yu
2015-08-01
Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.
Preliminary results of 3D dose calculations with MCNP-4B code from a SPECT image.
Rodríguez Gual, M; Lima, F F; Sospedra Alfonso, R; González González, J; Calderón Marín, C
2004-01-01
Interface software was developed to generate the input file to run Monte Carlo MCNP-4B code from medical image in Interfile format version 3.3. The software was tested using a spherical phantom of tomography slides with known cumulated activity distribution in Interfile format generated with IMAGAMMA medical image processing system. The 3D dose calculation obtained with Monte Carlo MCNP-4B code was compared with the voxel S factor method. The results show a relative error between both methods less than 1 %.
Monte Carlo method for calculating the radiation skyshine produced by electron accelerators
NASA Astrophysics Data System (ADS)
Kong, Chaocheng; Li, Quanfeng; Chen, Huaibi; Du, Taibin; Cheng, Cheng; Tang, Chuanxiang; Zhu, Li; Zhang, Hui; Pei, Zhigang; Ming, Shenjin
2005-06-01
Using the MCNP4C Monte Carlo code, the X-ray skyshine produced by 9 MeV, 15 MeV and 21 MeV electron linear accelerators were calculated respectively with a new two-step method combined with the split and roulette variance reduction technique. Results of the Monte Carlo simulation, the empirical formulas used for skyshine calculation and the dose measurements were analyzed and compared. In conclusion, the skyshine dose measurements agreed reasonably with the results computed by the Monte Carlo method, but deviated from computational results given by empirical formulas. The effect on skyshine dose caused by different structures of accelerator head is also discussed in this paper.
The response of a bonner sphere spectrometer to charged hadrons.
Agosteo, S; Dimovasili, E; Fassò, A; Silari, M
2004-01-01
Bonner sphere spectrometers (BSSs) are employed in neutron spectrometry and dosimetry since many years. Recent developments have seen the addition to a conventional BSS of one or more detectors (moderator plus thermal neutron counter) specifically designed to improve the overall response of the spectrometer to neutrons above 10 MeV. These additional detectors employ a shell of material with a high mass number (such as lead) within the polyethylene moderator, in order to slow down high-energy neutrons via (n,xn) reactions. A BSS can be used to measure neutron spectra both outside accelerator shielding and from an unshielded target. Measurements were recently performed at CERN of the neutron yield and spectral fluence at various angles from unshielded, semi-thick copper, silver and lead targets, bombarded by a mixed proton/pion beam with 40 GeV per c momentum. These experiments have provided evidence that under certain circumstances, the use of lead-enriched moderators may present a problem: these detectors were found to have a significant response to the charged hadron component accompanying the neutrons emitted from the target. Conventional polyethylene moderators show a similar behaviour but less pronounced. These secondary hadrons interact with the moderator and generate neutrons, which are in turn detected by the counter. To investigate this effect and determine a correction factor to be applied to the unfolding procedure, a series of Monte Carlo simulations were performed with the FLUKA code. These simulations aimed at determining the response of the BSS to charged hadrons under the specific experimental situation. Following these results, a complete response matrix of the extended BSS to charged pions and protons was calculated with FLUKA. An experimental verification was carried out with a 120 GeV per c hadron beam at the CERF facility at CERN.
Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael
2014-05-01
Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.
Optimization of the Monte Carlo code for modeling of photon migration in tissue.
Zołek, Norbert S; Liebert, Adam; Maniewski, Roman
2006-10-01
The Monte Carlo method is frequently used to simulate light transport in turbid media because of its simplicity and flexibility, allowing to analyze complicated geometrical structures. Monte Carlo simulations are, however, time consuming because of the necessity to track the paths of individual photons. The time consuming computation is mainly associated with the calculation of the logarithmic and trigonometric functions as well as the generation of pseudo-random numbers. In this paper, the Monte Carlo algorithm was developed and optimized, by approximation of the logarithmic and trigonometric functions. The approximations were based on polynomial and rational functions, and the errors of these approximations are less than 1% of the values of the original functions. The proposed algorithm was verified by simulations of the time-resolved reflectance at several source-detector separations. The results of the calculation using the approximated algorithm were compared with those of the Monte Carlo simulations obtained with an exact computation of the logarithm and trigonometric functions as well as with the solution of the diffusion equation. The errors of the moments of the simulated distributions of times of flight of photons (total number of photons, mean time of flight and variance) are less than 2% for a range of optical properties, typical of living tissues. The proposed approximated algorithm allows to speed up the Monte Carlo simulations by a factor of 4. The developed code can be used on parallel machines, allowing for further acceleration.
2009-07-01
simulation. The pilot described in this paper used this two-step approach within a Define, Measure, Analyze, Improve, and Control ( DMAIC ) framework to...networks, BBN, Monte Carlo simulation, DMAIC , Six Sigma, business case 15. NUMBER OF PAGES 35 16. PRICE CODE 17. SECURITY CLASSIFICATION OF
Akbari, Mahmoud Reza; Yousefnia, Hassan; Mirrezaei, Ehsan
2014-08-01
Water equivalent ratio (WER) was calculated for different proton energies in polymethyl methacrylate (PMMA), polystyrene (PS) and aluminum (Al) using FLUKA and SRIM codes. The results were compared with analytical, experimental and simulated SEICS code data obtained from the literature. The biggest difference between the codes was 3.19%, 1.9% and 0.67% for Al, PMMA and PS, respectively. FLUKA and SEICS had the greatest agreement (≤0.77% difference for PMMA and ≤1.08% difference for Al, respectively) with the experimental data. Copyright © 2014 Elsevier Ltd. All rights reserved.
Million-body star cluster simulations: comparisons between Monte Carlo and direct N-body
NASA Astrophysics Data System (ADS)
Rodriguez, Carl L.; Morscher, Meagan; Wang, Long; Chatterjee, Sourav; Rasio, Frederic A.; Spurzem, Rainer
2016-12-01
We present the first detailed comparison between million-body globular cluster simulations computed with a Hénon-type Monte Carlo code, CMC, and a direct N-body code, NBODY6++GPU. Both simulations start from an identical cluster model with 106 particles, and include all of the relevant physics needed to treat the system in a highly realistic way. With the two codes `frozen' (no fine-tuning of any free parameters or internal algorithms of the codes) we find good agreement in the overall evolution of the two models. Furthermore, we find that in both models, large numbers of stellar-mass black holes (>1000) are retained for 12 Gyr. Thus, the very accurate direct N-body approach confirms recent predictions that black holes can be retained in present-day, old globular clusters. We find only minor disagreements between the two models and attribute these to the small-N dynamics driving the evolution of the cluster core for which the Monte Carlo assumptions are less ideal. Based on the overwhelming general agreement between the two models computed using these vastly different techniques, we conclude that our Monte Carlo approach, which is more approximate, but dramatically faster compared to the direct N-body, is capable of producing an accurate description of the long-term evolution of massive globular clusters even when the clusters contain large populations of stellar-mass black holes.
NASA Astrophysics Data System (ADS)
Kim, Jeongnim; Baczewski, Andrew D.; Beaudet, Todd D.; Benali, Anouar; Chandler Bennett, M.; Berrill, Mark A.; Blunt, Nick S.; Josué Landinez Borda, Edgar; Casula, Michele; Ceperley, David M.; Chiesa, Simone; Clark, Bryan K.; Clay, Raymond C., III; Delaney, Kris T.; Dewing, Mark; Esler, Kenneth P.; Hao, Hongxia; Heinonen, Olle; Kent, Paul R. C.; Krogel, Jaron T.; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M. Graham; Luo, Ye; Malone, Fionn D.; Martin, Richard M.; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A.; Mitas, Lubos; Morales, Miguel A.; Neuscamman, Eric; Parker, William D.; Pineda Flores, Sergio D.; Romero, Nichols A.; Rubenstein, Brenda M.; Shea, Jacqueline A. R.; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F.; Townsend, Joshua P.; Tubman, Norm M.; Van Der Goetz, Brett; Vincent, Jordan E.; ChangMo Yang, D.; Yang, Yubo; Zhang, Shuai; Zhao, Luning
2018-05-01
QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater–Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.
Kim, Jeongnim; Baczewski, Andrew T; Beaudet, Todd D; Benali, Anouar; Bennett, M Chandler; Berrill, Mark A; Blunt, Nick S; Borda, Edgar Josué Landinez; Casula, Michele; Ceperley, David M; Chiesa, Simone; Clark, Bryan K; Clay, Raymond C; Delaney, Kris T; Dewing, Mark; Esler, Kenneth P; Hao, Hongxia; Heinonen, Olle; Kent, Paul R C; Krogel, Jaron T; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M Graham; Luo, Ye; Malone, Fionn D; Martin, Richard M; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A; Mitas, Lubos; Morales, Miguel A; Neuscamman, Eric; Parker, William D; Pineda Flores, Sergio D; Romero, Nichols A; Rubenstein, Brenda M; Shea, Jacqueline A R; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F; Townsend, Joshua P; Tubman, Norm M; Van Der Goetz, Brett; Vincent, Jordan E; Yang, D ChangMo; Yang, Yubo; Zhang, Shuai; Zhao, Luning
2018-05-16
QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program's capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.
TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badal, A; Zbijewski, W; Bolch, W
Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods,more » are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10{sup 7} xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the virtual generation of medical images and accurate estimation of radiation dose and other imaging parameters. For this, detailed computational phantoms of the patient anatomy must be utilized and implemented within the radiation transport code. Computational phantoms presently come in one of three format types, and in one of four morphometric categories. Format types include stylized (mathematical equation-based), voxel (segmented CT/MR images), and hybrid (NURBS and polygon mesh surfaces). Morphometric categories include reference (small library of phantoms by age at 50th height/weight percentile), patient-dependent (larger library of phantoms at various combinations of height/weight percentiles), patient-sculpted (phantoms altered to match the patient's unique outer body contour), and finally, patient-specific (an exact representation of the patient with respect to both body contour and internal anatomy). The existence and availability of these phantoms represents a very important advance for the simulation of realistic medical imaging applications using Monte Carlo methods. New Monte Carlo simulation codes need to be thoroughly validated before they can be used to perform novel research. Ideally, the validation process would involve comparison of results with those of an experimental measurement, but accurate replication of experimental conditions can be very challenging. It is very common to validate new Monte Carlo simulations by replicating previously published simulation results of similar experiments. This process, however, is commonly problematic due to the lack of sufficient information in the published reports of previous work so as to be able to replicate the simulation in detail. To aid in this process, the AAPM Task Group 195 prepared a report in which six different imaging research experiments commonly performed using Monte Carlo simulations are described and their results provided. The simulation conditions of all six cases are provided in full detail, with all necessary data on material composition, source, geometry, scoring and other parameters provided. The results of these simulations when performed with the four most common publicly available Monte Carlo packages are also provided in tabular form. The Task Group 195 Report will be useful for researchers needing to validate their Monte Carlo work, and for trainees needing to learn Monte Carlo simulation methods. In this symposium we will review the recent advancements in highperformance computing hardware enabling the reduction in computational resources needed for Monte Carlo simulations in medical imaging. We will review variance reduction techniques commonly applied in Monte Carlo simulations of medical imaging systems and present implementation strategies for efficient combination of these techniques with GPU acceleration. Trade-offs involved in Monte Carlo acceleration by means of denoising and “sparse sampling” will be discussed. A method for rapid scatter correction in cone-beam CT (<5 min/scan) will be presented as an illustration of the simulation speeds achievable with optimized Monte Carlo simulations. We will also discuss the development, availability, and capability of the various combinations of computational phantoms for Monte Carlo simulation of medical imaging systems. Finally, we will review some examples of experimental validation of Monte Carlo simulations and will present the AAPM Task Group 195 Report. Learning Objectives: Describe the advances in hardware available for performing Monte Carlo simulations in high performance computing environments. Explain variance reduction, denoising and sparse sampling techniques available for reduction of computational time needed for Monte Carlo simulations of medical imaging. List and compare the computational anthropomorphic phantoms currently available for more accurate assessment of medical imaging parameters in Monte Carlo simulations. Describe experimental methods used for validation of Monte Carlo simulations in medical imaging. Describe the AAPM Task Group 195 Report and its use for validation and teaching of Monte Carlo simulations in medical imaging.« less
Comparison of Geant4-DNA simulation of S-values with other Monte Carlo codes
NASA Astrophysics Data System (ADS)
André, T.; Morini, F.; Karamitros, M.; Delorme, R.; Le Loirec, C.; Campos, L.; Champion, C.; Groetz, J.-E.; Fromm, M.; Bordage, M.-C.; Perrot, Y.; Barberet, Ph.; Bernal, M. A.; Brown, J. M. C.; Deleuze, M. S.; Francis, Z.; Ivanchenko, V.; Mascialino, B.; Zacharatou, C.; Bardiès, M.; Incerti, S.
2014-01-01
Monte Carlo simulations of S-values have been carried out with the Geant4-DNA extension of the Geant4 toolkit. The S-values have been simulated for monoenergetic electrons with energies ranging from 0.1 keV up to 20 keV, in liquid water spheres (for four radii, chosen between 10 nm and 1 μm), and for electrons emitted by five isotopes of iodine (131, 132, 133, 134 and 135), in liquid water spheres of varying radius (from 15 μm up to 250 μm). The results have been compared to those obtained from other Monte Carlo codes and from other published data. The use of the Kolmogorov-Smirnov test has allowed confirming the statistical compatibility of all simulation results.
Reducing statistical uncertainties in simulated organ doses of phantoms immersed in water
Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.; ...
2016-08-13
In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less
A new dynamical atmospheric ionizing radiation (AIR) model for epidemiological studies
NASA Technical Reports Server (NTRS)
De Angelis, G.; Clem, J. M.; Goldhagen, P. E.; Wilson, J. W.
2003-01-01
A new Atmospheric Ionizing Radiation (AIR) model is currently being developed for use in radiation dose evaluation in epidemiological studies targeted to atmospheric flight personnel such as civilian airlines crewmembers. The model will allow computing values for biologically relevant parameters, e.g. dose equivalent and effective dose, for individual flights from 1945. Each flight is described by its actual three dimensional flight profile, i.e. geographic coordinates and altitudes varying with time. Solar modulated primary particles are filtered with a new analytical fully angular dependent geomagnetic cut off rigidity model, as a function of latitude, longitude, arrival direction, altitude and time. The particle transport results have been obtained with a technique based on the three-dimensional Monte Carlo transport code FLUKA, with a special procedure to deal with HZE particles. Particle fluxes are transformed into dose-related quantities and then integrated all along the flight path to obtain the overall flight dose. Preliminary validations of the particle transport technique using data from the AIR Project ER-2 flight campaign of measurements are encouraging. Future efforts will deal with modeling of the effects of the aircraft structure as well as inclusion of solar particle events. Published by Elsevier Ltd on behalf of COSPAR.
Induced activation studies for the LHC upgrade to High Luminosity LHC
NASA Astrophysics Data System (ADS)
Adorisio, C.; Roesler, S.
2018-06-01
The Large Hadron Collider (LHC) will be upgraded in 2019/2020 to increase its luminosity (rate of collisions) by a factor of five beyond its design value and the integrated luminosity by a factor ten, in order to maintain scientific progress and exploit its full capacity. The novel machine configuration, called High Luminosity LHC (HL-LHC), will increase consequently the level of activation of its components. The evaluation of the radiological impact of the HL-LHC operation in the Long Straight Sections of the Insertion Region 1 (ATLAS) and Insertion Region 5 (CMS) is presented. Using the Monte Carlo code FLUKA, ambient dose equivalent rate estimations have been performed on the basis of two announced operating scenarios and using the latest available machine layout. The HL-LHC project requires new technical infrastructure with caverns and 300 m long tunnels along the Insertion Regions 1 and 5. The new underground service galleries will be accessible during the operation of the accelerator machine. The radiological risk assessment for the Civil Engineering work foreseen to start excavating the new galleries in the next LHC Long Shutdown and the radiological impact of the machine operation will be discussed.
Radiation protection design for the Super-FRS and SIS100 at the international FAIR facility
NASA Astrophysics Data System (ADS)
Kozlova, Ekaterina; Sokolov, Alexey; Radon, Torsten; Lang, Rupert; Conrad, Inna; Fehrenbacher, Georg; Weick, Helmut; Winkler, Martin
2017-09-01
The new accelerator SIS100 and the Super-FRS will be built at the international Facility for Antiprotons and Ion Research FAIR. The synchrotron SIS100 is a core part of the FAIR facility which serves for acceleration of ions like Uranium up to 2.7 GeV/u with intensities of 3x1011 particles per second or protons up to 30 GeV with intensities of 5x1012 particles per second. The Super-FRS is a superconducting fragment separator, it will be able to separate all kinds of nuclear projectile fragments of primary heavy ion beams including Uranium with energies up to 1.5 GeV/u and intensities up to 3x1011 particles per second. During operation activation of several components, especially the production target and the beam catchers will take place. For handling of highly activated components it is foreseen to have a hot cell with connected storage place. All calculations for the optimisation of the shielding design of the SIS100, the Super-FRS and the hot cell were performed using the Monte Carlo code FLUKA, results are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groetz, J.-E., E-mail: jegroetz@univ-fcomte.fr; Mavon, C.; Fromm, M.
2014-08-15
We report the design of a millimeter-sized parallel plate free-air ionization chamber (IC) aimed at determining the absolute air kerma rate of an ultra-soft X-ray beam (E = 1.5 keV). The size of the IC was determined so that the measurement volume satisfies the condition of charged-particle equilibrium. The correction factors necessary to properly measure the absolute kerma using the IC have been established. Particular attention was given to the determination of the effective mean energy for the 1.5 keV photons using the PENELOPE code. Other correction factors were determined by means of computer simulation (COMSOL™and FLUKA). Measurements of airmore » kerma rates under specific operating parameters of the lab-bench X-ray source have been performed at various distances from that source and compared to Monte Carlo calculations. We show that the developed ionization chamber makes it possible to determine accurate photon fluence rates in routine work and will constitute substantial time-savings for future radiobiological experiments based on the use of ultra-soft X-rays.« less
Single event effects in high-energy accelerators
NASA Astrophysics Data System (ADS)
García Alía, Rubén; Brugger, Markus; Danzeca, Salvatore; Cerutti, Francesco; de Carvalho Saraiva, Joao Pedro; Denz, Reiner; Ferrari, Alfredo; Foro, Lionel L.; Peronnard, Paul; Røed, Ketil; Secondo, Raffaello; Steckert, Jens; Thurel, Yves; Toccafondo, Iacocpo; Uznanski, Slawosz
2017-03-01
The radiation environment encountered at high-energy hadron accelerators strongly differs from the environment relevant for space applications. The mixed-field expected at modern accelerators is composed of charged and neutral hadrons (protons, pions, kaons and neutrons), photons, electrons, positrons and muons, ranging from very low (thermal) energies up to the TeV range. This complex field, which is extensively simulated by Monte Carlo codes (e.g. FLUKA) is due to beam losses in the experimental areas, distributed along the machine (e.g. collimation points) and deriving from the interaction with the residual gas inside the beam pipe. The resulting intensity, energy distribution and proportion of the different particles largely depends on the distance and angle with respect to the interaction point as well as the amount of installed shielding material. Electronics operating in the vicinity of the accelerator will therefore be subject to both cumulative damage from radiation (total ionizing dose, displacement damage) as well as single event effects which can seriously compromise the operation of the machine. This, combined with the extensive use of commercial-off-the-shelf components due to budget, performance and availability reasons, results in the need to carefully characterize the response of the devices and systems to representative radiation conditions.
Metis: A Pure Metropolis Markov Chain Monte Carlo Bayesian Inference Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, Cameron Russell; Mckigney, Edward Allen
The use of Bayesian inference in data analysis has become the standard for large scienti c experiments [1, 2]. The Monte Carlo Codes Group(XCP-3) at Los Alamos has developed a simple set of algorithms currently implemented in C++ and Python to easily perform at-prior Markov Chain Monte Carlo Bayesian inference with pure Metropolis sampling. These implementations are designed to be user friendly and extensible for customization based on speci c application requirements. This document describes the algorithmic choices made and presents two use cases.
EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual
NASA Technical Reports Server (NTRS)
Raju, M. S.
1998-01-01
EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.
Optimization of beam shaping assembly based on D-T neutron generator and dose evaluation for BNCT
NASA Astrophysics Data System (ADS)
Naeem, Hamza; Chen, Chaobin; Zheng, Huaqing; Song, Jing
2017-04-01
The feasibility of developing an epithermal neutron beam for a boron neutron capture therapy (BNCT) facility based on a high intensity D-T fusion neutron generator (HINEG) and using the Monte Carlo code SuperMC (Super Monte Carlo simulation program for nuclear and radiation process) is proposed in this study. The Monte Carlo code SuperMC is used to determine and optimize the final configuration of the beam shaping assembly (BSA). The optimal BSA design in a cylindrical geometry which consists of a natural uranium sphere (14 cm) as a neutron multiplier, AlF3 and TiF3 as moderators (20 cm each), Cd (1 mm) as a thermal neutron filter, Bi (5 cm) as a gamma shield, and Pb as a reflector and collimator to guide neutrons towards the exit window. The epithermal neutron beam flux of the proposed model is 5.73 × 109 n/cm2s, and other dosimetric parameters for the BNCT reported by IAEA-TECDOC-1223 have been verified. The phantom dose analysis shows that the designed BSA is accurate, efficient and suitable for BNCT applications. Thus, the Monte Carlo code SuperMC is concluded to be capable of simulating the BSA and the dose calculation for BNCT, and high epithermal flux can be achieved using proposed BSA.
NASA Technical Reports Server (NTRS)
Platt, M. E.; Lewis, E. E.; Boehm, F.
1991-01-01
A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.
The Serpent Monte Carlo Code: Status, Development and Applications in 2013
NASA Astrophysics Data System (ADS)
Leppänen, Jaakko; Pusa, Maria; Viitanen, Tuomas; Valtavirta, Ville; Kaltiaisenaho, Toni
2014-06-01
The Serpent Monte Carlo reactor physics burnup calculation code has been developed at VTT Technical Research Centre of Finland since 2004, and is currently used in 100 universities and research organizations around the world. This paper presents the brief history of the project, together with the currently available methods and capabilities and plans for future work. Typical user applications are introduced in the form of a summary review on Serpent-related publications over the past few years.
An unbiased Hessian representation for Monte Carlo PDFs.
Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Latorre, José Ignacio; Rojo, Juan
We develop a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis. We validate the methodology by first showing that it faithfully reproduces a native Monte Carlo PDF set (NNPDF3.0), and then, that if applied to Hessian PDF set (MMHT14) which was transformed into a Monte Carlo set, it gives back the starting PDFs with minimal information loss. We then show that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (MC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea and a Hessian version of the recently suggested PDF compression algorithm (CMC-PDFs). The mc2hessian conversion code is made publicly available together with (through LHAPDF6) a Hessian representations of the NNPDF3.0 set, and the MC-H PDF set.
NOTE: Monte Carlo evaluation of kerma in an HDR brachytherapy bunker
NASA Astrophysics Data System (ADS)
Pérez-Calatayud, J.; Granero, D.; Ballester, F.; Casal, E.; Crispin, V.; Puchades, V.; León, A.; Verdú, G.
2004-12-01
In recent years, the use of high dose rate (HDR) after-loader machines has greatly increased due to the shift from traditional Cs-137/Ir-192 low dose rate (LDR) to HDR brachytherapy. The method used to calculate the required concrete and, where appropriate, lead shielding in the door is based on analytical methods provided by documents published by the ICRP, the IAEA and the NCRP. The purpose of this study is to perform a more realistic kerma evaluation at the entrance maze door of an HDR bunker using the Monte Carlo code GEANT4. The Monte Carlo results were validated experimentally. The spectrum at the maze entrance door, obtained with Monte Carlo, has an average energy of about 110 keV, maintaining a similar value along the length of the maze. The comparison of results from the aforementioned values with the Monte Carlo ones shows that results obtained using the albedo coefficient from the ICRP document more closely match those given by the Monte Carlo method, although the maximum value given by MC calculations is 30% greater.
Scaling GDL for Multi-cores to Process Planck HFI Beams Monte Carlo on HPC
NASA Astrophysics Data System (ADS)
Coulais, A.; Schellens, M.; Duvert, G.; Park, J.; Arabas, S.; Erard, S.; Roudier, G.; Hivon, E.; Mottet, S.; Laurent, B.; Pinter, M.; Kasradze, N.; Ayad, M.
2014-05-01
After reviewing the majors progress done in GDL -now in 0.9.4- on performance and plotting capabilities since ADASS XXI paper (Coulais et al. 2012), we detail how a large code for Planck HFI beams Monte Carlo was successfully transposed from IDL to GDL on HPC.
A Novel Implementation of Massively Parallel Three Dimensional Monte Carlo Radiation Transport
NASA Astrophysics Data System (ADS)
Robinson, P. B.; Peterson, J. D. L.
2005-12-01
The goal of our summer project was to implement the difference formulation for radiation transport into Cosmos++, a multidimensional, massively parallel, magneto hydrodynamics code for astrophysical applications (Peter Anninos - AX). The difference formulation is a new method for Symbolic Implicit Monte Carlo thermal transport (Brooks and Szöke - PAT). Formerly, simultaneous implementation of fully implicit Monte Carlo radiation transport in multiple dimensions on multiple processors had not been convincingly demonstrated. We found that a combination of the difference formulation and the inherent structure of Cosmos++ makes such an implementation both accurate and straightforward. We developed a "nearly nearest neighbor physics" technique to allow each processor to work independently, even with a fully implicit code. This technique coupled with the increased accuracy of an implicit Monte Carlo solution and the efficiency of parallel computing systems allows us to demonstrate the possibility of massively parallel thermal transport. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48
Badal, Andreu; Badano, Aldo
2009-11-01
It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.
NASA Astrophysics Data System (ADS)
Fensin, Michael Lorne
Monte Carlo-linked depletion methods have gained recent interest due to the ability to more accurately model complex 3-dimesional geometries and better track the evolution of temporal nuclide inventory by simulating the actual physical process utilizing continuous energy coefficients. The integration of CINDER90 into the MCNPX Monte Carlo radiation transport code provides a high-fidelity completely self-contained Monte-Carlo-linked depletion capability in a well established, widely accepted Monte Carlo radiation transport code that is compatible with most nuclear criticality (KCODE) particle tracking features in MCNPX. MCNPX depletion tracks all necessary reaction rates and follows as many isotopes as cross section data permits in order to achieve a highly accurate temporal nuclide inventory solution. This work chronicles relevant nuclear history, surveys current methodologies of depletion theory, details the methodology in applied MCNPX and provides benchmark results for three independent OECD/NEA benchmarks. Relevant nuclear history, from the Oklo reactor two billion years ago to the current major United States nuclear fuel cycle development programs, is addressed in order to supply the motivation for the development of this technology. A survey of current reaction rate and temporal nuclide inventory techniques is then provided to offer justification for the depletion strategy applied within MCNPX. The MCNPX depletion strategy is then dissected and each code feature is detailed chronicling the methodology development from the original linking of MONTEBURNS and MCNP to the most recent public release of the integrated capability (MCNPX 2.6.F). Calculation results of the OECD/NEA Phase IB benchmark, H. B. Robinson benchmark and OECD/NEA Phase IVB are then provided. The acceptable results of these calculations offer sufficient confidence in the predictive capability of the MCNPX depletion method. This capability sets up a significant foundation, in a well established and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.
Common radiation analysis model for 75,000 pound thrust NERVA engine (1137400E)
NASA Technical Reports Server (NTRS)
Warman, E. A.; Lindsey, B. A.
1972-01-01
The mathematical model and sources of radiation used for the radiation analysis and shielding activities in support of the design of the 1137400E version of the 75,000 lbs thrust NERVA engine are presented. The nuclear subsystem (NSS) and non-nuclear components are discussed. The geometrical model for the NSS is two dimensional as required for the DOT discrete ordinates computer code or for an azimuthally symetrical three dimensional Point Kernel or Monte Carlo code. The geometrical model for the non-nuclear components is three dimensional in the FASTER geometry format. This geometry routine is inherent in the ANSC versions of the QAD and GGG Point Kernal programs and the COHORT Monte Carlo program. Data are included pertaining to a pressure vessel surface radiation source data tape which has been used as the basis for starting ANSC analyses with the DASH code to bridge into the COHORT Monte Carlo code using the WANL supplied DOT angular flux leakage data. In addition to the model descriptions and sources of radiation, the methods of analyses are briefly described.
Supernova Light Curves and Spectra from Two Different Codes: Supernu and Phoenix
NASA Astrophysics Data System (ADS)
Van Rossum, Daniel R; Wollaeger, Ryan T
2014-08-01
The observed similarities between light curve shapes from Type Ia supernovae, and in particular the correlation of light curve shape and brightness, have been actively studied for more than two decades. In recent years, hydronamic simulations of white dwarf explosions have advanced greatly, and multiple mechanisms that could potentially produce Type Ia supernovae have been explored in detail. The question which of the proposed mechanisms is (or are) possibly realized in nature remains challenging to answer, but detailed synthetic light curves and spectra from explosion simulations are very helpful and important guidelines towards answering this question.We present results from a newly developed radiation transport code, Supernu. Supernu solves the supernova radiation transfer problem uses a novel technique based on a hybrid between Implicit Monte Carlo and Discrete Diffusion Monte Carlo. This technique enhances the efficiency with respect to traditional implicit monte carlo codes and thus lends itself perfectly for multi-dimensional simulations. We show direct comparisons of light curves and spectra from Type Ia simulations with Supernu versus the legacy Phoenix code.
Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy.
Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe
2015-07-07
The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm(3) calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.
Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy
NASA Astrophysics Data System (ADS)
Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe
2015-07-01
The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm3 calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.
LLNL Mercury Project Trinity Open Science Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brantley, Patrick; Dawson, Shawn; McKinley, Scott
2016-04-20
The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, wemore » also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.« less
Cellular dosimetry calculations for Strontium-90 using Monte Carlo code PENELOPE.
Hocine, Nora; Farlay, Delphine; Boivin, Georges; Franck, Didier; Agarande, Michelle
2014-11-01
To improve risk assessments associated with chronic exposure to Strontium-90 (Sr-90), for both the environment and human health, it is necessary to know the energy distribution in specific cells or tissue. Monte Carlo (MC) simulation codes are extremely useful tools for calculating deposition energy. The present work was focused on the validation of the MC code PENetration and Energy LOss of Positrons and Electrons (PENELOPE) and the assessment of dose distribution to bone marrow cells from punctual Sr-90 source localized within the cortical bone part. S-values (absorbed dose per unit cumulated activity) calculations using Monte Carlo simulations were performed by using PENELOPE and Monte Carlo N-Particle eXtended (MCNPX). Cytoplasm, nucleus, cell surface, mouse femur bone and Sr-90 radiation source were simulated. Cells are assumed to be spherical with the radii of the cell and cell nucleus ranging from 2-10 μm. The Sr-90 source is assumed to be uniformly distributed in cell nucleus, cytoplasm and cell surface. The comparison of S-values calculated with PENELOPE to MCNPX results and the Medical Internal Radiation Dose (MIRD) values agreed very well since the relative deviations were less than 4.5%. The dose distribution to mouse bone marrow cells showed that the cells localized near the cortical part received the maximum dose. The MC code PENELOPE may prove useful for cellular dosimetry involving radiation transport through materials other than water, or for complex distributions of radionuclides and geometries.
Morse Monte Carlo Radiation Transport Code System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one maymore » determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)« less
Portable LQCD Monte Carlo code using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Calore, Enrico; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Fabio Schifano, Sebastiano; Silvi, Giorgio; Tripiccione, Raffaele
2018-03-01
Varying from multi-core CPU processors to many-core GPUs, the present scenario of HPC architectures is extremely heterogeneous. In this context, code portability is increasingly important for easy maintainability of applications; this is relevant in scientific computing where code changes are numerous and frequent. In this talk we present the design and optimization of a state-of-the-art production level LQCD Monte Carlo application, using the OpenACC directives model. OpenACC aims to abstract parallel programming to a descriptive level, where programmers do not need to specify the mapping of the code on the target machine. We describe the OpenACC implementation and show that the same code is able to target different architectures, including state-of-the-art CPUs and GPUs.
Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P
2007-01-01
The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported.
NASA Astrophysics Data System (ADS)
Homma, Yuto; Moriwaki, Hiroyuki; Ohki, Shigeo; Ikeda, Kazumi
2014-06-01
This paper deals with verification of three dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at beginning of cycle of an initial core and at beginning and end of cycle of equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multi-plication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity.
Fixed forced detection for fast SPECT Monte-Carlo simulation
NASA Astrophysics Data System (ADS)
Cajgfinger, T.; Rit, S.; Létang, J. M.; Halty, A.; Sarrut, D.
2018-03-01
Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.
Fixed forced detection for fast SPECT Monte-Carlo simulation.
Cajgfinger, T; Rit, S; Létang, J M; Halty, A; Sarrut, D
2018-03-02
Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.
GCR-induced Photon Luminescence of the Moon: The Moon as a CR Detector
NASA Technical Reports Server (NTRS)
Wilson, Thomas L.; Lee, Kerry; Andersen, Vic
2007-01-01
We report on the results of a preliminary study of the GCR-induced photon luminescence of the Moon using the Monte Carlo program FLUKA. The model of the lunar surface is taken to be the chemical composition of soils found at various landing sites during the Apollo and Luna programs, averaged over all such sites to define a generic regolith for the present analysis. This then becomes the target that is bombarded by Galactic Cosmic Rays (GCRs) in FLUKA to determine the photon fluence when there is no sunshine or Earthshine. From the photon fluence we derive the energy spectrum which can be utilized to design an orbiting optical instrument for measuring the GCR-induced luminescence. This is to be distinguished from the gamma-ray spectrum produced by the radioactive decay of its radiogenic constituents lying in the surface and interior. Also, we investigate transient optical flashes from high-energy CRs impacting the lunar surface (boulders and regolith). The goal is to determine to what extent the Moon could be used as a rudimentary CR detector. Meteor impacts on the Moon have been observed for centuries to generate such flashes, so why not CRs?
MC3: Multi-core Markov-chain Monte Carlo code
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Harrington, Joseph; Lust, Nate; Foster, AJ; Stemm, Madison; Loredo, Tom; Stevenson, Kevin; Campo, Chris; Hardin, Matt; Hardy, Ryan
2016-10-01
MC3 (Multi-core Markov-chain Monte Carlo) is a Bayesian statistics tool that can be executed from the shell prompt or interactively through the Python interpreter with single- or multiple-CPU parallel computing. It offers Markov-chain Monte Carlo (MCMC) posterior-distribution sampling for several algorithms, Levenberg-Marquardt least-squares optimization, and uniform non-informative, Jeffreys non-informative, or Gaussian-informative priors. MC3 can share the same value among multiple parameters and fix the value of parameters to constant values, and offers Gelman-Rubin convergence testing and correlated-noise estimation with time-averaging or wavelet-based likelihood estimation methods.
Monte Carlo Simulation of Nonlinear Radiation Induced Plasmas. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Wang, B. S.
1972-01-01
A Monte Carlo simulation model for radiation induced plasmas with nonlinear properties due to recombination was, employing a piecewise linearized predict-correct iterative technique. Several important variance reduction techniques were developed and incorporated into the model, including an antithetic variates technique. This approach is especially efficient for plasma systems with inhomogeneous media, multidimensions, and irregular boundaries. The Monte Carlo code developed has been applied to the determination of the electron energy distribution function and related parameters for a noble gas plasma created by alpha-particle irradiation. The characteristics of the radiation induced plasma involved are given.
Monte Carlo charged-particle tracking and energy deposition on a Lagrangian mesh.
Yuan, J; Moses, G A; McKenty, P W
2005-10-01
A Monte Carlo algorithm for alpha particle tracking and energy deposition on a cylindrical computational mesh in a Lagrangian hydrodynamics code used for inertial confinement fusion (ICF) simulations is presented. The straight line approximation is used to follow propagation of "Monte Carlo particles" which represent collections of alpha particles generated from thermonuclear deuterium-tritium (DT) reactions. Energy deposition in the plasma is modeled by the continuous slowing down approximation. The scheme addresses various aspects arising in the coupling of Monte Carlo tracking with Lagrangian hydrodynamics; such as non-orthogonal severely distorted mesh cells, particle relocation on the moving mesh and particle relocation after rezoning. A comparison with the flux-limited multi-group diffusion transport method is presented for a polar direct drive target design for the National Ignition Facility. Simulations show the Monte Carlo transport method predicts about earlier ignition than predicted by the diffusion method, and generates higher hot spot temperature. Nearly linear speed-up is achieved for multi-processor parallel simulations.
Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.; ...
2018-04-19
QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less
Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweezy, Jeremy Ed
2016-01-21
The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gammamore » transport with multi-temperature treatment, static eigenvalue (k eff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.
QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less
NASA Astrophysics Data System (ADS)
Jaboulay, Jean-Charles; Brun, Emeric; Hugot, François-Xavier; Huynh, Tan-Dat; Malouch, Fadhel; Mancusi, Davide; Tsilanizara, Aime
2017-09-01
After fission or fusion reactor shutdown the activated structure emits decay photons. For maintenance operations the radiation dose map must be established in the reactor building. Several calculation schemes have been developed to calculate the shutdown dose rate. These schemes are widely developed in fusion application and more precisely for the ITER tokamak. This paper presents the rigorous-two-steps scheme implemented at CEA. It is based on the TRIPOLI-4® Monte Carlo code and the inventory code MENDEL. The ITER shutdown dose rate benchmark has been carried out, results are in a good agreement with the other participant.
Monte Carlo simulation of liver cancer treatment with 166Ho-loaded glass microspheres
NASA Astrophysics Data System (ADS)
da Costa Guimarães, Carla; Moralles, Maurício; Roberto Martinelli, José
2014-02-01
Microspheres loaded with pure beta-emitter radioisotopes are used in the treatment of some types of liver cancer. The Instituto de Pesquisas Energéticas e Nucleares (IPEN) is developing 166Ho-loaded glass microspheres as an alternative to the commercially available 90Y microspheres. This work describes the implementation of a Monte Carlo code to simulate both the irradiation effects and the imaging of 166Ho and 90Y sources localized in different parts of the liver. Results obtained with the code and perspectives for the future are discussed.
Skyshine radiation from a pressurized water reactor containment dome
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng, W.H.
1986-06-01
The radiation dose rates resulting from airborne activities inside a postaccident pressurized water reactor containment are calculated by a discrete ordinates/Monte Carlo combined method. The calculated total dose rates and the skyshine component are presented as a function of distance from the containment at three different elevations for various gamma-ray source energies. The one-dimensional (ANISN code) is used to approximate the skyshine dose rates from the hemisphere dome, and the results are compared favorably to more rigorous results calculated by a three-dimensional Monte Carlo code.
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L.M.; Hochstedler, R.D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less
MCMEG: Simulations of both PDD and TPR for 6 MV LINAC photon beam using different MC codes
NASA Astrophysics Data System (ADS)
Fonseca, T. C. F.; Mendes, B. M.; Lacerda, M. A. S.; Silva, L. A. C.; Paixão, L.; Bastos, F. M.; Ramirez, J. V.; Junior, J. P. R.
2017-11-01
The Monte Carlo Modelling Expert Group (MCMEG) is an expert network specializing in Monte Carlo radiation transport and the modelling and simulation applied to the radiation protection and dosimetry research field. For the first inter-comparison task the group launched an exercise to model and simulate a 6 MV LINAC photon beam using the Monte Carlo codes available within their laboratories and validate their simulated results by comparing them with experimental measurements carried out in the National Cancer Institute (INCA) in Rio de Janeiro, Brazil. The experimental measurements were performed using an ionization chamber with calibration traceable to a Secondary Standard Dosimetry Laboratory (SSDL). The detector was immersed in a water phantom at different depths and was irradiated with a radiation field size of 10×10 cm2. This exposure setup was used to determine the dosimetric parameters Percentage Depth Dose (PDD) and Tissue Phantom Ratio (TPR). The validation process compares the MC calculated results to the experimental measured PDD20,10 and TPR20,10. Simulations were performed reproducing the experimental TPR20,10 quality index which provides a satisfactory description of both the PDD curve and the transverse profiles at the two depths measured. This paper reports in detail the modelling process using MCNPx, MCNP6, EGSnrc and Penelope Monte Carlo codes, the source and tally descriptions, the validation processes and the results.
NASA Astrophysics Data System (ADS)
Tessonnier, T.; Böhlen, T. T.; Ceruti, F.; Ferrari, A.; Sala, P.; Brons, S.; Haberer, T.; Debus, J.; Parodi, K.; Mairani, A.
2017-08-01
The introduction of ‘new’ ion species in particle therapy needs to be supported by a thorough assessment of their dosimetric properties and by treatment planning comparisons with clinically used proton and carbon ion beams. In addition to the latter two ions, helium and oxygen ion beams are foreseen at the Heidelberg Ion Beam Therapy Center (HIT) as potential assets for improving clinical outcomes in the near future. We present in this study a dosimetric validation of a FLUKA-based Monte Carlo treatment planning tool (MCTP) for protons, helium, carbon and oxygen ions for spread-out Bragg peaks in water. The comparisons between the ions show the dosimetric advantages of helium and heavier ion beams in terms of their distal and lateral fall-offs with respect to protons, reducing the lateral size of the region receiving 50% of the planned dose up to 12 mm. However, carbon and oxygen ions showed significant doses beyond the target due to the higher fragmentation tail compared to lighter ions (p and He), up to 25%. The Monte Carlo predictions were found to be in excellent geometrical agreement with the measurements, with deviations below 1 mm for all parameters investigated such as target and lateral size as well as distal fall-offs. Measured and simulated absolute dose values agreed within about 2.5% on the overall dose distributions. The MCTP tool, which supports the usage of multiple state-of-the-art relative biological effectiveness models, will provide a solid engine for treatment planning comparisons at HIT.
Development of a new multi-modal Monte-Carlo radiotherapy planning system.
Kumada, H; Nakamura, T; Komeda, M; Matsumura, A
2009-07-01
A new multi-modal Monte-Carlo radiotherapy planning system (developing code: JCDS-FX) is under development at Japan Atomic Energy Agency. This system builds on fundamental technologies of JCDS applied to actual boron neutron capture therapy (BNCT) trials in JRR-4. One of features of the JCDS-FX is that PHITS has been applied to particle transport calculation. PHITS is a multi-purpose particle Monte-Carlo transport code. Hence application of PHITS enables to evaluate total doses given to a patient by a combined modality therapy. Moreover, JCDS-FX with PHITS can be used for the study of accelerator based BNCT. To verify calculation accuracy of the JCDS-FX, dose evaluations for neutron irradiation of a cylindrical water phantom and for an actual clinical trial were performed, then the results were compared with calculations by JCDS with MCNP. The verification results demonstrated that JCDS-FX is applicable to BNCT treatment planning in practical use.
Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...
2015-12-21
This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results.« less
Reconstruction of Human Monte Carlo Geometry from Segmented Images
NASA Astrophysics Data System (ADS)
Zhao, Kai; Cheng, Mengyun; Fan, Yanchang; Wang, Wen; Long, Pengcheng; Wu, Yican
2014-06-01
Human computational phantoms have been used extensively for scientific experimental analysis and experimental simulation. This article presented a method for human geometry reconstruction from a series of segmented images of a Chinese visible human dataset. The phantom geometry could actually describe detailed structure of an organ and could be converted into the input file of the Monte Carlo codes for dose calculation. A whole-body computational phantom of Chinese adult female has been established by FDS Team which is named Rad-HUMAN with about 28.8 billion voxel number. For being processed conveniently, different organs on images were segmented with different RGB colors and the voxels were assigned with positions of the dataset. For refinement, the positions were first sampled. Secondly, the large sums of voxels inside the organ were three-dimensional adjacent, however, there were not thoroughly mergence methods to reduce the cell amounts for the description of the organ. In this study, the voxels on the organ surface were taken into consideration of the mergence which could produce fewer cells for the organs. At the same time, an indexed based sorting algorithm was put forward for enhancing the mergence speed. Finally, the Rad-HUMAN which included a total of 46 organs and tissues was described by the cuboids into the Monte Carlo Monte Carlo Geometry for the simulation. The Monte Carlo geometry was constructed directly from the segmented images and the voxels was merged exhaustively. Each organ geometry model was constructed without ambiguity and self-crossing, its geometry information could represent the accuracy appearance and precise interior structure of the organs. The constructed geometry largely retaining the original shape of organs could easily be described into different Monte Carlo codes input file such as MCNP. Its universal property was testified and high-performance was experimentally verified
NASA Astrophysics Data System (ADS)
Gardner, Robin P.; Xu, Libai
2009-10-01
The Center for Engineering Applications of Radioisotopes (CEAR) has been working for over a decade on the Monte Carlo library least-squares (MCLLS) approach for treating non-linear radiation analyzer problems including: (1) prompt gamma-ray neutron activation analysis (PGNAA) for bulk analysis, (2) energy-dispersive X-ray fluorescence (EDXRF) analyzers, and (3) carbon/oxygen tool analysis in oil well logging. This approach essentially consists of using Monte Carlo simulation to generate the libraries of all the elements to be analyzed plus any other required background libraries. These libraries are then used in the linear library least-squares (LLS) approach with unknown sample spectra to analyze for all elements in the sample. Iterations of this are used until the LLS values agree with the composition used to generate the libraries. The current status of the methods (and topics) necessary to implement the MCLLS approach is reported. This includes: (1) the Monte Carlo codes such as CEARXRF, CEARCPG, and CEARCO for forward generation of the necessary elemental library spectra for the LLS calculation for X-ray fluorescence, neutron capture prompt gamma-ray analyzers, and carbon/oxygen tools; (2) the correction of spectral pulse pile-up (PPU) distortion by Monte Carlo simulation with the code CEARIPPU; (3) generation of detector response functions (DRF) for detectors with linear and non-linear responses for Monte Carlo simulation of pulse-height spectra; and (4) the use of the differential operator (DO) technique to make the necessary iterations for non-linear responses practical. In addition to commonly analyzed single spectra, coincidence spectra or even two-dimensional (2-D) coincidence spectra can also be used in the MCLLS approach and may provide more accurate results.
Use of single scatter electron monte carlo transport for medical radiation sciences
Svatos, Michelle M.
2001-01-01
The single scatter Monte Carlo code CREEP models precise microscopic interactions of electrons with matter to enhance physical understanding of radiation sciences. It is designed to simulate electrons in any medium, including materials important for biological studies. It simulates each interaction individually by sampling from a library which contains accurate information over a broad range of energies.
Stochastic Analysis of Orbital Lifetimes of Spacecraft
NASA Technical Reports Server (NTRS)
Sasamoto, Washito; Goodliff, Kandyce; Cornelius, David
2008-01-01
A document discusses (1) a Monte-Carlo-based methodology for probabilistic prediction and analysis of orbital lifetimes of spacecraft and (2) Orbital Lifetime Monte Carlo (OLMC)--a Fortran computer program, consisting of a previously developed long-term orbit-propagator integrated with a Monte Carlo engine. OLMC enables modeling of variances of key physical parameters that affect orbital lifetimes through the use of probability distributions. These parameters include altitude, speed, and flight-path angle at insertion into orbit; solar flux; and launch delays. The products of OLMC are predicted lifetimes (durations above specified minimum altitudes) for the number of user-specified cases. Histograms generated from such predictions can be used to determine the probabilities that spacecraft will satisfy lifetime requirements. The document discusses uncertainties that affect modeling of orbital lifetimes. Issues of repeatability, smoothness of distributions, and code run time are considered for the purpose of establishing values of code-specific parameters and number of Monte Carlo runs. Results from test cases are interpreted as demonstrating that solar-flux predictions are primary sources of variations in predicted lifetimes. Therefore, it is concluded, multiple sets of predictions should be utilized to fully characterize the lifetime range of a spacecraft.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badal, Andreu; Badano, Aldo
Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-raymore » imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.« less
Culbertson, C N; Wangerin, K; Ghandourah, E; Jevremovic, T
2005-08-01
The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for neutron capture therapy related modeling. A boron neutron capture therapy model was analyzed comparing COG calculational results to results from the widely used MCNP4B (Monte Carlo N-Particle) transport code. The approach for computing neutron fluence rate and each dose component relevant in boron neutron capture therapy is described, and calculated values are shown in detail. The differences between the COG and MCNP predictions are qualified and quantified. The differences are generally small and suggest that the COG code can be applied for BNCT research related problems.
Monte Carlo simulation of ò ó coincidence system using plastic scintillators in 4àgeometry
NASA Astrophysics Data System (ADS)
Dias, M. S.; Piuvezam-Filho, H.; Baccarelli, A. M.; Takeda, M. N.; Koskinas, M. F.
2007-09-01
A modified version of a Monte Carlo code called Esquema, developed at the Nuclear Metrology Laboratory in IPEN, São Paulo, Brazil, has been applied for simulating a 4 πβ(PS)-γ coincidence system designed for primary radionuclide standardisation. This system consists of a plastic scintillator in 4 π geometry, for alpha or electron detection, coupled to a NaI(Tl) counter for gamma-ray detection. The response curves for monoenergetic electrons and photons have been calculated previously by Penelope code and applied as input data to code Esquema. The latter code simulates all the disintegration processes, from the precursor nucleus to the ground state of the daughter radionuclide. As a result, the curve between the observed disintegration rate as a function of the beta efficiency parameter can be simulated. A least-squares fit between the experimental activity values and the Monte Carlo calculation provided the actual radioactive source activity, without need of conventional extrapolation procedures. Application of this methodology to 60Co and 133Ba radioactive sources is presented and showed results in good agreement with a conventional proportional counter 4 πβ(PC)-γ coincidence system.
Kramer, S. L.; Ghosh, V. J.; Breitfeller, M.; ...
2016-08-10
We present that third generation high brightness light sources are designed to have low emittance and high current beams, which contribute to higher beam loss rates that will be compensated by Top-Off injection. Shielding for these higher loss rates will be critical to protect the projected higher occupancy factors for the users. Top-Off injection requires a full energy injector, which will demand greater consideration of the potential abnormal beam miss-steering and localized losses that could occur. The high energy electron injection beam produces significantly higher neutron component dose to the experimental floor than a lower energy beam injection and rampedmore » operations. Minimizing this dose will require adequate knowledge of where the miss-steered beam can occur and sufficient EM shielding close to the loss point, in order to attenuate the energy of the particles in the EM shower below the neutron production threshold (<10 MeV), which will spread the incident energy on the bulk shield walls and thereby the dose penetrating the shield walls. Designing supplemental shielding near the loss point using the analytic shielding model is shown to be inadequate because of its lack of geometry specification for the EM shower process. To predict the dose rates outside the tunnel requires detailed description of the geometry and materials that the beam losses will encounter inside the tunnel. Modern radiation shielding Monte-Carlo codes, like FLUKA, can handle this geometric description of the radiation transport process in sufficient detail, allowing accurate predictions of the dose rates expected and the ability to show weaknesses in the design before a high radiation incident occurs. The effort required to adequately define the accelerator geometry for these codes has been greatly reduced with the implementation of the graphical interface of FLAIR to FLUKA. In conclusion, this made the effective shielding process for NSLS-II quite accurate and reliable. The principles used to provide supplemental shielding to the NSLS-II accelerators and the lessons learned from this process are presented.« less
Tessonnier, Thomas; Marcelos, Tiago; Mairani, Andrea; Brons, Stephan; Parodi, Katia
2015-01-01
In the field of radiation therapy, accurate and robust dose calculation is required. For this purpose, precise modeling of the irradiation system and reliable computational platforms are needed. At the Heidelberg Ion Therapy Center (HIT), the beamline has been already modeled in the FLUKA Monte Carlo (MC) code. However, this model was kept confidential for disclosure reasons and was not available for any external team. The main goal of this study was to create efficiently phase space (PS) files for proton and carbon ion beams, for all energies and foci available at HIT. PSs are representing the characteristics of each particle recorded (charge, mass, energy, coordinates, direction cosines, generation) at a certain position along the beam path. In order to achieve this goal, keeping a reasonable data size but maintaining the requested accuracy for the calculation, we developed a new approach of beam PS generation with the MC code FLUKA. The generated PSs were obtained using an infinitely narrow beam and recording the desired quantities after the last element of the beamline, with a discrimination of primaries or secondaries. In this way, a unique PS can be used for each energy to accommodate the different foci by combining the narrow-beam scenario with a random sampling of its theoretical Gaussian beam in vacuum. PS can also reproduce the different patterns from the delivery system, when properly combined with the beam scanning information. MC simulations using PS have been compared to simulations, including the full beamline geometry and have been found in very good agreement for several cases (depth dose distributions, lateral dose profiles), with relative dose differences below 0.5%. This approach has also been compared with measured data of ion beams with different energies and foci, resulting in a very satisfactory agreement. Hence, the proposed approach was able to fulfill the different requirements and has demonstrated its capability for application to clinical treatment fields. It also offers a powerful tool to perform investigations on the contribution of primary and secondary particles produced in the beamline. These PSs are already made available to external teams upon request, to support interpretation of their measurements.
Large Hadron Collider at CERN: Beams generating high-energy-density matter.
Tahir, N A; Schmidt, R; Shutov, A; Lomonosov, I V; Piriz, A R; Hoffmann, D H H; Deutsch, C; Fortov, V E
2009-04-01
This paper presents numerical simulations that have been carried out to study the thermodynamic and hydrodynamic responses of a solid copper cylindrical target that is facially irradiated along the axis by one of the two Large Hadron Collider (LHC) 7 TeV/ c proton beams. The energy deposition by protons in solid copper has been calculated using an established particle interaction and Monte Carlo code, FLUKA, which is capable of simulating all components of the particle cascades in matter, up to multi-TeV energies. These data have been used as input to a sophisticated two-dimensional hydrodynamic computer code BIG2 that has been employed to study this problem. The prime purpose of these investigations was to assess the damage caused to the equipment if the entire LHC beam is lost at a single place. The FLUKA calculations show that the energy of protons will be deposited in solid copper within about 1 m assuming constant material parameters. Nevertheless, our hydrodynamic simulations have shown that the energy deposition region will extend to a length of about 35 m over the beam duration. This is due to the fact that first few tens of bunches deposit sufficient energy that leads to high pressure that generates an outgoing radial shock wave. Shock propagation leads to continuous reduction in the density at the target center that allows the protons delivered in subsequent bunches to penetrate deeper and deeper into the target. This phenomenon has also been seen in case of heavy-ion heated targets [N. A. Tahir, A. Kozyreva, P. Spiller, D. H. H. Hoffmann, and A. Shutov, Phys. Rev. E 63, 036407 (2001)]. This effect needs to be considered in the design of a sacrificial beam stopper. These simulations have also shown that the target is severely damaged and is converted into a huge sample of high-energy density (HED) matter. In fact, the inner part of the target is transformed into a strongly coupled plasma with fairly uniform physical conditions. This work, therefore, has suggested an additional very important application of the LHC, namely, studies of HED states in matter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darafsheh, A; Kassaee, A; Finlay, J
Purpose: Range verification in proton therapy is of great importance. Cherenkov light follows the photon and electron energy deposition in water phantom. The purpose of this study is to investigate the connection between Cherenkov light generation and radiation absorbed dose in a water phantom irradiated with proton beams. Methods: Monte Carlo simulation was performed by employing FLUKA Monte Carlo code to stochastically simulate radiation transport, ionizing radiation dose deposition, and Cherenkov radiation in water phantoms. The simulations were performed for proton beams with energies in the range 50–600 MeV to cover a wide range of proton energies. Results: The mechanismmore » of Cherenkov light production depends on the initial energy of protons. For proton energy with 50–400 MeV energy that is below the threshold (∼483 MeV in water) for Cherenkov light production directly from incident protons, Cherenkov light is produced mainly from the secondary electrons liberated as a result of columbic interactions with the incident protons. For proton beams with energy above 500 MeV, in the initial depth that incident protons have higher energy than the Cherenkov light production threshold, the light has higher intensity. As the slowing down process results in lower energy protons in larger depths in the water phantom, there is a knee point in the Cherenkov light curve vs. depth due to switching the Cherenkov light production mechanism from primary protons to secondary electrons. At the end of the depth dose curve the Cherenkov light intensity does not follow the dose peak because of the lack of high energy protons to produce Cherenkov light either directly or through secondary electrons. Conclusion: In contrast to photon and electron beams, Cherenkov light generation induced by proton beams does not follow the proton energy deposition specially close to the end of the proton range near the Bragg peak.« less
MontePython 3: Parameter inference code for cosmology
NASA Astrophysics Data System (ADS)
Brinckmann, Thejs; Lesgourgues, Julien; Audren, Benjamin; Benabed, Karim; Prunet, Simon
2018-05-01
MontePython 3 provides numerous ways to explore parameter space using Monte Carlo Markov Chain (MCMC) sampling, including Metropolis-Hastings, Nested Sampling, Cosmo Hammer, and a Fisher sampling method. This improved version of the Monte Python (ascl:1307.002) parameter inference code for cosmology offers new ingredients that improve the performance of Metropolis-Hastings sampling, speeding up convergence and offering significant time improvement in difficult runs. Additional likelihoods and plotting options are available, as are post-processing algorithms such as Importance Sampling and Adding Derived Parameter.
Fast quantum Monte Carlo on a GPU
NASA Astrophysics Data System (ADS)
Lutsyshyn, Y.
2015-02-01
We present a scheme for the parallelization of quantum Monte Carlo method on graphical processing units, focusing on variational Monte Carlo simulation of bosonic systems. We use asynchronous execution schemes with shared memory persistence, and obtain an excellent utilization of the accelerator. The CUDA code is provided along with a package that simulates liquid helium-4. The program was benchmarked on several models of Nvidia GPU, including Fermi GTX560 and M2090, and the Kepler architecture K20 GPU. Special optimization was developed for the Kepler cards, including placement of data structures in the register space of the Kepler GPUs. Kepler-specific optimization is discussed.
Duggan, Dennis M
2004-12-01
Improved cross-sections in a new version of the Monte-Carlo N-particle (MCNP) code may eliminate discrepancies between radial dose functions (as defined by American Association of Physicists in Medicine Task Group 43) derived from Monte-Carlo simulations of low-energy photon-emitting brachytherapy sources and those from measurements on the same sources with thermoluminescent dosimeters. This is demonstrated for two 125I brachytherapy seed models, the Implant Sciences Model ISC3500 (I-Plant) and the Amersham Health Model 6711, by simulating their radial dose functions with two versions of MCNP, 4c2 and 5.
Dewaraja, Yuni K; Ljungberg, Michael; Majumdar, Amitava; Bose, Abhijit; Koral, Kenneth F
2002-02-01
This paper reports the implementation of the SIMIND Monte Carlo code on an IBM SP2 distributed memory parallel computer. Basic aspects of running Monte Carlo particle transport calculations on parallel architectures are described. Our parallelization is based on equally partitioning photons among the processors and uses the Message Passing Interface (MPI) library for interprocessor communication and the Scalable Parallel Random Number Generator (SPRNG) to generate uncorrelated random number streams. These parallelization techniques are also applicable to other distributed memory architectures. A linear increase in computing speed with the number of processors is demonstrated for up to 32 processors. This speed-up is especially significant in Single Photon Emission Computed Tomography (SPECT) simulations involving higher energy photon emitters, where explicit modeling of the phantom and collimator is required. For (131)I, the accuracy of the parallel code is demonstrated by comparing simulated and experimental SPECT images from a heart/thorax phantom. Clinically realistic SPECT simulations using the voxel-man phantom are carried out to assess scatter and attenuation correction.
NASA Astrophysics Data System (ADS)
Nagakura, Hiroki; Richers, Sherwood; Ott, Christian; Iwakami, Wakana; Furusawa, Shun; Sumiyoshi, Kohsuke; Yamada, Shoichi
2017-01-01
We have developed a multi-d radiation-hydrodynamic code which solves first-principles Boltzmann equation for neutrino transport. It is currently applicable specifically for core-collapse supernovae (CCSNe), but we will extend their applicability to further extreme phenomena such as black hole formation and coalescence of double neutron stars. In this meeting, I will discuss about two things; (1) detailed comparison with a Monte-Carlo neutrino transport (2) axisymmetric CCSNe simulations. The project (1) gives us confidence of our code. The Monte-Carlo code has been developed by Caltech group and it is specialized to obtain a steady state. Among CCSNe community, this is the first attempt to compare two different methods for multi-d neutrino transport. I will show the result of these comparison. For the project (2), I particularly focus on the property of neutrino distribution function in the semi-transparent region where only first-principle Boltzmann solver can appropriately handle the neutrino transport. In addition to these analyses, I will also discuss the ``explodability'' by neutrino heating mechanism.
Computing Temperatures in Optically Thick Protoplanetary Disks
NASA Technical Reports Server (NTRS)
Capuder, Lawrence F.. Jr.
2011-01-01
We worked with a Monte Carlo radiative transfer code to simulate the transfer of energy through protoplanetary disks, where planet formation occurs. The code tracks photons from the star into the disk, through scattering, absorption and re-emission, until they escape to infinity. High optical depths in the disk interior dominate the computation time because it takes the photon packet many interactions to get out of the region. High optical depths also receive few photons and therefore do not have well-estimated temperatures. We applied a modified random walk (MRW) approximation for treating high optical depths and to speed up the Monte Carlo calculations. The MRW is implemented by calculating the average number of interactions the photon packet will undergo in diffusing within a single cell of the spatial grid and then updating the packet position, packet frequencies, and local radiation absorption rate appropriately. The MRW approximation was then tested for accuracy and speed compared to the original code. We determined that MRW provides accurate answers to Monte Carlo Radiative transfer simulations. The speed gained from using MRW is shown to be proportional to the disk mass.
An update on the BQCD Hybrid Monte Carlo program
NASA Astrophysics Data System (ADS)
Haar, Taylor Ryan; Nakamura, Yoshifumi; Stüben, Hinnerk
2018-03-01
We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n) for K, cSW and chemical potential reweighting), a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.
MO-FG-BRA-01: 4D Monte Carlo Simulations for Verification of Dose Delivered to a Moving Anatomy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gholampourkashi, S; Cygler, J E.; The Ottawa Hospital Cancer Centre, Ottawa, ON
Purpose: To validate 4D Monte Carlo (MC) simulations of dose delivery by an Elekta Agility linear accelerator to a moving phantom. Methods: Monte Carlo simulations were performed using the 4DdefDOSXYZnrc/EGSnrc user code which samples a new geometry for each incident particle and calculates the dose in a continuously moving anatomy. A Quasar respiratory motion phantom with a lung insert containing a 3 cm diameter tumor was used for dose measurements on an Elekta Agility linac with the phantom in stationary and moving states. Dose to the center of tumor was measured using calibrated EBT3 film and the RADPOS 4D dosimetrymore » system. A VMAT plan covering the tumor was created on the static CT scan of the phantom using Monaco V.5.10.02. A validated BEAMnrc model of our Elekta Agility linac was used for Monte Carlo simulations on stationary and moving anatomies. To compare the planned and delivered doses, linac log files recorded during measurements were used for the simulations. For 4D simulations, deformation vectors that modeled the rigid translation of the lung insert were generated as input to the 4DdefDOSXYZnrc code as well as the phantom motion trace recorded with RADPOS during the measurements. Results: Monte Carlo simulations and film measurements were found to agree within 2mm/2% for 97.7% of points in the film in the static phantom and 95.5% in the moving phantom. Dose values based on film and RADPOS measurements are within 2% of each other and within 2σ of experimental uncertainties with respect to simulations. Conclusion: Our 4D Monte Carlo simulation using the defDOSXYZnrc code accurately calculates dose delivered to a moving anatomy. Future work will focus on more investigation of VMAT delivery on a moving phantom to improve the agreement between simulation and measurements, as well as establishing the accuracy of our method in a deforming anatomy. This work was supported by the Ontario Consortium of Adaptive Interventions in Radiation Oncology (OCAIRO), funded by the Ontario Research Fund Research Excellence program.« less
Bolding, Simon R.; Cleveland, Mathew Allen; Morel, Jim E.
2016-10-21
In this paper, we have implemented a new high-order low-order (HOLO) algorithm for solving thermal radiative transfer problems. The low-order (LO) system is based on the spatial and angular moments of the transport equation and a linear-discontinuous finite-element spatial representation, producing equations similar to the standard S 2 equations. The LO solver is fully implicit in time and efficiently resolves the nonlinear temperature dependence at each time step. The high-order (HO) solver utilizes exponentially convergent Monte Carlo (ECMC) to give a globally accurate solution for the angular intensity to a fixed-source pure-absorber transport problem. This global solution is used tomore » compute consistency terms, which require the HO and LO solutions to converge toward the same solution. The use of ECMC allows for the efficient reduction of statistical noise in the Monte Carlo solution, reducing inaccuracies introduced through the LO consistency terms. Finally, we compare results with an implicit Monte Carlo code for one-dimensional gray test problems and demonstrate the efficiency of ECMC over standard Monte Carlo in this HOLO algorithm.« less
Monte Carlo capabilities of the SCALE code system
Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...
2014-09-12
SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less
Paixão, Lucas; Oliveira, Bruno Beraldo; Viloria, Carolina; de Oliveira, Marcio Alves; Teixeira, Maria Helena Araújo; Nogueira, Maria do Socorro
2015-01-01
Derive filtered tungsten X-ray spectra used in digital mammography systems by means of Monte Carlo simulations. Filtered spectra for rhodium filter were obtained for tube potentials between 26 and 32 kV. The half-value layer (HVL) of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F & MAM Detector Platinum and 8201023-C Xi Base unit Platinum Plus w mAs in a Hologic Selenia Dimensions system using a direct radiography mode. Calculated HVL values showed good agreement as compared with those obtained experimentally. The greatest relative difference between the Monte Carlo calculated HVL values and experimental HVL values was 4%. The results show that the filtered tungsten anode X-ray spectra and the EGSnrc Monte Carlo code can be used for mean glandular dose determination in mammography.
Paixão, Lucas; Oliveira, Bruno Beraldo; Viloria, Carolina; de Oliveira, Marcio Alves; Teixeira, Maria Helena Araújo; Nogueira, Maria do Socorro
2015-01-01
Objective Derive filtered tungsten X-ray spectra used in digital mammography systems by means of Monte Carlo simulations. Materials and Methods Filtered spectra for rhodium filter were obtained for tube potentials between 26 and 32 kV. The half-value layer (HVL) of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F & MAM Detector Platinum and 8201023-C Xi Base unit Platinum Plus w mAs in a Hologic Selenia Dimensions system using a direct radiography mode. Results Calculated HVL values showed good agreement as compared with those obtained experimentally. The greatest relative difference between the Monte Carlo calculated HVL values and experimental HVL values was 4%. Conclusion The results show that the filtered tungsten anode X-ray spectra and the EGSnrc Monte Carlo code can be used for mean glandular dose determination in mammography. PMID:26811553
NASA Astrophysics Data System (ADS)
Liu, Tianyu; Wolfe, Noah; Lin, Hui; Zieb, Kris; Ji, Wei; Caracappa, Peter; Carothers, Christopher; Xu, X. George
2017-09-01
This paper contains two parts revolving around Monte Carlo transport simulation on Intel Many Integrated Core coprocessors (MIC, also known as Xeon Phi). (1) MCNP 6.1 was recompiled into multithreading (OpenMP) and multiprocessing (MPI) forms respectively without modification to the source code. The new codes were tested on a 60-core 5110P MIC. The test case was FS7ONNi, a radiation shielding problem used in MCNP's verification and validation suite. It was observed that both codes became slower on the MIC than on a 6-core X5650 CPU, by a factor of 4 for the MPI code and, abnormally, 20 for the OpenMP code, and both exhibited limited capability of strong scaling. (2) We have recently added a Constructive Solid Geometry (CSG) module to our ARCHER code to provide better support for geometry modelling in radiation shielding simulation. The functions of this module are frequently called in the particle random walk process. To identify the performance bottleneck we developed a CSG proxy application and profiled the code using the geometry data from FS7ONNi. The profiling data showed that the code was primarily memory latency bound on the MIC. This study suggests that despite low initial porting e_ort, Monte Carlo codes do not naturally lend themselves to the MIC platform — just like to the GPUs, and that the memory latency problem needs to be addressed in order to achieve decent performance gain.
Fission matrix-based Monte Carlo criticality analysis of fuel storage pools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farlotti, M.; Ecole Polytechnique, Palaiseau, F 91128; Larsen, E. W.
2013-07-01
Standard Monte Carlo transport procedures experience difficulties in solving criticality problems in fuel storage pools. Because of the strong neutron absorption between fuel assemblies, source convergence can be very slow, leading to incorrect estimates of the eigenvalue and the eigenfunction. This study examines an alternative fission matrix-based Monte Carlo transport method that takes advantage of the geometry of a storage pool to overcome this difficulty. The method uses Monte Carlo transport to build (essentially) a fission matrix, which is then used to calculate the criticality and the critical flux. This method was tested using a test code on a simplemore » problem containing 8 assemblies in a square pool. The standard Monte Carlo method gave the expected eigenfunction in 5 cases out of 10, while the fission matrix method gave the expected eigenfunction in all 10 cases. In addition, the fission matrix method provides an estimate of the error in the eigenvalue and the eigenfunction, and it allows the user to control this error by running an adequate number of cycles. Because of these advantages, the fission matrix method yields a higher confidence in the results than standard Monte Carlo. We also discuss potential improvements of the method, including the potential for variance reduction techniques. (authors)« less
Li, Junli; Li, Chunyan; Qiu, Rui; Yan, Congchong; Xie, Wenzhang; Wu, Zhen; Zeng, Zhi; Tung, Chuanjong
2015-09-01
The method of Monte Carlo simulation is a powerful tool to investigate the details of radiation biological damage at the molecular level. In this paper, a Monte Carlo code called NASIC (Nanodosimetry Monte Carlo Simulation Code) was developed. It includes physical module, pre-chemical module, chemical module, geometric module and DNA damage module. The physical module can simulate physical tracks of low-energy electrons in the liquid water event-by-event. More than one set of inelastic cross sections were calculated by applying the dielectric function method of Emfietzoglou's optical-data treatments, with different optical data sets and dispersion models. In the pre-chemical module, the ionised and excited water molecules undergo dissociation processes. In the chemical module, the produced radiolytic chemical species diffuse and react. In the geometric module, an atomic model of 46 chromatin fibres in a spherical nucleus of human lymphocyte was established. In the DNA damage module, the direct damages induced by the energy depositions of the electrons and the indirect damages induced by the radiolytic chemical species were calculated. The parameters should be adjusted to make the simulation results be agreed with the experimental results. In this paper, the influence study of the inelastic cross sections and vibrational excitation reaction on the parameters and the DNA strand break yields were studied. Further work of NASIC is underway. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Comment on ‘egs_brachy: a versatile and fast Monte Carlo code for brachytherapy’
NASA Astrophysics Data System (ADS)
Yegin, Gultekin
2018-02-01
In a recent paper (Chamberland et al 2016 Phys. Med. Biol. 61 8214) develop a new Monte Carlo code called egs_brachy for brachytherapy treatments. It is based on EGSnrc, and written in the C++ programming language. In order to benchmark the egs_brachy code, the authors use it in various test case scenarios in which complex geometry conditions exist. Another EGSnrc based brachytherapy dose calculation engine, BrachyDose, is used for dose comparisons. The authors fail to prove that egs_brachy can produce reasonable dose values for brachytherapy sources in a given medium. The dose comparisons in the paper are erroneous and misleading. egs_brachy should not be used in any further research studies unless and until all the potential bugs are fixed in the code.
Parallel Grand Canonical Monte Carlo (ParaGrandMC) Simulation Code
NASA Technical Reports Server (NTRS)
Yamakov, Vesselin I.
2016-01-01
This report provides an overview of the Parallel Grand Canonical Monte Carlo (ParaGrandMC) simulation code. This is a highly scalable parallel FORTRAN code for simulating the thermodynamic evolution of metal alloy systems at the atomic level, and predicting the thermodynamic state, phase diagram, chemical composition and mechanical properties. The code is designed to simulate multi-component alloy systems, predict solid-state phase transformations such as austenite-martensite transformations, precipitate formation, recrystallization, capillary effects at interfaces, surface absorption, etc., which can aid the design of novel metallic alloys. While the software is mainly tailored for modeling metal alloys, it can also be used for other types of solid-state systems, and to some degree for liquid or gaseous systems, including multiphase systems forming solid-liquid-gas interfaces.
Radiological Environmental Protection for LCLS-II High Power Operation
NASA Astrophysics Data System (ADS)
Liu, James; Blaha, Jan; Cimeno, Maranda; Mao, Stan; Nicolas, Ludovic; Rokni, Sayed; Santana, Mario; Tran, Henry
2017-09-01
The LCLS-II superconducting electron accelerator at SLAC plans to operate at up to 4 GeV and 240 kW average power, which would create higher radiological impacts particularly near the beam loss points such as beam dumps and halo collimators. The main hazards to the public and environment include direct or skyshine radiation, effluent of radioactive air such as 13N, 15O and 41Ar, and activation of groundwater creating tritium. These hazards were evaluated using analytic methods and FLUKA Monte Carlo code. The controls (mainly extensive bulk shielding and local shielding around high loss points) and monitoring (neutron/photon detectors with detection capabilities below natural background at site boundary, site-wide radioactive air monitors, and groundwater wells) were designed to meet the U.S. DOE and EPA, as well as SLAC requirements. The radiological design and controls for the LCW systems [including concrete housing shielding for 15O and 11C circulating in LCW, 7Be and erosion/corrosion products (22Na, 54Mn, 60Co, 65Zn, etc.) captured in resin and filters, leak detection and containment of LCW with 3H and its waste water discharge; explosion from H2 build-up in surge tank and release of radionuclides] associated with the high power beam dumps are also presented.
High and low energy gamma beam dump designs for the gamma beam delivery system at ELI-NP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yasin, Zafar, E-mail: zafar.yasin@eli-np.ro; Matei, Catalin; Ur, Calin A.
The Extreme Light Infrastructure - Nuclear Physics (ELI-NP) is under construction in Magurele, Bucharest, Romania. The facility will use two 10 PW lasers and a high intensity, narrow bandwidth gamma beam for stand-alone and combined laser-gamma experiments. The accurate estimation of particle doses and their restriction within the limits for both personel and general public is very important in the design phase of any nuclear facility. In the present work, Monte Carlo simulations are performed using FLUKA and MCNPX to design 19.4 and 4 MeV gamma beam dumps along with shielding of experimental areas. Dose rate contour plots from both FLUKAmore » and MCNPX along with numerical values of doses in experimental area E8 of the facility are performed. The calculated doses are within the permissible limits. Furthermore, a reasonable agreement between both codes enhances our confidence in using one or both of them for future calculations in beam dump designs, radiation shielding, radioactive inventory, and other calculations releated to radiation protection. Residual dose rates and residual activity calculations are also performed for high-energy beam dump and their effect is negligible in comparison to contributions from prompt radiation.« less
NASA Astrophysics Data System (ADS)
Kang, D.; Apel, W. D.; Arteaga-Velazquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Finger, M.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Klages, H. O.; Link, K.; Łuczak, P.; Ludwig, M.; Mathes, H. J.; Mayer, H. J.; Melissas, M.; Milke, J.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schroder, F.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Wommer, M.; Zabierowski, J.
2013-02-01
KASCADE-Grande is a large detector array for observations of the energy spectrum as well as the chemical composition of cosmic ray air showers up to primary energies of 1 EeV. The multi-detector arrangement allows to measure the electromagnetic and muonic components for individual air showers. In this analysis, the reconstruction of the all-particle energy spectrum is based on the size spectra of the charged particle component. The energy is calibrated by using Monte Carlo simulations performed with CORSIKA and high-energy interaction models QGSJet, EPOS and SIBYLL. In all cases FLUKA has been used as low-energy interaction model. In this contribution the resulting spectra by means of different hadronic interaction models will be compared and discussed.
Monte Carlo simulation of Alaska wolf survival
NASA Astrophysics Data System (ADS)
Feingold, S. J.
1996-02-01
Alaskan wolves live in a harsh climate and are hunted intensively. Penna's biological aging code, using Monte Carlo methods, has been adapted to simulate wolf survival. It was run on the case in which hunting causes the disruption of wolves' social structure. Social disruption was shown to increase the number of deaths occurring at a given level of hunting. For high levels of social disruption, the population did not survive.
OBJECT KINETIC MONTE CARLO SIMULATIONS OF MICROSTRUCTURE EVOLUTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard L.
2013-09-30
The objective is to report the development of the flexible object kinetic Monte Carlo (OKMC) simulation code KSOME (kinetic simulation of microstructure evolution) which can be used to simulate microstructure evolution of complex systems under irradiation. In this report we briefly describe the capabilities of KSOME and present preliminary results for short term annealing of single cascades in tungsten at various primary-knock-on atom (PKA) energies and temperatures.
SABRINA: an interactive three-dimensional geometry-mnodeling program for MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, J.T. III
SABRINA is a fully interactive three-dimensional geometry-modeling program for MCNP, a Los Alamos Monte Carlo code for neutron and photon transport. In SABRINA, a user constructs either body geometry or surface geometry models and debugs spatial descriptions for the resulting objects. This enhanced capability significantly reduces effort in constructing and debugging complicated three-dimensional geometry models for Monte Carlo analysis. 2 refs., 33 figs.
Development of a multi-modal Monte-Carlo radiation treatment planning system combined with PHITS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumada, Hiroaki; Nakamura, Takemi; Komeda, Masao
A new multi-modal Monte-Carlo radiation treatment planning system is under development at Japan Atomic Energy Agency. This system (developing code: JCDS-FX) builds on fundamental technologies of JCDS. JCDS was developed by JAEA to perform treatment planning of boron neutron capture therapy (BNCT) which is being conducted at JRR-4 in JAEA. JCDS has many advantages based on practical accomplishments for actual clinical trials of BNCT at JRR-4, the advantages have been taken over to JCDS-FX. One of the features of JCDS-FX is that PHITS has been applied to particle transport calculation. PHITS is a multipurpose particle Monte-Carlo transport code, thus applicationmore » of PHITS enables to evaluate doses for not only BNCT but also several radiotherapies like proton therapy. To verify calculation accuracy of JCDS-FX with PHITS for BNCT, treatment planning of an actual BNCT conducted at JRR-4 was performed retrospectively. The verification results demonstrated the new system was applicable to BNCT clinical trials in practical use. In framework of R and D for laser-driven proton therapy, we begin study for application of JCDS-FX combined with PHITS to proton therapy in addition to BNCT. Several features and performances of the new multimodal Monte-Carlo radiotherapy planning system are presented.« less
CPMC-Lab: A MATLAB package for Constrained Path Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Nguyen, Huy; Shi, Hao; Xu, Jie; Zhang, Shiwei
2014-12-01
We describe CPMC-Lab, a MATLAB program for the constrained-path and phaseless auxiliary-field Monte Carlo methods. These methods have allowed applications ranging from the study of strongly correlated models, such as the Hubbard model, to ab initio calculations in molecules and solids. The present package implements the full ground-state constrained-path Monte Carlo (CPMC) method in MATLAB with a graphical interface, using the Hubbard model as an example. The package can perform calculations in finite supercells in any dimensions, under periodic or twist boundary conditions. Importance sampling and all other algorithmic details of a total energy calculation are included and illustrated. This open-source tool allows users to experiment with various model and run parameters and visualize the results. It provides a direct and interactive environment to learn the method and study the code with minimal overhead for setup. Furthermore, the package can be easily generalized for auxiliary-field quantum Monte Carlo (AFQMC) calculations in many other models for correlated electron systems, and can serve as a template for developing a production code for AFQMC total energy calculations in real materials. Several illustrative studies are carried out in one- and two-dimensional lattices on total energy, kinetic energy, potential energy, and charge- and spin-gaps.
Takada, Kenta; Kumada, Hiroaki; Liem, Peng Hong; Sakurai, Hideyuki; Sakae, Takeji
2016-12-01
We simulated the effect of patient displacement on organ doses in boron neutron capture therapy (BNCT). In addition, we developed a faster calculation algorithm (NCT high-speed) to simulate irradiation more efficiently. We simulated dose evaluation for the standard irradiation position (reference position) using a head phantom. Cases were assumed where the patient body is shifted in lateral directions compared to the reference position, as well as in the direction away from the irradiation aperture. For three groups of neutron (thermal, epithermal, and fast), flux distribution using NCT high-speed with a voxelized homogeneous phantom was calculated. The three groups of neutron fluxes were calculated for the same conditions with Monte Carlo code. These calculated results were compared. In the evaluations of body movements, there were no significant differences even with shifting up to 9mm in the lateral directions. However, the dose decreased by about 10% with shifts of 9mm in a direction away from the irradiation aperture. When comparing both calculations in the phantom surface up to 3cm, the maximum differences between the fluxes calculated by NCT high-speed with those calculated by Monte Carlo code for thermal neutrons and epithermal neutrons were 10% and 18%, respectively. The time required for NCT high-speed code was about 1/10th compared to Monte Carlo calculation. In the evaluation, the longitudinal displacement has a considerable effect on the organ doses. We also achieved faster calculation of depth distribution of thermal neutron flux using NCT high-speed calculation code. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Monte Carlo study of four dimensional binary hard hypersphere mixtures
NASA Astrophysics Data System (ADS)
Bishop, Marvin; Whitlock, Paula A.
2012-01-01
A multithreaded Monte Carlo code was used to study the properties of binary mixtures of hard hyperspheres in four dimensions. The ratios of the diameters of the hyperspheres examined were 0.4, 0.5, 0.6, and 0.8. Many total densities of the binary mixtures were investigated. The pair correlation functions and the equations of state were determined and compared with other simulation results and theoretical predictions. At lower diameter ratios the pair correlation functions of the mixture agree with the pair correlation function of a one component fluid at an appropriately scaled density. The theoretical results for the equation of state compare well to the Monte Carlo calculations for all but the highest densities studied.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, J; Pelletier, C; Lee, C
Purpose: Organ doses for the Hodgkin’s lymphoma patients treated with cobalt-60 radiation were estimated using an anthropomorphic model and Monte Carlo modeling. Methods: A cobalt-60 treatment unit modeled in the BEAMnrc Monte Carlo code was used to produce phase space data. The Monte Carlo simulation was verified with percent depth dose measurement in water at various field sizes. Radiation transport through the lung blocks were modeled by adjusting the weights of phase space data. We imported a precontoured adult female hybrid model and generated a treatment plan. The adjusted phase space data and the human model were imported to themore » XVMC Monte Carlo code for dose calculation. The organ mean doses were estimated and dose volume histograms were plotted. Results: The percent depth dose agreement between measurement and calculation in water phantom was within 2% for all field sizes. The mean organ doses of heart, left breast, right breast, and spleen for the selected case were 44.3, 24.1, 14.6 and 3.4 Gy, respectively with the midline prescription dose of 40.0 Gy. Conclusion: Organ doses were estimated for the patient group whose threedimensional images are not available. This development may open the door to more accurate dose reconstruction and estimates of uncertainties in secondary cancer risk for Hodgkin’s lymphoma patients. This work was partially supported by the intramural research program of the National Institutes of Health, National Cancer Institute, Division of Cancer Epidemiology and Genetics.« less
Advances in Monte-Carlo code TRIPOLI-4®'s treatment of the electromagnetic cascade
NASA Astrophysics Data System (ADS)
Mancusi, Davide; Bonin, Alice; Hugot, François-Xavier; Malouch, Fadhel
2018-01-01
TRIPOLI-4® is a Monte-Carlo particle-transport code developed at CEA-Saclay (France) that is employed in the domains of nuclear-reactor physics, criticality-safety, shielding/radiation protection and nuclear instrumentation. The goal of this paper is to report on current developments, validation and verification made in TRIPOLI-4 in the electron/positron/photon sector. The new capabilities and improvements concern refinements to the electron transport algorithm, the introduction of a charge-deposition score, the new thick-target bremsstrahlung option, the upgrade of the bremsstrahlung model and the improvement of electron angular straggling at low energy. The importance of each of the developments above is illustrated by comparisons with calculations performed with other codes and with experimental data.
McSKY: A hybrid Monte-Carlo lime-beam code for shielded gamma skyshine calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shultis, J.K.; Faw, R.E.; Stedry, M.H.
1994-07-01
McSKY evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated into either a vertical cone or a vertical structure with an N-sided polygon cross section. The code assumes an overhead shield of two materials, through the user can specify zero shield thickness for an unshielded calculation. The code uses a Monte-Carlo algorithm to evaluate transport through source shields and the integral line source to describe photon transport through the atmosphere. The source energy must be between 0.02 and 100 MeV. For heavily shielded sources with energies above 20 MeV, McSKY results must be used cautiously, especially at detectormore » locations near the source.« less
NASA Astrophysics Data System (ADS)
De Napoli, M.; Romano, F.; D'Urso, D.; Licciardello, T.; Agodi, C.; Candiano, G.; Cappuzzello, F.; Cirrone, G. A. P.; Cuttone, G.; Musumarra, A.; Pandola, L.; Scuderi, V.
2014-12-01
When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned. Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u-1 12C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.
Comparison of UWCC MOX fuel measurements to MCNP-REN calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhold, M.; Baker, M.; Jie, R.
1998-12-31
The development of neutron coincidence counting has greatly improved the accuracy and versatility of neutron-based techniques to assay fissile materials. Today, the shift register analyzer connected to either a passive or active neutron detector is widely used by both domestic and international safeguards organizations. The continued development of these techniques and detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model, as it is currently used, fails to accurately predict detector response in highly multiplying mediums such as mixed-oxide (MOX) lightmore » water reactor fuel assemblies. For this reason, efforts have been made to modify the currently used Monte Carlo codes and to develop new analytical methods so that this model is not required to predict detector response. The authors describe their efforts to modify a widely used Monte Carlo code for this purpose and also compare calculational results with experimental measurements.« less
Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali
2014-01-01
Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization. PMID:24600168
Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali
2014-01-01
Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, R; Lakshmanan, M; Fong, G
Purpose: Coherent scatter based imaging has shown improved contrast and molecular specificity over conventional digital mammography however the biological risks have not been quantified due to a lack of accurate information on absorbed dose. This study intends to characterize the dose distribution and average glandular dose from coded aperture coherent scatter spectral imaging of the breast. The dose deposited in the breast from this new diagnostic imaging modality has not yet been quantitatively evaluated. Here, various digitized anthropomorphic phantoms are tested in a Monte Carlo simulation to evaluate the absorbed dose distribution and average glandular dose using clinically feasible scanmore » protocols. Methods: Geant4 Monte Carlo radiation transport simulation software is used to replicate the coded aperture coherent scatter spectral imaging system. Energy sensitive, photon counting detectors are used to characterize the x-ray beam spectra for various imaging protocols. This input spectra is cross-validated with the results from XSPECT, a commercially available application that yields x-ray tube specific spectra for the operating parameters employed. XSPECT is also used to determine the appropriate number of photons emitted per mAs of tube current at a given kVp tube potential. With the implementation of the XCAT digital anthropomorphic breast phantom library, a variety of breast sizes with differing anatomical structure are evaluated. Simulations were performed with and without compression of the breast for dose comparison. Results: Through the Monte Carlo evaluation of a diverse population of breast types imaged under real-world scan conditions, a clinically relevant average glandular dose for this new imaging modality is extrapolated. Conclusion: With access to the physical coherent scatter imaging system used in the simulation, the results of this Monte Carlo study may be used to directly influence the future development of the modality to keep breast dose to a minimum while still maintaining clinically viable image quality.« less
SPAMCART: a code for smoothed particle Monte Carlo radiative transfer
NASA Astrophysics Data System (ADS)
Lomax, O.; Whitworth, A. P.
2016-10-01
We present a code for generating synthetic spectral energy distributions and intensity maps from smoothed particle hydrodynamics simulation snapshots. The code is based on the Lucy Monte Carlo radiative transfer method, I.e. it follows discrete luminosity packets as they propagate through a density field, and then uses their trajectories to compute the radiative equilibrium temperature of the ambient dust. The sources can be extended and/or embedded, and discrete and/or diffuse. The density is not mapped on to a grid, and therefore the calculation is performed at exactly the same resolution as the hydrodynamics. We present two example calculations using this method. First, we demonstrate that the code strictly adheres to Kirchhoff's law of radiation. Secondly, we present synthetic intensity maps and spectra of an embedded protostellar multiple system. The algorithm uses data structures that are already constructed for other purposes in modern particle codes. It is therefore relatively simple to implement.
NASA Astrophysics Data System (ADS)
Prettyman, T. H.; Gardner, R. P.; Verghese, K.
1993-08-01
A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.
Yoriyaz, Hélio; Moralles, Maurício; Siqueira, Paulo de Tarso Dalledone; Guimarães, Carla da Costa; Cintra, Felipe Belonsi; dos Santos, Adimir
2009-11-01
Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons. Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mille, M; Lee, C; Failla, G
Purpose: To use the Attila deterministic solver as a supplement to Monte Carlo for calculating out-of-field organ dose in support of epidemiological studies looking at the risks of second cancers. Supplemental dosimetry tools are needed to speed up dose calculations for studies involving large-scale patient cohorts. Methods: Attila is a multi-group discrete ordinates code which can solve the 3D photon-electron coupled linear Boltzmann radiation transport equation on a finite-element mesh. Dose is computed by multiplying the calculated particle flux in each mesh element by a medium-specific energy deposition cross-section. The out-of-field dosimetry capability of Attila is investigated by comparing averagemore » organ dose to that which is calculated by Monte Carlo simulation. The test scenario consists of a 6 MV external beam treatment of a female patient with a tumor in the left breast. The patient is simulated by a whole-body adult reference female computational phantom. Monte Carlo simulations were performed using MCNP6 and XVMC. Attila can export a tetrahedral mesh for MCNP6, allowing for a direct comparison between the two codes. The Attila and Monte Carlo methods were also compared in terms of calculation speed and complexity of simulation setup. A key perquisite for this work was the modeling of a Varian Clinac 2100 linear accelerator. Results: The solid mesh of the torso part of the adult female phantom for the Attila calculation was prepared using the CAD software SpaceClaim. Preliminary calculations suggest that Attila is a user-friendly software which shows great promise for our intended application. Computational performance is related to the number of tetrahedral elements included in the Attila calculation. Conclusion: Attila is being explored as a supplement to the conventional Monte Carlo radiation transport approach for performing retrospective patient dosimetry. The goal is for the dosimetry to be sufficiently accurate for use in retrospective epidemiological investigations.« less
A Monte Carlo method using octree structure in photon and electron transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogawa, K.; Maeda, S.
Most of the early Monte Carlo calculations in medical physics were used to calculate absorbed dose distributions, and detector responses and efficiencies. Recently, data acquisition in Single Photon Emission CT (SPECT) has been simulated by a Monte Carlo method to evaluate scatter photons generated in a human body and a collimator. Monte Carlo simulations in SPECT data acquisition are generally based on the transport of photons only because the photons being simulated are low energy, and therefore the bremsstrahlung productions by the electrons generated are negligible. Since the transport calculation of photons without electrons is much simpler than that withmore » electrons, it is possible to accomplish the high-speed simulation in a simple object with one medium. Here, object description is important in performing the photon and/or electron transport using a Monte Carlo method efficiently. The authors propose a new description method using an octree representation of an object. Thus even if the boundaries of each medium are represented accurately, high-speed calculation of photon transport can be accomplished because the number of voxels is much fewer than that of the voxel-based approach which represents an object by a union of the voxels of the same size. This Monte Carlo code using the octree representation of an object first establishes the simulation geometry by reading octree string, which is produced by forming an octree structure from a set of serial sections for the object before the simulation; then it transports photons in the geometry. Using the code, if the user just prepares a set of serial sections for the object in which he or she wants to simulate photon trajectories, he or she can perform the simulation automatically using the suboptimal geometry simplified by the octree representation without forming the optimal geometry by handwriting.« less
Verification of unfold error estimates in the UFO code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehl, D.L.; Biggs, F.
Spectral unfolding is an inverse mathematical operation which attempts to obtain spectral source information from a set of tabulated response functions and data measurements. Several unfold algorithms have appeared over the past 30 years; among them is the UFO (UnFold Operator) code. In addition to an unfolded spectrum, UFO also estimates the unfold uncertainty (error) induced by running the code in a Monte Carlo fashion with prescribed data distributions (Gaussian deviates). In the problem studied, data were simulated from an arbitrarily chosen blackbody spectrum (10 keV) and a set of overlapping response functions. The data were assumed to have anmore » imprecision of 5% (standard deviation). 100 random data sets were generated. The built-in estimate of unfold uncertainty agreed with the Monte Carlo estimate to within the statistical resolution of this relatively small sample size (95% confidence level). A possible 10% bias between the two methods was unresolved. The Monte Carlo technique is also useful in underdetemined problems, for which the error matrix method does not apply. UFO has been applied to the diagnosis of low energy x rays emitted by Z-Pinch and ion-beam driven hohlraums.« less
Monte Carlo track structure for radiation biology and space applications
NASA Technical Reports Server (NTRS)
Nikjoo, H.; Uehara, S.; Khvostunov, I. G.; Cucinotta, F. A.; Wilson, W. E.; Goodhead, D. T.
2001-01-01
Over the past two decades event by event Monte Carlo track structure codes have increasingly been used for biophysical modelling and radiotherapy. Advent of these codes has helped to shed light on many aspects of microdosimetry and mechanism of damage by ionising radiation in the cell. These codes have continuously been modified to include new improved cross sections and computational techniques. This paper provides a summary of input data for ionizations, excitations and elastic scattering cross sections for event by event Monte Carlo track structure simulations for electrons and ions in the form of parametric equations, which makes it easy to reproduce the data. Stopping power and radial distribution of dose are presented for ions and compared with experimental data. A model is described for simulation of full slowing down of proton tracks in water in the range 1 keV to 1 MeV. Modelling and calculations are presented for the response of a TEPC proportional counter irradiated with 5 MeV alpha-particles. Distributions are presented for the wall and wall-less counters. Data shows contribution of indirect effects to the lineal energy distribution for the wall counters responses even at such a low ion energy.
Benchmarking study of the MCNP code against cold critical experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sitaraman, S.
1991-01-01
The purpose of this study was to benchmark the widely used Monte Carlo code MCNP against a set of cold critical experiments with a view to using the code as a means of independently verifying the performance of faster but less accurate Monte Carlo and deterministic codes. The experiments simulated consisted of both fast and thermal criticals as well as fuel in a variety of chemical forms. A standard set of benchmark cold critical experiments was modeled. These included the two fast experiments, GODIVA and JEZEBEL, the TRX metallic uranium thermal experiments, the Babcock and Wilcox oxide and mixed oxidemore » experiments, and the Oak Ridge National Laboratory (ORNL) and Pacific Northwest Laboratory (PNL) nitrate solution experiments. The principal case studied was a small critical experiment that was performed with boiling water reactor bundles.« less
CMacIonize: Monte Carlo photoionisation and moving-mesh radiation hydrodynamics
NASA Astrophysics Data System (ADS)
Vandenbroucke, Bert; Wood, Kenneth
2018-02-01
CMacIonize simulates the self-consistent evolution of HII regions surrounding young O and B stars, or other sources of ionizing radiation. The code combines a Monte Carlo photoionization algorithm that uses a complex mix of hydrogen, helium and several coolants in order to self-consistently solve for the ionization and temperature balance at any given time, with a standard first order hydrodynamics scheme. The code can be run as a post-processing tool to get the line emission from an existing simulation snapshot, but can also be used to run full radiation hydrodynamical simulations. Both the radiation transfer and the hydrodynamics are implemented in a general way that is independent of the grid structure that is used to discretize the system, allowing it to be run both as a standard fixed grid code and also as a moving-mesh code.
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson; ...
2018-06-14
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
Comparative Dosimetric Estimates of a 25 keV Electron Micro-beam with three Monte Carlo Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mainardi, Enrico; Donahue, Richard J.; Blakely, Eleanor A.
2002-09-11
The calculations presented compare the different performances of the three Monte Carlo codes PENELOPE-1999, MCNP-4C and PITS, for the evaluation of Dose profiles from a 25 keV electron micro-beam traversing individual cells. The overall model of a cell is a water cylinder equivalent for the three codes but with a different internal scoring geometry: hollow cylinders for PENELOPE and MCNP, whereas spheres are used for the PITS code. A cylindrical cell geometry with scoring volumes with the shape of hollow cylinders was initially selected for PENELOPE and MCNP because of its superior simulation of the actual shape and dimensions ofmore » a cell and for its improved computer-time efficiency if compared to spherical internal volumes. Some of the transfer points and energy transfer that constitute a radiation track may actually fall in the space between spheres, that would be outside the spherical scoring volume. This internal geometry, along with the PENELOPE algorithm, drastically reduced the computer time when using this code if comparing with event-by-event Monte Carlo codes like PITS. This preliminary work has been important to address dosimetric estimates at low electron energies. It demonstrates that codes like PENELOPE can be used for Dose evaluation, even with such small geometries and energies involved, which are far below the normal use for which the code was created. Further work (initiated in Summer 2002) is still needed however, to create a user-code for PENELOPE that allows uniform comparison of exact cell geometries, integral volumes and also microdosimetric scoring quantities, a field where track-structure codes like PITS, written for this purpose, are believed to be superior.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
Coupled Monte Carlo neutronics and thermal hydraulics for power reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernnat, W.; Buck, M.; Mattes, M.
The availability of high performance computing resources enables more and more the use of detailed Monte Carlo models even for full core power reactors. The detailed structure of the core can be described by lattices, modeled by so-called repeated structures e.g. in Monte Carlo codes such as MCNP5 or MCNPX. For cores with mainly uniform material compositions, fuel and moderator temperatures, there is no problem in constructing core models. However, when the material composition and the temperatures vary strongly a huge number of different material cells must be described which complicate the input and in many cases exceed code ormore » memory limits. The second problem arises with the preparation of corresponding temperature dependent cross sections and thermal scattering laws. Only if these problems can be solved, a realistic coupling of Monte Carlo neutronics with an appropriate thermal-hydraulics model is possible. In this paper a method for the treatment of detailed material and temperature distributions in MCNP5 is described based on user-specified internal functions which assign distinct elements of the core cells to material specifications (e.g. water density) and temperatures from a thermal-hydraulics code. The core grid itself can be described with a uniform material specification. The temperature dependency of cross sections and thermal neutron scattering laws is taken into account by interpolation, requiring only a limited number of data sets generated for different temperatures. Applications will be shown for the stationary part of the Purdue PWR benchmark using ATHLET for thermal- hydraulics and for a generic Modular High Temperature reactor using THERMIX for thermal- hydraulics. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, T; Lin, H; Xu, X
Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analyticallymore » derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.« less
Light transport feature for SCINFUL.
Etaati, G R; Ghal-Eh, N
2008-03-01
An extended version of the scintillator response function prediction code SCINFUL has been developed by incorporating PHOTRACK, a Monte Carlo light transport code. Comparisons of calculated and experimental results for organic scintillators exposed to neutrons show that the extended code improves the predictive capability of SCINFUL.
NASA Astrophysics Data System (ADS)
Tsinganis, A.; Kokkoris, M.; Vlastou, R.; Kalamara, A.; Stamatopoulos, A.; Kanellakopoulos, A.; Lagoyannis, A.; Axiotis, M.
2017-09-01
Accurate data on neutron-induced fission cross-sections of actinides are essential for the design of advanced nuclear reactors based either on fast neutron spectra or alternative fuel cycles, as well as for the reduction of safety margins of existing and future conventional facilities. The fission cross-section of 234U was measured at incident neutron energies of 560 and 660 keV and 7.5 MeV with a setup based on `microbulk' Micromegas detectors and the same samples previously used for the measurement performed at the CERN n_TOF facility (Karadimos et al., 2014). The 235U fission cross-section was used as reference. The (quasi-)monoenergetic neutron beams were produced via the 7Li(p,n) and the 2H(d,n) reactions at the neutron beam facility of the Institute of Nuclear and Particle Physics at the `Demokritos' National Centre for Scientific Research. A detailed study of the neutron spectra produced in the targets and intercepted by the samples was performed coupling the NeuSDesc and MCNPX codes, taking into account the energy spread, energy loss and angular straggling of the beam ions in the target assemblies, as well as contributions from competing reactions and neutron scattering in the experimental setup. Auxiliary Monte-Carlo simulations were performed with the FLUKA code to study the behaviour of the detectors, focusing particularly on the reproduction of the pulse height spectra of α-particles and fission fragments (using distributions produced with the GEF code) for the evaluation of the detector efficiency. An overview of the developed methodology and preliminary results are presented.
NASA Astrophysics Data System (ADS)
Allaf, M. Athari; Shahriari, M.; Sohrabpour, M.
2004-04-01
A new method using Monte Carlo source simulation of interference reactions in neutron activation analysis experiments has been developed. The neutron spectrum at the sample location has been simulated using the Monte Carlo code MCNP and the contributions of different elements to produce a specified gamma line have been determined. The produced response matrix has been used to measure peak areas and the sample masses of the elements of interest. A number of benchmark experiments have been performed and the calculated results verified against known values. The good agreement obtained between the calculated and known values suggests that this technique may be useful for the elimination of interference reactions in neutron activation analysis.
Data decomposition of Monte Carlo particle transport simulations via tally servers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit
An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithmmore » in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations.« less
Using hybrid implicit Monte Carlo diffusion to simulate gray radiation hydrodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cleveland, Mathew A., E-mail: cleveland7@llnl.gov; Gentile, Nick
This work describes how to couple a hybrid Implicit Monte Carlo Diffusion (HIMCD) method with a Lagrangian hydrodynamics code to evaluate the coupled radiation hydrodynamics equations. This HIMCD method dynamically applies Implicit Monte Carlo Diffusion (IMD) [1] to regions of a problem that are opaque and diffusive while applying standard Implicit Monte Carlo (IMC) [2] to regions where the diffusion approximation is invalid. We show that this method significantly improves the computational efficiency as compared to a standard IMC/Hydrodynamics solver, when optically thick diffusive material is present, while maintaining accuracy. Two test cases are used to demonstrate the accuracy andmore » performance of HIMCD as compared to IMC and IMD. The first is the Lowrie semi-analytic diffusive shock [3]. The second is a simple test case where the source radiation streams through optically thin material and heats a thick diffusive region of material causing it to rapidly expand. We found that HIMCD proves to be accurate, robust, and computationally efficient for these test problems.« less
Solution of the Burnett equations for hypersonic flows near the continuum limit
NASA Technical Reports Server (NTRS)
Imlay, Scott T.
1992-01-01
The INCA code, a three-dimensional Navier-Stokes code for analysis of hypersonic flowfields, was modified to analyze the lower reaches of the continuum transition regime, where the Navier-Stokes equations become inaccurate and Monte Carlo methods become too computationally expensive. The two-dimensional Burnett equations and the three-dimensional rotational energy transport equation were added to the code and one- and two-dimensional calculations were performed. For the structure of normal shock waves, the Burnett equations give consistently better results than Navier-Stokes equations and compare reasonably well with Monte Carlo methods. For two-dimensional flow of Nitrogen past a circular cylinder the Burnett equations predict the total drag reasonably well. Care must be taken, however, not to exceed the range of validity of the Burnett equations.
Monte Carlo simulation of ion-neutral charge exchange collisions and grid erosion in an ion thruster
NASA Technical Reports Server (NTRS)
Peng, Xiaohang; Ruyten, Wilhelmus M.; Keefer, Dennis
1991-01-01
A combined particle-in-cell (PIC)/Monte Carlo simulation model has been developed in which the PIC method is used to simulate the charge exchange collisions. It is noted that a number of features were reproduced correctly by this code, but that its assumption of two-dimensional axisymmetry for a single set of grid apertures precluded the reproduction of the most characteristic feature of actual test data; namely, the concentrated grid erosion at the geometric center of the hexagonal aperture array. The first results of a three-dimensional code, which takes into account the hexagonal symmetry of the grid, are presented. It is shown that, with this code, the experimentally observed erosion patterns are reproduced correctly, demonstrating explicitly the concentration of sputtering between apertures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghoos, K., E-mail: kristel.ghoos@kuleuven.be; Dekeyser, W.; Samaey, G.
2016-10-01
The plasma and neutral transport in the plasma edge of a nuclear fusion reactor is usually simulated using coupled finite volume (FV)/Monte Carlo (MC) codes. However, under conditions of future reactors like ITER and DEMO, convergence issues become apparent. This paper examines the convergence behaviour and the numerical error contributions with a simplified FV/MC model for three coupling techniques: Correlated Sampling, Random Noise and Robbins Monro. Also, practical procedures to estimate the errors in complex codes are proposed. Moreover, first results with more complex models show that an order of magnitude speedup can be achieved without any loss in accuracymore » by making use of averaging in the Random Noise coupling technique.« less
Monte Carlo simulation of energy-dispersive x-ray fluorescence and applications
NASA Astrophysics Data System (ADS)
Li, Fusheng
Four key components with regards to Monte Carlo Library Least Squares (MCLLS) have been developed by the author. These include: a comprehensive and accurate Monte Carlo simulation code - CEARXRF5 with Differential Operators (DO) and coincidence sampling, Detector Response Function (DRF), an integrated Monte Carlo - Library Least-Squares (MCLLS) Graphical User Interface (GUI) visualization System (MCLLSPro) and a new reproducible and flexible benchmark experiment setup. All these developments or upgrades enable the MCLLS approach to be a useful and powerful tool for a tremendous variety of elemental analysis applications. CEARXRF, a comprehensive and accurate Monte Carlo code for simulating the total and individual library spectral responses of all elements, has been recently upgraded to version 5 by the author. The new version has several key improvements: input file format fully compatible with MCNP5, a new efficient general geometry tracking code, versatile source definitions, various variance reduction techniques (e.g. weight window mesh and splitting, stratifying sampling, etc.), a new cross section data storage and accessing method which improves the simulation speed by a factor of four and new cross section data, upgraded differential operators (DO) calculation capability, and also an updated coincidence sampling scheme which including K-L and L-L coincidence X-Rays, while keeping all the capabilities of the previous version. The new Differential Operators method is powerful for measurement sensitivity study and system optimization. For our Monte Carlo EDXRF elemental analysis system, it becomes an important technique for quantifying the matrix effect in near real time when combined with the MCLLS approach. An integrated visualization GUI system has been developed by the author to perform elemental analysis using iterated Library Least-Squares method for various samples when an initial guess is provided. This software was built on the Borland C++ Builder platform and has a user-friendly interface to accomplish all qualitative and quantitative tasks easily. That is to say, the software enables users to run the forward Monte Carlo simulation (if necessary) or use previously calculated Monte Carlo library spectra to obtain the sample elemental composition estimation within a minute. The GUI software is easy to use with user-friendly features and has the capability to accomplish all related tasks in a visualization environment. It can be a powerful tool for EDXRF analysts. A reproducible experiment setup has been built and experiments have been performed to benchmark the system. Two types of Standard Reference Materials (SRM), stainless steel samples from National Institute of Standards and Technology (NIST) and aluminum alloy samples from Alcoa Inc., with certified elemental compositions, are tested with this reproducible prototype system using a 109Cd radioisotope source (20mCi) and a liquid nitrogen cooled Si(Li) detector. The results show excellent agreement between the calculated sample compositions and their reference values and the approach is very fast.
Characterizing a proton beam scanning system for Monte Carlo dose calculation in patients
NASA Astrophysics Data System (ADS)
Grassberger, C.; Lomax, Anthony; Paganetti, H.
2015-01-01
The presented work has two goals. First, to demonstrate the feasibility of accurately characterizing a proton radiation field at treatment head exit for Monte Carlo dose calculation of active scanning patient treatments. Second, to show that this characterization can be done based on measured depth dose curves and spot size alone, without consideration of the exact treatment head delivery system. This is demonstrated through calibration of a Monte Carlo code to the specific beam lines of two institutions, Massachusetts General Hospital (MGH) and Paul Scherrer Institute (PSI). Comparison of simulations modeling the full treatment head at MGH to ones employing a parameterized phase space of protons at treatment head exit reveals the adequacy of the method for patient simulations. The secondary particle production in the treatment head is typically below 0.2% of primary fluence, except for low-energy electrons (<0.6 MeV for 230 MeV protons), whose contribution to skin dose is negligible. However, there is significant difference between the two methods in the low-dose penumbra, making full treatment head simulations necessary to study out-of-field effects such as secondary cancer induction. To calibrate the Monte Carlo code to measurements in a water phantom, we use an analytical Bragg peak model to extract the range-dependent energy spread at the two institutions, as this quantity is usually not available through measurements. Comparison of the measured with the simulated depth dose curves demonstrates agreement within 0.5 mm over the entire energy range. Subsequently, we simulate three patient treatments with varying anatomical complexity (liver, head and neck and lung) to give an example how this approach can be employed to investigate site-specific discrepancies between treatment planning system and Monte Carlo simulations.
The Metropolis Monte Carlo method with CUDA enabled Graphic Processing Units
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Clifford; School of Physics, Astronomy, and Computational Sciences, George Mason University, 4400 University Dr., Fairfax, VA 22030; Ji, Weixiao
2014-02-01
We present a CPU–GPU system for runtime acceleration of large molecular simulations using GPU computation and memory swaps. The memory architecture of the GPU can be used both as container for simulation data stored on the graphics card and as floating-point code target, providing an effective means for the manipulation of atomistic or molecular data on the GPU. To fully take advantage of this mechanism, efficient GPU realizations of algorithms used to perform atomistic and molecular simulations are essential. Our system implements a versatile molecular engine, including inter-molecule interactions and orientational variables for performing the Metropolis Monte Carlo (MMC) algorithm,more » which is one type of Markov chain Monte Carlo. By combining memory objects with floating-point code fragments we have implemented an MMC parallel engine that entirely avoids the communication time of molecular data at runtime. Our runtime acceleration system is a forerunner of a new class of CPU–GPU algorithms exploiting memory concepts combined with threading for avoiding bus bandwidth and communication. The testbed molecular system used here is a condensed phase system of oligopyrrole chains. A benchmark shows a size scaling speedup of 60 for systems with 210,000 pyrrole monomers. Our implementation can easily be combined with MPI to connect in parallel several CPU–GPU duets. -- Highlights: •We parallelize the Metropolis Monte Carlo (MMC) algorithm on one CPU—GPU duet. •The Adaptive Tempering Monte Carlo employs MMC and profits from this CPU—GPU implementation. •Our benchmark shows a size scaling-up speedup of 62 for systems with 225,000 particles. •The testbed involves a polymeric system of oligopyrroles in the condensed phase. •The CPU—GPU parallelization includes dipole—dipole and Mie—Jones classic potentials.« less
NASA Astrophysics Data System (ADS)
Croce, Olivier; Hachem, Sabet; Franchisseur, Eric; Marcié, Serge; Gérard, Jean-Pierre; Bordy, Jean-Marc
2012-06-01
This paper presents a dosimetric study concerning the system named "Papillon 50" used in the department of radiotherapy of the Centre Antoine-Lacassagne, Nice, France. The machine provides a 50 kVp X-ray beam, currently used to treat rectal cancers. The system can be mounted with various applicators of different diameters or shapes. These applicators can be fixed over the main rod tube of the unit in order to deliver the prescribed absorbed dose into the tumor with an optimal distribution. We have analyzed depth dose curves and dose profiles for the naked tube and for a set of three applicators. Dose measurements were made with an ionization chamber (PTW type 23342) and Gafchromic films (EBT2). We have also compared the measurements with simulations performed using the Monte Carlo code PENELOPE. Simulations were performed with a detailed geometrical description of the experimental setup and with enough statistics. Results of simulations are made in accordance with experimental measurements and provide an accurate evaluation of the dose delivered. The depths of the 50% isodose in water for the various applicators are 4.0, 6.0, 6.6 and 7.1 mm. The Monte Carlo PENELOPE simulations are in accordance with the measurements for a 50 kV X-ray system. Simulations are able to confirm the measurements provided by Gafchromic films or ionization chambers. Results also demonstrate that Monte Carlo simulations could be helpful to validate the future applicators designed for other localizations such as breast or skin cancers. Furthermore, Monte Carlo simulations could be a reliable alternative for a rapid evaluation of the dose delivered by such a system that uses multiple designs of applicators.
Characterizing a Proton Beam Scanning System for Monte Carlo Dose Calculation in Patients
Grassberger, C; Lomax, Tony; Paganetti, H
2015-01-01
The presented work has two goals. First, to demonstrate the feasibility of accurately characterizing a proton radiation field at treatment head exit for Monte Carlo dose calculation of active scanning patient treatments. Second, to show that this characterization can be done based on measured depth dose curves and spot size alone, without consideration of the exact treatment head delivery system. This is demonstrated through calibration of a Monte Carlo code to the specific beam lines of two institutions, Massachusetts General Hospital (MGH) and Paul Scherrer Institute (PSI). Comparison of simulations modeling the full treatment head at MGH to ones employing a parameterized phase space of protons at treatment head exit reveals the adequacy of the method for patient simulations. The secondary particle production in the treatment head is typically below 0.2% of primary fluence, except for low–energy electrons (<0.6MeV for 230MeV protons), whose contribution to skin dose is negligible. However, there is significant difference between the two methods in the low-dose penumbra, making full treatment head simulations necessary to study out-of field effects such as secondary cancer induction. To calibrate the Monte Carlo code to measurements in a water phantom, we use an analytical Bragg peak model to extract the range-dependent energy spread at the two institutions, as this quantity is usually not available through measurements. Comparison of the measured with the simulated depth dose curves demonstrates agreement within 0.5mm over the entire energy range. Subsequently, we simulate three patient treatments with varying anatomical complexity (liver, head and neck and lung) to give an example how this approach can be employed to investigate site-specific discrepancies between treatment planning system and Monte Carlo simulations. PMID:25549079
Continuous Energy Photon Transport Implementation in MCATK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Terry R.; Trahan, Travis John; Sweezy, Jeremy Ed
2016-10-31
The Monte Carlo Application ToolKit (MCATK) code development team has implemented Monte Carlo photon transport into the MCATK software suite. The current particle transport capabilities in MCATK, which process the tracking and collision physics, have been extended to enable tracking of photons using the same continuous energy approximation. We describe the four photoatomic processes implemented, which are coherent scattering, incoherent scattering, pair-production, and photoelectric absorption. The accompanying background, implementation, and verification of these processes will be presented.
GCR-Induced Photon Luminescence of the Moon
NASA Technical Reports Server (NTRS)
Lee, K. T.; Wilson, T. L.
2008-01-01
It is shown that the Moon has a ubiquitous photon luminescence induced by Galactic cosmic-rays (GCRs), using the Monte Carlo particle-physics program FLUKA. Both the fluence and the flux of the radiation can be determined by this method, but only the fluence will be presented here. This is in addition to thermal radiation emitted due to the Moon s internal temperature and radioactivity. This study is a follow-up to an earlier discussion [1] that addressed several misconceptions regarding Moonshine in the Earth-Moon system (Figure 1) and predicted this effect. There also exists a related x-ray fluorescence induced by solar energetic particles (SEPs, <350 MeV) and solar photons at lower x-ray energies, although this latter fluorescence was studied on Apollo 15 and 16 [2- 5], Lunar Prospector [6], and even EGRET [7].
NASA Astrophysics Data System (ADS)
Rabie, M.; Franck, C. M.
2016-06-01
We present a freely available MATLAB code for the simulation of electron transport in arbitrary gas mixtures in the presence of uniform electric fields. For steady-state electron transport, the program provides the transport coefficients, reaction rates and the electron energy distribution function. The program uses established Monte Carlo techniques and is compatible with the electron scattering cross section files from the open-access Plasma Data Exchange Project LXCat. The code is written in object-oriented design, allowing the tracing and visualization of the spatiotemporal evolution of electron swarms and the temporal development of the mean energy and the electron number due to attachment and/or ionization processes. We benchmark our code with well-known model gases as well as the real gases argon, N2, O2, CF4, SF6 and mixtures of N2 and O2.
NASA Astrophysics Data System (ADS)
De Geyter, G.; Baes, M.; Fritz, J.; Camps, P.
2013-02-01
We present FitSKIRT, a method to efficiently fit radiative transfer models to UV/optical images of dusty galaxies. These images have the advantage that they have better spatial resolution compared to FIR/submm data. FitSKIRT uses the GAlib genetic algorithm library to optimize the output of the SKIRT Monte Carlo radiative transfer code. Genetic algorithms prove to be a valuable tool in handling the multi- dimensional search space as well as the noise induced by the random nature of the Monte Carlo radiative transfer code. FitSKIRT is tested on artificial images of a simulated edge-on spiral galaxy, where we gradually increase the number of fitted parameters. We find that we can recover all model parameters, even if all 11 model parameters are left unconstrained. Finally, we apply the FitSKIRT code to a V-band image of the edge-on spiral galaxy NGC 4013. This galaxy has been modeled previously by other authors using different combinations of radiative transfer codes and optimization methods. Given the different models and techniques and the complexity and degeneracies in the parameter space, we find reasonable agreement between the different models. We conclude that the FitSKIRT method allows comparison between different models and geometries in a quantitative manner and minimizes the need of human intervention and biasing. The high level of automation makes it an ideal tool to use on larger sets of observed data.
NASA Astrophysics Data System (ADS)
Wang, Chao; Xiao, Jun; Luo, Xiaobing
2016-10-01
The neutron inelastic scattering cross section of 115In has been measured by the activation technique at neutron energies of 2.95, 3.94, and 5.24 MeV with the neutron capture cross sections of 197Au as an internal standard. The effects of multiple scattering and flux attenuation were corrected using the Monte Carlo code GEANT4. Based on the experimental values, the 115In neutron inelastic scattering cross sections data were theoretically calculated between the 1 and 15 MeV with the TALYS software code, the theoretical results of this study are in reasonable agreement with the available experimental results.
Track-structure simulations for charged particles.
Dingfelder, Michael
2012-11-01
Monte Carlo track-structure simulations provide a detailed and accurate picture of radiation transport of charged particles through condensed matter of biological interest. Liquid water serves as a surrogate for soft tissue and is used in most Monte Carlo track-structure codes. Basic theories of radiation transport and track-structure simulations are discussed and differences compared to condensed history codes highlighted. Interaction cross sections for electrons, protons, alpha particles, and light and heavy ions are required input data for track-structure simulations. Different calculation methods, including the plane-wave Born approximation, the dielectric theory, and semi-empirical approaches are presented using liquid water as a target. Low-energy electron transport and light ion transport are discussed as areas of special interest.
Coupled reactors analysis: New needs and advances using Monte Carlo methodology
Aufiero, M.; Palmiotti, G.; Salvatores, M.; ...
2016-08-20
Coupled reactors and the coupling features of large or heterogeneous core reactors can be investigated with the Avery theory that allows a physics understanding of the main features of these systems. However, the complex geometries that are often encountered in association with coupled reactors, require a detailed geometry description that can be easily provided by modern Monte Carlo (MC) codes. This implies a MC calculation of the coupling parameters defined by Avery and of the sensitivity coefficients that allow further detailed physics analysis. The results presented in this paper show that the MC code SERPENT has been successfully modifed tomore » meet the required capabilities.« less
PBMC: Pre-conditioned Backward Monte Carlo code for radiative transport in planetary atmospheres
NASA Astrophysics Data System (ADS)
García Muñoz, A.; Mills, F. P.
2017-08-01
PBMC (Pre-Conditioned Backward Monte Carlo) solves the vector Radiative Transport Equation (vRTE) and can be applied to planetary atmospheres irradiated from above. The code builds the solution by simulating the photon trajectories from the detector towards the radiation source, i.e. in the reverse order of the actual photon displacements. In accounting for the polarization in the sampling of photon propagation directions and pre-conditioning the scattering matrix with information from the scattering matrices of prior (in the BMC integration order) photon collisions, PBMC avoids the unstable and biased solutions of classical BMC algorithms for conservative, optically-thick, strongly-polarizing media such as Rayleigh atmospheres.
Track structure in radiation biology: theory and applications.
Nikjoo, H; Uehara, S; Wilson, W E; Hoshi, M; Goodhead, D T
1998-04-01
A brief review is presented of the basic concepts in track structure and the relative merit of various theoretical approaches adopted in Monte-Carlo track-structure codes are examined. In the second part of the paper, a formal cluster analysis is introduced to calculate cluster-distance distributions. Total experimental ionization cross-sections were least-square fitted and compared with the calculation by various theoretical methods. Monte-Carlo track-structure code Kurbuc was used to examine and compare the spectrum of the secondary electrons generated by using functions given by Born-Bethe, Jain-Khare, Gryzinsky, Kim-Rudd, Mott and Vriens' theories. The cluster analysis in track structure was carried out using the k-means method and Hartigan algorithm. Data are presented on experimental and calculated total ionization cross-sections: inverse mean free path (IMFP) as a function of electron energy used in Monte-Carlo track-structure codes; the spectrum of secondary electrons generated by different functions for 500 eV primary electrons; cluster analysis for 4 MeV and 20 MeV alpha-particles in terms of the frequency of total cluster energy to the root-mean-square (rms) radius of the cluster and differential distance distributions for a pair of clusters; and finally relative frequency distribution for energy deposited in DNA, single-strand break and double-strand breaks for 10MeV/u protons, alpha-particles and carbon ions. There are a number of Monte-Carlo track-structure codes that have been developed independently and the bench-marking presented in this paper allows a better choice of the theoretical method adopted in a track-structure code to be made. A systematic bench-marking of cross-sections and spectra of the secondary electrons shows differences between the codes at atomic level, but such differences are not significant in biophysical modelling at the macromolecular level. Clustered-damage evaluation shows: that a substantial proportion of dose ( 30%) is deposited by low-energy electrons; the majority of DNA damage lesions are of simple type; the complexity of damage increases with increased LET, while the total yield of strand breaks remains constant; and at high LET values nearly 70% of all double-strand breaks are of complex type.
NASA Astrophysics Data System (ADS)
Nelson, Adam
Multi-group scattering moment matrices are critical to the solution of the multi-group form of the neutron transport equation, as they are responsible for describing the change in direction and energy of neutrons. These matrices, however, are difficult to correctly calculate from the measured nuclear data with both deterministic and stochastic methods. Calculating these parameters when using deterministic methods requires a set of assumptions which do not hold true in all conditions. These quantities can be calculated accurately with stochastic methods, however doing so is computationally expensive due to the poor efficiency of tallying scattering moment matrices. This work presents an improved method of obtaining multi-group scattering moment matrices from a Monte Carlo neutron transport code. This improved method of tallying the scattering moment matrices is based on recognizing that all of the outgoing particle information is known a priori and can be taken advantage of to increase the tallying efficiency (therefore reducing the uncertainty) of the stochastically integrated tallies. In this scheme, the complete outgoing probability distribution is tallied, supplying every one of the scattering moment matrices elements with its share of data. In addition to reducing the uncertainty, this method allows for the use of a track-length estimation process potentially offering even further improvement to the tallying efficiency. Unfortunately, to produce the needed distributions, the probability functions themselves must undergo an integration over the outgoing energy and scattering angle dimensions. This integration is too costly to perform during the Monte Carlo simulation itself and therefore must be performed in advance by way of a pre-processing code. The new method increases the information obtained from tally events and therefore has a significantly higher efficiency than the currently used techniques. The improved method has been implemented in a code system containing a new pre-processor code, NDPP, and a Monte Carlo neutron transport code, OpenMC. This method is then tested in a pin cell problem and a larger problem designed to accentuate the importance of scattering moment matrices. These tests show that accuracy was retained while the figure-of-merit for generating scattering moment matrices and fission energy spectra was significantly improved.
Renner, Franziska
2016-09-01
Monte Carlo simulations are regarded as the most accurate method of solving complex problems in the field of dosimetry and radiation transport. In (external) radiation therapy they are increasingly used for the calculation of dose distributions during treatment planning. In comparison to other algorithms for the calculation of dose distributions, Monte Carlo methods have the capability of improving the accuracy of dose calculations - especially under complex circumstances (e.g. consideration of inhomogeneities). However, there is a lack of knowledge of how accurate the results of Monte Carlo calculations are on an absolute basis. A practical verification of the calculations can be performed by direct comparison with the results of a benchmark experiment. This work presents such a benchmark experiment and compares its results (with detailed consideration of measurement uncertainty) with the results of Monte Carlo calculations using the well-established Monte Carlo code EGSnrc. The experiment was designed to have parallels to external beam radiation therapy with respect to the type and energy of the radiation, the materials used and the kind of dose measurement. Because the properties of the beam have to be well known in order to compare the results of the experiment and the simulation on an absolute basis, the benchmark experiment was performed using the research electron accelerator of the Physikalisch-Technische Bundesanstalt (PTB), whose beam was accurately characterized in advance. The benchmark experiment and the corresponding Monte Carlo simulations were carried out for two different types of ionization chambers and the results were compared. Considering the uncertainty, which is about 0.7 % for the experimental values and about 1.0 % for the Monte Carlo simulation, the results of the simulation and the experiment coincide. Copyright © 2015. Published by Elsevier GmbH.
Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr
2012-01-01
Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Monte Carlo simulation of MOSFET dosimeter for electron backscatter using the GEANT4 code.
Chow, James C L; Leung, Michael K K
2008-06-01
The aim of this study is to investigate the influence of the body of the metal-oxide-semiconductor field effect transistor (MOSFET) dosimeter in measuring the electron backscatter from lead. The electron backscatter factor (EBF), which is defined as the ratio of dose at the tissue-lead interface to the dose at the same point without the presence of backscatter, was calculated by the Monte Carlo simulation using the GEANT4 code. Electron beams with energies of 4, 6, 9, and 12 MeV were used in the simulation. It was found that in the presence of the MOSFET body, the EBFs were underestimated by about 2%-0.9% for electron beam energies of 4-12 MeV, respectively. The trend of the decrease of EBF with an increase of electron energy can be explained by the small MOSFET dosimeter, mainly made of epoxy and silicon, not only attenuated the electron fluence of the electron beam from upstream, but also the electron backscatter generated by the lead underneath the dosimeter. However, this variation of the EBF underestimation is within the same order of the statistical uncertainties as the Monte Carlo simulations, which ranged from 1.3% to 0.8% for the electron energies of 4-12 MeV, due to the small dosimetric volume. Such small EBF deviation is therefore insignificant when the uncertainty of the Monte Carlo simulation is taken into account. Corresponding measurements were carried out and uncertainties compared to Monte Carlo results were within +/- 2%. Spectra of energy deposited by the backscattered electrons in dosimetric volumes with and without the lead and MOSFET were determined by Monte Carlo simulations. It was found that in both cases, when the MOSFET body is either present or absent in the simulation, deviations of electron energy spectra with and without the lead decrease with an increase of the electron beam energy. Moreover, the softer spectrum of the backscattered electron when lead is present can result in a reduction of the MOSFET response due to stronger recombination in the SiO2 gate. It is concluded that the MOSFET dosimeter performed well for measuring the electron backscatter from lead using electron beams. The uncertainty of EBF determined by comparing the results of Monte Carlo simulations and measurements is well within the accuracy of the MOSFET dosimeter (< +/- 4.2%) provided by the manufacturer.
Constraining physical parameters of ultra-fast outflows in PDS 456 with Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Hagino, K.; Odaka, H.; Done, C.; Gandhi, P.; Takahashi, T.
2014-07-01
Deep absorption lines with extremely high velocity of ˜0.3c observed in PDS 456 spectra strongly indicate the existence of ultra-fast outflows (UFOs). However, the launching and acceleration mechanisms of UFOs are still uncertain. One possible way to solve this is to constrain physical parameters as a function of distance from the source. In order to study the spatial dependence of parameters, it is essential to adopt 3-dimensional Monte Carlo simulations that treat radiation transfer in arbitrary geometry. We have developed a new simulation code of X-ray radiation reprocessed in AGN outflow. Our code implements radiative transfer in 3-dimensional biconical disk wind geometry, based on Monte Carlo simulation framework called MONACO (Watanabe et al. 2006, Odaka et al. 2011). Our simulations reproduce FeXXV and FeXXVI absorption features seen in the spectra. Also, broad Fe emission lines, which reflects the geometry and viewing angle, is successfully reproduced. By comparing the simulated spectra with Suzaku data, we obtained constraints on physical parameters. We discuss launching and acceleration mechanisms of UFOs in PDS 456 based on our analysis.
NASA Astrophysics Data System (ADS)
Makarevich, K. O.; Minenko, V. F.; Verenich, K. A.; Kuten, S. A.
2016-05-01
This work is dedicated to modeling dental radiographic examinations to assess the absorbed doses of patients and effective doses. For simulating X-ray spectra, the TASMIP empirical model is used. Doses are assessed on the basis of the Monte Carlo method by using MCNP code for voxel phantoms of ICRP. The results of the assessment of doses to individual organs and effective doses for different types of dental examinations and features of X-ray tube are presented.
Calculation of self–shielding factor for neutron activation experiments using GEANT4 and MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero–Barrientos, Jaime, E-mail: jaromero@ing.uchile.cl; Universidad de Chile, DFI, Facultad de Ciencias Físicas Y Matemáticas, Avenida Blanco Encalada 2008, Santiago; Molina, F.
2016-07-07
The neutron self–shielding factor G as a function of the neutron energy was obtained for 14 pure metallic samples in 1000 isolethargic energy bins from 1·10{sup −5}eV to 2·10{sup 7}eV using Monte Carlo simulations in GEANT4 and MCNP6. The comparison of these two Monte Carlo codes shows small differences in the final self–shielding factor mostly due to the different cross section databases that each program uses.
Toward centrality determination at NICA/MPD
NASA Astrophysics Data System (ADS)
Galoyan, A. S.; Uzhinsky, V. V.
2017-03-01
Geometrical properties of nucleus-nucleus interactions at various centralities are calculated for the NICA energy range. A modified version of the Glauber Monte Carlo simulation code has been used for the calculations. It is shown that the geometrical properties of nucleus-nucleus interactions at the energies 5 - 10 GeV (NICA/MPD) and at energy 200 GeV (RHIC) are quite close to each other. A possible determination of centrality at NICA/MPD experiment using calculations of various Monte Carlo event generators are considered.
Monte Carlo N Particle code - Dose distribution of clinical electron beams in inhomogeneous phantoms
Nedaie, H. A.; Mosleh-Shirazi, M. A.; Allahverdi, M.
2013-01-01
Electron dose distributions calculated using the currently available analytical methods can be associated with large uncertainties. The Monte Carlo method is the most accurate method for dose calculation in electron beams. Most of the clinical electron beam simulation studies have been performed using non- MCNP [Monte Carlo N Particle] codes. Given the differences between Monte Carlo codes, this work aims to evaluate the accuracy of MCNP4C-simulated electron dose distributions in a homogenous phantom and around inhomogeneities. Different types of phantoms ranging in complexity were used; namely, a homogeneous water phantom and phantoms made of polymethyl methacrylate slabs containing different-sized, low- and high-density inserts of heterogeneous materials. Electron beams with 8 and 15 MeV nominal energy generated by an Elekta Synergy linear accelerator were investigated. Measurements were performed for a 10 cm × 10 cm applicator at a source-to-surface distance of 100 cm. Individual parts of the beam-defining system were introduced into the simulation one at a time in order to show their effect on depth doses. In contrast to the first scattering foil, the secondary scattering foil, X and Y jaws and applicator provide up to 5% of the dose. A 2%/2 mm agreement between MCNP and measurements was found in the homogenous phantom, and in the presence of heterogeneities in the range of 1-3%, being generally within 2% of the measurements for both energies in a "complex" phantom. A full-component simulation is necessary in order to obtain a realistic model of the beam. The MCNP4C results agree well with the measured electron dose distributions. PMID:23533162
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mein, S; Gunasingha, R; Nolan, M
Purpose: X-PACT is an experimental cancer therapy where kV x-rays are used to photo-activate anti-cancer therapeutics through phosphor intermediaries (phosphors that absorb x-rays and re-radiate as UV light). Clinical trials in pet dogs are currently underway (NC State College of Veterinary Medicine) and an essential component is the ability to model the kV dose in these dogs. Here we report the commissioning and characterization of a Monte Carlo (MC) treatment planning simulation tool to calculate X-PACT radiation doses in canine trials. Methods: FLUKA multi-particle MC simulation package was used to simulate a standard X-PACT radiation treatment beam of 80kVp withmore » the Varian OBI x-ray source geometry. The beam quality was verified by comparing measured and simulated attenuation of the beam by various thicknesses of aluminum (2–4.6 mm) under narrow beam conditions (HVL). The beam parameters at commissioning were then corroborated using MC, characterized and verified with empirically collected commissioning data, including: percent depth dose curves (PDD), back-scatter factors (BSF), collimator scatter factor(s), and heel effect, etc. All simulations were conducted for N=30M histories at M=100 iterations. Results: HVL and PDD simulation data agreed with an average percent error of 2.42%±0.33 and 6.03%±1.58, respectively. The mean square error (MSE) values for HVL and PDD (0.07% and 0.50%) were low, as expected; however, longer simulations are required to validate convergence to the expected values. Qualitatively, pre- and post-filtration source spectra matched well with 80kVp references generated via SPEKTR software. Further validation of commissioning data simulation is underway in preparation for first-time 3D dose calculations with canine CBCT data. Conclusion: We have prepared a Monte Carlo simulation capable of accurate dose calculation for use with ongoing X-PACT canine clinical trials. Preliminary results show good agreement with measured data and hold promise for accurate quantification of dose for this novel psoralen X-ray therapy. Funding Support, Disclosures, & Conflict of Interest: The Monte Carlo simulation work was not funded; Drs. Adamson & Oldham have received funding from Immunolight LLC for X-PACT research.« less
Pölz, Stefan; Laubersheimer, Sven; Eberhardt, Jakob S; Harrendorf, Marco A; Keck, Thomas; Benzler, Andreas; Breustedt, Bastian
2013-08-21
The basic idea of Voxel2MCNP is to provide a framework supporting users in modeling radiation transport scenarios using voxel phantoms and other geometric models, generating corresponding input for the Monte Carlo code MCNPX, and evaluating simulation output. Applications at Karlsruhe Institute of Technology are primarily whole and partial body counter calibration and calculation of dose conversion coefficients. A new generic data model describing data related to radiation transport, including phantom and detector geometries and their properties, sources, tallies and materials, has been developed. It is modular and generally independent of the targeted Monte Carlo code. The data model has been implemented as an XML-based file format to facilitate data exchange, and integrated with Voxel2MCNP to provide a common interface for modeling, visualization, and evaluation of data. Also, extensions to allow compatibility with several file formats, such as ENSDF for nuclear structure properties and radioactive decay data, SimpleGeo for solid geometry modeling, ImageJ for voxel lattices, and MCNPX's MCTAL for simulation results have been added. The framework is presented and discussed in this paper and example workflows for body counter calibration and calculation of dose conversion coefficients is given to illustrate its application.
NASA Astrophysics Data System (ADS)
Popota, F. D.; Aguiar, P.; España, S.; Lois, C.; Udias, J. M.; Ros, D.; Pavia, J.; Gispert, J. D.
2015-01-01
In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system’s sensitivity and scatter fraction for an energy window of 350-650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system’s dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems.
Popota, F D; Aguiar, P; España, S; Lois, C; Udias, J M; Ros, D; Pavia, J; Gispert, J D
2015-01-07
In this work a comparison between experimental and simulated data using GATE and PeneloPET Monte Carlo simulation packages is presented. All simulated setups, as well as the experimental measurements, followed exactly the guidelines of the NEMA NU 4-2008 standards using the microPET R4 scanner. The comparison was focused on spatial resolution, sensitivity, scatter fraction and counting rates performance. Both GATE and PeneloPET showed reasonable agreement for the spatial resolution when compared to experimental measurements, although they lead to slight underestimations for the points close to the edge. High accuracy was obtained between experiments and simulations of the system's sensitivity and scatter fraction for an energy window of 350-650 keV, as well as for the counting rate simulations. The latter was the most complicated test to perform since each code demands different specifications for the characterization of the system's dead time. Although simulated and experimental results were in excellent agreement for both simulation codes, PeneloPET demanded more information about the behavior of the real data acquisition system. To our knowledge, this constitutes the first validation of these Monte Carlo codes for the full NEMA NU 4-2008 standards for small animal PET imaging systems.
Hybrid Monte Carlo/deterministic methods for radiation shielding problems
NASA Astrophysics Data System (ADS)
Becker, Troy L.
For the past few decades, the most common type of deep-penetration (shielding) problem simulated using Monte Carlo methods has been the source-detector problem, in which a response is calculated at a single location in space. Traditionally, the nonanalog Monte Carlo methods used to solve these problems have required significant user input to generate and sufficiently optimize the biasing parameters necessary to obtain a statistically reliable solution. It has been demonstrated that this laborious task can be replaced by automated processes that rely on a deterministic adjoint solution to set the biasing parameters---the so-called hybrid methods. The increase in computational power over recent years has also led to interest in obtaining the solution in a region of space much larger than a point detector. In this thesis, we propose two methods for solving problems ranging from source-detector problems to more global calculations---weight windows and the Transform approach. These techniques employ sonic of the same biasing elements that have been used previously; however, the fundamental difference is that here the biasing techniques are used as elements of a comprehensive tool set to distribute Monte Carlo particles in a user-specified way. The weight window achieves the user-specified Monte Carlo particle distribution by imposing a particular weight window on the system, without altering the particle physics. The Transform approach introduces a transform into the neutron transport equation, which results in a complete modification of the particle physics to produce the user-specified Monte Carlo distribution. These methods are tested in a three-dimensional multigroup Monte Carlo code. For a basic shielding problem and a more realistic one, these methods adequately solved source-detector problems and more global calculations. Furthermore, they confirmed that theoretical Monte Carlo particle distributions correspond to the simulated ones, implying that these methods can be used to achieve user-specified Monte Carlo distributions. Overall, the Transform approach performed more efficiently than the weight window methods, but it performed much more efficiently for source-detector problems than for global problems.
Todo, A S; Hiromoto, G; Turner, J E; Hamm, R N; Wright, H A
1982-12-01
Previous calculations of the initial energies of electrons produced in water irradiated by photons are extended to 1 GeV by including pair and triplet production. Calculations were performed with the Monte Carlo computer code PHOEL-3, which replaces the earlier code, PHOEL-2. Tables of initial electron energies are presented for single interactions of monoenergetic photons at a number of energies from 10 keV to 1 GeV. These tables can be used to compute kerma in water irradiated by photons with arbitrary energy spectra to 1 GeV. In addition, separate tables of Compton-and pair-electron spectra are given over this energy range. The code PHOEL-3 is available from the Radiation Shielding Information Center, Oak Ridge National Laboratory, Oak Ridge, TN 37830.
A 3DHZETRN Code in a Spherical Uniform Sphere with Monte Carlo Verification
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2014-01-01
The computationally efficient HZETRN code has been used in recent trade studies for lunar and Martian exploration and is currently being used in the engineering development of the next generation of space vehicles, habitats, and extra vehicular activity equipment. A new version (3DHZETRN) capable of transporting High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation is under development. In the present report, new algorithms for light ion and neutron propagation with well-defined convergence criteria in 3D objects is developed and tested against Monte Carlo simulations to verify the solution methodology. The code will be available through the software system, OLTARIS, for shield design and validation and provides a basis for personal computer software capable of space shield analysis and optimization.
Kinetic Monte Carlo simulation of dopant-defect systems under submicrosecond laser thermal processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisicaro, G.; Pelaz, Lourdes; Lopez, P.
2012-11-06
An innovative Kinetic Monte Carlo (KMC) code has been developed, which rules the post-implant kinetics of the defects system in the extremely far-from-the equilibrium conditions caused by the laser irradiation close to the liquid-solid interface. It considers defect diffusion, annihilation and clustering. The code properly implements, consistently to the stochastic formalism, the fast varying local event rates related to the thermal field T(r,t) evolution. This feature of our numerical method represents an important advancement with respect to current state of the art KMC codes. The reduction of the implantation damage and its reorganization in defect aggregates are studied as amore » function of the process conditions. Phosphorus activation efficiency, experimentally determined in similar conditions, has been related to the emerging damage scenario.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, J; Culberson, W; DeWerd, L
Purpose: To test the validity of a windowless extrapolation chamber used to measure surface dose rate from planar ophthalmic applicators and to compare different Monte Carlo based codes for deriving correction factors. Methods: Dose rate measurements were performed using a windowless, planar extrapolation chamber with a {sup 90}Sr/{sup 90}Y Tracerlab RA-1 ophthalmic applicator previously calibrated at the National Institute of Standards and Technology (NIST). Capacitance measurements were performed to estimate the initial air gap width between the source face and collecting electrode. Current was measured as a function of air gap, and Bragg-Gray cavity theory was used to calculate themore » absorbed dose rate to water. To determine correction factors for backscatter, divergence, and attenuation from the Mylar entrance window found in the NIST extrapolation chamber, both EGSnrc Monte Carlo user code and Monte Carlo N-Particle Transport Code (MCNP) were utilized. Simulation results were compared with experimental current readings from the windowless extrapolation chamber as a function of air gap. Additionally, measured dose rate values were compared with the expected result from the NIST source calibration to test the validity of the windowless chamber design. Results: Better agreement was seen between EGSnrc simulated dose results and experimental current readings at very small air gaps (<100 µm) for the windowless extrapolation chamber, while MCNP results demonstrated divergence at these small gap widths. Three separate dose rate measurements were performed with the RA-1 applicator. The average observed difference from the expected result based on the NIST calibration was −1.88% with a statistical standard deviation of 0.39% (k=1). Conclusion: EGSnrc user code will be used during future work to derive correction factors for extrapolation chamber measurements. Additionally, experiment results suggest that an entrance window is not needed in order for an extrapolation chamber to provide accurate dose rate measurements for a planar ophthalmic applicator.« less
SU-F-T-12: Monte Carlo Dosimetry of the 60Co Bebig High Dose Rate Source for Brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campos, L T; Almeida, C E V de
Purpose: The purpose of this work is to obtain the dosimetry parameters in accordance with the AAPM TG-43U1 formalism with Monte Carlo calculations regarding the BEBIG 60Co high-dose-rate brachytherapy. The geometric design and material details of the source was provided by the manufacturer and was used to define the Monte Carlo geometry. Methods: The dosimetry studies included the calculation of the air kerma strength Sk, collision kerma in water along the transverse axis with an unbounded phantom, dose rate constant and radial dose function. The Monte Carlo code system that was used was EGSnrc with a new cavity code, whichmore » is a part of EGS++ that allows calculating the radial dose function around the source. The XCOM photon cross-section library was used. Variance reduction techniques were used to speed up the calculation and to considerably reduce the computer time. To obtain the dose rate distributions of the source in an unbounded liquid water phantom, the source was immersed at the center of a cube phantom of 100 cm3. Results: The obtained dose rate constant for the BEBIG 60Co source was 1.108±0.001 cGyh-1U-1, which is consistent with the values in the literature. The radial dose functions were compared with the values of the consensus data set in the literature, and they are consistent with the published data for this energy range. Conclusion: The dose rate constant is consistent with the results of Granero et al. and Selvam and Bhola within 1%. Dose rate data are compared to GEANT4 and DORZnrc Monte Carlo code. However, the radial dose function is different by up to 10% for the points that are notably near the source on the transversal axis because of the high-energy photons from 60Co, which causes an electronic disequilibrium at the interface between the source capsule and the liquid water for distances up to 1 cm.« less
Wiklund, Kristin; Olivera, Gustavo H; Brahme, Anders; Lind, Bengt K
2008-07-01
To speed up dose calculation, an analytical pencil-beam method has been developed to calculate the mean radial dose distributions due to secondary electrons that are set in motion by light ions in water. For comparison, radial dose profiles calculated using a Monte Carlo technique have also been determined. An accurate comparison of the resulting radial dose profiles of the Bragg peak for (1)H(+), (4)He(2+) and (6)Li(3+) ions has been performed. The double differential cross sections for secondary electron production were calculated using the continuous distorted wave-eikonal initial state method (CDW-EIS). For the secondary electrons that are generated, the radial dose distribution for the analytical case is based on the generalized Gaussian pencil-beam method and the central axis depth-dose distributions are calculated using the Monte Carlo code PENELOPE. In the Monte Carlo case, the PENELOPE code was used to calculate the whole radial dose profile based on CDW data. The present pencil-beam and Monte Carlo calculations agree well at all radii. A radial dose profile that is shallower at small radii and steeper at large radii than the conventional 1/r(2) is clearly seen with both the Monte Carlo and pencil-beam methods. As expected, since the projectile velocities are the same, the dose profiles of Bragg-peak ions of 0.5 MeV (1)H(+), 2 MeV (4)He(2+) and 3 MeV (6)Li(3+) are almost the same, with about 30% more delta electrons in the sub keV range from (4)He(2+)and (6)Li(3+) compared to (1)H(+). A similar behavior is also seen for 1 MeV (1)H(+), 4 MeV (4)He(2+) and 6 MeV (6)Li(3+), all classically expected to have the same secondary electron cross sections. The results are promising and indicate a fast and accurate way of calculating the mean radial dose profile.
Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Chao-Jen, E-mail: cjlai3711@gmail.com; Zhong, Yuncheng; Yi, Ying
2015-06-15
Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimatemore » average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm{sup 2} field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical measurements were 0.97 ± 0.03 and 1.10 ± 0.13, respectively, indicating that the accuracy of the Monte Carlo simulation was adequate. The normalized AGD with VOI field scans was substantially reduced by a factor of about 2 over the VOI region and by a factor of 18 over the entire breast for both 25% and 50% VGF simulated breasts compared with the normalized AGD with full field scans. The normalized AGD for the VOI breast CT technique can be kept the same as or lower than that for a full field scan with the exposure level for the VOI field scan increased by a factor of as much as 12. Conclusions: The authors’ Monte Carlo estimates of normalized AGDs for the VOI breast CT technique show that this technique can be used to markedly increase the dose to the breast and thus the visibility of the VOI region without increasing the dose to the breast. The results of this investigation should be helpful for those interested in using VOI breast CT technique to image small calcifications with dose concern.« less
Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study
Lai, Chao-Jen; Zhong, Yuncheng; Yi, Ying; Wang, Tianpeng; Shaw, Chris C.
2015-01-01
Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimate average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm2 field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical measurements were 0.97 ± 0.03 and 1.10 ± 0.13, respectively, indicating that the accuracy of the Monte Carlo simulation was adequate. The normalized AGD with VOI field scans was substantially reduced by a factor of about 2 over the VOI region and by a factor of 18 over the entire breast for both 25% and 50% VGF simulated breasts compared with the normalized AGD with full field scans. The normalized AGD for the VOI breast CT technique can be kept the same as or lower than that for a full field scan with the exposure level for the VOI field scan increased by a factor of as much as 12. Conclusions: The authors’ Monte Carlo estimates of normalized AGDs for the VOI breast CT technique show that this technique can be used to markedly increase the dose to the breast and thus the visibility of the VOI region without increasing the dose to the breast. The results of this investigation should be helpful for those interested in using VOI breast CT technique to image small calcifications with dose concern. PMID:26127058
Monte Carlo Methodology Serves Up a Software Success
NASA Technical Reports Server (NTRS)
2003-01-01
Widely used for the modeling of gas flows through the computation of the motion and collisions of representative molecules, the Direct Simulation Monte Carlo method has become the gold standard for producing research and engineering predictions in the field of rarefied gas dynamics. Direct Simulation Monte Carlo was first introduced in the early 1960s by Dr. Graeme Bird, a professor at the University of Sydney, Australia. It has since proved to be a valuable tool to the aerospace and defense industries in providing design and operational support data, as well as flight data analysis. In 2002, NASA brought to the forefront a software product that maintains the same basic physics formulation of Dr. Bird's method, but provides effective modeling of complex, three-dimensional, real vehicle simulations and parallel processing capabilities to handle additional computational requirements, especially in areas where computational fluid dynamics (CFD) is not applicable. NASA's Direct Simulation Monte Carlo Analysis Code (DAC) software package is now considered the Agency s premier high-fidelity simulation tool for predicting vehicle aerodynamics and aerothermodynamic environments in rarified, or low-density, gas flows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolding, Simon R.; Cleveland, Mathew Allen; Morel, Jim E.
In this paper, we have implemented a new high-order low-order (HOLO) algorithm for solving thermal radiative transfer problems. The low-order (LO) system is based on the spatial and angular moments of the transport equation and a linear-discontinuous finite-element spatial representation, producing equations similar to the standard S 2 equations. The LO solver is fully implicit in time and efficiently resolves the nonlinear temperature dependence at each time step. The high-order (HO) solver utilizes exponentially convergent Monte Carlo (ECMC) to give a globally accurate solution for the angular intensity to a fixed-source pure-absorber transport problem. This global solution is used tomore » compute consistency terms, which require the HO and LO solutions to converge toward the same solution. The use of ECMC allows for the efficient reduction of statistical noise in the Monte Carlo solution, reducing inaccuracies introduced through the LO consistency terms. Finally, we compare results with an implicit Monte Carlo code for one-dimensional gray test problems and demonstrate the efficiency of ECMC over standard Monte Carlo in this HOLO algorithm.« less
A Modified Monte Carlo Method for Carrier Transport in Germanium, Free of Isotropic Rates
NASA Astrophysics Data System (ADS)
Sundqvist, Kyle
2010-03-01
We present a new method for carrier transport simulation, relevant for high-purity germanium < 100 > at a temperature of 40 mK. In this system, the scattering of electrons and holes is dominated by spontaneous phonon emission. Free carriers are always out of equilibrium with the lattice. We must also properly account for directional effects due to band structure, but there are many cautions in the literature about treating germanium in particular. These objections arise because the germanium electron system is anisotropic to an extreme degree, while standard Monte Carlo algorithms maintain a reliance on isotropic, integrated rates. We re-examine Fermi's Golden Rule to produce a Monte Carlo method free of isotropic rates. Traditional Monte Carlo codes implement particle scattering based on an isotropically averaged rate, followed by a separate selection of the particle's final state via a momentum-dependent probability. In our method, the kernel of Fermi's Golden Rule produces analytical, bivariate rates which allow for the simultaneous choice of scatter and final state selection. Energy and momentum are automatically conserved. We compare our results to experimental data.
Time-resolved optically stimulated luminescence of Al2O3:C for ion beam therapy dosimetry
NASA Astrophysics Data System (ADS)
Yukihara, Eduardo G.; Doull, Brandon A.; Ahmed, Md; Brons, Stephan; Tessonnier, Thomas; Jäkel, Oliver; Greilich, Steffen
2015-09-01
The objective of this study was to characterize the time-resolved (TR) optically stimulated luminescence (OSL) from Al2O3:C detectors and investigate methodologies to improve the accuracy of these detectors in ion beam therapy dosimetry, addressing the reduction in relative response to high linear energy transfer (LET) particles common to solid-state detectors. Al2O3:C OSL detectors (OSLDs) were exposed to proton, 4He, 12C and 16O beams in 22 particle/energy combinations and read using a custom-built TR-OSL reader. The OSL response {{r}\\text{OSL}} , relative to 60Co gamma dose to water, and the ratio between the UV and blue OSL emission bands of Al2O3:C (UV/blue ratio) were determined as a function of the LET. Monte-Carlo simulations with the multi-purpose interaction and transport code FLUKA were used to estimate the absorbed doses and particle energy spectra in the different irradiation conditions. The OSL responses {{r}\\text{OSL}} varied from 0.980 (0.73 keV μm-1) to 0.288 (120.8 keV μm-1). The OSL UV/blue ratio varied by a factor of two in the investigated LET range, but the variation for 12C beams was only 11%. OSLDs were also irradiated at different depths of carbon ion spread-out Bragg peaks (SOBPs), where it was shown that doses could be obtained with an accuracy of ±2.0% at the entrance channel and within the SOBP using correction factors calculated based on the OSL responses obtained in this study. The UV/blue ratio did not allow accurate estimation of the dose-averaged LET for 12C SOBPs, although the values obtained can be explained with the data obtained in this study and the additional information provided by the Monte-Carlo simulations. The results demonstrate that accurate OSLD dosimetry can be performed in ion beam therapy using appropriate corrections for the OSL response.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holm, Elizabeth A.
2002-03-28
This code is a FORTRAN code for three-dimensional Monte Carol Potts Model (MCPM) Recrystallization and grain growth. A continuum grain structure is mapped onto a three-dimensional lattice. The mapping procedure is analogous to color bitmapping the grain structure; grains are clusters of pixels (sites) of the same color (spin). The total system energy is given by the Pott Hamiltonian and the kinetics of grain growth are determined through a Monte Carlo technique with a nonconserved order parameter (Glauber dynamics). The code can be compiled and run on UNIX/Linux platforms.
[Series: Medical Applications of the PHITS Code (2): Acceleration by Parallel Computing].
Furuta, Takuya; Sato, Tatsuhiko
2015-01-01
Time-consuming Monte Carlo dose calculation becomes feasible owing to the development of computer technology. However, the recent development is due to emergence of the multi-core high performance computers. Therefore, parallel computing becomes a key to achieve good performance of software programs. A Monte Carlo simulation code PHITS contains two parallel computing functions, the distributed-memory parallelization using protocols of message passing interface (MPI) and the shared-memory parallelization using open multi-processing (OpenMP) directives. Users can choose the two functions according to their needs. This paper gives the explanation of the two functions with their advantages and disadvantages. Some test applications are also provided to show their performance using a typical multi-core high performance workstation.
Calculation of out-of-field dose distribution in carbon-ion radiotherapy by Monte Carlo simulation.
Yonai, Shunsuke; Matsufuji, Naruhiro; Namba, Masao
2012-08-01
Recent radiotherapy technologies including carbon-ion radiotherapy can improve the dose concentration in the target volume, thereby not only reducing side effects in organs at risk but also the secondary cancer risk within or near the irradiation field. However, secondary cancer risk in the low-dose region is considered to be non-negligible, especially for younger patients. To achieve a dose estimation of the whole body of each patient receiving carbon-ion radiotherapy, which is essential for risk assessment and epidemiological studies, Monte Carlo simulation plays an important role because the treatment planning system can provide dose distribution only in∕near the irradiation field and the measured data are limited. However, validation of Monte Carlo simulations is necessary. The primary purpose of this study was to establish a calculation method using the Monte Carlo code to estimate the dose and quality factor in the body and to validate the proposed method by comparison with experimental data. Furthermore, we show the distributions of dose equivalent in a phantom and identify the partial contribution of each radiation type. We proposed a calculation method based on a Monte Carlo simulation using the PHITS code to estimate absorbed dose, dose equivalent, and dose-averaged quality factor by using the Q(L)-L relationship based on the ICRP 60 recommendation. The values obtained by this method in modeling the passive beam line at the Heavy-Ion Medical Accelerator in Chiba were compared with our previously measured data. It was shown that our calculation model can estimate the measured value within a factor of 2, which included not only the uncertainty of this calculation method but also those regarding the assumptions of the geometrical modeling and the PHITS code. Also, we showed the differences in the doses and the partial contributions of each radiation type between passive and active carbon-ion beams using this calculation method. These results indicated that it is essentially important to include the dose by secondary neutrons in the assessment of the secondary cancer risk of patients receiving carbon-ion radiotherapy with active as well as passive beams. We established a calculation method with a Monte Carlo simulation to estimate the distribution of dose equivalent in the body as a first step toward routine risk assessment and an epidemiological study of carbon-ion radiotherapy at NIRS. This method has the advantage of being verifiable by the measurement.
NASA Astrophysics Data System (ADS)
Yan, Qiang; Shao, Lin
2017-03-01
Current popular Monte Carlo simulation codes for simulating electron bombardment in solids focus primarily on electron trajectories, instead of electron-induced displacements. Here we report a Monte Carol simulation code, DEEPER (damage creation and particle transport in matter), developed for calculating 3-D distributions of displacements produced by electrons of incident energies up to 900 MeV. Electron elastic scattering is calculated by using full-Mott cross sections for high accuracy, and primary-knock-on-atoms (PKAs)-induced damage cascades are modeled using ZBL potential. We compare and show large differences in 3-D distributions of displacements and electrons in electron-irradiated Fe. The distributions of total displacements are similar to that of PKAs at low electron energies. But they are substantially different for higher energy electrons due to the shifting of PKA energy spectra towards higher energies. The study is important to evaluate electron-induced radiation damage, for the applications using high flux electron beams to intentionally introduce defects and using an electron analysis beam for microstructural characterization of nuclear materials.
NASA Astrophysics Data System (ADS)
Lee, Yi-Kang
2017-09-01
Nuclear decommissioning takes place in several stages due to the radioactivity in the reactor structure materials. A good estimation of the neutron activation products distributed in the reactor structure materials impacts obviously on the decommissioning planning and the low-level radioactive waste management. Continuous energy Monte-Carlo radiation transport code TRIPOLI-4 has been applied on radiation protection and shielding analyses. To enhance the TRIPOLI-4 application in nuclear decommissioning activities, both experimental and computational benchmarks are being performed. To calculate the neutron activation of the shielding and structure materials of nuclear facilities, the knowledge of 3D neutron flux map and energy spectra must be first investigated. To perform this type of neutron deep penetration calculations with the Monte Carlo transport code, variance reduction techniques are necessary in order to reduce the uncertainty of the neutron activation estimation. In this study, variance reduction options of the TRIPOLI-4 code were used on the NAIADE 1 light water shielding benchmark. This benchmark document is available from the OECD/NEA SINBAD shielding benchmark database. From this benchmark database, a simplified NAIADE 1 water shielding model was first proposed in this work in order to make the code validation easier. Determination of the fission neutron transport was performed in light water for penetration up to 50 cm for fast neutrons and up to about 180 cm for thermal neutrons. Measurement and calculation results were benchmarked. Variance reduction options and their performance were discussed and compared.
Verification of unfold error estimates in the unfold operator code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehl, D.L.; Biggs, F.
Spectral unfolding is an inverse mathematical operation that attempts to obtain spectral source information from a set of response functions and data measurements. Several unfold algorithms have appeared over the past 30 years; among them is the unfold operator (UFO) code written at Sandia National Laboratories. In addition to an unfolded spectrum, the UFO code also estimates the unfold uncertainty (error) induced by estimated random uncertainties in the data. In UFO the unfold uncertainty is obtained from the error matrix. This built-in estimate has now been compared to error estimates obtained by running the code in a Monte Carlo fashionmore » with prescribed data distributions (Gaussian deviates). In the test problem studied, data were simulated from an arbitrarily chosen blackbody spectrum (10 keV) and a set of overlapping response functions. The data were assumed to have an imprecision of 5{percent} (standard deviation). One hundred random data sets were generated. The built-in estimate of unfold uncertainty agreed with the Monte Carlo estimate to within the statistical resolution of this relatively small sample size (95{percent} confidence level). A possible 10{percent} bias between the two methods was unresolved. The Monte Carlo technique is also useful in underdetermined problems, for which the error matrix method does not apply. UFO has been applied to the diagnosis of low energy x rays emitted by Z-pinch and ion-beam driven hohlraums. {copyright} {ital 1997 American Institute of Physics.}« less
Cosmogenic Secondary Radiation from a Nearby Supernova
NASA Astrophysics Data System (ADS)
Overholt, Andrew
2017-01-01
Increasing evidence has been found for multiple supernovae within 100 pc of the solar system. Supernovae produce large amounts of cosmic rays which upon striking Earth's atmosphere, produce a cascade of secondary particles. Among these cosmic ray secondaries are neutrons and muons, which penetrate far within the atmosphere to sea level and even below sea level. Muons and neutrons are both forms of ionizing radiation which have been linked to increases in cancer, congenital malformations, and other maladies. This work focuses on the impact of muons, as they penetrate into ocean water to impact the lowest levels of the aquatic food chain. We have used monte carlo simulations (CORSIKA, MCNPx, and FLUKA) to determine the ionizing radiation dose due to cosmic ray secondaries. This information shows that although most astrophysical events do not supply the necessary radiation flux to prove dangerous; there may be other impacts such as an increase to mutation rate.
Poster — Thur Eve — 14: Improving Tissue Segmentation for Monte Carlo Dose Calculation using DECT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Salvio, A.; Bedwani, S.; Carrier, J-F.
2014-08-15
Purpose: To improve Monte Carlo dose calculation accuracy through a new tissue segmentation technique with dual energy CT (DECT). Methods: Electron density (ED) and effective atomic number (EAN) can be extracted directly from DECT data with a stoichiometric calibration method. Images are acquired with Monte Carlo CT projections using the user code egs-cbct and reconstructed using an FDK backprojection algorithm. Calibration is performed using projections of a numerical RMI phantom. A weighted parameter algorithm then uses both EAN and ED to assign materials to voxels from DECT simulated images. This new method is compared to a standard tissue characterization frommore » single energy CT (SECT) data using a segmented calibrated Hounsfield unit (HU) to ED curve. Both methods are compared to the reference numerical head phantom. Monte Carlo simulations on uniform phantoms of different tissues using dosxyz-nrc show discrepancies in depth-dose distributions. Results: Both SECT and DECT segmentation methods show similar performance assigning soft tissues. Performance is however improved with DECT in regions with higher density, such as bones, where it assigns materials correctly 8% more often than segmentation with SECT, considering the same set of tissues and simulated clinical CT images, i.e. including noise and reconstruction artifacts. Furthermore, Monte Carlo results indicate that kV photon beam depth-dose distributions can double between two tissues of density higher than muscle. Conclusions: A direct acquisition of ED and the added information of EAN with DECT data improves tissue segmentation and increases the accuracy of Monte Carlo dose calculation in kV photon beams.« less
MATSIM -The Development and Validation of a Numerical Voxel Model based on the MATROSHKA Phantom
NASA Astrophysics Data System (ADS)
Beck, Peter; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Latocha, Marcin; Vana, Norbert; Zechner, Andrea; Reitz, Guenther
The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center. The aim of the project is to develop a voxel-based model of the MATROSHKA anthro-pomorphic torso used at the International Space Station (ISS) as foundation to perform Monte Carlo high-energy particle transport simulations for different irradiation conditions. Funded by the Austrian Space Applications Programme (ASAP), MATSIM is a co-investigation with the European Space Agency (ESA) ELIPS project MATROSHKA, an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. The MATROSHKA facility is designed to determine the radiation exposure of an astronaut onboard ISS and especially during an ex-travehicular activity. The numerical model developed in the frame of MATSIM is validated by reference measurements. In this report we give on overview of the model development and compare photon and neutron irradiations of the detector-equipped phantom torso with Monte Carlo simulations using FLUKA. Exposure to Co-60 photons was realized in the standard ir-radiation laboratory at Seibersdorf, while investigations with neutrons were performed at the thermal column of the Vienna TRIGA Mark-II reactor. The phantom was loaded with passive thermoluminescence dosimeters. In addition, first results of the calculated dose distribution within the torso are presented for a simulated exposure in low-Earth orbit.
NASA Astrophysics Data System (ADS)
Kotchenova, Svetlana Y.; Vermote, Eric F.; Matarrese, Raffaella; Klemm, Frank J., Jr.
2006-09-01
A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm.
Kotchenova, Svetlana Y; Vermote, Eric F; Matarrese, Raffaella; Klemm, Frank J
2006-09-10
A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, John C; Peplow, Douglas E.; Mosher, Scott W
2011-01-01
This paper provides a review of the hybrid (Monte Carlo/deterministic) radiation transport methods and codes used at the Oak Ridge National Laboratory and examples of their application for increasing the efficiency of real-world, fixed-source Monte Carlo analyses. The two principal hybrid methods are (1) Consistent Adjoint Driven Importance Sampling (CADIS) for optimization of a localized detector (tally) region (e.g., flux, dose, or reaction rate at a particular location) and (2) Forward Weighted CADIS (FW-CADIS) for optimizing distributions (e.g., mesh tallies over all or part of the problem space) or multiple localized detector regions (e.g., simultaneous optimization of two or moremore » localized tally regions). The two methods have been implemented and automated in both the MAVRIC sequence of SCALE 6 and ADVANTG, a code that works with the MCNP code. As implemented, the methods utilize the results of approximate, fast-running 3-D discrete ordinates transport calculations (with the Denovo code) to generate consistent space- and energy-dependent source and transport (weight windows) biasing parameters. These methods and codes have been applied to many relevant and challenging problems, including calculations of PWR ex-core thermal detector response, dose rates throughout an entire PWR facility, site boundary dose from arrays of commercial spent fuel storage casks, radiation fields for criticality accident alarm system placement, and detector response for special nuclear material detection scenarios and nuclear well-logging tools. Substantial computational speed-ups, generally O(102-4), have been realized for all applications to date. This paper provides a brief review of the methods, their implementation, results of their application, and current development activities, as well as a considerable list of references for readers seeking more information about the methods and/or their applications.« less
NASA Astrophysics Data System (ADS)
Russkova, Tatiana V.
2017-11-01
One tool to improve the performance of Monte Carlo methods for numerical simulation of light transport in the Earth's atmosphere is the parallel technology. A new algorithm oriented to parallel execution on the CUDA-enabled NVIDIA graphics processor is discussed. The efficiency of parallelization is analyzed on the basis of calculating the upward and downward fluxes of solar radiation in both a vertically homogeneous and inhomogeneous models of the atmosphere. The results of testing the new code under various atmospheric conditions including continuous singlelayered and multilayered clouds, and selective molecular absorption are presented. The results of testing the code using video cards with different compute capability are analyzed. It is shown that the changeover of computing from conventional PCs to the architecture of graphics processors gives more than a hundredfold increase in performance and fully reveals the capabilities of the technology used.
Monte Carlo simulations in Nuclear Medicine
NASA Astrophysics Data System (ADS)
Loudos, George K.
2007-11-01
Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.
Molecular dynamics and dynamic Monte-Carlo simulation of irradiation damage with focused ion beams
NASA Astrophysics Data System (ADS)
Ohya, Kaoru
2017-03-01
The focused ion beam (FIB) has become an important tool for micro- and nanostructuring of samples such as milling, deposition and imaging. However, this leads to damage of the surface on the nanometer scale from implanted projectile ions and recoiled material atoms. It is therefore important to investigate each kind of damage quantitatively. We present a dynamic Monte-Carlo (MC) simulation code to simulate the morphological and compositional changes of a multilayered sample under ion irradiation and a molecular dynamics (MD) simulation code to simulate dose-dependent changes in the backscattering-ion (BSI)/secondary-electron (SE) yields of a crystalline sample. Recent progress in the codes for research to simulate the surface morphology and Mo/Si layers intermixing in an EUV lithography mask irradiated with FIBs, and the crystalline orientation effect on BSI and SE yields relating to the channeling contrast in scanning ion microscopes, is also presented.
NASA Astrophysics Data System (ADS)
Kostyuchenko, V. I.; Makarova, A. S.; Ryazantsev, O. B.; Samarin, S. I.; Uglov, A. S.
2014-06-01
A great breakthrough in proton therapy has happened in the new century: several tens of dedicated centers are now operated throughout the world and their number increases every year. An important component of proton therapy is a treatment planning system. To make calculations faster, these systems usually use analytical methods whose reliability and accuracy do not allow the advantages of this method of treatment to implement to the full extent. Predictions by the Monte Carlo (MC) method are a "gold" standard for the verification of calculations with these systems. At the Institute of Experimental and Theoretical Physics (ITEP) which is one of the eldest proton therapy centers in the world, an MC code is an integral part of their treatment planning system. This code which is called IThMC was developed by scientists from RFNC-VNIITF (Snezhinsk) under ISTC Project 3563.
Parallelization of KENO-Va Monte Carlo code
NASA Astrophysics Data System (ADS)
Ramón, Javier; Peña, Jorge
1995-07-01
KENO-Va is a code integrated within the SCALE system developed by Oak Ridge that solves the transport equation through the Monte Carlo Method. It is being used at the Consejo de Seguridad Nuclear (CSN) to perform criticality calculations for fuel storage pools and shipping casks. Two parallel versions of the code: one for shared memory machines and other for distributed memory systems using the message-passing interface PVM have been generated. In both versions the neutrons of each generation are tracked in parallel. In order to preserve the reproducibility of the results in both versions, advanced seeds for random numbers were used. The CONVEX C3440 with four processors and shared memory at CSN was used to implement the shared memory version. A FDDI network of 6 HP9000/735 was employed to implement the message-passing version using proprietary PVM. The speedup obtained was 3.6 in both cases.
Portable multi-node LQCD Monte Carlo simulations using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Calore, Enrico; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Sanfilippo, Francesco; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele
This paper describes a state-of-the-art parallel Lattice QCD Monte Carlo code for staggered fermions, purposely designed to be portable across different computer architectures, including GPUs and commodity CPUs. Portability is achieved using the OpenACC parallel programming model, used to develop a code that can be compiled for several processor architectures. The paper focuses on parallelization on multiple computing nodes using OpenACC to manage parallelism within the node, and OpenMPI to manage parallelism among the nodes. We first discuss the available strategies to be adopted to maximize performances, we then describe selected relevant details of the code, and finally measure the level of performance and scaling-performance that we are able to achieve. The work focuses mainly on GPUs, which offer a significantly high level of performances for this application, but also compares with results measured on other processors.
Monte Carlo technique for very large ising models
NASA Astrophysics Data System (ADS)
Kalle, C.; Winkelmann, V.
1982-08-01
Rebbi's multispin coding technique is improved and applied to the kinetic Ising model with size 600*600*600. We give the central part of our computer program (for a CDC Cyber 76), which will be helpful also in a simulation of smaller systems, and describe the other tricks necessary to go to large lattices. The magnetization M at T=1.4* T c is found to decay asymptotically as exp(-t/2.90) if t is measured in Monte Carlo steps per spin, and M( t = 0) = 1 initially.
NASA Astrophysics Data System (ADS)
Cramer, S. N.; Roussin, R. W.
1981-11-01
A Monte Carlo analysis of a time-dependent neutron and secondary gamma-ray integral experiment on a thick concrete and steel shield is presented. The energy range covered in the analysis is 15-2 MeV for neutron source energies. The multigroup MORSE code was used with the VITAMIN C 171-36 neutron-gamma-ray cross-section data set. Both neutron and gamma-ray count rates and unfolded energy spectra are presented and compared, with good general agreement, with experimental results.
A highly optimized vectorized code for Monte Carlo simulations of SU(3) lattice gauge theories
NASA Technical Reports Server (NTRS)
Barkai, D.; Moriarty, K. J. M.; Rebbi, C.
1984-01-01
New methods are introduced for improving the performance of the vectorized Monte Carlo SU(3) lattice gauge theory algorithm using the CDC CYBER 205. Structure, algorithm and programming considerations are discussed. The performance achieved for a 16(4) lattice on a 2-pipe system may be phrased in terms of the link update time or overall MFLOPS rates. For 32-bit arithmetic, it is 36.3 microsecond/link for 8 hits per iteration (40.9 microsecond for 10 hits) or 101.5 MFLOPS.
Monte Carlo study of the effective Sherman function for electron polarimetry
NASA Astrophysics Data System (ADS)
Drągowski, M.; Włodarczyk, M.; Weber, G.; Ciborowski, J.; Enders, J.; Fritzsche, Y.; Poliszczuk, A.
2016-12-01
The PEBSI Monte Carlo simulation was upgraded towards usefulness for electron Mott polarimetry. The description of Mott scattering was improved and polarisation transfer in Møller scattering was included in the code. An improved agreement was achieved between the simulation and available experimental data for a 100 keV polarised electron beam scattering off gold foils of various thicknesses. The dependence of the effective Sherman function on scattering angle and target thickness, as well as the method of finding optimal conditions for Mott polarimetry measurements were analysed.
Mesh-based Monte Carlo code for fluorescence modeling in complex tissues with irregular boundaries
NASA Astrophysics Data System (ADS)
Wilson, Robert H.; Chen, Leng-Chun; Lloyd, William; Kuo, Shiuhyang; Marcelo, Cynthia; Feinberg, Stephen E.; Mycek, Mary-Ann
2011-07-01
There is a growing need for the development of computational models that can account for complex tissue morphology in simulations of photon propagation. We describe the development and validation of a user-friendly, MATLAB-based Monte Carlo code that uses analytically-defined surface meshes to model heterogeneous tissue geometry. The code can use information from non-linear optical microscopy images to discriminate the fluorescence photons (from endogenous or exogenous fluorophores) detected from different layers of complex turbid media. We present a specific application of modeling a layered human tissue-engineered construct (Ex Vivo Produced Oral Mucosa Equivalent, EVPOME) designed for use in repair of oral tissue following surgery. Second-harmonic generation microscopic imaging of an EVPOME construct (oral keratinocytes atop a scaffold coated with human type IV collagen) was employed to determine an approximate analytical expression for the complex shape of the interface between the two layers. This expression can then be inserted into the code to correct the simulated fluorescence for the effect of the irregular tissue geometry.
Object-Oriented/Data-Oriented Design of a Direct Simulation Monte Carlo Algorithm
NASA Technical Reports Server (NTRS)
Liechty, Derek S.
2014-01-01
Over the past decade, there has been much progress towards improved phenomenological modeling and algorithmic updates for the direct simulation Monte Carlo (DSMC) method, which provides a probabilistic physical simulation of gas Rows. These improvements have largely been based on the work of the originator of the DSMC method, Graeme Bird. Of primary importance are improved chemistry, internal energy, and physics modeling and a reduction in time to solution. These allow for an expanded range of possible solutions In altitude and velocity space. NASA's current production code, the DSMC Analysis Code (DAC), is well-established and based on Bird's 1994 algorithms written in Fortran 77 and has proven difficult to upgrade. A new DSMC code is being developed in the C++ programming language using object-oriented and data-oriented design paradigms to facilitate the inclusion of the recent improvements and future development activities. The development efforts on the new code, the Multiphysics Algorithm with Particles (MAP), are described, and performance comparisons are made with DAC.
Comparison of EGS4 and MCNP Monte Carlo codes when calculating radiotherapy depth doses.
Love, P A; Lewis, D G; Al-Affan, I A; Smith, C W
1998-05-01
The Monte Carlo codes EGS4 and MCNP have been compared when calculating radiotherapy depth doses in water. The aims of the work were to study (i) the differences between calculated depth doses in water for a range of monoenergetic photon energies and (ii) the relative efficiency of the two codes for different electron transport energy cut-offs. The depth doses from the two codes agree with each other within the statistical uncertainties of the calculations (1-2%). The relative depth doses also agree with data tabulated in the British Journal of Radiology Supplement 25. A discrepancy in the dose build-up region may by attributed to the different electron transport algorithims used by EGS4 and MCNP. This discrepancy is considerably reduced when the improved electron transport routines are used in the latest (4B) version of MCNP. Timing calculations show that EGS4 is at least 50% faster than MCNP for the geometries used in the simulations.
NASA Astrophysics Data System (ADS)
Bottaini, C.; Mirão, J.; Figuereido, M.; Candeias, A.; Brunetti, A.; Schiavon, N.
2015-01-01
Energy dispersive X-ray fluorescence (EDXRF) is a well-known technique for non-destructive and in situ analysis of archaeological artifacts both in terms of the qualitative and quantitative elemental composition because of its rapidity and non-destructiveness. In this study EDXRF and realistic Monte Carlo simulation using the X-ray Monte Carlo (XRMC) code package have been combined to characterize a Cu-based bowl from the Iron Age burial from Fareleira 3 (Southern Portugal). The artifact displays a multilayered structure made up of three distinct layers: a) alloy substrate; b) green oxidized corrosion patina; and c) brownish carbonate soil-derived crust. To assess the reliability of Monte Carlo simulation in reproducing the composition of the bulk metal of the objects without recurring to potentially damaging patina's and crust's removal, portable EDXRF analysis was performed on cleaned and patina/crust coated areas of the artifact. Patina has been characterized by micro X-ray Diffractometry (μXRD) and Back-Scattered Scanning Electron Microscopy + Energy Dispersive Spectroscopy (BSEM + EDS). Results indicate that the EDXRF/Monte Carlo protocol is well suited when a two-layered model is considered, whereas in areas where the patina + crust surface coating is too thick, X-rays from the alloy substrate are not able to exit the sample.
CloudMC: a cloud computing application for Monte Carlo simulation.
Miras, H; Jiménez, R; Miras, C; Gomà, C
2013-04-21
This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.
NASA Astrophysics Data System (ADS)
Shepherd, James J.; López Ríos, Pablo; Needs, Richard J.; Drummond, Neil D.; Mohr, Jennifer A.-F.; Booth, George H.; Grüneis, Andreas; Kresse, Georg; Alavi, Ali
2013-03-01
Full configuration interaction quantum Monte Carlo1 (FCIQMC) and its initiator adaptation2 allow for exact solutions to the Schrödinger equation to be obtained within a finite-basis wavefunction ansatz. In this talk, we explore an application of FCIQMC to the homogeneous electron gas (HEG). In particular we use these exact finite-basis energies to compare with approximate quantum chemical calculations from the VASP code3. After removing the basis set incompleteness error by extrapolation4,5, we compare our energies with state-of-the-art diffusion Monte Carlo calculations from the CASINO package6. Using a combined approach of the two quantum Monte Carlo methods, we present the highest-accuracy thermodynamic (infinite-particle) limit energies for the HEG achieved to date. 1 G. H. Booth, A. Thom, and A. Alavi, J. Chem. Phys. 131, 054106 (2009). 2 D. Cleland, G. H. Booth, and A. Alavi, J. Chem. Phys. 132, 041103 (2010). 3 www.vasp.at (2012). 4 J. J. Shepherd, A. Grüneis, G. H. Booth, G. Kresse, and A. Alavi, Phys. Rev. B. 86, 035111 (2012). 5 J. J. Shepherd, G. H. Booth, and A. Alavi, J. Chem. Phys. 136, 244101 (2012). 6 R. Needs, M. Towler, N. Drummond, and P. L. Ríos, J. Phys.: Condensed Matter 22, 023201 (2010).
NASA Astrophysics Data System (ADS)
Schiavon, Nick; de Palmas, Anna; Bulla, Claudio; Piga, Giampaolo; Brunetti, Antonio
2016-09-01
A spectrometric protocol combining Energy Dispersive X-Ray Fluorescence Spectrometry with Monte Carlo simulations of experimental spectra using the XRMC code package has been applied for the first time to characterize the elemental composition of a series of famous Iron Age small scale archaeological bronze replicas of ships (known as the ;Navicelle;) from the Nuragic civilization in Sardinia, Italy. The proposed protocol is a useful, nondestructive and fast analytical tool for Cultural Heritage sample. In Monte Carlo simulations, each sample was modeled as a multilayered object composed by two or three layers depending on the sample: when all present, the three layers are the original bronze substrate, the surface corrosion patina and an outermost protective layer (Paraloid) applied during past restorations. Monte Carlo simulations were able to account for the presence of the patina/corrosion layer as well as the presence of the Paraloid protective layer. It also accounted for the roughness effect commonly found at the surface of corroded metal archaeological artifacts. In this respect, the Monte Carlo simulation approach adopted here was, to the best of our knowledge, unique and enabled to determine the bronze alloy composition together with the thickness of the surface layers without the need for previously removing the surface patinas, a process potentially threatening preservation of precious archaeological/artistic artifacts for future generations.
NASA Astrophysics Data System (ADS)
Liu, Hongdong; Zhang, Lian; Chen, Zhi; Liu, Xinguo; Dai, Zhongying; Li, Qiang; Xu, Xie George
2017-09-01
In medical physics it is desirable to have a Monte Carlo code that is less complex, reliable yet flexible for dose verification, optimization, and component design. TOPAS is a newly developed Monte Carlo simulation tool which combines extensive radiation physics libraries available in Geant4 code, easyto-use geometry and support for visualization. Although TOPAS has been widely tested and verified in simulations of proton therapy, there has been no reported application for carbon ion therapy. To evaluate the feasibility and accuracy of TOPAS simulations for carbon ion therapy, a licensed TOPAS code (version 3_0_p1) was used to carry out a dosimetric study of therapeutic carbon ions. Results of depth dose profile based on different physics models have been obtained and compared with the measurements. It is found that the G4QMD model is at least as accurate as the TOPAS default BIC physics model for carbon ions, but when the energy is increased to relatively high levels such as 400 MeV/u, the G4QMD model shows preferable performance. Also, simulations of special components used in the treatment head at the Institute of Modern Physics facility was conducted to investigate the Spread-Out dose distribution in water. The physical dose in water of SOBP was found to be consistent with the aim of the 6 cm ridge filter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bianchini, G.; Burgio, N.; Carta, M.
The GUINEVERE experiment (Generation of Uninterrupted Intense Neutrons at the lead Venus Reactor) is an experimental program in support of the ADS technology presently carried out at SCK-CEN in Mol (Belgium). In the experiment a modified lay-out of the original thermal VENUS critical facility is coupled to an accelerator, built by the French body CNRS in Grenoble, working in both continuous and pulsed mode and delivering 14 MeV neutrons by bombardment of deuterons on a tritium-target. The modified lay-out of the facility consists of a fast subcritical core made of 30% U-235 enriched metallic Uranium in a lead matrix. Severalmore » off-line and on-line reactivity measurement techniques will be investigated during the experimental campaign. This report is focused on the simulation by deterministic (ERANOS French code) and Monte Carlo (MCNPX US code) calculations of three reactivity measurement techniques, Slope ({alpha}-fitting), Area-ratio and Source-jerk, applied to a GUINEVERE subcritical configuration (namely SC1). The inferred reactivity, in dollar units, by the Area-ratio method shows an overall agreement between the two deterministic and Monte Carlo computational approaches, whereas the MCNPX Source-jerk results are affected by large uncertainties and allow only partial conclusions about the comparison. Finally, no particular spatial dependence of the results is observed in the case of the GUINEVERE SC1 subcritical configuration. (authors)« less
Implementation of new physics models for low energy electrons in liquid water in Geant4-DNA.
Bordage, M C; Bordes, J; Edel, S; Terrissol, M; Franceries, X; Bardiès, M; Lampe, N; Incerti, S
2016-12-01
A new alternative set of elastic and inelastic cross sections has been added to the very low energy extension of the Geant4 Monte Carlo simulation toolkit, Geant4-DNA, for the simulation of electron interactions in liquid water. These cross sections have been obtained from the CPA100 Monte Carlo track structure code, which has been a reference in the microdosimetry community for many years. They are compared to the default Geant4-DNA cross sections and show better agreement with published data. In order to verify the correct implementation of the CPA100 cross section models in Geant4-DNA, simulations of the number of interactions and ranges were performed using Geant4-DNA with this new set of models, and the results were compared with corresponding results from the original CPA100 code. Good agreement is observed between the implementations, with relative differences lower than 1% regardless of the incident electron energy. Useful quantities related to the deposited energy at the scale of the cell or the organ of interest for internal dosimetry, like dose point kernels, are also calculated using these new physics models. They are compared with results obtained using the well-known Penelope Monte Carlo code. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gao, Wanbao; Raeside, David E.
1997-12-01
Dose distributions that result from treating a patient with orthovoltage beams are best determined with a treatment planning system that uses the Monte Carlo method, and such systems are not readily available. In the present work, the Monte Carlo method was used to develop a computer code for determining absorbed dose distributions in orthovoltage radiation therapy. The code was used in planning treatment of a patient with a neuroendocrine carcinoma of the maxillary sinus. Two lateral high-energy photon beams supplemented by an anterior orthovoltage photon beam were utilized in the treatment plan. For the clinical case and radiation beams considered, a reasonably uniform dose distribution
is achieved within the target volume, while the dose to the lens of each eye is 4 - 8% of the prescribed dose. Therefore, an orthovoltage photon beam, when properly filtered and optimally combined with megavoltage beams, can be effective in the treatment of cancers below the skin, providing that accurate treatment planning is carried out to establish with accuracy and precision the doses to critical structures.
Implementation of the direct S ( α , β ) method in the KENO Monte Carlo code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Shane W. D.; Maldonado, G. Ivan
The Monte Carlo code KENO contains thermal scattering data for a wide variety of thermal moderators. These data are processed from Evaluated Nuclear Data Files (ENDF) by AMPX and stored as double differential probability distribution functions. The method examined in this study uses S(α,β) probability distribution functions derived from the ENDF data files directly instead of being converted to double differential cross sections. This allows the size of the cross section data on the disk to be reduced substantially amount. KENO has also been updated to allow interpolation in temperature on these data so that problems can be run atmore » any temperature. Results are shown for several simplified problems for a variety of moderators. In addition, benchmark models based on the KRITZ reactor in Sweden were run, and the results are compared with the previous versions of KENO without the direct S(α,β) method. Results from the direct S(α,β) method compare favorably with the original results obtained using the double differential cross sections. Finally, sampling the data increases the run-time of the Monte Carlo calculation, but memory usage is decreased substantially.« less
NASA Astrophysics Data System (ADS)
Cohen, R. E.; Driver, K.; Wu, Z.; Militzer, B.; Rios, P. L.; Towler, M.; Needs, R.
2009-03-01
We have used diffusion quantum Monte Carlo (DMC) with the CASINO code with thermal free energies from phonons computed using density functional perturbation theory (DFPT) with the ABINIT code to obtain phase transition curves and thermal equations of state of silica phases under pressure. We obtain excellent agreement with experiments for the metastable phase transition from quartz to stishovite. The local density approximation (LDA) incorrectly gives stishovite as the ground state. The generalized gradient approximation (GGA) correctly gives quartz as the ground state, but does worse than LDA for the equations of state. DMC, variational quantum Monte Carlo (VMC), and DFT all give good results for the ferroelastic transition of stishovite to the CaCl2 structure, and LDA or the WC exchange correlation potentials give good results within a given silica phase. The δV and δH from the CaCl2 structure to α-PbO2 is small, giving uncertainly in the theoretical transition pressure. It is interesting that DFT has trouble with silica transitions, although the electronic structures of silica are insulating, simple closed-shell with ionic/covalent bonding. It seems like the errors in DFT are from not precisely giving the ion sizes.
Bahreyni Toossi, M T; Moradi, H; Zare, H
2008-01-01
In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.
Implementation of the direct S ( α , β ) method in the KENO Monte Carlo code
Hart, Shane W. D.; Maldonado, G. Ivan
2016-11-25
The Monte Carlo code KENO contains thermal scattering data for a wide variety of thermal moderators. These data are processed from Evaluated Nuclear Data Files (ENDF) by AMPX and stored as double differential probability distribution functions. The method examined in this study uses S(α,β) probability distribution functions derived from the ENDF data files directly instead of being converted to double differential cross sections. This allows the size of the cross section data on the disk to be reduced substantially amount. KENO has also been updated to allow interpolation in temperature on these data so that problems can be run atmore » any temperature. Results are shown for several simplified problems for a variety of moderators. In addition, benchmark models based on the KRITZ reactor in Sweden were run, and the results are compared with the previous versions of KENO without the direct S(α,β) method. Results from the direct S(α,β) method compare favorably with the original results obtained using the double differential cross sections. Finally, sampling the data increases the run-time of the Monte Carlo calculation, but memory usage is decreased substantially.« less
Khajeh, Masoud; Safigholi, Habib
2015-01-01
A miniature X-ray source has been optimized for electronic brachytherapy. The cooling fluid for this device is water. Unlike the radionuclide brachytherapy sources, this source is able to operate at variable voltages and currents to match the dose with the tumor depth. First, Monte Carlo (MC) optimization was performed on the tungsten target-buffer thickness layers versus energy such that the minimum X-ray attenuation occurred. Second optimization was done on the selection of the anode shape based on the Monte Carlo in water TG-43U1 anisotropy function. This optimization was carried out to get the dose anisotropy functions closer to unity at any angle from 0° to 170°. Three anode shapes including cylindrical, spherical, and conical were considered. Moreover, by Computational Fluid Dynamic (CFD) code the optimal target-buffer shape and different nozzle shapes for electronic brachytherapy were evaluated. The characterization criteria of the CFD were the minimum temperature on the anode shape, cooling water, and pressure loss from inlet to outlet. The optimal anode was conical in shape with a conical nozzle. Finally, the TG-43U1 parameters of the optimal source were compared with the literature. PMID:26966563
NASA Astrophysics Data System (ADS)
Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca
2014-03-01
The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 83617 No. of bytes in distributed program, including test data, etc.: 1038160 Distribution format: tar.gz Programming language: C++. Computer: Tested on several PCs and on Mac. Operating system: Linux, Mac OS X, Windows (native and cygwin). RAM: It is dependent on the input data but usually between 1 and 10 MB. Classification: 2.5, 21.1. External routines: XrayLib (https://github.com/tschoonj/xraylib/wiki) Nature of problem: Simulation of a wide range of X-ray imaging and spectroscopy experiments using different types of sources and detectors. Solution method: XRMC is a versatile program that is useful for the simulation of a wide range of X-ray imaging and spectroscopy experiments. It enables the simulation of monochromatic and polychromatic X-ray sources, with unpolarised or partially/completely polarised radiation. Single-element detectors as well as two-dimensional pixel detectors can be used in the simulations, with several acquisition options. In the current version of the program, the sample is modelled by combining convex three-dimensional objects demarcated by quadric surfaces, such as planes, ellipsoids and cylinders. The Monte Carlo approach makes XRMC able to accurately simulate X-ray photon transport and interactions with matter up to any order of interaction. The differential cross-sections and all other quantities related to the interaction processes (photoelectric absorption, fluorescence emission, elastic and inelastic scattering) are computed using the xraylib software library, which is currently the most complete and up-to-date software library for X-ray parameters. The use of variance reduction techniques makes XRMC able to reduce the simulation time by several orders of magnitude compared to other general-purpose Monte Carlo simulation programs. Running time: It is dependent on the complexity of the simulation. For the examples distributed with the code, it ranges from less than 1 s to a few minutes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Robert Cameron; Steiner, Don
2004-06-15
The generation of runaway electrons during a thermal plasma disruption is a concern for the safe and economical operation of a tokamak power system. Runaway electrons have high energy, 10 to 300 MeV, and may potentially cause extensive damage to plasma-facing components (PFCs) through large temperature increases, melting of metallic components, surface erosion, and possible burnout of coolant tubes. The EPQ code system was developed to simulate the thermal response of PFCs to a runaway electron impact. The EPQ code system consists of several parts: UNIX scripts that control the operation of an electron-photon Monte Carlo code to calculate themore » interaction of the runaway electrons with the plasma-facing materials; a finite difference code to calculate the thermal response, melting, and surface erosion of the materials; a code to process, scale, transform, and convert the electron Monte Carlo data to volumetric heating rates for use in the thermal code; and several minor and auxiliary codes for the manipulation and postprocessing of the data. The electron-photon Monte Carlo code used was Electron-Gamma-Shower (EGS), developed and maintained by the National Research Center of Canada. The Quick-Therm-Two-Dimensional-Nonlinear (QTTN) thermal code solves the two-dimensional cylindrical modified heat conduction equation using the Quickest third-order accurate and stable explicit finite difference method and is capable of tracking melting or surface erosion. The EPQ code system is validated using a series of analytical solutions and simulations of experiments. The verification of the QTTN thermal code with analytical solutions shows that the code with the Quickest method is better than 99.9% accurate. The benchmarking of the EPQ code system and QTTN versus experiments showed that QTTN's erosion tracking method is accurate within 30% and that EPQ is able to predict the occurrence of melting within the proper time constraints. QTTN and EPQ are verified and validated as able to calculate the temperature distribution, phase change, and surface erosion successfully.« less
Monte Carlo simulation of Ising models by multispin coding on a vector computer
NASA Astrophysics Data System (ADS)
Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus
1984-11-01
Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.
Radiation Transport Tools for Space Applications: A Review
NASA Technical Reports Server (NTRS)
Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn
2008-01-01
This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chow, J
Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of computemore » node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.« less
Monte Carlo verification of radiotherapy treatments with CloudMC.
Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José
2018-06-27
A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.
Gravitational microlensing of gamma-ray bursts
NASA Technical Reports Server (NTRS)
Mao, Shude
1993-01-01
A Monte Carlo code is developed to calculate gravitational microlensing in three dimensions when the lensing optical depth is low or moderate (not greater than 0.25). The code calculates positions of microimages and time delays between the microimages. The majority of lensed gamma-ray bursts should show a simple double-burst structure, as predicted by a single point mass lens model. A small fraction should show complicated multiple events due to the collective effects of several point masses (black holes). Cosmological models with a significant fraction of mass density in massive compact objects can be tested by searching for microlensing events in the current BATSE data. Our catalog generated by 10,000 Monte Carlo models is accessible through the computer network. The catalog can be used to take realistic selection effects into account.
A Monte Carlo code for the fragmentation of polarized quarks
NASA Astrophysics Data System (ADS)
Kerbizi, A.; Artru, X.; Belghobsi, Z.; Bradamante, F.; Martin, A.
2017-12-01
We describe a Monte Carlo code for the fragmentation of polarized quarks into pseudoscalar mesons. The quark jet is generated by iteration of the splitting q → h + q‧ where q and q‧ indicate quarks and h a hadron. The splitting function describing the energy sharing between q‧ and h is calculated on the basis of the Symmetric Lund Model where the quark spin is introduced through spin matrices as foreseen in the 3 P 0 mechanism. A complex mass parameter is introduced for the parametrisation of the Collins effect. The results for the Collins analysing power and the comparison with the Collins asymmetries measured by the COMPASS collaboration are presented. For the first time preliminary results on the simulated azimuthal asymmetry due to the Boer-Mulders function are also given.
NASA Astrophysics Data System (ADS)
Basiri, H.; Tavakoli-Anbaran, H.
2018-01-01
Am-Be neutrons source is based on (α, n) reaction and generates neutrons in the energy range of 0-11 MeV. Since the thermal neutrons are widely used in different fields, in this work, we investigate how to improve the source configuration in order to increase the thermal flux. These suggested changes include a spherical moderator instead of common cylindrical geometry, a reflector layer and an appropriate materials selection in order to achieve the maximum thermal flux. All calculations were done by using MCNP1 Monte Carlo code. Our final results indicated that a spherical paraffin moderator, a layer of beryllium as a reflector can efficiently increase the thermal neutron flux of Am-Be source.
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2013-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
DEVELOPMENT OF A MULTIMODAL MONTE CARLO BASED TREATMENT PLANNING SYSTEM.
Kumada, Hiroaki; Takada, Kenta; Sakurai, Yoshinori; Suzuki, Minoru; Takata, Takushi; Sakurai, Hideyuki; Matsumura, Akira; Sakae, Takeji
2017-10-26
To establish boron neutron capture therapy (BNCT), the University of Tsukuba is developing a treatment device and peripheral devices required in BNCT, such as a treatment planning system. We are developing a new multimodal Monte Carlo based treatment planning system (developing code: Tsukuba Plan). Tsukuba Plan allows for dose estimation in proton therapy, X-ray therapy and heavy ion therapy in addition to BNCT because the system employs PHITS as the Monte Carlo dose calculation engine. Regarding BNCT, several verifications of the system are being carried out for its practical usage. The verification results demonstrate that Tsukuba Plan allows for accurate estimation of thermal neutron flux and gamma-ray dose as fundamental radiations of dosimetry in BNCT. In addition to the practical use of Tsukuba Plan in BNCT, we are investigating its application to other radiation therapies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Sboev, A. G.; Ilyashenko, A. S.; Vetrova, O. A.
1997-02-01
The method of bucking evaluation, realized in the MOnte Carlo code MCS, is described. This method was applied for calculational analysis of well known light water experiments TRX-1 and TRX-2. The analysis of this comparison shows, that there is no coincidence between Monte Carlo calculations, obtained by different ways: the MCS calculations with given experimental bucklings; the MCS calculations with given bucklings evaluated on base of full core MCS direct simulations; the full core MCNP and MCS direct simulations; the MCNP and MCS calculations, where the results of cell calculations are corrected by the coefficients taking into the account the leakage from the core. Also the buckling values evaluated by full core MCS calculations have differed from experimental ones, especially in the case of TRX-1, when this difference has corresponded to 0.5 percent increase of Keff value.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beer, M.
1980-12-01
The maximum likelihood method for the multivariate normal distribution is applied to the case of several individual eigenvalues. Correlated Monte Carlo estimates of the eigenvalue are assumed to follow this prescription and aspects of the assumption are examined. Monte Carlo cell calculations using the SAM-CE and VIM codes for the TRX-1 and TRX-2 benchmark reactors, and SAM-CE full core results are analyzed with this method. Variance reductions of a few percent to a factor of 2 are obtained from maximum likelihood estimation as compared with the simple average and the minimum variance individual eigenvalue. The numerical results verify that themore » use of sample variances and correlation coefficients in place of the corresponding population statistics still leads to nearly minimum variance estimation for a sufficient number of histories and aggregates.« less
Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds
NASA Astrophysics Data System (ADS)
Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.
2012-11-01
A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.
Tringe, J. W.; Ileri, N.; Levie, H. W.; ...
2015-08-01
We use Molecular Dynamics and Monte Carlo simulations to examine molecular transport phenomena in nanochannels, explaining four orders of magnitude difference in wheat germ agglutinin (WGA) protein diffusion rates observed by fluorescence correlation spectroscopy (FCS) and by direct imaging of fluorescently-labeled proteins. We first use the ESPResSo Molecular Dynamics code to estimate the surface transport distance for neutral and charged proteins. We then employ a Monte Carlo model to calculate the paths of protein molecules on surfaces and in the bulk liquid transport medium. Our results show that the transport characteristics depend strongly on the degree of molecular surface coverage.more » Atomic force microscope characterization of surfaces exposed to WGA proteins for 1000 s show large protein aggregates consistent with the predicted coverage. These calculations and experiments provide useful insight into the details of molecular motion in confined geometries.« less
Simulation of radiation damping in rings, using stepwise ray-tracing methods
Meot, F.
2015-06-26
The ray-tracing code Zgoubi computes particle trajectories in arbitrary magnetic and/or electric field maps or analytical field models. It includes a built-in fitting procedure, spin tracking many Monte Carlo processes. The accuracy of the integration method makes it an efficient tool for multi-turn tracking in periodic machines. Energy loss by synchrotron radiation, based on Monte Carlo techniques, had been introduced in Zgoubi in the early 2000s for studies regarding the linear collider beam delivery system. However, only recently has this Monte Carlo tool been used for systematic beam dynamics and spin diffusion studies in rings, including eRHIC electron-ion collider projectmore » at the Brookhaven National Laboratory. Some beam dynamics aspects of this recent use of Zgoubi capabilities, including considerations of accuracy as well as further benchmarking in the presence of synchrotron radiation in rings, are reported here.« less
Monte Carlo Simulations of Arterial Imaging with Optical Coherence Tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amendt, P.; Estabrook, K.; Everett, M.
2000-02-01
The laser-tissue interaction code LATIS [London et al., Appl. Optics 36, 9068 ( 1998)] is used to analyze photon scattering histories representative of optical coherence tomography (OCT) experiment performed at Lawrence Livermore National Laboratory. Monte Carlo photonics with Henyey-Greenstein anisotropic scattering is implemented and used to simulate signal discrimination of intravascular structure. An analytic model is developed and used to obtain a scaling law relation for optimization of the OCT signal and to validate Monte Carlo photonics. The appropriateness of the Henyey-Greenstein phase function is studied by direct comparison with more detailed Mie scattering theory using an ensemble of sphericalmore » dielectric scatterers. Modest differences are found between the two prescriptions for describing photon angular scattering in tissue. In particular, the Mie scattering phase functions provide less overall reflectance signal but more signal contrast compared to the Henyey-Greenstein formulation.« less
NOTE: MCDE: a new Monte Carlo dose engine for IMRT
NASA Astrophysics Data System (ADS)
Reynaert, N.; DeSmedt, B.; Coghe, M.; Paelinck, L.; Van Duyse, B.; DeGersem, W.; DeWagter, C.; DeNeve, W.; Thierens, H.
2004-07-01
A new accurate Monte Carlo code for IMRT dose computations, MCDE (Monte Carlo dose engine), is introduced. MCDE is based on BEAMnrc/DOSXYZnrc and consequently the accurate EGSnrc electron transport. DOSXYZnrc is reprogrammed as a component module for BEAMnrc. In this way both codes are interconnected elegantly, while maintaining the BEAM structure and only minimal changes to BEAMnrc.mortran are necessary. The treatment head of the Elekta SLiplus linear accelerator is modelled in detail. CT grids consisting of up to 200 slices of 512 × 512 voxels can be introduced and up to 100 beams can be handled simultaneously. The beams and CT data are imported from the treatment planning system GRATIS via a DICOM interface. To enable the handling of up to 50 × 106 voxels the system was programmed in Fortran95 to enable dynamic memory management. All region-dependent arrays (dose, statistics, transport arrays) were redefined. A scoring grid was introduced and superimposed on the geometry grid, to be able to limit the number of scoring voxels. The whole system uses approximately 200 MB of RAM and runs on a PC cluster consisting of 38 1.0 GHz processors. A set of in-house made scripts handle the parallellization and the centralization of the Monte Carlo calculations on a server. As an illustration of MCDE, a clinical example is discussed and compared with collapsed cone convolution calculations. At present, the system is still rather slow and is intended to be a tool for reliable verification of IMRT treatment planning in the case of the presence of tissue inhomogeneities such as air cavities.
NASA Astrophysics Data System (ADS)
Rodriguez, M.; Brualla, L.
2018-04-01
Monte Carlo simulation of radiation transport is computationally demanding to obtain reasonably low statistical uncertainties of the estimated quantities. Therefore, it can benefit in a large extent from high-performance computing. This work is aimed at assessing the performance of the first generation of the many-integrated core architecture (MIC) Xeon Phi coprocessor with respect to that of a CPU consisting of a double 12-core Xeon processor in Monte Carlo simulation of coupled electron-photonshowers. The comparison was made twofold, first, through a suite of basic tests including parallel versions of the random number generators Mersenne Twister and a modified implementation of RANECU. These tests were addressed to establish a baseline comparison between both devices. Secondly, through the p DPM code developed in this work. p DPM is a parallel version of the Dose Planning Method (DPM) program for fast Monte Carlo simulation of radiation transport in voxelized geometries. A variety of techniques addressed to obtain a large scalability on the Xeon Phi were implemented in p DPM. Maximum scalabilities of 84 . 2 × and 107 . 5 × were obtained in the Xeon Phi for simulations of electron and photon beams, respectively. Nevertheless, in none of the tests involving radiation transport the Xeon Phi performed better than the CPU. The disadvantage of the Xeon Phi with respect to the CPU owes to the low performance of the single core of the former. A single core of the Xeon Phi was more than 10 times less efficient than a single core of the CPU for all radiation transport simulations.
Methodology comparison for gamma-heating calculations in material-testing reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lemaire, M.; Vaglio-Gaudard, C.; Lyoussi, A.
2015-07-01
The Jules Horowitz Reactor (JHR) is a Material-Testing Reactor (MTR) under construction in the south of France at CEA Cadarache (French Alternative Energies and Atomic Energy Commission). It will typically host about 20 simultaneous irradiation experiments in the core and in the beryllium reflector. These experiments will help us better understand the complex phenomena occurring during the accelerated ageing of materials and the irradiation of nuclear fuels. Gamma heating, i.e. photon energy deposition, is mainly responsible for temperature rise in non-fuelled zones of nuclear reactors, including JHR internal structures and irradiation devices. As temperature is a key parameter for physicalmore » models describing the behavior of material, accurate control of temperature, and hence gamma heating, is required in irradiation devices and samples in order to perform an advanced suitable analysis of future experimental results. From a broader point of view, JHR global attractiveness as a MTR depends on its ability to monitor experimental parameters with high accuracy, including gamma heating. Strict control of temperature levels is also necessary in terms of safety. As JHR structures are warmed up by gamma heating, they must be appropriately cooled down to prevent creep deformation or melting. Cooling-power sizing is based on calculated levels of gamma heating in the JHR. Due to these safety concerns, accurate calculation of gamma heating with well-controlled bias and associated uncertainty as low as possible is all the more important. There are two main kinds of calculation bias: bias coming from nuclear data on the one hand and bias coming from physical approximations assumed by computer codes and by general calculation route on the other hand. The former must be determined by comparison between calculation and experimental data; the latter by calculation comparisons between codes and between methodologies. In this presentation, we focus on this latter kind of bias. Nuclear heating is represented by the physical quantity called absorbed dose (energy deposition induced by particle-matter interactions, divided by mass). Its calculation with Monte Carlo codes is possible but computationally expensive as it requires transport simulation of charged particles, along with neutrons and photons. For that reason, the calculation of another physical quantity, called KERMA, is often preferred, as KERMA calculation with Monte Carlo codes only requires transport of neutral particles. However, KERMA is only an estimator of the absorbed dose and many conditions must be fulfilled for KERMA to be equal to absorbed dose, including so-called condition of electronic equilibrium. Also, Monte Carlo computations of absorbed dose still present some physical approximations, even though there is only a limited number of them. Some of these approximations are linked to the way how Monte Carlo codes apprehend the transport simulation of charged particles and the productive and destructive interactions between photons, electrons and positrons. There exists a huge variety of electromagnetic shower models which tackle this topic. Differences in the implementation of these models can lead to discrepancies in calculated values of absorbed dose between different Monte Carlo codes. The magnitude of order of such potential discrepancies should be quantified for JHR gamma-heating calculations. We consequently present a two-pronged plan. In a first phase, we intend to perform compared absorbed dose / KERMA Monte Carlo calculations in the JHR. This way, we will study the presence or absence of electronic equilibrium in the different JHR structures and experimental devices and we will give recommendations for the choice of KERMA or absorbed dose when calculating gamma heating in the JHR. In a second phase, we intend to perform compared TRIPOLI4 / MCNP absorbed dose calculations in a simplified JHR-representative geometry. For this comparison, we will use the same nuclear data library for both codes (the European library JEFF3.1.1 and photon library EPDL97) so as to isolate the effects from electromagnetic shower models on absorbed dose calculation. This way, we hope to get insightful feedback on these models and their implementation in Monte Carlo codes. (authors)« less
Souris, Kevin; Lee, John Aldo; Sterpin, Edmond
2016-04-01
Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.
Synthetic neutron camera and spectrometer in JET based on AFSI-ASCOT simulations
NASA Astrophysics Data System (ADS)
Sirén, P.; Varje, J.; Weisen, H.; Koskela, T.; contributors, JET
2017-09-01
The ASCOT Fusion Source Integrator (AFSI) has been used to calculate neutron production rates and spectra corresponding to the JET 19-channel neutron camera (KN3) and the time-of-flight spectrometer (TOFOR) as ideal diagnostics, without detector-related effects. AFSI calculates fusion product distributions in 4D, based on Monte Carlo integration from arbitrary reactant distribution functions. The distribution functions were calculated by the ASCOT Monte Carlo particle orbit following code for thermal, NBI and ICRH particle reactions. Fusion cross-sections were defined based on the Bosch-Hale model and both DD and DT reactions have been included. Neutrons generated by AFSI-ASCOT simulations have already been applied as a neutron source of the Serpent neutron transport code in ITER studies. Additionally, AFSI has been selected to be a main tool as the fusion product generator in the complete analysis calculation chain: ASCOT - AFSI - SERPENT (neutron and gamma transport Monte Carlo code) - APROS (system and power plant modelling code), which encompasses the plasma as an energy source, heat deposition in plant structures as well as cooling and balance-of-plant in DEMO applications and other reactor relevant analyses. This conference paper presents the first results and validation of the AFSI DD fusion model for different auxiliary heating scenarios (NBI, ICRH) with very different fast particle distribution functions. Both calculated quantities (production rates and spectra) have been compared with experimental data from KN3 and synthetic spectrometer data from ControlRoom code. No unexplained differences have been observed. In future work, AFSI will be extended for synthetic gamma diagnostics and additionally, AFSI will be used as part of the neutron transport calculation chain to model real diagnostics instead of ideal synthetic diagnostics for quantitative benchmarking.
Lakshmanan, Manu N.; Greenberg, Joel A.; Samei, Ehsan; Kapadia, Anuj J.
2016-01-01
Abstract. A scatter imaging technique for the differentiation of cancerous and healthy breast tissue in a heterogeneous sample is introduced in this work. Such a technique has potential utility in intraoperative margin assessment during lumpectomy procedures. In this work, we investigate the feasibility of the imaging method for tumor classification using Monte Carlo simulations and physical experiments. The coded aperture coherent scatter spectral imaging technique was used to reconstruct three-dimensional (3-D) images of breast tissue samples acquired through a single-position snapshot acquisition, without rotation as is required in coherent scatter computed tomography. We perform a quantitative assessment of the accuracy of the cancerous voxel classification using Monte Carlo simulations of the imaging system; describe our experimental implementation of coded aperture scatter imaging; show the reconstructed images of the breast tissue samples; and present segmentations of the 3-D images in order to identify the cancerous and healthy tissue in the samples. From the Monte Carlo simulations, we find that coded aperture scatter imaging is able to reconstruct images of the samples and identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) inside them with a cancerous voxel identification sensitivity, specificity, and accuracy of 92.4%, 91.9%, and 92.0%, respectively. From the experimental results, we find that the technique is able to identify cancerous and healthy tissue samples and reconstruct differential coherent scatter cross sections that are highly correlated with those measured by other groups using x-ray diffraction. Coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue inside samples within a time on the order of a minute per slice. PMID:26962543
Lakshmanan, Manu N; Greenberg, Joel A; Samei, Ehsan; Kapadia, Anuj J
2016-01-01
A scatter imaging technique for the differentiation of cancerous and healthy breast tissue in a heterogeneous sample is introduced in this work. Such a technique has potential utility in intraoperative margin assessment during lumpectomy procedures. In this work, we investigate the feasibility of the imaging method for tumor classification using Monte Carlo simulations and physical experiments. The coded aperture coherent scatter spectral imaging technique was used to reconstruct three-dimensional (3-D) images of breast tissue samples acquired through a single-position snapshot acquisition, without rotation as is required in coherent scatter computed tomography. We perform a quantitative assessment of the accuracy of the cancerous voxel classification using Monte Carlo simulations of the imaging system; describe our experimental implementation of coded aperture scatter imaging; show the reconstructed images of the breast tissue samples; and present segmentations of the 3-D images in order to identify the cancerous and healthy tissue in the samples. From the Monte Carlo simulations, we find that coded aperture scatter imaging is able to reconstruct images of the samples and identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) inside them with a cancerous voxel identification sensitivity, specificity, and accuracy of 92.4%, 91.9%, and 92.0%, respectively. From the experimental results, we find that the technique is able to identify cancerous and healthy tissue samples and reconstruct differential coherent scatter cross sections that are highly correlated with those measured by other groups using x-ray diffraction. Coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue inside samples within a time on the order of a minute per slice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakao, N.; /SLAC; Taniguchi, S.
Neutron energy spectra were measured behind the lateral shield of the CERF (CERN-EU High Energy Reference Field) facility at CERN with a 120 GeV/c positive hadron beam (a mixture of mainly protons and pions) on a cylindrical copper target (7-cm diameter by 50-cm long). An NE213 organic liquid scintillator (12.7-cm diameter by 12.7-cm long) was located at various longitudinal positions behind shields of 80- and 160-cm thick concrete and 40-cm thick iron. The measurement locations cover an angular range with respect to the beam axis between 13 and 133{sup o}. Neutron energy spectra in the energy range between 32 MeVmore » and 380 MeV were obtained by unfolding the measured pulse height spectra with the detector response functions which have been verified in the neutron energy range up to 380 MeV in separate experiments. Since the source term and experimental geometry in this experiment are well characterized and simple and results are given in the form of energy spectra, these experimental results are very useful as benchmark data to check the accuracies of simulation codes and nuclear data. Monte Carlo simulations of the experimental set up were performed with the FLUKA, MARS and PHITS codes. Simulated spectra for the 80-cm thick concrete often agree within the experimental uncertainties. On the other hand, for the 160-cm thick concrete and iron shield differences are generally larger than the experimental uncertainties, yet within a factor of 2. Based on source term simulations, observed discrepancies among simulations of spectra outside the shield can be partially explained by differences in the high-energy hadron production in the copper target.« less
NASA Astrophysics Data System (ADS)
Yeh, Peter C. Y.; Lee, C. C.; Chao, T. C.; Tung, C. J.
2017-11-01
Intensity-modulated radiation therapy is an effective treatment modality for the nasopharyngeal carcinoma. One important aspect of this cancer treatment is the need to have an accurate dose algorithm dealing with the complex air/bone/tissue interface in the head-neck region to achieve the cure without radiation-induced toxicities. The Acuros XB algorithm explicitly solves the linear Boltzmann transport equation in voxelized volumes to account for the tissue heterogeneities such as lungs, bone, air, and soft tissues in the treatment field receiving radiotherapy. With the single beam setup in phantoms, this algorithm has already been demonstrated to achieve the comparable accuracy with Monte Carlo simulations. In the present study, five nasopharyngeal carcinoma patients treated with the intensity-modulated radiation therapy were examined for their dose distributions calculated using the Acuros XB in the planning target volume and the organ-at-risk. Corresponding results of Monte Carlo simulations were computed from the electronic portal image data and the BEAMnrc/DOSXYZnrc code. Analysis of dose distributions in terms of the clinical indices indicated that the Acuros XB was in comparable accuracy with Monte Carlo simulations and better than the anisotropic analytical algorithm for dose calculations in real patients.
Quantum Monte Carlo Endstation for Petascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lubos Mitas
2011-01-26
NCSU research group has been focused on accomplising the key goals of this initiative: establishing new generation of quantum Monte Carlo (QMC) computational tools as a part of Endstation petaflop initiative for use at the DOE ORNL computational facilities and for use by computational electronic structure community at large; carrying out high accuracy quantum Monte Carlo demonstration projects in application of these tools to the forefront electronic structure problems in molecular and solid systems; expanding the impact of QMC methods and approaches; explaining and enhancing the impact of these advanced computational approaches. In particular, we have developed quantum Monte Carlomore » code (QWalk, www.qwalk.org) which was significantly expanded and optimized using funds from this support and at present became an actively used tool in the petascale regime by ORNL researchers and beyond. These developments have been built upon efforts undertaken by the PI's group and collaborators over the period of the last decade. The code was optimized and tested extensively on a number of parallel architectures including petaflop ORNL Jaguar machine. We have developed and redesigned a number of code modules such as evaluation of wave functions and orbitals, calculations of pfaffians and introduction of backflow coordinates together with overall organization of the code and random walker distribution over multicore architectures. We have addressed several bottlenecks such as load balancing and verified efficiency and accuracy of the calculations with the other groups of the Endstation team. The QWalk package contains about 50,000 lines of high quality object-oriented C++ and includes also interfaces to data files from other conventional electronic structure codes such as Gamess, Gaussian, Crystal and others. This grant supported PI for one month during summers, a full-time postdoc and partially three graduate students over the period of the grant duration, it has resulted in 13 published papers, 15 invited talks and lectures nationally and internationally. My former graduate student and postdoc Dr. Michal Bajdich, who was supported byt this grant, is currently a postdoc with ORNL in the group of Dr. F. Reboredo and Dr. P. Kent and is using the developed tools in a number of DOE projects. The QWalk package has become a truly important research tool used by the electronic structure community and has attracted several new developers in other research groups. Our tools use several types of correlated wavefunction approaches, variational, diffusion and reptation methods, large-scale optimization methods for wavefunctions and enables to calculate energy differences such as cohesion, electronic gaps, but also densities and other properties, using multiple runs one can obtain equations of state for given structures and beyond. Our codes use efficient numerical and Monte Carlo strategies (high accuracy numerical orbitals, multi-reference wave functions, highly accurate correlation factors, pairing orbitals, force biased and correlated sampling Monte Carlo), are robustly parallelized and enable to run on tens of thousands cores very efficiently. Our demonstration applications were focused on the challenging research problems in several fields of materials science such as transition metal solids. We note that our study of FeO solid was the first QMC calculation of transition metal oxides at high pressures.« less
Hoshi, M; Hiraoka, M; Hayakawa, N; Sawada, S; Munaka, M; Kuramoto, A; Oka, T; Iwatani, K; Shizuma, K; Hasai, H
1992-11-01
A benchmark test of the Monte Carlo neutron and photon transport code system (MCNP) was performed using a 252Cf fission neutron source to validate the use of the code for the energy spectrum analyses of Hiroshima atomic bomb neutrons. Nuclear data libraries used in the Monte Carlo neutron and photon transport code calculation were ENDF/B-III, ENDF/B-IV, LASL-SUB, and ENDL-73. The neutron moderators used were granite (the main component of which is SiO2, with a small fraction of hydrogen), Newlight [polyethylene with 3.7% boron (natural)], ammonium chloride (NH4Cl), and water (H2O). Each moderator was 65 cm thick. The neutron detectors were gold and nickel foils, which were used to detect thermal and epithermal neutrons (4.9 eV) and fast neutrons (> 0.5 MeV), respectively. Measured activity data from neutron-irradiated gold and nickel foils in these moderators decreased to about 1/1,000th or 1/10,000th, which correspond to about 1,500 m ground distance from the hypocenter in Hiroshima. For both gold and nickel detectors, the measured activities and the calculated values agreed within 10%. The slopes of the depth-yield relations in each moderator, except granite, were similar for neutrons detected by the gold and nickel foils. From the results of these studies, the Monte Carlo neutron and photon transport code was verified to be accurate enough for use with the elements hydrogen, carbon, nitrogen, oxygen, silicon, chlorine, and cadmium, and for the incident 252Cf fission spectrum neutrons.
CGRO Guest Investigator Program
NASA Technical Reports Server (NTRS)
Begelman, Mitchell C.
1997-01-01
The following are highlights from the research supported by this grant: (1) Theory of gamma-ray blazars: We studied the theory of gamma-ray blazars, being among the first investigators to propose that the GeV emission arises from Comptonization of diffuse radiation surrounding the jet, rather than from the synchrotron-self-Compton mechanism. In related work, we uncovered possible connections between the mechanisms of gamma-ray blazars and those of intraday radio variability, and have conducted a general study of the role of Compton radiation drag on the dynamics of relativistic jets. (2) A Nonlinear Monte Carlo code for gamma-ray spectrum formation: We developed, tested, and applied the first Nonlinear Monte Carlo (NLMC) code for simulating gamma-ray production and transfer under much more general (and realistic) conditions than are accessible with other techniques. The present version of the code is designed to simulate conditions thought to be present in active galactic nuclei and certain types of X-ray binaries, and includes the physics needed to model thermal and nonthermal electron-positron pair cascades. Unlike traditional Monte-Carlo techniques, our method can accurately handle highly non-linear systems in which the radiation and particle backgrounds must be determined self-consistently and in which the particle energies span many orders of magnitude. Unlike models based on kinetic equations, our code can handle arbitrary source geometries and relativistic kinematic effects In its first important application following testing, we showed that popular semi-analytic accretion disk corona models for Seyfert spectra are seriously in error, and demonstrated how the spectra can be simulated if the disk is sparsely covered by localized 'flares'.
NASA Astrophysics Data System (ADS)
Verbeke, Jérôme M.; Petit, Odile; Chebboubi, Abdelhazize; Litaize, Olivier
2018-01-01
Fission modeling in general-purpose Monte Carlo transport codes often relies on average nuclear data provided by international evaluation libraries. As such, only average fission multiplicities are available and correlations between fission neutrons and photons are missing. Whereas uncorrelated fission physics is usually sufficient for standard reactor core and radiation shielding calculations, correlated fission secondaries are required for specialized nuclear instrumentation and detector modeling. For coincidence counting detector optimization for instance, precise simulation of fission neutrons and photons that remain correlated in time from birth to detection is essential. New developments were recently integrated into the Monte Carlo transport code TRIPOLI-4 to model fission physics more precisely, the purpose being to access event-by-event fission events from two different fission models: FREYA and FIFRELIN. TRIPOLI-4 simulations can now be performed, either by connecting via an API to the LLNL fission library including FREYA, or by reading external fission event data files produced by FIFRELIN beforehand. These new capabilities enable us to easily compare results from Monte Carlo transport calculations using the two fission models in a nuclear instrumentation application. In the first part of this paper, broad underlying principles of the two fission models are recalled. We then present experimental measurements of neutron angular correlations for 252Cf(sf) and 240Pu(sf). The correlations were measured for several neutron kinetic energy thresholds. In the latter part of the paper, simulation results are compared to experimental data. Spontaneous fissions in 252Cf and 240Pu are modeled by FREYA or FIFRELIN. Emitted neutrons and photons are subsequently transported to an array of scintillators by TRIPOLI-4 in analog mode to preserve their correlations. Angular correlations between fission neutrons obtained independently from these TRIPOLI-4 simulations, using either FREYA or FIFRELIN, are compared to experimental results. For 240Pu(sf), the measured correlations were used to tune the model parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yuhe; Mazur, Thomas R.; Green, Olga
Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expandedmore » to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less
Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold
2016-01-01
Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian’s kmc. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems. PMID:27370123
Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold
2016-07-01
The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.
Direct simulation Monte Carlo method for gas flows in micro-channels with bends with added curvature
NASA Astrophysics Data System (ADS)
Tisovský, Tomáš; Vít, Tomáš
Gas flows in micro-channels are simulated using an open source Direct Simulation Monte Carlo (DSMC) code dsmcFOAM for general application to rarefied gas flow written within the framework of the open source C++ toolbox called OpenFOAM. Aim of this paper is to investigate the flow in micro-channel with bend with added curvature. Results are compared with flows in channel without added curvature and equivalent straight channel. Effects of micro-channel bend was already thoroughly investigated by White et al. Geometry proposed by White is also used here for refference.
New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.
Resonant scattering experiments with radioactive nuclear beams - Recent results and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teranishi, T.; Sakaguchi, S.; Uesaka, T.
2013-04-19
Resonant scattering with low-energy radioactive nuclear beams of E < 5 MeV/u have been studied at CRIB of CNS and at RIPS of RIKEN. As an extension to the present experimental technique, we will install an advanced polarized proton target for resonant scattering experiments. A Monte-Carlo simulation was performed to study the feasibility of future experiments with the polarized target. In the Monte-Carlo simulation, excitation functions and analyzing powers were calculated using a newly developed R-matrix calculation code. A project of a small-scale radioactive beam facility at Kyushu University is also briefly described.
Monte Carlo errors with less errors
NASA Astrophysics Data System (ADS)
Wolff, Ulli; Alpha Collaboration
2004-01-01
We explain in detail how to estimate mean values and assess statistical errors for arbitrary functions of elementary observables in Monte Carlo simulations. The method is to estimate and sum the relevant autocorrelation functions, which is argued to produce more certain error estimates than binning techniques and hence to help toward a better exploitation of expensive simulations. An effective integrated autocorrelation time is computed which is suitable to benchmark efficiencies of simulation algorithms with regard to specific observables of interest. A Matlab code is offered for download that implements the method. It can also combine independent runs (replica) allowing to judge their consistency.
NASA Astrophysics Data System (ADS)
Krása, A.; Majerle, M.; Krízek, F.; Wagner, V.; Kugler, A.; Svoboda, O.; Henzl, V.; Henzlová, D.; Adam, J.; Caloun, P.; Kalinnikov, V. G.; Krivopustov, M. I.; Stegailov, V. I.; Tsoupko-Sitnikov, V. M.
2006-05-01
Relativistic protons with energies 0.7-1.5 GeV interacting with a thick, cylindrical, lead target, surrounded by a uranium blanket and a polyethylene moderator, produced spallation neutrons. The spatial and energetic distributions of the produced neutron field were measured by the Activation Analysis Method using Al, Au, Bi, and Co radio-chemical sensors. The experimental yields of isotopes induced in the sensors were compared with Monte-Carlo calculations performed with the MCNPX 2.4.0 code.
Robatjazi, Mostafa; Baghani, Hamid Reza; Mahdavic, Seied Rabi; Felici, Giuseppe
2018-05-01
A shielding disk is used for IOERT procedures to absorb radiation behind the target and protect underlying healthy tissues. Setup variation of shielding disk can affect the corresponding in-vivo dose distribution. In this study, the changes of dosimetric parameters due to the disk setup variations is evaluated using EGSnrc Monte Carlo (MC) code. The results can help treatment team to decide about the level of accuracy in the setup procedure and delivered dose to the target volume during IOERT. Copyright © 2018 Elsevier Ltd. All rights reserved.
A novel Monte Carlo algorithm for simulating crystals with McStas
NASA Astrophysics Data System (ADS)
Alianelli, L.; Sánchez del Río, M.; Felici, R.; Andersen, K. H.; Farhi, E.
2004-07-01
We developed an original Monte Carlo algorithm for the simulation of Bragg diffraction by mosaic, bent and gradient crystals. It has practical applications, as it can be used for simulating imperfect crystals (monochromators, analyzers and perhaps samples) in neutron ray-tracing packages, like McStas. The code we describe here provides a detailed description of the particle interaction with the microscopic homogeneous regions composing the crystal, therefore it can be used also for the calculation of quantities having a conceptual interest, as multiple scattering, or for the interpretation of experiments aiming at characterizing crystals, like diffraction topographs.
Present Status and Extensions of the Monte Carlo Performance Benchmark
NASA Astrophysics Data System (ADS)
Hoogenboom, J. Eduard; Petrovic, Bojan; Martin, William R.
2014-06-01
The NEA Monte Carlo Performance benchmark started in 2011 aiming to monitor over the years the abilities to perform a full-size Monte Carlo reactor core calculation with a detailed power production for each fuel pin with axial distribution. This paper gives an overview of the contributed results thus far. It shows that reaching a statistical accuracy of 1 % for most of the small fuel zones requires about 100 billion neutron histories. The efficiency of parallel execution of Monte Carlo codes on a large number of processor cores shows clear limitations for computer clusters with common type computer nodes. However, using true supercomputers the speedup of parallel calculations is increasing up to large numbers of processor cores. More experience is needed from calculations on true supercomputers using large numbers of processors in order to predict if the requested calculations can be done in a short time. As the specifications of the reactor geometry for this benchmark test are well suited for further investigations of full-core Monte Carlo calculations and a need is felt for testing other issues than its computational performance, proposals are presented for extending the benchmark to a suite of benchmark problems for evaluating fission source convergence for a system with a high dominance ratio, for coupling with thermal-hydraulics calculations to evaluate the use of different temperatures and coolant densities and to study the correctness and effectiveness of burnup calculations. Moreover, other contemporary proposals for a full-core calculation with realistic geometry and material composition will be discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cleveland, Mathew A., E-mail: cleveland7@llnl.gov; Brunner, Thomas A.; Gentile, Nicholas A.
2013-10-15
We describe and compare different approaches for achieving numerical reproducibility in photon Monte Carlo simulations. Reproducibility is desirable for code verification, testing, and debugging. Parallelism creates a unique problem for achieving reproducibility in Monte Carlo simulations because it changes the order in which values are summed. This is a numerical problem because double precision arithmetic is not associative. Parallel Monte Carlo, both domain replicated and decomposed simulations, will run their particles in a different order during different runs of the same simulation because the non-reproducibility of communication between processors. In addition, runs of the same simulation using different domain decompositionsmore » will also result in particles being simulated in a different order. In [1], a way of eliminating non-associative accumulations using integer tallies was described. This approach successfully achieves reproducibility at the cost of lost accuracy by rounding double precision numbers to fewer significant digits. This integer approach, and other extended and reduced precision reproducibility techniques, are described and compared in this work. Increased precision alone is not enough to ensure reproducibility of photon Monte Carlo simulations. Non-arbitrary precision approaches require a varying degree of rounding to achieve reproducibility. For the problems investigated in this work double precision global accuracy was achievable by using 100 bits of precision or greater on all unordered sums which where subsequently rounded to double precision at the end of every time-step.« less
NASA Astrophysics Data System (ADS)
Zhou, Abel; White, Graeme L.; Davidson, Rob
2018-02-01
Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.
Design and optimization of a portable LQCD Monte Carlo code using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele
The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.
PyMC: Bayesian Stochastic Modelling in Python
Patil, Anand; Huard, David; Fonnesbeck, Christopher J.
2010-01-01
This user guide describes a Python package, PyMC, that allows users to efficiently code a probabilistic model and draw samples from its posterior distribution using Markov chain Monte Carlo techniques. PMID:21603108
An improved target velocity sampling algorithm for free gas elastic scattering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romano, Paul K.; Walsh, Jonathan A.
We present an improved algorithm for sampling the target velocity when simulating elastic scattering in a Monte Carlo neutron transport code that correctly accounts for the energy dependence of the scattering cross section. The algorithm samples the relative velocity directly, thereby avoiding a potentially inefficient rejection step based on the ratio of cross sections. Here, we have shown that this algorithm requires only one rejection step, whereas other methods of similar accuracy require two rejection steps. The method was verified against stochastic and deterministic reference results for upscattering percentages in 238U. Simulations of a light water reactor pin cell problemmore » demonstrate that using this algorithm results in a 3% or less penalty in performance when compared with an approximate method that is used in most production Monte Carlo codes« less
An improved target velocity sampling algorithm for free gas elastic scattering
Romano, Paul K.; Walsh, Jonathan A.
2018-02-03
We present an improved algorithm for sampling the target velocity when simulating elastic scattering in a Monte Carlo neutron transport code that correctly accounts for the energy dependence of the scattering cross section. The algorithm samples the relative velocity directly, thereby avoiding a potentially inefficient rejection step based on the ratio of cross sections. Here, we have shown that this algorithm requires only one rejection step, whereas other methods of similar accuracy require two rejection steps. The method was verified against stochastic and deterministic reference results for upscattering percentages in 238U. Simulations of a light water reactor pin cell problemmore » demonstrate that using this algorithm results in a 3% or less penalty in performance when compared with an approximate method that is used in most production Monte Carlo codes« less
Features of MCNP6 Relevant to Medical Radiation Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, H. Grady III; Goorley, John T.
2012-08-29
MCNP (Monte Carlo N-Particle) is a general-purpose Monte Carlo code for simulating the transport of neutrons, photons, electrons, positrons, and more recently other fundamental particles and heavy ions. Over many years MCNP has found a wide range of applications in many different fields, including medical radiation physics. In this presentation we will describe and illustrate a number of significant recently-developed features in the current version of the code, MCNP6, having particular utility for medical physics. Among these are major extensions of the ability to simulate large, complex geometries, improvement in memory requirements and speed for large lattices, introduction of mesh-basedmore » isotopic reaction tallies, advances in radiography simulation, expanded variance-reduction capabilities, especially for pulse-height tallies, and a large number of enhancements in photon/electron transport.« less
DOSE COEFFICIENTS FOR LIVER CHEMOEMBOLISATION PROCEDURES USING MONTE CARLO CODE.
Karavasilis, E; Dimitriadis, A; Gonis, H; Pappas, P; Georgiou, E; Yakoumakis, E
2016-12-01
The aim of the present study is the estimation of radiation burden during liver chemoembolisation procedures. Organ dose and effective dose conversion factors, normalised to dose-area product (DAP), were estimated for chemoembolisation procedures using a Monte Carlo transport code in conjunction with an adult mathematical phantom. Exposure data from 32 patients were used to determine the exposure projections for the simulations. Equivalent organ (H T ) and effective (E) doses were estimated using individual DAP values. The organs receiving the highest amount of doses during these exams were lumbar spine, liver and kidneys. The mean effective dose conversion factor was 1.4 Sv Gy -1 m -2 Dose conversion factors can be useful for patient-specific radiation burden during chemoembolisation procedures. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Study of the impact of artificial articulations on the dose distribution under medical irradiation
NASA Astrophysics Data System (ADS)
Buffard, E.; Gschwind, R.; Makovicka, L.; Martin, E.; Meunier, C.; David, C.
2005-02-01
Perturbations due to the presence of high density heterogeneities in the body are not correctly taken into account in the Treatment Planning Systems currently available for external radiotherapy. For this reason, the accuracy of the dose distribution calculations has to be improved by using Monte Carlo simulations. In a previous study, we established a theoretical model by using the Monte Carlo code EGSnrc [I. Kawrakow, D.W.O. Rogers, The EGSnrc code system: MC simulation of electron and photon transport. Technical Report PIRS-701, NRCC, Ottawa, Canada, 2000] in order to obtain the dose distributions around simple heterogeneities. These simulations were then validated by experimental results obtained with thermoluminescent dosemeters and an ionisation chamber. The influence of samples composed of hip prostheses materials (titanium alloy and steel) and a substitute of bone were notably studied. A more complex model was then developed with the Monte Carlo code BEAMnrc [D.W.O. Rogers, C.M. MA, G.X. Ding, B. Walters, D. Sheikh-Bagheri, G.G. Zhang, BEAMnrc Users Manual. NRC Report PPIRS 509(a) rev F, 2001] in order to take into account the hip prosthesis geometry. The simulation results were compared to experimental measurements performed in a water phantom, in the case of a standard treatment of a pelvic cancer for one of the beams passing through the implant. These results have shown the great influence of the prostheses on the dose distribution.
SU-F-T-657: In-Room Neutron Dose From High Energy Photon Beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christ, D; Ding, G
Purpose: To estimate neutron dose inside the treatment room from photodisintegration events in high energy photon beams using Monte Carlo simulations and experimental measurements. Methods: The Monte Carlo code MCNP6 was used for the simulations. An Eberline ESP-1 Smart Portable Neutron Detector was used to measure neutron dose. A water phantom was centered at isocenter on the treatment couch, and the detector was placed near the phantom. A Varian 2100EX linear accelerator delivered an 18MV open field photon beam to the phantom at 400MU/min, and a camera captured the detector readings. The experimental setup was modeled in the Monte Carlomore » simulation. The source was modeled for two extreme cases: a) hemispherical photon source emitting from the target and b) cone source with an angle of the primary collimator cone. The model includes the target, primary collimator, flattening filter, secondary collimators, water phantom, detector and concrete walls. Energy deposition tallies were measured for neutrons in the detector and for photons at the center of the phantom. Results: For an 18MV beam with an open 10cm by 10cm field and the gantry at 180°, the Monte Carlo simulations predict the neutron dose in the detector to be 0.11% of the photon dose in the water phantom for case a) and 0.01% for case b). The measured neutron dose is 0.04% of the photon dose. Considering the range of neutron dose predicted by Monte Carlo simulations, the calculated results are in good agreement with measurements. Conclusion: We calculated in-room neutron dose by using Monte Carlo techniques, and the predicted neutron dose is confirmed by experimental measurements. If we remodel the source as an electron beam hitting the target for a more accurate representation of the bremsstrahlung fluence, it is feasible that the Monte Carlo simulations can be used to help in shielding designs.« less
NASA Astrophysics Data System (ADS)
Lee, A.; Jung, N. S.; Mokhtari Oranj, L.; Lee, H. S.
2018-06-01
The leakage of radioactive materials generated at particle accelerator facilities is one of the important issues in the view of radiation safety. In this study, fire and flooding at particle accelerator facilities were considered as the non-radiation disasters which result in the leakage of radioactive materials. To analyse the expected effects at each disaster, the case study on fired and flooded particle accelerator facilities was carried out with the property investigation of interesting materials presented in the accelerator tunnel and the activity estimation. Five major materials in the tunnel were investigated: dust, insulators, concrete, metals and paints. The activation levels on the concerned materials were calculated using several Monte Carlo codes (MCNPX 2.7+SP-FISPACT 2007, FLUKA 2011.4c and PHITS 2.64+DCHAIN-SP 2001). The impact weight to environment was estimated for the different beam particles (electron, proton, carbon and uranium) and the different beam energies (100, 430, 600 and 1000 MeV/nucleon). With the consideration of the leakage path of radioactive materials due to fire and flooding, the activation level of selected materials, and the impacts to the environment were evaluated. In the case of flooding, dust, concrete and metal were found as a considerable object. In the case of fire event, dust, insulator and paint were the major concerns. As expected, the influence of normal fire and flooding at electron accelerator facilities would be relatively low for both cases.
NASA Astrophysics Data System (ADS)
Horst, Felix; Fehrenbacher, Georg; Radon, Torsten; Kozlova, Ekaterina; Rosmej, Olga; Czarnecki, Damian; Schrenk, Oliver; Breckow, Joachim; Zink, Klemens
2015-05-01
This work presents a thermoluminescence dosimetry based method for the measurement of bremsstrahlung spectra in the energy range from 30 keV to 100 MeV, resolved in ten different energy intervals and for the photon ambient dosimetry in ultrashort pulsed radiation fields as e.g. generated during operation of the PHELIX laser at the GSI Helmholtzzentrum für Schwerionenforschung. The method is a routine-oriented development by application of a multi-filter technique. The data analysis takes around 1 h. The spectral information is obtained by the unfolding of the response of ten thermoluminescence dosimeters with absorbers of different materials and thicknesses arranged as a stack each with a different response function to photon radiation. These response functions were simulated by the use of the Monte Carlo code FLUKA. An algorithm was developed to unfold bremsstrahlung spectra from the readings of the ten dosimeters. The method has been validated by measurements at a clinical electron linear accelerator (6 MV and 18 MV bremsstrahlung). First measurements at the PHELIX laser system were carried out in December 2013 and January 2014. Spectra with photon energies up to 10 MeV and mean energies up to 420 keV were observed at laser-intensities around 1019 W /cm2 on a titanium foil target. The measurement results imply that the steel walls of the target chamber might be an additional bright x-ray source.
Simulated Response of a Tissue-equivalent Proportional Counter on the Surface of Mars.
Northum, Jeremy D; Guetersloh, Stephen B; Braby, Leslie A; Ford, John R
2015-10-01
Uncertainties persist regarding the assessment of the carcinogenic risk associated with galactic cosmic ray (GCR) exposure during a mission to Mars. The GCR spectrum peaks in the range of 300(-1) MeV n to 700 MeV n(-1) and is comprised of elemental ions from H to Ni. While Fe ions represent only 0.03% of the GCR spectrum in terms of particle abundance, they are responsible for nearly 30% of the dose equivalent in free space. Because of this, radiation biology studies focusing on understanding the biological effects of GCR exposure generally use Fe ions. Acting as a thin shield, the Martian atmosphere alters the GCR spectrum in a manner that significantly reduces the importance of Fe ions. Additionally, albedo particles emanating from the regolith complicate the radiation environment. The present study uses the Monte Carlo code FLUKA to simulate the response of a tissue-equivalent proportional counter on the surface of Mars to produce dosimetry quantities and microdosimetry distributions. The dose equivalent rate on the surface of Mars was found to be 0.18 Sv y(-1) with an average quality factor of 2.9 and a dose mean lineal energy of 18.4 keV μm(-1). Additionally, albedo neutrons were found to account for 25% of the dose equivalent. It is anticipated that these data will provide relevant starting points for use in future risk assessment and mission planning studies.