Science.gov

Sample records for carlo photon transport

  1. Condensed history Monte Carlo methods for photon transport problems

    PubMed Central

    Bhan, Katherine; Spanier, Jerome

    2007-01-01

    We study methods for accelerating Monte Carlo simulations that retain most of the accuracy of conventional Monte Carlo algorithms. These methods – called Condensed History (CH) methods – have been very successfully used to model the transport of ionizing radiation in turbid systems. Our primary objective is to determine whether or not such methods might apply equally well to the transport of photons in biological tissue. In an attempt to unify the derivations, we invoke results obtained first by Lewis, Goudsmit and Saunderson and later improved by Larsen and Tolar. We outline how two of the most promising of the CH models – one based on satisfying certain similarity relations and the second making use of a scattering phase function that permits only discrete directional changes – can be developed using these approaches. The main idea is to exploit the connection between the space-angle moments of the radiance and the angular moments of the scattering phase function. We compare the results obtained when the two CH models studied are used to simulate an idealized tissue transport problem. The numerical results support our findings based on the theoretical derivations and suggest that CH models should play a useful role in modeling light-tissue interactions. PMID:18548128

  2. Modelling photon transport in non-uniform media for SPECT with a vectorized Monte Carlo code.

    PubMed

    Smith, M F

    1993-10-01

    A vectorized Monte Carlo code has been developed for modelling photon transport in non-uniform media for single-photon-emission computed tomography (SPECT). The code is designed to compute photon detection kernels, which are used to build system matrices for simulating SPECT projection data acquisition and for use in matrix-based image reconstruction. Non-uniform attenuating and scattering regions are constructed from simple three-dimensional geometric shapes, in which the density and mass attenuation coefficients are individually specified. On a Stellar GS1000 computer, Monte Carlo simulations are performed between 1.6 and 2.0 times faster when the vector processor is utilized than when computations are performed in scalar mode. Projection data acquired with a clinical SPECT gamma camera for a line source in a non-uniform thorax phantom are well modelled by Monte Carlo simulations. The vectorized Monte Carlo code was used to stimulate a 99Tcm SPECT myocardial perfusion study, and compensations for non-uniform attenuation and the detection of scattered photons improve activity estimation. The speed increase due to vectorization makes Monte Carlo simulation more attractive as a tool for modelling photon transport in non-uniform media for SPECT. PMID:8248288

  3. Photons, Electrons and Positrons Transport in 3D by Monte Carlo Techniques

    SciTech Connect

    2014-12-01

    Version 04 FOTELP-2014 is a new compact general purpose version of the previous FOTELP-2K6 code designed to simulate the transport of photons, electrons and positrons through three-dimensional material and sources geometry by Monte Carlo techniques, using subroutine package PENGEOM from the PENELOPE code under Linux-based and Windows OS. This new version includes routine ELMAG for electron and positron transport simulation in electric and magnetic fields, RESUME option and routine TIMER for obtaining starting random number and for measuring the time of simulation.

  4. Photons, Electrons and Positrons Transport in 3D by Monte Carlo Techniques

    Energy Science and Technology Software Center (ESTSC)

    2014-12-01

    Version 04 FOTELP-2014 is a new compact general purpose version of the previous FOTELP-2K6 code designed to simulate the transport of photons, electrons and positrons through three-dimensional material and sources geometry by Monte Carlo techniques, using subroutine package PENGEOM from the PENELOPE code under Linux-based and Windows OS. This new version includes routine ELMAG for electron and positron transport simulation in electric and magnetic fields, RESUME option and routine TIMER for obtaining starting random numbermore » and for measuring the time of simulation.« less

  5. Macro-step Monte Carlo Methods and their Applications in Proton Radiotherapy and Optical Photon Transport

    NASA Astrophysics Data System (ADS)

    Jacqmin, Dustin J.

    Monte Carlo modeling of radiation transport is considered the gold standard for radiotherapy dose calculations. However, highly accurate Monte Carlo calculations are very time consuming and the use of Monte Carlo dose calculation methods is often not practical in clinical settings. With this in mind, a variation on the Monte Carlo method called macro Monte Carlo (MMC) was developed in the 1990's for electron beam radiotherapy dose calculations. To accelerate the simulation process, the electron MMC method used larger steps-sizes in regions of the simulation geometry where the size of the region was large relative to the size of a typical Monte Carlo step. These large steps were pre-computed using conventional Monte Carlo simulations and stored in a database featuring many step-sizes and materials. The database was loaded into memory by a custom electron MMC code and used to transport electrons quickly through a heterogeneous absorbing geometry. The purpose of this thesis work was to apply the same techniques to proton radiotherapy dose calculation and light propagation Monte Carlo simulations. First, the MMC method was implemented for proton radiotherapy dose calculations. A database composed of pre-computed steps was created using MCNPX for many materials and beam energies. The database was used by a custom proton MMC code called PMMC to transport protons through a heterogeneous absorbing geometry. The PMMC code was tested against MCNPX for a number of different proton beam energies and geometries and proved to be accurate and much more efficient. The MMC method was also implemented for light propagation Monte Carlo simulations. The widely accepted Monte Carlo for multilayered media (MCML) was modified to incorporate the MMC method. The original MCML uses basic scattering and absorption physics to transport optical photons through multilayered geometries. The MMC version of MCML was tested against the original MCML code using a number of different geometries and

  6. Monte Carlo photon transport on vector and parallel supercomputers: Final report

    SciTech Connect

    Martin, W.R.; Nowak, P.F.

    1986-12-01

    The University of Michigan has been investigating the implementation of vectorized and parallelized Monte Carlo algorithms for the analysis of photon transport in an inertially-confined fusion (ICF) plasma. The goal of this work is to develop and test Monte Carlo algorithms for vector/parallel supercomputers such as the Cray X-MP and Cray-2. Previous effort has resulted in the development of a vectorized photon transport code, named VPHOT, and a companion scalar code, named SPHOT, that performs the same analysis and is used for comparative purposes to assess the performance of the vectorized algorithm. A test problem, denoted the ICF test problem, has been created and tested with the VPHOT and SPHOT codes. By comparison with a reference LLNL calculation of the ICF test problem, the VPHOT/SPHOT codes have been verified to predict the correct results. Performance results with VPHOT versus SPHOT and the reference LLNL code have been reported previously and indicate that speedups in the range of 6 to 12 can be achieved with the vectorized algorithm versus the conventional scalar algorithm on the Cray X-MP. This report summarizes the progress made during the last year to continue the investigation of vectorized Monte Carlo (parameter studies, alternative vectorized algorithm, alternative target machines) and to extend the work into the area of parallel processing. 5 refs.

  7. Fast Monte Carlo Electron-Photon Transport Method and Application in Accurate Radiotherapy

    NASA Astrophysics Data System (ADS)

    Hao, Lijuan; Sun, Guangyao; Zheng, Huaqing; Song, Jing; Chen, Zhenping; Li, Gui

    2014-06-01

    Monte Carlo (MC) method is the most accurate computational method for dose calculation, but its wide application on clinical accurate radiotherapy is hindered due to its poor speed of converging and long computation time. In the MC dose calculation research, the main task is to speed up computation while high precision is maintained. The purpose of this paper is to enhance the calculation speed of MC method for electron-photon transport with high precision and ultimately to reduce the accurate radiotherapy dose calculation time based on normal computer to the level of several hours, which meets the requirement of clinical dose verification. Based on the existing Super Monte Carlo Simulation Program (SuperMC), developed by FDS Team, a fast MC method for electron-photon coupled transport was presented with focus on two aspects: firstly, through simplifying and optimizing the physical model of the electron-photon transport, the calculation speed was increased with slightly reduction of calculation accuracy; secondly, using a variety of MC calculation acceleration methods, for example, taking use of obtained information in previous calculations to avoid repeat simulation of particles with identical history; applying proper variance reduction techniques to accelerate MC method convergence rate, etc. The fast MC method was tested by a lot of simple physical models and clinical cases included nasopharyngeal carcinoma, peripheral lung tumor, cervical carcinoma, etc. The result shows that the fast MC method for electron-photon transport was fast enough to meet the requirement of clinical accurate radiotherapy dose verification. Later, the method will be applied to the Accurate/Advanced Radiation Therapy System ARTS as a MC dose verification module.

  8. TART97 a coupled neutron-photon 3-D, combinatorial geometry Monte Carlo transport code

    SciTech Connect

    Cullen, D.E.

    1997-11-22

    TART97 is a coupled neutron-photon, 3 Dimensional, combinatorial geometry, time dependent Monte Carlo transport code. This code can on any modern computer. It is a complete system to assist you with input preparation, running Monte Carlo calculations, and analysis of output results. TART97 is also incredibly FAST; if you have used similar codes, you will be amazed at how fast this code is compared to other similar codes. Use of the entire system can save you a great deal of time and energy. TART97 is distributed on CD. This CD contains on- line documentation for all codes included in the system, the codes configured to run on a variety of computers, and many example problems that you can use to familiarize yourself with the system. TART97 completely supersedes all older versions of TART, and it is strongly recommended that users only use the most recent version of TART97 and its data riles.

  9. MCNP: a general Monte Carlo code for neutron and photon transport

    SciTech Connect

    Forster, R.A.; Godfrey, T.N.K.

    1985-01-01

    MCNP is a very general Monte Carlo neutron photon transport code system with approximately 250 person years of Group X-6 code development invested. It is extremely portable, user-oriented, and a true production code as it is used about 60 Cray hours per month by about 150 Los Alamos users. It has as its data base the best cross-section evaluations available. MCNP contains state-of-the-art traditional and adaptive Monte Carlo techniques to be applied to the solution of an ever-increasing number of problems. Excellent user-oriented documentation is available for all facets of the MCNP code system. Many useful and important variants of MCNP exist for special applications. The Radiation Shielding Information Center (RSIC) in Oak Ridge, Tennessee is the contact point for worldwide MCNP code and documentation distribution. A much improved MCNP Version 3A will be available in the fall of 1985, along with new and improved documentation. Future directions in MCNP development will change the meaning of MCNP to Monte Carlo N Particle where N particle varieties will be transported.

  10. Comparison of Monte Carlo collimator transport methods for photon treatment planning in radiotherapy

    SciTech Connect

    Schmidhalter, D.; Manser, P.; Frei, D.; Volken, W.; Fix, M. K.

    2010-02-15

    Purpose: The aim of this work was a Monte Carlo (MC) based investigation of the impact of different radiation transport methods in collimators of a linear accelerator on photon beam characteristics, dose distributions, and efficiency. Thereby it is investigated if it is possible to use different simplifications in the radiation transport for some clinical situations in order to save calculation time. Methods: Within the Swiss Monte Carlo Plan, a GUI-based framework for photon MC treatment planning, different MC methods are available for the radiation transport through the collimators [secondary jaws and multileaf collimator (MLC)]: EGSnrc (reference), VMC++, and Pin (an in-house developed MC code). Additional nonfull transport methods were implemented in order to provide different complexity levels for the MC simulation: Considering collimator attenuation only, considering Compton scatter only or just the firstCompton process, and considering the collimators as totally absorbing. Furthermore, either a simple or an exact geometry of the collimators can be selected for the absorbing or attenuation method. Phasespaces directly above and dose distributions in a water phantom are analyzed for academic and clinical treatment fields using 6 and 15 MV beams, including intensity modulated radiation therapy with dynamic MLC. Results: For all MC transport methods, differences in the radial mean energy and radial energy fluence are within 1% inside the geometric field. Below the collimators, the energy fluence is underestimated for nonfull MC transport methods ranging from 5% for Compton to 100% for Absorbing. Gamma analysis using EGSnrc calculated doses as reference shows that the percentage of voxels fulfilling a 1% /1 mm criterion is at least 98% when using VMC++, Compton, or firstCompton transport methods. When using the methods Pin, Transmission, Flat-Transmission, Flat-Absorbing or Absorbing, the mean value of points fulfilling this criterion over all tested cases is 97

  11. ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2008-04-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  12. SU-E-T-558: Monte Carlo Photon Transport Simulations On GPU with Quadric Geometry

    SciTech Connect

    Chi, Y; Tian, Z; Jiang, S; Jia, X

    2015-06-15

    Purpose: Monte Carlo simulation on GPU has experienced rapid advancements over the past a few years and tremendous accelerations have been achieved. Yet existing packages were developed only in voxelized geometry. In some applications, e.g. radioactive seed modeling, simulations in more complicated geometry are needed. This abstract reports our initial efforts towards developing a quadric geometry module aiming at expanding the application scope of GPU-based MC simulations. Methods: We defined the simulation geometry consisting of a number of homogeneous bodies, each specified by its material composition and limiting surfaces characterized by quadric functions. A tree data structure was utilized to define geometric relationship between different bodies. We modified our GPU-based photon MC transport package to incorporate this geometry. Specifically, geometry parameters were loaded into GPU’s shared memory for fast access. Geometry functions were rewritten to enable the identification of the body that contains the current particle location via a fast searching algorithm based on the tree data structure. Results: We tested our package in an example problem of HDR-brachytherapy dose calculation for shielded cylinder. The dose under the quadric geometry and that under the voxelized geometry agreed in 94.2% of total voxels within 20% isodose line based on a statistical t-test (95% confidence level), where the reference dose was defined to be the one at 0.5cm away from the cylinder surface. It took 243sec to transport 100million source photons under this quadric geometry on an NVidia Titan GPU card. Compared with simulation time of 99.6sec in the voxelized geometry, including quadric geometry reduced efficiency due to the complicated geometry-related computations. Conclusion: Our GPU-based MC package has been extended to support photon transport simulation in quadric geometry. Satisfactory accuracy was observed with a reduced efficiency. Developments for charged

  13. penORNL: a parallel Monte Carlo photon and electron transport package using PENELOPE

    SciTech Connect

    Bekar, Kursat B.; Miller, Thomas Martin; Patton, Bruce W.; Weber, Charles F.

    2015-01-01

    The parallel Monte Carlo photon and electron transport code package penORNL was developed at Oak Ridge National Laboratory to enable advanced scanning electron microscope (SEM) simulations on high-performance computing systems. This paper discusses the implementations, capabilities and parallel performance of the new code package. penORNL uses PENELOPE for its physics calculations and provides all available PENELOPE features to the users, as well as some new features including source definitions specifically developed for SEM simulations, a pulse-height tally capability for detailed simulations of gamma and x-ray detectors, and a modified interaction forcing mechanism to enable accurate energy deposition calculations. The parallel performance of penORNL was extensively tested with several model problems, and very good linear parallel scaling was observed with up to 512 processors. penORNL, along with its new features, will be available for SEM simulations upon completion of the new pulse-height tally implementation.

  14. Space applications of the MITS electron-photon Monte Carlo transport code system

    SciTech Connect

    Kensek, R.P.; Lorence, L.J.; Halbleib, J.A.; Morel, J.E.

    1996-07-01

    The MITS multigroup/continuous-energy electron-photon Monte Carlo transport code system has matured to the point that it is capable of addressing more realistic three-dimensional adjoint applications. It is first employed to efficiently predict point doses as a function of source energy for simple three-dimensional experimental geometries exposed to simulated uniform isotropic planar sources of monoenergetic electrons up to 4.0 MeV. Results are in very good agreement with experimental data. It is then used to efficiently simulate dose to a detector in a subsystem of a GPS satellite due to its natural electron environment, employing a relatively complex model of the satellite. The capability for survivability analysis of space systems is demonstrated, and results are obtained with and without variance reduction.

  15. Ant colony algorithm implementation in electron and photon Monte Carlo transport: Application to the commissioning of radiosurgery photon beams

    SciTech Connect

    Garcia-Pareja, S.; Galan, P.; Manzano, F.; Brualla, L.; Lallena, A. M.

    2010-07-15

    Purpose: In this work, the authors describe an approach which has been developed to drive the application of different variance-reduction techniques to the Monte Carlo simulation of photon and electron transport in clinical accelerators. Methods: The new approach considers the following techniques: Russian roulette, splitting, a modified version of the directional bremsstrahlung splitting, and the azimuthal particle redistribution. Their application is controlled by an ant colony algorithm based on an importance map. Results: The procedure has been applied to radiosurgery beams. Specifically, the authors have calculated depth-dose profiles, off-axis ratios, and output factors, quantities usually considered in the commissioning of these beams. The agreement between Monte Carlo results and the corresponding measurements is within {approx}3%/0.3 mm for the central axis percentage depth dose and the dose profiles. The importance map generated in the calculation can be used to discuss simulation details in the different parts of the geometry in a simple way. The simulation CPU times are comparable to those needed within other approaches common in this field. Conclusions: The new approach is competitive with those previously used in this kind of problems (PSF generation or source models) and has some practical advantages that make it to be a good tool to simulate the radiation transport in problems where the quantities of interest are difficult to obtain because of low statistics.

  16. Coupled Neutron-Photon, 3-D, Combinatorial Geometry, Time Dependent, Monte Carlo Transport Code System.

    Energy Science and Technology Software Center (ESTSC)

    2013-06-24

    Version 07 TART2012 is a coupled neutron-photon Monte Carlo transport code designed to use three-dimensional (3-D) combinatorial geometry. Neutron and/or photon sources as well as neutron induced photon production can be tracked. It is a complete system to assist you with input preparation, running Monte Carlo calculations, and analysis of output results. TART2012 is also incredibly FAST; if you have used similar codes, you will be amazed at how fast this code is compared tomore » other similar codes. Use of the entire system can save you a great deal of time and energy. TART2012 extends the general utility of the code to even more areas of application than available in previous releases by concentrating on improving the physics, particularly with regard to improved treatment of neutron fission, resonance self-shielding, molecular binding, and extending input options used by the code. Several utilities are included for creating input files and displaying TART results and data. TART2012 uses the latest ENDF/B-VI, Release 8, data. New for TART2012 is the use of continuous energy neutron cross sections, in addition to its traditional multigroup cross sections. For neutron interaction, the data are derived using ENDF-ENDL2005 and include both continuous energy cross sections and 700 group neutron data derived using a combination of ENDF/B-VI, Release 8, and ENDL data. The 700 group structure extends from 10-5 eV up to 1 GeV. Presently nuclear data are only available up to 20 MeV, so that only 616 of the groups are currently used. For photon interaction, 701 point photon data were derived using the Livermore EPDL97 file. The new 701 point structure extends from 100 eV up to 1 GeV, and is currently used over this entire energy range. TART2012 completely supersedes all older versions of TART, and it is strongly recommended that one use only the most recent version of TART2012 and its data files. Check author’s homepage for related information: http

  17. A Coupled Neutron-Photon 3-D Combinatorial Geometry Monte Carlo Transport Code

    Energy Science and Technology Software Center (ESTSC)

    1998-06-12

    TART97 is a coupled neutron-photon, 3 dimensional, combinatorial geometry, time dependent Monte Carlo transport code. This code can run on any modern computer. It is a complete system to assist you with input preparation, running Monte Carlo calculations, and analysis of output results. TART97 is also incredibly fast: if you have used similar codes, you will be amazed at how fast this code is compared to other similar codes. Use of the entire system canmore » save you a great deal of time and energy. TART 97 is distributed on CD. This CD contains on-line documentation for all codes included in the system, the codes configured to run on a variety of computers, and many example problems that you can use to familiarize yourself with the system. TART97 completely supersedes all older versions of TART, and it is strongly recommended that users only use the most recent version of TART97 and ist data files.« less

  18. Code System for Monte Carlo Simulation of Electron and Photon Transport.

    SciTech Connect

    2015-07-01

    Version 01 PENELOPE performs Monte Carlo simulation of coupled electron-photon transport in arbitrary materials and complex quadric geometries. A mixed procedure is used for the simulation of electron and positron interactions (elastic scattering, inelastic scattering and bremsstrahlung emission), in which ‘hard’ events (i.e. those with deflection angle and/or energy loss larger than pre-selected cutoffs) are simulated in a detailed way, while ‘soft’ interactions are calculated from multiple scattering approaches. Photon interactions (Rayleigh scattering, Compton scattering, photoelectric effect and electron-positron pair production) and positron annihilation are simulated in a detailed way. PENELOPE reads the required physical information about each material (which includes tables of physical properties, interaction cross sections, relaxation data, etc.) from the input material data file. The material data file is created by means of the auxiliary program MATERIAL, which extracts atomic interaction data from the database of ASCII files. PENELOPE mailing list archives and additional information about the code can be found at http://www.nea.fr/lists/penelope.html. See Abstract for additional features.

  19. Code System for Monte Carlo Simulation of Electron and Photon Transport.

    Energy Science and Technology Software Center (ESTSC)

    2015-07-01

    Version 01 PENELOPE performs Monte Carlo simulation of coupled electron-photon transport in arbitrary materials and complex quadric geometries. A mixed procedure is used for the simulation of electron and positron interactions (elastic scattering, inelastic scattering and bremsstrahlung emission), in which ‘hard’ events (i.e. those with deflection angle and/or energy loss larger than pre-selected cutoffs) are simulated in a detailed way, while ‘soft’ interactions are calculated from multiple scattering approaches. Photon interactions (Rayleigh scattering, Compton scattering,more » photoelectric effect and electron-positron pair production) and positron annihilation are simulated in a detailed way. PENELOPE reads the required physical information about each material (which includes tables of physical properties, interaction cross sections, relaxation data, etc.) from the input material data file. The material data file is created by means of the auxiliary program MATERIAL, which extracts atomic interaction data from the database of ASCII files. PENELOPE mailing list archives and additional information about the code can be found at http://www.nea.fr/lists/penelope.html. See Abstract for additional features.« less

  20. Integrated TIGER Series of Coupled Electron/Photon Monte Carlo Transport Codes System.

    Energy Science and Technology Software Center (ESTSC)

    2012-11-30

    Version: 00 Distribution is restricted to US Government Agencies and Their Contractors Only. The Integrated Tiger Series (ITS) is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. The goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects onemore » of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 95. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.« less

  1. Integrated TIGER Series of Coupled Electron/Photon Monte Carlo Transport Codes System.

    SciTech Connect

    VALDEZ, GREG D.

    2012-11-30

    Version: 00 Distribution is restricted to US Government Agencies and Their Contractors Only. The Integrated Tiger Series (ITS) is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. The goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 95. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  2. Qualitative Simulation of Photon Transport in Free Space Based on Monte Carlo Method and Its Parallel Implementation

    PubMed Central

    Chen, Xueli; Gao, Xinbo; Qu, Xiaochao; Chen, Duofang; Ma, Bin; Wang, Lin; Peng, Kuan; Liang, Jimin; Tian, Jie

    2010-01-01

    During the past decade, Monte Carlo method has obtained wide applications in optical imaging to simulate photon transport process inside tissues. However, this method has not been effectively extended to the simulation of free-space photon transport at present. In this paper, a uniform framework for noncontact optical imaging is proposed based on Monte Carlo method, which consists of the simulation of photon transport both in tissues and in free space. Specifically, the simplification theory of lens system is utilized to model the camera lens equipped in the optical imaging system, and Monte Carlo method is employed to describe the energy transformation from the tissue surface to the CCD camera. Also, the focusing effect of camera lens is considered to establish the relationship of corresponding points between tissue surface and CCD camera. Furthermore, a parallel version of the framework is realized, making the simulation much more convenient and effective. The feasibility of the uniform framework and the effectiveness of the parallel version are demonstrated with a cylindrical phantom based on real experimental results. PMID:20689705

  3. Development of a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport.

    PubMed

    Jia, Xun; Gu, Xuejun; Sempau, Josep; Choi, Dongju; Majumdar, Amitava; Jiang, Steve B

    2010-06-01

    Monte Carlo simulation is the most accurate method for absorbed dose calculations in radiotherapy. Its efficiency still requires improvement for routine clinical applications, especially for online adaptive radiotherapy. In this paper, we report our recent development on a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport. We have implemented the dose planning method (DPM) Monte Carlo dose calculation package (Sempau et al 2000 Phys. Med. Biol. 45 2263-91) on the GPU architecture under the CUDA platform. The implementation has been tested with respect to the original sequential DPM code on the CPU in phantoms with water-lung-water or water-bone-water slab geometry. A 20 MeV mono-energetic electron point source or a 6 MV photon point source is used in our validation. The results demonstrate adequate accuracy of our GPU implementation for both electron and photon beams in the radiotherapy energy range. Speed-up factors of about 5.0-6.6 times have been observed, using an NVIDIA Tesla C1060 GPU card against a 2.27 GHz Intel Xeon CPU processor. PMID:20463376

  4. Application of discrete ordinates and Monte Carlo methods to transport of photons from environmental sources

    SciTech Connect

    Ryman, J.C.; Eckerman, K.F.; Shultis, J.K.; Faw, R.E.; Dillman, L.T.

    1996-04-01

    Federal Guidance Report No. 12 tabulates dose coefficients for external exposure to photons and electrons emitted by radionuclides distributed in air, water, and soil. Although the dose coefficients of this report are based on previously developed dosimetric methodologies, they are derived from new, detailed calculations of energy and angular distributions of the radiations incident on the body and the transport of these radiations within the body. Effort was devoted to expanding the information available for assessment of radiation dose from radionuclides distributed on or below the surface of the ground. A companion paper (External Exposure to Radionuclides in Air, Water, and Soil) discusses the significance of the new tabulations of coefficients and provides detiled comparisons to previously published values. This paper discusses details of the photon transport calculations.

  5. TART98 a coupled neutron-photon 3-D, combinatorial geometry time dependent Monte Carlo Transport code

    SciTech Connect

    Cullen, D E

    1998-11-22

    TART98 is a coupled neutron-photon, 3 Dimensional, combinatorial geometry, time dependent Monte Carlo radiation transport code. This code can run on any modern computer. It is a complete system to assist you with input preparation, running Monte Carlo calculations, and analysis of output results. TART98 is also incredibly FAST; if you have used similar codes, you will be amazed at how fast this code is compared to other similar codes. Use of the entire system can save you a great deal of time and energy. TART98 is distributed on CD. This CD contains on-line documentation for all codes included in the system, the codes configured to run on a variety of computers, and many example problems that you can use to familiarize yourself with the system. TART98 completely supersedes all older versions of TART, and it is strongly recommended that users only use the most recent version of TART98 and its data files.

  6. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    SciTech Connect

    Badal, Andreu; Badano, Aldo

    2009-11-15

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  7. Modeling of an industrial environment: external dose calculations based on Monte Carlo simulations of photon transport.

    PubMed

    Kis, Zoltán; Eged, Katalin; Voigt, Gabriele; Meckbach, Reinhard; Müller, Heinz

    2004-02-01

    External gamma exposures from radionuclides deposited on surfaces usually result in the major contribution to the total dose to the public living in urban-industrial environments. The aim of the paper is to give an example for a calculation of the collective and averted collective dose due to the contamination and decontamination of deposition surfaces in a complex environment based on the results of Monte Carlo simulations. The shielding effects of the structures in complex and realistic industrial environments (where productive and/or commercial activity is carried out) were computed by the use of Monte Carlo method. Several types of deposition areas (walls, roofs, windows, streets, lawn) were considered. Moreover, this paper gives a summary about the time dependence of the source strengths relative to a reference surface and a short overview about the mechanical and chemical intervention techniques which can be applied in this area. An exposure scenario was designed based on a survey of average German and Hungarian supermarkets. In the first part of the paper the air kermas per photon per unit area due to each specific deposition area contaminated by 137Cs were determined at several arbitrary locations in the whole environment relative to a reference value of 8.39 x 10(-4) pGy per gamma m(-2). The calculations provide the possibility to assess the whole contribution of a specific deposition area to the collective dose, separately. According to the current results, the roof and the paved area contribute the most part (approximately 92%) to the total dose in the first year taking into account the relative contamination of the deposition areas. When integrating over 10 or 50 y, these two surfaces remain the most important contributors as well but the ratio will increasingly be shifted in favor of the roof. The decontamination of the roof and the paved area results in about 80-90% of the total averted collective dose in each calculated time period (1, 10, 50 y

  8. ITS version 5.0 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2004-06-01

    ITS is a powerful and user-friendly software package permitting state of the art Monte Carlo solution of linear time-independent couple electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2)multigroup codes with adjoint transport capabilities, and (3) parallel implementations of all ITS codes. Moreover the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  9. Monte Carlo electron-photon transport using GPUs as an accelerator: Results for a water-aluminum-water phantom

    SciTech Connect

    Su, L.; Du, X.; Liu, T.; Xu, X. G.

    2013-07-01

    An electron-photon coupled Monte Carlo code ARCHER - Accelerated Radiation-transport Computations in Heterogeneous Environments - is being developed at Rensselaer Polytechnic Institute as a software test bed for emerging heterogeneous high performance computers that utilize accelerators such as GPUs. In this paper, the preliminary results of code development and testing are presented. The electron transport in media was modeled using the class-II condensed history method. The electron energy considered ranges from a few hundred keV to 30 MeV. Moller scattering and bremsstrahlung processes above a preset energy were explicitly modeled. Energy loss below that threshold was accounted for using the Continuously Slowing Down Approximation (CSDA). Photon transport was dealt with using the delta tracking method. Photoelectric effect, Compton scattering and pair production were modeled. Voxelised geometry was supported. A serial ARHCHER-CPU was first written in C++. The code was then ported to the GPU platform using CUDA C. The hardware involved a desktop PC with an Intel Xeon X5660 CPU and six NVIDIA Tesla M2090 GPUs. ARHCHER was tested for a case of 20 MeV electron beam incident perpendicularly on a water-aluminum-water phantom. The depth and lateral dose profiles were found to agree with results obtained from well tested MC codes. Using six GPU cards, 6x10{sup 6} histories of electrons were simulated within 2 seconds. In comparison, the same case running the EGSnrc and MCNPX codes required 1645 seconds and 9213 seconds, respectively, on a CPU with a single core used. (authors)

  10. Electron-photon transport using the EGS4 (Electron Gamma Shower) Monte Carlo Code

    SciTech Connect

    Nelson, W.R.; Hirayama, H.; Rogers, D.W.O.

    1986-01-01

    The EGS (Electron Gamma Shower) code system was formally introduced in 1978 as a package, most commonly referred to as ESG3. It was designed to simulate electromagnetic cascades in various geometries and at energies up to a few thousand gigaelectron volts and down to cutoff kinetic energies of 0.1 MeV (photons) and 1 MeV (electrons). There have been many requests to extend EGS3 down to lower energies and this is a major, but not the only, reason for creating EGS4, which is now available for general distribution and is the subject of this presentation. A summary is given of the main features of the ESG4 code system, including statements about the physics that has been put into it and what can be realistically simulated. 6 refs.

  11. Development and Implementation of Photonuclear Cross-Section Data for Mutually Coupled Neutron-Photon Transport Calculations in the Monte Carlo N-Particle (MCNP) Radiation Transport Code

    SciTech Connect

    Morgan C. White

    2000-07-01

    The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a select group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V&V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second, the ability to

  12. A Monte Carlo study of high-energy photon transport in matter: application for multiple scattering investigation in Compton spectroscopy

    PubMed Central

    Brancewicz, Marek; Itou, Masayoshi; Sakurai, Yoshiharu

    2016-01-01

    The first results of multiple scattering simulations of polarized high-energy X-rays for Compton experiments using a new Monte Carlo program, MUSCAT, are presented. The program is developed to follow the restrictions of real experimental geometries. The new simulation algorithm uses not only well known photon splitting and interaction forcing methods but it is also upgraded with the new propagation separation method and highly vectorized. In this paper, a detailed description of the new simulation algorithm is given. The code is verified by comparison with the previous experimental and simulation results by the ESRF group and new restricted geometry experiments carried out at SPring-8. PMID:26698070

  13. Simulation of the full-core pin-model by JMCT Monte Carlo neutron-photon transport code

    SciTech Connect

    Li, D.; Li, G.; Zhang, B.; Shu, L.; Shangguan, D.; Ma, Y.; Hu, Z.

    2013-07-01

    Since the large numbers of cells over a million, the tallies over a hundred million and the particle histories over ten billion, the simulation of the full-core pin-by-pin model has become a real challenge for the computers and the computational methods. On the other hand, the basic memory of the model has exceeded the limit of a single CPU, so the spatial domain and data decomposition must be considered. JMCT (J Monte Carlo Transport code) has successful fulfilled the simulation of the full-core pin-by-pin model by the domain decomposition and the nested parallel computation. The k{sub eff} and flux of each cell are obtained. (authors)

  14. ScintSim1: A new Monte Carlo simulation code for transport of optical photons in 2D arrays of scintillation detectors.

    PubMed

    Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali

    2014-01-01

    Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization. PMID:24600168

  15. ScintSim1: A new Monte Carlo simulation code for transport of optical photons in 2D arrays of scintillation detectors

    PubMed Central

    Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali

    2014-01-01

    Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization. PMID:24600168

  16. General Purpose Monte Carlo Codes for Neutron and Photon Transport Calculations based on Continuous Energy and Multigroup Methods.

    Energy Science and Technology Software Center (ESTSC)

    2008-02-29

    Version 00 (1) Problems to be solved: MVP/GMVP II can solve eigenvalue and fixed-source problems. The multigroup code GMVP can solve forward and adjoint problems for neutron, photon and neutron-photon coupled transport. The continuous-energy code MVP can solve only the forward problems. Both codes can also perform time-dependent calculations. (2) Geometry description: MVP/GMVP employs combinatorial geometry to describe the calculation geometry. It describes spatial regions by the combination of the 3-dimensional objects (BODIes). Currently, themore » following objects (BODIes) can be used. - BODIes with linear surfaces : half space, parallelepiped, right parallelepiped, wedge, right hexagonal prism - BODIes with quadratic surface and linear surfaces : cylinder, sphere, truncated right cone, truncated elliptic cone, ellipsoid by rotation, general ellipsoid - Arbitrary quadratic surface and torus The rectangular and hexagonal lattice geometry can be used to describe the repeated geometry. Furthermore, the statistical geometry model is available to treat coated fuel particles or pebbles for high temperature reactors. (3) Particle sources: The various forms of energy-, angle-, space- and time-dependent distribution functions can be specified. See Abstract for more detail.« less

  17. Verification of external exposure assessment for the upper Techa riverside by luminescence measurements and Monte Carlo photon transport modeling.

    PubMed

    Taranenko, V; Meckbach, R; Degteva, M O; Bougrov, N G; Göksu, Y; Vorobiova, M I; Jacob, P

    2003-04-01

    An area located in the Southern Urals was contaminated in 1949-1956 as a result of radioactive waste releases into the Techa river by the Mayak Production Association. The external dose reconstruction of the Techa river dosimetry system (TRDS-2000) for the exposed population is based on an assessment of dose rates in air (DRA) obtained by modeling transport and deposition of radionuclides along the river for the time before 1952 and by gamma dose rate measurements since 1952. The aim of this paper is to contribute to a verification of the TRDS-2000 external dose assessment. Absorbed doses in bricks from a 130-year-old building in the heavily exposed Metlino settlement were measured by a luminescence technique. By the autumn of 1956 the population of Metlino had been evacuated, and then a water reservoir was created at the village location, which led to a change in the radioactive source geometry. Radiation transport calculations for assumed environmental sources before and since 1957 were performed with the MCNP Monte Carlo code. In combination with TRDS-2000 estimates for annual dose rates in air at the shore of the Techa river for the period 1949-1956 and contemporary dose rate in air measurements, absorbed doses in bricks were calculated. These calculations were performed deterministically with best estimates of the modeling parameters and stochastically by propagating uncertainty distributions through the calculation scheme. Assessed doses in bricks were found to be consistent with measured values within the uncertainty bounds, while their best estimates were approximately 15% lower than the luminescence measurements. PMID:12687379

  18. Monte Carlo Transport for Electron Thermal Transport

    NASA Astrophysics Data System (ADS)

    Chenhall, Jeffrey; Cao, Duc; Moses, Gregory

    2015-11-01

    The iSNB (implicit Schurtz Nicolai Busquet multigroup electron thermal transport method of Cao et al. is adapted into a Monte Carlo transport method in order to better model the effects of non-local behavior. The end goal is a hybrid transport-diffusion method that combines Monte Carlo Transport with a discrete diffusion Monte Carlo (DDMC). The hybrid method will combine the efficiency of a diffusion method in short mean free path regions with the accuracy of a transport method in long mean free path regions. The Monte Carlo nature of the approach allows the algorithm to be massively parallelized. Work to date on the method will be presented. This work was supported by Sandia National Laboratory - Albuquerque and the University of Rochester Laboratory for Laser Energetics.

  19. ITS version 5.0 :the integrated TIGER series of coupled electron/Photon monte carlo transport codes with CAD geometry.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2005-09-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  20. A transported probability density function/photon Monte Carlo method for high-temperature oxy-natural gas combustion with spectral gas and wall radiation

    NASA Astrophysics Data System (ADS)

    Zhao, X. Y.; Haworth, D. C.; Ren, T.; Modest, M. F.

    2013-04-01

    A computational fluid dynamics model for high-temperature oxy-natural gas combustion is developed and exercised. The model features detailed gas-phase chemistry and radiation treatments (a photon Monte Carlo method with line-by-line spectral resolution for gas and wall radiation - PMC/LBL) and a transported probability density function (PDF) method to account for turbulent fluctuations in composition and temperature. The model is first validated for a 0.8 MW oxy-natural gas furnace, and the level of agreement between model and experiment is found to be at least as good as any that has been published earlier. Next, simulations are performed with systematic model variations to provide insight into the roles of individual physical processes and their interplay in high-temperature oxy-fuel combustion. This includes variations in the chemical mechanism and the radiation model, and comparisons of results obtained with versus without the PDF method to isolate and quantify the effects of turbulence-chemistry interactions and turbulence-radiation interactions. In this combustion environment, it is found to be important to account for the interconversion of CO and CO2, and radiation plays a dominant role. The PMC/LBL model allows the effects of molecular gas radiation and wall radiation to be clearly separated and quantified. Radiation and chemistry are tightly coupled through the temperature, and correct temperature prediction is required for correct prediction of the CO/CO2 ratio. Turbulence-chemistry interactions influence the computed flame structure and mean CO levels. Strong local effects of turbulence-radiation interactions are found in the flame, but the net influence of TRI on computed mean temperature and species profiles is small. The ultimate goal of this research is to simulate high-temperature oxy-coal combustion, where accurate treatments of chemistry, radiation and turbulence-chemistry-particle-radiation interactions will be even more important.

  1. TOPICAL REVIEW: Monte Carlo modelling of external radiotherapy photon beams

    NASA Astrophysics Data System (ADS)

    Verhaegen, Frank; Seuntjens, Jan

    2003-11-01

    An essential requirement for successful radiation therapy is that the discrepancies between dose distributions calculated at the treatment planning stage and those delivered to the patient are minimized. An important component in the treatment planning process is the accurate calculation of dose distributions. The most accurate way to do this is by Monte Carlo calculation of particle transport, first in the geometry of the external or internal source followed by tracking the transport and energy deposition in the tissues of interest. Additionally, Monte Carlo simulations allow one to investigate the influence of source components on beams of a particular type and their contaminant particles. Since the mid 1990s, there has been an enormous increase in Monte Carlo studies dealing specifically with the subject of the present review, i.e., external photon beam Monte Carlo calculations, aided by the advent of new codes and fast computers. The foundations for this work were laid from the late 1970s until the early 1990s. In this paper we will review the progress made in this field over the last 25 years. The review will be focused mainly on Monte Carlo modelling of linear accelerator treatment heads but sections will also be devoted to kilovoltage x-ray units and 60Co teletherapy sources.

  2. Monte Carlo modelling of external radiotherapy photon beams.

    PubMed

    Verhaegen, Frank; Seuntjens, Jan

    2003-11-01

    An essential requirement for successful radiation therapy is that the discrepancies between dose distributions calculated at the treatment planning stage and those delivered to the patient are minimized. An important component in the treatment planning process is the accurate calculation of dose distributions. The most accurate way to do this is by Monte Carlo calculation of particle transport, first in the geometry of the external or internal source followed by tracking the transport and energy deposition in the tissues of interest. Additionally, Monte Carlo simulations allow one to investigate the influence of source components on beams of a particular type and their contaminant particles. Since the mid 1990s, there has been an enormous increase in Monte Carlo studies dealing specifically with the subject of the present review, i.e., external photon beam Monte Carlo calculations, aided by the advent of new codes and fast computers. The foundations for this work were laid from the late 1970s until the early 1990s. In this paper we will review the progress made in this field over the last 25 years. The review will be focused mainly on Monte Carlo modelling of linear accelerator treatment heads but sections will also be devoted to kilovoltage x-ray units and 60Co teletherapy sources. PMID:14653555

  3. RCPO1 - A Monte Carlo program for solving neutron and photon transport problems in three dimensional geometry with detailed energy description and depletion capability

    SciTech Connect

    Ondis, L.A., II; Tyburski, L.J.; Moskowitz, B.S.

    2000-03-01

    The RCP01 Monte Carlo program is used to analyze many geometries of interest in nuclear design and analysis of light water moderated reactors such as the core in its pressure vessel with complex piping arrangement, fuel storage arrays, shipping and container arrangements, and neutron detector configurations. Written in FORTRAN and in use on a variety of computers, it is capable of estimating steady state neutron or photon reaction rates and neutron multiplication factors. The energy range covered in neutron calculations is that relevant to the fission process and subsequent slowing-down and thermalization, i.e., 20 MeV to 0 eV. The same energy range is covered for photon calculations.

  4. THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE

    SciTech Connect

    WATERS, LAURIE S.; MCKINNEY, GREGG W.; DURKEE, JOE W.; FENSIN, MICHAEL L.; JAMES, MICHAEL R.; JOHNS, RUSSELL C.; PELOWITZ, DENISE B.

    2007-01-10

    MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.

  5. Improved geometry representations for Monte Carlo radiation transport.

    SciTech Connect

    Martin, Matthew Ryan

    2004-08-01

    ITS (Integrated Tiger Series) permits a state-of-the-art Monte Carlo solution of linear time-integrated coupled electron/photon radiation transport problems with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. ITS allows designers to predict product performance in radiation environments.

  6. Automated Monte Carlo biasing for photon-generated electrons near surfaces.

    SciTech Connect

    Franke, Brian Claude; Crawford, Martin James; Kensek, Ronald Patrick

    2009-09-01

    This report describes efforts to automate the biasing of coupled electron-photon Monte Carlo particle transport calculations. The approach was based on weight-windows biasing. Weight-window settings were determined using adjoint-flux Monte Carlo calculations. A variety of algorithms were investigated for adaptivity of the Monte Carlo tallies. Tree data structures were used to investigate spatial partitioning. Functional-expansion tallies were used to investigate higher-order spatial representations.

  7. Fast photon-boundary intersection computation for Monte Carlo simulation of photon migration

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaofen; Liu, Hongyan; Zhang, Bin; Liu, Fei; Luo, Jianwen; Bai, Jing

    2013-01-01

    Monte Carlo (MC) method is generally used as a "gold standard" technique to simulate photon transport in biomedical optics. However, it is quite time-consuming since abundant photon propagations need to be simulated in order to achieve an accurate result. In the case of complicated geometry, the computation speed is bound up with the calculation of the intersection between the photon transmission path and media boundary. The ray-triangle-based method is often used to calculate the photon-boundary intersection in the shape-based MC simulation for light propagation, but it is still relatively time-consuming. We present a fast way to determine the photon-boundary intersection. Triangle meshes are used to describe the boundary structure. A line segment instead of a ray is used to check if there exists a photon-boundary intersection, as the next location of the photon in light transports is determined by the step size. Results suggest that by simply replacing the conventional ray-triangle-based method with the proposed line segment-triangle-based method, the MC simulation for light propagation in the mouse model can be speeded up by more than 35%.

  8. Recent advances in the Mercury Monte Carlo particle transport code

    SciTech Connect

    Brantley, P. S.; Dawson, S. A.; McKinley, M. S.; O'Brien, M. J.; Stevens, D. E.; Beck, B. R.; Jurgenson, E. D.; Ebbers, C. A.; Hall, J. M.

    2013-07-01

    We review recent physics and computational science advances in the Mercury Monte Carlo particle transport code under development at Lawrence Livermore National Laboratory. We describe recent efforts to enable a nuclear resonance fluorescence capability in the Mercury photon transport. We also describe recent work to implement a probability of extinction capability into Mercury. We review the results of current parallel scaling and threading efforts that enable the code to run on millions of MPI processes. (authors)

  9. Coupled electron-photon radiation transport

    SciTech Connect

    Lorence, L.; Kensek, R.P.; Valdez, G.D.; Drumm, C.R.; Fan, W.C.; Powell, J.L.

    2000-01-17

    Massively-parallel computers allow detailed 3D radiation transport simulations to be performed to analyze the response of complex systems to radiation. This has been recently been demonstrated with the coupled electron-photon Monte Carlo code, ITS. To enable such calculations, the combinatorial geometry capability of ITS was improved. For greater geometrical flexibility, a version of ITS is under development that can track particles in CAD geometries. Deterministic radiation transport codes that utilize an unstructured spatial mesh are also being devised. For electron transport, the authors are investigating second-order forms of the transport equations which, when discretized, yield symmetric positive definite matrices. A novel parallelization strategy, simultaneously solving for spatial and angular unknowns, has been applied to the even- and odd-parity forms of the transport equation on a 2D unstructured spatial mesh. Another second-order form, the self-adjoint angular flux transport equation, also shows promise for electron transport.

  10. Monte Carlo Ion Transport Analysis Code.

    Energy Science and Technology Software Center (ESTSC)

    2009-04-15

    Version: 00 TRIPOS is a versatile Monte Carlo ion transport analysis code. It has been applied to the treatment of both surface and bulk radiation effects. The media considered is composed of multilayer polyatomic materials.

  11. Monte Carlo simulation of photon-induced air showers

    NASA Astrophysics Data System (ADS)

    D'Ettorre Piazzoli, B.; di Sciascio, G.

    1994-05-01

    The EPAS code (Electron Photon-induced Air Showers) is a three-dimensional Monte Carlo simulation developed to study the properties of extensive air showers (EAS) generated by the interaction of high energy photons (or electrons) in the atmosphere. Results of the present simulation concern the longitudinal, lateral, temporal and angular distributions of electrons in atmospheric cascades initiated by photons of energies up to 10^3 TeV.

  12. Analytic treatment of source photon emission times to reduce noise in implicit Monte Carlo calculations

    SciTech Connect

    Trahan, Travis J.; Gentile, Nicholas A.

    2012-09-10

    Statistical uncertainty is inherent to any Monte Carlo simulation of radiation transport problems. In space-angle-frequency independent radiative transfer calculations, the uncertainty in the solution is entirely due to random sampling of source photon emission times. We have developed a modification to the Implicit Monte Carlo algorithm that eliminates noise due to sampling of the emission time of source photons. In problems that are independent of space, angle, and energy, the new algorithm generates a smooth solution, while a standard implicit Monte Carlo solution is noisy. For space- and angle-dependent problems, the new algorithm exhibits reduced noise relative to standard implicit Monte Carlo in some cases, and comparable noise in all other cases. In conclusion, the improvements are limited to short time scales; over long time scales, noise due to random sampling of spatial and angular variables tends to dominate the noise reduction from the new algorithm.

  13. An efficient framework for photon Monte Carlo treatment planning.

    PubMed

    Fix, Michael K; Manser, Peter; Frei, Daniel; Volken, Werner; Mini, Roberto; Born, Ernst J

    2007-10-01

    Currently photon Monte Carlo treatment planning (MCTP) for a patient stored in the patient database of a treatment planning system (TPS) can usually only be performed using a cumbersome multi-step procedure where many user interactions are needed. This means automation is needed for usage in clinical routine. In addition, because of the long computing time in MCTP, optimization of the MC calculations is essential. For these purposes a new graphical user interface (GUI)-based photon MC environment has been developed resulting in a very flexible framework. By this means appropriate MC transport methods are assigned to different geometric regions by still benefiting from the features included in the TPS. In order to provide a flexible MC environment, the MC particle transport has been divided into different parts: the source, beam modifiers and the patient. The source part includes the phase-space source, source models and full MC transport through the treatment head. The beam modifier part consists of one module for each beam modifier. To simulate the radiation transport through each individual beam modifier, one out of three full MC transport codes can be selected independently. Additionally, for each beam modifier a simple or an exact geometry can be chosen. Thereby, different complexity levels of radiation transport are applied during the simulation. For the patient dose calculation, two different MC codes are available. A special plug-in in Eclipse providing all necessary information by means of Dicom streams was used to start the developed MC GUI. The implementation of this framework separates the MC transport from the geometry and the modules pass the particles in memory; hence, no files are used as the interface. The implementation is realized for 6 and 15 MV beams of a Varian Clinac 2300 C/D. Several applications demonstrate the usefulness of the framework. Apart from applications dealing with the beam modifiers, two patient cases are shown. Thereby

  14. Transport in Sawtooth photonic lattices

    NASA Astrophysics Data System (ADS)

    Weimann, Steffen; Morales-Inostroza, Luis; Real, Bastián; Cantillano, Camilo; Szameit, Alexander; Vicencio, Rodrigo A.

    2016-06-01

    We investigate, theoretically and experimentally, a photonic realization of a Sawtooth lattice. This special lattice exhibits two spectral bands, with one of them experiencing a complete collapse to a highly degenerate flat band for a special set of inter-site coupling constants. We report the ob- servation of different transport regimes, including strong transport inhibition due to the appearance of the non-diffractive flat band. Moreover, we excite localized Shockley surfaces states, residing in the gap between the two linear bands.

  15. Clinical implementation of the Peregrine Monte Carlo dose calculations system for photon beam therapy

    SciTech Connect

    Albright, N; Bergstrom, P M; Daly, T P; Descalle, M; Garrett, D; House, R K; Knapp, D K; May, S; Patterson, R W; Siantar, C L; Verhey, L; Walling, R S; Welczorek, D

    1999-07-01

    PEREGRINE is a 3D Monte Carlo dose calculation system designed to serve as a dose calculation engine for clinical radiation therapy treatment planning systems. Taking advantage of recent advances in low-cost computer hardware, modern multiprocessor architectures and optimized Monte Carlo transport algorithms, PEREGRINE performs mm-resolution Monte Carlo calculations in times that are reasonable for clinical use. PEREGRINE has been developed to simulate radiation therapy for several source types, including photons, electrons, neutrons and protons, for both teletherapy and brachytherapy. However the work described in this paper is limited to linear accelerator-based megavoltage photon therapy. Here we assess the accuracy, reliability, and added value of 3D Monte Carlo transport for photon therapy treatment planning. Comparisons with clinical measurements in homogeneous and heterogeneous phantoms demonstrate PEREGRINE's accuracy. Studies with variable tissue composition demonstrate the importance of material assignment on the overall dose distribution. Detailed analysis of Monte Carlo results provides new information for radiation research by expanding the set of observables.

  16. Scalable Domain Decomposed Monte Carlo Particle Transport

    SciTech Connect

    O'Brien, Matthew Joseph

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  17. Evaluation of bremsstrahlung contribution to photon transport in coupled photon-electron problems

    NASA Astrophysics Data System (ADS)

    Fernández, Jorge E.; Scot, Viviana; Di Giulio, Eugenio; Salvat, Francesc

    2015-11-01

    The most accurate description of the radiation field in x-ray spectrometry requires the modeling of coupled photon-electron transport. Compton scattering and the photoelectric effect actually produce electrons as secondary particles which contribute to the photon field through conversion mechanisms like bremsstrahlung (which produces a continuous photon energy spectrum) and inner-shell impact ionization (ISII) (which gives characteristic lines). The solution of the coupled problem is time consuming because the electrons interact continuously and therefore, the number of electron collisions to be considered is always very high. This complex problem is frequently simplified by neglecting the contributions of the secondary electrons. Recent works (Fernández et al., 2013; Fernández et al., 2014) have shown the possibility to include a separately computed coupled photon-electron contribution like ISII in a photon calculation for improving such a crude approximation while preserving the speed of the pure photon transport model. By means of a similar approach and the Monte Carlo code PENELOPE (coupled photon-electron Monte Carlo), the bremsstrahlung contribution is characterized in this work. The angular distribution of the photons due to bremsstrahlung can be safely considered as isotropic, with the point of emission located at the same place of the photon collision. A new photon kernel describing the bremsstrahlung contribution is introduced: it can be included in photon transport codes (deterministic or Monte Carlo) with a minimal effort. A data library to describe the energy dependence of the bremsstrahlung emission has been generated for all elements Z=1-92 in the energy range 1-150 keV. The bremsstrahlung energy distribution for an arbitrary energy is obtained by interpolating in the database. A comparison between a PENELOPE direct simulation and the interpolated distribution using the data base shows an almost perfect agreement. The use of the data base increases

  18. A NEW MONTE CARLO METHOD FOR TIME-DEPENDENT NEUTRINO RADIATION TRANSPORT

    SciTech Connect

    Abdikamalov, Ernazar; Ott, Christian D.; O'Connor, Evan; Burrows, Adam; Dolence, Joshua C.; Loeffler, Frank; Schnetter, Erik

    2012-08-20

    Monte Carlo approaches to radiation transport have several attractive properties such as simplicity of implementation, high accuracy, and good parallel scaling. Moreover, Monte Carlo methods can handle complicated geometries and are relatively easy to extend to multiple spatial dimensions, which makes them potentially interesting in modeling complex multi-dimensional astrophysical phenomena such as core-collapse supernovae. The aim of this paper is to explore Monte Carlo methods for modeling neutrino transport in core-collapse supernovae. We generalize the Implicit Monte Carlo photon transport scheme of Fleck and Cummings and gray discrete-diffusion scheme of Densmore et al. to energy-, time-, and velocity-dependent neutrino transport. Using our 1D spherically-symmetric implementation, we show that, similar to the photon transport case, the implicit scheme enables significantly larger timesteps compared with explicit time discretization, without sacrificing accuracy, while the discrete-diffusion method leads to significant speed-ups at high optical depth. Our results suggest that a combination of spectral, velocity-dependent, Implicit Monte Carlo and discrete-diffusion Monte Carlo methods represents a robust approach for use in neutrino transport calculations in core-collapse supernovae. Our velocity-dependent scheme can easily be adapted to photon transport.

  19. A quantum photonic dissipative transport theory

    NASA Astrophysics Data System (ADS)

    Lei, Chan U.; Zhang, Wei-Min

    2012-05-01

    In this paper, a quantum transport theory for describing photonic dissipative transport dynamics in nanophotonics is developed. The nanophotonic devices concerned in this paper consist of on-chip all-optical integrated circuits incorporating photonic bandgap waveguides and driven resonators embedded in nanostructured photonic crystals. The photonic transport through waveguides is entirely determined from the exact master equation of the driven resonators, which is obtained by explicitly eliminating all the degrees of freedom of the waveguides (treated as reservoirs). Back-reactions from the reservoirs are fully taken into account. The relation between the driven photonic dynamics and photocurrents is obtained explicitly. The non-Markovian memory structure and quantum decoherence dynamics in photonic transport can then be fully addressed. As an illustration, the theory is utilized to study the transport dynamics of a photonic transistor consisting of a nanocavity coupled to two waveguides in photonic crystals. The controllability of photonic transport through the external driven field is demonstrated.

  20. Applications of the Monte Carlo radiation transport toolkit at LLNL

    NASA Astrophysics Data System (ADS)

    Sale, Kenneth E.; Bergstrom, Paul M., Jr.; Buck, Richard M.; Cullen, Dermot; Fujino, D.; Hartmann-Siantar, Christine

    1999-09-01

    Modern Monte Carlo radiation transport codes can be applied to model most applications of radiation, from optical to TeV photons, from thermal neutrons to heavy ions. Simulations can include any desired level of detail in three-dimensional geometries using the right level of detail in the reaction physics. The technology areas to which we have applied these codes include medical applications, defense, safety and security programs, nuclear safeguards and industrial and research system design and control. The main reason such applications are interesting is that by using these tools substantial savings of time and effort (i.e. money) can be realized. In addition it is possible to separate out and investigate computationally effects which can not be isolated and studied in experiments. In model calculations, just as in real life, one must take care in order to get the correct answer to the right question. Advancing computing technology allows extensions of Monte Carlo applications in two directions. First, as computers become more powerful more problems can be accurately modeled. Second, as computing power becomes cheaper Monte Carlo methods become accessible more widely. An overview of the set of Monte Carlo radiation transport tools in use a LLNL will be presented along with a few examples of applications and future directions.

  1. Monte Carlo simulation of photon way in clinical laser therapy

    NASA Astrophysics Data System (ADS)

    Ionita, Iulian; Voitcu, Gabriel

    2011-07-01

    The multiple scattering of light can increase efficiency of laser therapy of inflammatory diseases enlarging the treated area. The light absorption is essential for treatment while scattering dominates. Multiple scattering effects must be introduced using the Monte Carlo method for modeling light transport in tissue and finally to calculate the optical parameters. Diffuse reflectance measurements were made on high concentrated live leukocyte suspensions in similar conditions as in-vivo measurements. The results were compared with the values determined by MC calculations, and the latter have been adjusted to match the specified values of diffuse reflectance. The principal idea of MC simulations applied to absorption and scattering phenomena is to follow the optical path of a photon through the turbid medium. The concentrated live cell solution is a compromise between homogeneous layer as in MC model and light-live cell interaction as in-vivo experiments. In this way MC simulation allow us to compute the absorption coefficient. The values of optical parameters, derived from simulation by best fitting of measured reflectance, were used to determine the effective cross section. Thus we can compute the absorbed radiation dose at cellular level.

  2. MCNP (Monte Carlo Neutron Photon) capabilities for nuclear well logging calculations

    SciTech Connect

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    1989-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. The general-purpose continuous-energy Monte Carlo code MCNP (Monte Carlo Neutron Photon), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tally characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data. A rich collections of variance reduction features can greatly increase the efficiency of a calculation. MCNP is written in FORTRAN 77 and has been run on variety of computer systems from scientific workstations to supercomputers. The next production version of MCNP will include features such as continuous-energy electron transport and a multitasking option. Areas of ongoing research of interest to the well logging community include angle biasing, adaptive Monte Carlo, improved discrete ordinates capabilities, and discrete ordinates/Monte Carlo hybrid development. Los Alamos has requested approval by the Department of Energy to create a Radiation Transport Computational Facility under their User Facility Program to increase external interactions with industry, universities, and other government organizations. 21 refs.

  3. SABRINA - An interactive geometry modeler for MCNP (Monte Carlo Neutron Photon)

    SciTech Connect

    West, J.T.; Murphy, J.

    1988-01-01

    SABRINA is an interactive three-dimensional geometry modeler developed to produce complicated models for the Los Alamos Monte Carlo Neutron Photon program MCNP. SABRINA produces line drawings and color-shaded drawings for a wide variety of interactive graphics terminals. It is used as a geometry preprocessor in model development and as a Monte Carlo particle-track postprocessor in the visualization of complicated particle transport problem. SABRINA is written in Fortran 77 and is based on the Los Alamos Common Graphics System, CGS. 5 refs., 2 figs.

  4. Coupling Photon Monte Carlo Simulation and CAD Software. Application to X-ray Nondestructive Evaluation

    NASA Astrophysics Data System (ADS)

    Tabary, J.; Glière, A.

    A Monte Carlo radiation transport simulation program, EGS Nova, and a Computer Aided Design software, BRL-CAD, have been coupled within the framework of Sindbad, a Nondestructive Evaluation (NDE) simulation system. In its current status, the program is very valuable in a NDE laboratory context, as it helps simulate the images due to the uncollided and scattered photon fluxes in a single NDE software environment, without having to switch to a Monte Carlo code parameters set. Numerical validations show a good agreement with EGS4 computed and published data. As the program's major drawback is the execution time, computational efficiency improvements are foreseen.

  5. Hybrid Monte-Carlo method for simulating neutron and photon radiography

    NASA Astrophysics Data System (ADS)

    Wang, Han; Tang, Vincent

    2013-11-01

    We present a Hybrid Monte-Carlo method (HMCM) for simulating neutron and photon radiographs. HMCM utilizes the combination of a Monte-Carlo particle simulation for calculating incident film radiation and a statistical post-processing routine to simulate film noise. Since the method relies on MCNP for transport calculations, it is easily generalized to most non-destructive evaluation (NDE) simulations. We verify the method's accuracy through ASTM International's E592-99 publication, Standard Guide to Obtainable Equivalent Penetrameter Sensitivity for Radiography of Steel Plates [1]. Potential uses for the method include characterizing alternative radiological sources and simulating NDE radiographs.

  6. Monte Carlo method for photon heating using temperature-dependent optical properties.

    PubMed

    Slade, Adam Broadbent; Aguilar, Guillermo

    2015-02-01

    The Monte Carlo method for photon transport is often used to predict the volumetric heating that an optical source will induce inside a tissue or material. This method relies on constant (with respect to temperature) optical properties, specifically the coefficients of scattering and absorption. In reality, optical coefficients are typically temperature-dependent, leading to error in simulation results. The purpose of this study is to develop a method that can incorporate variable properties and accurately simulate systems where the temperature will greatly vary, such as in the case of laser-thawing of frozen tissues. A numerical simulation was developed that utilizes the Monte Carlo method for photon transport to simulate the thermal response of a system that allows temperature-dependent optical and thermal properties. This was done by combining traditional Monte Carlo photon transport with a heat transfer simulation to provide a feedback loop that selects local properties based on current temperatures, for each moment in time. Additionally, photon steps are segmented to accurately obtain path lengths within a homogenous (but not isothermal) material. Validation of the simulation was done using comparisons to established Monte Carlo simulations using constant properties, and a comparison to the Beer-Lambert law for temperature-variable properties. The simulation is able to accurately predict the thermal response of a system whose properties can vary with temperature. The difference in results between variable-property and constant property methods for the representative system of laser-heated silicon can become larger than 100K. This simulation will return more accurate results of optical irradiation absorption in a material which undergoes a large change in temperature. This increased accuracy in simulated results leads to better thermal predictions in living tissues and can provide enhanced planning and improved experimental and procedural outcomes. PMID

  7. Photon beam characterization and modelling for Monte Carlo treatment planning

    NASA Astrophysics Data System (ADS)

    Deng, Jun; Jiang, Steve B.; Kapur, Ajay; Li, Jinsheng; Pawlicki, Todd; Ma, C.-M.

    2000-02-01

    Photon beams of 4, 6 and 15 MV from Varian Clinac 2100C and 2300C/D accelerators were simulated using the EGS4/BEAM code system. The accelerators were modelled as a combination of component modules (CMs) consisting of a target, primary collimator, exit window, flattening filter, monitor chamber, secondary collimator, ring collimator, photon jaws and protection window. A full phase space file was scored directly above the upper photon jaws and analysed using beam data processing software, BEAMDP, to derive the beam characteristics, such as planar fluence, angular distribution, energy spectrum and the fractional contributions of each individual CM. A multiple-source model has been further developed to reconstruct the original phase space. Separate sources were created with accurate source intensity, energy, fluence and angular distributions for the target, primary collimator and flattening filter. Good agreement (within 2%) between the Monte Carlo calculations with the source model and those with the original phase space was achieved in the dose distributions for field sizes of 4 cm × 4 cm to 40 cm × 40 cm at source surface distances (SSDs) of 80-120 cm. The dose distributions in lung and bone heterogeneous phantoms have also been found to be in good agreement (within 2%) for 4, 6 and 15 MV photon beams for various field sizes between the Monte Carlo calculations with the source model and those with the original phase space.

  8. Energy Modulated Photon Radiotherapy: A Monte Carlo Feasibility Study.

    PubMed

    Zhang, Ying; Feng, Yuanming; Ming, Xin; Deng, Jun

    2016-01-01

    A novel treatment modality termed energy modulated photon radiotherapy (EMXRT) was investigated. The first step of EMXRT was to determine beam energy for each gantry angle/anatomy configuration from a pool of photon energy beams (2 to 10 MV) with a newly developed energy selector. An inverse planning system using gradient search algorithm was then employed to optimize photon beam intensity of various beam energies based on presimulated Monte Carlo pencil beam dose distributions in patient anatomy. Finally, 3D dose distributions in six patients of different tumor sites were simulated with Monte Carlo method and compared between EMXRT plans and clinical IMRT plans. Compared to current IMRT technique, the proposed EMXRT method could offer a better paradigm for the radiotherapy of lung cancers and pediatric brain tumors in terms of normal tissue sparing and integral dose. For prostate, head and neck, spine, and thyroid lesions, the EMXRT plans were generally comparable to the IMRT plans. Our feasibility study indicated that lower energy (<6 MV) photon beams could be considered in modern radiotherapy treatment planning to achieve a more personalized care for individual patient with dosimetric gains. PMID:26977413

  9. Energy Modulated Photon Radiotherapy: A Monte Carlo Feasibility Study

    PubMed Central

    Zhang, Ying; Feng, Yuanming; Ming, Xin

    2016-01-01

    A novel treatment modality termed energy modulated photon radiotherapy (EMXRT) was investigated. The first step of EMXRT was to determine beam energy for each gantry angle/anatomy configuration from a pool of photon energy beams (2 to 10 MV) with a newly developed energy selector. An inverse planning system using gradient search algorithm was then employed to optimize photon beam intensity of various beam energies based on presimulated Monte Carlo pencil beam dose distributions in patient anatomy. Finally, 3D dose distributions in six patients of different tumor sites were simulated with Monte Carlo method and compared between EMXRT plans and clinical IMRT plans. Compared to current IMRT technique, the proposed EMXRT method could offer a better paradigm for the radiotherapy of lung cancers and pediatric brain tumors in terms of normal tissue sparing and integral dose. For prostate, head and neck, spine, and thyroid lesions, the EMXRT plans were generally comparable to the IMRT plans. Our feasibility study indicated that lower energy (<6 MV) photon beams could be considered in modern radiotherapy treatment planning to achieve a more personalized care for individual patient with dosimetric gains. PMID:26977413

  10. Monte Carlo treatment planning for photon and electron beams

    NASA Astrophysics Data System (ADS)

    Reynaert, N.; van der Marck, S. C.; Schaart, D. R.; Van der Zee, W.; Van Vliet-Vroegindeweij, C.; Tomsej, M.; Jansen, J.; Heijmen, B.; Coghe, M.; De Wagter, C.

    2007-04-01

    During the last few decades, accuracy in photon and electron radiotherapy has increased substantially. This is partly due to enhanced linear accelerator technology, providing more flexibility in field definition (e.g. the usage of computer-controlled dynamic multileaf collimators), which led to intensity modulated radiotherapy (IMRT). Important improvements have also been made in the treatment planning process, more specifically in the dose calculations. Originally, dose calculations relied heavily on analytic, semi-analytic and empirical algorithms. The more accurate convolution/superposition codes use pre-calculated Monte Carlo dose "kernels" partly accounting for tissue density heterogeneities. It is generally recognized that the Monte Carlo method is able to increase accuracy even further. Since the second half of the 1990s, several Monte Carlo dose engines for radiotherapy treatment planning have been introduced. To enable the use of a Monte Carlo treatment planning (MCTP) dose engine in clinical circumstances, approximations have been introduced to limit the calculation time. In this paper, the literature on MCTP is reviewed, focussing on patient modeling, approximations in linear accelerator modeling and variance reduction techniques. An overview of published comparisons between MC dose engines and conventional dose calculations is provided for phantom studies and clinical examples, evaluating the added value of MCTP in the clinic. An overview of existing Monte Carlo dose engines and commercial MCTP systems is presented and some specific issues concerning the commissioning of a MCTP system are discussed.

  11. Shield weight optimization using Monte Carlo transport calculations

    NASA Technical Reports Server (NTRS)

    Jordan, T. M.; Wohl, M. L.

    1972-01-01

    Outlines are given of the theory used in FASTER-3 Monte Carlo computer program for the transport of neutrons and gamma rays in complex geometries. The code has the additional capability of calculating the minimum weight layered unit shield configuration which will meet a specified dose rate constraint. It includes the treatment of geometric regions bounded by quadratic and quardric surfaces with multiple radiation sources which have a specified space, angle, and energy dependence. The program calculates, using importance sampling, the resulting number and energy fluxes at specified point, surface, and volume detectors. Results are presented for sample problems involving primary neutron and both primary and secondary photon transport in a spherical reactor shield configuration. These results include the optimization of the shield configuration.

  12. Monte Carlo radiation transport¶llelism

    SciTech Connect

    Cox, L. J.; Post, S. E.

    2002-01-01

    This talk summarizes the main aspects of the LANL ASCI Eolus project and its major unclassified code project, MCNP. The MCNP code provide a state-of-the-art Monte Carlo radiation transport to approximately 3000 users world-wide. Almost all hardware platforms are supported because we strictly adhere to the FORTRAN-90/95 standard. For parallel processing, MCNP uses a mixture of OpenMp combined with either MPI or PVM (shared and distributed memory). This talk summarizes our experiences on various platforms using MPI with and without OpenMP. These platforms include PC-Windows, Intel-LINUX, BlueMountain, Frost, ASCI-Q and others.

  13. Monte Carlo simulation for the transport beamline

    SciTech Connect

    Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A.; Attili, A.; Marchetto, F.; Russo, G.; Cirrone, G. A. P.; Schillaci, F.; Scuderi, V.; Carpinelli, M.

    2013-07-26

    In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery.

  14. Scalable Domain Decomposed Monte Carlo Particle Transport

    NASA Astrophysics Data System (ADS)

    O'Brien, Matthew Joseph

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation. The main algorithms we consider are: • Domain decomposition of constructive solid geometry: enables extremely large calculations in which the background geometry is too large to fit in the memory of a single computational node. • Load Balancing: keeps the workload per processor as even as possible so the calculation runs efficiently. • Global Particle Find: if particles are on the wrong processor, globally resolve their locations to the correct processor based on particle coordinate and background domain. • Visualizing constructive solid geometry, sourcing particles, deciding that particle streaming communication is completed and spatial redecomposition. These algorithms are some of the most important parallel algorithms required for domain decomposed Monte Carlo particle transport. We demonstrate that our previous algorithms were not scalable, prove that our new algorithms are scalable, and run some of the algorithms up to 2 million MPI processes on the Sequoia supercomputer.

  15. Calculation of radiation therapy dose using all particle Monte Carlo transport

    DOEpatents

    Chandler, W.P.; Hartmann-Siantar, C.L.; Rathkopf, J.A.

    1999-02-09

    The actual radiation dose absorbed in the body is calculated using three-dimensional Monte Carlo transport. Neutrons, protons, deuterons, tritons, helium-3, alpha particles, photons, electrons, and positrons are transported in a completely coupled manner, using this Monte Carlo All-Particle Method (MCAPM). The major elements of the invention include: computer hardware, user description of the patient, description of the radiation source, physical databases, Monte Carlo transport, and output of dose distributions. This facilitated the estimation of dose distributions on a Cartesian grid for neutrons, photons, electrons, positrons, and heavy charged-particles incident on any biological target, with resolutions ranging from microns to centimeters. Calculations can be extended to estimate dose distributions on general-geometry (non-Cartesian) grids for biological and/or non-biological media. 57 figs.

  16. Calculation of radiation therapy dose using all particle Monte Carlo transport

    DOEpatents

    Chandler, William P.; Hartmann-Siantar, Christine L.; Rathkopf, James A.

    1999-01-01

    The actual radiation dose absorbed in the body is calculated using three-dimensional Monte Carlo transport. Neutrons, protons, deuterons, tritons, helium-3, alpha particles, photons, electrons, and positrons are transported in a completely coupled manner, using this Monte Carlo All-Particle Method (MCAPM). The major elements of the invention include: computer hardware, user description of the patient, description of the radiation source, physical databases, Monte Carlo transport, and output of dose distributions. This facilitated the estimation of dose distributions on a Cartesian grid for neutrons, photons, electrons, positrons, and heavy charged-particles incident on any biological target, with resolutions ranging from microns to centimeters. Calculations can be extended to estimate dose distributions on general-geometry (non-Cartesian) grids for biological and/or non-biological media.

  17. Advantages of Analytical Transformations in Monte Carlo Methods for Radiation Transport

    SciTech Connect

    McKinley, M S; Brooks III, E D; Daffin, F

    2004-12-13

    Monte Carlo methods for radiation transport typically attempt to solve an integral by directly sampling analog or weighted particles, which are treated as physical entities. Improvements to the methods involve better sampling, probability games or physical intuition about the problem. We show that significant improvements can be achieved by recasting the equations with an analytical transform to solve for new, non-physical entities or fields. This paper looks at one such transform, the difference formulation for thermal photon transport, showing a significant advantage for Monte Carlo solution of the equations for time dependent transport. Other related areas are discussed that may also realize significant benefits from similar analytical transformations.

  18. Benchmarking of Proton Transport in Super Monte Carlo Simulation Program

    NASA Astrophysics Data System (ADS)

    Wang, Yongfeng; Li, Gui; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Wu, Yican

    2014-06-01

    The Monte Carlo (MC) method has been traditionally applied in nuclear design and analysis due to its capability of dealing with complicated geometries and multi-dimensional physics problems as well as obtaining accurate results. The Super Monte Carlo Simulation Program (SuperMC) is developed by FDS Team in China for fusion, fission, and other nuclear applications. The simulations of radiation transport, isotope burn-up, material activation, radiation dose, and biology damage could be performed using SuperMC. Complicated geometries and the whole physical process of various types of particles in broad energy scale can be well handled. Bi-directional automatic conversion between general CAD models and full-formed input files of SuperMC is supported by MCAM, which is a CAD/image-based automatic modeling program for neutronics and radiation transport simulation. Mixed visualization of dynamical 3D dataset and geometry model is supported by RVIS, which is a nuclear radiation virtual simulation and assessment system. Continuous-energy cross section data from hybrid evaluated nuclear data library HENDL are utilized to support simulation. Neutronic fixed source and critical design parameters calculates for reactors of complex geometry and material distribution based on the transport of neutron and photon have been achieved in our former version of SuperMC. Recently, the proton transport has also been intergrated in SuperMC in the energy region up to 10 GeV. The physical processes considered for proton transport include electromagnetic processes and hadronic processes. The electromagnetic processes include ionization, multiple scattering, bremsstrahlung, and pair production processes. Public evaluated data from HENDL are used in some electromagnetic processes. In hadronic physics, the Bertini intra-nuclear cascade model with exitons, preequilibrium model, nucleus explosion model, fission model, and evaporation model are incorporated to treat the intermediate energy nuclear

  19. Vertical Photon Transport in Cloud Remote Sensing Problems

    NASA Technical Reports Server (NTRS)

    Platnick, S.

    1999-01-01

    Photon transport in plane-parallel, vertically inhomogeneous clouds is investigated and applied to cloud remote sensing techniques that use solar reflectance or transmittance measurements for retrieving droplet effective radius. Transport is couched in terms of weighting functions which approximate the relative contribution of individual layers to the overall retrieval. Two vertical weightings are investigated, including one based on the average number of scatterings encountered by reflected and transmitted photons in any given layer. A simpler vertical weighting based on the maximum penetration of reflected photons proves useful for solar reflectance measurements. These weighting functions are highly dependent on droplet absorption and solar/viewing geometry. A superposition technique, using adding/doubling radiative transfer procedures, is derived to accurately determine both weightings, avoiding time consuming Monte Carlo methods. Superposition calculations are made for a variety of geometries and cloud models, and selected results are compared with Monte Carlo calculations. Effective radius retrievals from modeled vertically inhomogeneous liquid water clouds are then made using the standard near-infrared bands, and compared with size estimates based on the proposed weighting functions. Agreement between the two methods is generally within several tenths of a micrometer, much better than expected retrieval accuracy. Though the emphasis is on photon transport in clouds, the derived weightings can be applied to any multiple scattering plane-parallel radiative transfer problem, including arbitrary combinations of cloud, aerosol, and gas layers.

  20. Approximation for Horizontal Photon Transport in Cloud Remote Sensing Problems

    NASA Technical Reports Server (NTRS)

    Plantnick, Steven

    1999-01-01

    The effect of horizontal photon transport within real-world clouds can be of consequence to remote sensing problems based on plane-parallel cloud models. An analytic approximation for the root-mean-square horizontal displacement of reflected and transmitted photons relative to the incident cloud-top location is derived from random walk theory. The resulting formula is a function of the average number of photon scatterings, and particle asymmetry parameter and single scattering albedo. In turn, the average number of scatterings can be determined from efficient adding/doubling radiative transfer procedures. The approximation is applied to liquid water clouds for typical remote sensing solar spectral bands, involving both conservative and non-conservative scattering. Results compare well with Monte Carlo calculations. Though the emphasis is on horizontal photon transport in terrestrial clouds, the derived approximation is applicable to any multiple scattering plane-parallel radiative transfer problem. The complete horizontal transport probability distribution can be described with an analytic distribution specified by the root-mean-square and average displacement values. However, it is shown empirically that the average displacement can be reasonably inferred from the root-mean-square value. An estimate for the horizontal transport distribution can then be made from the root-mean-square photon displacement alone.

  1. Fiber transport of spatially entangled photons

    NASA Astrophysics Data System (ADS)

    Löffler, W.; Eliel, E. R.; Woerdman, J. P.; Euser, T. G.; Scharrer, M.; Russell, P.

    2012-03-01

    High-dimensional entangled photons pairs are interesting for quantum information and cryptography: Compared to the well-known 2D polarization case, the stronger non-local quantum correlations could improve noise resistance or security, and the larger amount of information per photon increases the available bandwidth. One implementation is to use entanglement in the spatial degree of freedom of twin photons created by spontaneous parametric down-conversion, which is equivalent to orbital angular momentum entanglement, this has been proven to be an excellent model system. The use of optical fiber technology for distribution of such photons has only very recently been practically demonstrated and is of fundamental and applied interest. It poses a big challenge compared to the established time and frequency domain methods: For spatially entangled photons, fiber transport requires the use of multimode fibers, and mode coupling and intermodal dispersion therein must be minimized not to destroy the spatial quantum correlations. We demonstrate that these shortcomings of conventional multimode fibers can be overcome by using a hollow-core photonic crystal fiber, which follows the paradigm to mimic free-space transport as good as possible, and are able to confirm entanglement of the fiber-transported photons. Fiber transport of spatially entangled photons is largely unexplored yet, therefore we discuss the main complications, the interplay of intermodal dispersion and mode mixing, the influence of external stress and core deformations, and consider the pros and cons of various fiber types.

  2. Photon spectra calculation for an Elekta linac beam using experimental scatter measurements and Monte Carlo techniques.

    PubMed

    Juste, B; Miro, R; Campayo, J M; Diez, S; Verdu, G

    2008-01-01

    The present work is centered in reconstructing by means of a scatter analysis method the primary beam photon spectrum of a linear accelerator. This technique is based on irradiating the isocenter of a rectangular block made of methacrylate placed at 100 cm distance from surface and measuring scattered particles around the plastic at several specific positions with different scatter angles. The MCNP5 Monte Carlo code has been used to simulate the particles transport of mono-energetic beams to register the scatter measurement after contact the attenuator. Measured ionization values allow calculating the spectrum as the sum of mono-energetic individual energy bins using the Schiff Bremsstrahlung model. The measurements have been made in an Elekta Precise linac using a 6 MeV photon beam. Relative depth and profile dose curves calculated in a water phantom using the reconstructed spectrum agree with experimentally measured dose data to within 3%. PMID:19163410

  3. Transport of photons produced by lightning in clouds

    NASA Technical Reports Server (NTRS)

    Solakiewicz, Richard

    1991-01-01

    The optical effects of the light produced by lightning are of interest to atmospheric scientists for a number of reasons. Two techniques are mentioned which are used to explain the nature of these effects: Monte Carlo simulation; and an equivalent medium approach. In the Monte Carlo approach, paths of individual photons are simulated; a photon is said to be scattered if it escapes the cloud, otherwise it is absorbed. In the equivalent medium approach, the cloud is replaced by a single obstacle whose properties are specified by bulk parameters obtained by methods due to Twersky. Herein, Boltzmann transport theory is used to obtain photon intensities. The photons are treated like a Lorentz gas. Only elastic scattering is considered and gravitational effects are neglected. Water droplets comprising a cuboidal cloud are assumed to be spherical and homogeneous. Furthermore, it is assumed that the distribution of droplets in the cloud is uniform and that scattering by air molecules is neglible. The time dependence and five dimensional nature of this problem make it particularly difficult; neither analytic nor numerical solutions are known.

  4. Applications of the COG multiparticle Monte Carlo transport code to simulated imaging of complex objects

    SciTech Connect

    Buck, R M; Hall, J M

    1999-06-01

    COG is a major multiparticle simulation code in the LLNL Monte Carlo radiation transport toolkit. It was designed to solve deep-penetration radiation shielding problems in arbitrarily complex 3D geometries, involving coupled transport of photons, neutrons, and electrons. COG was written to provide as much accuracy as the underlying cross-sections will allow, and has a number of variance-reduction features to speed computations. Recently COG has been applied to the simulation of high- resolution radiographs of complex objects and the evaluation of contraband detection schemes. In this paper we will give a brief description of the capabilities of the COG transport code and show several examples of neutron and gamma-ray imaging simulations. Keywords: Monte Carlo, radiation transport, simulated radiography, nonintrusive inspection, neutron imaging.

  5. Commissioning of a Varian Clinac iX 6 MV photon beam using Monte Carlo simulation

    SciTech Connect

    Dirgayussa, I Gde Eka Yani, Sitti; Haryanto, Freddy; Rhani, M. Fahdillah

    2015-09-30

    Monte Carlo modelling of a linear accelerator is the first and most important step in Monte Carlo dose calculations in radiotherapy. Monte Carlo is considered today to be the most accurate and detailed calculation method in different fields of medical physics. In this research, we developed a photon beam model for Varian Clinac iX 6 MV equipped with MilleniumMLC120 for dose calculation purposes using BEAMnrc/DOSXYZnrc Monte Carlo system based on the underlying EGSnrc particle transport code. Monte Carlo simulation for this commissioning head LINAC divided in two stages are design head Linac model using BEAMnrc, characterize this model using BEAMDP and analyze the difference between simulation and measurement data using DOSXYZnrc. In the first step, to reduce simulation time, a virtual treatment head LINAC was built in two parts (patient-dependent component and patient-independent component). The incident electron energy varied 6.1 MeV, 6.2 MeV and 6.3 MeV, 6.4 MeV, and 6.6 MeV and the FWHM (full width at half maximum) of source is 1 mm. Phase-space file from the virtual model characterized using BEAMDP. The results of MC calculations using DOSXYZnrc in water phantom are percent depth doses (PDDs) and beam profiles at depths 10 cm were compared with measurements. This process has been completed if the dose difference of measured and calculated relative depth-dose data along the central-axis and dose profile at depths 10 cm is ≤ 5%. The effect of beam width on percentage depth doses and beam profiles was studied. Results of the virtual model were in close agreement with measurements in incident energy electron 6.4 MeV. Our results showed that photon beam width could be tuned using large field beam profile at the depth of maximum dose. The Monte Carlo model developed in this study accurately represents the Varian Clinac iX with millennium MLC 120 leaf and can be used for reliable patient dose calculations. In this commissioning process, the good

  6. Commissioning of a Varian Clinac iX 6 MV photon beam using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Dirgayussa, I. Gde Eka; Yani, Sitti; Rhani, M. Fahdillah; Haryanto, Freddy

    2015-09-01

    Monte Carlo modelling of a linear accelerator is the first and most important step in Monte Carlo dose calculations in radiotherapy. Monte Carlo is considered today to be the most accurate and detailed calculation method in different fields of medical physics. In this research, we developed a photon beam model for Varian Clinac iX 6 MV equipped with MilleniumMLC120 for dose calculation purposes using BEAMnrc/DOSXYZnrc Monte Carlo system based on the underlying EGSnrc particle transport code. Monte Carlo simulation for this commissioning head LINAC divided in two stages are design head Linac model using BEAMnrc, characterize this model using BEAMDP and analyze the difference between simulation and measurement data using DOSXYZnrc. In the first step, to reduce simulation time, a virtual treatment head LINAC was built in two parts (patient-dependent component and patient-independent component). The incident electron energy varied 6.1 MeV, 6.2 MeV and 6.3 MeV, 6.4 MeV, and 6.6 MeV and the FWHM (full width at half maximum) of source is 1 mm. Phase-space file from the virtual model characterized using BEAMDP. The results of MC calculations using DOSXYZnrc in water phantom are percent depth doses (PDDs) and beam profiles at depths 10 cm were compared with measurements. This process has been completed if the dose difference of measured and calculated relative depth-dose data along the central-axis and dose profile at depths 10 cm is ≤ 5%. The effect of beam width on percentage depth doses and beam profiles was studied. Results of the virtual model were in close agreement with measurements in incident energy electron 6.4 MeV. Our results showed that photon beam width could be tuned using large field beam profile at the depth of maximum dose. The Monte Carlo model developed in this study accurately represents the Varian Clinac iX with millennium MLC 120 leaf and can be used for reliable patient dose calculations. In this commissioning process, the good criteria of dose

  7. Photonic sensor applications in transportation security

    NASA Astrophysics Data System (ADS)

    Krohn, David A.

    2007-09-01

    There is a broad range of security sensing applications in transportation that can be facilitated by using fiber optic sensors and photonic sensor integrated wireless systems. Many of these vital assets are under constant threat of being attacked. It is important to realize that the threats are not just from terrorism but an aging and often neglected infrastructure. To specifically address transportation security, photonic sensors fall into two categories: fixed point monitoring and mobile tracking. In fixed point monitoring, the sensors monitor bridge and tunnel structural health and environment problems such as toxic gases in a tunnel. Mobile tracking sensors are being designed to track cargo such as shipboard cargo containers and trucks. Mobile tracking sensor systems have multifunctional sensor requirements including intrusion (tampering), biochemical, radiation and explosives detection. This paper will review the state of the art of photonic sensor technologies and their ability to meet the challenges of transportation security.

  8. Review of Monte Carlo modeling of light transport in tissues.

    PubMed

    Zhu, Caigang; Liu, Quan

    2013-05-01

    A general survey is provided on the capability of Monte Carlo (MC) modeling in tissue optics while paying special attention to the recent progress in the development of methods for speeding up MC simulations. The principles of MC modeling for the simulation of light transport in tissues, which includes the general procedure of tracking an individual photon packet, common light-tissue interactions that can be simulated, frequently used tissue models, common contact/noncontact illumination and detection setups, and the treatment of time-resolved and frequency-domain optical measurements, are briefly described to help interested readers achieve a quick start. Following that, a variety of methods for speeding up MC simulations, which includes scaling methods, perturbation methods, hybrid methods, variance reduction techniques, parallel computation, and special methods for fluorescence simulations, as well as their respective advantages and disadvantages are discussed. Then the applications of MC methods in tissue optics, laser Doppler flowmetry, photodynamic therapy, optical coherence tomography, and diffuse optical tomography are briefly surveyed. Finally, the potential directions for the future development of the MC method in tissue optics are discussed. PMID:23698318

  9. A finite element approach for modeling photon transport in tissue.

    PubMed

    Arridge, S R; Schweiger, M; Hiraoka, M; Delpy, D T

    1993-01-01

    The use of optical radiation in medical physics is important in several fields for both treatment and diagnosis. In all cases an analytic and computable model of the propagation of radiation in tissue is essential for a meaningful interpretation of the procedures. A finite element method (FEM) for deriving photon density inside an object, and photon flux at its boundary, assuming that the photon transport model is the diffusion approximation to the radiative transfer equation, is introduced herein. Results from the model for a particular case are given: the calculation of the boundary flux as a function of time resulting from a delta-function input to a two-dimensional circle (equivalent to a line source in an infinite cylinder) with homogeneous scattering and absorption properties. This models the temporal point spread function of interest in near infrared spectroscopy and imaging. The convergence of the FEM results are demonstrated, as the resolution of the mesh is increased, to the analytical expression for the Green's function for this system. The diffusion approximation is very commonly adopted as appropriate for cases which are scattering dominated, i.e., where mu s > mu a, and results from other workers have compared it to alternative models. In this article a high degree of agreement with a Monte Carlo method is demonstrated. The principle advantage of the FE method is its speed. It is in all ways as flexible as Monte Carlo methods and in addition can produce photon density everywhere, as well as flux on the boundary. One disadvantage is that there is no means of deriving individual photon histories. PMID:8497214

  10. COMPARISON OF MONTE CARLO METHODS FOR NONLINEAR RADIATION TRANSPORT

    SciTech Connect

    W. R. MARTIN; F. B. BROWN

    2001-03-01

    Five Monte Carlo methods for solving the nonlinear thermal radiation transport equations are compared. The methods include the well-known Implicit Monte Carlo method (IMC) developed by Fleck and Cummings, an alternative to IMC developed by Carter and Forest, an ''exact'' method recently developed by Ahrens and Larsen, and two methods recently proposed by Martin and Brown. The five Monte Carlo methods are developed and applied to the radiation transport equation in a medium assuming local thermodynamic equilibrium. Conservation of energy is derived and used to define appropriate material energy update equations for each of the methods. Details of the Monte Carlo implementation are presented, both for the random walk simulation and the material energy update. Simulation results for all five methods are obtained for two infinite medium test problems and a 1-D test problem, all of which have analytical solutions. Conclusions regarding the relative merits of the various schemes are presented.

  11. Monte Carlo simulation of photon buildup factors for shielding materials in radiotherapy x-ray facilities

    SciTech Connect

    Karim Karoui, Mohamed; Kharrati, Hedi

    2013-07-15

    Purpose: This paper presents the results of a series of calculations to determine buildup factors for ordinary concrete, baryte concrete, lead, steel, and iron in broad beam geometry for photons energies from 0.125 to 25.125 MeV at 0.250 MeV intervals.Methods: Monte Carlo N-particle radiation transport computer code has been used to determine the buildup factors for the studied shielding materials.Results: The computation of the primary broad beams using buildup factors data was done for nine published megavoltage photon beam spectra ranging from 4 to 25 MV in nominal energies, representing linacs made by the three major manufacturers. The first tenth value layer and the equilibrium tenth value layer are calculated from the broad beam transmission for these nine primary megavoltage photon beam spectra.Conclusions: The results, compared with published data, show the ability of these buildup factor data to predict shielding transmission curves for the primary radiation beam. Therefore, the buildup factor data can be combined with primary, scatter, and leakage x-ray spectra to perform computation of broad beam transmission for barriers in radiotherapy shielding x-ray facilities.

  12. Shift: A Massively Parallel Monte Carlo Radiation Transport Package

    SciTech Connect

    Pandya, Tara M; Johnson, Seth R; Davidson, Gregory G; Evans, Thomas M; Hamilton, Steven P

    2015-01-01

    This paper discusses the massively-parallel Monte Carlo radiation transport package, Shift, developed at Oak Ridge National Laboratory. It reviews the capabilities, implementation, and parallel performance of this code package. Scaling results demonstrate very good strong and weak scaling behavior of the implemented algorithms. Benchmark results from various reactor problems show that Shift results compare well to other contemporary Monte Carlo codes and experimental results.

  13. Calculation of photon pulse height distribution using deterministic and Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Akhavan, Azadeh; Vosoughi, Naser

    2015-12-01

    Radiation transport techniques which are used in radiation detection systems comprise one of two categories namely probabilistic and deterministic. However, probabilistic methods are typically used in pulse height distribution simulation by recreating the behavior of each individual particle, the deterministic approach, which approximates the macroscopic behavior of particles by solution of Boltzmann transport equation, is being developed because of its potential advantages in computational efficiency for complex radiation detection problems. In current work linear transport equation is solved using two methods including collided components of the scalar flux algorithm which is applied by iterating on the scattering source and ANISN deterministic computer code. This approach is presented in one dimension with anisotropic scattering orders up to P8 and angular quadrature orders up to S16. Also, multi-group gamma cross-section library required for this numerical transport simulation is generated in a discrete appropriate form. Finally, photon pulse height distributions are indirectly calculated by deterministic methods that approvingly compare with those from Monte Carlo based codes namely MCNPX and FLUKA.

  14. Concurrent Monte Carlo transport and fluence optimization with fluence adjusting scalable transport Monte Carlo

    PubMed Central

    Svatos, M.; Zankowski, C.; Bednarz, B.

    2016-01-01

    Purpose: The future of radiation therapy will require advanced inverse planning solutions to support single-arc, multiple-arc, and “4π” delivery modes, which present unique challenges in finding an optimal treatment plan over a vast search space, while still preserving dosimetric accuracy. The successful clinical implementation of such methods would benefit from Monte Carlo (MC) based dose calculation methods, which can offer improvements in dosimetric accuracy when compared to deterministic methods. The standard method for MC based treatment planning optimization leverages the accuracy of the MC dose calculation and efficiency of well-developed optimization methods, by precalculating the fluence to dose relationship within a patient with MC methods and subsequently optimizing the fluence weights. However, the sequential nature of this implementation is computationally time consuming and memory intensive. Methods to reduce the overhead of the MC precalculation have been explored in the past, demonstrating promising reductions of computational time overhead, but with limited impact on the memory overhead due to the sequential nature of the dose calculation and fluence optimization. The authors propose an entirely new form of “concurrent” Monte Carlo treat plan optimization: a platform which optimizes the fluence during the dose calculation, reduces wasted computation time being spent on beamlets that weakly contribute to the final dose distribution, and requires only a low memory footprint to function. In this initial investigation, the authors explore the key theoretical and practical considerations of optimizing fluence in such a manner. Methods: The authors present a novel derivation and implementation of a gradient descent algorithm that allows for optimization during MC particle transport, based on highly stochastic information generated through particle transport of very few histories. A gradient rescaling and renormalization algorithm, and the

  15. MORSE Monte Carlo radiation transport code system

    SciTech Connect

    Emmett, M.B.

    1983-02-01

    This report is an addendum to the MORSE report, ORNL-4972, originally published in 1975. This addendum contains descriptions of several modifications to the MORSE Monte Carlo Code, replacement pages containing corrections, Part II of the report which was previously unpublished, and a new Table of Contents. The modifications include a Klein Nishina estimator for gamma rays. Use of such an estimator required changing the cross section routines to process pair production and Compton scattering cross sections directly from ENDF tapes and writing a new version of subroutine RELCOL. Another modification is the use of free form input for the SAMBO analysis data. This required changing subroutines SCORIN and adding new subroutine RFRE. References are updated, and errors in the original report have been corrected. (WHK)

  16. Response of thermoluminescent dosimeters to photons simulated with the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Moralles, M.; Guimarães, C. C.; Okuno, E.

    2005-06-01

    Personal monitors composed of thermoluminescent dosimeters (TLDs) made of natural fluorite (CaF 2:NaCl) and lithium fluoride (Harshaw TLD-100) were exposed to gamma and X rays of different qualities. The GEANT4 radiation transport Monte Carlo toolkit was employed to calculate the energy depth deposition profile in the TLDs. X-ray spectra of the ISO/4037-1 narrow-spectrum series, with peak voltage (kVp) values in the range 20-300 kV, were obtained by simulating a X-ray Philips MG-450 tube associated with the recommended filters. A realistic photon distribution of a 60Co radiotherapy source was taken from results of Monte Carlo simulations found in the literature. Comparison between simulated and experimental results revealed that the attenuation of emitted light in the readout process of the fluorite dosimeter must be taken into account, while this effect is negligible for lithium fluoride. Differences between results obtained by heating the dosimeter from the irradiated side and from the opposite side allowed the determination of the light attenuation coefficient for CaF 2:NaCl (mass proportion 60:40) as 2.2 mm -1.

  17. Modelling of electron contamination in clinical photon beams for Monte Carlo dose calculation

    NASA Astrophysics Data System (ADS)

    Yang, J.; Li, J. S.; Qin, L.; Xiong, W.; Ma, C.-M.

    2004-06-01

    The purpose of this work is to model electron contamination in clinical photon beams and to commission the source model using measured data for Monte Carlo treatment planning. In this work, a planar source is used to represent the contaminant electrons at a plane above the upper jaws. The source size depends on the dimensions of the field size at the isocentre. The energy spectra of the contaminant electrons are predetermined using Monte Carlo simulations for photon beams from different clinical accelerators. A 'random creep' method is employed to derive the weight of the electron contamination source by matching Monte Carlo calculated monoenergetic photon and electron percent depth-dose (PDD) curves with measured PDD curves. We have integrated this electron contamination source into a previously developed multiple source model and validated the model for photon beams from Siemens PRIMUS accelerators. The EGS4 based Monte Carlo user code BEAM and MCSIM were used for linac head simulation and dose calculation. The Monte Carlo calculated dose distributions were compared with measured data. Our results showed good agreement (less than 2% or 2 mm) for 6, 10 and 18 MV photon beams.

  18. Monte Carlo Nucleon Meson Transport Code System.

    Energy Science and Technology Software Center (ESTSC)

    2000-11-17

    Version 00 NMTC/JAERI97 is an upgraded version of the code system NMTC/JAERI, which was developed in 1982 at JAERI and is based on the CCC-161/NMTC code system. NMTC/JAERI97 simulates high energy nuclear reactions and nucleon-meson transport processes.

  19. Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce.

    PubMed

    Pratx, Guillem; Xing, Lei

    2011-12-01

    Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes. PMID:22191916

  20. Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce

    NASA Astrophysics Data System (ADS)

    Pratx, Guillem; Xing, Lei

    2011-12-01

    Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes.

  1. Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce

    PubMed Central

    Pratx, Guillem; Xing, Lei

    2011-01-01

    Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes. PMID:22191916

  2. Monte Carlo source model for photon beam radiotherapy: photon source characteristics

    SciTech Connect

    Fix, Michael K.; Keall, Paul J.; Dawson, Kathryn; Siebers, Jeffrey V.

    2004-11-01

    A major barrier to widespread clinical implementation of Monte Carlo dose calculation is the difficulty in characterizing the radiation source within a generalized source model. This work aims to develop a generalized three-component source model (target, primary collimator, flattening filter) for 6- and 18-MV photon beams that match full phase-space data (PSD). Subsource by subsource comparison of dose distributions, using either source PSD or the source model as input, allows accurate source characterization and has the potential to ease the commissioning procedure, since it is possible to obtain information about which subsource needs to be tuned. This source model is unique in that, compared to previous source models, it retains additional correlations among PS variables, which improves accuracy at nonstandard source-to-surface distances (SSDs). In our study, three-dimensional (3D) dose calculations were performed for SSDs ranging from 50 to 200 cm and for field sizes from 1x1 to 30x30 cm{sup 2} as well as a 10x10 cm{sup 2} field 5 cm off axis in each direction. The 3D dose distributions, using either full PSD or the source model as input, were compared in terms of dose-difference and distance-to-agreement. With this model, over 99% of the voxels agreed within {+-}1% or 1 mm for the target, within 2% or 2 mm for the primary collimator, and within {+-}2.5% or 2 mm for the flattening filter in all cases studied. For the dose distributions, 99% of the dose voxels agreed within 1% or 1 mm when the combined source model--including a charged particle source and the full PSD as input--was used. The accurate and general characterization of each photon source and knowledge of the subsource dose distributions should facilitate source model commissioning procedures by allowing scaling the histogram distributions representing the subsources to be tuned.

  3. Monte Carlo Assessments of Absorbed Doses to the Hands of Radiopharmaceutical Workers Due to Photon Emitters

    SciTech Connect

    Ilas, Dan; Eckerman, Keith F; Karagiannis, Harriet

    2009-01-01

    This paper describes the characterization of radiation doses to the hands of nuclear medicine technicians resulting from the handling of radiopharmaceuticals. Radiation monitoring using ring dosimeters indicates that finger dosimeters that are used to show compliance with applicable regulations may overestimate or underestimate radiation doses to the skin depending on the nature of the particular procedure and the radionuclide being handled. To better understand the parameters governing the absorbed dose distributions, a detailed model of the hands was created and used in Monte Carlo simulations of selected nuclear medicine procedures. Simulations of realistic configurations typical for workers handling radiopharmaceuticals were performedfor a range of energies of the source photons. The lack of charged-particle equilibrium necessitated full photon-electron coupled transport calculations. The results show that the dose to different regions of the fingers can differ substantially from dosimeter readings when dosimeters are located at the base of the finger. We tried to identify consistent patterns that relate the actual dose to the dosimeter readings. These patterns depend on the specific work conditions and can be used to better assess the absorbed dose to different regions of the exposed skin.

  4. Specific absorbed fractions of electrons and photons for Rad-HUMAN phantom using Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Wang, Wen; Cheng, Meng-Yun; Long, Peng-Cheng; Hu, Li-Qin

    2015-07-01

    The specific absorbed fractions (SAF) for self- and cross-irradiation are effective tools for the internal dose estimation of inhalation and ingestion intakes of radionuclides. A set of SAFs of photons and electrons were calculated using the Rad-HUMAN phantom, which is a computational voxel phantom of a Chinese adult female that was created using the color photographic image of the Chinese Visible Human (CVH) data set by the FDS Team. The model can represent most Chinese adult female anatomical characteristics and can be taken as an individual phantom to investigate the difference of internal dose with Caucasians. In this study, the emission of mono-energetic photons and electrons of 10 keV to 4 MeV energy were calculated using the Monte Carlo particle transport calculation code MCNP. Results were compared with the values from ICRP reference and ORNL models. The results showed that SAF from the Rad-HUMAN have similar trends but are larger than those from the other two models. The differences were due to the racial and anatomical differences in organ mass and inter-organ distance. The SAFs based on the Rad-HUMAN phantom provide an accurate and reliable data for internal radiation dose calculations for Chinese females. Supported by Strategic Priority Research Program of Chinese Academy of Sciences (XDA03040000), National Natural Science Foundation of China (910266004, 11305205, 11305203) and National Special Program for ITER (2014GB112001)

  5. Monte Carlo Simulation of Light Transport in Tissue, Beta Version

    Energy Science and Technology Software Center (ESTSC)

    2003-12-09

    Understanding light-tissue interaction is fundamental in the field of Biomedical Optics. It has important implications for both therapeutic and diagnostic technologies. In this program, light transport in scattering tissue is modeled by absorption and scattering events as each photon travels through the tissue. the path of each photon is determined statistically by calculating probabilities of scattering and absorption. Other meausured quantities are total reflected light, total transmitted light, and total heat absorbed.

  6. Performance analysis of the Monte Carlo code MCNP4A for photon-based radiotherapy applications

    SciTech Connect

    DeMarco, J.J.; Solberg, T.D.; Wallace, R.E.; Smathers, J.B.

    1995-12-31

    The Los Alamos code MCNP4A (Monte Carlo M-Particle version 4A) is currently used to simulate a variety of problems ranging from nuclear reactor analysis to boron neutron capture therapy. This study is designed to evaluate MCNP4A as the dose calculation system for photon-based radiotherapy applications. A graphical user interface (MCNP Radiation Therapy) has been developed which automatically sets up the geometry and photon source requirements for three-dimensional simulations using Computed Tomography (CT) data. Preliminary results suggest the code is capable of calculating satisfactory dose distributions in a variety of simulated homogeneous and heterogeneous phantoms. The major drawback for this dosimetry system is the amount of time to obtain a statistically significant answer. MCNPRT allows the user to analyze the performance of MCNP4A as a function of material, geometry resolution and MCNP4A photon and electron physics parameters. A typical simulation geometry consists of a 10 MV photon point source incident on a 15 x 15 x 15 cm{sup 3} phantom composed of water voxels ranging in size from 10 x 10 x 10 mm{sup 3} to 2 x 2 x 2 mm{sup 3}. As the voxel size is decreased, a larger percentage of time is spent tracking photons through the voxelized geometry as opposed to the secondary electrons. A PRPR Patch file is under development that will optimize photon transport within the simulation phantom specifically for radiotherapy applications. MCNP4A also supports parallel processing capabilities via the Parallel Virtual Machine (PVM) message passing system. A dedicated network of five SUN SPARC2 processors produced a wall-clock speedup of 4.4 based on a simulation phantom containing 5 x 5 x 5 mm{sup 3} water voxels. The code was also tested on the 80 node IBM RS/6000 cluster at the Maui High Performance Computing Center (NHPCC). A non-dedicated system of 75 processors produces a wall clock speedup of 29 relative to one SUN SPARC2 computer.

  7. Efficient, Automated Monte Carlo Methods for Radiation Transport

    PubMed Central

    Kong, Rong; Ambrose, Martin; Spanier, Jerome

    2012-01-01

    Monte Carlo simulations provide an indispensible model for solving radiative transport problems, but their slow convergence inhibits their use as an everyday computational tool. In this paper, we present two new ideas for accelerating the convergence of Monte Carlo algorithms based upon an efficient algorithm that couples simulations of forward and adjoint transport equations. Forward random walks are first processed in stages, each using a fixed sample size, and information from stage k is used to alter the sampling and weighting procedure in stage k + 1. This produces rapid geometric convergence and accounts for dramatic gains in the efficiency of the forward computation. In case still greater accuracy is required in the forward solution, information from an adjoint simulation can be added to extend the geometric learning of the forward solution. The resulting new approach should find widespread use when fast, accurate simulations of the transport equation are needed. PMID:23226872

  8. Monte Carlo simulation and experimental measurement of a nonspectroscopic radiation portal monitor for photon detection efficiencies of internally deposited radionuclides

    NASA Astrophysics Data System (ADS)

    Carey, Matthew Glen

    Particle transport of radionuclide photons using the Monte Carlo N-Particle computer code can be used to determine a portal monitor's photon detection efficiency, in units of counts per photon, for internally deposited radionuclides. Good agreement has been found with experimental results for radionuclides that emit higher energy photons, such as Cs-137 and Co-60. Detection efficiency for radionuclides that emit lower energy photons, such as Am-241, greatly depend on the effective discriminator energy level of the portal monitor as well as any attenuating material between the source and detectors. This evaluation uses a chi-square approach to determine the best fit discriminator level of a non-spectroscopic portal monitor when the effective discriminator level, in units of energy, is not known. Internal detection efficiencies were evaluated experimentally using an anthropomorphic phantom with NIST traceable sources at various internal locations, and by simulation using MCNP5. The results of this research find that MCNP5 can be an effective tool for simulation of photon detection efficiencies, given a known discriminator level, for internally and externally deposited radionuclides. In addition, MCNP5 can be used for bounding personnel doses from either internally or externally deposited mixtures of radionuclides.

  9. Overview and applications of the Monte Carlo radiation transport kit at LLNL

    SciTech Connect

    Sale, K E

    1999-06-23

    Modern Monte Carlo radiation transport codes can be applied to model most applications of radiation, from optical to TeV photons, from thermal neutrons to heavy ions. Simulations can include any desired level of detail in three-dimensional geometries using the right level of detail in the reaction physics. The technology areas to which we have applied these codes include medical applications, defense, safety and security programs, nuclear safeguards and industrial and research system design and control. The main reason such applications are interesting is that by using these tools substantial savings of time and effort (i.e. money) can be realized. In addition it is possible to separate out and investigate computationally effects which can not be isolated and studied in experiments. In model calculations, just as in real life, one must take care in order to get the correct answer to the right question. Advancing computing technology allows extensions of Monte Carlo applications in two directions. First, as computers become more powerful more problems can be accurately modeled. Second, as computing power becomes cheaper Monte Carlo methods become accessible more widely. An overview of the set of Monte Carlo radiation transport tools in use a LLNL will be presented along with a few examples of applications and future directions.

  10. Comparison of RTPS and Monte Carlo dose distributions in heterogeneous phantoms for photon beams.

    PubMed

    Nakaguchi, Yuji; Araki, Fujio; Maruyama, Masato; Fukuda, Shogo

    2010-04-20

    The purpose of this study was to compare dose distributions from three different RTPS with those from Monte Carlo (MC) calculations and measurements, in heterogeneous phantoms for photon beams. This study used four algorithms for RTPS: AAA (analytical anisotropic algorithm) implemented in the Eclipse (Varian Medical Systems) treatment planning system, CC (collapsed cone) superposition from the Pinnacle (Philips), and MGS (multigrid superposition) and FFT (fast Fourier transform) convolution from XiO (CMS). The dose distributions from these algorithms were compared with those from MC and measurements in a set of heterogeneous phantoms. Eclipse/AAA underestimated the dose inside the lung region for low energies of 4 and 6 MV. This is because Eclipse/AAA do not adequately account for a scaling of the spread of the pencil (lateral electron transport) based on changes in the electron density at low photon energies. The dose distributions from Pinnacle/CC and XiO/MGS almost agree with those of MC and measurements at low photon energies, but increase errors at high energy of 15 MV, especially for a small field of 3x3 cm(2). The FFT convolution extremely overestimated the dose inside the lung slab compared to MC. The dose distributions from the superposition algorithms almost agree with those from MC as well as measured values at 4 and 6 MV. The dose errors for Eclipse/AAA are lager in lung model phantoms for 4 and 6 MV. It is necessary to use the algorithms comparable to superposition for accuracy of dose calculations in heterogeneous regions. PMID:20625219

  11. Topologically robust transport of entangled photons in a 2D photonic system.

    PubMed

    Mittal, Sunil; Orre, Venkata Vikram; Hafezi, Mohammad

    2016-07-11

    We theoretically study the transport of time-bin entangled photon pairs in a two-dimensional topological photonic system of coupled ring resonators. This system implements the integer quantum Hall model using a synthetic gauge field and exhibits topologically robust edge states. We show that the transport through edge states preserves temporal correlations of entangled photons whereas bulk transport does not preserve these correlations and can lead to significant unwanted temporal bunching or anti-bunching of photons. We study the effect of disorder on the quantum transport properties; while the edge transport remains robust, bulk transport is very susceptible, and in the limit of strong disorder, bulk states become localized. We show that this localization is manifested as an enhanced bunching/anti-bunching of photons. This topologically robust transport of correlations through edge states could enable robust on-chip quantum communication channels and delay lines for information encoded in temporal correlations of photons. PMID:27410836

  12. GPU-accelerated object-oriented Monte Carlo modeling of photon migration in turbid media

    NASA Astrophysics Data System (ADS)

    Doronin, Alex; Meglinski, Igor

    2010-10-01

    Due to the recent intense developments in lasers and optical technologies a number of novel revolutionary imaging and photonic-based diagnostic modalities have arisen. Utilizing various features of light these techniques provide new practical solutions in a range of biomedical, environmental and industrial applications. Conceptual engineering design of new optical diagnostic systems requires a clear understanding of the light-tissue interaction and the peculiarities of optical radiation propagation therein. Description of photon migration within the random media is based on the radiative transfer that forms a basis of Monte Carlo modelling of light propagation in complex turbid media like biological tissues. In current presentation with a further development of the Monte Carlo technique we introduce a novel Object-Oriented Programming (OOP) paradigm accelerated by Graphics Processing Unit that provide an opportunity to escalate the performance of standard Monte Carlo simulation over 100 times.

  13. GPU-accelerated object-oriented Monte Carlo modeling of photon migration in turbid media

    NASA Astrophysics Data System (ADS)

    Doronin, Alex; Meglinski, Igor

    2011-03-01

    Due to the recent intense developments in lasers and optical technologies a number of novel revolutionary imaging and photonic-based diagnostic modalities have arisen. Utilizing various features of light these techniques provide new practical solutions in a range of biomedical, environmental and industrial applications. Conceptual engineering design of new optical diagnostic systems requires a clear understanding of the light-tissue interaction and the peculiarities of optical radiation propagation therein. Description of photon migration within the random media is based on the radiative transfer that forms a basis of Monte Carlo modelling of light propagation in complex turbid media like biological tissues. In current presentation with a further development of the Monte Carlo technique we introduce a novel Object-Oriented Programming (OOP) paradigm accelerated by Graphics Processing Unit that provide an opportunity to escalate the performance of standard Monte Carlo simulation over 100 times.

  14. Monte Carlo simulations of charge transport in heterogeneous organic semiconductors

    NASA Astrophysics Data System (ADS)

    Aung, Pyie Phyo; Khanal, Kiran; Luettmer-Strathmann, Jutta

    2015-03-01

    The efficiency of organic solar cells depends on the morphology and electronic properties of the active layer. Research teams have been experimenting with different conducting materials to achieve more efficient solar panels. In this work, we perform Monte Carlo simulations to study charge transport in heterogeneous materials. We have developed a coarse-grained lattice model of polymeric photovoltaics and use it to generate active layers with ordered and disordered regions. We determine carrier mobilities for a range of conditions to investigate the effect of the morphology on charge transport.

  15. Monte Carlo Modeling of Photon Interrogation Methods for Characterization of Special Nuclear Material

    SciTech Connect

    Pozzi, Sara A; Downar, Thomas J; Padovani, Enrico; Clarke, Shaun D

    2006-01-01

    This work illustrates a methodology based on photon interrogation and coincidence counting for determining the characteristics of fissile material. The feasibility of the proposed methods was demonstrated using a Monte Carlo code system to simulate the full statistics of the neutron and photon field generated by the photon interrogation of fissile and non-fissile materials. Time correlation functions between detectors were simulated for photon beam-on and photon beam-off operation. In the latter case, the correlation signal is obtained via delayed neutrons from photofission, which induce further fission chains in the nuclear material. An analysis methodology was demonstrated based on features selected from the simulated correlation functions and on the use of artificial neural networks. We show that the methodology can reliably differentiate between highly enriched uranium and plutonium. Furthermore, the mass of the material can be determined with a relative error of about 12%. Keywords: MCNP, MCNP-PoliMi, Artificial neural network, Correlation measurement, Photofission

  16. Monte Carlo based beam model using a photon MLC for modulated electron radiotherapy

    SciTech Connect

    Henzen, D. Manser, P.; Frei, D.; Volken, W.; Born, E. J.; Vetterli, D.; Chatelain, C.; Fix, M. K.; Neuenschwander, H.; Stampanoni, M. F. M.

    2014-02-15

    Purpose: Modulated electron radiotherapy (MERT) promises sparing of organs at risk for certain tumor sites. Any implementation of MERT treatment planning requires an accurate beam model. The aim of this work is the development of a beam model which reconstructs electron fields shaped using the Millennium photon multileaf collimator (MLC) (Varian Medical Systems, Inc., Palo Alto, CA) for a Varian linear accelerator (linac). Methods: This beam model is divided into an analytical part (two photon and two electron sources) and a Monte Carlo (MC) transport through the MLC. For dose calculation purposes the beam model has been coupled with a macro MC dose calculation algorithm. The commissioning process requires a set of measurements and precalculated MC input. The beam model has been commissioned at a source to surface distance of 70 cm for a Clinac 23EX (Varian Medical Systems, Inc., Palo Alto, CA) and a TrueBeam linac (Varian Medical Systems, Inc., Palo Alto, CA). For validation purposes, measured and calculated depth dose curves and dose profiles are compared for four different MLC shaped electron fields and all available energies. Furthermore, a measured two-dimensional dose distribution for patched segments consisting of three 18 MeV segments, three 12 MeV segments, and a 9 MeV segment is compared with corresponding dose calculations. Finally, measured and calculated two-dimensional dose distributions are compared for a circular segment encompassed with a C-shaped segment. Results: For 15 × 34, 5 × 5, and 2 × 2 cm{sup 2} fields differences between water phantom measurements and calculations using the beam model coupled with the macro MC dose calculation algorithm are generally within 2% of the maximal dose value or 2 mm distance to agreement (DTA) for all electron beam energies. For a more complex MLC pattern, differences between measurements and calculations are generally within 3% of the maximal dose value or 3 mm DTA for all electron beam energies. For the

  17. Monte Carlo radiation transport: A revolution in science

    SciTech Connect

    Hendricks, J.

    1993-04-01

    When Enrico Fermi, Stan Ulam, Nicholas Metropolis, John von Neuman, and Robert Richtmyer invented the Monte Carlo method fifty years ago, little could they imagine the far-flung consequences, the international applications, and the revolution in science epitomized by their abstract mathematical method. The Monte Carlo method is used in a wide variety of fields to solve exact computational models approximately by statistical sampling. It is an alternative to traditional physics modeling methods which solve approximate computational models exactly by deterministic methods. Modern computers and improved methods, such as variance reduction, have enhanced the method to the point of enabling a true predictive capability in areas such as radiation or particle transport. This predictive capability has contributed to a radical change in the way science is done: design and understanding come from computations built upon experiments rather than being limited to experiments, and the computer codes doing the computations have become the repository for physics knowledge. The MCNP Monte Carlo computer code effort at Los Alamos is an example of this revolution. Physicians unfamiliar with physics details can design cancer treatments using physics buried in the MCNP computer code. Hazardous environments and hypothetical accidents can be explored. Many other fields, from underground oil well exploration to aerospace, from physics research to energy production, from safety to bulk materials processing, benefit from MCNP, the Monte Carlo method, and the revolution in science.

  18. Modeling parameterized geometry in GPU-based Monte Carlo particle transport simulation for radiotherapy

    NASA Astrophysics Data System (ADS)

    Chi, Yujie; Tian, Zhen; Jia, Xun

    2016-08-01

    Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU’s shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75–2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0

  19. Modeling parameterized geometry in GPU-based Monte Carlo particle transport simulation for radiotherapy.

    PubMed

    Chi, Yujie; Tian, Zhen; Jia, Xun

    2016-08-01

    Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU's shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0

  20. Composition PDF/photon Monte Carlo modeling of moderately sooting turbulent jet flames

    SciTech Connect

    Mehta, R.S.; Haworth, D.C.; Modest, M.F.

    2010-05-15

    A comprehensive model for luminous turbulent flames is presented. The model features detailed chemistry, radiation and soot models and state-of-the-art closures for turbulence-chemistry interactions and turbulence-radiation interactions. A transported probability density function (PDF) method is used to capture the effects of turbulent fluctuations in composition and temperature. The PDF method is extended to include soot formation. Spectral gas and soot radiation is modeled using a (particle-based) photon Monte Carlo method coupled with the PDF method, thereby capturing both emission and absorption turbulence-radiation interactions. An important element of this work is that the gas-phase chemistry and soot models that have been thoroughly validated across a wide range of laminar flames are used in turbulent flame simulations without modification. Six turbulent jet flames are simulated with Reynolds numbers varying from 6700 to 15,000, two fuel types (pure ethylene, 90% methane-10% ethylene blend) and different oxygen concentrations in the oxidizer stream (from 21% O{sub 2} to 55% O{sub 2}). All simulations are carried out with a single set of physical and numerical parameters (model constants). Uniformly good agreement between measured and computed mean temperatures, mean soot volume fractions and (where available) radiative fluxes is found across all flames. This demonstrates that with the combination of a systematic approach and state-of-the-art physical models and numerical algorithms, it is possible to simulate a broad range of luminous turbulent flames with a single model. (author)

  1. Monte Carlo simulation of photon buildup factors for shielding materials in diagnostic x-ray facilities

    SciTech Connect

    Kharrati, Hedi; Agrebi, Amel; Karoui, Mohamed Karim

    2012-10-15

    Purpose: A simulation of buildup factors for ordinary concrete, steel, lead, plate glass, lead glass, and gypsum wallboard in broad beam geometry for photons energies from 10 keV to 150 keV at 5 keV intervals is presented. Methods: Monte Carlo N-particle radiation transport computer code has been used to determine the buildup factors for the studied shielding materials. Results: An example concretizing the use of the obtained buildup factors data in computing the broad beam transmission for tube potentials at 70, 100, 120, and 140 kVp is given. The half value layer, the tenth value layer, and the equilibrium tenth value layer are calculated from the broad beam transmission for these tube potentials. Conclusions: The obtained values compared with those calculated from the published data show the ability of these data to predict shielding transmission curves. Therefore, the buildup factors data can be combined with primary, scatter, and leakage x-ray spectra to provide a computationally based solution to broad beam transmission for barriers in shielding x-ray facilities.

  2. SIMIND Monte Carlo simulation of a single photon emission CT

    PubMed Central

    Bahreyni Toossi, M. T.; Islamian, J. Pirayesh; Momennezhad, M.; Ljungberg, M.; Naseri, S. H.

    2010-01-01

    In this study, we simulated a Siemens E.CAM SPECT system using SIMIND Monte Carlo program to acquire its experimental characterization in terms of energy resolution, sensitivity, spatial resolution and imaging of phantoms using 99mTc. The experimental and simulation data for SPECT imaging was acquired from a point source and Jaszczak phantom. Verification of the simulation was done by comparing two sets of images and related data obtained from the actual and simulated systems. Image quality was assessed by comparing image contrast and resolution. Simulated and measured energy spectra (with or without a collimator) and spatial resolution from point sources in air were compared. The resulted energy spectra present similar peaks for the gamma energy of 99mTc at 140 KeV. FWHM for the simulation calculated to 14.01 KeV and 13.80 KeV for experimental data, corresponding to energy resolution of 10.01 and 9.86% compared to defined 9.9% for both systems, respectively. Sensitivities of the real and virtual gamma cameras were calculated to 85.11 and 85.39 cps/MBq, respectively. The energy spectra of both simulated and real gamma cameras were matched. Images obtained from Jaszczak phantom, experimentally and by simulation, showed similarity in contrast and resolution. SIMIND Monte Carlo could successfully simulate the Siemens E.CAM gamma camera. The results validate the use of the simulated system for further investigation, including modification, planning, and developing a SPECT system to improve the quality of images. PMID:20177569

  3. Analytical band Monte Carlo analysis of electron transport in silicene

    NASA Astrophysics Data System (ADS)

    Yeoh, K. H.; Ong, D. S.; Ooi, C. H. Raymond; Yong, T. K.; Lim, S. K.

    2016-06-01

    An analytical band Monte Carlo (AMC) with linear energy band dispersion has been developed to study the electron transport in suspended silicene and silicene on aluminium oxide (Al2O3) substrate. We have calibrated our model against the full band Monte Carlo (FMC) results by matching the velocity-field curve. Using this model, we discover that the collective effects of charge impurity scattering and surface optical phonon scattering can degrade the electron mobility down to about 400 cm2 V‑1 s‑1 and thereafter it is less sensitive to the changes of charge impurity in the substrate and surface optical phonon. We also found that further reduction of mobility to ∼100 cm2 V‑1 s‑1 as experimentally demonstrated by Tao et al (2015 Nat. Nanotechnol. 10 227) can only be explained by the renormalization of Fermi velocity due to interaction with Al2O3 substrate.

  4. Monte Carlo calculation of dose rate conversion factors for external exposure to photon emitters in soil.

    PubMed

    Clouvas, A; Xanthos, S; Antonopoulos-Domis, M; Silva, J

    2000-03-01

    The dose rate conversion factors D(CF) (absorbed dose rate in air per unit activity per unit of soil mass, nGy h(-1) per Bq kg(-1)) are calculated 1 m above ground for photon emitters of natural radionuclides uniformly distributed in the soil. Three Monte Carlo codes are used: 1) The MCNP code of Los Alamos; 2) The GEANT code of CERN; and 3) a Monte Carlo code developed in the Nuclear Technology Laboratory of the Aristotle University of Thessaloniki. The accuracy of the Monte Carlo results is tested by the comparison of the unscattered flux obtained by the three Monte Carlo codes with an independent straightforward calculation. All codes and particularly the MCNP calculate accurately the absorbed dose rate in air due to the unscattered radiation. For the total radiation (unscattered plus scattered) the D(CF) values calculated from the three codes are in very good agreement between them. The comparison between these results and the results deduced previously by other authors indicates a good agreement (less than 15% of difference) for photon energies above 1,500 keV. Antithetically, the agreement is not as good (difference of 20-30%) for the low energy photons. PMID:10688452

  5. Parallel Monte Carlo Synthetic Acceleration methods for discrete transport problems

    NASA Astrophysics Data System (ADS)

    Slattery, Stuart R.

    This work researches and develops Monte Carlo Synthetic Acceleration (MCSA) methods as a new class of solution techniques for discrete neutron transport and fluid flow problems. Monte Carlo Synthetic Acceleration methods use a traditional Monte Carlo process to approximate the solution to the discrete problem as a means of accelerating traditional fixed-point methods. To apply these methods to neutronics and fluid flow and determine the feasibility of these methods on modern hardware, three complementary research and development exercises are performed. First, solutions to the SPN discretization of the linear Boltzmann neutron transport equation are obtained using MCSA with a difficult criticality calculation for a light water reactor fuel assembly used as the driving problem. To enable MCSA as a solution technique a group of modern preconditioning strategies are researched. MCSA when compared to conventional Krylov methods demonstrated improved iterative performance over GMRES by converging in fewer iterations when using the same preconditioning. Second, solutions to the compressible Navier-Stokes equations were obtained by developing the Forward-Automated Newton-MCSA (FANM) method for nonlinear systems based on Newton's method. Three difficult fluid benchmark problems in both convective and driven flow regimes were used to drive the research and development of the method. For 8 out of 12 benchmark cases, it was found that FANM had better iterative performance than the Newton-Krylov method by converging the nonlinear residual in fewer linear solver iterations with the same preconditioning. Third, a new domain decomposed algorithm to parallelize MCSA aimed at leveraging leadership-class computing facilities was developed by utilizing parallel strategies from the radiation transport community. The new algorithm utilizes the Multiple-Set Overlapping-Domain strategy in an attempt to reduce parallel overhead and add a natural element of replication to the algorithm. It

  6. Monte Carlo simulation of secondary radiation exposure from high-energy photon therapy using an anthropomorphic phantom.

    PubMed

    Frankl, Matthias; Macián-Juan, Rafael

    2016-03-01

    The development of intensity-modulated radiotherapy treatments delivering large amounts of monitor units (MUs) recently raised concern about higher risks for secondary malignancies. In this study, optimised combinations of several variance reduction techniques (VRTs) have been implemented in order to achieve a high precision in Monte Carlo (MC) radiation transport simulations and the calculation of in- and out-of-field photon and neutron dose-equivalent distributions in an anthropomorphic phantom using MCNPX, v.2.7. The computer model included a Varian Clinac 2100C treatment head and a high-resolution head phantom. By means of the applied VRTs, a relative uncertainty for the photon dose-equivalent distribution of <1 % in-field and 15 % in average over the rest of the phantom could be obtained. Neutron dose equivalent, caused by photonuclear reactions in the linear accelerator components at photon energies of approximately >8 MeV, has been calculated. Relative uncertainty, calculated for each voxel, could be kept below 5 % in average over all voxels of the phantom. Thus, a very detailed neutron dose distribution could be obtained. The achieved precision now allows a far better estimation of both photon and especially neutron doses out-of-field, where neutrons can become the predominant component of secondary radiation. PMID:26311702

  7. Current status of the PSG Monte Carlo neutron transport code

    SciTech Connect

    Leppaenen, J.

    2006-07-01

    PSG is a new Monte Carlo neutron transport code, developed at the Technical Research Centre of Finland (VTT). The code is mainly intended for fuel assembly-level reactor physics calculations, such as group constant generation for deterministic reactor simulator codes. This paper presents the current status of the project and the essential capabilities of the code. Although the main application of PSG is in lattice calculations, the geometry is not restricted in two dimensions. This paper presents the validation of PSG against the experimental results of the three-dimensional MOX fuelled VENUS-2 reactor dosimetry benchmark. (authors)

  8. Modeling photon transport in transabdominal fetal oximetry

    NASA Astrophysics Data System (ADS)

    Jacques, Steven L.; Ramanujam, Nirmala; Vishnoi, Gargi; Choe, Regine; Chance, Britton

    2000-07-01

    The possibility of optical oximetry of the blood in the fetal brain measured across the maternal abdomen just prior to birth is under investigated. Such measurements could detect fetal distress prior to birth and aid in the clinical decision regarding Cesarean section. This paper uses a perturbation method to model photon transport through a 8- cm-diam fetal brain located at a constant 2.5 cm below a curved maternal abdominal surface with an air/tissue boundary. In the simulation, a near-infrared light source delivers light to the abdomen and a detector is positioned up to 10 cm from the source along the arc of the abdominal surface. The light transport [W/cm2 fluence rate per W incident power] collected at the 10 cm position is Tm equals 2.2 X 10-6 cm-2 if the fetal brain has the same optical properties as the mother and Tf equals 1.0 X 10MIN6 cm-2 for an optically perturbing fetal brain with typical brain optical properties. The perturbation P equals (Tf - Tm)/Tm is -53% due to the fetal brain. The model illustrates the challenge and feasibility of transabdominal oximetry of the fetal brain.

  9. Low-energy photons in high-energy photon fields--Monte Carlo generated spectra and a new descriptive parameter.

    PubMed

    Chofor, Ndimofor; Harder, Dietrich; Willborn, Kay; Rühmann, Antje; Poppe, Björn

    2011-09-01

    The varying low-energy contribution to the photon spectra at points within and around radiotherapy photon fields is associated with variations in the responses of non-water equivalent dosimeters and in the water-to-material dose conversion factors for tissues such as the red bone marrow. In addition, the presence of low-energy photons in the photon spectrum enhances the RBE in general and in particular for the induction of second malignancies. The present study discusses the general rules valid for the low-energy spectral component of radiotherapeutic photon beams at points within and in the periphery of the treatment field, taking as an example the Siemens Primus linear accelerator at 6 MV and 15 MV. The photon spectra at these points and their typical variations due to the target system, attenuation, single and multiple Compton scattering, are described by the Monte Carlo method, using the code BEAMnrc/EGSnrc. A survey of the role of low energy photons in the spectra within and around radiotherapy fields is presented. In addition to the spectra, some data compression has proven useful to support the overview of the behaviour of the low-energy component. A characteristic indicator of the presence of low-energy photons is the dose fraction attributable to photons with energies not exceeding 200 keV, termed P(D)(200 keV). Its values are calculated for different depths and lateral positions within a water phantom. For a pencil beam of 6 or 15 MV primary photons in water, the radial distribution of P(D)(200 keV) is bellshaped, with a wide-ranging exponential tail of half value 6 to 7 cm. The P(D)(200 keV) value obtained on the central axis of a photon field shows an approximately proportional increase with field size. Out-of-field P(D)(200 keV) values are up to an order of magnitude higher than on the central axis for the same irradiation depth. The 2D pattern of P(D)(200 keV) for a radiotherapy field visualizes the regions, e.g. at the field margin, where changes of

  10. A high-order photon Monte Carlo method for radiative transfer in direct numerical simulation

    SciTech Connect

    Wu, Y.; Modest, M.F.; Haworth, D.C. . E-mail: dch12@psu.edu

    2007-05-01

    A high-order photon Monte Carlo method is developed to solve the radiative transfer equation. The statistical and discretization errors of the computed radiative heat flux and radiation source term are isolated and quantified. Up to sixth-order spatial accuracy is demonstrated for the radiative heat flux, and up to fourth-order accuracy for the radiation source term. This demonstrates the compatibility of the method with high-fidelity direct numerical simulation (DNS) for chemically reacting flows. The method is applied to address radiative heat transfer in a one-dimensional laminar premixed flame and a statistically one-dimensional turbulent premixed flame. Modifications of the flame structure with radiation are noted in both cases, and the effects of turbulence/radiation interactions on the local reaction zone structure are revealed for the turbulent flame. Computational issues in using a photon Monte Carlo method for DNS of turbulent reacting flows are discussed.

  11. The macro response Monte Carlo method for electron transport

    SciTech Connect

    Svatos, M M

    1998-09-01

    The main goal of this thesis was to prove the feasibility of basing electron depth dose calculations in a phantom on first-principles single scatter physics, in an amount of time that is equal to or better than current electron Monte Carlo methods. The Macro Response Monte Carlo (MRMC) method achieves run times that are on the order of conventional electron transport methods such as condensed history, with the potential to be much faster. This is possible because MRMC is a Local-to-Global method, meaning the problem is broken down into two separate transport calculations. The first stage is a local, in this case, single scatter calculation, which generates probability distribution functions (PDFs) to describe the electron's energy, position and trajectory after leaving the local geometry, a small sphere or "kugel" A number of local kugel calculations were run for calcium and carbon, creating a library of kugel data sets over a range of incident energies (0.25 MeV - 8 MeV) and sizes (0.025 cm to 0.1 cm in radius). The second transport stage is a global calculation, where steps that conform to the size of the kugels in the library are taken through the global geometry. For each step, the appropriate PDFs from the MRMC library are sampled to determine the electron's new energy, position and trajectory. The electron is immediately advanced to the end of the step and then chooses another kugel to sample, which continues until transport is completed. The MRMC global stepping code was benchmarked as a series of subroutines inside of the Peregrine Monte Carlo code. It was compared to Peregrine's class II condensed history electron transport package, EGS4, and MCNP for depth dose in simple phantoms having density inhomogeneities. Since the kugels completed in the library were of relatively small size, the zoning of the phantoms was scaled down from a clinical size, so that the energy deposition algorithms for spreading dose across 5-10 zones per kugel could be tested. Most

  12. Optimization of Monte Carlo transport simulations in stochastic media

    SciTech Connect

    Liang, C.; Ji, W.

    2012-07-01

    This paper presents an accurate and efficient approach to optimize radiation transport simulations in a stochastic medium of high heterogeneity, like the Very High Temperature Gas-cooled Reactor (VHTR) configurations packed with TRISO fuel particles. Based on a fast nearest neighbor search algorithm, a modified fast Random Sequential Addition (RSA) method is first developed to speed up the generation of the stochastic media systems packed with both mono-sized and poly-sized spheres. A fast neutron tracking method is then developed to optimize the next sphere boundary search in the radiation transport procedure. In order to investigate their accuracy and efficiency, the developed sphere packing and neutron tracking methods are implemented into an in-house continuous energy Monte Carlo code to solve an eigenvalue problem in VHTR unit cells. Comparison with the MCNP benchmark calculations for the same problem indicates that the new methods show considerably higher computational efficiency. (authors)

  13. The macro response Monte Carlo method for electron transport

    NASA Astrophysics Data System (ADS)

    Svatos, Michelle Marie

    1998-10-01

    This thesis proves the feasibility of basing depth dose calculations for electron radiotherapy on first- principles single scatter physics, in an amount of time that is comparable to or better than current electron Monte Carlo methods. The Macro Response Monte Carlo (MRMC) method achieves run times that have potential to be much faster than conventional electron transport methods such as condensed history. This is possible because MRMC is a Local-to- Global method, meaning the problem is broken down into two separate transport calculations. The first stage is a local, single scatter calculation, which generates probability distribution functions (PDFs) to describe the electron's energy, position and trajectory after leaving the local geometry, a small sphere or 'kugel'. A number of local kugel calculations were run for calcium and carbon, creating a library of kugel data sets over a range of incident energies (0.25 MeV-8 MeV) and sizes (0.025 cm to 0.1 cm in radius). The second transport stage is a global calculation, where steps that conform to the size of the kugels in the library are taken through the global geometry, which in this case is a CT (computed tomography) scan of a patient or phantom. For each step, the appropriate PDFs from the MRMC library are sampled to determine the electron's new energy, position and trajectory. The electron is immediately advanced to the end of the step and then chooses another kugel to sample, which continues until transport is completed. The MRMC global stepping code was benchmarked as a series of subroutines inside of the Peregrine Monte Carlo code against EGS4 and MCNP for depth dose in simple phantoms having density inhomogeneities. The energy deposition algorithms for spreading dose across 5-10 zones per kugel were tested. Most resulting depth dose calculations were within 2-3% of well-benchmarked codes, with one excursion to 4%. This thesis shows that the concept of using single scatter-based physics in clinical radiation

  14. Achieving nonreciprocal unidirectional single-photon quantum transport using the photonic Aharonov-Bohm effect.

    PubMed

    Yuan, Luqi; Xu, Shanshan; Fan, Shanhui

    2015-11-15

    We show that nonreciprocal unidirectional single-photon quantum transport can be achieved with the photonic Aharonov-Bohm effect. The system consists of a 1D waveguide coupling to two three-level atoms of the V-type. The two atoms, in addition, are each driven by an external coherent field. We show that the phase of the external coherent field provides a gauge potential for the photon states. With a proper choice of the phase difference between the two coherent fields, the transport of a single photon can exhibit unity contrast in its transmissions for the two propagation directions. PMID:26565819

  15. Radiation Transport for Explosive Outflows: A Multigroup Hybrid Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Wollaeger, Ryan T.; van Rossum, Daniel R.; Graziani, Carlo; Couch, Sean M.; Jordan, George C., IV; Lamb, Donald Q.; Moses, Gregory A.

    2013-12-01

    We explore Implicit Monte Carlo (IMC) and discrete diffusion Monte Carlo (DDMC) for radiation transport in high-velocity outflows with structured opacity. The IMC method is a stochastic computational technique for nonlinear radiation transport. IMC is partially implicit in time and may suffer in efficiency when tracking MC particles through optically thick materials. DDMC accelerates IMC in diffusive domains. Abdikamalov extended IMC and DDMC to multigroup, velocity-dependent transport with the intent of modeling neutrino dynamics in core-collapse supernovae. Densmore has also formulated a multifrequency extension to the originally gray DDMC method. We rigorously formulate IMC and DDMC over a high-velocity Lagrangian grid for possible application to photon transport in the post-explosion phase of Type Ia supernovae. This formulation includes an analysis that yields an additional factor in the standard IMC-to-DDMC spatial interface condition. To our knowledge the new boundary condition is distinct from others presented in prior DDMC literature. The method is suitable for a variety of opacity distributions and may be applied to semi-relativistic radiation transport in simple fluids and geometries. Additionally, we test the code, called SuperNu, using an analytic solution having static material, as well as with a manufactured solution for moving material with structured opacities. Finally, we demonstrate with a simple source and 10 group logarithmic wavelength grid that IMC-DDMC performs better than pure IMC in terms of accuracy and speed when there are large disparities between the magnitudes of opacities in adjacent groups. We also present and test our implementation of the new boundary condition.

  16. Acceleration of a Monte Carlo radiation transport code

    SciTech Connect

    Hochstedler, R.D.; Smith, L.M.

    1996-03-01

    Execution time for the Integrated TIGER Series (ITS) Monte Carlo radiation transport code has been reduced by careful re-coding of computationally intensive subroutines. Three test cases for the TIGER (1-D slab geometry), CYLTRAN (2-D cylindrical geometry), and ACCEPT (3-D arbitrary geometry) codes were identified and used to benchmark and profile program execution. Based upon these results, sixteen top time-consuming subroutines were examined and nine of them modified to accelerate computations with equivalent numerical output to the original. The results obtained via this study indicate that speedup factors of 1.90 for the TIGER code, 1.67 for the CYLTRAN code, and 1.11 for the ACCEPT code are achievable. {copyright} {ital 1996 American Institute of Physics.}

  17. Analysis of Light Transport Features in Stone Fruits Using Monte Carlo Simulation

    PubMed Central

    Ding, Chizhu; Shi, Shuning; Chen, Jianjun; Wei, Wei; Tan, Zuojun

    2015-01-01

    The propagation of light in stone fruit tissue was modeled using the Monte Carlo (MC) method. Peaches were used as the representative model of stone fruits. The effects of the fruit core and the skin on light transport features in the peaches were assessed. It is suggested that the skin, flesh and core should be separately considered with different parameters to accurately simulate light propagation in intact stone fruit. The detection efficiency was evaluated by the percentage of effective photons and the detection sensitivity of the flesh tissue. The fruit skin decreases the detection efficiency, especially in the region close to the incident point. The choices of the source-detector distance, detection angle and source intensity were discussed. Accurate MC simulations may result in better insight into light propagation in stone fruit and aid in achieving the optimal fruit quality inspection without extensive experimental measurements. PMID:26469695

  18. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    SciTech Connect

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).

  19. Electron transport in magnetrons by a posteriori Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Costin, C.; Minea, T. M.; Popa, G.

    2014-02-01

    Electron transport across magnetic barriers is crucial in all magnetized plasmas. It governs not only the plasma parameters in the volume, but also the fluxes of charged particles towards the electrodes and walls. It is particularly important in high-power impulse magnetron sputtering (HiPIMS) reactors, influencing the quality of the deposited thin films, since this type of discharge is characterized by an increased ionization fraction of the sputtered material. Transport coefficients of electron clouds released both from the cathode and from several locations in the discharge volume are calculated for a HiPIMS discharge with pre-ionization operated in argon at 0.67 Pa and for very short pulses (few µs) using the a posteriori Monte Carlo simulation technique. For this type of discharge electron transport is characterized by strong temporal and spatial dependence. Both drift velocity and diffusion coefficient depend on the releasing position of the electron cloud. They exhibit minimum values at the centre of the race-track for the secondary electrons released from the cathode. The diffusion coefficient of the same electrons increases from 2 to 4 times when the cathode voltage is doubled, in the first 1.5 µs of the pulse. These parameters are discussed with respect to empirical Bohm diffusion.

  20. Monte Carlo Particle Transport Capability for Inertial Confinement Fusion Applications

    SciTech Connect

    Brantley, P S; Stuart, L M

    2006-11-06

    A time-dependent massively-parallel Monte Carlo particle transport calculational module (ParticleMC) for inertial confinement fusion (ICF) applications is described. The ParticleMC package is designed with the long-term goal of transporting neutrons, charged particles, and gamma rays created during the simulation of ICF targets and surrounding materials, although currently the package treats neutrons and gamma rays. Neutrons created during thermonuclear burn provide a source of neutrons to the ParticleMC package. Other user-defined sources of particles are also available. The module is used within the context of a hydrodynamics client code, and the particle tracking is performed on the same computational mesh as used in the broader simulation. The module uses domain-decomposition and the MPI message passing interface to achieve parallel scaling for large numbers of computational cells. The Doppler effects of bulk hydrodynamic motion and the thermal effects due to the high temperatures encountered in ICF plasmas are directly included in the simulation. Numerical results for a three-dimensional benchmark test problem are presented in 3D XYZ geometry as a verification of the basic transport capability. In the full paper, additional numerical results including a prototype ICF simulation will be presented.

  1. Monte Carlo study of photon fields from a flattening filter-free clinical accelerator

    SciTech Connect

    Vassiliev, Oleg N.; Titt, Uwe; Kry, Stephen F.; Poenisch, Falk; Gillin, Michael T.; Mohan, Radhe

    2006-04-15

    In conventional clinical linear accelerators, the flattening filter scatters and absorbs a large fraction of primary photons. Increasing the beam-on time, which also increases the out-of-field exposure to patients, compensates for the reduction in photon fluence. In recent years, intensity modulated radiation therapy has been introduced, yielding better dose distributions than conventional three-dimensional conformal therapy. The drawback of this method is the further increase in beam-on time. An accelerator with the flattening filter removed, which would increase photon fluence greatly, could deliver considerably higher dose rates. The objective of the present study is to investigate the dosimetric properties of 6 and 18 MV photon beams from an accelerator without a flattening filter. The dosimetric data were generated using the Monte Carlo programs BEAMnrc and DOSXYZnrc. The accelerator model was based on the Varian Clinac 2100 design. We compared depth doses, dose rates, lateral profiles, doses outside collimation, total and collimator scatter factors for an accelerator with and without a flatteneing filter. The study showed that removing the filter increased the dose rate on the central axis by a factor of 2.31 (6 MV) and 5.45 (18 MV) at a given target current. Because the flattening filter is a major source of head scatter photons, its removal from the beam line could reduce the out-of-field dose.

  2. A deterministic computational model for the two dimensional electron and photon transport

    NASA Astrophysics Data System (ADS)

    Badavi, Francis F.; Nealy, John E.

    2014-12-01

    A deterministic (non-statistical) two dimensional (2D) computational model describing the transport of electron and photon typical of space radiation environment in various shield media is described. The 2D formalism is casted into a code which is an extension of a previously developed one dimensional (1D) deterministic electron and photon transport code. The goal of both 1D and 2D codes is to satisfy engineering design applications (i.e. rapid analysis) while maintaining an accurate physics based representation of electron and photon transport in space environment. Both 1D and 2D transport codes have utilized established theoretical representations to describe the relevant collisional and radiative interactions and transport processes. In the 2D version, the shield material specifications are made more general as having the pertinent cross sections. In the 2D model, the specification of the computational field is in terms of a distance of traverse z along an axial direction as well as a variable distribution of deflection (i.e. polar) angles θ where -π/2<θ<π/2, and corresponding symmetry is assumed for the range of azimuth angles (0<φ<2π). In the transport formalism, a combined mean-free-path and average trajectory approach is used. For candidate shielding materials, using the trapped electron radiation environments at low Earth orbit (LEO), geosynchronous orbit (GEO) and Jupiter moon Europa, verification of the 2D formalism vs. 1D and an existing Monte Carlo code are presented.

  3. Parallelization of a Monte Carlo particle transport simulation code

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P.; Bousis, C.; Emfietzoglou, D.

    2010-05-01

    We have developed a high performance version of the Monte Carlo particle transport simulation code MC4. The original application code, developed in Visual Basic for Applications (VBA) for Microsoft Excel, was first rewritten in the C programming language for improving code portability. Several pseudo-random number generators have been also integrated and studied. The new MC4 version was then parallelized for shared and distributed-memory multiprocessor systems using the Message Passing Interface. Two parallel pseudo-random number generator libraries (SPRNG and DCMT) have been seamlessly integrated. The performance speedup of parallel MC4 has been studied on a variety of parallel computing architectures including an Intel Xeon server with 4 dual-core processors, a Sun cluster consisting of 16 nodes of 2 dual-core AMD Opteron processors and a 200 dual-processor HP cluster. For large problem size, which is limited only by the physical memory of the multiprocessor server, the speedup results are almost linear on all systems. We have validated the parallel implementation against the serial VBA and C implementations using the same random number generator. Our experimental results on the transport and energy loss of electrons in a water medium show that the serial and parallel codes are equivalent in accuracy. The present improvements allow for studying of higher particle energies with the use of more accurate physical models, and improve statistics as more particles tracks can be simulated in low response time.

  4. Photon transport through a nanohole by a moving atom

    NASA Astrophysics Data System (ADS)

    Afanasiev, A. E.; Melentiev, P. N.; Kuzin, A. A.; Kalatskiy, A. Yu; Balykin, V. I.

    2016-05-01

    We have proposed and investigated for the first time an efficient way of photon transport through a subwavelength hole by a moving atom. The transfer mechanism is based on the reduction of the wave packet of a single photon due to its absorption by an atom and, correspondingly, its localization in a volume is smaller than both the radiation wavelength and the nanohole size. The scheme realizes the transformation of a single-photon single-mode wave packet of the laser light into a single-photon multimode wave packet in free space.

  5. Dissipationless electron transport in photon-dressed nanostructures.

    PubMed

    Kibis, O V

    2011-09-01

    It is shown that the electron coupling to photons in field-dressed nanostructures can result in the ground electron-photon state with a nonzero electric current. Since the current is associated with the ground state, it flows without the Joule heating of the nanostructure and is nondissipative. Such a dissipationless electron transport can be realized in strongly coupled electron-photon systems with the broken time-reversal symmetry--particularly, in quantum rings and chiral nanostructures dressed by circularly polarized photons. PMID:21981519

  6. Few-photon transport in low-dimensional systems

    SciTech Connect

    Longo, Paolo; Schmitteckert, Peter; Busch, Kurt

    2011-06-15

    We analyze the role of quantum interference effects induced by an embedded two-level system on the photon transport properties in waveguiding structures that exhibit cutoffs (band edges) in their dispersion relation. In particular, we demonstrate that these systems invariably exhibit single-particle photon-atom bound states and strong effective nonlinear responses on the few-photon level. Based on this, we find that the properties of these photon-atom bound states may be tuned via the underlying dispersion relation and that their occupation can be controlled via multiparticle scattering processes. This opens an interesting route for controlling photon transport properties in a number of solid-state-based quantum optical systems and the realization of corresponding functional elements and devices.

  7. Electron transport through a quantum dot assisted by cavity photons

    NASA Astrophysics Data System (ADS)

    Abdullah, Nzar Rauf; Tang, Chi-Shung; Manolescu, Andrei; Gudmundsson, Vidar

    2013-11-01

    We investigate transient transport of electrons through a single quantum dot controlled by a plunger gate. The dot is embedded in a finite wire with length Lx assumed to lie along the x-direction with a parabolic confinement in the y-direction. The quantum wire, originally with hard-wall confinement at its ends, ±Lx/2, is weakly coupled at t = 0 to left and right leads acting as external electron reservoirs. The central system, the dot and the finite wire, is strongly coupled to a single cavity photon mode. A non-Markovian density-matrix formalism is employed to take into account the full electron-photon interaction in the transient regime. In the absence of a photon cavity, a resonant current peak can be found by tuning the plunger-gate voltage to lift a many-body state of the system into the source-drain bias window. In the presence of an x-polarized photon field, additional side peaks can be found due to photon-assisted transport. By appropriately tuning the plunger-gate voltage, the electrons in the left lead are allowed to undergo coherent inelastic scattering to a two-photon state above the bias window if initially one photon was present in the cavity. However, this photon-assisted feature is suppressed in the case of a y-polarized photon field due to the anisotropy of our system caused by its geometry.

  8. Electron transport through a quantum dot assisted by cavity photons.

    PubMed

    Abdullah, Nzar Rauf; Tang, Chi-Shung; Manolescu, Andrei; Gudmundsson, Vidar

    2013-11-20

    We investigate transient transport of electrons through a single quantum dot controlled by a plunger gate. The dot is embedded in a finite wire with length Lx assumed to lie along the x-direction with a parabolic confinement in the y-direction. The quantum wire, originally with hard-wall confinement at its ends, ±Lx/2, is weakly coupled at t = 0 to left and right leads acting as external electron reservoirs. The central system, the dot and the finite wire, is strongly coupled to a single cavity photon mode. A non-Markovian density-matrix formalism is employed to take into account the full electron-photon interaction in the transient regime. In the absence of a photon cavity, a resonant current peak can be found by tuning the plunger-gate voltage to lift a many-body state of the system into the source-drain bias window. In the presence of an x-polarized photon field, additional side peaks can be found due to photon-assisted transport. By appropriately tuning the plunger-gate voltage, the electrons in the left lead are allowed to undergo coherent inelastic scattering to a two-photon state above the bias window if initially one photon was present in the cavity. However, this photon-assisted feature is suppressed in the case of a y-polarized photon field due to the anisotropy of our system caused by its geometry. PMID:24132041

  9. A Fano cavity test for Monte Carlo proton transport algorithms

    SciTech Connect

    Sterpin, Edmond; Sorriaux, Jefferson; Souris, Kevin; Vynckier, Stefaan; Bouchard, Hugo

    2014-01-15

    Purpose: In the scope of reference dosimetry of radiotherapy beams, Monte Carlo (MC) simulations are widely used to compute ionization chamber dose response accurately. Uncertainties related to the transport algorithm can be verified performing self-consistency tests, i.e., the so-called “Fano cavity test.” The Fano cavity test is based on the Fano theorem, which states that under charged particle equilibrium conditions, the charged particle fluence is independent of the mass density of the media as long as the cross-sections are uniform. Such tests have not been performed yet for MC codes simulating proton transport. The objectives of this study are to design a new Fano cavity test for proton MC and to implement the methodology in two MC codes: Geant4 and PENELOPE extended to protons (PENH). Methods: The new Fano test is designed to evaluate the accuracy of proton transport. Virtual particles with an energy ofE{sub 0} and a mass macroscopic cross section of (Σ)/(ρ) are transported, having the ability to generate protons with kinetic energy E{sub 0} and to be restored after each interaction, thus providing proton equilibrium. To perform the test, the authors use a simplified simulation model and rigorously demonstrate that the computed cavity dose per incident fluence must equal (ΣE{sub 0})/(ρ) , as expected in classic Fano tests. The implementation of the test is performed in Geant4 and PENH. The geometry used for testing is a 10 × 10 cm{sup 2} parallel virtual field and a cavity (2 × 2 × 0.2 cm{sup 3} size) in a water phantom with dimensions large enough to ensure proton equilibrium. Results: For conservative user-defined simulation parameters (leading to small step sizes), both Geant4 and PENH pass the Fano cavity test within 0.1%. However, differences of 0.6% and 0.7% were observed for PENH and Geant4, respectively, using larger step sizes. For PENH, the difference is attributed to the random-hinge method that introduces an artificial energy

  10. Coupled Deterministic-Monte Carlo Transport for Radiation Portal Modeling

    SciTech Connect

    Smith, Leon E.; Miller, Erin A.; Wittman, Richard S.; Shaver, Mark W.

    2008-01-14

    Radiation portal monitors are being deployed, both domestically and internationally, to detect illicit movement of radiological materials concealed in cargo. Evaluation of the current and next generations of these radiation portal monitor (RPM) technologies is an ongoing process. 'Injection studies' that superimpose, computationally, the signature from threat materials onto empirical vehicle profiles collected at ports of entry, are often a component of the RPM evaluation process. However, measurement of realistic threat devices can be both expensive and time-consuming. Radiation transport methods that can predict the response of radiation detection sensors with high fidelity, and do so rapidly enough to allow the modeling of many different threat-source configurations, are a cornerstone of reliable evaluation results. Monte Carlo methods have been the primary tool of the detection community for these kinds of calculations, in no small part because they are particularly effective for calculating pulse-height spectra in gamma-ray spectrometers. However, computational times for problems with a high degree of scattering and absorption can be extremely long. Deterministic codes that discretize the transport in space, angle, and energy offer potential advantages in computational efficiency for these same kinds of problems, but the pulse-height calculations needed to predict gamma-ray spectrometer response are not readily accessible. These complementary strengths for radiation detection scenarios suggest that coupling Monte Carlo and deterministic methods could be beneficial in terms of computational efficiency. Pacific Northwest National Laboratory and its collaborators are developing a RAdiation Detection Scenario Analysis Toolbox (RADSAT) founded on this coupling approach. The deterministic core of RADSAT is Attila, a three-dimensional, tetrahedral-mesh code originally developed by Los Alamos National Laboratory, and since expanded and refined by Transpire, Inc. [1

  11. Optimizing light transport in scintillation crystals for time-of-flight PET: an experimental and optical Monte Carlo simulation study

    PubMed Central

    Berg, Eric; Roncali, Emilie; Cherry, Simon R.

    2015-01-01

    Achieving excellent timing resolution in gamma ray detectors is crucial in several applications such as medical imaging with time-of-flight positron emission tomography (TOF-PET). Although many factors impact the overall system timing resolution, the statistical nature of scintillation light, including photon production and transport in the crystal to the photodetector, is typically the limiting factor for modern scintillation detectors. In this study, we investigated the impact of surface treatment, in particular, roughening select areas of otherwise polished crystals, on light transport and timing resolution. A custom Monte Carlo photon tracking tool was used to gain insight into changes in light collection and timing resolution that were observed experimentally: select roughening configurations increased the light collection up to 25% and improved timing resolution by 15% compared to crystals with all polished surfaces. Simulations showed that partial surface roughening caused a greater number of photons to be reflected towards the photodetector and increased the initial rate of photoelectron production. This study provides a simple method to improve timing resolution and light collection in scintillator-based gamma ray detectors, a topic of high importance in the field of TOF-PET. Additionally, we demonstrated utility of our Monte Carlo simulation tool to accurately predict the effect of altering crystal surfaces on light collection and timing resolution. PMID:26114040

  12. Controlling single-photon transport with three-level quantum dots in photonic crystals

    NASA Astrophysics Data System (ADS)

    Yan, Cong-Hua; Jia, Wen-Zhi; Wei, Lian-Fu

    2014-03-01

    We investigate how to control single-photon transport along the photonic crystal waveguide with the recent experimentally demonstrated artificial atoms [i.e., Λ-type quantum dots (QDs)] [S. G. Carter et al., Nat. Photon. 7, 329 (2013), 10.1038/nphoton.2013.41] in an all-optical way. Adopting full quantum theory in real space, we analytically calculate the transport coefficients of single photons scattered by a Λ-type QD embedded in single- and two-mode photonic crystal cavities (PCCs), respectively. Our numerical results clearly show that the photonic transmission properties can be exactly manipulated by adjusting the coupling strengths of waveguide-cavity and QD-cavity interactions. Specifically, for the PCC with two degenerate orthogonal polarization modes coupled to a Λ-type QD with two degenerate ground states, we find that the photonic transmission spectra show three Rabi-splitting dips and the present system could serve as single-photon polarization beam splitters. The feasibility of our proposal with the current photonic crystal technique is also discussed.

  13. Generalized free-space diffuse photon transport model based on the influence analysis of a camera lens diaphragm.

    PubMed

    Chen, Xueli; Gao, Xinbo; Qu, Xiaochao; Chen, Duofang; Ma, Xiaopeng; Liang, Jimin; Tian, Jie

    2010-10-10

    The camera lens diaphragm is an important component in a noncontact optical imaging system and has a crucial influence on the images registered on the CCD camera. However, this influence has not been taken into account in the existing free-space photon transport models. To model the photon transport process more accurately, a generalized free-space photon transport model is proposed. It combines Lambertian source theory with analysis of the influence of the camera lens diaphragm to simulate photon transport process in free space. In addition, the radiance theorem is also adopted to establish the energy relationship between the virtual detector and the CCD camera. The accuracy and feasibility of the proposed model is validated with a Monte-Carlo-based free-space photon transport model and physical phantom experiment. A comparison study with our previous hybrid radiosity-radiance theorem based model demonstrates the improvement performance and potential of the proposed model for simulating photon transport process in free space. PMID:20935713

  14. Identifying key surface parameters for optical photon transport in GEANT4/GATE simulations.

    PubMed

    Nilsson, Jenny; Cuplov, Vesna; Isaksson, Mats

    2015-09-01

    For a scintillator used for spectrometry, the generation, transport and detection of optical photons have a great impact on the energy spectrum resolution. A complete Monte Carlo model of a scintillator includes a coupled ionizing particle and optical photon transport, which can be simulated with the GEANT4 code. The GEANT4 surface parameters control the physics processes an optical photon undergoes when reaching the surface of a volume. In this work the impact of each surface parameter on the optical transport was studied by looking at the optical spectrum: the number of detected optical photons per ionizing source particle from a large plastic scintillator, i.e. the output signal. All simulations were performed using GATE v6.2 (GEANT4 Application for Tomographic Emission). The surface parameter finish (polished, ground, front-painted or back-painted) showed the greatest impact on the optical spectrum whereas the surface parameter σ(α), which controls the surface roughness, had a relatively small impact. It was also shown how the surface parameters reflectivity and reflectivity types (specular spike, specular lobe, Lambertian and backscatter) changed the optical spectrum depending on the probability for reflection and the combination of reflectivity types. A change in the optical spectrum will ultimately have an impact on a simulated energy spectrum. By studying the optical spectra presented in this work, a GEANT4 user can predict the shift in an optical spectrum caused be the alteration of a specific surface parameter. PMID:26046519

  15. Neutron contamination of Varian Clinac iX 10 MV photon beam using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Yani, S.; Tursinah, R.; Rhani, M. F.; Soh, R. C. X.; Haryanto, F.; Arif, I.

    2016-03-01

    High energy medical accelerators are commonly used in radiotherapy to increase the effectiveness of treatments. As we know neutrons can be emitted from a medical accelerator if there is an incident of X-ray that hits any of its materials. This issue becomes a point of view of many researchers. The neutron contamination has caused many problems such as image resolution and radiation protection for patients and radio oncologists. This study concerns the simulation of neutron contamination emitted from Varian Clinac iX 10 MV using Monte Carlo code system. As neutron production process is very complex, Monte Carlo simulation with MCNPX code system was carried out to study this contamination. The design of this medical accelerator was modelled based on the actual materials and geometry. The maximum energy of photons and neutron in the scoring plane was 10.5 and 2.239 MeV, respectively. The number and energy of the particles produced depend on the depth and distance from beam axis. From these results, it is pointed out that the neutron produced by linac 10 MV photon beam in a typical treatment is not negligible.

  16. Monte Carlo photon beam modeling and commissioning for radiotherapy dose calculation algorithm.

    PubMed

    Toutaoui, A; Ait chikh, S; Khelassi-Toutaoui, N; Hattali, B

    2014-11-01

    The aim of the present work was a Monte Carlo verification of the Multi-grid superposition (MGS) dose calculation algorithm implemented in the CMS XiO (Elekta) treatment planning system and used to calculate the dose distribution produced by photon beams generated by the linear accelerator (linac) Siemens Primus. The BEAMnrc/DOSXYZnrc (EGSnrc package) Monte Carlo model of the linac head was used as a benchmark. In the first part of the work, the BEAMnrc was used for the commissioning of a 6 MV photon beam and to optimize the linac description to fit the experimental data. In the second part, the MGS dose distributions were compared with DOSXYZnrc using relative dose error comparison and γ-index analysis (2%/2 mm, 3%/3 mm), in different dosimetric test cases. Results show good agreement between simulated and calculated dose in homogeneous media for square and rectangular symmetric fields. The γ-index analysis confirmed that for most cases the MGS model and EGSnrc doses are within 3% or 3 mm. PMID:24947967

  17. Selected organ dose conversion coefficients for external photons calculated using ICRP adult voxel phantoms and Monte Carlo code FLUKA.

    PubMed

    Patni, H K; Nadar, M Y; Akar, D K; Bhati, S; Sarkar, P K

    2011-11-01

    The adult reference male and female computational voxel phantoms recommended by ICRP are adapted into the Monte Carlo transport code FLUKA. The FLUKA code is then utilised for computation of dose conversion coefficients (DCCs) expressed in absorbed dose per air kerma free-in-air for colon, lungs, stomach wall, breast, gonads, urinary bladder, oesophagus, liver and thyroid due to a broad parallel beam of mono-energetic photons impinging in anterior-posterior and posterior-anterior directions in the energy range of 15 keV-10 MeV. The computed DCCs of colon, lungs, stomach wall and breast are found to be in good agreement with the results published in ICRP publication 110. The present work thus validates the use of FLUKA code in computation of organ DCCs for photons using ICRP adult voxel phantoms. Further, the DCCs for gonads, urinary bladder, oesophagus, liver and thyroid are evaluated and compared with results published in ICRP 74 in the above-mentioned energy range and geometries. Significant differences in DCCs are observed for breast, testis and thyroid above 1 MeV, and for most of the organs at energies below 60 keV in comparison with the results published in ICRP 74. The DCCs of female voxel phantom were found to be higher in comparison with male phantom for almost all organs in both the geometries. PMID:21147784

  18. Monte Carlo impurity transport modeling in the DIII-D transport

    SciTech Connect

    Evans, T.E.; Finkenthal, D.F.

    1998-04-01

    A description of the carbon transport and sputtering physics contained in the Monte Carlo Impurity (MCI) transport code is given. Examples of statistically significant carbon transport pathways are examined using MCI`s unique tracking visualizer and a mechanism for enhanced carbon accumulation on the high field side of the divertor chamber is discussed. Comparisons between carbon emissions calculated with MCI and those measured in the DIII-D tokamak are described. Good qualitative agreement is found between 2D carbon emission patterns calculated with MCI and experimentally measured carbon patterns. While uncertainties in the sputtering physics, atomic data, and transport models have made quantitative comparisons with experiments more difficult, recent results using a physics based model for physical and chemical sputtering has yielded simulations with about 50% of the total carbon radiation measured in the divertor. These results and plans for future improvement in the physics models and atomic data are discussed.

  19. Parallel processing implementation for the coupled transport of photons and electrons using OpenMP

    NASA Astrophysics Data System (ADS)

    Doerner, Edgardo

    2016-05-01

    In this work the use of OpenMP to implement the parallel processing of the Monte Carlo (MC) simulation of the coupled transport for photons and electrons is presented. This implementation was carried out using a modified EGSnrc platform which enables the use of the Microsoft Visual Studio 2013 (VS2013) environment, together with the developing tools available in the Intel Parallel Studio XE 2015 (XE2015). The performance study of this new implementation was carried out in a desktop PC with a multi-core CPU, taking as a reference the performance of the original platform. The results were satisfactory, both in terms of scalability as parallelization efficiency.

  20. Status of the MORSE multigroup Monte Carlo radiation transport code

    SciTech Connect

    Emmett, M.B.

    1993-06-01

    There are two versions of the MORSE multigroup Monte Carlo radiation transport computer code system at Oak Ridge National Laboratory. MORSE-CGA is the most well-known and has undergone extensive use for many years. MORSE-SGC was originally developed in about 1980 in order to restructure the cross-section handling and thereby save storage. However, with the advent of new computer systems having much larger storage capacity, that aspect of SGC has become unnecessary. Both versions use data from multigroup cross-section libraries, although in somewhat different formats. MORSE-SGC is the version of MORSE that is part of the SCALE system, but it can also be run stand-alone. Both CGA and SGC use the Multiple Array System (MARS) geometry package. In the last six months the main focus of the work on these two versions has been on making them operational on workstations, in particular, the IBM RISC 6000 family. A new version of SCALE for workstations is being released to the Radiation Shielding Information Center (RSIC). MORSE-CGA, Version 2.0, is also being released to RSIC. Both SGC and CGA have undergone other revisions recently. This paper reports on the current status of the MORSE code system.

  1. Monte Carlo simulations of the vacuum performance of differential pumps at the Advanced Photon Source

    NASA Astrophysics Data System (ADS)

    Liu, C.; Shu, D.; Kuzay, T. M.; Kersevan, R.

    1996-09-01

    Monte Carlo computer simulations have been successfully applied in the design of vacuum systems. These simulations allow the user to check the vacuum performance without the need of making a prototype of the vacuum system. In this paper we demonstrate the effectiveness and aptitude of these simulations in the design of differential pumps for synchrotron radiation beamlines. Eventually a good number of the beamline front ends at the Advanced Photon Source (APS) will use differential pumps to protect the synchrotron storage ring vacuum. A Monte Carlo computer program is used to calculate the molecular flow transmission and pressure distribution across the differential pump. A differential pump system, which consists of two 170 l/s ion pumps with three conductance-limiting apertures, was previously tested on an APS insertion-device beamline front end. Pressure distribution measurements using controlled leaks demonstrated a pressure difference of over two decades across the differential pump. A new differential pump utilizes a fixed mask between two 170 l/s ion pumps. The fixed mask, which has a conical channel with a small cross section of 4.5×4.5 mm2 in the far end, is used in the beamline to confine the photon beam. Monte Carlo simulations indicate that this configuration with the fixed mask significantly improves the pressure reduction capability of the differential pump, to ˜3×10-5, within the operational range from ˜10-4 to 10-10 Torr. The lower end of pressure is limited by outgassing from front-end components and the higher end by the pumping ability of the ion pump.

  2. Photonic quantum transport in a nonlinear optical fiber

    NASA Astrophysics Data System (ADS)

    Hafezi, M.; Chang, D. E.; Gritsev, V.; Demler, E. A.; Lukin, M. D.

    2011-06-01

    We theoretically study the transmission of few-photon quantum fields through a strongly nonlinear optical medium. We develop a general approach to investigate nonequilibrium quantum transport of bosonic fields through a finite-size nonlinear medium and apply it to a recently demonstrated experimental system where cold atoms are loaded in a hollow-core optical fiber. We show that when the interaction between photons is effectively repulsive, the system acts as a single-photon switch. In the case of attractive interaction, the system can exhibit either antibunching or bunching, associated with the resonant excitation of bound states of photons by the input field. These effects can be observed by probing statistics of photons transmitted through the nonlinear fiber.

  3. Robust light transport in non-Hermitian photonic lattices.

    PubMed

    Longhi, Stefano; Gatti, Davide; Della Valle, Giuseppe

    2015-01-01

    Combating the effects of disorder on light transport in micro- and nano-integrated photonic devices is of major importance from both fundamental and applied viewpoints. In ordinary waveguides, imperfections and disorder cause unwanted back-reflections, which hinder large-scale optical integration. Topological photonic structures, a new class of optical systems inspired by quantum Hall effect and topological insulators, can realize robust transport via topologically-protected unidirectional edge modes. Such waveguides are realized by the introduction of synthetic gauge fields for photons in a two-dimensional structure, which break time reversal symmetry and enable one-way guiding at the edge of the medium. Here we suggest a different route toward robust transport of light in lower-dimensional (1D) photonic lattices, in which time reversal symmetry is broken because of the non-Hermitian nature of transport. While a forward propagating mode in the lattice is amplified, the corresponding backward propagating mode is damped, thus resulting in an asymmetric transport insensitive to disorder or imperfections in the structure. Non-Hermitian asymmetric transport can occur in tight-binding lattices with an imaginary gauge field via a non-Hermitian delocalization transition, and in periodically-driven superlattices. The possibility to observe non-Hermitian delocalization is suggested using an engineered coupled-resonator optical waveguide (CROW) structure. PMID:26314932

  4. Robust light transport in non-Hermitian photonic lattices

    PubMed Central

    Longhi, Stefano; Gatti, Davide; Valle, Giuseppe Della

    2015-01-01

    Combating the effects of disorder on light transport in micro- and nano-integrated photonic devices is of major importance from both fundamental and applied viewpoints. In ordinary waveguides, imperfections and disorder cause unwanted back-reflections, which hinder large-scale optical integration. Topological photonic structures, a new class of optical systems inspired by quantum Hall effect and topological insulators, can realize robust transport via topologically-protected unidirectional edge modes. Such waveguides are realized by the introduction of synthetic gauge fields for photons in a two-dimensional structure, which break time reversal symmetry and enable one-way guiding at the edge of the medium. Here we suggest a different route toward robust transport of light in lower-dimensional (1D) photonic lattices, in which time reversal symmetry is broken because of the non-Hermitian nature of transport. While a forward propagating mode in the lattice is amplified, the corresponding backward propagating mode is damped, thus resulting in an asymmetric transport insensitive to disorder or imperfections in the structure. Non-Hermitian asymmetric transport can occur in tight-binding lattices with an imaginary gauge field via a non-Hermitian delocalization transition, and in periodically-driven superlattices. The possibility to observe non-Hermitian delocalization is suggested using an engineered coupled-resonator optical waveguide (CROW) structure. PMID:26314932

  5. Monte Carlo calculation based on hydrogen composition of the tissue for MV photon radiotherapy.

    PubMed

    Demol, Benjamin; Viard, Romain; Reynaert, Nick

    2015-01-01

    The purpose of this study was to demonstrate that Monte Carlo treatment planning systems require tissue characterization (density and composition) as a function of CT number. A discrete set of tissue classes with a specific composition is introduced. In the current work we demonstrate that, for megavoltage photon radiotherapy, only the hydrogen content of the different tissues is of interest. This conclusion might have an impact on MRI-based dose calculations and on MVCT calibration using tissue substitutes. A stoichiometric calibration was performed, grouping tissues with similar atomic composition into 15 dosimetrically equivalent subsets. To demonstrate the importance of hydrogen, a new scheme was derived, with correct hydrogen content, complemented by oxygen (all elements differing from hydrogen are replaced by oxygen). Mass attenuation coefficients and mass stopping powers for this scheme were calculated and compared to the original scheme. Twenty-five CyberKnife treatment plans were recalculated by an in-house developed Monte Carlo system using tissue density and hydrogen content derived from the CT images. The results were compared to Monte Carlo simulations using the original stoichiometric calibration. Between 300 keV and 3 MeV, the relative difference of mass attenuation coefficients is under 1% within all subsets. Between 10 keV and 20 MeV, the relative difference of mass stopping powers goes up to 5% in hard bone and remains below 2% for all other tissue subsets. Dose-volume histograms (DVHs) of the treatment plans present no visual difference between the two schemes. Relative differences of dose indexes D98, D95, D50, D05, D02, and Dmean were analyzed and a distribution centered around zero and of standard deviation below 2% (3 σ) was established. On the other hand, once the hydrogen content is slightly modified, important dose differences are obtained. Monte Carlo dose planning in the field of megavoltage photon radiotherapy is fully achievable using

  6. The difference of scoring dose to water or tissues in Monte Carlo dose calculations for low energy brachytherapy photon sources

    SciTech Connect

    Landry, Guillaume; Reniers, Brigitte; Pignol, Jean-Philippe; Beaulieu, Luc; Verhaegen, Frank

    2011-03-15

    Purpose: The goal of this work is to compare D{sub m,m} (radiation transported in medium; dose scored in medium) and D{sub w,m} (radiation transported in medium; dose scored in water) obtained from Monte Carlo (MC) simulations for a subset of human tissues of interest in low energy photon brachytherapy. Using low dose rate seeds and an electronic brachytherapy source (EBS), the authors quantify the large cavity theory conversion factors required. The authors also assess whether applying large cavity theory utilizing the sources' initial photon spectra and average photon energy induces errors related to spatial spectral variations. First, ideal spherical geometries were investigated, followed by clinical brachytherapy LDR seed implants for breast and prostate cancer patients. Methods: Two types of dose calculations are performed with the GEANT4 MC code. (1) For several human tissues, dose profiles are obtained in spherical geometries centered on four types of low energy brachytherapy sources: {sup 125}I, {sup 103}Pd, and {sup 131}Cs seeds, as well as an EBS operating at 50 kV. Ratios of D{sub w,m} over D{sub m,m} are evaluated in the 0-6 cm range. In addition to mean tissue composition, compositions corresponding to one standard deviation from the mean are also studied. (2) Four clinical breast (using {sup 103}Pd) and prostate (using {sup 125}I) brachytherapy seed implants are considered. MC dose calculations are performed based on postimplant CT scans using prostate and breast tissue compositions. PTV D{sub 90} values are compared for D{sub w,m} and D{sub m,m}. Results: (1) Differences (D{sub w,m}/D{sub m,m}-1) of -3% to 70% are observed for the investigated tissues. For a given tissue, D{sub w,m}/D{sub m,m} is similar for all sources within 4% and does not vary more than 2% with distance due to very moderate spectral shifts. Variations of tissue composition about the assumed mean composition influence the conversion factors up to 38%. (2) The ratio of D{sub 90(w

  7. Simulating photon scattering effects in structurally detailed ventricular models using a Monte Carlo approach

    PubMed Central

    Bishop, Martin J.; Plank, Gernot

    2014-01-01

    Light scattering during optical imaging of electrical activation within the heart is known to significantly distort the optically-recorded action potential (AP) upstroke, as well as affecting the magnitude of the measured response of ventricular tissue to strong electric shocks. Modeling approaches based on the photon diffusion equation have recently been instrumental in quantifying and helping to understand the origin of the resulting distortion. However, they are unable to faithfully represent regions of non-scattering media, such as small cavities within the myocardium which are filled with perfusate during experiments. Stochastic Monte Carlo (MC) approaches allow simulation and tracking of individual photon “packets” as they propagate through tissue with differing scattering properties. Here, we present a novel application of the MC method of photon scattering simulation, applied for the first time to the simulation of cardiac optical mapping signals within unstructured, tetrahedral, finite element computational ventricular models. The method faithfully allows simulation of optical signals over highly-detailed, anatomically-complex MR-based models, including representations of fine-scale anatomy and intramural cavities. We show that optical action potential upstroke is prolonged close to large subepicardial vessels than further away from vessels, at times having a distinct “humped” morphology. Furthermore, we uncover a novel mechanism by which photon scattering effects around vessels cavities interact with “virtual-electrode” regions of strong de-/hyper-polarized tissue surrounding cavities during shocks, significantly reducing the apparent optically-measured epicardial polarization. We therefore demonstrate the importance of this novel optical mapping simulation approach along with highly anatomically-detailed models to fully investigate electrophysiological phenomena driven by fine-scale structural heterogeneity. PMID:25309442

  8. Monte Carlo-based revised values of dose rate constants at discrete photon energies

    PubMed Central

    Selvam, T. Palani; Shrivastava, Vandana; Chourasiya, Ghanashyam; Babu, D. Appala Raju

    2014-01-01

    Absorbed dose rate to water at 0.2 cm and 1 cm due to a point isotropic photon source as a function of photon energy is calculated using the EDKnrc user-code of the EGSnrc Monte Carlo system. This code system utilized widely used XCOM photon cross-section dataset for the calculation of absorbed dose to water. Using the above dose rates, dose rate constants are calculated. Air-kerma strength Sk needed for deriving dose rate constant is based on the mass-energy absorption coefficient compilations of Hubbell and Seltzer published in the year 1995. A comparison of absorbed dose rates in water at the above distances to the published values reflects the differences in photon cross-section dataset in the low-energy region (difference is up to 2% in dose rate values at 1 cm in the energy range 30–50 keV and up to 4% at 0.2 cm at 30 keV). A maximum difference of about 8% is observed in the dose rate value at 0.2 cm at 1.75 MeV when compared to the published value. Sk calculations based on the compilation of Hubbell and Seltzer show a difference of up to 2.5% in the low-energy region (20–50 keV) when compared to the published values. The deviations observed in the values of dose rate and Sk affect the values of dose rate constants up to 3%. PMID:24600166

  9. Effect of transverse magnetic fields on dose distribution and RBE of photon beams: comparing PENELOPE and EGS4 Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    Nettelbeck, H.; Takacs, G. J.; Rosenfeld, A. B.

    2008-09-01

    The application of a strong transverse magnetic field to a volume undergoing irradiation by a photon beam can produce localized regions of dose enhancement and dose reduction. This study uses the PENELOPE Monte Carlo code to investigate the effect of a slice of uniform transverse magnetic field on a photon beam using different magnetic field strengths and photon beam energies. The maximum and minimum dose yields obtained in the regions of dose enhancement and dose reduction are compared to those obtained with the EGS4 Monte Carlo code in a study by Li et al (2001), who investigated the effect of a slice of uniform transverse magnetic field (1 to 20 Tesla) applied to high-energy photon beams. PENELOPE simulations yielded maximum dose enhancements and dose reductions as much as 111% and 77%, respectively, where most results were within 6% of the EGS4 result. Further PENELOPE simulations were performed with the Sheikh-Bagheri and Rogers (2002) input spectra for 6, 10 and 15 MV photon beams, yielding results within 4% of those obtained with the Mohan et al (1985) spectra. Small discrepancies between a few of the EGS4 and PENELOPE results prompted an investigation into the influence of the PENELOPE elastic scattering parameters C1 and C2 and low-energy electron and photon transport cut-offs. Repeating the simulations with smaller scoring bins improved the resolution of the regions of dose enhancement and dose reduction, especially near the magnetic field boundaries where the dose deposition can abruptly increase or decrease. This study also investigates the effect of a magnetic field on the low-energy electron spectrum that may correspond to a change in the radiobiological effectiveness (RBE). Simulations show that the increase in dose is achieved predominantly through the lower energy electron population.

  10. A Monte Carlo simulation for predicting photon return from sodium laser guide star

    NASA Astrophysics Data System (ADS)

    Feng, Lu; Kibblewhite, Edward; Jin, Kai; Xue, Suijian; Shen, Zhixia; Bo, Yong; Zuo, Junwei; Wei, Kai

    2015-10-01

    Sodium laser guide star is an ideal source for astronomical adaptive optics system correcting wave-front aberration caused by atmospheric turbulence. However, the cost and difficulties to manufacture a compact high quality sodium laser with power higher than 20W is not a guarantee that the laser will provide a bright enough laser guide star due to the physics of sodium atom in the atmosphere. It would be helpful if a prediction tool could provide the estimation of photon generating performance for arbitrary laser output formats, before an actual laser were designed. Based on rate equation, we developed a Monte Carlo simulation software that could be used to predict sodium laser guide star generating performance for arbitrary laser formats. In this paper, we will describe the model of our simulation, its implementation and present comparison results with field test data.

  11. Study on Photon Transport Problem Based on the Platform of Molecular Optical Simulation Environment

    PubMed Central

    Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie

    2010-01-01

    As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (SPn), and physical measurement to verify the performance of our study method on both accuracy and efficiency. PMID:20445737

  12. Study on photon transport problem based on the platform of molecular optical simulation environment.

    PubMed

    Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie

    2010-01-01

    As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (SP(n)), and physical measurement to verify the performance of our study method on both accuracy and efficiency. PMID:20445737

  13. MULTIDIMENSIONAL COUPLED PHOTON-ELECTRON TRANSPORT SIMULATIONS USING NEUTRAL PARTICLE SN CODES

    SciTech Connect

    Ilas, Dan; Williams, Mark L; Peplow, Douglas E.; Kirk, Bernadette Lugue

    2008-01-01

    During the past two years a study was underway at ORNL to assess the suitability of the popular SN neutral particle codes ANISN, DORT and TORT for coupled photon-electron calculations specific to external beam therapy of medical physics applications. The CEPXS-BFP code was used to generate the cross sections. The computational tests were performed on phantoms typical of those used in medical physics for external beam therapy, with materials simulated by water at different densities and the comparisons were made against Monte Carlo simulations that served as benchmarks. Although the results for one-dimensional calculations were encouraging, it appeared that the higher dimensional transport codes had fundamental difficulties in handling the electron transport. The results of two-dimensional simulations using the code DORT with an S16 fully symmetric quadrature set agree fairly with the reference Monte Carlo results but not well enough for clinical applications. While the photon fluxes are in better agreement (generally, within less than 5% from the reference), the discrepancy increases, sometimes very significantly, for the electron fluxes. The paper, however, focuses on the results obtained with the three-dimensional code TORT which had convergence difficulties for the electron groups. Numerical instabilities occurred in these groups. These instabilities were more pronounced with the degree of anisotropy of the problem.

  14. Evaluation of Electron Contamination in Cancer Treatment with Megavoltage Photon Beams: Monte Carlo Study

    PubMed Central

    Seif, F.; Bayatiani, M. R.

    2015-01-01

    Background Megavoltage beams used in radiotherapy are contaminated with secondary electrons. Different parts of linac head and air above patient act as a source of this contamination. This contamination can increase damage to skin and subcutaneous tissue during radiotherapy. Monte Carlo simulation is an accurate method for dose calculation in medical dosimetry and has an important role in optimization of linac head materials. The aim of this study was to calculate electron contamination of Varian linac. Materials and Method The 6MV photon beam of Varian (2100 C/D) linac was simulated by Monte Carlo code, MCNPX, based on its company’s instructions. The validation was done by comparing the calculated depth dose and profiles of simulation with dosimetry measurements in a water phantom (error less than 2%). The Percentage Depth Dose (PDDs), profiles and contamination electron energy spectrum were calculated for different therapeutic field sizes (5×5 to 40×40 cm2) for both linacs. Results The dose of electron contamination was observed to rise with increase in field size. The contribution of the secondary contamination electrons on the surface dose was 6% for 5×5 cm2 to 27% for 40×40 cm2, respectively. Conclusion Based on the results, the effect of electron contamination on patient surface dose cannot be ignored, so the knowledge of the electron contamination is important in clinical dosimetry. It must be calculated for each machine and considered in Treatment Planning Systems. PMID:25973409

  15. FZ2MC: A Tool for Monte Carlo Transport Code Geometry Manipulation

    SciTech Connect

    Hackel, B M; Nielsen Jr., D E; Procassini, R J

    2009-02-25

    The process of creating and validating combinatorial geometry representations of complex systems for use in Monte Carlo transport simulations can be both time consuming and error prone. To simplify this process, a tool has been developed which employs extensions of the Form-Z commercial solid modeling tool. The resultant FZ2MC (Form-Z to Monte Carlo) tool permits users to create, modify and validate Monte Carlo geometry and material composition input data. Plugin modules that export this data to an input file, as well as parse data from existing input files, have been developed for several Monte Carlo codes. The FZ2MC tool is envisioned as a 'universal' tool for the manipulation of Monte Carlo geometry and material data. To this end, collaboration on the development of plug-in modules for additional Monte Carlo codes is desired.

  16. Transport properties of pseudospin-1 photons (Presentation Recording)

    NASA Astrophysics Data System (ADS)

    Chan, Che Ting; Fang, Anan; Zhang, Zhao-Qing; Louie, Steven G.

    2015-09-01

    Pseudospin is of central importance in governing many unusual transport properties of graphene and other artificial systems which have pseudospins of 1/2. These unconventional transport properties are manifested in phenomena such as Klein tunneling, and collimation of electron beams in one-dimensional external potentials. Here we show that in certain photonic crystals (PCs) exhibiting conical dispersions at the center of Brillouin zone, the eigenstates near the "Dirac-like point" can be described by an effective spin-orbit Hamiltonian with a pseudospin of 1. This effective Hamiltonian describes within a unified framework the wave propagations in both positive and negative refractive index media which correspond to the upper and lower conical bands respectively. Different from a Berry phase of π for the Dirac cone of pseudospin-1/2 systems, the Berry phase for the Dirac-like cone turns out to be zero from this pseudospin-1 Hamiltonian. In addition, we found that a change of length scale of the PC can shift the Dirac-like cone rigidly up or down in frequency with its group velocity unchanged, hence mimicking a gate voltage in graphene and allowing for a simple mechanism to control the flow of pseudospin-1 photons. As a photonic analogue of electron potential, the length-scale induced Dirac-like point shift is effectively a photonic potential within the effective pseudospin-1 Hamiltonian description. At the interface of two different potentials, the 3-component spinor gives rise to distinct boundary conditions which do not require each component of the wave function to be continuous, leading to new wave transport behaviors as shown in Klein tunneling and supercollimation. For examples, the Klein tunneling of pseudospin-1 photons is much less anisotropic with reference to the incident angle than that of pseudospin-1/2 electrons, and collimation can be more robust with pseudospin-1 than pseudospin-1/2. The special wave transport properties of pseudospin-1 photons

  17. FASTER 3: A generalized-geometry Monte Carlo computer program for the transport of neutrons and gamma rays. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    Jordan, T. M.

    1970-01-01

    A description of the FASTER-III program for Monte Carlo Carlo calculation of photon and neutron transport in complex geometries is presented. Major revisions include the capability of calculating minimum weight shield configurations for primary and secondary radiation and optimal importance sampling parameters. The program description includes a users manual describing the preparation of input data cards, the printout from a sample problem including the data card images, definitions of Fortran variables, the program logic, and the control cards required to run on the IBM 7094, IBM 360, UNIVAC 1108 and CDC 6600 computers.

  18. Extension of the Integrated Tiger Series (ITS) of electron-photon Monte Carlo codes to 100 GeV

    SciTech Connect

    Miller, S.G.

    1988-08-01

    Version 2.1 of the Integrated Tiger Series (ITS) of electron-photon Monte Carlo codes was modified to extend their ability to model interactions up to 100 GeV. Benchmarks against experimental results conducted at 10 and 15 GeV confirm the accuracy of the extended codes. 12 refs., 2 figs., 2 tabs.

  19. Detector-selection technique for Monte Carlo transport in azimuthally symmetric geometries

    SciTech Connect

    Hoffman, T.J.; Tang, J.S.; Parks, C.V.

    1982-01-01

    Many radiation transport problems contain geometric symmetries which are not exploited in obtaining their Monte Carlo solutions. An important class of problems is that in which the geometry is symmetric about an axis. These problems arise in the analyses of a reactor core or shield, spent fuel shipping casks, tanks containing radioactive solutions, radiation transport in the atmosphere (air-over-ground problems), etc. Although amenable to deterministic solution, such problems can often be solved more efficiently and accurately with the Monte Carlo method. For this class of problems, a technique is described in this paper which significantly reduces the variance of the Monte Carlo-calculated effect of interest at point detectors.

  20. Comparative analysis of discrete and continuous absorption weighting estimators used in Monte Carlo simulations of radiative transport in turbid media.

    PubMed

    Hayakawa, Carole K; Spanier, Jerome; Venugopalan, Vasan

    2014-02-01

    We examine the relative error of Monte Carlo simulations of radiative transport that employ two commonly used estimators that account for absorption differently, either discretely, at interaction points, or continuously, between interaction points. We provide a rigorous derivation of these discrete and continuous absorption weighting estimators within a stochastic model that we show to be equivalent to an analytic model, based on the radiative transport equation (RTE). We establish that both absorption weighting estimators are unbiased and, therefore, converge to the solution of the RTE. An analysis of spatially resolved reflectance predictions provided by these two estimators reveals no advantage to either in cases of highly scattering and highly anisotropic media. However, for moderate to highly absorbing media or isotropically scattering media, the discrete estimator provides smaller errors at proximal source locations while the continuous estimator provides smaller errors at distal locations. The origin of these differing variance characteristics can be understood through examination of the distribution of exiting photon weights. PMID:24562029

  1. Comparative analysis of discrete and continuous absorption weighting estimators used in Monte Carlo simulations of radiative transport in turbid media

    PubMed Central

    Hayakawa, Carole K.; Spanier, Jerome; Venugopalan, Vasan

    2014-01-01

    We examine the relative error of Monte Carlo simulations of radiative transport that employ two commonly used estimators that account for absorption differently, either discretely, at interaction points, or continuously, between interaction points. We provide a rigorous derivation of these discrete and continuous absorption weighting estimators within a stochastic model that we show to be equivalent to an analytic model, based on the radiative transport equation (RTE). We establish that both absorption weighting estimators are unbiased and, therefore, converge to the solution of the RTE. An analysis of spatially resolved reflectance predictions provided by these two estimators reveals no advantage to either in cases of highly scattering and highly anisotropic media. However, for moderate to highly absorbing media or isotropically scattering media, the discrete estimator provides smaller errors at proximal source locations while the continuous estimator provides smaller errors at distal locations. The origin of these differing variance characteristics can be understood through examination of the distribution of exiting photon weights. PMID:24562029

  2. SU-E-J-69: Iterative Deconvolution of the Initial Photon Fluence for EPID Dosimetry: A Monte Carlo Based Study

    SciTech Connect

    Czarnecki, D; Voigts-Rhetz, P von; Shishechian, D Uchimura; Zink, K

    2015-06-15

    Purpose: Developing a fast and accurate calculation model to reconstruct the applied photon fluence from an external photon radiation therapy treatment based on an image recorded by an electronic portal image device (EPID). Methods: To reconstruct the initial photon fluence the 2D EPID image was corrected for scatter from the patient/phantom and EPID to generate the transmitted primary photon fluence. This was done by an iterative deconvolution using precalculated point spread functions (PSF). The transmitted primary photon fluence was then backprojected through the patient/phantom geometry considering linear attenuation to receive the initial photon fluence applied for the treatment.The calculation model was verified using Monte Carlo simulations performed with the EGSnrc code system. EPID images were produced by calculating the dose deposition in the EPID from a 6 MV photon beam irradiating a water phantom with air and bone inhomogeneities and the ICRP anthropomorphic voxel phantom. Results: The initial photon fluence was reconstructed using a single PSF and position dependent PSFs which depend on the radiological thickness of the irradiated object. Appling position dependent point spread functions the mean uncertainty of the reconstructed initial photon fluence could be reduced from 1.13 % to 0.13 %. Conclusion: This study presents a calculation model for fluence reconstruction from EPID images. The{sup Result} show a clear advantage when position dependent PSF are used for the iterative reconstruction. The basic work of a reconstruction method was established and further evaluations must be made in an experimental study.

  3. Photon-Inhibited Topological Transport in Quantum Well Heterostructures

    NASA Astrophysics Data System (ADS)

    Farrell, Aaron; Pereg-Barnea, T.

    2015-09-01

    Here we provide a picture of transport in quantum well heterostructures with a periodic driving field in terms of a probabilistic occupation of the topologically protected edge states in the system. This is done by generalizing methods from the field of photon-assisted tunneling. We show that the time dependent field dresses the underlying Hamiltonian of the heterostructure and splits the system into sidebands. Each of these sidebands is occupied with a certain probability which depends on the drive frequency and strength. This leads to a reduction in the topological transport signatures of the system because of the probability to absorb or emit a photon. Therefore when the voltage is tuned to the bulk gap the conductance is smaller than the expected 2 e2/h . We refer to this as photon-inhibited topological transport. Nevertheless, the edge modes reveal their topological origin in the robustness of the edge conductance to disorder and changes in model parameters. In this work the analogy with photon-assisted tunneling allows us to interpret the calculated conductivity and explain the sum rule observed by Kundu and Seradjeh.

  4. Monte Carlo calculations of correction factors for plastic phantoms in clinical photon and electron beam dosimetry.

    PubMed

    Araki, Fujio; Hanyu, Yuji; Fukuoka, Miyoko; Matsumoto, Kenji; Okumura, Masahiko; Oguchi, Hiroshi

    2009-07-01

    The purpose of this study is to calculate correction factors for plastic water (PW) and plastic water diagnostic-therapy (PWDT) phantoms in clinical photon and electron beam dosimetry using the EGSnrc Monte Carlo code system. A water-to-plastic ionization conversion factor k(pl) for PW and PWDT was computed for several commonly used Farmer-type ionization chambers with different wall materials in the range of 4-18 MV photon beams. For electron beams, a depth-scaling factor c(pl) and a chamber-dependent fluence correction factor h(pl) for both phantoms were also calculated in combination with NACP-02 and Roos plane-parallel ionization chambers in the range of 4-18 MeV. The h(pl) values for the plane-parallel chambers were evaluated from the electron fluence correction factor phi(pl)w and wall correction factors P(wall,w) and P(wall,pl) for a combination of water or plastic materials. The calculated k(pl) and h(pl) values were verified by comparison with the measured values. A set of k(pl) values computed for the Farmer-type chambers was equal to unity within 0.5% for PW and PWDT in photon beams. The k(pl) values also agreed within their combined uncertainty with the measured data. For electron beams, the c(pl) values computed for PW and PWDT were from 0.998 to 1.000 and from 0.992 to 0.997, respectively, in the range of 4-18 MeV. The phi(pl)w values for PW and PWDT were from 0.998 to 1.001 and from 1.004 to 1.001, respectively, at a reference depth in the range of 4-18 MeV. The difference in P(wall) between water and plastic materials for the plane-parallel chambers was 0.8% at a maximum. Finally, h(pl) values evaluated for plastic materials were equal to unity within 0.6% for NACP-02 and Roos chambers. The h(pl) values also agreed within their combined uncertainty with the measured data. The absorbed dose to water from ionization chamber measurements in PW and PWDT plastic materials corresponds to that in water within 1%. Both phantoms can thus be used as a

  5. ACCELERATING FUSION REACTOR NEUTRONICS MODELING BY AUTOMATIC COUPLING OF HYBRID MONTE CARLO/DETERMINISTIC TRANSPORT ON CAD GEOMETRY

    SciTech Connect

    Biondo, Elliott D; Ibrahim, Ahmad M; Mosher, Scott W; Grove, Robert E

    2015-01-01

    Detailed radiation transport calculations are necessary for many aspects of the design of fusion energy systems (FES) such as ensuring occupational safety, assessing the activation of system components for waste disposal, and maintaining cryogenic temperatures within superconducting magnets. Hybrid Monte Carlo (MC)/deterministic techniques are necessary for this analysis because FES are large, heavily shielded, and contain streaming paths that can only be resolved with MC. The tremendous complexity of FES necessitates the use of CAD geometry for design and analysis. Previous ITER analysis has required the translation of CAD geometry to MCNP5 form in order to use the AutomateD VAriaNce reducTion Generator (ADVANTG) for hybrid MC/deterministic transport. In this work, ADVANTG was modified to support CAD geometry, allowing hybrid (MC)/deterministic transport to be done automatically and eliminating the need for this translation step. This was done by adding a new ray tracing routine to ADVANTG for CAD geometries using the Direct Accelerated Geometry Monte Carlo (DAGMC) software library. This new capability is demonstrated with a prompt dose rate calculation for an ITER computational benchmark problem using both the Consistent Adjoint Driven Importance Sampling (CADIS) method an the Forward Weighted (FW)-CADIS method. The variance reduction parameters produced by ADVANTG are shown to be the same using CAD geometry and standard MCNP5 geometry. Significant speedups were observed for both neutrons (as high as a factor of 7.1) and photons (as high as a factor of 59.6).

  6. Monte Carlo simulation of small electron fields collimated by the integrated photon MLC

    NASA Astrophysics Data System (ADS)

    Mihaljevic, Josip; Soukup, Martin; Dohm, Oliver; Alber, Markus

    2011-02-01

    In this study, a Monte Carlo (MC)-based beam model for an ELEKTA linear accelerator was established. The beam model is based on the EGSnrc Monte Carlo code, whereby electron beams with nominal energies of 10, 12 and 15 MeV were considered. For collimation of the electron beam, only the integrated photon multi-leaf-collimators (MLCs) were used. No additional secondary or tertiary add-ons like applicators, cutouts or dedicated electron MLCs were included. The source parameters of the initial electron beam were derived semi-automatically from measurements of depth-dose curves and lateral profiles in a water phantom. A routine to determine the initial electron energy spectra was developed which fits a Gaussian spectrum to the most prominent features of depth-dose curves. The comparisons of calculated and measured depth-dose curves demonstrated agreement within 1%/1 mm. The source divergence angle of initial electrons was fitted to lateral dose profiles beyond the range of electrons, where the imparted dose is mainly due to bremsstrahlung produced in the scattering foils. For accurate modelling of narrow beam segments, the influence of air density on dose calculation was studied. The air density for simulations was adjusted to local values (433 m above sea level) and compared with the standard air supplied by the ICRU data set. The results indicate that the air density is an influential parameter for dose calculations. Furthermore, the default value of the BEAMnrc parameter 'skin depth' for the boundary crossing algorithm was found to be inadequate for the modelling of small electron fields. A higher value for this parameter eliminated discrepancies in too broad dose profiles and an increased dose along the central axis. The beam model was validated with measurements, whereby an agreement mostly within 3%/3 mm was found.

  7. Monte Carlo Simulation Of H{sup -} Ion Transport

    SciTech Connect

    Diomede, P.; Longo, S.; Capitelli, M.

    2009-03-12

    In this work we study in detail the kinetics of H{sup -} ion swarms in velocity space: this provides a useful contrast to the usual literature in the field, where device features in configuration space are often included in detail but kinetic distributions are only marginally considered. To this aim a Monte Carlo model is applied, which includes several collision processes of H{sup -} ions with neutral particles as well as Coulomb collisions with positive ions. We characterize the full velocity distribution i.e. including its anisotropy, for different values of E/N, the atomic fraction and the H{sup +} mole fraction, which makes our results of interest for both source modeling and beam formation. A simple analytical theory, for highly dissociated hydrogen is formulated and checked by Monte Carlo calculations.

  8. Analysis of neutron and photon response of a TLD-ALBEDO personal dosemeter on an ISO slab phantom using TRIPOLI-4.3 Monte Carlo code.

    PubMed

    Lee, Y K

    2005-01-01

    TRIPOLI-4.3 Monte Carlo transport code has been used to evaluate the QUADOS (Quality Assurance of Computational Tools for Dosimetry) problem P4, neutron and photon response of an albedo-type thermoluminescence personal dosemeter (TLD) located on an ISO slab phantom. Two enriched 6LiF and two 7LiF TLD chips were used and they were protected, in front or behind, with a boron-loaded dosemeter-holder. Neutron response of the four chips was determined by counting 6Li(n,t)4He events using ENDF/B-VI.4 library and photon response by estimating absorbed dose (MeV g(-1)). Ten neutron energies from thermal to 20 MeV and six photon energies from 33 keV to 1.25 MeV were used to study the energy dependence. The fraction of the neutron and photon response owing to phantom backscatter has also been investigated. Detailed TRIPOLI-4.3 solutions are presented and compared with MCNP-4C calculations. PMID:16381740

  9. PyMercury: Interactive Python for the Mercury Monte Carlo Particle Transport Code

    SciTech Connect

    Iandola, F N; O'Brien, M J; Procassini, R J

    2010-11-29

    Monte Carlo particle transport applications are often written in low-level languages (C/C++) for optimal performance on clusters and supercomputers. However, this development approach often sacrifices straightforward usability and testing in the interest of fast application performance. To improve usability, some high-performance computing applications employ mixed-language programming with high-level and low-level languages. In this study, we consider the benefits of incorporating an interactive Python interface into a Monte Carlo application. With PyMercury, a new Python extension to the Mercury general-purpose Monte Carlo particle transport code, we improve application usability without diminishing performance. In two case studies, we illustrate how PyMercury improves usability and simplifies testing and validation in a Monte Carlo application. In short, PyMercury demonstrates the value of interactive Python for Monte Carlo particle transport applications. In the future, we expect interactive Python to play an increasingly significant role in Monte Carlo usage and testing.

  10. Multidimensional electron-photon transport with standard discrete ordinates codes

    SciTech Connect

    Drumm, C.R.

    1995-12-31

    A method is described for generating electron cross sections that are compatible with standard discrete ordinates codes without modification. There are many advantages of using an established discrete ordinates solver, e.g. immediately available adjoint capability. Coupled electron-photon transport capability is needed for many applications, including the modeling of the response of electronics components to space and man-made radiation environments. The cross sections have been successfully used in the DORT, TWODANT and TORT discrete ordinates codes. The cross sections are shown to provide accurate and efficient solutions to certain multidimensional electronphoton transport problems.

  11. LDRD project 151362 : low energy electron-photon transport.

    SciTech Connect

    Kensek, Ronald Patrick; Hjalmarson, Harold Paul; Magyar, Rudolph J.; Bondi, Robert James; Crawford, Martin James

    2013-09-01

    At sufficiently high energies, the wavelengths of electrons and photons are short enough to only interact with one atom at time, leading to the popular %E2%80%9Cindependent-atom approximation%E2%80%9D. We attempted to incorporate atomic structure in the generation of cross sections (which embody the modeled physics) to improve transport at lower energies. We document our successes and failures. This was a three-year LDRD project. The core team consisted of a radiation-transport expert, a solid-state physicist, and two DFT experts.

  12. Source of statistical noises in the Monte Carlo sampling techniques for coherently scattered photons

    PubMed Central

    Muhammad, Wazir; Lee, Sang Hoon

    2013-01-01

    Detailed comparisons of the predictions of the Relativistic Form Factors (RFFs) and Modified Form Factors (MFFs) and their advantages and shortcomings in calculating elastic scattering cross sections can be found in the literature. However, the issues related to their implementation in the Monte Carlo (MC) sampling for coherently scattered photons is still under discussion. Secondly, the linear interpolation technique (LIT) is a popular method to draw the integrated values of squared RFFs/MFFs (i.e. ) over squared momentum transfer (). In the current study, the role/issues of RFFs/MFFs and LIT in the MC sampling for the coherent scattering were analyzed. The results showed that the relative probability density curves sampled on the basis of MFFs are unable to reveal any extra scientific information as both the RFFs and MFFs produced the same MC sampled curves. Furthermore, no relationship was established between the multiple small peaks and irregular step shapes (i.e. statistical noise) in the PDFs and either RFFs or MFFs. In fact, the noise in the PDFs appeared due to the use of LIT. The density of the noise depends upon the interval length between two consecutive points in the input data table of and has no scientific background. The probability density function curves became smoother as the interval lengths were decreased. In conclusion, these statistical noises can be efficiently removed by introducing more data points in the data tables. PMID:22984278

  13. Estimation of crosstalk in LED fNIRS by photon propagation Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Iwano, Takayuki; Umeyama, Shinji

    2015-12-01

    fNIRS (functional near-Infrared spectroscopy) can measure brain activity non-invasively and has advantages such as low cost and portability. While the conventional fNIRS has used laser light, LED light fNIRS is recently becoming common in use. Using LED for fNIRS, equipment can be more inexpensive and more portable. LED light, however, has a wider illumination spectrum than laser light, which may change crosstalk between the calculated concentration change of oxygenated and deoxygenated hemoglobins. The crosstalk is caused by difference in light path length in the head tissues depending on wavelengths used. We conducted Monte Carlo simulations of photon propagation in the tissue layers of head (scalp, skull, CSF, gray matter, and white matter) to estimate the light path length in each layers. Based on the estimated path lengths, the crosstalk in fNIRS using LED light was calculated. Our results showed that LED light more increases the crosstalk than laser light does when certain combinations of wavelengths were adopted. Even in such cases, the crosstalk increased by using LED light can be effectively suppressed by replacing the value of extinction coefficients used in the hemoglobin calculation to their weighted average over illumination spectrum.

  14. Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics.

    PubMed

    Doronin, Alexander; Meglinski, Igor

    2012-09-01

    In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy. PMID:23085901

  15. Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics

    NASA Astrophysics Data System (ADS)

    Doronin, Alexander; Meglinski, Igor

    2012-09-01

    In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.

  16. A fully coupled Monte Carlo/discrete ordinates solution to the neutron transport equation. Final report

    SciTech Connect

    Filippone, W.L.; Baker, R.S.

    1990-12-31

    The neutron transport equation is solved by a hybrid method that iteratively couples regions where deterministic (S{sub N}) and stochastic (Monte Carlo) methods are applied. Unlike previous hybrid methods, the Monte Carlo and S{sub N} regions are fully coupled in the sense that no assumption is made about geometrical separation or decoupling. The hybrid method provides a new means of solving problems involving both optically thick and optically thin regions that neither Monte Carlo nor S{sub N} is well suited for by themselves. The fully coupled Monte Carlo/S{sub N} technique consists of defining spatial and/or energy regions of a problem in which either a Monte Carlo calculation or an S{sub N} calculation is to be performed. The Monte Carlo region may comprise the entire spatial region for selected energy groups, or may consist of a rectangular area that is either completely or partially embedded in an arbitrary S{sub N} region. The Monte Carlo and S{sub N} regions are then connected through the common angular boundary fluxes, which are determined iteratively using the response matrix technique, and volumetric sources. The hybrid method has been implemented in the S{sub N} code TWODANT by adding special-purpose Monte Carlo subroutines to calculate the response matrices and volumetric sources, and linkage subrountines to carry out the interface flux iterations. The common angular boundary fluxes are included in the S{sub N} code as interior boundary sources, leaving the logic for the solution of the transport flux unchanged, while, with minor modifications, the diffusion synthetic accelerator remains effective in accelerating S{sub N} calculations. The special-purpose Monte Carlo routines used are essentially analog, with few variance reduction techniques employed. However, the routines have been successfully vectorized, with approximately a factor of five increase in speed over the non-vectorized version.

  17. MC-PEPTITA: A Monte Carlo model for Photon, Electron and Positron Tracking In Terrestrial Atmosphere—Application for a terrestrial gamma ray flash

    NASA Astrophysics Data System (ADS)

    Sarria, D.; Blelly, P.-L.; Forme, F.

    2015-05-01

    Terrestrial gamma ray flashes are natural bursts of X and gamma rays, correlated to thunderstorms, that are likely to be produced at an altitude of about 10 to 20 km. After the emission, the flux of gamma rays is filtered and altered by the atmosphere and a small part of it may be detected by a satellite on low Earth orbit (RHESSI or Fermi, for example). Thus, only a residual part of the initial burst can be measured and most of the flux is made of scattered primary photons and of secondary emitted electrons, positrons, and photons. Trying to get information on the initial flux from the measurement is a very complex inverse problem, which can only be tackled by the use of a numerical model solving the transport of these high-energy particles. For this purpose, we developed a numerical Monte Carlo model which solves the transport in the atmosphere of both relativistic electrons/positrons and X/gamma rays. It makes it possible to track the photons, electrons, and positrons in the whole Earth environment (considering the atmosphere and the magnetic field) to get information on what affects the transport of the particles from the source region to the altitude of the satellite. We first present the MC-PEPTITA model, and then we validate it by comparison with a benchmark GEANT4 simulation with similar settings. Then, we show the results of a simulation close to Fermi event number 091214 in order to discuss some important properties of the photons and electrons/positrons that are reaching satellite altitude.

  18. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation

    SciTech Connect

    Yoriyaz, Helio; Moralles, Mauricio; Tarso Dalledone Siqueira, Paulo de; Costa Guimaraes, Carla da; Belonsi Cintra, Felipe; Santos, Adimir dos

    2009-11-15

    Purpose: Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. Methods: For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Results: Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons.Conclusion: Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  19. FASTER 3: A generalized-geometry Monte Carlo computer program for the transport of neutrons and gamma rays. Volume 1: Summary report

    NASA Technical Reports Server (NTRS)

    Jordan, T. M.

    1970-01-01

    The theory used in FASTER-III, a Monte Carlo computer program for the transport of neutrons and gamma rays in complex geometries, is outlined. The program includes the treatment of geometric regions bounded by quadratic and quadric surfaces with multiple radiation sources which have specified space, angle, and energy dependence. The program calculates, using importance sampling, the resulting number and energy fluxes at specified point, surface, and volume detectors. It can also calculate minimum weight shield configuration meeting a specified dose rate constraint. Results are presented for sample problems involving primary neutron, and primary and secondary photon, transport in a spherical reactor shield configuration.

  20. A hybrid Monte Carlo model for the energy response functions of X-ray photon counting detectors

    NASA Astrophysics Data System (ADS)

    Wu, Dufan; Xu, Xiaofei; Zhang, Li; Wang, Sen

    2016-09-01

    In photon counting computed tomography (CT), it is vital to know the energy response functions of the detector for noise estimation and system optimization. Empirical methods lack flexibility and Monte Carlo simulations require too much knowledge of the detector. In this paper, we proposed a hybrid Monte Carlo model for the energy response functions of photon counting detectors in X-ray medical applications. GEANT4 was used to model the energy deposition of X-rays in the detector. Then numerical models were used to describe the process of charge sharing, anti-charge sharing and spectral broadening, which were too complicated to be included in the Monte Carlo model. Several free parameters were introduced in the numerical models, and they could be calibrated from experimental measurements such as X-ray fluorescence from metal elements. The method was used to model the energy response function of an XCounter Flite X1 photon counting detector. The parameters of the model were calibrated with fluorescence measurements. The model was further tested against measured spectrums of a VJ X-ray source to validate its feasibility and accuracy.

  1. ITS; The intergrated TIGER series of electron/photon transport codes-version 3. 0

    SciTech Connect

    Halbleib, J. A.; Kensek, R.P. ); Valdez, G.D. ); Seltzer, S.M.; Berger, M.J. )

    1992-08-01

    This paper reports on the ITS system which is a powerful and use-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Version 3.0 is a major upgrade of the system with important improvements in the physical model, variance reduction, I/O, and user friendliness. Improvements to the cross-section generator include the replacement of Born-approximation bremsstrahlung cross sections with the results of numerical phase-shift calculations, the addition of coherent scattering and binding effects in incoherent scattering, an upgrade of collisional and radiative stopping powers, and a complete rewrite to Fortran 77 standards emphasizing Block-IF structure.

  2. A hybrid transport-diffusion Monte Carlo method for frequency-dependent radiative-transfer simulations

    SciTech Connect

    Densmore, Jeffery D.; Thompson, Kelly G.; Urbatsch, Todd J.

    2012-08-15

    Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations in optically thick media. In DDMC, particles take discrete steps between spatial cells according to a discretized diffusion equation. Each discrete step replaces many smaller Monte Carlo steps, thus improving the efficiency of the simulation. In this paper, we present an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold, as optical thickness is typically a decreasing function of frequency. Above this threshold we employ standard Monte Carlo, which results in a hybrid transport-diffusion scheme. With a set of frequency-dependent test problems, we confirm the accuracy and increased efficiency of our new DDMC method.

  3. Dosimetric dependences of bone heterogeneity and beam angle on the unflattened and flattened photon beams: A Monte Carlo comparison

    NASA Astrophysics Data System (ADS)

    Chow, James C. L.; Owrangi, Amir M.

    2014-08-01

    The variations of depth and surface dose on the bone heterogeneity and beam angle were compared between unflattened and flattened photon beams using Monte Carlo simulations. Phase-space files of the 6 MV photon beams with field size of 10×10 cm2 were generated with and without the flattening filter based on a Varian TrueBeam linac. Depth and surface doses were calculated in a bone and water phantoms using Monte Carlo simulations (the EGSnrc-based code). Dose calculations were repeated with angles of the unflattened and flattened beams turned from 0° to 15°, 30°, 45°, 60°, 75° and 90° in the bone and water phantoms. Monte Carlo results of depth doses showed that compared to the flattened beam the unflattened photon beam had a higher dose in the build-up region but lower dose beyond the depth of maximum dose. Dose ratios of the unflattened to flattened beams were calculated in the range of 1.6-2.6 with beam angle varying from 0° to 90° in water. Similar results were found in the bone phantom. In addition, higher surface doses of about 2.5 times were found with beam angles equal to 0° and 15° in the bone and water phantoms. However, surface dose deviation between the unflattened and flattened beams became smaller with increasing beam angle. Dose enhancements due to the bone backscatter were also found at the water-bone and bone-water interfaces for both the unflattened and flattened beams in the bone phantom. With Monte Carlo beams cross-calibrated to the monitor unit in simulations, variations of depth and surface dose on the bone heterogeneity and beam angle were investigated and compared using Monte Carlo simulations. For the unflattened and flattened photon beams, the surface dose and range of depth dose ratios (unflattened to flattened beam) decreased with increasing beam angle. The dosimetric comparison in this study is useful in understanding the characteristics of unflattened photon beam on the depth and surface dose with bone heterogeneity.

  4. Monte Carlo Photon Modeling to Explore the Dependence of Snow Bidirectional Reflectance on Grain Shape and Size

    NASA Astrophysics Data System (ADS)

    Schneider, A. M.; Flanner, M.; Yang, P.; Yi, B.; Huang, X.; Feldman, D.

    2015-12-01

    The spectral albedo of a snow-covered surface is sensitive to effective snow grain size. Snow metamorphism, then, affects the strength of surface albedo feedback and changes the radiative energy budget of the planet. The Near-Infrared Emitting Reflectance Dome (NERD) is an instrument in development designed to measure snow effective radius from in situ bidirectional reflectance factors (BRFs) by illuminating a surface with nadir positioned light emitting diodes centered around 1.30 and 1.55 microns. Better understanding the dependences of BRFs on snow grain shape and size is imperative to constraining measurements taken by the NERD. Here, we use the Monte Carlo method for photon transport to explore BRFs of snow surfaces of different shapes and sizes. In addition to assuming spherical grains and using Mie theory, we incorporate into the model the scattering phase functions and other single scattering properties of the following nine aspherical grain shapes: hexagonal columns, plates, hollow columns, droxtals, hollow bullet rosettes, solid bullet rosettes, 8-element column aggregates, 5-element plate aggregates, and 10-element plate aggregates. We present the simulated BRFs of homogeneous snow surfaces for these ten shape habits and show their spectral variability for a wide range of effective radii. Initial findings using Mie theory indicate that surfaces of spherical particles exhibit rather Lambertian reflectance for the two incident wavelengths used in the NERD and show a monotonically decreasing trend in black-sky albedo with increasing effective radius. These results are consistent with previous studies and also demonstrate good agreement with models using the two-stream approximation.

  5. Verification measurements and clinical evaluation of the iPlan RT Monte Carlo dose algorithm for 6 MV photon energy

    NASA Astrophysics Data System (ADS)

    Petoukhova, A. L.; van Wingerden, K.; Wiggenraad, R. G. J.; van de Vaart, P. J. M.; van Egmond, J.; Franken, E. M.; van Santvoort, J. P. C.

    2010-08-01

    This study presents data for verification of the iPlan RT Monte Carlo (MC) dose algorithm (BrainLAB, Feldkirchen, Germany). MC calculations were compared with pencil beam (PB) calculations and verification measurements in phantoms with lung-equivalent material, air cavities or bone-equivalent material to mimic head and neck and thorax and in an Alderson anthropomorphic phantom. Dosimetric accuracy of MC for the micro-multileaf collimator (MLC) simulation was tested in a homogeneous phantom. All measurements were performed using an ionization chamber and Kodak EDR2 films with Novalis 6 MV photon beams. Dose distributions measured with film and calculated with MC in the homogeneous phantom are in excellent agreement for oval, C and squiggle-shaped fields and for a clinical IMRT plan. For a field with completely closed MLC, MC is much closer to the experimental result than the PB calculations. For fields larger than the dimensions of the inhomogeneities the MC calculations show excellent agreement (within 3%/1 mm) with the experimental data. MC calculations in the anthropomorphic phantom show good agreement with measurements for conformal beam plans and reasonable agreement for dynamic conformal arc and IMRT plans. For 6 head and neck and 15 lung patients a comparison of the MC plan with the PB plan was performed. Our results demonstrate that MC is able to accurately predict the dose in the presence of inhomogeneities typical for head and neck and thorax regions with reasonable calculation times (5-20 min). Lateral electron transport was well reproduced in MC calculations. We are planning to implement MC calculations for head and neck and lung cancer patients.

  6. Monte Carlo study of photon beams from medical linear accelerators: Optimization, benchmark and spectra

    NASA Astrophysics Data System (ADS)

    Sheikh-Bagheri, Daryoush

    1999-12-01

    BEAM is a general purpose EGS4 user code for simulating radiotherapy sources (Rogers et al. Med. Phys. 22, 503-524, 1995). The BEAM code is optimized by first minimizing unnecessary electron transport (a factor of 3 improvement in efficiency). The efficiency of the uniform bremsstrahlung splitting (UBS) technique is assessed and found to be 4 times more efficient. The Russian Roulette technique used in conjunction with UBS is substantially modified to make simulations additionally 2 times more efficient. Finally, a novel and robust technique, called selective bremsstrahlung splitting (SBS), is developed and shown to improve the efficiency of photon beam simulations by an additional factor of 3-4, depending on the end- point considered. The optimized BEAM code is benchmarked by comparing calculated and measured ionization distributions in water from the 10 and 20 MV photon beams of the NRCC linac. Unlike previous calculations, the incident e - energy is known independently to 1%, the entire extra-focal radiation is simulated and e- contamination is accounted for. Both beams use clinical jaws, whose dimensions are accurately measured, and which are set for a 10 x 10 cm2 field at 110 cm. At both energies, the calculated and the measured values of ionization on the central-axis in the buildup region agree within 1% of maximum dose. The agreement is well within statistics elsewhere on the central-axis. Ionization profiles match within 1% of maximum dose, except at the geometrical edges of the field, where the disagreement is up to 5% of dose maximum. Causes for this discrepancy are discussed. The benchmarked BEAM code is then used to simulate beams from the major commercial medical linear accelerators. The off-axis factors are matched within statistical uncertainties, for most of the beams at the 1 σ level and for all at the 2 σ level. The calculated and measured depth-dose data agree within 1% (local dose), at about 1% (1 σ level) statistics, at all depths past

  7. Monte Carlo calculated correction factors for diodes and ion chambers in small photon fields

    NASA Astrophysics Data System (ADS)

    Czarnecki, D.; Zink, K.

    2013-04-01

    The application of small photon fields in modern radiotherapy requires the determination of total scatter factors Scp or field factors \\Omega ^{f_{clin} ,f_{msr}}_{Q_{clin} ,Q_{msr}} with high precision. Both quantities require the knowledge of the field-size-dependent and detector-dependent correction factor k^{f_{clin} ,f_{msr}}_{Q_{clin} ,Q_{msr}}. The aim of this study is the determination of the correction factor k^{f_{clin} ,f_{msr}}_{Q_{clin} ,Q_{msr}} for different types of detectors in a clinical 6 MV photon beam of a Siemens KD linear accelerator. The EGSnrc Monte Carlo code was used to calculate the dose to water and the dose to different detectors to determine the field factor as well as the mentioned correction factor for different small square field sizes. Besides this, the mean water to air stopping power ratio as well as the ratio of the mean energy absorption coefficients for the relevant materials was calculated for different small field sizes. As the beam source, a Monte Carlo based model of a Siemens KD linear accelerator was used. The results show that in the case of ionization chambers the detector volume has the largest impact on the correction factor k^{f_{clin} ,f_{msr}}_{Q_{clin} ,Q_{msr}}; this perturbation may contribute up to 50% to the correction factor. Field-dependent changes in stopping-power ratios are negligible. The magnitude of k^{f_{clin} ,f_{msr}}_{Q_{clin} ,Q_{msr}} is of the order of 1.2 at a field size of 1 × 1 cm2 for the large volume ion chamber PTW31010 and is still in the range of 1.05-1.07 for the PinPoint chambers PTW31014 and PTW31016. For the diode detectors included in this study (PTW60016, PTW 60017), the correction factor deviates no more than 2% from unity in field sizes between 10 × 10 and 1 × 1 cm2, but below this field size there is a steep decrease of k^{f_{clin} ,f_{msr}}_{Q_{clin} ,Q_{msr}} below unity, i.e. a strong overestimation of dose. Besides the field size and detector dependence, the results

  8. Control of photon transport properties in nanocomposite nanowires

    NASA Astrophysics Data System (ADS)

    Moffa, M.; Fasano, V.; Camposeo, A.; Persano, L.; Pisignano, D.

    2016-02-01

    Active nanowires and nanofibers can be realized by the electric-field induced stretching of polymer solutions with sufficient molecular entanglements. The resulting nanomaterials are attracting an increasing attention in view of their application in a wide variety of fields, including optoelectronics, photonics, energy harvesting, nanoelectronics, and microelectromechanical systems. Realizing nanocomposite nanofibers is especially interesting in this respect. In particular, methods suitable for embedding inorganic nanocrystals in electrified jets and then in active fiber systems allow for controlling light-scattering and refractive index properties in the realized fibrous materials. We here report on the design, realization, and morphological and spectroscopic characterization of new species of active, composite nanowires and nanofibers for nanophotonics. We focus on the properties of light-confinement and photon transport along the nanowire longitudinal axis, and on how these depend on nanoparticle incorporation. Optical losses mechanisms and their influence on device design and performances are also presented and discussed.

  9. Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes

    SciTech Connect

    Frambati, S.; Frignani, M.

    2012-07-01

    We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design for radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)

  10. Direct calibration in megavoltage photon beams using Monte Carlo conversion factor: validation and clinical implications.

    PubMed

    Wright, Tracy; Lye, Jessica E; Ramanathan, Ganesan; Harty, Peter D; Oliver, Chris; Webb, David V; Butler, Duncan J

    2015-01-21

    The Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) has established a method for ionisation chamber calibrations using megavoltage photon reference beams. The new method will reduce the calibration uncertainty compared to a (60)Co calibration combined with the TRS-398 energy correction factor. The calibration method employs a graphite calorimeter and a Monte Carlo (MC) conversion factor to convert the absolute dose to graphite to absorbed dose to water. EGSnrc is used to model the linac head and doses in the calorimeter and water phantom. The linac model is validated by comparing measured and modelled PDDs and profiles. The relative standard uncertainties in the calibration factors at the ARPANSA beam qualities were found to be 0.47% at 6 MV, 0.51% at 10 MV and 0.46% for the 18 MV beam. A comparison with the Bureau International des Poids et Mesures (BIPM) as part of the key comparison BIPM.RI(I)-K6 gave results of 0.9965(55), 0.9924(60) and 0.9932(59) for the 6, 10 and 18 MV beams, respectively, with all beams within 1σ of the participant average. The measured kQ values for an NE2571 Farmer chamber were found to be lower than those in TRS-398 but are consistent with published measured and modelled values. Users can expect a shift in the calibration factor at user energies of an NE2571 chamber between 0.4-1.1% across the range of calibration energies compared to the current calibration method. PMID:25565406

  11. Direct calibration in megavoltage photon beams using Monte Carlo conversion factor: validation and clinical implications

    NASA Astrophysics Data System (ADS)

    Wright, Tracy; Lye, Jessica E.; Ramanathan, Ganesan; Harty, Peter D.; Oliver, Chris; Webb, David V.; Butler, Duncan J.

    2015-01-01

    The Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) has established a method for ionisation chamber calibrations using megavoltage photon reference beams. The new method will reduce the calibration uncertainty compared to a 60Co calibration combined with the TRS-398 energy correction factor. The calibration method employs a graphite calorimeter and a Monte Carlo (MC) conversion factor to convert the absolute dose to graphite to absorbed dose to water. EGSnrc is used to model the linac head and doses in the calorimeter and water phantom. The linac model is validated by comparing measured and modelled PDDs and profiles. The relative standard uncertainties in the calibration factors at the ARPANSA beam qualities were found to be 0.47% at 6 MV, 0.51% at 10 MV and 0.46% for the 18 MV beam. A comparison with the Bureau International des Poids et Mesures (BIPM) as part of the key comparison BIPM.RI(I)-K6 gave results of 0.9965(55), 0.9924(60) and 0.9932(59) for the 6, 10 and 18 MV beams, respectively, with all beams within 1σ of the participant average. The measured kQ values for an NE2571 Farmer chamber were found to be lower than those in TRS-398 but are consistent with published measured and modelled values. Users can expect a shift in the calibration factor at user energies of an NE2571 chamber between 0.4-1.1% across the range of calibration energies compared to the current calibration method.

  12. General Purpose Monte Carlo Multigroup Neutron and Gamma-Ray Transport Code System. We recommend C00474/ALLCP/02 MORSE-CGA.

    Energy Science and Technology Software Center (ESTSC)

    1991-08-01

    Version: 00 The original MORSE code was a multipurpose neutron and gamma-ray transport Monte Carlo code. It was designed as a tool for solving most shielding problems. Through the use of multigroup cross sections, the solution of neutron, gamma-ray, or coupled neutron-gamma-ray problems could be obtained in either the forward or adjoint mode. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry could be used with an albedo option available atmore » any material surface. Isotropic or anisotropic scattering up to a P16 expansion of the angular distribution was allowed. MORSE-CG incorporated the Mathematical Applications, Inc. (MAGI) combinatorial geometry routines. MORSE-B modifies the Monte Carlo neutron and photon transport computer code MORSE-CG by adding routines which allow various flexible options.« less

  13. Update On the Status of the FLUKA Monte Carlo Transport Code*

    NASA Technical Reports Server (NTRS)

    Ferrari, A.; Lorenzo-Sentis, M.; Roesler, S.; Smirnov, G.; Sommerer, F.; Theis, C.; Vlachoudis, V.; Carboni, M.; Mostacci, A.; Pelliccioni, M.

    2006-01-01

    The FLUKA Monte Carlo transport code is a well-known simulation tool in High Energy Physics. FLUKA is a dynamic tool in the sense that it is being continually updated and improved by the authors. We review the progress achieved since the last CHEP Conference on the physics models, some technical improvements to the code and some recent applications. From the point of view of the physics, improvements have been made with the extension of PEANUT to higher energies for p, n, pi, pbar/nbar and for nbars down to the lowest energies, the addition of the online capability to evolve radioactive products and get subsequent dose rates, upgrading of the treatment of EM interactions with the elimination of the need to separately prepare preprocessed files. A new coherent photon scattering model, an updated treatment of the photo-electric effect, an improved pair production model, new photon cross sections from the LLNL Cullen database have been implemented. In the field of nucleus-- nucleus interactions the electromagnetic dissociation of heavy ions has been added along with the extension of the interaction models for some nuclide pairs to energies below 100 MeV/A using the BME approach, as well as the development of an improved QMD model for intermediate energies. Both DPMJET 2.53 and 3 remain available along with rQMD 2.4 for heavy ion interactions above 100 MeV/A. Technical improvements include the ability to use parentheses in setting up the combinatorial geometry, the introduction of pre-processor directives in the input stream. a new random number generator with full 64 bit randomness, new routines for mathematical special functions (adapted from SLATEC). Finally, work is progressing on the deployment of a user-friendly GUI input interface as well as a CAD-like geometry creation and visualization tool. On the application front, FLUKA has been used to extensively evaluate the potential space radiation effects on astronauts for future deep space missions, the activation

  14. Time series analysis of Monte Carlo neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Nease, Brian Robert

    A time series based approach is applied to the Monte Carlo (MC) fission source distribution to calculate the non-fundamental mode eigenvalues of the system. The approach applies Principal Oscillation Patterns (POPs) to the fission source distribution, transforming the problem into a simple autoregressive order one (AR(1)) process. Proof is provided that the stationary MC process is linear to first order approximation, which is a requirement for the application of POPs. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern k-eigenvalue MC codes calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. The strength of this approach is contrasted against the Fission Matrix method (FMM) in terms of accuracy versus computer memory constraints. Multi-dimensional problems are considered since the approach has strong potential for use in reactor analysis, and the implementation of the method into production codes is discussed. Lastly, the appearance of complex eigenvalues is investigated and solutions are provided.

  15. Using Nuclear Theory, Data and Uncertainties in Monte Carlo Transport Applications

    SciTech Connect

    Rising, Michael Evan

    2015-11-03

    These are slides for a presentation on using nuclear theory, data and uncertainties in Monte Carlo transport applications. The following topics are covered: nuclear data (experimental data versus theoretical models, data evaluation and uncertainty quantification), fission multiplicity models (fixed source applications, criticality calculations), uncertainties and their impact (integral quantities, sensitivity analysis, uncertainty propagation).

  16. Hypersensitive Transport in Photonic Crystals with Accidental Spatial Degeneracies

    NASA Astrophysics Data System (ADS)

    Makri, Eleana; Smith, Kyle; Chabanov, Andrey; Vitebskiy, Ilya; Kottos, Tsampikos

    2016-02-01

    A localized mode in a photonic layered structure can develop nodal points (nodal planes), where the oscillating electric field is negligible. Placing a thin metallic layer at such a nodal point results in the phenomenon of induced transmission. Here we demonstrate that if the nodal point is not a point of symmetry, then even a tiny alteration of the permittivity in the vicinity of the metallic layer drastically suppresses the localized mode along with the resonant transmission. This renders the layered structure highly reflective within a broad frequency range. Applications of this hypersensitive transport for optical and microwave limiting and switching are discussed.

  17. Efficient photon transport in positron emission tomography simulations using VMC++

    NASA Astrophysics Data System (ADS)

    Kawrakow, I.; Mitev, K.; Gerganov, G.; Madzhunkov, J.; Kirov, A.

    2008-02-01

    vmcPET, a VMC++ based fast code for simulating photon transport through the patient geometry for use in positron emission tomography related calculations, is presented. vmcPET is shown to be between 250 and 425 times faster than GATE in completely analog mode and up to 50000 times faster when using advanced variance reduction techniques. Excellent agreement between vmcPET and EGSnrc and GATE benchmarks is found. vmcPET is coupled to GATE via phase-space files of particles emerging from the patient geometry.

  18. Hypersensitive Transport in Photonic Crystals with Accidental Spatial Degeneracies

    PubMed Central

    Makri, Eleana; Smith, Kyle; Chabanov, Andrey; Vitebskiy, Ilya; Kottos, Tsampikos

    2016-01-01

    A localized mode in a photonic layered structure can develop nodal points (nodal planes), where the oscillating electric field is negligible. Placing a thin metallic layer at such a nodal point results in the phenomenon of induced transmission. Here we demonstrate that if the nodal point is not a point of symmetry, then even a tiny alteration of the permittivity in the vicinity of the metallic layer drastically suppresses the localized mode along with the resonant transmission. This renders the layered structure highly reflective within a broad frequency range. Applications of this hypersensitive transport for optical and microwave limiting and switching are discussed. PMID:26903232

  19. Hypersensitive Transport in Photonic Crystals with Accidental Spatial Degeneracies.

    PubMed

    Makri, Eleana; Smith, Kyle; Chabanov, Andrey; Vitebskiy, Ilya; Kottos, Tsampikos

    2016-01-01

    A localized mode in a photonic layered structure can develop nodal points (nodal planes), where the oscillating electric field is negligible. Placing a thin metallic layer at such a nodal point results in the phenomenon of induced transmission. Here we demonstrate that if the nodal point is not a point of symmetry, then even a tiny alteration of the permittivity in the vicinity of the metallic layer drastically suppresses the localized mode along with the resonant transmission. This renders the layered structure highly reflective within a broad frequency range. Applications of this hypersensitive transport for optical and microwave limiting and switching are discussed. PMID:26903232

  20. Radiative transport in fluorescence-enhanced frequency domain photon migration.

    PubMed

    Rasmussen, John C; Joshi, Amit; Pan, Tianshu; Wareing, Todd; McGhee, John; Sevick-Muraca, Eva M

    2006-12-01

    Small animal optical tomography has significant, but potential application for streamlining drug discovery and pre-clinical investigation of drug candidates. However, accurate modeling of photon propagation in small animal volumes is critical to quantitatively obtain accurate tomographic images. Herein we present solutions from a robust fluorescence-enhanced, frequency domain radiative transport equation (RTE) solver with unique attributes that facilitate its deployment within tomographic algorithms. Specifically, the coupled equations describing time-dependent excitation and emission light transport are solved using discrete ordinates (SN) angular differencing along with linear discontinuous finite-element spatial differencing on unstructured tetrahedral grids. Source iteration in conjunction with diffusion synthetic acceleration is used to iteratively solve the resulting system of equations. This RTE solver can accurately and efficiently predict ballistic as well as diffusion limited transport regimes which could simultaneously exist in small animals. Furthermore, the solver provides accurate solutions on unstructured, tetrahedral grids with relatively large element sizes as compared to commonly employed solvers that use step differencing. The predictions of the solver are validated by a series of frequency-domain, phantom measurements with optical properties ranging from diffusion limited to transport limited propagation. Our results demonstrate that the RTE solution consistently matches measurements made under both diffusion and transport-limited conditions. This work demonstrates the use of an appropriate RTE solver for deployment in small animal optical tomography. PMID:17278821

  1. Hadronic Monte Carlo Transport: A Very Personal View

    NASA Astrophysics Data System (ADS)

    Prael, R. E.

    Much to the disappointment of many, our distinguished speaker for the initial plenary session has been unable to attend our conference. I was prevailed upon by the conference organization to present a talk which, as prescribed, will be of a historical nature, but as the title describes, will also be a very personal view. Perhaps the opinions expressed will find sympathy with my associates around the world who have devoted their efforts to and found some satisfaction with providing the code tools for radiation transport to a large, and occasional anxious, community of users.

  2. A Two-Dimensional Monte Carlo Code System for Linear Neutron Transport Calculations.

    Energy Science and Technology Software Center (ESTSC)

    1980-04-24

    Version 00 KIM (k-infinite-Monte Carlo) solves the steady-state linear neutron transport equation for a fixed source problem or, by successive fixed-source runs, for the eigenvalue problem, in a two-dimensional infinite thermal reactor lattice using the Monte Carlo method. In addition to the combinatorial description of domains, the program allows complex configurations to be represented by a discrete set of points whereby the calculation speed is greatly improved. Configurations are described as the result of overlaysmore » of elementary figures over a basic domain.« less

  3. Coupling Deterministic and Monte Carlo Transport Methods for the Simulation of Gamma-Ray Spectroscopy Scenarios

    SciTech Connect

    Smith, Leon E.; Gesh, Christopher J.; Pagh, Richard T.; Miller, Erin A.; Shaver, Mark W.; Ashbaker, Eric D.; Batdorf, Michael T.; Ellis, J. E.; Kaye, William R.; McConn, Ronald J.; Meriwether, George H.; Ressler, Jennifer J.; Valsan, Andrei B.; Wareing, Todd A.

    2008-10-31

    Radiation transport modeling methods used in the radiation detection community fall into one of two broad categories: stochastic (Monte Carlo) and deterministic. Monte Carlo methods are typically the tool of choice for simulating gamma-ray spectrometers operating in homeland and national security settings (e.g. portal monitoring of vehicles or isotope identification using handheld devices), but deterministic codes that discretize the linear Boltzmann transport equation in space, angle, and energy offer potential advantages in computational efficiency for many complex radiation detection problems. This paper describes the development of a scenario simulation framework based on deterministic algorithms. Key challenges include: formulating methods to automatically define an energy group structure that can support modeling of gamma-ray spectrometers ranging from low to high resolution; combining deterministic transport algorithms (e.g. ray-tracing and discrete ordinates) to mitigate ray effects for a wide range of problem types; and developing efficient and accurate methods to calculate gamma-ray spectrometer response functions from the deterministic angular flux solutions. The software framework aimed at addressing these challenges is described and results from test problems that compare coupled deterministic-Monte Carlo methods and purely Monte Carlo approaches are provided.

  4. Implicitly causality enforced solution of multidimensional transient photon transport equation.

    PubMed

    Handapangoda, Chintha C; Premaratne, Malin

    2009-12-21

    A novel method for solving the multidimensional transient photon transport equation for laser pulse propagation in biological tissue is presented. A Laguerre expansion is used to represent the time dependency of the incident short pulse. Owing to the intrinsic causal nature of Laguerre functions, our technique automatically always preserve the causality constrains of the transient signal. This expansion of the radiance using a Laguerre basis transforms the transient photon transport equation to the steady state version. The resulting equations are solved using the discrete ordinates method, using a finite volume approach. Therefore, our method enables one to handle general anisotropic, inhomogeneous media using a single formulation but with an added degree of flexibility owing to the ability to invoke higher-order approximations of discrete ordinate quadrature sets. Therefore, compared with existing strategies, this method offers the advantage of representing the intensity with a high accuracy thus minimizing numerical dispersion and false propagation errors. The application of the method to one, two and three dimensional geometries is provided. PMID:20052050

  5. Multidimensional electron-photon transport with standard discrete ordinates codes

    SciTech Connect

    Drumm, C.R.

    1997-04-01

    A method is described for generating electron cross sections that are comparable with standard discrete ordinates codes without modification. There are many advantages of using an established discrete ordinates solver, e.g. immediately available adjoint capability. Coupled electron-photon transport capability is needed for many applications, including the modeling of the response of electronics components to space and man-made radiation environments. The cross sections have been successfully used in the DORT, TWODANT and TORT discrete ordinates codes. The cross sections are shown to provide accurate and efficient solutions to certain multidimensional electron-photon transport problems. The key to the method is a simultaneous solution of the continuous-slowing-down (CSD) portion and elastic-scattering portion of the scattering source by the Goudsmit-Saunderson theory. The resulting multigroup-Legendre cross sections are much smaller than the true scattering cross sections that they represent. Under certain conditions, the cross sections are guaranteed positive and converge with a low-order Legendre expansion.

  6. Accurate and efficient radiation transport in optically thick media -- by means of the Symbolic Implicit Monte Carlo method in the difference formulation

    SciTech Connect

    Szoke, A; Brooks, E D; McKinley, M; Daffin, F

    2005-03-30

    The equations of radiation transport for thermal photons are notoriously difficult to solve in thick media without resorting to asymptotic approximations such as the diffusion limit. One source of this difficulty is that in thick, absorbing media thermal emission is almost completely balanced by strong absorption. In a previous publication [SB03], the photon transport equation was written in terms of the deviation of the specific intensity from the local equilibrium field. We called the new form of the equations the difference formulation. The difference formulation is rigorously equivalent to the original transport equation. It is particularly advantageous in thick media, where the radiation field approaches local equilibrium and the deviations from the Planck distribution are small. The difference formulation for photon transport also clarifies the diffusion limit. In this paper, the transport equation is solved by the Symbolic Implicit Monte Carlo (SIMC) method and a comparison is made between the standard formulation and the difference formulation. The SIMC method is easily adapted to the derivative source terms of the difference formulation, and a remarkable reduction in noise is obtained when the difference formulation is applied to problems involving thick media.

  7. Lorentz force correction to the Boltzmann radiation transport equation and its implications for Monte Carlo algorithms.

    PubMed

    Bouchard, Hugo; Bielajew, Alex

    2015-07-01

    To establish a theoretical framework for generalizing Monte Carlo transport algorithms by adding external electromagnetic fields to the Boltzmann radiation transport equation in a rigorous and consistent fashion. Using first principles, the Boltzmann radiation transport equation is modified by adding a term describing the variation of the particle distribution due to the Lorentz force. The implications of this new equation are evaluated by investigating the validity of Fano's theorem. Additionally, Lewis' approach to multiple scattering theory in infinite homogeneous media is redefined to account for the presence of external electromagnetic fields. The equation is modified and yields a description consistent with the deterministic laws of motion as well as probabilistic methods of solution. The time-independent Boltzmann radiation transport equation is generalized to account for the electromagnetic forces in an additional operator similar to the interaction term. Fano's and Lewis' approaches are stated in this new equation. Fano's theorem is found not to apply in the presence of electromagnetic fields. Lewis' theory for electron multiple scattering and moments, accounting for the coupling between the Lorentz force and multiple elastic scattering, is found. However, further investigation is required to develop useful algorithms for Monte Carlo and deterministic transport methods. To test the accuracy of Monte Carlo transport algorithms in the presence of electromagnetic fields, the Fano cavity test, as currently defined, cannot be applied. Therefore, new tests must be designed for this specific application. A multiple scattering theory that accurately couples the Lorentz force with elastic scattering could improve Monte Carlo efficiency. The present study proposes a new theoretical framework to develop such algorithms. PMID:26061045

  8. Lorentz force correction to the Boltzmann radiation transport equation and its implications for Monte Carlo algorithms

    NASA Astrophysics Data System (ADS)

    Bouchard, Hugo; Bielajew, Alex

    2015-07-01

    To establish a theoretical framework for generalizing Monte Carlo transport algorithms by adding external electromagnetic fields to the Boltzmann radiation transport equation in a rigorous and consistent fashion. Using first principles, the Boltzmann radiation transport equation is modified by adding a term describing the variation of the particle distribution due to the Lorentz force. The implications of this new equation are evaluated by investigating the validity of Fano’s theorem. Additionally, Lewis’ approach to multiple scattering theory in infinite homogeneous media is redefined to account for the presence of external electromagnetic fields. The equation is modified and yields a description consistent with the deterministic laws of motion as well as probabilistic methods of solution. The time-independent Boltzmann radiation transport equation is generalized to account for the electromagnetic forces in an additional operator similar to the interaction term. Fano’s and Lewis’ approaches are stated in this new equation. Fano’s theorem is found not to apply in the presence of electromagnetic fields. Lewis’ theory for electron multiple scattering and moments, accounting for the coupling between the Lorentz force and multiple elastic scattering, is found. However, further investigation is required to develop useful algorithms for Monte Carlo and deterministic transport methods. To test the accuracy of Monte Carlo transport algorithms in the presence of electromagnetic fields, the Fano cavity test, as currently defined, cannot be applied. Therefore, new tests must be designed for this specific application. A multiple scattering theory that accurately couples the Lorentz force with elastic scattering could improve Monte Carlo efficiency. The present study proposes a new theoretical framework to develop such algorithms.

  9. A Hybrid Monte Carlo-Deterministic Method for Global Binary Stochastic Medium Transport Problems

    SciTech Connect

    Keady, K P; Brantley, P

    2010-03-04

    Global deep-penetration transport problems are difficult to solve using traditional Monte Carlo techniques. In these problems, the scalar flux distribution is desired at all points in the spatial domain (global nature), and the scalar flux typically drops by several orders of magnitude across the problem (deep-penetration nature). As a result, few particle histories may reach certain regions of the domain, producing a relatively large variance in tallies in those regions. Implicit capture (also known as survival biasing or absorption suppression) can be used to increase the efficiency of the Monte Carlo transport algorithm to some degree. A hybrid Monte Carlo-deterministic technique has previously been developed by Cooper and Larsen to reduce variance in global problems by distributing particles more evenly throughout the spatial domain. This hybrid method uses an approximate deterministic estimate of the forward scalar flux distribution to automatically generate weight windows for the Monte Carlo transport simulation, avoiding the necessity for the code user to specify the weight window parameters. In a binary stochastic medium, the material properties at a given spatial location are known only statistically. The most common approach to solving particle transport problems involving binary stochastic media is to use the atomic mix (AM) approximation in which the transport problem is solved using ensemble-averaged material properties. The most ubiquitous deterministic model developed specifically for solving binary stochastic media transport problems is the Levermore-Pomraning (L-P) model. Zimmerman and Adams proposed a Monte Carlo algorithm (Algorithm A) that solves the Levermore-Pomraning equations and another Monte Carlo algorithm (Algorithm B) that is more accurate as a result of improved local material realization modeling. Recent benchmark studies have shown that Algorithm B is often significantly more accurate than Algorithm A (and therefore the L-P model

  10. Selection of voxel size and photon number in voxel-based Monte Carlo method: criteria and applications

    NASA Astrophysics Data System (ADS)

    Li, Dong; Chen, Bin; Ran, Wei Yu; Wang, Guo Xiang; Wu, Wen Juan

    2015-09-01

    The voxel-based Monte Carlo method (VMC) is now a gold standard in the simulation of light propagation in turbid media. For complex tissue structures, however, the computational cost will be higher when small voxels are used to improve smoothness of tissue interface and a large number of photons are used to obtain accurate results. To reduce computational cost, criteria were proposed to determine the voxel size and photon number in 3-dimensional VMC simulations with acceptable accuracy and computation time. The selection of the voxel size can be expressed as a function of tissue geometry and optical properties. The photon number should be at least 5 times the total voxel number. These criteria are further applied in developing a photon ray splitting scheme of local grid refinement technique to reduce computational cost of a nonuniform tissue structure with significantly varying optical properties. In the proposed technique, a nonuniform refined grid system is used, where fine grids are used for the tissue with high absorption and complex geometry, and coarse grids are used for the other part. In this technique, the total photon number is selected based on the voxel size of the coarse grid. Furthermore, the photon-splitting scheme is developed to satisfy the statistical accuracy requirement for the dense grid area. Result shows that local grid refinement technique photon ray splitting scheme can accelerate the computation by 7.6 times (reduce time consumption from 17.5 to 2.3 h) in the simulation of laser light energy deposition in skin tissue that contains port wine stain lesions.

  11. Selection of voxel size and photon number in voxel-based Monte Carlo method: criteria and applications.

    PubMed

    Li, Dong; Chen, Bin; Ran, Wei Yu; Wang, Guo Xiang; Wu, Wen Juan

    2015-01-01

    The voxel-based Monte Carlo method (VMC) is now a gold standard in the simulation of light propagation in turbid media. For complex tissue structures, however, the computational cost will be higher when small voxels are used to improve smoothness of tissue interface and a large number of photons are used to obtain accurate results. To reduce computational cost, criteria were proposed to determine the voxel size and photon number in 3-dimensional VMC simulations with acceptable accuracy and computation time. The selection of the voxel size can be expressed as a function of tissue geometry and optical properties. The photon number should be at least 5 times the total voxel number. These criteria are further applied in developing a photon ray splitting scheme of local grid refinement technique to reduce computational cost of a nonuniform tissue structure with significantly varying optical properties. In the proposed technique, a nonuniform refined grid system is used, where fine grids are used for the tissue with high absorption and complex geometry, and coarse grids are used for the other part. In this technique, the total photon number is selected based on the voxel size of the coarse grid. Furthermore, the photon-splitting scheme is developed to satisfy the statistical accuracy requirement for the dense grid area. Result shows that local grid refinement technique photon ray splitting scheme can accelerate the computation by 7.6 times (reduce time consumption from 17.5 to 2.3 h) in the simulation of laser light energy deposition in skin tissue that contains port wine stain lesions. PMID:26417866

  12. Dosimetric impact of monoenergetic photon beams in the small-animal irradiation with inhomogeneities: A Monte Carlo evaluation

    NASA Astrophysics Data System (ADS)

    Chow, James C. L.

    2013-05-01

    This study investigated the variations of the dose and dose distribution in a small-animal irradiation due to the photon beam energy and presence of inhomogeneity. Based on the same mouse computed tomography image set, three Monte Carlo phantoms namely, inhomogeneous, homogeneous and bone-tissue phantoms were used in this study. These phantoms were generated by overriding the relative electron density of no voxel (inhomogeneous), all voxel (homogeneous) and the bone voxel (bone-tissue) to one. 360° photon arcs with beam energies of 50-1250 kV were used in mouse irradiations. Doses in the above phantoms were calculated using the EGSnrc-based DOSXYZnrc code through the DOSCTP. It was found that the dose conformity increased with the increase of the photon beam energy from the kV to MV range. For the inhomogeneous mouse phantom, increasing the photon beam energy from 50 kV to 1250 kV increased about 21 times the dose deposited at the isocenter. For the bone dose enhancement, the mean dose was 1.4 times higher when the bone inhomogeneity was not neglected using the 50 kV photon beams in the mouse irradiation. Bone dose enhancement affecting the mean dose in the mouse irradiation can be found in the photon beams with energy range of 50-200 kV, and the dose enhancement decreases with an increase of the beam energy. Moreover, the MV photon beam has a higher dose at the isocenter, and a better dose conformity compared to the kV beam.

  13. Monte Carlo simulation on pre-clinical irradiation: A heterogeneous phantom study on monoenergetic kilovoltage photon beams

    NASA Astrophysics Data System (ADS)

    Chow, James C. L.

    2012-10-01

    This study investigated radiation dose variations in pre-clinical irradiation due to the photon beam energy and presence of tissue heterogeneity. Based on the same mouse computed tomography image dataset, three phantoms namely, heterogeneous, homogeneous and bone homogeneous were used. These phantoms were generated by overriding the relative electron density of no voxel (heterogeneous), all voxel (homogeneous) and the bone voxel (bone homogeneous) to one. 360° photon arcs with beam energies of 50 - 1250 keV were used in mouse irradiations. Doses in the above phantoms were calculated using the EGSnrc-based DOSXYZnrc code through the DOSCTP. Monte Carlo simulations were carried out in parallel using multiple nodes in a high-performance computing cluster. It was found that the dose conformity increased with the increase of the photon beam energy from the keV to MeV range. For the heterogeneous mouse phantom, increasing the photon beam energy from 50 keV to 1250 keV increased seven times the dose deposited at the isocenter. For the bone dose enhancement, the mean dose was 2.7 times higher when the bone heterogeneity was not neglected using the 50 keV photon beams in the mouse irradiation. Bone dose enhancement affecting the mean dose was found in the photon beams with energy range of 50 - 200 keV and the dose enhancement decreased with an increase of the beam energy. Moreover, the MeV photon beam had a higher dose at the isocenter, and a better dose conformity compared to the keV beam.

  14. High-resolution Monte Carlo simulation of flow and conservative transport in heterogeneous porous media 2. Transport results

    USGS Publications Warehouse

    Naff, R.L.; Haley, D.F.; Sudicky, E.A.

    1998-01-01

    In this, the second of two papers concerned with the use of numerical simulation to examine flow and transport parameters in heterogeneous porous media via Monte Carlo methods, results from the transport aspect of these simulations are reported on. Transport simulations contained herein assume a finite pulse input of conservative tracer, and the numerical technique endeavors to realistically simulate tracer spreading as the cloud moves through a heterogeneous medium. Medium heterogeneity is limited to the hydraulic conductivity field, and generation of this field assumes that the hydraulic- conductivity process is second-order stationary. Methods of estimating cloud moments, and the interpretation of these moments, are discussed. Techniques for estimation of large-time macrodispersivities from cloud second-moment data, and for the approximation of the standard errors associated with these macrodispersivities, are also presented. These moment and macrodispersivity estimation techniques were applied to tracer clouds resulting from transport scenarios generated by specific Monte Carlo simulations. Where feasible, moments and macrodispersivities resulting from the Monte Carlo simulations are compared with first- and second-order perturbation analyses. Some limited results concerning the possible ergodic nature of these simulations, and the presence of non- Gaussian behavior of the mean cloud, are reported on as well.

  15. Data decomposition of Monte Carlo particle transport simulations via tally servers

    SciTech Connect

    Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit; Smith, Kord

    2013-11-01

    An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithm in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations.

  16. Simulation of diffuse photon migration in tissue by a Monte Carlo method derived from the optical scattering of spheroids.

    PubMed

    Hart, Vern P; Doyle, Timothy E

    2013-09-01

    A Monte Carlo method was derived from the optical scattering properties of spheroidal particles and used for modeling diffuse photon migration in biological tissue. The spheroidal scattering solution used a separation of variables approach and numerical calculation of the light intensity as a function of the scattering angle. A Monte Carlo algorithm was then developed which utilized the scattering solution to determine successive photon trajectories in a three-dimensional simulation of optical diffusion and resultant scattering intensities in virtual tissue. Monte Carlo simulations using isotropic randomization, Henyey-Greenstein phase functions, and spherical Mie scattering were additionally developed and used for comparison to the spheroidal method. Intensity profiles extracted from diffusion simulations showed that the four models differed significantly. The depth of scattering extinction varied widely among the four models, with the isotropic, spherical, spheroidal, and phase function models displaying total extinction at depths of 3.62, 2.83, 3.28, and 1.95 cm, respectively. The results suggest that advanced scattering simulations could be used as a diagnostic tool by distinguishing specific cellular structures in the diffused signal. For example, simulations could be used to detect large concentrations of deformed cell nuclei indicative of early stage cancer. The presented technique is proposed to be a more physical description of photon migration than existing phase function methods. This is attributed to the spheroidal structure of highly scattering mitochondria and elongation of the cell nucleus, which occurs in the initial phases of certain cancers. The potential applications of the model and its importance to diffusive imaging techniques are discussed. PMID:24085080

  17. Modeling bioluminescent photon transport in tissue based on Radiosity-diffusion model

    NASA Astrophysics Data System (ADS)

    Sun, Li; Wang, Pu; Tian, Jie; Zhang, Bo; Han, Dong; Yang, Xin

    2010-03-01

    Bioluminescence tomography (BLT) is one of the most important non-invasive optical molecular imaging modalities. The model for the bioluminescent photon propagation plays a significant role in the bioluminescence tomography study. Due to the high computational efficiency, diffusion approximation (DA) is generally applied in the bioluminescence tomography. But the diffusion equation is valid only in highly scattering and weakly absorbing regions and fails in non-scattering or low-scattering tissues, such as a cyst in the breast, the cerebrospinal fluid (CSF) layer of the brain and synovial fluid layer in the joints. A hybrid Radiosity-diffusion model is proposed for dealing with the non-scattering regions within diffusing domains in this paper. This hybrid method incorporates a priori information of the geometry of non-scattering regions, which can be acquired by magnetic resonance imaging (MRI) or x-ray computed tomography (CT). Then the model is implemented using a finite element method (FEM) to ensure the high computational efficiency. Finally, we demonstrate that the method is comparable with Mont Carlo (MC) method which is regarded as a 'gold standard' for photon transportation simulation.

  18. A hybrid transport-diffusion method for Monte Carlo radiative-transfer simulations

    SciTech Connect

    Densmore, Jeffery D. . E-mail: jdd@lanl.gov; Urbatsch, Todd J. . E-mail: tmonster@lanl.gov; Evans, Thomas M. . E-mail: tme@lanl.gov; Buksas, Michael W. . E-mail: mwbuksas@lanl.gov

    2007-03-20

    Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Monte Carlo particle-transport simulations in diffusive media. If standard Monte Carlo is used in such media, particle histories will consist of many small steps, resulting in a computationally expensive calculation. In DDMC, particles take discrete steps between spatial cells according to a discretized diffusion equation. Each discrete step replaces many small Monte Carlo steps, thus increasing the efficiency of the simulation. In addition, given that DDMC is based on a diffusion equation, it should produce accurate solutions if used judiciously. In practice, DDMC is combined with standard Monte Carlo to form a hybrid transport-diffusion method that can accurately simulate problems with both diffusive and non-diffusive regions. In this paper, we extend previously developed DDMC techniques in several ways that improve the accuracy and utility of DDMC for nonlinear, time-dependent, radiative-transfer calculations. The use of DDMC in these types of problems is advantageous since, due to the underlying linearizations, optically thick regions appear to be diffusive. First, we employ a diffusion equation that is discretized in space but is continuous in time. Not only is this methodology theoretically more accurate than temporally discretized DDMC techniques, but it also has the benefit that a particle's time is always known. Thus, there is no ambiguity regarding what time to assign a particle that leaves an optically thick region (where DDMC is used) and begins transporting by standard Monte Carlo in an optically thin region. Also, we treat the interface between optically thick and optically thin regions with an improved method, based on the asymptotic diffusion-limit boundary condition, that can produce accurate results regardless of the angular distribution of the incident Monte Carlo particles. Finally, we develop a technique for estimating radiation momentum deposition during the

  19. A Novel Implementation of Massively Parallel Three Dimensional Monte Carlo Radiation Transport

    NASA Astrophysics Data System (ADS)

    Robinson, P. B.; Peterson, J. D. L.

    2005-12-01

    The goal of our summer project was to implement the difference formulation for radiation transport into Cosmos++, a multidimensional, massively parallel, magneto hydrodynamics code for astrophysical applications (Peter Anninos - AX). The difference formulation is a new method for Symbolic Implicit Monte Carlo thermal transport (Brooks and Szöke - PAT). Formerly, simultaneous implementation of fully implicit Monte Carlo radiation transport in multiple dimensions on multiple processors had not been convincingly demonstrated. We found that a combination of the difference formulation and the inherent structure of Cosmos++ makes such an implementation both accurate and straightforward. We developed a "nearly nearest neighbor physics" technique to allow each processor to work independently, even with a fully implicit code. This technique coupled with the increased accuracy of an implicit Monte Carlo solution and the efficiency of parallel computing systems allows us to demonstrate the possibility of massively parallel thermal transport. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48

  20. Dosimetric variation due to the photon beam energy in the small-animal irradiation: A Monte Carlo study

    SciTech Connect

    Chow, James C. L.; Leung, Michael K. K.; Lindsay, Patricia E.; Jaffray, David A.

    2010-10-15

    Purpose: The impact of photon beam energy and tissue heterogeneities on dose distributions and dosimetric characteristics such as point dose, mean dose, and maximum dose was investigated in the context of small-animal irradiation using Monte Carlo simulations based on the EGSnrc code. Methods: Three Monte Carlo mouse phantoms, namely, heterogeneous, homogeneous, and bone homogeneous were generated based on the same mouse computed tomography image set. These phantoms were generated by overriding the tissue type of none of the voxels (heterogeneous), all voxels (homogeneous), and only the bone voxels (bone homogeneous) to that of soft tissue. Phase space files of the 100 and 225 kVp photon beams based on a small-animal irradiator (XRad225Cx, Precision X-Ray Inc., North Branford, CT) were generated using BEAMnrc. A 360 deg. photon arc was simulated and three-dimensional (3D) dose calculations were carried out using the DOSXYZnrc code through DOSCTP in the above three phantoms. For comparison, the 3D dose distributions, dose profiles, mean, maximum, and point doses at different locations such as the isocenter, lung, rib, and spine were determined in the three phantoms. Results: The dose gradient resulting from the 225 kVp arc was found to be steeper than for the 100 kVp arc. The mean dose was found to be 1.29 and 1.14 times higher for the heterogeneous phantom when compared to the mean dose in the homogeneous phantom using the 100 and 225 kVp photon arcs, respectively. The bone doses (rib and spine) in the heterogeneous mouse phantom were about five (100 kVp) and three (225 kVp) times higher when compared to the homogeneous phantom. However, the lung dose did not vary significantly between the heterogeneous, homogeneous, and bone homogeneous phantom for the 225 kVp compared to the 100 kVp photon beams. Conclusions: A significant bone dose enhancement was found when the 100 and 225 kVp photon beams were used in small-animal irradiation. This dosimetric effect, due to

  1. Radiation dose measurements and Monte Carlo calculations for neutron and photon reactions in a human head phantom for accelerator-based boron neutron capture therapy

    NASA Astrophysics Data System (ADS)

    Kim, Don-Soo

    Dose measurements and radiation transport calculations were investigated for the interactions within the human brain of fast neutrons, slow neutrons, thermal neutrons, and photons associated with accelerator-based boron neutron capture therapy (ABNCT). To estimate the overall dose to the human brain, it is necessary to distinguish the doses from the different radiation sources. Using organic scintillators, human head phantom and detector assemblies were designed, constructed, and tested to determine the most appropriate dose estimation system to discriminate dose due to the different radiation sources that will ultimately be incorporated into a human head phantom to be used for dose measurements in ABNCT. Monoenergetic and continuous energy neutrons were generated via the 7Li(p,n)7Be reaction in a metallic lithium target near the reaction threshold using the 5.5 MV Van de Graaff accelerator at the University of Massachusetts Lowell. A human head phantom was built to measure and to distinguish the doses which result from proton recoils induced by fast neutrons, alpha particles and recoil lithium nuclei from the 10B(n,alpha)7Li reaction, and photons generated in the 7Li accelerator target as well as those generated inside the head phantom through various nuclear reactions at the same time during neutron irradiation procedures. The phantom consists of two main parts to estimate dose to tumor and dose to healthy tissue as well: a 3.22 cm3 boron loaded plastic scintillator which simulates a boron containing tumor inside the brain and a 2664 cm3 cylindrical liquid scintillator which represents the surrounding healthy tissue in the head. The Monte Carlo code MCNPX(TM) was used for the simulation of radiation transport due to neutrons and photons and extended to investigate the effects of neutrons and other radiation on the brain at various depths.

  2. Correlated histogram representation of Monte Carlo derived medical accelerator photon-output phase space

    DOEpatents

    Schach Von Wittenau, Alexis E.

    2003-01-01

    A method is provided to represent the calculated phase space of photons emanating from medical accelerators used in photon teletherapy. The method reproduces the energy distributions and trajectories of the photons originating in the bremsstrahlung target and of photons scattered by components within the accelerator head. The method reproduces the energy and directional information from sources up to several centimeters in radial extent, so it is expected to generalize well to accelerators made by different manufacturers. The method is computationally both fast and efficient overall sampling efficiency of 80% or higher for most field sizes. The computational cost is independent of the number of beams used in the treatment plan.

  3. Evaluation of path-history-based fluorescence Monte Carlo method for photon migration in heterogeneous media.

    PubMed

    Jiang, Xu; Deng, Yong; Luo, Zhaoyang; Wang, Kan; Lian, Lichao; Yang, Xiaoquan; Meglinski, Igor; Luo, Qingming

    2014-12-29

    The path-history-based fluorescence Monte Carlo method used for fluorescence tomography imaging reconstruction has attracted increasing attention. In this paper, we first validate the standard fluorescence Monte Carlo (sfMC) method by experimenting with a cylindrical phantom. Then, we describe a path-history-based decoupled fluorescence Monte Carlo (dfMC) method, analyze different perturbation fluorescence Monte Carlo (pfMC) methods, and compare the calculation accuracy and computational efficiency of the dfMC and pfMC methods using the sfMC method as a reference. The results show that the dfMC method is more accurate and efficient than the pfMC method in heterogeneous medium. PMID:25607163

  4. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing. PMID:26249663

  5. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    DOE PAGESBeta

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000® problems. These benchmark and scaling studies show promising results.« less

  6. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    SciTech Connect

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Some specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000® problems. These benchmark and scaling studies show promising results.

  7. Minimizing the cost of splitting in Monte Carlo radiation transport simulation

    SciTech Connect

    Juzaitis, R.J.

    1980-10-01

    A deterministic analysis of the computational cost associated with geometric splitting/Russian roulette in Monte Carlo radiation transport calculations is presented. Appropriate integro-differential equations are developed for the first and second moments of the Monte Carlo tally as well as time per particle history, given that splitting with Russian roulette takes place at one (or several) internal surfaces of the geometry. The equations are solved using a standard S/sub n/ (discrete ordinates) solution technique, allowing for the prediction of computer cost (formulated as the product of sample variance and time per particle history, sigma/sup 2//sub s/tau p) associated with a given set of splitting parameters. Optimum splitting surface locations and splitting ratios are determined. Benefits of such an analysis are particularly noteworthy for transport problems in which splitting is apt to be extensively employed (e.g., deep penetration calculations).

  8. Light transport and lasing in complex photonic structures

    NASA Astrophysics Data System (ADS)

    Liew, Seng Fatt

    Complex photonic structures refer to composite optical materials with dielectric constant varying on length scales comparable to optical wavelengths. Light propagation in such heterogeneous composites is greatly different from homogeneous media due to scattering of light in all directions. Interference of these scattered light waves gives rise to many fascinating phenomena and it has been a fast growing research area, both for its fundamental physics and for its practical applications. In this thesis, we have investigated the optical properties of photonic structures with different degree of order, ranging from periodic to random. The first part of this thesis consists of numerical studies of the photonic band gap (PBG) effect in structures from 1D to 3D. From these studies, we have observed that PBG effect in a 1D photonic crystal is robust against uncorrelated disorder due to preservation of long-range positional order. However, in higher dimensions, the short-range positional order alone is sufficient to form PBGs in 2D and 3D photonic amorphous structures (PASS). We have identified several parameters including dielectric filling fraction and degree of order that can be tuned to create a broad isotropic PBG. The largest PBG is produced by the dielectric networks due to local uniformity in their dielectric constant distribution. In addition, we also show that deterministic aperiodic structures (DASs) such as the golden-angle spiral and topological defect structures can support a wide PBG and their optical resonances contain unexpected features compared to those in photonic crystals. Another growing research field based on complex photonic structures is the study of structural color in animals and plants. Previous studies have shown that non-iridescent color can be generated from PASs via single or double scatterings. For better understanding of the coloration mechanisms, we have measured the wavelength-dependent scattering length from the biomimetic samples. Our

  9. A General-Purpose Monte Carlo Gamma-Ray Transport Code System for Minicomputers.

    Energy Science and Technology Software Center (ESTSC)

    1981-08-27

    Version 00 The OGRE code system was designed to calculate, by Monte Carlo methods, any quantity related to gamma-ray transport. The system is represented by two codes which treat slab geometry. OGRE-P1 computes the dose on one side of a slab for a source on the other side, and HOTONE computes energy deposition in addition. The source may be monodirectional, isotropic, or cosine distributed.

  10. Modular, object-oriented redesign of a large-scale Monte Carlo neutron transport program

    SciTech Connect

    Moskowitz, B.S.

    2000-02-01

    This paper describes the modular, object-oriented redesign of a large-scale Monte Carlo neutron transport program. This effort represents a complete 'white sheet of paper' rewrite of the code. In this paper, the motivation driving this project, the design objectives for the new version of the program, and the design choices and their consequences will be discussed. The design itself will also be described, including the important subsystems as well as the key classes within those subsystems.