Sample records for dosimetry computer codes

  1. Sci—Thur AM: YIS - 03: irtGPUMCD: a new GPU-calculated dosimetry code for {sup 177}Lu-octreotate radionuclide therapy of neuroendocrine tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montégiani, Jean-François; Gaudin, Émilie; Després, Philippe

    2014-08-15

    In peptide receptor radionuclide therapy (PRRT), huge inter-patient variability in absorbed radiation doses per administered activity mandates the utilization of individualized dosimetry to evaluate therapeutic efficacy and toxicity. We created a reliable GPU-calculated dosimetry code (irtGPUMCD) and assessed {sup 177}Lu-octreotate renal dosimetry in eight patients (4 cycles of approximately 7.4 GBq). irtGPUMCD was derived from a brachytherapy dosimetry code (bGPUMCD), which was adapted to {sup 177}Lu PRRT dosimetry. Serial quantitative single-photon emission computed tomography (SPECT) images were obtained from three SPECT/CT acquisitions performed at 4, 24 and 72 hours after {sup 177}Lu-octreotate administration, and registered with non-rigid deformation of CTmore » volumes, to obtain {sup 177}Lu-octreotate 4D quantitative biodistribution. Local energy deposition from the β disintegrations was assumed. Using Monte Carlo gamma photon transportation, irtGPUMCD computed dose rate at each time point. Average kidney absorbed dose was obtained from 1-cm{sup 3} VOI dose rate samples on each cortex, subjected to a biexponential curve fit. Integration of the latter time-dose rate curve yielded the renal absorbed dose. The mean renal dose per administered activity was 0.48 ± 0.13 Gy/GBq (range: 0.30–0.71 Gy/GBq). Comparison to another PRRT dosimetry code (VRAK: Voxelized Registration and Kinetics) showed fair accordance with irtGPUMCD (11.4 ± 6.8 %, range: 3.3–26.2%). These results suggest the possibility to use the irtGPUMCD code in order to personalize administered activity in PRRT. This could allow improving clinical outcomes by maximizing per-cycle tumor doses, without exceeding the tolerable renal dose.« less

  2. Items Supporting the Hanford Internal Dosimetry Program Implementation of the IMBA Computer Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.; Bihl, Donald E.

    2008-01-07

    The Hanford Internal Dosimetry Program has adopted the computer code IMBA (Integrated Modules for Bioassay Analysis) as its primary code for bioassay data evaluation and dose assessment using methodologies of ICRP Publications 60, 66, 67, 68, and 78. The adoption of this code was part of the implementation plan for the June 8, 2007 amendments to 10 CFR 835. This information release includes action items unique to IMBA that were required by PNNL quality assurance standards for implementation of safety software. Copie of the IMBA software verification test plan and the outline of the briefing given to new users aremore » also included.« less

  3. Reactor Dosimetry Applications Using RAPTOR-M3G:. a New Parallel 3-D Radiation Transport Code

    NASA Astrophysics Data System (ADS)

    Longoni, Gianluca; Anderson, Stanwood L.

    2009-08-01

    The numerical solution of the Linearized Boltzmann Equation (LBE) via the Discrete Ordinates method (SN) requires extensive computational resources for large 3-D neutron and gamma transport applications due to the concurrent discretization of the angular, spatial, and energy domains. This paper will discuss the development RAPTOR-M3G (RApid Parallel Transport Of Radiation - Multiple 3D Geometries), a new 3-D parallel radiation transport code, and its application to the calculation of ex-vessel neutron dosimetry responses in the cavity of a commercial 2-loop Pressurized Water Reactor (PWR). RAPTOR-M3G is based domain decomposition algorithms, where the spatial and angular domains are allocated and processed on multi-processor computer architectures. As compared to traditional single-processor applications, this approach reduces the computational load as well as the memory requirement per processor, yielding an efficient solution methodology for large 3-D problems. Measured neutron dosimetry responses in the reactor cavity air gap will be compared to the RAPTOR-M3G predictions. This paper is organized as follows: Section 1 discusses the RAPTOR-M3G methodology; Section 2 describes the 2-loop PWR model and the numerical results obtained. Section 3 addresses the parallel performance of the code, and Section 4 concludes this paper with final remarks and future work.

  4. WE-AB-204-11: Development of a Nuclear Medicine Dosimetry Module for the GPU-Based Monte Carlo Code ARCHER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Xu, X

    Purpose: To develop a nuclear medicine dosimetry module for the GPU-based Monte Carlo code ARCHER. Methods: We have developed a nuclear medicine dosimetry module for the fast Monte Carlo code ARCHER. The coupled electron-photon Monte Carlo transport kernel included in ARCHER is built upon the Dose Planning Method code (DPM). The developed module manages the radioactive decay simulation by consecutively tracking several types of radiation on a per disintegration basis using the statistical sampling method. Optimization techniques such as persistent threads and prefetching are studied and implemented. The developed module is verified against the VIDA code, which is based onmore » Geant4 toolkit and has previously been verified against OLINDA/EXM. A voxelized geometry is used in the preliminary test: a sphere made of ICRP soft tissue is surrounded by a box filled with water. Uniform activity distribution of I-131 is assumed in the sphere. Results: The self-absorption dose factors (mGy/MBqs) of the sphere with varying diameters are calculated by ARCHER and VIDA respectively. ARCHER’s result is in agreement with VIDA’s that are obtained from a previous publication. VIDA takes hours of CPU time to finish the computation, while it takes ARCHER 4.31 seconds for the 12.4-cm uniform activity sphere case. For a fairer CPU-GPU comparison, more effort will be made to eliminate the algorithmic differences. Conclusion: The coupled electron-photon Monte Carlo code ARCHER has been extended to radioactive decay simulation for nuclear medicine dosimetry. The developed code exhibits good performance in our preliminary test. The GPU-based Monte Carlo code is developed with grant support from the National Institute of Biomedical Imaging and Bioengineering through an R01 grant (R01EB015478)« less

  5. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-07-01

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural net approach it is possible to reduce the rate counts used to unfold the neutron spectrum. To evaluate these codes a computer tool called Neutron Spectrometry and dosimetry computer tool was designed. The results obtained with this package are showed. The codes here mentioned are freely available upon request to the authors.

  6. Development and application of a complex numerical model and software for the computation of dose conversion factors for radon progenies.

    PubMed

    Farkas, Árpád; Balásházy, Imre

    2015-04-01

    A more exact determination of dose conversion factors associated with radon progeny inhalation was possible due to the advancements in epidemiological health risk estimates in the last years. The enhancement of computational power and the development of numerical techniques allow computing dose conversion factors with increasing reliability. The objective of this study was to develop an integrated model and software based on a self-developed airway deposition code, an own bronchial dosimetry model and the computational methods accepted by International Commission on Radiological Protection (ICRP) to calculate dose conversion coefficients for different exposure conditions. The model was tested by its application for exposure and breathing conditions characteristic of mines and homes. The dose conversion factors were 8 and 16 mSv WLM(-1) for homes and mines when applying a stochastic deposition model combined with the ICRP dosimetry model (named PM-A model), and 9 and 17 mSv WLM(-1) when applying the same deposition model combined with authors' bronchial dosimetry model and the ICRP bronchiolar and alveolar-interstitial dosimetry model (called PM-B model). User friendly software for the computation of dose conversion factors has also been developed. The software allows one to compute conversion factors for a large range of exposure and breathing parameters and to perform sensitivity analyses. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Monte Carlo treatment planning for molecular targeted radiotherapy within the MINERVA system

    NASA Astrophysics Data System (ADS)

    Lehmann, Joerg; Hartmann Siantar, Christine; Wessol, Daniel E.; Wemple, Charles A.; Nigg, David; Cogliati, Josh; Daly, Tom; Descalle, Marie-Anne; Flickinger, Terry; Pletcher, David; DeNardo, Gerald

    2005-03-01

    The aim of this project is to extend accurate and patient-specific treatment planning to new treatment modalities, such as molecular targeted radiation therapy, incorporating previously crafted and proven Monte Carlo and deterministic computation methods. A flexible software environment is being created that allows planning radiation treatment for these new modalities and combining different forms of radiation treatment with consideration of biological effects. The system uses common input interfaces, medical image sets for definition of patient geometry and dose reporting protocols. Previously, the Idaho National Engineering and Environmental Laboratory (INEEL), Montana State University (MSU) and Lawrence Livermore National Laboratory (LLNL) had accrued experience in the development and application of Monte Carlo based, three-dimensional, computational dosimetry and treatment planning tools for radiotherapy in several specialized areas. In particular, INEEL and MSU have developed computational dosimetry systems for neutron radiotherapy and neutron capture therapy, while LLNL has developed the PEREGRINE computational system for external beam photon-electron therapy. Building on that experience, the INEEL and MSU are developing the MINERVA (modality inclusive environment for radiotherapeutic variable analysis) software system as a general framework for computational dosimetry and treatment planning for a variety of emerging forms of radiotherapy. In collaboration with this development, LLNL has extended its PEREGRINE code to accommodate internal sources for molecular targeted radiotherapy (MTR), and has interfaced it with the plugin architecture of MINERVA. Results from the extended PEREGRINE code have been compared to published data from other codes, and found to be in general agreement (EGS4—2%, MCNP—10%) (Descalle et al 2003 Cancer Biother. Radiopharm. 18 71-9). The code is currently being benchmarked against experimental data. The interpatient variability of the drug pharmacokinetics in MTR can only be properly accounted for by image-based, patient-specific treatment planning, as has been common in external beam radiation therapy for many years. MINERVA offers 3D Monte Carlo-based MTR treatment planning as its first integrated operational capability. The new MINERVA system will ultimately incorporate capabilities for a comprehensive list of radiation therapies. In progress are modules for external beam photon-electron therapy and boron neutron capture therapy (BNCT). Brachytherapy and proton therapy are planned. Through the open application programming interface (API), other groups can add their own modules and share them with the community.

  8. SU-F-T-111: Investigation of the Attila Deterministic Solver as a Supplement to Monte Carlo for Calculating Out-Of-Field Radiotherapy Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mille, M; Lee, C; Failla, G

    Purpose: To use the Attila deterministic solver as a supplement to Monte Carlo for calculating out-of-field organ dose in support of epidemiological studies looking at the risks of second cancers. Supplemental dosimetry tools are needed to speed up dose calculations for studies involving large-scale patient cohorts. Methods: Attila is a multi-group discrete ordinates code which can solve the 3D photon-electron coupled linear Boltzmann radiation transport equation on a finite-element mesh. Dose is computed by multiplying the calculated particle flux in each mesh element by a medium-specific energy deposition cross-section. The out-of-field dosimetry capability of Attila is investigated by comparing averagemore » organ dose to that which is calculated by Monte Carlo simulation. The test scenario consists of a 6 MV external beam treatment of a female patient with a tumor in the left breast. The patient is simulated by a whole-body adult reference female computational phantom. Monte Carlo simulations were performed using MCNP6 and XVMC. Attila can export a tetrahedral mesh for MCNP6, allowing for a direct comparison between the two codes. The Attila and Monte Carlo methods were also compared in terms of calculation speed and complexity of simulation setup. A key perquisite for this work was the modeling of a Varian Clinac 2100 linear accelerator. Results: The solid mesh of the torso part of the adult female phantom for the Attila calculation was prepared using the CAD software SpaceClaim. Preliminary calculations suggest that Attila is a user-friendly software which shows great promise for our intended application. Computational performance is related to the number of tetrahedral elements included in the Attila calculation. Conclusion: Attila is being explored as a supplement to the conventional Monte Carlo radiation transport approach for performing retrospective patient dosimetry. The goal is for the dosimetry to be sufficiently accurate for use in retrospective epidemiological investigations.« less

  9. A dosimetry study comparing NCS report-5, IAEA TRS-381, AAPM TG-51 and IAEA TRS-398 in three clinical electron beam energies

    NASA Astrophysics Data System (ADS)

    Palmans, Hugo; Nafaa, Laila; de Patoul, Nathalie; Denis, Jean-Marc; Tomsej, Milan; Vynckier, Stefaan

    2003-05-01

    New codes of practice for reference dosimetry in clinical high-energy photon and electron beams have been published recently, to replace the air kerma based codes of practice that have determined the dosimetry of these beams for the past twenty years. In the present work, we compared dosimetry based on the two most widespread absorbed dose based recommendations (AAPM TG-51 and IAEA TRS-398) with two air kerma based recommendations (NCS report-5 and IAEA TRS-381). Measurements were performed in three clinical electron beam energies using two NE2571-type cylindrical chambers, two Markus-type plane-parallel chambers and two NACP-02-type plane-parallel chambers. Dosimetry based on direct calibrations of all chambers in 60Co was investigated, as well as dosimetry based on cross-calibrations of plane-parallel chambers against a cylindrical chamber in a high-energy electron beam. Furthermore, 60Co perturbation factors for plane-parallel chambers were derived. It is shown that the use of 60Co calibration factors could result in deviations of more than 2% for plane-parallel chambers between the old and new codes of practice, whereas the use of cross-calibration factors, which is the first recommendation in the new codes, reduces the differences to less than 0.8% for all situations investigated here. The results thus show that neither the chamber-to-chamber variations, nor the obtained absolute dose values are significantly altered by changing from air kerma based dosimetry to absorbed dose based dosimetry when using calibration factors obtained from the Laboratory for Standard Dosimetry, Ghent, Belgium. The values of the 60Co perturbation factor for plane-parallel chambers (katt . km for the air kerma based and pwall for the absorbed dose based codes of practice) that are obtained from comparing the results based on 60Co calibrations and cross-calibrations are within the experimental uncertainties in agreement with the results from other investigators.

  10. Internal dosimetry with the Monte Carlo code GATE: validation using the ICRP/ICRU female reference computational model

    NASA Astrophysics Data System (ADS)

    Villoing, Daphnée; Marcatili, Sara; Garcia, Marie-Paule; Bardiès, Manuel

    2017-03-01

    The purpose of this work was to validate GATE-based clinical scale absorbed dose calculations in nuclear medicine dosimetry. GATE (version 6.2) and MCNPX (version 2.7.a) were used to derive dosimetric parameters (absorbed fractions, specific absorbed fractions and S-values) for the reference female computational model proposed by the International Commission on Radiological Protection in ICRP report 110. Monoenergetic photons and electrons (from 50 keV to 2 MeV) and four isotopes currently used in nuclear medicine (fluorine-18, lutetium-177, iodine-131 and yttrium-90) were investigated. Absorbed fractions, specific absorbed fractions and S-values were generated with GATE and MCNPX for 12 regions of interest in the ICRP 110 female computational model, thereby leading to 144 source/target pair configurations. Relative differences between GATE and MCNPX obtained in specific configurations (self-irradiation or cross-irradiation) are presented. Relative differences in absorbed fractions, specific absorbed fractions or S-values are below 10%, and in most cases less than 5%. Dosimetric results generated with GATE for the 12 volumes of interest are available as supplemental data. GATE can be safely used for radiopharmaceutical dosimetry at the clinical scale. This makes GATE a viable option for Monte Carlo modelling of both imaging and absorbed dose in nuclear medicine.

  11. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetrymore » with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural net approach it is possible to reduce the rate counts used to unfold the neutron spectrum. To evaluate these codes a computer tool called Neutron Spectrometry and dosimetry computer tool was designed. The results obtained with this package are showed. The codes here mentioned are freely available upon request to the authors.« less

  12. Detour factors in water and plastic phantoms and their use for range and depth scaling in electron-beam dosimetry.

    PubMed

    Fernández-Varea, J M; Andreo, P; Tabata, T

    1996-07-01

    Average penetration depths and detour factors of 1-50 MeV electrons in water and plastic materials have been computed by means of analytical calculation, within the continuous-slowing-down approximation and including multiple scattering, and using the Monte Carlo codes ITS and PENELOPE. Results are compared to detour factors from alternative definitions previously proposed in the literature. Different procedures used in low-energy electron-beam dosimetry to convert ranges and depths measured in plastic phantoms into water-equivalent ranges and depths are analysed. A new simple and accurate scaling method, based on Monte Carlo-derived ratios of average electron penetration depths and thus incorporating the effect of multiple scattering, is presented. Data are given for most plastics used in electron-beam dosimetry together with a fit which extends the method to any other low-Z plastic material. A study of scaled depth-dose curves and mean energies as a function of depth for some plastics of common usage shows that the method improves the consistency and results of other scaling procedures in dosimetry with electron beams at therapeutic energies.

  13. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images.

    PubMed

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity.

  14. Computer Aided Dosimetry and Verification of Exposure to Radiation

    NASA Astrophysics Data System (ADS)

    Waller, Edward; Stodilka, Robert Z.; Leach, Karen E.; Lalonde, Louise

    2002-06-01

    In the timeframe following the September 11th attacks on the United States, increased emphasis has been placed on Chemical, Biological, Radiological and Nuclear (CBRN) preparedness. Of prime importance is rapid field assessment of potential radiation exposure to Canadian Forces field personnel. This work set up a framework for generating an 'expert' computer system for aiding and assisting field personnel in determining the extent of radiation insult to military personnel. Data was gathered by review of the available literature, discussions with medical and health physics personnel having hands-on experience dealing with radiation accident victims, and from experience of the principal investigator. Flow charts and generic data fusion algorithms were developed. Relationships between known exposure parameters, patient interview and history, clinical symptoms, clinical work-ups, physical dosimetry, biological dosimetry, and dose reconstruction as critical data indicators were investigated. The data obtained was examined in terms of information theory. A main goal was to determine how best to generate an adaptive model (i.e. when more data becomes available, how is the prediction improved). Consideration was given to determination of predictive algorithms for health outcome. In addition. the concept of coding an expert medical treatment advisor system was developed (U)

  15. Validation of a personalized dosimetric evaluation tool (Oedipe) for targeted radiotherapy based on the Monte Carlo MCNPX code

    NASA Astrophysics Data System (ADS)

    Chiavassa, S.; Aubineau-Lanièce, I.; Bitar, A.; Lisbona, A.; Barbet, J.; Franck, D.; Jourdain, J. R.; Bardiès, M.

    2006-02-01

    Dosimetric studies are necessary for all patients treated with targeted radiotherapy. In order to attain the precision required, we have developed Oedipe, a dosimetric tool based on the MCNPX Monte Carlo code. The anatomy of each patient is considered in the form of a voxel-based geometry created using computed tomography (CT) images or magnetic resonance imaging (MRI). Oedipe enables dosimetry studies to be carried out at the voxel scale. Validation of the results obtained by comparison with existing methods is complex because there are multiple sources of variation: calculation methods (different Monte Carlo codes, point kernel), patient representations (model or specific) and geometry definitions (mathematical or voxel-based). In this paper, we validate Oedipe by taking each of these parameters into account independently. Monte Carlo methodology requires long calculation times, particularly in the case of voxel-based geometries, and this is one of the limits of personalized dosimetric methods. However, our results show that the use of voxel-based geometry as opposed to a mathematically defined geometry decreases the calculation time two-fold, due to an optimization of the MCNPX2.5e code. It is therefore possible to envisage the use of Oedipe for personalized dosimetry in the clinical context of targeted radiotherapy.

  16. Analysis of dosimetry from the H.B. Robinson unit 2 pressure vessel benchmark using RAPTOR-M3G and ALPAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, G.A.

    2011-07-01

    Document available in abstract form only, full text of document follows: The dosimetry from the H. B. Robinson Unit 2 Pressure Vessel Benchmark is analyzed with a suite of Westinghouse-developed codes and data libraries. The radiation transport from the reactor core to the surveillance capsule and ex-vessel locations is performed by RAPTOR-M3G, a parallel deterministic radiation transport code that calculates high-resolution neutron flux information in three dimensions. The cross-section library used in this analysis is the ALPAN library, an Evaluated Nuclear Data File (ENDF)/B-VII.0-based library designed for reactor dosimetry and fluence analysis applications. Dosimetry is evaluated with the industry-standard SNLRMLmore » reactor dosimetry cross-section data library. (authors)« less

  17. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images*

    PubMed Central

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    Objective This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. Materials and Methods A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. Results With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. Conclusion The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity. PMID:25741101

  18. A test of the IAEA code of practice for absorbed dose determination in photon and electron beams

    NASA Astrophysics Data System (ADS)

    Leitner, Arnold; Tiefenboeck, Wilhelm; Witzani, Josef; Strachotinsky, Christian

    1990-12-01

    The IAEA (International Atomic Energy Agency) code of practice TRS 277 gives recommendations for absorbed dose determination in high energy photon and electron beams based on the use of ionization chambers calibrated in terms of exposure of air kerma. The scope of the work was to test the code for cobalt 60 gamma radiation and for several radiation qualities at four different types of electron accelerators and to compare the ionization chamber dosimetry with ferrous sulphate dosimetry. The results show agreement between the two methods within about one per cent for all the investigated qualities. In addition the response of the TLD capsules of the IAEA/WHO TL dosimetry service was determined.

  19. Least-Squares Neutron Spectral Adjustment with STAYSL PNNL

    NASA Astrophysics Data System (ADS)

    Greenwood, L. R.; Johnson, C. D.

    2016-02-01

    The STAYSL PNNL computer code, a descendant of the STAY'SL code [1], performs neutron spectral adjustment of a starting neutron spectrum, applying a least squares method to determine adjustments based on saturated activation rates, neutron cross sections from evaluated nuclear data libraries, and all associated covariances. STAYSL PNNL is provided as part of a comprehensive suite of programs [2], where additional tools in the suite are used for assembling a set of nuclear data libraries and determining all required corrections to the measured data to determine saturated activation rates. Neutron cross section and covariance data are taken from the International Reactor Dosimetry File (IRDF-2002) [3], which was sponsored by the International Atomic Energy Agency (IAEA), though work is planned to update to data from the IAEA's International Reactor Dosimetry and Fusion File (IRDFF) [4]. The nuclear data and associated covariances are extracted from IRDF-2002 using the third-party NJOY99 computer code [5]. The NJpp translation code converts the extracted data into a library data array format suitable for use as input to STAYSL PNNL. The software suite also includes three utilities to calculate corrections to measured activation rates. Neutron self-shielding corrections are calculated as a function of neutron energy with the SHIELD code and are applied to the group cross sections prior to spectral adjustment, thus making the corrections independent of the neutron spectrum. The SigPhi Calculator is a Microsoft Excel spreadsheet used for calculating saturated activation rates from raw gamma activities by applying corrections for gamma self-absorption, neutron burn-up, and the irradiation history. Gamma self-absorption and neutron burn-up corrections are calculated (iteratively in the case of the burn-up) within the SigPhi Calculator spreadsheet. The irradiation history corrections are calculated using the BCF computer code and are inserted into the SigPhi Calculator workbook for use in correcting the measured activities. Output from the SigPhi Calculator is automatically produced, and consists of a portion of the STAYSL PNNL input file data that is required to run the spectral adjustment calculations. Within STAYSL PNNL, the least-squares process is performed in one step, without iteration, and provides rapid results on PC platforms. STAYSL PNNL creates multiple output files with tabulated results, data suitable for plotting, and data formatted for use in subsequent radiation damage calculations using the SPECTER computer code (which is not included in the STAYSL PNNL suite). All components of the software suite have undergone extensive testing and validation prior to release and test cases are provided with the package.

  20. Identification of Trends into Dose Calculations for Astronauts through Performing Sensitivity Analysis on Calculational Models Used by the Radiation Health Office

    NASA Technical Reports Server (NTRS)

    Adams, Thomas; VanBaalen, Mary

    2009-01-01

    The Radiation Health Office (RHO) determines each astronaut s cancer risk by using models to associate the amount of radiation dose that astronauts receive from spaceflight missions. The baryon transport codes (BRYNTRN), high charge (Z) and energy transport codes (HZETRN), and computer risk models are used to determine the effective dose received by astronauts in Low Earth orbit (LEO). This code uses an approximation of the Boltzman transport formula. The purpose of the project is to run this code for various International Space Station (ISS) flight parameters in order to gain a better understanding of how this code responds to different scenarios. The project will determine how variations in one set of parameters such as, the point of the solar cycle and altitude can affect the radiation exposure of astronauts during ISS missions. This project will benefit NASA by improving mission dosimetry.

  1. Accuracy Evaluation of Oncentra™ TPS in HDR Brachytherapy of Nasopharynx Cancer Using EGSnrc Monte Carlo Code.

    PubMed

    Hadad, K; Zohrevand, M; Faghihi, R; Sedighi Pashaki, A

    2015-03-01

    HDR brachytherapy is one of the commonest methods of nasopharyngeal cancer treatment. In this method, depending on how advanced one tumor is, 2 to 6 Gy dose as intracavitary brachytherapy is prescribed. Due to high dose rate and tumor location, accuracy evaluation of treatment planning system (TPS) is particularly important. Common methods used in TPS dosimetry are based on computations in a homogeneous phantom. Heterogeneous phantoms, especially patient-specific voxel phantoms can increase dosimetric accuracy. In this study, using CT images taken from a patient and ctcreate-which is a part of the DOSXYZnrc computational code, patient-specific phantom was made. Dose distribution was plotted by DOSXYZnrc and compared with TPS one. Also, by extracting the voxels absorbed dose in treatment volume, dose-volume histograms (DVH) was plotted and compared with Oncentra™ TPS DVHs. The results from calculations were compared with data from Oncentra™ treatment planning system and it was observed that TPS calculation predicts lower dose in areas near the source, and higher dose in areas far from the source relative to MC code. Absorbed dose values in the voxels also showed that TPS reports D90 value is 40% higher than the Monte Carlo method. Today, most treatment planning systems use TG-43 protocol. This protocol may results in errors such as neglecting tissue heterogeneity, scattered radiation as well as applicator attenuation. Due to these errors, AAPM emphasized departing from TG-43 protocol and approaching new brachytherapy protocol TG-186 in which patient-specific phantom is used and heterogeneities are affected in dosimetry.

  2. Accuracy Evaluation of Oncentra™ TPS in HDR Brachytherapy of Nasopharynx Cancer Using EGSnrc Monte Carlo Code

    PubMed Central

    Hadad, K.; Zohrevand, M.; Faghihi, R.; Sedighi Pashaki, A.

    2015-01-01

    Background HDR brachytherapy is one of the commonest methods of nasopharyngeal cancer treatment. In this method, depending on how advanced one tumor is, 2 to 6 Gy dose as intracavitary brachytherapy is prescribed. Due to high dose rate and tumor location, accuracy evaluation of treatment planning system (TPS) is particularly important. Common methods used in TPS dosimetry are based on computations in a homogeneous phantom. Heterogeneous phantoms, especially patient-specific voxel phantoms can increase dosimetric accuracy. Materials and Methods In this study, using CT images taken from a patient and ctcreate-which is a part of the DOSXYZnrc computational code, patient-specific phantom was made. Dose distribution was plotted by DOSXYZnrc and compared with TPS one. Also, by extracting the voxels absorbed dose in treatment volume, dose-volume histograms (DVH) was plotted and compared with Oncentra™ TPS DVHs. Results The results from calculations were compared with data from Oncentra™ treatment planning system and it was observed that TPS calculation predicts lower dose in areas near the source, and higher dose in areas far from the source relative to MC code. Absorbed dose values in the voxels also showed that TPS reports D90 value is 40% higher than the Monte Carlo method. Conclusion Today, most treatment planning systems use TG-43 protocol. This protocol may results in errors such as neglecting tissue heterogeneity, scattered radiation as well as applicator attenuation. Due to these errors, AAPM emphasized departing from TG-43 protocol and approaching new brachytherapy protocol TG-186 in which patient-specific phantom is used and heterogeneities are affected in dosimetry. PMID:25973408

  3. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  4. A Dynamic/Anisotropic Low Earth Orbit (LEO) Ionizing Radiation Model

    NASA Technical Reports Server (NTRS)

    Badavi, Francis F.; West, Katie J.; Nealy, John E.; Wilson, John W.; Abrahms, Briana L.; Luetke, Nathan J.

    2006-01-01

    The International Space Station (ISS) provides the proving ground for future long duration human activities in space. Ionizing radiation measurements in ISS form the ideal tool for the experimental validation of ionizing radiation environmental models, nuclear transport code algorithms, and nuclear reaction cross sections. Indeed, prior measurements on the Space Transportation System (STS; Shuttle) have provided vital information impacting both the environmental models and the nuclear transport code development by requiring dynamic models of the Low Earth Orbit (LEO) environment. Previous studies using Computer Aided Design (CAD) models of the evolving ISS configurations with Thermo Luminescent Detector (TLD) area monitors, demonstrated that computational dosimetry requires environmental models with accurate non-isotropic as well as dynamic behavior, detailed information on rack loading, and an accurate 6 degree of freedom (DOF) description of ISS trajectory and orientation.

  5. Dosimetry applications in GATE Monte Carlo toolkit.

    PubMed

    Papadimitroulas, Panagiotis

    2017-09-01

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Computer Aided Dosimetry and Verification of Exposure to Radiation

    DTIC Science & Technology

    2002-06-01

    Event matrix 2. Hematopoietic * Absolute blood counts * Relative blood counts 3. Dosimetry * TLD * EPDQuantitative * Radiation survey * Whole body...EI1 Defence Research and Recherche et developpement Development Canada pour la d6fense Canada DEFENCE •mI•DEFENSE Computer Aided Dosimetry and...Aided Dosimetry and Verification of Exposure to Radiation Edward Waller SAIC Canada Robert Z Stodilka Radiation Effects Group, Space Systems and

  7. All about MAX: a male adult voxel phantom for Monte Carlo calculations in radiation protection dosimetry

    NASA Astrophysics Data System (ADS)

    Kramer, R.; Vieira, J. W.; Khoury, H. J.; Lima, F. R. A.; Fuelle, D.

    2003-05-01

    The MAX (Male Adult voXel) phantom has been developed from existing segmented images of a male adult body, in order to achieve a representation as close as possible to the anatomical properties of the reference adult male specified by the ICRP. The study describes the adjustments of the soft-tissue organ masses, a new dosimetric model for the skin, a new model for skeletal dosimetry and a computational exposure model based on coupling the MAX phantom with the EGS4 Monte Carlo code. Conversion coefficients between equivalent dose to the red bone marrow as well as effective MAX dose and air-kerma free in air for external photon irradiation from the front and from the back, respectively, are presented and compared with similar data from other human phantoms.

  8. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  9. VVER-440 and VVER-1000 reactor dosimetry benchmark - BUGLE-96 versus ALPAN VII.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duo, J. I.

    2011-07-01

    Document available in abstract form only, full text of document follows: Analytical results of the vodo-vodyanoi energetichesky reactor-(VVER-) 440 and VVER-1000 reactor dosimetry benchmarks developed from engineering mockups at the Nuclear Research Inst. Rez LR-0 reactor are discussed. These benchmarks provide accurate determination of radiation field parameters in the vicinity and over the thickness of the reactor pressure vessel. Measurements are compared to calculated results with two sets of tools: TORT discrete ordinates code and BUGLE-96 cross-section library versus the newly Westinghouse-developed RAPTOR-M3G and ALPAN VII.0. The parallel code RAPTOR-M3G enables detailed neutron distributions in energy and space in reducedmore » computational time. ALPAN VII.0 cross-section library is based on ENDF/B-VII.0 and is designed for reactor dosimetry applications. It uses a unique broad group structure to enhance resolution in thermal-neutron-energy range compared to other analogous libraries. The comparison of fast neutron (E > 0.5 MeV) results shows good agreement (within 10%) between BUGLE-96 and ALPAN VII.O libraries. Furthermore, the results compare well with analogous results of participants of the REDOS program (2005). Finally, the analytical results for fast neutrons agree within 15% with the measurements, for most locations in all three mockups. In general, however, the analytical results underestimate the attenuation through the reactor pressure vessel thickness compared to the measurements. (authors)« less

  10. Calculation of electron and isotopes dose point kernels with fluka Monte Carlo code for dosimetry in nuclear medicine therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Botta, F; Di Dia, A; Pedroli, G

    The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK),more » quantifying the energy deposition all around a point isotropic source, is often the one.Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10–3 MeV) and for beta emitting isotopes commonly used for therapy (89Sr, 90Y, 131I, 153Sm, 177Lu, 186Re, and 188Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8·RCSDA and 0.9·RCSDA for monoenergetic electrons (RCSDA being the continuous slowing down approximation range) and within 0.8·X90 and 0.9·X90 for isotopes (X90 being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9·RCSDA and 0.9·X90 for electrons and isotopes, respectively.Results: Concerning monoenergetic electrons, within 0.8·RCSDA (where 90%–97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The discrepancies between fluka and the other codes are of the same order of magnitude than those observed when comparing the other codes among them, which can be referred to the different simulation algorithms. When considering the beta spectra, discrepancies notably reduce: within 0.9·X90, fluka and penelope differ for less than 1% in water and less than 2% in bone with any of the isotopes here considered. Complete data of fluka DPKs are given as Supplementary Material as a tool to perform dosimetry by analytical point kernel convolution.Conclusions: fluka provides reliable results when transporting electrons in the low energy range, proving to be an adequate tool for nuclear medicine dosimetry.« less

  11. Treating voxel geometries in radiation protection dosimetry with a patched version of the Monte Carlo codes MCNP and MCNPX.

    PubMed

    Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P

    2007-01-01

    The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported.

  12. G4DARI: Geant4/GATE based Monte Carlo simulation interface for dosimetry calculation in radiotherapy.

    PubMed

    Slimani, Faiçal A A; Hamdi, Mahdjoub; Bentourkia, M'hamed

    2018-05-01

    Monte Carlo (MC) simulation is widely recognized as an important technique to study the physics of particle interactions in nuclear medicine and radiation therapy. There are different codes dedicated to dosimetry applications and widely used today in research or in clinical application, such as MCNP, EGSnrc and Geant4. However, such codes made the physics easier but the programming remains a tedious task even for physicists familiar with computer programming. In this paper we report the development of a new interface GEANT4 Dose And Radiation Interactions (G4DARI) based on GEANT4 for absorbed dose calculation and for particle tracking in humans, small animals and complex phantoms. The calculation of the absorbed dose is performed based on 3D CT human or animal images in DICOM format, from images of phantoms or from solid volumes which can be made from any pure or composite material to be specified by its molecular formula. G4DARI offers menus to the user and tabs to be filled with values or chemical formulas. The interface is described and as application, we show results obtained in a lung tumor in a digital mouse irradiated with seven energy beams, and in a patient with glioblastoma irradiated with five photon beams. In conclusion, G4DARI can be easily used by any researcher without the need to be familiar with computer programming, and it will be freely available as an application package. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.

  14. Monte Carlo modeling of a conventional X-ray computed tomography scanner for gel dosimetry purposes.

    PubMed

    Hayati, Homa; Mesbahi, Asghar; Nazarpoor, Mahmood

    2016-01-01

    Our purpose in the current study was to model an X-ray CT scanner with the Monte Carlo (MC) method for gel dosimetry. In this study, a conventional CT scanner with one array detector was modeled with use of the MCNPX MC code. The MC calculated photon fluence in detector arrays was used for image reconstruction of a simple water phantom as well as polyacrylamide polymer gel (PAG) used for radiation therapy. Image reconstruction was performed with the filtered back-projection method with a Hann filter and the Spline interpolation method. Using MC results, we obtained the dose-response curve for images of irradiated gel at different absorbed doses. A spatial resolution of about 2 mm was found for our simulated MC model. The MC-based CT images of the PAG gel showed a reliable increase in the CT number with increasing absorbed dose for the studied gel. Also, our results showed that the current MC model of a CT scanner can be used for further studies on the parameters that influence the usability and reliability of results, such as the photon energy spectra and exposure techniques in X-ray CT gel dosimetry.

  15. Implementation and validation of collapsed cone superposition for radiopharmaceutical dosimetry of photon emitters

    NASA Astrophysics Data System (ADS)

    Sanchez-Garcia, Manuel; Gardin, Isabelle; Lebtahi, Rachida; Dieudonné, Arnaud

    2015-10-01

    Two collapsed cone (CC) superposition algorithms have been implemented for radiopharmaceutical dosimetry of photon emitters. The straight CC (SCC) superposition method uses a water energy deposition kernel (EDKw) for each electron, positron and photon components, while the primary and scatter CC (PSCC) superposition method uses different EDKw for primary and once-scattered photons. PSCC was implemented only for photons originating from the nucleus, precluding its application to positron emitters. EDKw are linearly scaled by radiological distance, taking into account tissue density heterogeneities. The implementation was tested on 100, 300 and 600 keV mono-energetic photons and 18F, 99mTc, 131I and 177Lu. The kernels were generated using the Monte Carlo codes MCNP and EGSnrc. The validation was performed on 6 phantoms representing interfaces between soft-tissues, lung and bone. The figures of merit were γ (3%, 3 mm) and γ (5%, 5 mm) criterions corresponding to the computation comparison on 80 absorbed doses (AD) points per phantom between Monte Carlo simulations and CC algorithms. PSCC gave better results than SCC for the lowest photon energy (100 keV). For the 3 isotopes computed with PSCC, the percentage of AD points satisfying the γ (5%, 5 mm) criterion was always over 99%. A still good but worse result was found with SCC, since at least 97% of AD-values verified the γ (5%, 5 mm) criterion, except a value of 57% for the 99mTc with the lung/bone interface. The CC superposition method for radiopharmaceutical dosimetry is a good alternative to Monte Carlo simulations while reducing computation complexity.

  16. Implementation and validation of collapsed cone superposition for radiopharmaceutical dosimetry of photon emitters.

    PubMed

    Sanchez-Garcia, Manuel; Gardin, Isabelle; Lebtahi, Rachida; Dieudonné, Arnaud

    2015-10-21

    Two collapsed cone (CC) superposition algorithms have been implemented for radiopharmaceutical dosimetry of photon emitters. The straight CC (SCC) superposition method uses a water energy deposition kernel (EDKw) for each electron, positron and photon components, while the primary and scatter CC (PSCC) superposition method uses different EDKw for primary and once-scattered photons. PSCC was implemented only for photons originating from the nucleus, precluding its application to positron emitters. EDKw are linearly scaled by radiological distance, taking into account tissue density heterogeneities. The implementation was tested on 100, 300 and 600 keV mono-energetic photons and (18)F, (99m)Tc, (131)I and (177)Lu. The kernels were generated using the Monte Carlo codes MCNP and EGSnrc. The validation was performed on 6 phantoms representing interfaces between soft-tissues, lung and bone. The figures of merit were γ (3%, 3 mm) and γ (5%, 5 mm) criterions corresponding to the computation comparison on 80 absorbed doses (AD) points per phantom between Monte Carlo simulations and CC algorithms. PSCC gave better results than SCC for the lowest photon energy (100 keV). For the 3 isotopes computed with PSCC, the percentage of AD points satisfying the γ (5%, 5 mm) criterion was always over 99%. A still good but worse result was found with SCC, since at least 97% of AD-values verified the γ (5%, 5 mm) criterion, except a value of 57% for the (99m)Tc with the lung/bone interface. The CC superposition method for radiopharmaceutical dosimetry is a good alternative to Monte Carlo simulations while reducing computation complexity.

  17. Development of Monte Carlo simulations to provide scanner-specific organ dose coefficients for contemporary CT

    NASA Astrophysics Data System (ADS)

    Jansen, Jan T. M.; Shrimpton, Paul C.

    2016-07-01

    The ImPACT (imaging performance assessment of CT scanners) CT patient dosimetry calculator is still used world-wide to estimate organ and effective doses (E) for computed tomography (CT) examinations, although the tool is based on Monte Carlo calculations reflecting practice in the early 1990’s. Subsequent developments in CT scanners, definitions of E, anthropomorphic phantoms, computers and radiation transport codes, have all fuelled an urgent need for updated organ dose conversion factors for contemporary CT. A new system for such simulations has been developed and satisfactorily tested. Benchmark comparisons of normalised organ doses presently derived for three old scanners (General Electric 9800, Philips Tomoscan LX and Siemens Somatom DRH) are within 5% of published values. Moreover, calculated normalised values of CT Dose Index for these scanners are in reasonable agreement (within measurement and computational uncertainties of  ±6% and  ±1%, respectively) with reported standard measurements. Organ dose coefficients calculated for a contemporary CT scanner (Siemens Somatom Sensation 16) demonstrate potential deviations by up to around 30% from the surrogate values presently assumed (through a scanner matching process) when using the ImPACT CT Dosimetry tool for newer scanners. Also, illustrative estimates of E for some typical examinations and a range of anthropomorphic phantoms demonstrate the significant differences (by some 10’s of percent) that can arise when changing from the previously adopted stylised mathematical phantom to the voxel phantoms presently recommended by the International Commission on Radiological Protection (ICRP), and when following the 2007 ICRP recommendations (updated from 1990) concerning tissue weighting factors. Further simulations with the validated dosimetry system will provide updated series of dose coefficients for a wide range of contemporary scanners.

  18. The computation of ICRP dose coefficients for intakes of radionuclides with PLEIADES: biokinetic aspects.

    PubMed

    Fell, T P

    2007-01-01

    The ICRP has published dose coefficients for the ingestion or inhalation of radionuclides in a series of reports covering intakes by workers and members of the public including children and pregnant or lactating women. The calculation of these coefficients conveniently divides into two distinct parts--the biokinetic and dosimetric. This paper gives a brief summary of the methods used to solve the biokinetic problem in the generation of dose coefficients on behalf of the ICRP, as implemented in the Health Protection Agency's internal dosimetry code PLEIADES.

  19. Multi-resolution voxel phantom modeling: a high-resolution eye model for computational dosimetry

    NASA Astrophysics Data System (ADS)

    Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek

    2014-09-01

    Voxel models of the human body are commonly used for simulating radiation dose with a Monte Carlo radiation transport code. Due to memory limitations, the voxel resolution of these computational phantoms is typically too large to accurately represent the dimensions of small features such as the eye. Recently reduced recommended dose limits to the lens of the eye, which is a radiosensitive tissue with a significant concern for cataract formation, has lent increased importance to understanding the dose to this tissue. A high-resolution eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and combined with an existing set of whole-body models to form a multi-resolution voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole-body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  20. Characterization and Simulation of a New Design Parallel-Plate Ionization Chamber for CT Dosimetry at Calibration Laboratories

    NASA Astrophysics Data System (ADS)

    Perini, Ana P.; Neves, Lucio P.; Maia, Ana F.; Caldas, Linda V. E.

    2013-12-01

    In this work, a new extended-length parallel-plate ionization chamber was tested in the standard radiation qualities for computed tomography established according to the half-value layers defined at the IEC 61267 standard, at the Calibration Laboratory of the Instituto de Pesquisas Energéticas e Nucleares (IPEN). The experimental characterization was made following the IEC 61674 standard recommendations. The experimental results obtained with the ionization chamber studied in this work were compared to those obtained with a commercial pencil ionization chamber, showing a good agreement. With the use of the PENELOPE Monte Carlo code, simulations were undertaken to evaluate the influence of the cables, insulator, PMMA body, collecting electrode, guard ring, screws, as well as different materials and geometrical arrangements, on the energy deposited on the ionization chamber sensitive volume. The maximum influence observed was 13.3% for the collecting electrode, and regarding the use of different materials and design, the substitutions showed that the original project presented the most suitable configuration. The experimental and simulated results obtained in this work show that this ionization chamber has appropriate characteristics to be used at calibration laboratories, for dosimetry in standard computed tomography and diagnostic radiology quality beams.

  1. Calculation of electron and isotopes dose point kernels with fluka Monte Carlo code for dosimetry in nuclear medicine therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Botta, F.; Mairani, A.; Battistoni, G.

    Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernelmore » (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10{sup -3} MeV) and for beta emitting isotopes commonly used for therapy ({sup 89}Sr, {sup 90}Y, {sup 131}I, {sup 153}Sm, {sup 177}Lu, {sup 186}Re, and {sup 188}Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8{center_dot}R{sub CSDA} and 0.9{center_dot}R{sub CSDA} for monoenergetic electrons (R{sub CSDA} being the continuous slowing down approximation range) and within 0.8{center_dot}X{sub 90} and 0.9{center_dot}X{sub 90} for isotopes (X{sub 90} being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9{center_dot}R{sub CSDA} and 0.9{center_dot}X{sub 90} for electrons and isotopes, respectively. Results: Concerning monoenergetic electrons, within 0.8{center_dot}R{sub CSDA} (where 90%-97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The discrepancies between fluka and the other codes are of the same order of magnitude than those observed when comparing the other codes among them, which can be referred to the different simulation algorithms. When considering the beta spectra, discrepancies notably reduce: within 0.9{center_dot}X{sub 90}, fluka and penelope differ for less than 1% in water and less than 2% in bone with any of the isotopes here considered. Complete data of fluka DPKs are given as Supplementary Material as a tool to perform dosimetry by analytical point kernel convolution. Conclusions: fluka provides reliable results when transporting electrons in the low energy range, proving to be an adequate tool for nuclear medicine dosimetry.« less

  2. Evaluation of the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels using particle and heavy ion transport code system: PHITS.

    PubMed

    Shiiba, Takuro; Kuga, Naoya; Kuroiwa, Yasuyoshi; Sato, Tatsuhiko

    2017-10-01

    We assessed the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels (DPKs) calculated using the particle and heavy ion transport code system (PHITS) for patient-specific dosimetry in targeted radionuclide treatment (TRT) and compared our data with published data. All mono-energetic and beta-emitting isotope DPKs calculated using PHITS, both in water and compact bone, were in good agreement with those in literature using other MC codes. PHITS provided reliable mono-energetic electron and beta-emitting isotope scaled DPKs for patient-specific dosimetry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Gamma irradiator dose mapping simulation using the MCNP code and benchmarking with dosimetry.

    PubMed

    Sohrabpour, M; Hassanzadeh, M; Shahriari, M; Sharifzadeh, M

    2002-10-01

    The Monte Carlo transport code, MCNP, has been applied in simulating dose rate distribution in the IR-136 gamma irradiator system. Isodose curves, cumulative dose values, and system design data such as throughputs, over-dose-ratios, and efficiencies have been simulated as functions of product density. Simulated isodose curves, and cumulative dose values were compared with dosimetry values obtained using polymethyle-methacrylate, Fricke, ethanol-chlorobenzene, and potassium dichromate dosimeters. The produced system design data were also found to agree quite favorably with those of the system manufacturer's data. MCNP has thus been found to be an effective transport code for handling of various dose mapping excercises for gamma irradiators.

  4. The internal dosimetry code PLEIADES.

    PubMed

    Fell, T P; Phipps, A W; Smith, T J

    2007-01-01

    The International Commission on Radiological Protection (ICRP) has published dose coefficients for the ingestion or inhalation of radionuclides in a series of reports covering intakes by workers and members of the public, including children and pregnant or lactating women. The calculation of these coefficients divides naturally into two distinct parts-the biokinetic and dosimetric. This paper describes in detail the methods used to solve the biokinetic problem in the generation of dose coefficients on behalf of the ICRP, as implemented in the Health Protection Agency's internal dosimetry code PLEIADES. A summary of the dosimetric treatment is included.

  5. Development, validation, and implementation of a patient-specific Monte Carlo 3D internal dosimetry platform

    NASA Astrophysics Data System (ADS)

    Besemer, Abigail E.

    Targeted radionuclide therapy is emerging as an attractive treatment option for a broad spectrum of tumor types because it has the potential to simultaneously eradicate both the primary tumor site as well as the metastatic disease throughout the body. Patient-specific absorbed dose calculations for radionuclide therapies are important for reducing the risk of normal tissue complications and optimizing tumor response. However, the only FDA approved software for internal dosimetry calculates doses based on the MIRD methodology which estimates mean organ doses using activity-to-dose scaling factors tabulated from standard phantom geometries. Despite the improved dosimetric accuracy afforded by direct Monte Carlo dosimetry methods these methods are not widely used in routine clinical practice because of the complexity of implementation, lack of relevant standard protocols, and longer dose calculation times. The main goal of this work was to develop a Monte Carlo internal dosimetry platform in order to (1) calculate patient-specific voxelized dose distributions in a clinically feasible time frame, (2) examine and quantify the dosimetric impact of various parameters and methodologies used in 3D internal dosimetry methods, and (3) develop a multi-criteria treatment planning optimization framework for multi-radiopharmaceutical combination therapies. This platform utilizes serial PET/CT or SPECT/CT images to calculate voxelized 3D internal dose distributions with the Monte Carlo code Geant4. Dosimetry can be computed for any diagnostic or therapeutic radiopharmaceutical and for both pre-clinical and clinical applications. In this work, the platform's dosimetry calculations were successfully validated against previously published reference doses values calculated in standard phantoms for a variety of radionuclides, over a wide range of photon and electron energies, and for many different organs and tumor sizes. Retrospective dosimetry was also calculated for various pre-clinical and clinical patients and large dosimetric differences resulted when using conventional organ-level methods and the patient-specific voxelized methods described in this work. The dosimetric impact of various steps in the 3D voxelized dosimetry process were evaluated including quantitative imaging acquisition, image coregistration, voxel resampling, ROI contouring, CT-based material segmentation, and pharmacokinetic fitting. Finally, a multi-objective treatment planning optimization framework was developed for multi-radiopharmaceutical combination therapies.

  6. Radioactive decay data tables: A handbook of decay data for application to radiation dosimetry and radiological assessments

    NASA Astrophysics Data System (ADS)

    Kocher, D. C.; Smith, J. S.

    Decay data are presented for approximately 500 radionuclides including those occurring naturally in the environment, those of potential importance in routine or accidental releases from the nuclear fuel cycle, those of current interest in nuclear medicine and fusion reactor technology, and some of those of interest to Committee 2 of the International Commission on Radiological Protection for the estimation of annual limits on intake via inhalation and ingestion for occupationally exposed individuals. Physical processes involved in radioactive decay which produce the different types of radiation observed, methods used to prepare the decay data sets for each radionuclide in the format of the computerized evaluated nuclear structure data file, the tables of radioactive decay data, and the computer code MEDLIST used to produce the tables are described. Applications of the data to problems of interest in radiation dosimetry and radiological assessments are considered as well as the calculations of the activity of a daughter radionuclide relative to the activity of its parent in a radioactive decay chain.

  7. Confirmation of a realistic reactor model for BNCT dosimetry at the TRIGA Mainz

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziegner, Markus, E-mail: Markus.Ziegner.fl@ait.ac.at; Schmitz, Tobias; Hampel, Gabriele

    2014-11-01

    Purpose: In order to build up a reliable dose monitoring system for boron neutron capture therapy (BNCT) applications at the TRIGA reactor in Mainz, a computer model for the entire reactor was established, simulating the radiation field by means of the Monte Carlo method. The impact of different source definition techniques was compared and the model was validated by experimental fluence and dose determinations. Methods: The depletion calculation code ORIGEN2 was used to compute the burn-up and relevant material composition of each burned fuel element from the day of first reactor operation to its current core. The material composition ofmore » the current core was used in a MCNP5 model of the initial core developed earlier. To perform calculations for the region outside the reactor core, the model was expanded to include the thermal column and compared with the previously established ATTILA model. Subsequently, the computational model is simplified in order to reduce the calculation time. Both simulation models are validated by experiments with different setups using alanine dosimetry and gold activation measurements with two different types of phantoms. Results: The MCNP5 simulated neutron spectrum and source strength are found to be in good agreement with the previous ATTILA model whereas the photon production is much lower. Both MCNP5 simulation models predict all experimental dose values with an accuracy of about 5%. The simulations reveal that a Teflon environment favorably reduces the gamma dose component as compared to a polymethyl methacrylate phantom. Conclusions: A computer model for BNCT dosimetry was established, allowing the prediction of dosimetric quantities without further calibration and within a reasonable computation time for clinical applications. The good agreement between the MCNP5 simulations and experiments demonstrates that the ATTILA model overestimates the gamma dose contribution. The detailed model can be used for the planning of structural modifications in the thermal column irradiation channel or the use of different irradiation sites than the thermal column, e.g., the beam tubes.« less

  8. Energy absorption buildup factors, exposure buildup factors and Kerma for optically stimulated luminescence materials and their tissue equivalence for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Singh, Vishwanath P.; Badiger, N. M.

    2014-11-01

    Optically stimulated luminescence (OSL) materials are sensitive dosimetric materials used for precise and accurate dose measurement for low-energy ionizing radiation. Low dose measurement capability with improved sensitivity makes these dosimeters very useful for diagnostic imaging, personnel monitoring and environmental radiation dosimetry. Gamma ray energy absorption buildup factors and exposure build factors were computed for OSL materials using the five-parameter Geometric Progression (G-P) fitting method in the energy range 0.015-15 MeV for penetration depths up to 40 mean free path. The computed energy absorption buildup factor and exposure buildup factor values were studied as a function of penetration depth and incident photon energy. Effective atomic numbers and Kerma relative to air of the selected OSL materials and tissue equivalence were computed and compared with that of water, PMMA and ICRU standard tissues. The buildup factors and kerma relative to air were found dependent upon effective atomic numbers. Buildup factors determined in the present work should be useful in radiation dosimetry, medical diagnostics and therapy, space dosimetry, accident dosimetry and personnel monitoring.

  9. History of dose specification in Brachytherapy: From Threshold Erythema Dose to Computational Dosimetry

    NASA Astrophysics Data System (ADS)

    Williamson, Jeffrey F.

    2006-09-01

    This paper briefly reviews the evolution of brachytherapy dosimetry from 1900 to the present. Dosimetric practices in brachytherapy fall into three distinct eras: During the era of biological dosimetry (1900-1938), radium pioneers could only specify Ra-226 and Rn-222 implants in terms of the mass of radium encapsulated within the implanted sources. Due to the high energy of its emitted gamma rays and the long range of its secondary electrons in air, free-air chambers could not be used to quantify the output of Ra-226 sources in terms of exposure. Biological dosimetry, most prominently the threshold erythema dose, gained currency as a means of intercomparing radium treatments with exposure-calibrated orthovoltage x-ray units. The classical dosimetry era (1940-1980) began with successful exposure standardization of Ra-226 sources by Bragg-Gray cavity chambers. Classical dose-computation algorithms, based upon 1-D buildup factor measurements and point-source superposition computational algorithms, were able to accommodate artificial radionuclides such as Co-60, Ir-192, and Cs-137. The quantitative dosimetry era (1980- ) arose in response to the increasing utilization of low energy K-capture radionuclides such as I-125 and Pd-103 for which classical approaches could not be expected to estimate accurate correct doses. This led to intensive development of both experimental (largely TLD-100 dosimetry) and Monte Carlo dosimetry techniques along with more accurate air-kerma strength standards. As a result of extensive benchmarking and intercomparison of these different methods, single-seed low-energy radionuclide dose distributions are now known with a total uncertainty of 3%-5%.

  10. Characterization of a fiber-coupled Al2O3:C luminescence dosimetry system for online in vivo dose verification during 192Ir brachytherapy.

    PubMed

    Andersen, Claus E; Nielsen, Søren Kynde; Greilich, Steffen; Helt-Hansen, Jakob; Lindegaard, Jacob Christian; Tanderup, Kari

    2009-03-01

    A prototype of a new dose-verification system has been developed to facilitate prevention and identification of dose delivery errors in remotely afterloaded brachytherapy. The system allows for automatic online in vivo dosimetry directly in the tumor region using small passive detector probes that fit into applicators such as standard needles or catheters. The system measures the absorbed dose rate (0.1 s time resolution) and total absorbed dose on the basis of radioluminescence (RL) and optically stimulated luminescence (OSL) from aluminum oxide crystals attached to optical fiber cables (1 mm outer diameter). The system was tested in the range from 0 to 4 Gy using a solid-water phantom, a Varian GammaMed Plus 192Ir PDR afterloader, and dosimetry probes inserted into stainless-steel brachytherapy needles. The calibrated system was found to be linear in the tested dose range. The reproducibility (one standard deviation) for RL and OSL measurements was 1.3%. The measured depth-dose profiles agreed well with the theoretical expectations computed with the EGSNRC Monte Carlo code, suggesting that the energy dependence for the dosimeter probes (relative to water) is less than 6% for source-to-probe distances in the range of 2-50 mm. Under certain conditions, the RL signal could be greatly disturbed by the so-called stem signal (i.e., unwanted light generated in the fiber cable upon irradiation). The OSL signal is not subject to this source of error. The tested system appears to be adequate for in vivo brachytherapy dosimetry.

  11. Retrospective dosimetry analyses of reactor vessel cladding samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenwood, L. R.; Soderquist, C. Z.; Fero, A. H.

    2011-07-01

    Reactor pressure vessel cladding samples for Ringhals Units 3 and 4 in Sweden were analyzed using retrospective reactor dosimetry techniques. The objective was to provide the best estimates of the neutron fluence for comparison with neutron transport calculations. A total of 51 stainless steel samples consisting of chips weighing approximately 100 to 200 mg were removed from selected locations around the pressure vessel and were sent to Pacific Northwest National Laboratory for analysis. The samples were fully characterized and analyzed for radioactive isotopes, with special interest in the presence of Nb-93m. The RPV cladding retrospective dosimetry results will be combinedmore » with a re-evaluation of the surveillance capsule dosimetry and with ex-vessel neutron dosimetry results to form a comprehensive 3D comparison of measurements to calculations performed with 3D deterministic transport code. (authors)« less

  12. WE-B-207-02: CT Lung Cancer Screening and the Medical Physicist: A Dosimetry Summary of CT Participants in the National Lung Cancer Screening Trial (NLST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C.

    2015-06-15

    The US National Lung Screening Trial (NLST) was a multi-center randomized, controlled trial comparing a low-dose CT (LDCT) to posterior-anterior (PA) chest x-ray (CXR) in screening older, current and former heavy smokers for early detection of lung cancer. Recruitment was launched in September 2002 and ended in April 2004 when 53,454 participants had been randomized at 33 screening sites in equal proportions. Funded by the National Cancer Institute this trial demonstrated that LDCT screening reduced lung cancer mortality. The US Preventive Services Task Force (USPSTF) cited NLST findings and conclusions in its deliberations and analysis of lung cancer screening. Undermore » the 2010 Patient Protection and Affordable Care Act, the USPSTF favorable recommendation regarding lung cancer CT screening assisted in obtaining third-party payers coverage for screening. The objective of this session is to provide an introduction to the NLST and the trial findings, in addition to a comprehensive review of the dosimetry investigations and assessments completed using individual NLST participant CT and CXR examinations. Session presentations will review and discuss the findings of two independent assessments, a CXR assessment and the findings of a CT investigation calculating individual organ dosimetry values. The CXR assessment reviewed a total of 73,733 chest x-ray exams that were performed on 92 chest imaging systems of which 66,157 participant examinations were used. The CT organ dosimetry investigation collected scan parameters from 23,773 CT examinations; a subset of the 75,133 CT examinations performed using 97 multi-detector CT scanners. Organ dose conversion coefficients were calculated using a Monte Carlo code. An experimentally-validated CT scanner simulation was coupled with 193 adult hybrid computational phantoms representing the height and weight of the current U.S. population. The dose to selected organs was calculated using the organ dose library and the abstracted scan parameters. This session will review the results and summarize the individualized doses to major organs and the mean effective dose and CTDIvol estimate for 66,157 PA chest and 23,773 CT examinations respectively, using size-dependent computational phantoms coupled with Monte Carlo calculations. Learning Objectives: Review and summarize relevant NLST findings and conclusions. Understand the scope and scale of the NLST specific to participant dosimetry. Provide a comprehensive review of NLST participant dosimetry assessments. Summarize the results of an investigation providing individualized organ dose estimates for NLST participant cohorts.« less

  13. Computational hybrid anthropometric paediatric phantom library for internal radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Xie, Tianwu; Kuster, Niels; Zaidi, Habib

    2017-04-01

    Hybrid computational phantoms combine voxel-based and simplified equation-based modelling approaches to provide unique advantages and more realism for the construction of anthropomorphic models. In this work, a methodology and C++ code are developed to generate hybrid computational phantoms covering statistical distributions of body morphometry in the paediatric population. The paediatric phantoms of the Virtual Population Series (IT’IS Foundation, Switzerland) were modified to match target anthropometric parameters, including body mass, body length, standing height and sitting height/stature ratio, determined from reference databases of the National Centre for Health Statistics and the National Health and Nutrition Examination Survey. The phantoms were selected as representative anchor phantoms for the newborn, 1, 2, 5, 10 and 15 years-old children, and were subsequently remodelled to create 1100 female and male phantoms with 10th, 25th, 50th, 75th and 90th body morphometries. Evaluation was performed qualitatively using 3D visualization and quantitatively by analysing internal organ masses. Overall, the newly generated phantoms appear very reasonable and representative of the main characteristics of the paediatric population at various ages and for different genders, body sizes and sitting stature ratios. The mass of internal organs increases with height and body mass. The comparison of organ masses of the heart, kidney, liver, lung and spleen with published autopsy and ICRP reference data for children demonstrated that they follow the same trend when correlated with age. The constructed hybrid computational phantom library opens up the prospect of comprehensive radiation dosimetry calculations and risk assessment for the paediatric population of different age groups and diverse anthropometric parameters.

  14. SU-F-T-12: Monte Carlo Dosimetry of the 60Co Bebig High Dose Rate Source for Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, L T; Almeida, C E V de

    Purpose: The purpose of this work is to obtain the dosimetry parameters in accordance with the AAPM TG-43U1 formalism with Monte Carlo calculations regarding the BEBIG 60Co high-dose-rate brachytherapy. The geometric design and material details of the source was provided by the manufacturer and was used to define the Monte Carlo geometry. Methods: The dosimetry studies included the calculation of the air kerma strength Sk, collision kerma in water along the transverse axis with an unbounded phantom, dose rate constant and radial dose function. The Monte Carlo code system that was used was EGSnrc with a new cavity code, whichmore » is a part of EGS++ that allows calculating the radial dose function around the source. The XCOM photon cross-section library was used. Variance reduction techniques were used to speed up the calculation and to considerably reduce the computer time. To obtain the dose rate distributions of the source in an unbounded liquid water phantom, the source was immersed at the center of a cube phantom of 100 cm3. Results: The obtained dose rate constant for the BEBIG 60Co source was 1.108±0.001 cGyh-1U-1, which is consistent with the values in the literature. The radial dose functions were compared with the values of the consensus data set in the literature, and they are consistent with the published data for this energy range. Conclusion: The dose rate constant is consistent with the results of Granero et al. and Selvam and Bhola within 1%. Dose rate data are compared to GEANT4 and DORZnrc Monte Carlo code. However, the radial dose function is different by up to 10% for the points that are notably near the source on the transversal axis because of the high-energy photons from 60Co, which causes an electronic disequilibrium at the interface between the source capsule and the liquid water for distances up to 1 cm.« less

  15. Dose estimation for astronauts using dose conversion coefficients calculated with the PHITS code and the ICRP/ICRU adult reference computational phantoms.

    PubMed

    Sato, Tatsuhiko; Endo, Akira; Sihver, Lembit; Niita, Koji

    2011-03-01

    Absorbed-dose and dose-equivalent rates for astronauts were estimated by multiplying fluence-to-dose conversion coefficients in the units of Gy.cm(2) and Sv.cm(2), respectively, and cosmic-ray fluxes around spacecrafts in the unit of cm(-2) s(-1). The dose conversion coefficients employed in the calculation were evaluated using the general-purpose particle and heavy ion transport code system PHITS coupled to the male and female adult reference computational phantoms, which were released as a common ICRP/ICRU publication. The cosmic-ray fluxes inside and near to spacecrafts were also calculated by PHITS, using simplified geometries. The accuracy of the obtained absorbed-dose and dose-equivalent rates was verified by various experimental data measured both inside and outside spacecrafts. The calculations quantitatively show that the effective doses for astronauts are significantly greater than their corresponding effective dose equivalents, because of the numerical incompatibility between the radiation quality factors and the radiation weighting factors. These results demonstrate the usefulness of dose conversion coefficients in space dosimetry. © Springer-Verlag 2010

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruger, R.

    The US National Lung Screening Trial (NLST) was a multi-center randomized, controlled trial comparing a low-dose CT (LDCT) to posterior-anterior (PA) chest x-ray (CXR) in screening older, current and former heavy smokers for early detection of lung cancer. Recruitment was launched in September 2002 and ended in April 2004 when 53,454 participants had been randomized at 33 screening sites in equal proportions. Funded by the National Cancer Institute this trial demonstrated that LDCT screening reduced lung cancer mortality. The US Preventive Services Task Force (USPSTF) cited NLST findings and conclusions in its deliberations and analysis of lung cancer screening. Undermore » the 2010 Patient Protection and Affordable Care Act, the USPSTF favorable recommendation regarding lung cancer CT screening assisted in obtaining third-party payers coverage for screening. The objective of this session is to provide an introduction to the NLST and the trial findings, in addition to a comprehensive review of the dosimetry investigations and assessments completed using individual NLST participant CT and CXR examinations. Session presentations will review and discuss the findings of two independent assessments, a CXR assessment and the findings of a CT investigation calculating individual organ dosimetry values. The CXR assessment reviewed a total of 73,733 chest x-ray exams that were performed on 92 chest imaging systems of which 66,157 participant examinations were used. The CT organ dosimetry investigation collected scan parameters from 23,773 CT examinations; a subset of the 75,133 CT examinations performed using 97 multi-detector CT scanners. Organ dose conversion coefficients were calculated using a Monte Carlo code. An experimentally-validated CT scanner simulation was coupled with 193 adult hybrid computational phantoms representing the height and weight of the current U.S. population. The dose to selected organs was calculated using the organ dose library and the abstracted scan parameters. This session will review the results and summarize the individualized doses to major organs and the mean effective dose and CTDIvol estimate for 66,157 PA chest and 23,773 CT examinations respectively, using size-dependent computational phantoms coupled with Monte Carlo calculations. Learning Objectives: Review and summarize relevant NLST findings and conclusions. Understand the scope and scale of the NLST specific to participant dosimetry. Provide a comprehensive review of NLST participant dosimetry assessments. Summarize the results of an investigation providing individualized organ dose estimates for NLST participant cohorts.« less

  17. Effects of body habitus on internal radiation dose calculations using the 5-year-old anthropomorphic male models.

    PubMed

    Xie, Tianwu; Kuster, Niels; Zaidi, Habib

    2017-07-13

    Computational phantoms are commonly used in internal radiation dosimetry to assess the amount and distribution pattern of energy deposited in various parts of the human body from different internal radiation sources. Radiation dose assessments are commonly performed on predetermined reference computational phantoms while the argument for individualized patient-specific radiation dosimetry exists. This study aims to evaluate the influence of body habitus on internal dosimetry and to quantify the uncertainties in dose estimation correlated with the use of fixed reference models. The 5-year-old IT'IS male phantom was modified to match target anthropometric parameters, including body weight, body height and sitting height/stature ratio (SSR), determined from reference databases, thus enabling the creation of 125 5-year-old habitus-dependent male phantoms with 10th, 25th, 50th, 75th and 90th percentile body morphometries. We evaluated the absorbed fractions and the mean absorbed dose to the target region per unit cumulative activity in the source region (S-values) of F-18 in 46 source regions for the generated 125 anthropomorphic 5-year-old hybrid male phantoms using the Monte Carlo N-Particle eXtended general purpose Monte Carlo transport code and calculated the absorbed dose and effective dose of five 18 F-labelled radiotracers for children of various habitus. For most organs, the S-value of F-18 presents stronger statistical correlations with body weight, standing height and sitting height than BMI and SSR. The self-absorbed fraction and self-absorbed S-values of F-18 and the absorbed dose and effective dose of 18 F-labelled radiotracers present with the strongest statistical correlations with body weight. For 18 F-Amino acids, 18 F-Brain receptor substances, 18 F-FDG, 18 F-L-DOPA and 18 F-FBPA, the mean absolute effective dose differences between phantoms of different habitus and fixed reference models are 11.4%, 11.3%, 10.8%, 13.3% and 11.4%, respectively. Total body weight, standing height and sitting height have considerable effects on human internal dosimetry. Radiation dose calculations for individual subjects using the most closely matched habitus-dependent computational phantom should be considered as an alternative to improve the accuracy of the estimates.

  18. Effects of body habitus on internal radiation dose calculations using the 5-year-old anthropomorphic male models

    NASA Astrophysics Data System (ADS)

    Xie, Tianwu; Kuster, Niels; Zaidi, Habib

    2017-08-01

    Computational phantoms are commonly used in internal radiation dosimetry to assess the amount and distribution pattern of energy deposited in various parts of the human body from different internal radiation sources. Radiation dose assessments are commonly performed on predetermined reference computational phantoms while the argument for individualized patient-specific radiation dosimetry exists. This study aims to evaluate the influence of body habitus on internal dosimetry and to quantify the uncertainties in dose estimation correlated with the use of fixed reference models. The 5-year-old IT’IS male phantom was modified to match target anthropometric parameters, including body weight, body height and sitting height/stature ratio (SSR), determined from reference databases, thus enabling the creation of 125 5-year-old habitus-dependent male phantoms with 10th, 25th, 50th, 75th and 90th percentile body morphometries. We evaluated the absorbed fractions and the mean absorbed dose to the target region per unit cumulative activity in the source region (S-values) of F-18 in 46 source regions for the generated 125 anthropomorphic 5-year-old hybrid male phantoms using the Monte Carlo N-Particle eXtended general purpose Monte Carlo transport code and calculated the absorbed dose and effective dose of five 18F-labelled radiotracers for children of various habitus. For most organs, the S-value of F-18 presents stronger statistical correlations with body weight, standing height and sitting height than BMI and SSR. The self-absorbed fraction and self-absorbed S-values of F-18 and the absorbed dose and effective dose of 18F-labelled radiotracers present with the strongest statistical correlations with body weight. For 18F-Amino acids, 18F-Brain receptor substances, 18F-FDG, 18F-L-DOPA and 18F-FBPA, the mean absolute effective dose differences between phantoms of different habitus and fixed reference models are 11.4%, 11.3%, 10.8%, 13.3% and 11.4%, respectively. Total body weight, standing height and sitting height have considerable effects on human internal dosimetry. Radiation dose calculations for individual subjects using the most closely matched habitus-dependent computational phantom should be considered as an alternative to improve the accuracy of the estimates.

  19. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.

    PubMed

    Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L

    2012-06-07

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  20. The work programme of EURADOS on internal and external dosimetry.

    PubMed

    Rühm, W; Bottollier-Depois, J F; Gilvin, P; Harrison, R; Knežević, Ž; Lopez, M A; Tanner, R; Vargas, A; Woda, C

    2018-01-01

    Since the early 1980s, the European Radiation Dosimetry Group (EURADOS) has been maintaining a network of institutions interested in the dosimetry of ionising radiation. As of 2017, this network includes more than 70 institutions (research centres, dosimetry services, university institutes, etc.), and the EURADOS database lists more than 500 scientists who contribute to the EURADOS mission, which is to promote research and technical development in dosimetry and its implementation into practice, and to contribute to harmonisation of dosimetry in Europe and its conformance with international practices. The EURADOS working programme is organised into eight working groups dealing with environmental, computational, internal, and retrospective dosimetry; dosimetry in medical imaging; dosimetry in radiotherapy; dosimetry in high-energy radiation fields; and harmonisation of individual monitoring. Results are published as freely available EURADOS reports and in the peer-reviewed scientific literature. Moreover, EURADOS organises winter schools and training courses on various aspects relevant for radiation dosimetry, and formulates the strategic research needs in dosimetry important for Europe. This paper gives an overview on the most important EURADOS activities. More details can be found at www.eurados.org .

  1. Strategic Consolidation of Medical War Reserve Material (WRM) Equipment Unit Type Code (UTC) Assemblages

    DTIC Science & Technology

    2013-03-01

    anomalies such as listing 640 deployments in 2003 but only 11 for 2005 and 3 for 2008. The poor data quality was attributed to a lost hard drive 31...Dosimetry Equipment 9171 Expeditionary Dental Clinic 902P RAD/ NUC Dosimetry A ug Equipm ent 9171 High A ltitude A ir Dr op M ission Support 903A Oxygen

  2. Internal dosimetry through GATE simulations of preclinical radiotherapy using a melanin-targeting ligand

    NASA Astrophysics Data System (ADS)

    Perrot, Y.; Degoul, F.; Auzeloux, P.; Bonnet, M.; Cachin, F.; Chezal, J. M.; Donnarieix, D.; Labarre, P.; Moins, N.; Papon, J.; Rbah-Vidal, L.; Vidal, A.; Miot-Noirault, E.; Maigne, L.

    2014-05-01

    The GATE Monte Carlo simulation platform based on the Geant4 toolkit is under constant improvement for dosimetric calculations. In this study, we explore its use for the dosimetry of the preclinical targeted radiotherapy of melanoma using a new specific melanin-targeting radiotracer labeled with iodine 131. Calculated absorbed fractions and S values for spheres and murine models (digital and CT-scan-based mouse phantoms) are compared between GATE and EGSnrc Monte Carlo codes considering monoenergetic electrons and the detailed energy spectrum of iodine 131. The behavior of Geant4 standard and low energy models is also tested. Following the different authors’ guidelines concerning the parameterization of electron physics models, this study demonstrates an agreement of 1.2% and 1.5% with EGSnrc, respectively, for the calculation of S values for small spheres and mouse phantoms. S values calculated with GATE are then used to compute the dose distribution in organs of interest using the activity distribution in mouse phantoms. This study gives the dosimetric data required for the translation of the new treatment to the clinic.

  3. Development and application of a 3-D geometry/mass model for LDEF satellite ionizing radiation assessments

    NASA Technical Reports Server (NTRS)

    Colborn, B. L.; Armstrong, T. W.

    1992-01-01

    A computer model of the three dimensional geometry and material distributions for the LDEF spacecraft, experiment trays, and, for selected trays, the components of experiments within a tray was developed for use in ionizing radiation assessments. The model is being applied to provide 3-D shielding distributions around radiation dosimeters to aid in data interpretation, particularly in assessing the directional properties of the radiation exposure. Also, the model has been interfaced with radiation transport codes for 3-D dosimetry response predictions and for calculations related to determining the accuracy of trapped proton and cosmic ray environment models. The methodology is described used in developing the 3-D LDEF model and the level of detail incorporated. Currently, the trays modeled in detail are F2, F8, and H12 and H3. Applications of the model which are discussed include the 3-D shielding distributions around various dosimeters, the influence of shielding on dosimetry responses, and comparisons of dose predictions based on the present 3-D model vs those from 1-D geometry model approximations used in initial estimates.

  4. MAGIC polymer gel for dosimetric verification in boron neutron capture therapy

    PubMed Central

    Heikkinen, Sami; Kotiluoto, Petri; Serén, Tom; Seppälä, Tiina; Auterinen, Iiro; Savolainen, Sauli

    2007-01-01

    Radiation‐sensitive polymer gels are among the most promising three‐dimensional dose verification tools developed to date. We tested the normoxic polymer gel dosimeter known by the acronym MAGIC (methacrylic and ascorbic acid in gelatin initiated by copper) to evaluate its use in boron neutron capture therapy (BNCT) dosimetry. We irradiated a large cylindrical gel phantom (diameter: 10 cm; length: 20 cm) in the epithermal neutron beam of the Finnish BNCT facility at the FiR 1 nuclear reactor. Neutron irradiation was simulated with a Monte Carlo radiation transport code MCNP. To compare dose–response, gel samples from the same production batch were also irradiated with 6 MV photons from a medical linear accelerator. Irradiated gel phantoms then underwent magnetic resonance imaging to determine their R2 relaxation rate maps. The measured and normalized dose distribution in the epithermal neutron beam was compared with the dose distribution calculated by computer simulation. The results support the feasibility of using MAGIC gel in BNCT dosimetry. PACS numbers: 87.53.Qc, 87.53.Wz, 87.66.Ff PMID:17592463

  5. SU-C-201-07: Towards Clinical Cherenkov Emission Dosimetry: Stopping Power-To-Cherenkov Power Ratios and Beam Quality Specification of Clinical Electron Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zlateva, Y; Seuntjens, J; El Naqa, I

    Purpose: We propose a Cherenkov emission (CE)-based reference dosimetry method, which in contrast to ionization chamber-based dosimetry, employs spectrum-averaged electron restricted mass collision stopping power-to-Cherenkov power ratios (SCRs), and we examine Monte Carlo-calculated SCRs and beam quality specification of clinical electron beams. Methods: The EGSnrc user code SPRRZnrc was modified to compute SCRs instead of stopping-power ratios (single medium: water; cut-off: CE threshold (observing Spencer-Attix conditions); CE power: Frank-Tamm). SCRs are calculated with BEAMnrc for realistic electron beams with nominal energies of 6–22 MeV from three Varian accelerators (TrueBeam Clinac 21EX, Clinac 2100C/D) and for mono-energetic beams of energies equalmore » to the mean electron energy at the water surface. Sources of deviation between clinical and mono-energetic SCRs are analyzed quantitatively. A universal fit for the beam-quality index R{sub 50} in terms of the depth of 50% CE C{sub 50} is carried out. Results: SCRs at reference depth are overestimated by mono-energetic values by up to 0.2% for a 6-MeV beam and underestimated by up to 2.3% for a 22-MeV beam. The variation is mainly due to the clinical beam spectrum and photon contamination. Beam angular spread has a small effect across all depths and energies. The influence of the electron spectrum becomes increasingly significant at large depths, while at shallow depths and high beam energies photon contamination is predominant (up to 2.0%). The universal data fit reveals a strong linear correlation between R{sub 50} and C{sub 50} (ρ > 0.99999). Conclusion: CE is inherent to radiotherapy beams and can be detected outside the beam with available optical technologies, which makes it an ideal candidate for out-of-beam high-resolution 3D dosimetry. Successful clinical implementation of CE dosimetry hinges on the development of robust protocols for converting measured CE to radiation dose. Our findings constitute a key step towards clinical CE dosimetry.« less

  6. IPEM guidelines on dosimeter systems for use as transfer instruments between the UK primary dosimetry standards laboratory (NPL) and radiotherapy centres1

    NASA Astrophysics Data System (ADS)

    Morgan, A. M.; Aird, E. G. A.; Aukett, R. J.; Duane, S.; Jenkins, N. H.; Mayles, W. P. M.; Moretti, C.; Thwaites, D. I.

    2000-09-01

    United Kingdom dosimetry codes of practice have traditionally specified one electrometer for use as a secondary standard, namely the Nuclear Enterprises (NE) 2560 NPL secondary standard therapy level exposure meter. The NE2560 will become obsolete in the foreseeable future. This report provides guidelines to assist physicists following the United Kingdom dosimetry codes of practice in the selection of an electrometer to replace the NE2560 when necessary. Using an internationally accepted standard (BS EN 60731:1997) as a basis, estimated error analyses demonstrate that the uncertainty (one standard deviation) in a charge measurement associated with the NE2560 alone is approximately 0.3% under specified conditions. Following a review of manufacturers' literature, it is considered that modern electrometers should be capable of equalling this performance. Additional constructural and operational requirements not specified in the international standard but considered essential in a modern electrometer to be used as a secondary standard are presented.

  7. Computational Thermodynamics Analysis of Vaporizing Fuel Droplets in the Human Upper Airways

    NASA Astrophysics Data System (ADS)

    Zhang, Zhe; Kleinstreuer, Clement

    The detailed knowledge of air flow structures as well as particle transport and deposition in the human lung for typical inhalation flow rates is an important precursor for dosimetry-and-health-effect studies of toxic particles as well as for targeted drug delivery of therapeutic aerosols. Focusing on highly toxic JP-8 fuel aerosols, 3-D airflow and fluid-particle thermodynamics in a human upper airway model starting from mouth to Generation G3 (G0 is the trachea) are simulated using a user-enhanced and experimentally validated finite-volume code. The temperature distributions and their effects on airflow structures, fuel vapor deposition and droplet motion/evaporation are discussed. The computational results show that the thermal effect on vapor deposition is minor, but it may greatly affect droplet deposition in human airways.

  8. Development of a patient-specific dosimetry estimation system in nuclear medicine examination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H. H.; Dong, S. L.; Yang, H. J.

    2011-07-01

    The purpose of this study is to develop a patient-specific dosimetry estimation system in nuclear medicine examination using a SimSET-based Monte Carlo code. We added a dose deposition routine to store the deposited energy of the photons during their flights in SimSET and developed a user-friendly interface for reading PET and CT images. Dose calculated on ORNL phantom was used to validate the accuracy of this system. The S values for {sup 99m}Tc, {sup 18}F and {sup 131}I obtained by the system were compared to those from the MCNP4C code and OLINDA. The ratios of S values computed by thismore » system to those obtained with OLINDA for various organs were ranged from 0.93 to 1.18, which are comparable to that obtained from MCNP4C code (0.94 to 1.20). The average ratios of S value were 0.99{+-}0.04, 1.03{+-}0.05, and 1.00{+-}0.07 for isotopes {sup 131}I, {sup 18}F, and {sup 99m}Tc, respectively. The simulation time of SimSET was two times faster than MCNP4C's for various isotopes. A 3D dose calculation was also performed on a patient data set with PET/CT examination using this system. Results from the patient data showed that the estimated S values using this system differed slightly from those of OLINDA for ORNL phantom. In conclusion, this system can generate patient-specific dose distribution and display the isodose curves on top of the anatomic structure through a friendly graphic user interface. It may also provide a useful tool to establish an appropriate dose-reduction strategy to patients in nuclear medicine environments. (authors)« less

  9. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée; McKay, Erin

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of amore » given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. Conclusions: The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.« less

  10. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry.

    PubMed

    Garcia, Marie-Paule; Villoing, Daphnée; McKay, Erin; Ferrer, Ludovic; Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila; Bardiès, Manuel

    2015-12-01

    The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit gate offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on gate to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user's imaging requirements and generates automatically command files used as input for gate. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant gate input files are generated for the virtual patient model and associated pharmacokinetics. Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body "step and shoot" acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.

  11. Numerical Analysis of Organ Doses Delivered During Computed Tomography Examinations Using Japanese Adult Phantoms with the WAZA-ARI Dosimetry System.

    PubMed

    Takahashi, Fumiaki; Sato, Kaoru; Endo, Akira; Ono, Koji; Ban, Nobuhiko; Hasegawa, Takayuki; Katsunuma, Yasushi; Yoshitake, Takayasu; Kai, Michiaki

    2015-08-01

    A dosimetry system for computed tomography (CT) examinations, named WAZA-ARI, is being developed to accurately assess radiation doses to patients in Japan. For dose calculations in WAZA-ARI, organ doses were numerically analyzed using average adult Japanese male (JM) and female (JF) phantoms with the Particle and Heavy Ion Transport code System (PHITS). Experimental studies clarified the photon energy distribution of emitted photons and dose profiles on the table for some multi-detector row CT (MDCT) devices. Numerical analyses using a source model in PHITS could specifically take into account emissions of x rays from the tube to the table with attenuation of photons through a beam-shaping filter for each MDCT device based on the experiment results. The source model was validated by measuring the CT dose index (CTDI). Numerical analyses with PHITS revealed a concordance of organ doses with body sizes of the JM and JF phantoms. The organ doses in the JM phantoms were compared with data obtained using previously developed systems. In addition, the dose calculations in WAZA-ARI were verified with previously reported results by realistic NUBAS phantoms and radiation dose measurement using a physical Japanese model (THRA1 phantom). The results imply that numerical analyses using the Japanese phantoms and specified source models can give reasonable estimates of dose for MDCT devices for typical Japanese adults.

  12. Use of computer code for dose distribution studies in A 60CO industrial irradiator

    NASA Astrophysics Data System (ADS)

    Piña-Villalpando, G.; Sloan, D. P.

    1995-09-01

    This paper presents a benchmark comparison between calculated and experimental absorbed dose values tor a typical product, in a 60Co industrial irradiator, located at ININ, México. The irradiator is a two levels, two layers system with overlapping product configuration with activity around 300kCi. Experimental values were obtanied from routine dosimetry, using red acrylic pellets. Typical product was Petri dishes packages, apparent density 0.13 g/cm3; that product was chosen because uniform size, large quantity and low density. Minimum dose was fixed in 15 kGy. Calculated values were obtained from QAD-CGGP code. This code uses a point kernel technique, build-up factors fitting was done by geometrical progression and combinatorial geometry is used for system description. Main modifications for the code were related with source sumilation, using punctual sources instead of pencils and an energy and anisotropic emission spectrums were included. Results were, for maximum dose, calculated value (18.2 kGy) was 8% higher than experimental average value (16.8 kGy); for minimum dose, calculated value (13.8 kGy) was 3% higher than experimental average value (14.3 kGy).

  13. An Eye Model for Computational Dosimetry Using A Multi-Scale Voxel Phantom

    NASA Astrophysics Data System (ADS)

    Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek

    2014-06-01

    The lens of the eye is a radiosensitive tissue with cataract formation being the major concern. Recently reduced recommended dose limits to the lens of the eye have made understanding the dose to this tissue of increased importance. Due to memory limitations, the voxel resolution of computational phantoms used for radiation dose calculations is too large to accurately represent the dimensions of the eye. A revised eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and is then transformed into a high-resolution voxel model. This eye model is combined with an existing set of whole body models to form a multi-scale voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  14. Scattered Dose Calculations and Measurements in a Life-Like Mouse Phantom

    PubMed Central

    Welch, David; Turner, Leah; Speiser, Michael; Randers-Pehrson, Gerhard; Brenner, David J.

    2017-01-01

    Anatomically accurate phantoms are useful tools for radiation dosimetry studies. In this work, we demonstrate the construction of a new generation of life-like mouse phantoms in which the methods have been generalized to be applicable to the fabrication of any small animal. The mouse phantoms, with built-in density inhomogeneity, exhibit different scattering behavior dependent on where the radiation is delivered. Computer models of the mouse phantoms and a small animal irradiation platform were devised in Monte Carlo N-Particle code (MCNP). A baseline test replicating the irradiation system in a computational model shows minimal differences from experimental results from 50 Gy down to 0.1 Gy. We observe excellent agreement between scattered dose measurements and simulation results from X-ray irradiations focused at either the lung or the abdomen within our phantoms. This study demonstrates the utility of our mouse phantoms as measurement tools with the goal of using our phantoms to verify complex computational models. PMID:28140787

  15. Comparison of normal tissue dose calculation methods for epidemiological studies of radiotherapy patients.

    PubMed

    Mille, Matthew M; Jung, Jae Won; Lee, Choonik; Kuzmin, Gleb A; Lee, Choonsik

    2018-06-01

    Radiation dosimetry is an essential input for epidemiological studies of radiotherapy patients aimed at quantifying the dose-response relationship of late-term morbidity and mortality. Individualised organ dose must be estimated for all tissues of interest located in-field, near-field, or out-of-field. Whereas conventional measurement approaches are limited to points in water or anthropomorphic phantoms, computational approaches using patient images or human phantoms offer greater flexibility and can provide more detailed three-dimensional dose information. In the current study, we systematically compared four different dose calculation algorithms so that dosimetrists and epidemiologists can better understand the advantages and limitations of the various approaches at their disposal. The four dose calculations algorithms considered were as follows: the (1) Analytical Anisotropic Algorithm (AAA) and (2) Acuros XB algorithm (Acuros XB), as implemented in the Eclipse treatment planning system (TPS); (3) a Monte Carlo radiation transport code, EGSnrc; and (4) an accelerated Monte Carlo code, the x-ray Voxel Monte Carlo (XVMC). The four algorithms were compared in terms of their accuracy and appropriateness in the context of dose reconstruction for epidemiological investigations. Accuracy in peripheral dose was evaluated first by benchmarking the calculated dose profiles against measurements in a homogeneous water phantom. Additional simulations in a heterogeneous cylinder phantom evaluated the performance of the algorithms in the presence of tissue heterogeneity. In general, we found that the algorithms contained within the commercial TPS (AAA and Acuros XB) were fast and accurate in-field or near-field, but not acceptable out-of-field. Therefore, the TPS is best suited for epidemiological studies involving large cohorts and where the organs of interest are located in-field or partially in-field. The EGSnrc and XVMC codes showed excellent agreement with measurements both in-field and out-of-field. The EGSnrc code was the most accurate dosimetry approach, but was too slow to be used for large-scale epidemiological cohorts. The XVMC code showed similar accuracy to EGSnrc, but was significantly faster, and thus epidemiological applications seem feasible, especially when the organs of interest reside far away from the field edge.

  16. Monte Carlo Simulations Comparing the Response of a Novel Hemispherical Tepc to Existing Spherical and Cylindrical Tepcs for Neutron Monitoring and Dosimetry.

    PubMed

    Broughton, David P; Waker, Anthony J

    2017-05-01

    Neutron dosimetry in reactor fields is currently mainly conducted with unwieldy flux monitors. Tissue Equivalent Proportional Counters (TEPCs) have been shown to have the potential to improve the accuracy of neutron dosimetry in these fields, and Multi-Element Tissue Equivalent Proportional Counters (METEPCs) could reduce the size of instrumentation required to do so. Complexity of current METEPC designs has inhibited their use beyond research. This work proposes a novel hemispherical counter with a wireless anode ball in place of the traditional anode wire as a possible solution for simplifying manufacturing. The hemispherical METEPC element was analyzed as a single TEPC to first demonstrate the potential of this new design by evaluating its performance relative to the reference spherical TEPC design and a single element from a cylindrical METEPC. Energy deposition simulations were conducted using the Monte Carlo code PHITS for both monoenergetic 2.5 MeV neutrons and the neutron energy spectrum of Cf-D2O moderated. In these neutron fields, the hemispherical counter appears to be a good alternative to the reference spherical geometry, performing slightly better than the cylindrical counter, which tends to underrespond to H*(10) for the lower neutron energies of the Cf-D2O moderated field. These computational results are promising, and if follow-up experimental work demonstrates the hemispherical counter works as anticipated, it will be ready to be incorporated into an METEPC design.

  17. Experimental verification of bremsstrahlung production and dosimetry predictions for 15.5 MeV electrons

    NASA Astrophysics Data System (ADS)

    Sanford, T. W. L.; Beutler, D. E.; Halbleib, J. A.; Knott, D. P.

    1991-12-01

    The radiation produced by a 15.5-MeV monoenergetic electron beam incident on optimized and nonoptimized bremsstrahlung targets is characterized using the ITS Monte Carlo code and measurements with equilibrated and nonequilibrated TLD dosimetry. Comparisons between calculations and measurements verify the calculations and demonstrate that the code can be used to predict both bremsstrahlung production and TLD response for radiation fields that are characteristic of those produced by pulsed simulators of gamma rays. The comparisons provide independent confirmation of the validity of the TLD calibration for photon fields characteristic of gamma-ray simulators. The empirical Martin equation, which is often used to calculate radiation dose from optimized bremsstrahlung targets, is examined, and its range of validity is established.

  18. Development of a computer code to calculate the distribution of radionuclides within the human body by the biokinetic models of the ICRP.

    PubMed

    Matsumoto, Masaki; Yamanaka, Tsuneyasu; Hayakawa, Nobuhiro; Iwai, Satoshi; Sugiura, Nobuyuki

    2015-03-01

    This paper describes the Basic Radionuclide vAlue for Internal Dosimetry (BRAID) code, which was developed to calculate the time-dependent activity distribution in each organ and tissue characterised by the biokinetic compartmental models provided by the International Commission on Radiological Protection (ICRP). Translocation from one compartment to the next is taken to be governed by first-order kinetics, which is formulated by the first-order differential equations. In the source program of this code, the conservation equations are solved for the mass balance that describes the transfer of a radionuclide between compartments. This code is applicable to the evaluation of the radioactivity of nuclides in an organ or tissue without modification of the source program. It is also possible to handle easily the cases of the revision of the biokinetic model or the application of a uniquely defined model by a user, because this code is designed so that all information on the biokinetic model structure is imported from an input file. The sample calculations are performed with the ICRP model, and the results are compared with the analytic solutions using simple models. It is suggested that this code provides sufficient result for the dose estimation and interpretation of monitoring data. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. WE-B-207-00: CT Lung Cancer Screening Part 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    The US National Lung Screening Trial (NLST) was a multi-center randomized, controlled trial comparing a low-dose CT (LDCT) to posterior-anterior (PA) chest x-ray (CXR) in screening older, current and former heavy smokers for early detection of lung cancer. Recruitment was launched in September 2002 and ended in April 2004 when 53,454 participants had been randomized at 33 screening sites in equal proportions. Funded by the National Cancer Institute this trial demonstrated that LDCT screening reduced lung cancer mortality. The US Preventive Services Task Force (USPSTF) cited NLST findings and conclusions in its deliberations and analysis of lung cancer screening. Undermore » the 2010 Patient Protection and Affordable Care Act, the USPSTF favorable recommendation regarding lung cancer CT screening assisted in obtaining third-party payers coverage for screening. The objective of this session is to provide an introduction to the NLST and the trial findings, in addition to a comprehensive review of the dosimetry investigations and assessments completed using individual NLST participant CT and CXR examinations. Session presentations will review and discuss the findings of two independent assessments, a CXR assessment and the findings of a CT investigation calculating individual organ dosimetry values. The CXR assessment reviewed a total of 73,733 chest x-ray exams that were performed on 92 chest imaging systems of which 66,157 participant examinations were used. The CT organ dosimetry investigation collected scan parameters from 23,773 CT examinations; a subset of the 75,133 CT examinations performed using 97 multi-detector CT scanners. Organ dose conversion coefficients were calculated using a Monte Carlo code. An experimentally-validated CT scanner simulation was coupled with 193 adult hybrid computational phantoms representing the height and weight of the current U.S. population. The dose to selected organs was calculated using the organ dose library and the abstracted scan parameters. This session will review the results and summarize the individualized doses to major organs and the mean effective dose and CTDIvol estimate for 66,157 PA chest and 23,773 CT examinations respectively, using size-dependent computational phantoms coupled with Monte Carlo calculations. Learning Objectives: Review and summarize relevant NLST findings and conclusions. Understand the scope and scale of the NLST specific to participant dosimetry. Provide a comprehensive review of NLST participant dosimetry assessments. Summarize the results of an investigation providing individualized organ dose estimates for NLST participant cohorts.« less

  20. Development of the two Korean adult tomographic computational phantoms for organ dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Choonsik; Lee, Choonik; Park, Sang-Hyun

    2006-02-15

    Following the previously developed Korean tomographic phantom, KORMAN, two additional whole-body tomographic phantoms of Korean adult males were developed from magnetic resonance (MR) and computed tomography (CT) images, respectively. Two healthy male volunteers, whose body dimensions were fairly representative of the average Korean adult male, were recruited and scanned for phantom development. Contiguous whole body MR images were obtained from one subject exclusive of the arms, while whole-body CT images were acquired from the second individual. A total of 29 organs and tissues and 19 skeletal sites were segmented via image manipulation techniques such as gray-level thresholding, region growing, andmore » manual drawing, in which each of segmented image slice was subsequently reviewed by an experienced radiologist for anatomical accuracy. The resulting phantoms, the MR-based KTMAN-1 (Korean Typical MAN-1) and the CT-based KTMAN-2 (Korean Typical MAN-2), consist of 300x150x344 voxels with a voxel resolution of 2x2x5 mm{sup 3} for both phantoms. Masses of segmented organs and tissues were calculated as the product of a nominal reference density, the prevoxel volume, and the cumulative number of voxels defining each organs or tissue. These organs masses were then compared with those of both the Asian and the ICRP reference adult male. Organ masses within both KTMAN-1 and KTMAN-2 showed differences within 40% of Asian and ICRP reference values, with the exception of the skin, gall bladder, and pancreas which displayed larger differences. The resulting three-dimensional binary file was ported to the Monte Carlo code MCNPX2.4 to calculate organ doses following external irradiation for illustrative purposes. Colon, lung, liver, and stomach absorbed doses, as well as the effective dose, for idealized photon irradiation geometries (anterior-posterior and right lateral) were determined, and then compared with data from two other tomographic phantoms (Asian and Caucasian), and stylized ORNL phantom. The armless KTMAN-1 can be applied to dosimetry for computed tomography or lateral x-ray examination, while the whole body KTMAN-2 can be used for radiation protection dosimetry.« less

  1. Dosimetry of Al2O3 optically stimulated luminescent dosimeter at high energy photons and electrons

    NASA Astrophysics Data System (ADS)

    Yusof, M. F. Mohd; Joohari, N. A.; Abdullah, R.; Shukor, N. S. Abd; Kadir, A. B. Abd; Isa, N. Mohd

    2018-01-01

    The linearity of Al2O3 OSL dosimeters (OSLD) were evaluated for dosimetry works in clinical photons and electrons. The measurements were made at a reference depth of Zref according to IAEA TRS 398:2000 codes of practice at 6 and 10 MV photons and 6 and 9 MeV electrons. The measured dose was compared to the thermoluminescence dosimeters (TLD) and ionization chamber commonly used for dosimetry works for higher energy photons and electrons. The results showed that the measured dose in OSL dosimeters were in good agreement with the reported by the ionization chamber in both high energy photons and electrons. A reproducibility test also reported excellent consistency of readings with the OSL at similar energy levels. The overall results confirmed the suitability of OSL dosimeters for dosimetry works involving high energy photons and electrons in radiotherapy.

  2. Review and standardization of cell phone exposure calculations using the SAM phantom and anatomically correct head models.

    PubMed

    Beard, Brian B; Kainz, Wolfgang

    2004-10-13

    We reviewed articles using computational RF dosimetry to compare the Specific Anthropomorphic Mannequin (SAM) to anatomically correct models of the human head. Published conclusions based on such comparisons have varied widely. We looked for reasons that might cause apparently similar comparisons to produce dissimilar results. We also looked at the information needed to adequately compare the results of computational RF dosimetry studies. We concluded studies were not comparable because of differences in definitions, models, and methodology. Therefore we propose a protocol, developed by an IEEE standards group, as an initial step in alleviating this problem. The protocol calls for a benchmark validation study comparing the SAM phantom to two anatomically correct models of the human head. It also establishes common definitions and reporting requirements that will increase the comparability of all computational RF dosimetry studies of the human head.

  3. Review and standardization of cell phone exposure calculations using the SAM phantom and anatomically correct head models

    PubMed Central

    Beard, Brian B; Kainz, Wolfgang

    2004-01-01

    We reviewed articles using computational RF dosimetry to compare the Specific Anthropomorphic Mannequin (SAM) to anatomically correct models of the human head. Published conclusions based on such comparisons have varied widely. We looked for reasons that might cause apparently similar comparisons to produce dissimilar results. We also looked at the information needed to adequately compare the results of computational RF dosimetry studies. We concluded studies were not comparable because of differences in definitions, models, and methodology. Therefore we propose a protocol, developed by an IEEE standards group, as an initial step in alleviating this problem. The protocol calls for a benchmark validation study comparing the SAM phantom to two anatomically correct models of the human head. It also establishes common definitions and reporting requirements that will increase the comparability of all computational RF dosimetry studies of the human head. PMID:15482601

  4. Computational high-resolution heart phantoms for medical imaging and dosimetry simulations

    NASA Astrophysics Data System (ADS)

    Gu, Songxiang; Gupta, Rajiv; Kyprianou, Iacovos

    2011-09-01

    Cardiovascular disease in general and coronary artery disease (CAD) in particular, are the leading cause of death worldwide. They are principally diagnosed using either invasive percutaneous transluminal coronary angiograms or non-invasive computed tomography angiograms (CTA). Minimally invasive therapies for CAD such as angioplasty and stenting are rendered under fluoroscopic guidance. Both invasive and non-invasive imaging modalities employ ionizing radiation and there is concern for deterministic and stochastic effects of radiation. Accurate simulation to optimize image quality with minimal radiation dose requires detailed, gender-specific anthropomorphic phantoms with anatomically correct heart and associated vasculature. Such phantoms are currently unavailable. This paper describes an open source heart phantom development platform based on a graphical user interface. Using this platform, we have developed seven high-resolution cardiac/coronary artery phantoms for imaging and dosimetry from seven high-quality CTA datasets. To extract a phantom from a coronary CTA, the relationship between the intensity distribution of the myocardium, the ventricles and the coronary arteries is identified via histogram analysis of the CTA images. By further refining the segmentation using anatomy-specific criteria such as vesselness, connectivity criteria required by the coronary tree and image operations such as active contours, we are able to capture excellent detail within our phantoms. For example, in one of the female heart phantoms, as many as 100 coronary artery branches could be identified. Triangular meshes are fitted to segmented high-resolution CTA data. We have also developed a visualization tool for adding stenotic lesions to the coronaries. The male and female heart phantoms generated so far have been cross-registered and entered in the mesh-based Virtual Family of phantoms with matched age/gender information. Any phantom in this family, along with user-defined stenoses, can be used to obtain clinically realistic projection images with the Monte Carlo code penMesh for optimizing imaging and dosimetry.

  5. SU-E-I-107: Suitability of Various Radiation Detectors Used in Radiation Therapy for X-Ray Dosimetry in Computed Tomography.

    PubMed

    Liebmann, M; Poppe, B; von Boetticher, H

    2012-06-01

    Assessment of suitability for X-ray dosimetry in computed tomography of various ionization chambers, diodes and two-dimensional detector arrays primarily used in radiation therapy. An Oldelft X-ray simulation unit was used to irradiate PTW 60008, 60012 dosimetry diodes, PTW 23332, 31013, 31010, 31006 axial symmetrical ionization chambers, PTW 23343, 34001 plane parallel ionization chambers and PTW Starcheck and 2D-Array seven29 as well as a prototype Farmer chamber with a copper wall. Peak potential was varied from 50 kV up to 125 kV and beam qualities were quantified through half-value-layer measurements. Energy response was investigated free in air as well as in 2 cm depth in a solid water phantom and refers to a manufacturer calibrated PTW 60004 diode for kV-dosimetry. The thimble ionization chambers PTW 31010, 31013, the uncapsuled diode PTW 60012 and the PTW 2D-Array seven29 exhibit an energy response deviation in the investigated energy region of approximately 10% or lower thus proving good usability in X-ray dosimetry if higher spatial resolution is needed or rotational irradiations occur. It could be shown that in radiation therapy routinely used detectors are usable in a much lower energy region. The rotational symmetry is of advantage in computed tomography dosimetry and enables dose profile as well as point dose measurements in a suitable phantom for estimation of organ doses. Additional the PTW 2D-Array seven29 can give a quick overview of radiation fields in non-rotating tasks. © 2012 American Association of Physicists in Medicine.

  6. Dosimetry of gamma chamber blood irradiator using PAGAT gel dosimeter and Monte Carlo simulations

    PubMed Central

    Mohammadyari, Parvin; Zehtabian, Mehdi; Sina, Sedigheh; Tavasoli, Ali Reza

    2014-01-01

    Currently, the use of blood irradiation for inactivating pathogenic microbes in infected blood products and preventing graft‐versus‐host disease (GVHD) in immune suppressed patients is greater than ever before. In these systems, dose distribution and uniformity are two important concepts that should be checked. In this study, dosimetry of the gamma chamber blood irradiator model Gammacell 3000 Elan was performed by several dosimeter methods including thermoluminescence dosimeters (TLD), PAGAT gel dosimetry, and Monte Carlo simulations using MCNP4C code. The gel dosimeter was put inside a glass phantom and the TL dosimeters were placed on its surface, and the phantom was then irradiated for 5 min and 27 sec. The dose values at each point inside the vials were obtained from the magnetic resonance imaging of the phantom. For Monte Carlo simulations, all components of the irradiator were simulated and the dose values in a fine cubical lattice were calculated using tally F6. This study shows that PAGAT gel dosimetry results are in close agreement with the results of TL dosimetry, Monte Carlo simulations, and the results given by the vendor, and the percentage difference between the different methods is less than 4% at different points inside the phantom. According to the results obtained in this study, PAGAT gel dosimetry is a reliable method for dosimetry of the blood irradiator. The major advantage of this kind of dosimetry is that it is capable of 3D dose calculation. PACS number: 87.53.Bn PMID:24423829

  7. Effect of respiratory motion on internal radiation dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Tianwu; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch; Geneva Neuroscience Center, Geneva University, Geneva CH-1205

    Purpose: Estimation of the radiation dose to internal organs is essential for the assessment of radiation risks and benefits to patients undergoing diagnostic and therapeutic nuclear medicine procedures including PET. Respiratory motion induces notable internal organ displacement, which influences the absorbed dose for external exposure to radiation. However, to their knowledge, the effect of respiratory motion on internal radiation dosimetry has never been reported before. Methods: Thirteen computational models representing the adult male at different respiratory phases corresponding to the normal respiratory cycle were generated from the 4D dynamic XCAT phantom. Monte Carlo calculations were performed using the MCNP transportmore » code to estimate the specific absorbed fractions (SAFs) of monoenergetic photons/electrons, the S-values of common positron-emitting radionuclides (C-11, N-13, O-15, F-18, Cu-64, Ga-68, Rb-82, Y-86, and I-124), and the absorbed dose of {sup 18}F-fluorodeoxyglucose ({sup 18}F-FDG) in 28 target regions for both the static (average of dynamic frames) and dynamic phantoms. Results: The self-absorbed dose for most organs/tissues is only slightly influenced by respiratory motion. However, for the lung, the self-absorbed SAF is about 11.5% higher at the peak exhale phase than the peak inhale phase for photon energies above 50 keV. The cross-absorbed dose is obviously affected by respiratory motion for many combinations of source-target pairs. The cross-absorbed S-values for the heart contents irradiating the lung are about 7.5% higher in the peak exhale phase than the peak inhale phase for different positron-emitting radionuclides. For {sup 18}F-FDG, organ absorbed doses are less influenced by respiratory motion. Conclusions: Respiration-induced volume variations of the lungs and the repositioning of internal organs affect the self-absorbed dose of the lungs and cross-absorbed dose between organs in internal radiation dosimetry. The dynamic anatomical model provides more accurate internal radiation dosimetry estimates for the lungs and abdominal organs based on realistic modeling of respiratory motion. This work also contributes to a better understanding of model-induced uncertainties in internal radiation dosimetry.« less

  8. APPLICATION OF A FINITE-DIFFERENCE TECHNIQUE TO THE HUMAN RADIOFREQUENCY DOSIMETRY PROBLEM

    EPA Science Inventory

    A powerful finite difference numerical technique has been applied to the human radiofrequency dosimetry problem. The method possesses inherent advantages over the method of moments approach in that its implementation requires much less computer memory. Consequently, it has the ca...

  9. Commissioning dosimetry and in situ dose mapping of a semi-industrial Cobalt-60 gamma-irradiation facility using Fricke and Ceric-cerous dosimetry system and comparison with Monte Carlo simulation data

    NASA Astrophysics Data System (ADS)

    Mortuza, Md Firoz; Lepore, Luigi; Khedkar, Kalpana; Thangam, Saravanan; Nahar, Arifatun; Jamil, Hossen Mohammad; Bandi, Laxminarayan; Alam, Md Khorshed

    2018-03-01

    Characterization of a 90 kCi (3330 TBq), semi-industrial, cobalt-60 gamma irradiator was performed by commissioning dosimetry and in-situ dose mapping experiments with Ceric-cerous and Fricke dosimetry systems. Commissioning dosimetry was carried out to determine dose distribution pattern of absorbed dose in the irradiation cell and products. To determine maximum and minimum absorbed dose, overdose ratio and dwell time of the tote boxes, homogeneous dummy product (rice husk) with a bulk density of 0.13 g/cm3 were used in the box positions of irradiation chamber. The regions of minimum absorbed dose of the tote boxes were observed in the lower zones of middle plane and maximum absorbed doses were found in the middle position of front plane. Moreover, as a part of dose mapping, dose rates in the wall positions and some selective strategic positions were also measured to carry out multiple irradiation program simultaneously, especially for low dose research irradiation program. In most of the cases, Monte Carlo simulation data, using Monte Carlo N-Particle eXtended code version MCNPX 2.7., were found to be in congruence with experimental values obtained from Ceric-cerous and Fricke dosimetry; however, in close proximity positions from the source, the dose rate variation between chemical dosimetry and MCNP was higher than distant positions.

  10. [Verification of the dose delivered to the patient by means of TLD, SC, PID. What future?].

    PubMed

    Noël, A

    2003-11-01

    Among the different possibilities to check the accuracy of the treatment delivered, only in vivo dosimetry ensures the precision of the dose delivered to the patient during the treatment. In 1970-1980, Ruden assessed the use of thermoluminescent dosimetry to perform in vivo measurements at Radiumemmet in Stockholm. Straightforward in its principle but demanding in its implementation, thermoluminescent dosimetry has largely been used. Today, thanks to the work of Rikner, the use of semiconductor detectors allows the general implementation of in vivo dosimetry. Tomorrow, we will use electronic portal imaging device to verify the geometrical patient setup and the dose delivery at the same time. Its implementation remains complex and will need the development of algorithms to compute exit dose or midplane dose using portal in vivo dosimetry. First clinical results show that portal imaging is an accurate alternative for conventional in vivo dosimetry using diodes.

  11. Mathematical modelling of scanner-specific bowtie filters for Monte Carlo CT dosimetry

    NASA Astrophysics Data System (ADS)

    Kramer, R.; Cassola, V. F.; Andrade, M. E. A.; de Araújo, M. W. C.; Brenner, D. J.; Khoury, H. J.

    2017-02-01

    The purpose of bowtie filters in CT scanners is to homogenize the x-ray intensity measured by the detectors in order to improve the image quality and at the same time to reduce the dose to the patient because of the preferential filtering near the periphery of the fan beam. For CT dosimetry, especially for Monte Carlo calculations of organ and tissue absorbed doses to patients, it is important to take the effect of bowtie filters into account. However, material composition and dimensions of these filters are proprietary. Consequently, a method for bowtie filter simulation independent of access to proprietary data and/or to a specific scanner would be of interest to many researchers involved in CT dosimetry. This study presents such a method based on the weighted computer tomography dose index, CTDIw, defined in two cylindrical PMMA phantoms of 16 cm and 32 cm diameter. With an EGSnrc-based Monte Carlo (MC) code, ratios CTDIw/CTDI100,a were calculated for a specific CT scanner using PMMA bowtie filter models based on sigmoid Boltzmann functions combined with a scanner filter factor (SFF) which is modified during calculations until the calculated MC CTDIw/CTDI100,a matches ratios CTDIw/CTDI100,a, determined by measurements or found in publications for that specific scanner. Once the scanner-specific value for an SFF has been found, the bowtie filter algorithm can be used in any MC code to perform CT dosimetry for that specific scanner. The bowtie filter model proposed here was validated for CTDIw/CTDI100,a considering 11 different CT scanners and for CTDI100,c, CTDI100,p and their ratio considering 4 different CT scanners. Additionally, comparisons were made for lateral dose profiles free in air and using computational anthropomorphic phantoms. CTDIw/CTDI100,a determined with this new method agreed on average within 0.89% (max. 3.4%) and 1.64% (max. 4.5%) with corresponding data published by CTDosimetry (www.impactscan.org) for the CTDI HEAD and BODY phantoms, respectively. Comparison with results calculated using proprietary data for the PHILIPS Brilliance 64 scanner showed agreement on average within 2.5% (max. 5.8%) and with data measured for that scanner within 2.1% (max. 3.7%). Ratios of CTDI100,c/CTDI100, p for this study and corresponding data published by CTDosimetry (www.impactscan.org) agree on average within about 11% (max. 28.6%). Lateral dose profiles calculated with the proposed bowtie filter and with proprietary data agreed within 2% (max. 5.9%), and both calculated data agreed within 5.4% (max. 11.2%) with measured results. Application of the proposed bowtie filter and of the exactly modelled filter to human phantom Monte Carlo calculations show agreement on the average within less than 5% (max. 7.9%) for organ and tissue absorbed doses.

  12. MAGIC-f Gel in Nuclear Medicine Dosimetry: study in an external beam of Iodine-131

    NASA Astrophysics Data System (ADS)

    Schwarcke, M.; Marques, T.; Garrido, C.; Nicolucci, P.; Baffa, O.

    2010-11-01

    MAGIC-f gel applicability in Nuclear Medicine dosimetry was investigated by exposure to a 131I source. Calibration was made to provide known absorbed doses in different positions around the source. The absorbed dose in gel was compared with a Monte Carlo Simulation using PENELOPE code and a thermoluminescent dosimetry (TLD). Using MRI analysis for the gel a R2-dose sensitivity of 0.23 s-1Gy-1was obtained. The agreement between dose-distance curves obtained with Monte Carlo simulation and TLD was better than 97% and for MAGIC-f and TLD was better than 98%. The results show the potential of polymer gel for application in nuclear medicine where three dimensional dose distribution is demanded.

  13. Reactor Dosimetry State of the Art 2008

    NASA Astrophysics Data System (ADS)

    Voorbraak, Wim; Debarberis, Luigi; D'Hondt, Pierre; Wagemans, Jan

    2009-08-01

    Oral session 1: Retrospective dosimetry. Retrospective dosimetry of VVER 440 reactor pressure vessel at the 3rd unit of Dukovany NPP / M. Marek ... [et al.]. Retrospective dosimetry study at the RPV of NPP Greifswald unit 1 / J. Konheiser ... [et al.]. Test of prototype detector for retrospective neutron dosimetry of reactor internals and vessel / K. Hayashi ... [et al.]. Neutron doses to the concrete vessel and tendons of a magnox reactor using retrospective dosimetry / D. A. Allen ... [et al.]. A retrospective dosimetry feasibility study for Atucha I / J. Wagemans ... [et al.]. Retrospective reactor dosimetry with zirconium alloy samples in a PWR / L. R. Greenwood and J. P. Foster -- Oral session 2: Experimental techniques. Characterizing the Time-dependent components of reactor n/y environments / P. J. Griffin, S. M. Luker and A. J. Suo-Anttila. Measurements of the recoil-ion response of silicon carbide detectors to fast neutrons / F. H. Ruddy, J. G. Seidel and F. Franceschini. Measurement of the neutron spectrum of the HB-4 cold source at the high flux isotope reactor at Oak Ridge National Laboratory / J. L. Robertson and E. B. Iverson. Feasibility of cavity ring-down laser spectroscopy for dose rate monitoring on nuclear reactor / H. Tomita ... [et al.]. Measuring transistor damage factors in a non-stable defect environment / D. B. King ... [et al.]. Neutron-detection based monitoring of void effects in boiling water reactors / J. Loberg ... [et al.] -- Poster session 1: Power reactor surveillance, retrospective dosimetry, benchmarks and inter-comparisons, adjustment methods, experimental techniques, transport calculations. Improved diagnostics for analysis of a reactor pulse radiation environment / S. M. Luker ... [et al.]. Simulation of the response of silicon carbide fast neutron detectors / F. Franceschini, F. H. Ruddy and B. Petrović. NSV A-3: a computer code for least-squares adjustment of neutron spectra and measured dosimeter responses / J. G. Williams, A. P. Ribaric and T. Schnauber. Agile high-fidelity MCNP model development techniques for rapid mechanical design iteration / J. A. Kulesza.Extension of Raptor-M3G to r-8-z geometry for use in reactor dosimetry applications / M. A. Hunter, G. Longoni and S. L. Anderson. In vessel exposure distributions evaluated with MCNP5 for Atucha II / J. M. Longhino, H. Blaumann and G. Zamonsky. Atucha I nuclear power plant azimutal ex-vessel flux profile evaluation / J. M. Longhino ... [et al.]. UFTR thermal column characterization and redesign for maximized thermal flux / C. Polit and A. Haghighat. Activation counter using liquid light-guide for dosimetry of neutron burst / M. Hayashi ... [et al.]. Control rod reactivity curves for the annular core research reactor / K. R. DePriest ... [et al.]. Specification of irradiation conditions in VVER-440 surveillance positions / V. Kochkin ... [et al.]. Simulations of Mg-Ar ionisation and TE-TE ionisation chambers with MCNPX in a straightforward gamma and beta irradiation field / S. Nievaart ... [et al.]. The change of austenitic stainless steel elements content in the inner parts of VVER-440 reactor during operation / V. Smutný, J. Hep and P. Novosad. Fast neutron environmental spectrometry using disk activation / G. Lövestam ... [et al.]. Optimization of the neutron activation detector location scheme for VVER-lOOO ex-vessel dosimetry / V. N. Bukanov ... [et al.]. Irradiation conditions for surveillance specimens located into plane containers installed in the WWER-lOOO reactor of unit 2 of the South-Ukrainian NPP / O. V. Grytsenko. V. N. Bukanov and S. M. Pugach. Conformity between LRO mock-ups and VVERS NPP RPV neutron flux attenuation / S. Belousov. Kr. Ilieva and D. Kirilova. FLUOLE: a new relevant experiment for PWR pressure vessel surveillance / D. Beretz ... [et al.]. Transport of neutrons and photons through the iron and water layers / M. J. Kost'ál ... [et al.]. Condition evaluation of spent nuclear fuel assemblies from the first-generation nuclear-powered submarines by gamma scanning / A. F. Usatyi. L. A. Serdyukova and B. S. Stepennov -- Oral session 3: Power plant surveillance. Upgraded neutron dosimetry procedure for VVER-440 surveillance specimens / V. Kochkin ... [et al.]. Neutron dosimetry on the full-core first generation VVER-440 aimed to reactor support structure load evaluation / P. Borodkin ... [et al.]. Ex-vessel neutron dosimetry programs for PWRs in Korea / C. S. Yoo. B. C. Kim and C. C. Kim. Comparison of irradiation conditions of VVER-1000 reactor pressure vessel and surveillance specimens for various core loadings / V. N. Bukanov ... [et al.]. Re-evaluation of dosimetry in the new surveillance program for the Loviisa 1 VVER-440 reactor / T. Serén -- Oral session 4: Benchmarks, intercomparisons and adjustment methods. Determination of the neutron parameter's uncertainties using the stochastic methods of uncertainty propagation and analysis / G. Grégoire ... [et al.].Covariance matrices for calculated neutron spectra and measured dosimeter responses / J. G. Williams ... [et al.]. The role of dosimetry at the high flux reactor / S. C. van der Marek ... [et al.]. Calibration of a manganese bath relative to Cf-252 nu-bar / D. M. Gilliam, A. T. Yue and M. Scott Dewey. Major upgrade of the reactor dosimetry interpretation methodology used at the CEA: general principle / C. Destouches ... [et al.] -- Oral session 5: power plant surveillance. The role of ex-vessel neutron dosimetry in reactor vessel surveillance in South Korea / B.-C. Kim ... [et al.]. Spanish RPV surveillance programmes: lessons learned and current activities / A. Ballesteros and X. Jardí. Atucha I nuclear power plant extended dosimetry and assessment / H. Blaumann ... [et al.]. Monitoring of radiation load of pressure vessels of Russian VVER in compliance with license amendments / G. Borodkin ... [et al.] -- Poster session 2: Test reactors, accelerators and advanced systems; cross sections, nuclear data, damage correlations. Two-dimensional mapping of the calculated fission power for the full-size fuel plate experiment irradiated in the advanced test reactor / G. S. Chang and M. A. Lillo. The radiation safety information computational center: a resource for reactor dosimetry software and nuclear data / B. L. Kirk. Irradiated xenon isotopic ratio measurement for failed fuel detection and location in fast reactor / C. Ito, T. Iguchi and H. Harano. Characterization of dosimetry of the BMRR horizontal thimble tubes and broad beam facility / J.-P. Hu, R. N. Reciniello and N. E. Holden. 2007 nuclear data review / N. E. Holden. Further dosimetry studies at the Rhode Island nuclear science / R. N. Reciniello ... [et al.]. Characterization of neutron fields in the experimental fast reactor Joyo MK-III core / S. Maeda ... [et al.]. Measuring [symbol]Li(n, t) and [symbol]B(n, [symbol]) cross sections using the NIST alpha-gamma apparatus / M. S. Dewey ... [et al.]. Improvement of neutron/gamma field evaluation for restart of JMTR / Y. Nagao ... [et al.]. Monitoring of the irradiated neutron fluence in the neutron transmutation doping process of HANARO / M.-S. Kim and S.-J. Park.Training reactor VR-l neutron spectrum determination / M. Vins, A. Kolros and K. Katovsky. Differential cross sections for gamma-ray production by 14 MeV neutrons on iron and bismuth / V. M. Bondar ... [et al.]. The measurements of the differential elastic neutron cross-sections of carbon for energies from 2 to 133 ke V / O. Gritzay ... [et al.]. Determination of neutron spectrum by the dosimetry foil method up to 35 Me V / S. P. Simakov ... [et al.]. Extension of the BGL broad group cross section library / D. Kirilova, S. Belousov and Kr. Ilieva. Measurements of neutron capture cross-section for tantalum at the neutron filtered beams / O. Gritzayand V. Libman. Measurements of microscopic data at GELINA in support of dosimetry / S. Kopecky ... [et al.]. Nuclide guide and international chart of nuclides - 2008 / T. Golashvili -- Oral session 6: Test reactors, accelerators and advanced systems. Neutronic analyses in support of the HFIR beamline modifications and lifetime extension / I. Remec and E. D. Blakeman. Characterization of neutron test facilities at Sandia National Laboratories / D. W. Vehar ... [et al.]. LYRA irradiation experiments: neutron metrology and dosimetry / B. Acosta and L. Debarberis. Calculated neutron and gamma-ray spectra across the prismatic very high temperature reactor core / J. W. Sterbentz. Enhancement of irradiation capability of the experimental fast reactor joyo / S. Maeda ... [et al.]. Neutron spectrum analyses by foil activation method for high-energy proton beams / C. H. Pyeon ... [et al.] -- Oral session 7: Cross sections, nuclear data, damage correlations. Investigation of new reaction cross-section evaluations in order to update and extend the IRDF-2002 reactor dosimetry library / É. M. Zsolnay, H. J. Nolthenius and A. L. Nichols. A novel approach towards DPA calculations / A. Hogenbirk and D. F. Da Cruz. A new ENDFIB-VII.O based multigroup cross-section library for reactor dosimetry / F. A. Alpan and S. L. Anderson. Activities at the NEA for dosimetry applications / H. Henriksson and I. Kodeli. Validation and verification of covariance data from dosimetry reaction cross-section evaluations / S. Badikov. Status of the neutron cross section standards / A. D. Carlson -- Oral session 8: transport calculations. A dosimetry assessment for the core restraint of an advanced gas cooled reactor / D. A. Thornton ... [et al.]. Neutron dosimetry study in the region of the support structure of a VVER-1000 type reactor / G. Borodkin ... [et al.]. SNS moderator poison design and experiment validation of the moderator performance / W. Lu ... [et al.]. Analysis of OSIRIS in-core surveillance dosimetry for GONDOLE steel irradiation program by using TRIPOLI-4 Monte Carlo code / Y. K. Lee and F. Malouch.Reactor dosimetry applications using RAPTOR-M3G: a new parallel 3-D radiation transport code / G. Longoni and S. L. Anderson.

  14. A Monte Carlo calculation model of electronic portal imaging device for transit dosimetry through heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Jihyung; Jung, Jae Won, E-mail: jungj@ecu.edu; Kim, Jong Oh

    2016-05-15

    Purpose: To develop and evaluate a fast Monte Carlo (MC) dose calculation model of electronic portal imaging device (EPID) based on its effective atomic number modeling in the XVMC code. Methods: A previously developed EPID model, based on the XVMC code by density scaling of EPID structures, was modified by additionally considering effective atomic number (Z{sub eff}) of each structure and adopting a phase space file from the EGSnrc code. The model was tested under various homogeneous and heterogeneous phantoms and field sizes by comparing the calculations in the model with measurements in EPID. In order to better evaluate themore » model, the performance of the XVMC code was separately tested by comparing calculated dose to water with ion chamber (IC) array measurement in the plane of EPID. Results: In the EPID plane, calculated dose to water by the code showed agreement with IC measurements within 1.8%. The difference was averaged across the in-field regions of the acquired profiles for all field sizes and phantoms. The maximum point difference was 2.8%, affected by proximity of the maximum points to penumbra and MC noise. The EPID model showed agreement with measured EPID images within 1.3%. The maximum point difference was 1.9%. The difference dropped from the higher value of the code by employing the calibration that is dependent on field sizes and thicknesses for the conversion of calculated images to measured images. Thanks to the Z{sub eff} correction, the EPID model showed a linear trend of the calibration factors unlike those of the density-only-scaled model. The phase space file from the EGSnrc code sharpened penumbra profiles significantly, improving agreement of calculated profiles with measured profiles. Conclusions: Demonstrating high accuracy, the EPID model with the associated calibration system may be used for in vivo dosimetry of radiation therapy. Through this study, a MC model of EPID has been developed, and their performance has been rigorously investigated for transit dosimetry.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Jeffrey F.

    This paper briefly reviews the evolution of brachytherapy dosimetry from 1900 to the present. Dosimetric practices in brachytherapy fall into three distinct eras: During the era of biological dosimetry (1900-1938), radium pioneers could only specify Ra-226 and Rn-222 implants in terms of the mass of radium encapsulated within the implanted sources. Due to the high energy of its emitted gamma rays and the long range of its secondary electrons in air, free-air chambers could not be used to quantify the output of Ra-226 sources in terms of exposure. Biological dosimetry, most prominently the threshold erythema dose, gained currency as amore » means of intercomparing radium treatments with exposure-calibrated orthovoltage x-ray units. The classical dosimetry era (1940-1980) began with successful exposure standardization of Ra-226 sources by Bragg-Gray cavity chambers. Classical dose-computation algorithms, based upon 1-D buildup factor measurements and point-source superposition computational algorithms, were able to accommodate artificial radionuclides such as Co-60, Ir-192, and Cs-137. The quantitative dosimetry era (1980- ) arose in response to the increasing utilization of low energy K-capture radionuclides such as I-125 and Pd-103 for which classical approaches could not be expected to estimate accurate correct doses. This led to intensive development of both experimental (largely TLD-100 dosimetry) and Monte Carlo dosimetry techniques along with more accurate air-kerma strength standards. As a result of extensive benchmarking and intercomparison of these different methods, single-seed low-energy radionuclide dose distributions are now known with a total uncertainty of 3%-5%.« less

  16. Sci-Thur AM: YIS – 04: Stopping power-to-Cherenkov power ratios and beam quality specification for clinical Cherenkov emission dosimetry of electrons: beam-specific effects and experimental validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zlateva, Yana; Seuntjens, Jan; El Naqa, Issam

    Purpose: To advance towards clinical Cherenkov emission (CE)-based dosimetry by investigating beam-specific effects on Monte Carlo-calculated electron-beam stopping power-to-CE power ratios (SCRs), addressing electron beam quality specification in terms of CE, and validating simulations with measurements. Methods: The EGSnrc user code SPRRZnrc, used to calculate Spencer-Attix stopping-power ratios, was modified to instead calculate SCRs. SCRs were calculated for 6- to 22-MeV clinical electron beams from Varian TrueBeam, Clinac 21EX, and Clinac 2100C/D accelerators. Experiments were performed with a 20-MeV electron beam from a Varian TrueBeam accelerator, using a diffraction grating spectrometer with optical fiber input and a cooled back-illuminated CCD.more » A fluorophore was dissolved in the water to remove CE signal anisotropy. Results: It was found that angular spread of the incident beam has little effect on the SCR (≤ 0.3% at d{sub max}), while both the electron spectrum and photon contamination increase the SCR at shallow depths and decrease it at large depths. A universal data fit of R{sub 50} in terms of C{sub 50} (50% CE depth) revealed a strong linear dependence (R{sup 2} > 0.9999). The SCR was fit with a Burns-type equation (R{sup 2} = 0.9974, NRMSD = 0.5%). Below-threshold incident radiation was found to have minimal effect on beam quality specification (< 0.1%). Experiments and simulations were in good agreement. Conclusions: Our findings confirm the feasibility of the proposed CE dosimetry method, contingent on computation of SCRs from additional accelerators and on further experimental validation. This work constitutes an important step towards clinical high-resolution out-of-beam CE dosimetry.« less

  17. Views of Medical Physics in the United Kingdom and Ireland, 1980.

    DTIC Science & Technology

    1981-05-19

    as a means of characteriza- tion. Other studies include determination of electron dosimetry in bone tissue, radiological survey of the population dose...addition to Ellis, who heads the department, they are; Radiobiology and Dosimetry Prof. P.RoJ. Burch Dr. A.Jo Walker Medical Electronics and Computing Dr. F...absorptiometry l radiation dosimetry 1 radiothprapy ultrasound scahning 11 20. ASISTRACT (Cal’th"M 601 fwa side "f M1aaeaam’ 4104 fd=ifr by b1106h .Nbie) This

  18. Performance of two commercial electron beam algorithms over regions close to the lung-mediastinum interface, against Monte Carlo simulation and point dosimetry in virtual and anthropomorphic phantoms.

    PubMed

    Ojala, J; Hyödynmaa, S; Barańczyk, R; Góra, E; Waligórski, M P R

    2014-03-01

    Electron radiotherapy is applied to treat the chest wall close to the mediastinum. The performance of the GGPB and eMC algorithms implemented in the Varian Eclipse treatment planning system (TPS) was studied in this region for 9 and 16 MeV beams, against Monte Carlo (MC) simulations, point dosimetry in a water phantom and dose distributions calculated in virtual phantoms. For the 16 MeV beam, the accuracy of these algorithms was also compared over the lung-mediastinum interface region of an anthropomorphic phantom, against MC calculations and thermoluminescence dosimetry (TLD). In the phantom with a lung-equivalent slab the results were generally congruent, the eMC results for the 9 MeV beam slightly overestimating the lung dose, and the GGPB results for the 16 MeV beam underestimating the lung dose. Over the lung-mediastinum interface, for 9 and 16 MeV beams, the GGPB code underestimated the lung dose and overestimated the dose in water close to the lung, compared to the congruent eMC and MC results. In the anthropomorphic phantom, results of TLD measurements and MC and eMC calculations agreed, while the GGPB code underestimated the lung dose. Good agreement between TLD measurements and MC calculations attests to the accuracy of "full" MC simulations as a reference for benchmarking TPS codes. Application of the GGPB code in chest wall radiotherapy may result in significant underestimation of the lung dose and overestimation of dose to the mediastinum, affecting plan optimization over volumes close to the lung-mediastinum interface, such as the lung or heart. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. SU-F-18C-09: Assessment of OSL Dosimeter Technology in the Validation of a Monte Carlo Radiation Transport Code for CT Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, D; Kost, S; Pickens, D

    Purpose: To assess the utility of optically stimulated luminescent (OSL) dosimeter technology in calibrating and validating a Monte Carlo radiation transport code for computed tomography (CT). Methods: Exposure data were taken using both a standard CT 100-mm pencil ionization chamber and a series of 150-mm OSL CT dosimeters. Measurements were made at system isocenter in air as well as in standard 16-cm (head) and 32-cm (body) CTDI phantoms at isocenter and at the 12 o'clock positions. Scans were performed on a Philips Brilliance 64 CT scanner for 100 and 120 kVp at 300 mAs with a nominal beam width ofmore » 40 mm. A radiation transport code to simulate the CT scanner conditions was developed using the GEANT4 physics toolkit. The imaging geometry and associated parameters were simulated for each ionization chamber and phantom combination. Simulated absorbed doses were compared to both CTDI{sub 100} values determined from the ion chamber and to CTDI{sub 100} values reported from the OSLs. The dose profiles from each simulation were also compared to the physical OSL dose profiles. Results: CTDI{sub 100} values reported by the ion chamber and OSLs are generally in good agreement (average percent difference of 9%), and provide a suitable way to calibrate doses obtained from simulation to real absorbed doses. Simulated and real CTDI{sub 100} values agree to within 10% or less, and the simulated dose profiles also predict the physical profiles reported by the OSLs. Conclusion: Ionization chambers are generally considered the standard for absolute dose measurements. However, OSL dosimeters may also serve as a useful tool with the significant benefit of also assessing the radiation dose profile. This may offer an advantage to those developing simulations for assessing radiation dosimetry such as verification of spatial dose distribution and beam width.« less

  20. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample

    PubMed Central

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-01-01

    Abstract To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site. PMID:29385528

  1. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample.

    PubMed

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-05-01

    To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site.

  2. Brachytherapy dosimetry of 125I and 103Pd sources using an updated cross section library for the MCNP Monte Carlo transport code.

    PubMed

    Bohm, Tim D; DeLuca, Paul M; DeWerd, Larry A

    2003-04-01

    Permanent implantation of low energy (20-40 keV) photon emitting radioactive seeds to treat prostate cancer is an important treatment option for patients. In order to produce accurate implant brachytherapy treatment plans, the dosimetry of a single source must be well characterized. Monte Carlo based transport calculations can be used for source characterization, but must have up to date cross section libraries to produce accurate dosimetry results. This work benchmarks the MCNP code and its photon cross section library for low energy photon brachytherapy applications. In particular, we calculate the emitted photon spectrum, air kerma, depth dose in water, and radial dose function for both 125I and 103Pd based seeds and compare to other published results. Our results show that MCNP's cross section library differs from recent data primarily in the photoelectric cross section for low energies and low atomic number materials. In water, differences as large as 10% in the photoelectric cross section and 6% in the total cross section occur at 125I and 103Pd photon energies. This leads to differences in the dose rate constant of 3% and 5%, and differences as large as 18% and 20% in the radial dose function for the 125I and 103Pd based seeds, respectively. Using a partially updated photon library, calculations of the dose rate constant and radial dose function agree with other published results. Further, the use of the updated photon library allows us to verify air kerma and depth dose in water calculations performed using MCNP's perturbation feature to simulate updated cross sections. We conclude that in order to most effectively use MCNP for low energy photon brachytherapy applications, we must update its cross section library. Following this update, the MCNP code system will be a very effective tool for low energy photon brachytherapy dosimetry applications.

  3. Neutron Exposure Parameters for the Dosimetry Capsule in the Heavy-Section Steel Irradiation Program Tenth Irradiation Series

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C.A. Baldwin; F.B.K. Kam; I. Remec

    1998-10-01

    This report describes the computational methodology for the least-squares adjustment of the dosimetry data from the HSSI 10.OD dosimetry capsule with neutronics calculations. It presents exposure rates at each dosimetry location for the neutron fluence greater than 1.0 MeV, fluence greater than 0.1 MeV, and displacements per atom. Exposure parameter distributions are also described in terms of three- dimensional fitting functions. When fitting functions are used it is suggested that an uncertainty of 6% (1 o) should be associated with the exposure rate values. The specific activity of each dosimeter at the end of irradiation is listed in the Appendix.

  4. Dosimetric Considerations in Radioimmunotherapy and Systemic Radionuclide Therapies: A Review

    PubMed Central

    Loke, Kelvin S. H.; Padhy, Ajit K.; Ng, David C. E.; Goh, Anthony S.W.; Divgi, Chaitanya

    2011-01-01

    Radiopharmaceutical therapy, once touted as the “magic bullet” in radiation oncology, is increasingly being used in the treatment of a variety of malignancies; albeit in later disease stages. With ever-increasing public and medical awareness of radiation effects, radiation dosimetry is becoming more important. Dosimetry allows administration of the maximum tolerated radiation dose to the tumor/organ to be treated but limiting radiation to critical organs. Traditional tumor dosimetry involved acquiring pretherapy planar scans and plasma estimates with a diagnostic dose of intended radiopharmaceuticals. New advancements in single photon emission computed tomography and positron emission tomography systems allow semi-quantitative measurements of radiation dosimetry thus allowing treatments tailored to each individual patient. PMID:22144871

  5. Lung Dosimetry for Radioiodine Treatment Planning in the Case of Diffuse Lung Metastases

    PubMed Central

    Song, Hong; He, Bin; Prideaux, Andrew; Du, Yong; Frey, Eric; Kasecamp, Wayne; Ladenson, Paul W.; Wahl, Richard L.; Sgouros, George

    2010-01-01

    The lungs are the most frequent sites of distant metastasis in differentiated thyroid carcinoma. Radioiodine treatment planning for these patients is usually performed following the Benua– Leeper method, which constrains the administered activity to 2.96 GBq (80 mCi) whole-body retention at 48 h after administration to prevent lung toxicity in the presence of iodine-avid lung metastases. This limit was derived from clinical experience, and a dosimetric analysis of lung and tumor absorbed dose would be useful to understand the implications of this limit on toxicity and tumor control. Because of highly nonuniform lung density and composition as well as the nonuniform activity distribution when the lungs contain tumor nodules, Monte Carlo dosimetry is required to estimate tumor and normal lung absorbed dose. Reassessment of this toxicity limit is also appropriate in light of the contemporary use of recombinant thyrotropin (thyroid-stimulating hormone) (rTSH) to prepare patients for radioiodine therapy. In this work we demonstrated the use of MCNP, a Monte Carlo electron and photon transport code, in a 3-dimensional (3D) imaging–based absorbed dose calculation for tumor and normal lungs. Methods A pediatric thyroid cancer patient with diffuse lung metastases was administered 37MBq of 131I after preparation with rTSH. SPECT/CT scans were performed over the chest at 27, 74, and 147 h after tracer administration. The time–activity curve for 131I in the lungs was derived from the whole-body planar imaging and compared with that obtained from the quantitative SPECT methods. Reconstructed and coregistered SPECT/CT images were converted into 3D density and activity probability maps suitable for MCNP4b input. Absorbed dose maps were calculated using electron and photon transport in MCNP4b. Administered activity was estimated on the basis of the maximum tolerated dose (MTD) of 27.25 Gy to the normal lungs. Computational efficiency of the MCNP4b code was studied with a simple segmentation approach. In addition, the Benua–Leeper method was used to estimate the recommended administered activity. The standard dosing plan was modified to account for the weight of this pediatric patient, where the 2.96-GBq (80 mCi) whole-body retention was scaled to 2.44 GBq (66 mCi) to give the same dose rate of 43.6 rad/h in the lungs at 48 h. Results Using the MCNP4b code, both the spatial dose distribution and a dose–volume histogram were obtained for the lungs. An administered activity of 1.72 GBq (46.4 mCi) delivered the putative MTD of 27.25 Gy to the lungs with a tumor absorbed dose of 63.7 Gy. Directly applying the Benua–Leeper method, an administered activity of 3.89 GBq (105.0 mCi) was obtained, resulting in tumor and lung absorbed doses of 144.2 and 61.6 Gy, respectively, when the MCNP-based dosimetry was applied. The voxel-by-voxel calculation time of 4,642.3 h for photon transport was reduced to 16.8 h when the activity maps were segmented into 20 regions. Conclusion MCNP4b–based, patient-specific 3D dosimetry is feasible and important in the dosimetry of thyroid cancer patients with avid lung metastases that exhibit prolonged retention in the lungs. PMID:17138741

  6. Dosimetry of nasal uptake of soluble and reactive gases: A first study of inter-human variability (Journal Article)

    EPA Science Inventory

    Anatomically accurate human child and adult nasal tract models will be used in concert with computationally simulated air flow information to investigate the influence of age-related differences in anatomy on inhalation dosimetry in the upper and lower airways. The findings of t...

  7. Development and validation of a GEANT4 radiation transport code for CT dosimetry

    PubMed Central

    Carver, DE; Kost, SD; Fernald, MJ; Lewis, KG; Fraser, ND; Pickens, DR; Price, RR; Stabin, MG

    2014-01-01

    We have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate our simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air, a standard 16-cm acrylic head phantom, and a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of our Monte Carlo simulations. We found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  8. Development and validation of a GEANT4 radiation transport code for CT dosimetry.

    PubMed

    Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G

    2015-04-01

    The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other.

  9. Protracted Low-Dose Ionizing Radiation Effects upon Primate Performance

    DTIC Science & Technology

    1977-12-01

    61 G. Dosimetry ................................ ............. 74 NTiS Whife Sectle ) U A N O U C E D JUSTIFICATION...AECL facility. Standard dosimetry techniques were utilized during radiation expo- sur.. In addition, extensive preexposure calibration was conducted...During each of the epochs, the five basic variables were deter- mined. These calculations were accomplished on an analog computer, Electronics Associates

  10. Methods and computer readable medium for improved radiotherapy dosimetry planning

    DOEpatents

    Wessol, Daniel E.; Frandsen, Michael W.; Wheeler, Floyd J.; Nigg, David W.

    2005-11-15

    Methods and computer readable media are disclosed for ultimately developing a dosimetry plan for a treatment volume irradiated during radiation therapy with a radiation source concentrated internally within a patient or incident from an external beam. The dosimetry plan is available in near "real-time" because of the novel geometric model construction of the treatment volume which in turn allows for rapid calculations to be performed for simulated movements of particles along particle tracks therethrough. The particles are exemplary representations of alpha, beta or gamma emissions emanating from an internal radiation source during various radiotherapies, such as brachytherapy or targeted radionuclide therapy, or they are exemplary representations of high-energy photons, electrons, protons or other ionizing particles incident on the treatment volume from an external source. In a preferred embodiment, a medical image of a treatment volume irradiated during radiotherapy having a plurality of pixels of information is obtained.

  11. Optical computed tomography in PRESAGE® three-dimensional dosimetry: Challenges and prospective.

    PubMed

    Khezerloo, Davood; Nedaie, Hassan Ali; Farhood, Bagher; Zirak, Alireza; Takavar, Abbas; Banaee, Nooshin; Ahmadalidokht, Isa; Kron, Tomas

    2017-01-01

    With the advent of new complex but precise radiotherapy techniques, the demands for an accurate, feasible three-dimensional (3D) dosimetry system have been increased. A 3D dosimeter system generally should not only have accurate and precise results but should also feasible, inexpensive, and time consuming. Recently, one of the new candidates for 3D dosimetry is optical computed tomography (CT) with a radiochromic dosimeter such as PRESAGE®. Several generations of optical CT have been developed since the 90s. At the same time, a large attempt has been also done to introduce the robust dosimeters that compatible with optical CT scanners. In 2004, PRESAGE® dosimeter as a new radiochromic solid plastic dosimeters was introduced. In this decade, a large number of efforts have been carried out to enhance optical scanning methods. This article attempts to review and reflect on the results of these investigations.

  12. Monte Carlo MCNP-4B-based absorbed dose distribution estimates for patient-specific dosimetry.

    PubMed

    Yoriyaz, H; Stabin, M G; dos Santos, A

    2001-04-01

    This study was intended to verify the capability of the Monte Carlo MCNP-4B code to evaluate spatial dose distribution based on information gathered from CT or SPECT. A new three-dimensional (3D) dose calculation approach for internal emitter use in radioimmunotherapy (RIT) was developed using the Monte Carlo MCNP-4B code as the photon and electron transport engine. It was shown that the MCNP-4B computer code can be used with voxel-based anatomic and physiologic data to provide 3D dose distributions. This study showed that the MCNP-4B code can be used to develop a treatment planning system that will provide such information in a time manner, if dose reporting is suitably optimized. If each organ is divided into small regions where the average energy deposition is calculated with a typical volume of 0.4 cm(3), regional dose distributions can be provided with reasonable central processing unit times (on the order of 12-24 h on a 200-MHz personal computer or modest workstation). Further efforts to provide semiautomated region identification (segmentation) and improvement of marrow dose calculations are needed to supply a complete system for RIT. It is envisioned that all such efforts will continue to develop and that internal dose calculations may soon be brought to a similar level of accuracy, detail, and robustness as is commonly expected in external dose treatment planning. For this study we developed a code with a user-friendly interface that works on several nuclear medicine imaging platforms and provides timely patient-specific dose information to the physician and medical physicist. Future therapy with internal emitters should use a 3D dose calculation approach, which represents a significant advance over dose information provided by the standard geometric phantoms used for more than 20 y (which permit reporting of only average organ doses for certain standardized individuals)

  13. Monte Carlo simulations in radiotherapy dosimetry.

    PubMed

    Andreo, Pedro

    2018-06-27

    The use of the Monte Carlo (MC) method in radiotherapy dosimetry has increased almost exponentially in the last decades. Its widespread use in the field has converted this computer simulation technique in a common tool for reference and treatment planning dosimetry calculations. This work reviews the different MC calculations made on dosimetric quantities, like stopping-power ratios and perturbation correction factors required for reference ionization chamber dosimetry, as well as the fully realistic MC simulations currently available on clinical accelerators, detectors and patient treatment planning. Issues are raised that include the necessity for consistency in the data throughout the entire dosimetry chain in reference dosimetry, and how Bragg-Gray theory breaks down for small photon fields. Both aspects are less critical for MC treatment planning applications, but there are important constraints like tissue characterization and its patient-to-patient variability, which together with the conversion between dose-to-water and dose-to-tissue, are analysed in detail. Although these constraints are common to all methods and algorithms used in different types of treatment planning systems, they make uncertainties involved in MC treatment planning to still remain "uncertain".

  14. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less

  15. CERN IRRADIATION FACILITIES.

    PubMed

    Pozzi, Fabio; Garcia Alia, Ruben; Brugger, Markus; Carbonez, Pierre; Danzeca, Salvatore; Gkotse, Blerina; Richard Jaekel, Martin; Ravotti, Federico; Silari, Marco; Tali, Maris

    2017-09-28

    CERN provides unique irradiation facilities for applications in dosimetry, metrology, intercomparison of radiation protection devices, benchmark of Monte Carlo codes and radiation damage studies to electronics. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. A neutron spectrum unfolding computer code based on artificial neural networks

    NASA Astrophysics Data System (ADS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2014-02-01

    The Bonner Spheres Spectrometer consists of a thermal neutron sensor placed at the center of a number of moderating polyethylene spheres of different diameters. From the measured readings, information can be derived about the spectrum of the neutron field where measurements were made. Disadvantages of the Bonner system are the weight associated with each sphere and the need to sequentially irradiate the spheres, requiring long exposure periods. Provided a well-established response matrix and adequate irradiation conditions, the most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Intelligence, mainly Artificial Neural Networks, have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This code is called Neutron Spectrometry and Dosimetry with Artificial Neural networks unfolding code that was designed in a graphical interface. The core of the code is an embedded neural network architecture previously optimized using the robust design of artificial neural networks methodology. The main features of the code are: easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6LiI(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, for unfolding the neutron spectrum, only seven rate counts measured with seven Bonner spheres are required; simultaneously the code calculates 15 dosimetric quantities as well as the total flux for radiation protection purposes. This code generates a full report with all information of the unfolding in the HTML format. NSDann unfolding code is freely available, upon request to the authors.

  17. Monte Carol-Based Dosimetry of Beta-Emitters for Intravascular Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, C.K.

    2002-06-25

    Monte Carlo simulations for radiation dosimetry and the experimental verifications of the simulations have been developed for the treatment geometry of intravascular brachytherapy, a form of radionuclide therapy for occluded coronary disease (restenosis). Monte Carlo code, MCNP4C, has been used to calculate the radiation dose from the encapsulated array of B-emitting seeds (Sr/Y-source train). Solid water phantoms have been fabricated to measure the dose on the radiochromic films that were exposed to the beta source train for both linear and curved coronary vessel geometries. While the dose difference for the 5-degree curved vessel at the prescription point of f+2.0 mmmore » is within the 10% guideline set by the AAPM, however, the difference increased dramatically to 16.85% for the 10-degree case which requires additional adjustment for the acceptable dosimetry planning. The experimental dose measurements agree well with the simulation results« less

  18. MONTE CARLO STUDY OF THE CARDIAC ABSORBED DOSE DURING X-RAY EXAMINATION OF AN ADULT PATIENT.

    PubMed

    Kadri, O; Manai, K; Alfuraih, A

    2016-12-01

    The computational voxel phantom 'High-Definition Reference Korean-Man (HDRK-Man)' was implemented into the Monte Carlo transport toolkit Geant4. The voxel model, adjusted to the Reference Korean Man, is 171 cm in height and 68 kg in weight and composed of ∼30 million voxels whose size is 1.981 × 1.981 × 2.0854 mm 3 The Geant4 code is then utilised to compute the dose conversion coefficients (DCCs) expressed in absorbed dose per air kerma free in air for >30 tissues and organs, including almost all organs required in the new recommendation of the ICRP 103, due to a broad parallel beam of monoenergetic photons impinging in antero-postero direction with energy ranging from 10 to 150 keV. The computed DCCs of different organs are found to be in good agreement with data published using other simulation codes. Also, the influence of patient size on DCC values was investigated for a representative body size of the adult Korean patient population. The study was performed using five different sizes covering the range of 0.8-1.2 magnification order of the original HDRK-Man. It focussed on the computation of DCC for the human heart. Moreover, the provided DCCs were used to present an analytical parameterisation for the calculation of the cardiac absorbed dose for any arbitrary X-ray spectrum and for those patient sizes. Thus, the present work can be considered as an enhancement of the continuous studies performed by medical physicist as part of quality control tests and radiation protection dosimetry. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Current status of kilovoltage (kV) radiotherapy in the UK: installed equipment, clinical workload, physics quality control and radiation dosimetry.

    PubMed

    Palmer, Antony L; Pearson, Michael; Whittard, Paul; McHugh, Katie E; Eaton, David J

    2016-12-01

    To assess the status and practice of kilovoltage (kV) radiotherapy in the UK. 96% of the radiotherapy centres in the UK responded to a comprehensive survey. An analysis of the installed equipment base, patient numbers, clinical treatment sites, quality control (QC) testing and radiation dosimetry processes were undertaken. 73% of UK centres have at least one kV treatment unit, with 58 units installed across the UK. Although 35% of units are over 10 years old, 39% units have been installed in the last 5 years. Approximately 6000 patients are treated with kV units in the UK each year, the most common site (44%) being basal cell carcinoma. A benchmark of QC practice in the UK is presented, against which individual centres can compare their procedures, frequency of testing and acceptable tolerance values. We propose the use of internal "notification" and "suspension" levels for analysis. All surveyed centres were using recommended Codes of Practice for kV dosimetry in the UK; approximately the same number using in-air and in-water methodologies for medium energy, with two-thirds of all centres citing "clinical relevance" as the reason for choice of code. 64% of centres had hosted an external dosimetry audit within the last 3 years, with only one centre never being independently audited. The majority of centres use locally measured applicator factors and published backscatter factors for treatments. Monitor unit calculations are performed using software in only 36% of centres. A comprehensive review of current kV practice in the UK is presented. Advances in knowledge: Data and discussion on contemporary kV radiotherapy in the UK, with a particular focus on physics aspects.

  20. Current status of kilovoltage (kV) radiotherapy in the UK: installed equipment, clinical workload, physics quality control and radiation dosimetry

    PubMed Central

    Pearson, Michael; Whittard, Paul; McHugh, Katie E; Eaton, David J

    2016-01-01

    Objective: To assess the status and practice of kilovoltage (kV) radiotherapy in the UK. Methods: 96% of the radiotherapy centres in the UK responded to a comprehensive survey. An analysis of the installed equipment base, patient numbers, clinical treatment sites, quality control (QC) testing and radiation dosimetry processes were undertaken. Results: 73% of UK centres have at least one kV treatment unit, with 58 units installed across the UK. Although 35% of units are over 10 years old, 39% units have been installed in the last 5 years. Approximately 6000 patients are treated with kV units in the UK each year, the most common site (44%) being basal cell carcinoma. A benchmark of QC practice in the UK is presented, against which individual centres can compare their procedures, frequency of testing and acceptable tolerance values. We propose the use of internal “notification” and “suspension” levels for analysis. All surveyed centres were using recommended Codes of Practice for kV dosimetry in the UK; approximately the same number using in-air and in-water methodologies for medium energy, with two-thirds of all centres citing “clinical relevance” as the reason for choice of code. 64% of centres had hosted an external dosimetry audit within the last 3 years, with only one centre never being independently audited. The majority of centres use locally measured applicator factors and published backscatter factors for treatments. Monitor unit calculations are performed using software in only 36% of centres. Conclusion: A comprehensive review of current kV practice in the UK is presented. Advances in knowledge: Data and discussion on contemporary kV radiotherapy in the UK, with a particular focus on physics aspects. PMID:27730839

  1. Sci-Thur PM: YIS - 07: Monte Carlo simulations to obtain several parameters required for electron beam dosimetry.

    PubMed

    Muir, B; Rogers, D; McEwen, M

    2012-07-01

    When current dosimetry protocols were written, electron beam data were limited and had uncertainties that were unacceptable for reference dosimetry. Protocols for high-energy reference dosimetry are currently being updated leading to considerable interest in accurate electron beam data. To this end, Monte Carlo simulations using the EGSnrc user-code egs_chamber are performed to extract relevant data for reference beam dosimetry. Calculations of the absorbed dose to water and the absorbed dose to the gas in realistic ion chamber models are performed as a function of depth in water for cobalt-60 and high-energy electron beams between 4 and 22 MeV. These calculations are used to extract several of the parameters required for electron beam dosimetry - the beam quality specifier, R 50 , beam quality conversion factors, k Q and k R50 , the electron quality conversion factor, k' R50 , the photon-electron conversion factor, k ecal , and ion chamber perturbation factors, P Q . The method used has the advantage that many important parameters can be extracted as a function of depth instead of determination at only the reference depth as has typically been done. Results obtained here are in good agreement with measured and other calculated results. The photon-electron conversion factors obtained for a Farmer-type NE2571 and plane-parallel PTW Roos, IBA NACP-02 and Exradin A11 chambers are 0.903, 0.896, 0.894 and 0.906, respectively. These typically differ by less than 0.7% from the contentious TG-51 values but have much smaller systematic uncertainties. These results are valuable for reference dosimetry of high-energy electron beams. © 2012 American Association of Physicists in Medicine.

  2. SU-F-T-562: Validation of EPID-Based Dosimetry for FSRS Commissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Y; Saleh, Z; Obcemea, C

    Purpose: The prevailing approach to frameless SRS (fSRS) small field dosimetry is Gafchromic film. Though providing continuous information, its intrinsic uncertainties in fabrication, response, scan, and calibration often make film dosimetry subject to different interpretations. In this study, we explored the feasibility of using EPID portal dosimetry as a viable alternative to film for small field dosimetry. Methods: Plans prescribed a dose of 21 Gy were created on a flat solid water phantom with Eclipse V11 and iPlan for small static square fields (1.0 to 3.0 cm). In addition, two clinical test plans were computed by employing iPlan on amore » CIRS Kesler head phantom for target dimensions of 1.2cm and 2.0cm. Corresponding portal dosimetry plans were computed using the Eclipse TPS and delivered on a Varian TrueBeam machine. EBT-XD film dosimetry was performed as a reference. The isocenter doses were measured using EPID, OSLD, stereotactic diode, and CC01 ion chamber. Results: EPID doses at the center of the square field were higher than Eclipse TPS predicted portal doses, with the mean difference being 2.42±0.65%. Doses measured by EBT-XD film, OSLD, stereotactic diode, and CC01 ion chamber revealed smaller differences (except OSLDs), with mean differences being 0.36±3.11%, 4.12±4.13%, 1.7±2.76%, 1.45±2.37% for Eclipse and −1.36±0.85%, 2.38±4.2%, −0.03±0.50%, −0.27±0.78% for iPlan. The profiles measured by EPID and EBT-XD film resembled TPS (Eclipse and iPlan) predicted ones within 3.0%. For the two clinical test plans, the EPID mean doses at the center of field were 2.66±0.68% and 2.33±0.32% higher than TPS predicted doses. Conclusion: We found that results obtained with EPID portal dosimetry were slightly higher (∼2%) than those obtained with EBT-XD film, diode, and CC01 ion chamber with the exception of OSLDs, but well within IROC tolerance (5.0%). Therefore, EPID has the potential to become a viable real-time alternative method to film dosimetry.« less

  3. Organ dose calculations by Monte Carlo modeling of the updated VCH adult male phantom against idealized external proton exposure

    NASA Astrophysics Data System (ADS)

    Zhang, Guozhi; Liu, Qian; Zeng, Shaoqun; Luo, Qingming

    2008-07-01

    The voxel-based visible Chinese human (VCH) adult male phantom has offered a high-quality test bed for realistic Monte Carlo modeling in radiological dosimetry simulations. The phantom has been updated in recent effort by adding newly segmented organs, revising walled and smaller structures as well as recalibrating skeletal marrow distributions. The organ absorbed dose against external proton exposure was calculated at a voxel resolution of 2 × 2 × 2 mm3 using the MCNPX code for incident energies from 20 MeV to 10 GeV and for six idealized irradiation geometries: anterior-posterior (AP), posterior-anterior (PA), left-lateral (LLAT), right-lateral (RLAT), rotational (ROT) and isotropic (ISO), respectively. The effective dose on the VCH phantom was derived in compliance with the evaluation scheme for the reference male proposed in the 2007 recommendations of the International Commission on Radiological Protection (ICRP). Algorithm transitions from the revised radiation and tissue weighting factors are accountable for approximately 90% and 10% of effective dose discrepancies in proton dosimetry, respectively. Results are tabulated in terms of fluence-to-dose conversion coefficients for practical use and are compared with data from other models available in the literature. Anatomical variations between various computational phantoms lead to dose discrepancies ranging from a negligible level to 100% or more at proton energies below 200 MeV, corresponding to the spatial geometric locations of individual organs within the body. Doses show better agreement at higher energies and the deviations are mostly within 20%, to which the organ volume and mass differences should be of primary responsibility. The impact of body size on dose distributions was assessed by dosimetry of a scaled-up VCH phantom that was resized in accordance with the height and total mass of the ICRP reference man. The organ dose decreases with the directionally uniform enlargement of voxels. Potential pathways to improve the VCH phantom have also been briefly addressed. This work pertains to VCH-based systematic multi-particle dose investigations and will contribute to comparative dosimetry studies of ICRP standardized voxel phantoms in the near future.

  4. SU-E-T-399: Evaluation of Selection Criteria for Computational Human Phantoms for Use in Out-Of-Field Organ Dosimetry for Radiotherapy Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelletier, C; Jung, J; Lee, C

    2015-06-15

    Purpose: To quantify the dosimetric uncertainty due to organ position errors when using height and weight as phantom selection criteria in the UF/NCI Hybrid Phantom Library for the purpose of out-of-field organ dose reconstruction. Methods: Four diagnostic patient CT images were used to create 7-field IMRT plans. For each patient, dose to the liver, right lung, and left lung were calculated using the XVMC Monte Carlo code. These doses were taken to be the ground truth. For each patient, the phantom with the most closely matching height and weight was selected from the body size dependent phantom library. The patientmore » plans were then transferred to the computational phantoms and organ doses were recalculated. Each plan was also run on 4 additional phantoms with reference heights and or weights. Maximum and mean doses for the three organs were computed, and the DVHs were extracted and compared. One sample t-tests were performed to compare the accuracy of the height and weight matched phantoms against the additional phantoms in regards to both maximum and mean dose. Results: For one of the patients, the height and weight matched phantom yielded the most accurate results across all three organs for both maximum and mean doses. For two additional patients, the matched phantom yielded the best match for one organ only. In 13 of the 24 cases, the matched phantom yielded better results than the average of the other four phantoms, though the results were only statistically significant at the .05 level for three cases. Conclusion: Using height and weight matched phantoms does yield better results in regards to out-of-field dosimetry than using average phantoms. Height and weight appear to be moderately good selection criteria, though this selection criteria failed to yield any better results for one patient.« less

  5. Using NJOY to Create MCNP ACE Files and Visualize Nuclear Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahler, Albert Comstock

    We provide lecture materials that describe the input requirements to create various MCNP ACE files (Fast, Thermal, Dosimetry, Photo-nuclear and Photo-atomic) with the NJOY Nuclear Data Processing code system. Input instructions to visualize nuclear data with NJOY are also provided.

  6. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    PubMed

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  7. Radiological assessment. A textbook on environmental dose analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Till, J.E.; Meyer, H.R.

    1983-09-01

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. Themore » material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides.« less

  8. TLD efficiency calculations for heavy ions: an analytical approach

    DOE PAGES

    Boscolo, Daria; Scifoni, Emanuele; Carlino, Antonio; ...

    2015-12-18

    The use of thermoluminescent dosimeters (TLDs) in heavy charged particles’ dosimetry is limited by their non-linear dose response curve and by their response dependence on the radiation quality. Thus, in order to use TLDs with particle beams, a model that can reproduce the behavior of these detectors under different conditions is needed. Here a new, simple and completely analytical algorithm for the calculation of the relative TL-efficiency depending on the ion charge Z and energy E is presented. In addition, the detector response is evaluated starting from the single ion case, where the computed effectiveness values have been compared withmore » experimental data as well as with predictions from a different method. The main advantage of this approach is that, being fully analytical, it is computationally fast and can be efficiently integrated into treatment planning verification tools. In conclusion, the calculated efficiency values have been then implemented in the treatment planning code TRiP98 and dose calculations on a macroscopic target irradiated with an extended carbon ion field have been performed and verified against experimental data.« less

  9. Calibration of modified Liulin detector for cosmic radiation measurements on-board aircraft.

    PubMed

    Kyselová, D; Ambrožová, I; Krist, P; Kubančák, J; Uchihori, Y; Kitamura, H; Ploc, O

    2015-06-01

    The annual effective doses of aircrew members often exceed the limit of 1 mSv for the public due to the increased level of cosmic radiation at the flight altitudes, and thus, it is recommended to monitor them. Aircrew dosimetry is usually performed using special computer programs mostly based on results of Monte Carlo simulations. Contemporary, detectors are used mostly for validation of these computer codes, verification of effective dose calculations and for research purposes. One of such detectors is active silicon semiconductor deposited energy spectrometer Liulin. Output quantities of measurement with the Liulin detector are the absorbed dose in silicon D and the ambient dose equivalent H*(10); to determine it, two calibrations are necessary. The purpose of this work was to develop a calibration methodology that can be used to convert signal from the detector to D independently on calibration performed at Heavy Ion Medical Accelerator facility in Chiba, Japan. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Development of the voxel computational phantoms of pediatric patients and their application to organ dose assessment

    NASA Astrophysics Data System (ADS)

    Lee, Choonik

    A series of realistic voxel computational phantoms of pediatric patients were developed and then used for the radiation risk assessment for various exposure scenarios. The high-resolution computed tomographic images of live patients were utilized for the development of the five voxel phantoms of pediatric patients, 9-month male, 4-year female, 8-year female, 11-year male, and 14-year male. The phantoms were first developed as head and torso phantoms and then extended into whole body phantoms by utilizing computed tomographic images of a healthy adult volunteer. The whole body phantom series was modified to have the same anthropometrics with the most recent reference data reported by the international commission on radiological protection. The phantoms, named as the University of Florida series B, are the first complete set of the pediatric voxel phantoms having reference organ masses and total heights. As part of the dosimetry study, the investigation on skeletal tissue dosimetry methods was performed for better understanding of the radiation dose to the active bone marrow and bone endosteum. All of the currently available methodologies were inter-compared and benchmarked with the paired-image radiation transport model. The dosimetric characteristics of the phantoms were investigated by using Monte Carlo simulation of the broad parallel beams of external phantom in anterior-posterior, posterior-anterior, left lateral, right lateral, rotational, and isotropic angles. Organ dose conversion coefficients were calculated for extensive photon energies and compared with the conventional stylized pediatric phantoms of Oak Ridge National Laboratory. The multi-slice helical computed tomography exams were simulated using Monte Carlo simulation code for various exams protocols, head, chest, abdomen, pelvis, and chest-abdomen-pelvis studies. Results have found realistic estimates of the effective doses for frequently used protocols in pediatric radiology. The results were very crucial in understanding the radiation risks of the patients undergoing computed tomography. Finally, nuclear medicine simulations were performed by calculating specific absorbed fractions for multiple target-source organ pairs via Monte Carlo simulations. Specific absorbed fractions were calculated for both photon and electron so that they can be used to calculated radionuclide S-values. All of the results were tabulated for future uses and example dose assessment was performed for selected nuclides administered in nuclear medicine.

  11. Analysis of regional radiotherapy dosimetry audit data and recommendations for future audits

    PubMed Central

    Palmer, A; Mzenda, B; Kearton, J; Wills, R

    2011-01-01

    Objectives Regional interdepartmental dosimetry audits within the UK provide basic assurances of the dosimetric accuracy of radiotherapy treatments. Methods This work reviews several years of audit results from the South East Central audit group including megavoltage (MV) and kilovoltage (kV) photons, electrons and iodine-125 seeds. Results Apart from some minor systematic errors that were resolved, the results of all audits have been within protocol tolerances, confirming the long-term stability and agreement of basic radiation dosimetric parameters between centres in the audit region. There is some evidence of improvement in radiation dosimetry with the adoption of newer codes of practice. Conclusion The value of current audit methods and the limitations of peer-to-peer auditing is discussed, particularly the influence of the audit schedule on the results obtained, where no “gold standard” exists. Recommendations are made for future audits, including an essential requirement to maintain the monitoring of basic fundamental dosimetry, such as MV photon and electron output, but audits must also be developed to include new treatment technologies such as image-guided radiotherapy and address the most common sources of error in radiotherapy. PMID:21159805

  12. Data Packages for the Hanford Immobilized Low Activity Tank Waste Performance Assessment 2001 Version [SEC 1 THRU 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANN, F.M.

    Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.

  13. Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*

    PubMed Central

    Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G

    2014-01-01

    Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image-based dosimetry in nuclear medicine. PMID:24200697

  14. Comparison of Flattening Filter (FF) and Flattening-Filter-Free (FFF) 6 MV photon beam characteristics for small field dosimetry using EGSnrc Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Sangeetha, S.; Sureka, C. S.

    2017-06-01

    The present study is focused to compare the characteristics of Varian Clinac 600 C/D flattened and unflattened 6 MV photon beams for small field dosimetry using EGSnrc Monte Carlo Simulation since the small field dosimetry is considered to be the most crucial and provoking task in the field of radiation dosimetry. A 6 MV photon beam of a Varian Clinac 600 C/D medical linear accelerator operates with Flattening Filter (FF) and Flattening-Filter-Free (FFF) mode for small field dosimetry were performed using EGSnrc Monte Carlo user codes (BEAMnrc and DOSXYZnrc) in order to calculate the beam characteristics using Educated-trial and error method. These includes: Percentage depth dose, lateral beam profile, dose rate delivery, photon energy spectra, photon beam uniformity, out-of-field dose, surface dose, penumbral dose and output factor for small field dosimetry (0.5×0.5 cm2 to 4×4 cm2) and are compared with magna-field sizes (5×5 cm2 to 40×40 cm2) at various depths. The results obtained showed that the optimized beam energy and Full-width-half maximum value for small field dosimetry and magna-field dosimetry was found to be 5.7 MeV and 0.13 cm for both FF and FFF beams. The depth of dose maxima for small field size deviates minimally for both FF and FFF beams similar to magna-fields. The depths greater than dmax depicts a steeper dose fall off in the exponential region for FFF beams comparing FF beams where its deviations gets increased with the increase in field size. The shape of the lateral beam profiles of FF and FFF beams varies remains similar for the small field sizes less than 4×4 cm2 whereas it varies in the case of magna-fields. Dose rate delivery for FFF beams shows an eminent increase with a two-fold factor for both small field dosimetry and magna-field sizes. The surface dose measurements of FFF beams for small field size were found to be higher whereas it gets lower for magna-fields than FF beam. The amount of out-of-field dose reduction gets increased with the increase in field size. It is also observed that the photon energy spectrum gets increased with the increase in field size for FFF beam mode. Finally, the output factors for FFF beams were relatively quite low for small field sizes than FF beams whereas it gets higher for magna-field sizes. From this study, it is concluded that the FFF beams depicted minimal deviations in the treatment field region irrespective to the normal tissue region for small field dosimetry compared to FF beams. The more prominent result observed from the study is that the shape of the beam profile remains similar for FF and FFF beams in the case of smaller field size that leads to more accurate treatment planning in the case of IMRT (Image-Guided Radiation Therapy), IGAT (Image-Guided Adaptive Radiation Therapy), SBRT (Stereotactic Body Radiation Therapy), SRS (Stereotactic Radio Surgery), and Tomotherapy techniques where homogeneous dose is not necessary. On the whole, the determination of dosimetric beam characteristics of Varian linac machine using Monte Carlo simulation provides accurate dose calculation as the clinical golden data.

  15. Depth dose distribution study within a phantom torso after irradiation with a simulated Solar Particle Event at NSRL

    NASA Astrophysics Data System (ADS)

    Berger, Thomas; Matthiä, Daniel; Koerner, Christine; George, Kerry; Rhone, Jordan; Cucinotta, Francis A.; Reitz, Guenther

    The adequate knowledge of the radiation environment and the doses incurred during a space mission is essential for estimating an astronaut's health risk. The space radiation environment is complex and variable, and exposures inside the spacecraft and the astronaut's body are com-pounded by the interactions of the primary particles with the atoms of the structural materials and with the body itself. Astronauts' radiation exposures are measured by means of personal dosimetry, but there remains substantial uncertainty associated with the computational extrap-olation of skin dose to organ dose, which can lead to over-or under-estimation of the health risk. Comparisons of models to data showed that the astronaut's Effective dose (E) can be pre-dicted to within about a +10In the research experiment "Depth dose distribution study within a phantom torso" at the NASA Space Radiation Laboratory (NSRL) at BNL, Brookhaven, USA the large 1972 SPE spectrum was simulated using seven different proton energies from 50 up to 450 MeV. A phantom torso constructed of natural bones and realistic distributions of human tissue equivalent materials, which is comparable to the torso of the MATROSHKA phantom currently on the ISS, was equipped with a comprehensive set of thermoluminescence detectors and human cells. The detectors are applied to assess the depth dose distribution and radiation transport codes (e.g. GEANT4) are used to assess the radiation field and interactions of the radiation field with the phantom torso. Lymphocyte cells are strategically embedded at selected locations at the skin and internal organs and are processed after irradiation to assess the effects of shielding on the yield of chromosome damage. The first focus of the pre-sented experiment is to correlate biological results with physical dosimetry measurements in the phantom torso. Further on the results of the passive dosimetry using the anthropomorphic phantoms represent the best tool to generate reliable to benchmark computational radiation transport models in a radiation field of interest. The presentation will give first results of the physical dose distribution, the comparison with GEANT4 computer simulations, based on a Voxel model of the phantom, and a comparison with the data from the chromosome aberration study. The help and support of Adam Russek and Michael Sivertz of the NASA Space Radiation Laboratory (NSRL), Brookhaven, USA during the setup and the irradiation of the phantom are highly appreciated. The Voxel model describing the human phantom used for the GEANT4 simulations was kindly provided by Monika Puchalska (CHALMERS, Gothenburg, Sweden).

  16. Internal photon and electron dosimetry of the newborn patient—a hybrid computational phantom study

    NASA Astrophysics Data System (ADS)

    Wayson, Michael; Lee, Choonsik; Sgouros, George; Treves, S. Ted; Frey, Eric; Bolch, Wesley E.

    2012-03-01

    Estimates of radiation absorbed dose to organs of the nuclear medicine patient are a requirement for administered activity optimization and for stochastic risk assessment. Pediatric patients, and in particular the newborn child, represent that portion of the patient population where such optimization studies are most crucial owing to the enhanced tissue radiosensitivities and longer life expectancies of this patient subpopulation. In cases where whole-body CT imaging is not available, phantom-based calculations of radionuclide S values—absorbed dose to a target tissue per nuclear transformation in a source tissue—are required for dose and risk evaluation. In this study, a comprehensive model of electron and photon dosimetry of the reference newborn child is presented based on a high-resolution hybrid-voxel phantom from the University of Florida (UF) patient model series. Values of photon specific absorbed fraction (SAF) were assembled for both the reference male and female newborn using the radiation transport code MCNPX v2.6. Values of electron SAF were assembled in a unique and time-efficient manner whereby the collisional and radiative components of organ dose--for both self- and cross-dose terms—were computed separately. Dose to the newborn skeletal tissues were assessed via fluence-to-dose response functions reported for the first time in this study. Values of photon and electron SAFs were used to assemble a complete set of S values for some 16 radionuclides commonly associated with molecular imaging of the newborn. These values were then compared to those available in the OLINDA/EXM software. S value ratios for organ self-dose ranged from 0.46 to 1.42, while similar ratios for organ cross-dose varied from a low of 0.04 to a high of 3.49. These large discrepancies are due in large part to the simplistic organ modeling in the stylized newborn model used in the OLINDA/EXM software. A comprehensive model of internal dosimetry is presented in this study for the newborn nuclear medicine patient based upon the UF hybrid computational phantom. Photon dose response functions, photon and electron SAFs, and tables of radionuclide S values for the newborn child--both male and female--are given in a series of four electronic annexes available at stacks.iop.org/pmb/57/1433/mmedia. These values can be applied to optimization studies of image quality and stochastic risk for this most vulnerable class of pediatric patients.

  17. FLUKA simulation studies on in-phantom dosimetric parameters of a LINAC-based BNCT

    NASA Astrophysics Data System (ADS)

    Ghal-Eh, N.; Goudarzi, H.; Rahmani, F.

    2017-12-01

    The Monte Carlo simulation code, FLUKA version 2011.2c.5, has been used to estimate the in-phantom dosimetric parameters for use in BNCT studies. The in-phantom parameters of a typical Snyder head, which are necessary information prior to any clinical treatment, have been calculated with both FLUKA and MCNPX codes, which exhibit a promising agreement. The results confirm that FLUKA can be regarded as a good alternative for the MCNPX in BNCT dosimetry simulations.

  18. Estimation of electromagnetic dosimetric values from non-ionizing radiofrequency fields in an indoor commercial airplane environment.

    PubMed

    Aguirre, Erik; Arpón, Javier; Azpilicueta, Leire; López, Peio; de Miguel, Silvia; Ramos, Victoria; Falcone, Francisco

    2014-12-01

    In this article, the impact of topology as well as morphology of a complex indoor environment such as a commercial aircraft in the estimation of dosimetric assessment is presented. By means of an in-house developed deterministic 3D ray-launching code, estimation of electric field amplitude as a function of position for the complete volume of a commercial passenger airplane is obtained. Estimation of electromagnetic field exposure in this environment is challenging, due to the complexity and size of the scenario, as well as to the large metallic content, giving rise to strong multipath components. By performing the calculation with a deterministic technique, the complete scenario can be considered with an optimized balance between accuracy and computational cost. The proposed method can aid in the assessment of electromagnetic dosimetry in the future deployment of embarked wireless systems in commercial aircraft.

  19. Dosimetry quality audit of high energy photon beams in greek radiotherapy centers.

    PubMed

    Hourdakis, Constantine J; Boziari, A

    2008-04-01

    Dosimetry quality audits and intercomparisons in radiotherapy centers is a useful tool in order to enhance the confidence for an accurate therapy and to explore and dissolve discrepancies in dose delivery. This is the first national comprehensive study that has been carried out in Greece. During 2002--2006 the Greek Atomic Energy Commission performed a dosimetry quality audit of high energy external photon beams in all (23) Greek radiotherapy centers, where 31 linacs and 13 Co-60 teletherapy units were assessed in terms of their mechanical performance characteristics and relative and absolute dosimetry. The quality audit in dosimetry of external photon beams took place by means of on-site visits, where certain parameters of the photon beams were measured, calculated and assessed according to a specific protocol and the IAEA TRS 398 dosimetry code of practice. In each radiotherapy unit (Linac or Co-60), certain functional parameters were measured and the results were compared to tolerance values and limits. Doses in water under reference and non reference conditions were measured and compared to the stated values. Also, the treatment planning systems (TPS) were evaluated with respect to irradiation time calculations. The results of the mechanical tests, dosimetry measurements and TPS evaluation have been presented in this work and discussed in detail. This study showed that Co-60 units had worse performance mechanical characteristics than linacs. 28% of all irradiation units (23% of linacs and 42% of Co-60 units) exceeded the acceptance limit at least in one mechanical parameter. Dosimetry accuracy was much worse in Co60 units than in linacs. 61% of the Co60 units exhibited deviations outside +/-3% and 31% outside +/-5%. The relevant percentages for the linacs were 24% and 7% respectively. The results were grouped for each hospital and the sources of errors (functional and human) have been investigated and discussed in details. This quality audit proved to be a useful tool for the improvement of quality in radiotherapy. It succeeded to disseminate the IAEA TRS-398 protocol in nearly all radiotherapy centers achieving homogenization and consistency of dosimetry within the country. Also, it detected discrepancies in dosimetry and provided guidance and recommendations to eliminate sources of errors. Finally, it proved that quality assurance programs, periodic quality control tests, maintenance and service play an important role for achieving accuracy and safe operation in radiotherapy.

  20. Multiscale integral analysis of a HT leakage in a fusion nuclear power plant

    NASA Astrophysics Data System (ADS)

    Velarde, M.; Fradera, J.; Perlado, J. M.; Zamora, I.; Martínez-Saban, E.; Colomer, C.; Briani, P.

    2016-05-01

    The present work presents an example of the application of an integral methodology based on a multiscale analysis that covers the whole tritium cycle within a nuclear fusion power plant, from a micro scale, analyzing key components where tritium is leaked through permeation, to a macro scale, considering its atmospheric transport. A leakage from the Nuclear Power Plants, (NPP) primary to the secondary side of a heat exchanger (HEX) is considered for the present example. Both primary and secondary loop coolants are assumed to be He. Leakage is placed inside the HEX, leaking tritium in elementary tritium (HT) form to the secondary loop where it permeates through the piping structural material to the exterior. The Heating Ventilation and Air Conditioning (HVAC) system removes the leaked tritium towards the NPP exhaust. The HEX is modelled with system codes and coupled to Computational Fluid Dynamic (CFD) to account for tritium dispersion inside the nuclear power plants buildings and in site environment. Finally, tritium dispersion is calculated with an atmospheric transport code and a dosimetry analysis is carried out. Results show how the implemented methodology is capable of assessing the impact of tritium from the microscale to the atmospheric scale including the dosimetric aspect.

  1. Dosimetric evaluation of nanotargeted (188)Re-liposome with the MIRDOSE3 and OLINDA/EXM programs.

    PubMed

    Chang, Chih-Hsien; Chang, Ya-Jen; Lee, Te-Wei; Ting, Gann; Chang, Kwo-Ping

    2012-06-01

    The OLINDA/EXM computer code was created as a replacement for the widely used MIRDOSE3 code for radiation dosimetry in nuclear medicine. A dosimetric analysis with these codes was performed to evaluate nanoliposomes as carriers of radionuclides ((188)Re-liposomes) in colon carcinoma-bearing mice. Pharmacokinetic data for (188)Re-N, N-bis (2-mercaptoethyl)-N',N'-diethylethylenediamine ((188)Re-BMEDA) and (188)Re-liposome were obtained for estimation of absorbed doses in normal organs. Radiation dose estimates for normal tissues were calculated using the MIRDOSE3 and OLINDA/EXM programs for a colon carcinoma solid tumor mouse model. Mean absorbed doses derived from(188)Re-BMEDA and (188)Re-liposome in normal tissues were generally similar as calculated by MIRDOSE3 and OLINDA/EXM programs. One notable exception to this was red marrow, wherein MIRDOSE3 resulted in higher absorbed doses than OLINDA/EXM (1.53- and 1.60-fold for (188)Re-BMEDA and (188)Re-liposome, respectively). MIRDOSE3 and OLINDA have very similar residence times and organ doses. Bone marrow doses were estimated by designating cortical bone rather than bone marrow as a source organ. The bone marrow doses calculated by MIRDOSE3 are higher than those by OLINDA. If the bone marrow is designated as a source organ, the doses estimated by MIRDOSE3 and OLINDA programs will be very similar.

  2. MO-B-BRB-04: 3D Dosimetry in End-To-End Dosimetry QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibbott, G.

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by themore » development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.« less

  3. Computational model of gamma irradiation room at ININ

    NASA Astrophysics Data System (ADS)

    Rodríguez-Romo, Suemi; Patlan-Cardoso, Fernando; Ibáñez-Orozco, Oscar; Vergara Martínez, Francisco Javier

    2018-03-01

    In this paper, we present a model of the gamma irradiation room at the National Institute of Nuclear Research (ININ is its acronym in Spanish) in Mexico to improve the use of physics in dosimetry for human protection. We deal with air-filled ionization chambers and scientific computing made in house and framed in both the GEANT4 scheme and our analytical approach to characterize the irradiation room. This room is the only secondary dosimetry facility in Mexico. Our aim is to optimize its experimental designs, facilities, and industrial applications of physical radiation. The computational results provided by our model are supported by all the known experimental data regarding the performance of the ININ gamma irradiation room and allow us to predict the values of the main variables related to this fully enclosed space to within an acceptable margin of error.

  4. Patient-specific dosimetry based on quantitative SPECT imaging and 3D-DFT convolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akabani, G.; Hawkins, W.G.; Eckblade, M.B.

    1999-01-01

    The objective of this study was to validate the use of a 3-D discrete Fourier Transform (3D-DFT) convolution method to carry out the dosimetry for I-131 for soft tissues in radioimmunotherapy procedures. To validate this convolution method, mathematical and physical phantoms were used as a basis of comparison with Monte Carlo transport (MCT) calculations which were carried out using the EGS4 system code. The mathematical phantom consisted of a sphere containing uniform and nonuniform activity distributions. The physical phantom consisted of a cylinder containing uniform and nonuniform activity distributions. Quantitative SPECT reconstruction was carried out using the Circular Harmonic Transformmore » (CHT) algorithm.« less

  5. The Bebig Valencia-type skin applicators: Dosimetric study and implementation of a dosimetric hybrid technique.

    PubMed

    Anagnostopoulos, Georgios; Andrássy, Michael; Baltas, Dimos

    To determine the relative dose rate distribution in water for the Bebig 20 mm and 30 mm skin applicators and report results in a form suitable for potential clinical use. Results for both skin applicators are also provided in the form of a hybrid Task Group 43 (TG-43) dosimetry technique. Furthermore, the radiation leakage around both skin applicators from the radiation protection point of view and the impact of the geometrical source position uncertainties are studied and reported. Monte Carlo simulations were performed using the MCNP 6.1 general purpose code, which was benchmarked against published dosimetry data for the Bebig Ir2.A85-2 high-dose-rate iridium-192 source, as well as the dosimetry data for the two Elekta skin applicators. Both Bebig skin applicators were modeled, and the dose rate distributions in a water phantom were calculated. The dosimetric quantities derived according to a hybrid TG-43 dosimetry technique are provided with their corresponding uncertainty values. The air kerma rate in air was simulated in the vicinity of each skin applicator to assess the radiation leakage. Results from the Monte Carlo simulations of both skin applicators are presented in the form of figures and relative dose rate tables, and additionally with the aid of the quantities defined in the hybrid TG-43 dosimetry technique and their corresponding uncertainty values. Their output factors, flatness, and penumbra values were found comparable to the Elekta skin applicators. The radiation shielding was evaluated to be adequate. The effect of potential uncertainties in source positioning on dosimetry should be investigated as part of applicator commissioning. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterialsmore » or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.« less

  7. Pediatric personalized CT-dosimetry Monte Carlo simulations, using computational phantoms

    NASA Astrophysics Data System (ADS)

    Papadimitroulas, P.; Kagadis, G. C.; Ploussi, A.; Kordolaimi, S.; Papamichail, D.; Karavasilis, E.; Syrgiamiotis, V.; Loudos, G.

    2015-09-01

    The last 40 years Monte Carlo (MC) simulations serve as a “gold standard” tool for a wide range of applications in the field of medical physics and tend to be essential in daily clinical practice. Regarding diagnostic imaging applications, such as computed tomography (CT), the assessment of deposited energy is of high interest, so as to better analyze the risks and the benefits of the procedure. The last few years a big effort is done towards personalized dosimetry, especially in pediatric applications. In the present study the GATE toolkit was used and computational pediatric phantoms have been modeled for the assessment of CT examinations dosimetry. The pediatric models used come from the XCAT and IT'IS series. The X-ray spectrum of a Brightspeed CT scanner was simulated and validated with experimental data. Specifically, a DCT-10 ionization chamber was irradiated twice using 120 kVp with 100 mAs and 200 mAs, for 1 sec in 1 central axial slice (thickness = 10mm). The absorbed dose was measured in air resulting in differences lower than 4% between the experimental and simulated data. The simulations were acquired using ˜1010 number of primaries in order to achieve low statistical uncertainties. Dose maps were also saved for quantification of the absorbed dose in several children critical organs during CT acquisition.

  8. [Determination of absorbed dose to water for high energy photon and electron beams--comparison of different dosimetry protocols].

    PubMed

    Zakaria, Golam Abu; Schütte, Wilhelm

    2003-01-01

    The determination of absorbed dose to water for high-energy photon and electron beams is performed in Germany according to the dosimetry protocol DIN 6800-2 (1997). At an international level, the main protocols used are the AAPM dosimetry protocol TG-51 (1999) and the IAEA Code of Practice TRS-398 (2000). The present paper systematically compares these three dosimetry protocols, and identifies similarities and differences. The investigations were performed using 4 and 10 MV photon beams, as well as 6, 8, 9, 10, 12 and 14 MeV electron beams. Two cylindrical and two plane-parallel type chambers were used for measurements. In general, the discrepancies among the three protocols were 1.0% for photon beams and 1.6% for electron beams. Comparative measurements in the context of measurement technical control (MTK) with TLD showed a deviation of less than 1.3% between the measurements obtained according to protocols DIN 6800-2 and MTK (exceptions: 4 MV photons with 2.9% and 6 MeV electrons with 2.4%). While only cylindrical chambers were used for photon beams, measurements of electron beams were performed using both cylindrical and plane-parallel chambers (the latter used after a cross-calibration to a cylindrical chamber, as required by the respective dosimetry protocols). Notably, unlike recommended in the corresponding protocols, we found out that cylindrical chambers can be used also for energies from 6 to 10 MeV.

  9. Comparison of Three Methods of Calculation, Experimental and Monte Carlo Simulation in Investigation of Organ Doses (Thyroid, Sternum, Cervical Vertebra) in Radioiodine Therapy

    PubMed Central

    Shahbazi-Gahrouei, Daryoush; Ayat, Saba

    2012-01-01

    Radioiodine therapy is an effective method for treating thyroid cancer carcinoma, but it has some affects on normal tissues, hence dosimetry of vital organs is important to weigh the risks and benefits of this method. The aim of this study is to measure the absorbed doses of important organs by Monte Carlo N Particle (MCNP) simulation and comparing the results of different methods of dosimetry by performing a t-paired test. To calculate the absorbed dose of thyroid, sternum, and cervical vertebra using the MCNP code, *F8 tally was used. Organs were simulated by using a neck phantom and Medical Internal Radiation Dosimetry (MIRD) method. Finally, the results of MCNP, MIRD, and Thermoluminescent dosimeter (TLD) measurements were compared by SPSS software. The absorbed dose obtained by Monte Carlo simulations for 100, 150, and 175 mCi administered 131I was found to be 388.0, 427.9, and 444.8 cGy for thyroid, 208.7, 230.1, and 239.3 cGy for sternum and 272.1, 299.9, and 312.1 cGy for cervical vertebra. The results of paired t-test were 0.24 for comparing TLD dosimetry and MIRD calculation, 0.80 for MCNP simulation and MIRD, and 0.19 for TLD and MCNP. The results showed no significant differences among three methods of Monte Carlo simulations, MIRD calculation and direct experimental dosimetry using TLD. PMID:23717806

  10. Cellular dosimetry calculations for Strontium-90 using Monte Carlo code PENELOPE.

    PubMed

    Hocine, Nora; Farlay, Delphine; Boivin, Georges; Franck, Didier; Agarande, Michelle

    2014-11-01

    To improve risk assessments associated with chronic exposure to Strontium-90 (Sr-90), for both the environment and human health, it is necessary to know the energy distribution in specific cells or tissue. Monte Carlo (MC) simulation codes are extremely useful tools for calculating deposition energy. The present work was focused on the validation of the MC code PENetration and Energy LOss of Positrons and Electrons (PENELOPE) and the assessment of dose distribution to bone marrow cells from punctual Sr-90 source localized within the cortical bone part. S-values (absorbed dose per unit cumulated activity) calculations using Monte Carlo simulations were performed by using PENELOPE and Monte Carlo N-Particle eXtended (MCNPX). Cytoplasm, nucleus, cell surface, mouse femur bone and Sr-90 radiation source were simulated. Cells are assumed to be spherical with the radii of the cell and cell nucleus ranging from 2-10 μm. The Sr-90 source is assumed to be uniformly distributed in cell nucleus, cytoplasm and cell surface. The comparison of S-values calculated with PENELOPE to MCNPX results and the Medical Internal Radiation Dose (MIRD) values agreed very well since the relative deviations were less than 4.5%. The dose distribution to mouse bone marrow cells showed that the cells localized near the cortical part received the maximum dose. The MC code PENELOPE may prove useful for cellular dosimetry involving radiation transport through materials other than water, or for complex distributions of radionuclides and geometries.

  11. MO-B-BRB-00: Three Dimensional Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by themore » development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.« less

  12. MO-B-BRB-03: 3D Dosimetry in the Clinic: Validating Special Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Juang, T.

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by themore » development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.« less

  13. MO-B-BRB-01: 3D Dosimetry in the Clinic: Background and Motivation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schreiner, L.

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by themore » development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.« less

  14. MO-B-BRB-02: 3D Dosimetry in the Clinic: IMRT Technique Validation in Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceberg, S.

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by themore » development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.« less

  15. ESTIMATION OF INTERNAL EXPOSURE TO URANIUM WITH UNCERTAINTY FROM URINALYSIS DATA USING THE InDEP COMPUTER CODE

    PubMed Central

    Anderson, Jeri L.; Apostoaei, A. Iulian; Thomas, Brian A.

    2015-01-01

    The National Institute for Occupational Safety and Health (NIOSH) is currently studying mortality in a cohort of 6409 workers at a former uranium processing facility. As part of this study, over 220 000 urine samples were used to reconstruct organ doses due to internal exposure to uranium. Most of the available computational programs designed for analysis of bioassay data handle a single case at a time, and thus require a significant outlay of time and resources for the exposure assessment of a large cohort. NIOSH is currently supporting the development of a computer program, InDEP (Internal Dose Evaluation Program), to facilitate internal radiation exposure assessment as part of epidemiological studies of both uranium- and plutonium-exposed cohorts. A novel feature of InDEP is its batch processing capability which allows for the evaluation of multiple study subjects simultaneously. InDEP analyses bioassay data and derives intakes and organ doses with uncertainty estimates using least-squares regression techniques or using the Bayes’ Theorem as applied to internal dosimetry (Bayesian method). This paper describes the application of the current version of InDEP to formulate assumptions about the characteristics of exposure at the study facility that were used in a detailed retrospective intake and organ dose assessment of the cohort. PMID:22683620

  16. Benchmark of PENELOPE code for low-energy photon transport: dose comparisons with MCNP4 and EGS4.

    PubMed

    Ye, Sung-Joon; Brezovich, Ivan A; Pareek, Prem; Naqvi, Shahid A

    2004-02-07

    The expanding clinical use of low-energy photon emitting 125I and 103Pd seeds in recent years has led to renewed interest in their dosimetric properties. Numerous papers pointed out that higher accuracy could be obtained in Monte Carlo simulations by utilizing newer libraries for the low-energy photon cross-sections, such as XCOM and EPDL97. The recently developed PENELOPE 2001 Monte Carlo code is user friendly and incorporates photon cross-section data from the EPDL97. The code has been verified for clinical dosimetry of high-energy electron and photon beams, but has not yet been tested at low energies. In the present work, we have benchmarked the PENELOPE code for 10-150 keV photons. We computed radial dose distributions from 0 to 10 cm in water at photon energies of 10-150 keV using both PENELOPE and MCNP4C with either DLC-146 or DLC-200 cross-section libraries, assuming a point source located at the centre of a 30 cm diameter and 20 cm length cylinder. Throughout the energy range of simulated photons (except for 10 keV), PENELOPE agreed within statistical uncertainties (at worst +/- 5%) with MCNP/DLC-146 in the entire region of 1-10 cm and with published EGS4 data up to 5 cm. The dose at 1 cm (or dose rate constant) of PENELOPE agreed with MCNP/DLC-146 and EGS4 data within approximately +/- 2% in the range of 20-150 keV, while MCNP/DLC-200 produced values up to 9% lower in the range of 20-100 keV than PENELOPE or the other codes. However, the differences among the four datasets became negligible above 100 keV.

  17. Advanced dosimetry systems for the space transport and space station

    NASA Technical Reports Server (NTRS)

    Wailly, L. F.; Schneider, M. F.; Clark, B. C.

    1972-01-01

    Advanced dosimetry system concepts are described that will provide automated and instantaneous measurement of dose and particle spectra. Systems are proposed for measuring dose rate from cosmic radiation background to greater than 3600 rads/hr. Charged particle spectrometers, both internal and external to the spacecraft, are described for determining mixed field energy spectra and particle fluxes for both real time onboard and ground-based computer evaluation of the radiation hazard. Automated passive dosimetry systems consisting of thermoluminescent dosimeters and activation techniques are proposed for recording the dose levels for twelve or more crew members. This system will allow automatic onboard readout and data storage of the accumulated dose and can be transmitted to ground after readout or data records recovered with each crew rotation.

  18. LWR pressure vessel surveillance dosimetry improvement program: LWR power reactor surveillance physics-dosimetry data base compendium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McElroy, W.N.

    1985-08-01

    This NRC physics-dosimetry compendium is a collation of information and data developed from available research and commercial light water reactor vessel surveillance program (RVSP) documents and related surveillance capsule reports. The data represents the results of the HEDL least-squares FERRET-SAND II Code re-evaluation of exposure units and values for 47 PWR and BWR surveillance capsules for W, B and W, CE, and GE power plants. Using a consistent set of auxiliary data and dosimetry-adjusted reactor physics results, the revised fluence values for E > 1 MeV averaged 25% higher than the originally reported values. The range of fluence values (new/old)more » was from a low of 0.80 to a high of 2.38. These HEDL-derived FERRET-SAND II exposure parameter values are being used for NRC-supported HEDL and other PWR and BWR trend curve data development and testing studies. These studies are providing results to support Revision 2 of Regulatory Guide 1.99. As stated by Randall (Ra84), the Guide is being updated to reflect recent studies of the physical basis for neutron radiation damage and efforts to correlate damage to chemical composition and fluence.« less

  19. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zehtabian, M; Zaker, N; Sina, S

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 whichmore » is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.« less

  20. Benefits of online in vivo dosimetry for single-fraction total body irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eaton, David J., E-mail: davideaton@nhs.net; Warry, Alison J.; Trimble, Rachel E.

    Use of a patient test dose before single-fraction total body irradiation (TBI) allows review of in vivo dosimetry and modification of the main treatment setup. However, use of computed tomography (CT) planning and online in vivo dosimetry may reduce the need for this additional step. Patients were treated using a supine CT-planned extended source-to-surface distance (SSD) technique with lead compensators and bolus. In vivo dosimetry was performed using thermoluminescent dosimeters (TLDs) and diodes at 10 representative anatomical locations, for both a 0.1-Gy test dose and the treatment dose. In total, 28 patients were treated between April 2007 and July 2013,more » with changes made in 10 cases (36%) following test dose results. Overall, 98.1% of measured in vivo treatment doses were within 10% of the prescribed dose, compared with 97.0% of test dose readings. Changes made following the test dose could have been applied during the single-fraction treatment itself, assuming that the dose was delivered in subportions and online in vivo dosimetry was available for all clinically important anatomical sites. This alleviates the need for a test dose, saving considerable time and resources.« less

  1. The polyGeVero® software for fast and easy computation of 3D radiotherapy dosimetry data

    NASA Astrophysics Data System (ADS)

    Kozicki, Marek; Maras, Piotr

    2015-01-01

    The polyGeVero® software package was elaborated for calculations of 3D dosimetry data such as the polymer gel dosimetry. It comprises four workspaces designed for: i) calculating calibrations, ii) storing calibrations in a database, iii) calculating dose distribution 3D cubes, iv) comparing two datasets e.g. a measured one with a 3D dosimetry with a calculated one with the aid of a treatment planning system. To accomplish calculations the software was equipped with a number of tools such as the brachytherapy isotopes database, brachytherapy dose versus distance calculation based on the line approximation approach, automatic spatial alignment of two 3D dose cubes for comparison purposes, 3D gamma index, 3D gamma angle, 3D dose difference, Pearson's coefficient, histograms calculations, isodoses superimposition for two datasets, and profiles calculations in any desired direction. This communication is to briefly present the main functions of the software and report on the speed of calculations performed by polyGeVero®.

  2. COMPUTATIONAL TOXICOLOGY: AN IN SILLICO DOSIMETRY MODEL FOR THE ASSESSMENT OF AIR POLLUTANTS

    EPA Science Inventory

    To accurately assess the threat to human health presented by airborne contaminants, it is necessary to know the deposition patterns of particulate matter (PM) within the respiratory system. To provide a foundation for computational toxicology, we have developed an in silico model...

  3. Assessment of PCXMC for patients with different body size in chest and abdominal x ray examinations: a Monte Carlo simulation study.

    PubMed

    Borrego, David; Lowe, Erin M; Kitahara, Cari M; Lee, Choonsik

    2018-03-21

    A PC Program for x ray Monte Carlo (PCXMC) has been used to calculate organ doses in patient dosimetry and for the exposure assessment in epidemiological studies of radiogenic health related risks. This study compared the dosimetry from using the built-in stylized phantoms in the PCXMC to that of a newer hybrid phantom library with improved anatomical realism. We simulated chest and abdominal x ray projections for 146 unique body size computational phantoms, 77 males and 69 females, with different combinations of height (125-180 cm) and weight (20-140 kg) using the built-in stylized phantoms in the PCXMC version 2.0.1.4 and the hybrid phantom library using the Monte Carlo N-particle eXtended transport code 2.7 (MCNPX). Unfortunately, it was not possible to incorporate the hybrid phantom library into the PCXMC. We compared 14 organ doses, including dose to the active bone marrow, to evaluate differences between the built-in stylized phantoms in the PCXMC and the hybrid phantoms (Cristy and Eckerman 1987 Technical Report ORNL/TM-8381/V1, Oak Ridge National Laboratory, Eckerman and Ryman 1993 Technical Report 12 Oak Ridge, TN, Geyer et al 2014 Phys. Med. Biol. 59 5225-42). On average, organ doses calculated using the built-in stylized phantoms in the PCXMC were greater when compared to the hybrid phantoms. This is most prominent in AP abdominal exams by an average factor of 2.4-, 2.8-, and 2.8-fold for the 10-year-old, 15-year-old, and adult phantoms, respectively. For chest exams, organ doses are greater by an average factor of 1.1-, 1.4-, and 1.2-fold for the 10-year-old, 15-year-old, and adult phantoms, respectively. The PCXMX, due to its ease of use, is often selected to support dosimetry in epidemiological studies; however, it uses simplified models of the human anatomy that fail to account for variations in body morphometry for increasing weight. For epidemiological studies that use PCXMC dosimetry, associations between radiation-related disease risks and organ doses may be underestimated, and to a greater degree in pediatric, especially obese pediatric, compared to adult patients.

  4. Assessment of PCXMC for patients with different body size in chest and abdominal x ray examinations: a Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Borrego, David; Lowe, Erin M.; Kitahara, Cari M.; Lee, Choonsik

    2018-03-01

    A PC Program for x ray Monte Carlo (PCXMC) has been used to calculate organ doses in patient dosimetry and for the exposure assessment in epidemiological studies of radiogenic health related risks. This study compared the dosimetry from using the built-in stylized phantoms in the PCXMC to that of a newer hybrid phantom library with improved anatomical realism. We simulated chest and abdominal x ray projections for 146 unique body size computational phantoms, 77 males and 69 females, with different combinations of height (125–180 cm) and weight (20–140 kg) using the built-in stylized phantoms in the PCXMC version 2.0.1.4 and the hybrid phantom library using the Monte Carlo N-particle eXtended transport code 2.7 (MCNPX). Unfortunately, it was not possible to incorporate the hybrid phantom library into the PCXMC. We compared 14 organ doses, including dose to the active bone marrow, to evaluate differences between the built-in stylized phantoms in the PCXMC and the hybrid phantoms (Cristy and Eckerman 1987 Technical Report ORNL/TM-8381/V1, Oak Ridge National Laboratory, Eckerman and Ryman 1993 Technical Report 12 Oak Ridge, TN, Geyer et al 2014 Phys. Med. Biol. 59 5225–42). On average, organ doses calculated using the built-in stylized phantoms in the PCXMC were greater when compared to the hybrid phantoms. This is most prominent in AP abdominal exams by an average factor of 2.4-, 2.8-, and 2.8-fold for the 10-year-old, 15-year-old, and adult phantoms, respectively. For chest exams, organ doses are greater by an average factor of 1.1-, 1.4-, and 1.2-fold for the 10-year-old, 15-year-old, and adult phantoms, respectively. The PCXMX, due to its ease of use, is often selected to support dosimetry in epidemiological studies; however, it uses simplified models of the human anatomy that fail to account for variations in body morphometry for increasing weight. For epidemiological studies that use PCXMC dosimetry, associations between radiation-related disease risks and organ doses may be underestimated, and to a greater degree in pediatric, especially obese pediatric, compared to adult patients.

  5. Monte Carlo simulation of portal dosimetry on a rectilinear voxel geometry: a variable gantry angle solution.

    PubMed

    Chin, P W; Spezi, E; Lewis, D G

    2003-08-21

    A software solution has been developed to carry out Monte Carlo simulations of portal dosimetry using the BEAMnrc/DOSXYZnrc code at oblique gantry angles. The solution is based on an integrated phantom, whereby the effect of incident beam obliquity was included using geometric transformations. Geometric transformations are accurate within +/- 1 mm and +/- 1 degrees with respect to exact values calculated using trigonometry. An application in portal image prediction of an inhomogeneous phantom demonstrated good agreement with measured data, where the root-mean-square of the difference was under 2% within the field. Thus, we achieved a dose model framework capable of handling arbitrary gantry angles, voxel-by-voxel phantom description and realistic particle transport throughout the geometry.

  6. ISDD: A computational model of particle sedimentation, diffusion and target cell dosimetry for in vitro toxicity studies

    PubMed Central

    2010-01-01

    Background The difficulty of directly measuring cellular dose is a significant obstacle to application of target tissue dosimetry for nanoparticle and microparticle toxicity assessment, particularly for in vitro systems. As a consequence, the target tissue paradigm for dosimetry and hazard assessment of nanoparticles has largely been ignored in favor of using metrics of exposure (e.g. μg particle/mL culture medium, particle surface area/mL, particle number/mL). We have developed a computational model of solution particokinetics (sedimentation, diffusion) and dosimetry for non-interacting spherical particles and their agglomerates in monolayer cell culture systems. Particle transport to cells is calculated by simultaneous solution of Stokes Law (sedimentation) and the Stokes-Einstein equation (diffusion). Results The In vitro Sedimentation, Diffusion and Dosimetry model (ISDD) was tested against measured transport rates or cellular doses for multiple sizes of polystyrene spheres (20-1100 nm), 35 nm amorphous silica, and large agglomerates of 30 nm iron oxide particles. Overall, without adjusting any parameters, model predicted cellular doses were in close agreement with the experimental data, differing from as little as 5% to as much as three-fold, but in most cases approximately two-fold, within the limits of the accuracy of the measurement systems. Applying the model, we generalize the effects of particle size, particle density, agglomeration state and agglomerate characteristics on target cell dosimetry in vitro. Conclusions Our results confirm our hypothesis that for liquid-based in vitro systems, the dose-rates and target cell doses for all particles are not equal; they can vary significantly, in direct contrast to the assumption of dose-equivalency implicit in the use of mass-based media concentrations as metrics of exposure for dose-response assessment. The difference between equivalent nominal media concentration exposures on a μg/mL basis and target cell doses on a particle surface area or number basis can be as high as three to six orders of magnitude. As a consequence, in vitro hazard assessments utilizing mass-based exposure metrics have inherently high errors where particle number or surface areas target cells doses are believed to drive response. The gold standard for particle dosimetry for in vitro nanotoxicology studies should be direct experimental measurement of the cellular content of the studied particle. However, where such measurements are impractical, unfeasible, and before such measurements become common, particle dosimetry models such as ISDD provide a valuable, immediately useful alternative, and eventually, an adjunct to such measurements. PMID:21118529

  7. MCMEG: Simulations of both PDD and TPR for 6 MV LINAC photon beam using different MC codes

    NASA Astrophysics Data System (ADS)

    Fonseca, T. C. F.; Mendes, B. M.; Lacerda, M. A. S.; Silva, L. A. C.; Paixão, L.; Bastos, F. M.; Ramirez, J. V.; Junior, J. P. R.

    2017-11-01

    The Monte Carlo Modelling Expert Group (MCMEG) is an expert network specializing in Monte Carlo radiation transport and the modelling and simulation applied to the radiation protection and dosimetry research field. For the first inter-comparison task the group launched an exercise to model and simulate a 6 MV LINAC photon beam using the Monte Carlo codes available within their laboratories and validate their simulated results by comparing them with experimental measurements carried out in the National Cancer Institute (INCA) in Rio de Janeiro, Brazil. The experimental measurements were performed using an ionization chamber with calibration traceable to a Secondary Standard Dosimetry Laboratory (SSDL). The detector was immersed in a water phantom at different depths and was irradiated with a radiation field size of 10×10 cm2. This exposure setup was used to determine the dosimetric parameters Percentage Depth Dose (PDD) and Tissue Phantom Ratio (TPR). The validation process compares the MC calculated results to the experimental measured PDD20,10 and TPR20,10. Simulations were performed reproducing the experimental TPR20,10 quality index which provides a satisfactory description of both the PDD curve and the transverse profiles at the two depths measured. This paper reports in detail the modelling process using MCNPx, MCNP6, EGSnrc and Penelope Monte Carlo codes, the source and tally descriptions, the validation processes and the results.

  8. Comparison of IPSM 1990 photon dosimetry code of practice with IAEA TRS‐398 and AAPM TG‐51.

    PubMed Central

    Henríquez, Francisco Cutanda

    2009-01-01

    Several codes of practice for photon dosimetry are currently used around the world, supported by different organizations. A comparison of IPSM 1990 with both IAEA TRS‐398 and AAPM TG‐51 has been performed. All three protocols are based on the calibration of ionization chambers in terms of standards of absorbed dose to water, as it is the case with other modern codes of practice. This comparison has been carried out for photon beams of nominal energies: 4 MV, 6 MV, 8 MV, 10 MV and 18 MV. An NE 2571 graphite ionization chamber was used in this study, cross‐calibrated against an NE 2611A Secondary Standard, calibrated in the National Physical Laboratory (NPL). Absolute dose in reference conditions was obtained using each of these three protocols including: beam quality indices, beam quality conversion factors both theoretical and NPL experimental ones, correction factors for influence quantities and absolute dose measurements. Each protocol recommendations have been strictly followed. Uncertainties have been obtained according to the ISO Guide to the Expression of Uncertainty in Measurement. Absorbed dose obtained according to all three protocols agree within experimental uncertainty. The largest difference between absolute dose results for two protocols is obtained for the highest energy: 0.7% between IPSM 1990 and IAEA TRS‐398 using theoretical beam quality conversion factors. PACS number: 87.55.tm

  9. Assessment of the feasibility of using transrectal ultrasound for postimplant dosimetry in low-dose-rate prostate brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davies, Rhian Siân, E-mail: rhian.s.davies@wales.nhs.uk; Perrett, Teresa; Powell, Jane

    A study was performed to establish whether transrectal ultrasound (TRUS)-based postimplant dosimetry (PID) is both practically feasible and comparable to computed tomography (CT)-based PID, recommended in current published guidelines. In total, 22 patients treated consecutively at a single cancer center with low-dose-rate (LDR) brachytherapy for early-stage prostate cancer had a transrectal ultrasound performed immediately after implant (d0-TRUS) and computed tomography scan 30 days after implant (d30-CT). Postimplant dosimetry planning was performed on both image sets and the results were compared. The interobserver reproducibility of the transrectal ultrasound postimplant dosimetry planning technique was also assessed. It was noticed that there wasmore » no significant difference in mean prostate D{sub 90} (136.5 Gy and 144.4 Gy, p = 0.2197), V{sub 100} (86.4% and 89.1%, p = 0.1480) and V{sub 150} (52.0% and 47.8%, p = 0.1657) for d30-CT and d0-TRUS, respectively. Rectal doses were significantly higher for d0-TRUS than d30-CT. Urethral doses were available with d0-TRUS only. We have shown that d0-TRUS PID is a useful tool for assessing the quality of an implant after low-dose-rate prostate brachytherapy and is comparable to d30-CT PID. There are clear advantages to its use in terms of resource and time efficiency both for the clinical team and the patient.« less

  10. Two-dimensional dosimetry of radiotherapeutical proton beams using thermoluminescence foils.

    PubMed

    Czopyk, L; Klosowski, M; Olko, P; Swakon, J; Waligorski, M P R; Kajdrowicz, T; Cuttone, G; Cirrone, G A P; Di Rosa, F

    2007-01-01

    In modern radiation therapy such as intensity modulated radiation therapy or proton therapy, one is able to cover the target volume with improved dose conformation and to spare surrounding tissue with help of modern measurement techniques. Novel thermoluminescence dosimetry (TLD) foils, developed from the hot-pressed mixture of LiF:Mg,Cu,P (MCP TL) powder and ethylene-tetrafluoroethylene (ETFE) copolymer, have been applied for 2-D dosimetry of radiotherapeutical proton beams at INFN Catania and IFJ Krakow. A TLD reader with 70 mm heating plate and CCD camera was used to read the 2-D emission pattern of irradiated foils. The absorbed dose profiles were evaluated, taking into account correction factors specific for TLD such as dose and energy response. TLD foils were applied for measuring of dose distributions within an eye phantom and compared with predictions obtained from the MCNPX code and Eclipse Ocular Proton Planning (Varian Medical Systems) clinical radiotherapy planning system. We demonstrate the possibility of measuring 2-D dose distributions with point resolution of about 0.5 x 0.5 mm(2).

  11. Depth Dose Distribution Study within a Phantom Torso after Irradiation with a Simulated Solar Particle Event at NSRL

    NASA Technical Reports Server (NTRS)

    Berger, Thomas; Matthiae, Daniel; Koerner, Christine; George, Kerry; Rhone, Jordan; Cucinotta, Francis; Reitz, Guenther

    2010-01-01

    The adequate knowledge of the radiation environment and the doses incurred during a space mission is essential for estimating an astronaut's health risk. The space radiation environment is complex and variable, and exposures inside the spacecraft and the astronaut's body are compounded by the interactions of the primary particles with the atoms of the structural materials and with the body itself Astronauts' radiation exposures are measured by means of personal dosimetry, but there remains substantial uncertainty associated with the computational extrapolation of skin dose to organ dose, which can lead to over- or underestimation of the health risk. Comparisons of models to data showed that the astronaut's Effective dose (E) can be predicted to within about a +10% accuracy using space radiation transport models for galactic cosmic rays (GCR) and trapped radiation behind shielding. However for solar particle event (SPE) with steep energy spectra and for extra-vehicular activities on the surface of the moon where only tissue shielding is present, transport models predict that there are large differences in model assumptions in projecting organ doses. Therefore experimental verification of SPE induced organ doses may be crucial for the design of lunar missions. In the research experiment "Depth dose distribution study within a phantom torso" at the NASA Space Radiation Laboratory (NSRL) at BNL, Brookhaven, USA the large 1972 SPE spectrum was simulated using seven different proton energies from 50 up to 450 MeV. A phantom torso constructed of natural bones and realistic distributions of human tissue equivalent materials, which is comparable to the torso of the MATROSHKA phantom currently on the ISS, was equipped with a comprehensive set of thermoluminescence detectors and human cells. The detectors are applied to assess the depth dose distribution and radiation transport codes (e.g. GEANT4) are used to assess the radiation field and interactions of the radiation field with the phantom torso. Lymphocyte cells are strategically embedded at selected locations at the skin and internal organs and are processed after irradiation to assess the effects of shielding on the yield of chromosome damage. The initial focus of the present experiment is to correlate biological results with physical dosimetry measurements in the phantom torso. Further on, the results of the passive dosimetry within the anthropomorphic phantoms represent the best tool to generate reliable data to benchmark computational radiation transport models in a radiation field of interest. The presentation will give first results of the physical dose distribution, the comparison with GEANT4 computer simulations based on a Voxel model of the phantom, and a comparison with the data from the chromosome aberration study.

  12. Comparison of codes assessing galactic cosmic radiation exposure of aircraft crew.

    PubMed

    Bottollier-Depois, J F; Beck, P; Bennett, B; Bennett, L; Bütikofer, R; Clairand, I; Desorgher, L; Dyer, C; Felsberger, E; Flückiger, E; Hands, A; Kindl, P; Latocha, M; Lewis, B; Leuthold, G; Maczka, T; Mares, V; McCall, M J; O'Brien, K; Rollet, S; Rühm, W; Wissmann, F

    2009-10-01

    The assessment of the exposure to cosmic radiation onboard aircraft is one of the preoccupations of bodies responsible for radiation protection. Cosmic particle flux is significantly higher onboard aircraft than at ground level and its intensity depends on the solar activity. The dose is usually estimated using codes validated by the experimental data. In this paper, a comparison of various codes is presented, some of them are used routinely, to assess the dose received by the aircraft crew caused by the galactic cosmic radiation. Results are provided for periods close to solar maximum and minimum and for selected flights covering major commercial routes in the world. The overall agreement between the codes, particularly for those routinely used for aircraft crew dosimetry, was better than +/-20 % from the median in all but two cases. The agreement within the codes is considered to be fully satisfactory for radiation protection purposes.

  13. Biological dose estimation for charged-particle therapy using an improved PHITS code coupled with a microdosimetric kinetic model.

    PubMed

    Sato, Tatsuhiko; Kase, Yuki; Watanabe, Ritsuko; Niita, Koji; Sihver, Lembit

    2009-01-01

    Microdosimetric quantities such as lineal energy, y, are better indexes for expressing the RBE of HZE particles in comparison to LET. However, the use of microdosimetric quantities in computational dosimetry is severely limited because of the difficulty in calculating their probability densities in macroscopic matter. We therefore improved the particle transport simulation code PHITS, providing it with the capability of estimating the microdosimetric probability densities in a macroscopic framework by incorporating a mathematical function that can instantaneously calculate the probability densities around the trajectory of HZE particles with a precision equivalent to that of a microscopic track-structure simulation. A new method for estimating biological dose, the product of physical dose and RBE, from charged-particle therapy was established using the improved PHITS coupled with a microdosimetric kinetic model. The accuracy of the biological dose estimated by this method was tested by comparing the calculated physical doses and RBE values with the corresponding data measured in a slab phantom irradiated with several kinds of HZE particles. The simulation technique established in this study will help to optimize the treatment planning of charged-particle therapy, thereby maximizing the therapeutic effect on tumors while minimizing unintended harmful effects on surrounding normal tissues.

  14. Reducing the number of CTs performed to monitor personalized dosimetry during peptide receptor radionuclide therapy (PRRT).

    PubMed

    Chicheportiche, Alexandre; Artoul, Faozi; Schwartz, Arnon; Grozinsky-Glasberg, Simona; Meirovitz, Amichay; Gross, David J; Godefroy, Jeremy

    2018-06-19

    Peptide receptor radionuclide therapy (PRRT) with [ 177 Lu]-DOTA-TATE is an effective treatment of neuroendocrine tumors (NETs). After each cycle of treatment, patient dosimetry evaluates the radiation dose to the risk organs, kidneys, and bone marrow, the most radiosensitive tissues. Absorbed doses are calculated from the radioactivity in the blood and from single photon emission computed tomography (SPECT) images corrected by computed tomography (CT) acquired after each course of treatment. The aim of this work is to assess whether the dosimetry along all treatment cycles can be calculated using a single CT. We hypothesize that the absorbed doses to the risk organs calculated with a single CT will be accurate enough to correctly manage the patients, i.e., whether or not to continue PRRT. Twenty-four patients diagnosed with metastatic NETs undergoing PRRT with [ 177 Lu]-DOTA-TATE were retrospectively included in this study. We compared radiation doses to the kidneys and bone marrow using two protocols. In the "classical" one, dosimetry is calculated based on a SPECT and a CT after each treatment cycle. In the new protocol, dosimetry is calculated based on a SPECT study after each cycle but with the first acquired CT for all cycles. The decision whether or not to stop PRRT because of unsafe absorbed dose to the risk organs would have been the same had the classical or the new protocol been used. The agreement between the cumulative doses to the kidneys and bone marrow obtained from the two protocols was excellent with Pearson's correlation coefficients r = 0.95 and r = 0.99 (P < 0.0001) and mean relative differences of 5.30 ± 6.20% and 0.48 ± 4.88%, respectively. Dosimetry calculations for a given patient can be done using a single CT registered to serial SPECTs. This new protocol reduces the need for a hybrid camera in the follow-up of patients receiving [ 177 Lu]-DOTA-TATE.

  15. Comparison of Analysis Results Between 2D/1D Synthesis and RAPTOR-M3G in the Korea Standard Nuclear Plant (KSNP)

    NASA Astrophysics Data System (ADS)

    Joung Lim, Mi; Maeng, Young Jae; Fero, Arnold H.; Anderson, Stanwood L.

    2016-02-01

    The 2D/1D synthesis methodology has been used to calculate the fast neutron (E > 1.0 MeV) exposure to the beltline region of the reactor pressure vessel. This method uses the DORT 3.1 discrete ordinates code and the BUGLE-96 cross-section library based on ENDF/B-VI. RAPTOR-M3G (RApid Parallel Transport Of Radiation-Multiple 3D Geometries) which performs full 3D calculations was developed and is based on domain decomposition algorithms, where the spatial and angular domains are allocated and processed on multi-processor computer architecture. As compared to traditional single-processor applications, this approach reduces the computational load as well as the memory requirement per processor. Both methods are applied to surveillance test results for the Korea Standard Nuclear Plant (KSNP)-OPR (Optimized Power Reactor) 1000 MW. The objective of this paper is to compare the results of the KSNP surveillance program between 2D/1D synthesis and RAPTOR-M3G. Each operating KSNP has a reactor vessel surveillance program consisting of six surveillance capsules located between the core and the reactor vessel in the downcomer region near the reactor vessel wall. In addition to the In-Vessel surveillance program, an Ex-Vessel Neutron Dosimetry (EVND) program has been implemented. In order to estimate surveillance test results, cycle-specific forward transport calculations were performed by 2D/1D synthesis and by RAPTOR-M3G. The ratio between measured and calculated (M/C) reaction rates will be discussed. The current plan is to install an EVND system in all of the Korea PWRs including the new reactor type, APR (Advanced Power Reactor) 1400 MW. This work will play an important role in establishing a KSNP-specific database of surveillance test results and will employ RAPTOR-M3G for surveillance dosimetry location as well as positions in the KSNP reactor vessel.

  16. The Mayak Worker Dosimetry System (MWDS-2013): Implementation of the Dose Calculations.

    PubMed

    Zhdanov, А; Vostrotin, V; Efimov, А; Birchall, A; Puncher, M

    2016-07-15

    The calculation of internal doses for the Mayak Worker Dosimetry System (MWDS-2013) involved extensive computational resources due to the complexity and sheer number of calculations required. The required output consisted of a set of 1000 hyper-realizations: each hyper-realization consists of a set (1 for each worker) of probability distributions of organ doses. This report describes the hardware components and computational approaches required to make the calculation tractable. Together with the software, this system is referred to here as the 'PANDORA system'. It is based on a commercial SQL server database in a series of six work stations. A complete run of the entire Mayak worker cohort entailed a huge amount of calculations in PANDORA and due to the relatively slow speed of writing the data into the SQL server, each run took about 47 days. Quality control was monitored by comparing doses calculated in PANDORA with those in a specially modified version of the commercial software 'IMBA Professional Plus'. Suggestions are also made for increasing calculation and storage efficiency for future dosimetry calculations using PANDORA. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Index extraction for electromagnetic field evaluation of high power wireless charging system.

    PubMed

    Park, SangWook

    2017-01-01

    This paper presents the precise dosimetry for highly resonant wireless power transfer (HR-WPT) system using an anatomically realistic human voxel model. The dosimetry for the HR-WPT system designed to operate at 13.56 MHz frequency, which one of the ISM band frequency band, is conducted in the various distances between the human model and the system, and in the condition of alignment and misalignment between transmitting and receiving circuits. The specific absorption rates in the human body are computed by the two-step approach; in the first step, the field generated by the HR-WPT system is calculated and in the second step the specific absorption rates are computed with the scattered field finite-difference time-domain method regarding the fields obtained in the first step as the incident fields. The safety compliance for non-uniform field exposure from the HR-WPT system is discussed with the international safety guidelines. Furthermore, the coupling factor concept is employed to relax the maximum allowable transmitting power. Coupling factors derived from the dosimetry results are presented. In this calculation, the external magnetic field from the HR-WPT system can be relaxed by approximately four times using coupling factor in the worst exposure scenario.

  18. Uncertainty propagation for SPECT/CT-based renal dosimetry in 177Lu peptide receptor radionuclide therapy

    NASA Astrophysics Data System (ADS)

    Gustafsson, Johan; Brolin, Gustav; Cox, Maurice; Ljungberg, Michael; Johansson, Lena; Sjögreen Gleisner, Katarina

    2015-11-01

    A computer model of a patient-specific clinical 177Lu-DOTATATE therapy dosimetry system is constructed and used for investigating the variability of renal absorbed dose and biologically effective dose (BED) estimates. As patient models, three anthropomorphic computer phantoms coupled to a pharmacokinetic model of 177Lu-DOTATATE are used. Aspects included in the dosimetry-process model are the gamma-camera calibration via measurement of the system sensitivity, selection of imaging time points, generation of mass-density maps from CT, SPECT imaging, volume-of-interest delineation, calculation of absorbed-dose rate via a combination of local energy deposition for electrons and Monte Carlo simulations of photons, curve fitting and integration to absorbed dose and BED. By introducing variabilities in these steps the combined uncertainty in the output quantity is determined. The importance of different sources of uncertainty is assessed by observing the decrease in standard deviation when removing a particular source. The obtained absorbed dose and BED standard deviations are approximately 6% and slightly higher if considering the root mean square error. The most important sources of variability are the compensation for partial volume effects via a recovery coefficient and the gamma-camera calibration via the system sensitivity.

  19. The calibration of a Scanditronix-Wellhöfer thimble chamber for photon dosimetry using the IAEA TRS 277 code of practice.

    PubMed

    Fourie, O L

    2004-03-01

    This note investigates the calibration of a Scanditronix-Wellhöfer type FC65-G ionisation chamber to be used in clinical photon dosimetry. The current Adaptation by the Australasian College of Physical Scientists and Engineers in Medicine (ACPSEM) of the IAEA TRS 277 dosimetry protocol makes no provision for this type of chamber. The absorbed dose to air calibration coefficient ND was therefore calculated from the air kerma calibration coefficient NK using the formalism of the IAEA TRS 277 protocol and it is shown that the value of the correction factor kmkatt for the FC65-G chamber is identical to that of the NE 2571 chamber. ND was also determined experimentally from a cross calibration against an NE 2571 dosimetry. It was found that there is a good correspondence between the calculated and measured values. To establish to what extent the ACPSEM Adaptation can be used for the FC65-G chamber, values for the ratio of stopping powers in water and air (Sw,air)Q and the perturbation correction factor pQ were calculated using the TRS 277 protocol. From these results it is shown that over the range of beam qualities TPR20,10 = 0.59 to TPR20,10 = 0.78 the Adaptation can be used for the FC65-G chamber.

  20. The World as Viewed by and with Unpaired Electrons

    PubMed Central

    Eaton, Sandra S.; Eaton, Gareth R.

    2012-01-01

    Recent advances in electron paramagnetic resonance (EPR) include capabilities for applications to areas as diverse as archeology, beer shelf life, biological structure, dosimetry, in vivo imaging, molecular magnets, and quantum computing. Enabling technologies include multifrequency continuous wave, pulsed, and rapid scan EPR. Interpretation is enhanced by increasingly powerful computational models. PMID:22975244

  1. Ionization chamber dosimetry of small photon fields: a Monte Carlo study on stopping-power ratios for radiosurgery and IMRT beams.

    PubMed

    Sánchez-Doblado, F; Andreo, P; Capote, R; Leal, A; Perucha, M; Arráns, R; Núñez, L; Mainegra, E; Lagares, J I; Carrasco, E

    2003-07-21

    Absolute dosimetry with ionization chambers of the narrow photon fields used in stereotactic techniques and IMRT beamlets is constrained by lack of electron equilibrium in the radiation field. It is questionable that stopping-power ratio in dosimetry protocols, obtained for broad photon beams and quasi-electron equilibrium conditions, can be used in the dosimetry of narrow fields while keeping the uncertainty at the same level as for the broad beams used in accelerator calibrations. Monte Carlo simulations have been performed for two 6 MV clinical accelerators (Elekta SL-18 and Siemens Mevatron Primus), equipped with radiosurgery applicators and MLC. Narrow circular and Z-shaped on-axis and off-axis fields, as well as broad IMRT configured beams, have been simulated together with reference 10 x 10 cm2 beams. Phase-space data have been used to generate 3D dose distributions which have been compared satisfactorily with experimental profiles (ion chamber, diodes and film). Photon and electron spectra at various depths in water have been calculated, followed by Spencer-Attix (delta = 10 keV) stopping-power ratio calculations which have been compared to those used in the IAEA TRS-398 code of practice. For water/air and PMMA/air stopping-power ratios, agreements within 0.1% have been obtained for the 10 x 10 cm2 fields. For radiosurgery applicators and narrow MLC beams, the calculated s(w,air) values agree with the reference within +/-0.3%, well within the estimated standard uncertainty of the reference stopping-power ratios (0.5%). Ionization chamber dosimetry of narrow beams at the photon qualities used in this work (6 MV) can therefore be based on stopping-power ratios data in dosimetry protocols. For a modulated 6 MV broad beam used in clinical IMRT, s(w,air) agrees within 0.1% with the value for 10 x 10 cm2, confirming that at low energies IMRT absolute dosimetry can also be based on data for open reference fields. At higher energies (24 MV) the difference in s(w,air) was up to 1.1%, indicating that the use of protocol data for narrow beams in such cases is less accurate than at low energies, and detailed calculations of the dosimetry parameters involved should be performed if similar accuracy to that of 6 MV is sought.

  2. Edema and Seed Displacements Affect Intraoperative Permanent Prostate Brachytherapy Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westendorp, Hendrik, E-mail: r.westendorp@radiotherapiegroep.nl; Nuver, Tonnis T.; Department of Radiation Oncology, Radiotherapiegroep Behandellocatie Deventer, Deventer

    Purpose: We sought to identify the intraoperative displacement patterns of seeds and to evaluate the correlation of intraoperative dosimetry with day 30 for permanent prostate brachytherapy. Methods and Materials: We analyzed the data from 699 patients. Intraoperative dosimetry was acquired using transrectal ultrasonography (TRUS) and C-arm cone beam computed tomography (CBCT). Intraoperative dosimetry (minimal dose to 40%-95% of the volume [D{sub 40}-D{sub 95}]) was compared with the day 30 dosimetry for both modalities. An additional edema-compensating comparison was performed for D{sub 90}. Stranded seeds were linked between TRUS and CBCT using an automatic and fast linking procedure. Displacement patterns weremore » analyzed for each seed implantation location. Results: On average, an intraoperative (TRUS to CBCT) D{sub 90} decline of 10.6% ± 7.4% was observed. Intraoperative CBCT D{sub 90} showed a greater correlation (R{sup 2} = 0.33) with respect to Day 30 than did TRUS (R{sup 2} = 0.17). Compensating for edema, the correlation increased to 0.41 for CBCT and 0.38 for TRUS. The mean absolute intraoperative seed displacement was 3.9 ± 2.0 mm. The largest seed displacements were observed near the rectal wall. The central and posterior seeds showed less caudal displacement than lateral and anterior seeds. Seeds that were implanted closer to the base showed more divergence than seeds close to the apex. Conclusions: Intraoperative CBCT D{sub 90} showed a greater correlation with the day 30 dosimetry than intraoperative TRUS. Edema seemed to cause most of the systematic difference between the intraoperative and day 30 dosimetry. Seeds near the rectal wall showed the most displacement, comparing TRUS and CBCT, probably because of TRUS probe–induced prostate deformation.« less

  3. Computed Tomography

    NASA Astrophysics Data System (ADS)

    Castellano, Isabel; Geleijns, Jacob

    After its clinical introduction in 1973, computed tomography developed from an x-ray modality for axial imaging in neuroradiology into a versatile three dimensional imaging modality for a wide range of applications in for example oncology, vascular radiology, cardiology, traumatology and even in interventional radiology. Computed tomography is applied for diagnosis, follow-up studies and screening of healthy subpopulations with specific risk factors. This chapter provides a general introduction in computed tomography, covering a short history of computed tomography, technology, image quality, dosimetry, room shielding, quality control and quality criteria.

  4. Determination of absorbed dose to water for high-energy photon and electron beams-comparison of the standards DIN 6800-2 (1997), IAEA TRS 398 (2000) and DIN 6800-2 (2006)

    PubMed Central

    Zakaria, Golam Abu; Schuette, Wilhelm

    2007-01-01

    For the determination of the absorbed dose to water for high-energy photon and electron beams the IAEA code of practice TRS-398 (2000) is applied internationally. In Germany, the German dosimetry protocol DIN 6800-2 (1997) is used. Recently, the DIN standard has been revised and published as Draft National Standard DIN 6800-2 (2006). It has adopted widely the methodology and dosimetric data of the code of practice. This paper compares these three dosimetry protocols systematically and identifies similarities as well as differences. The investigation was done with 6 and 18 MV photon as well as 5 to 21 MeV electron beams. While only cylindrical chambers were used for photon beams, measurements of electron beams were performed using cylindrical as well as plane-parallel chambers. The discrepancies in the determination of absorbed dose to water between the three protocols were 0.4% for photon beams and 1.5% for electron beams. Comparative measurements showed a deviation of less than 0.5% between our measurements following protocol DIN 6800-2 (2006) and TLD inter-comparison procedure in an external audit. PMID:21217912

  5. Determination of absorbed dose to water for high-energy photon and electron beams-comparison of the standards DIN 6800-2 (1997), IAEA TRS 398 (2000) and DIN 6800-2 (2006).

    PubMed

    Zakaria, Golam Abu; Schuette, Wilhelm

    2007-01-01

    For the determination of the absorbed dose to water for high-energy photon and electron beams the IAEA code of practice TRS-398 (2000) is applied internationally. In Germany, the German dosimetry protocol DIN 6800-2 (1997) is used. Recently, the DIN standard has been revised and published as Draft National Standard DIN 6800-2 (2006). It has adopted widely the methodology and dosimetric data of the code of practice. This paper compares these three dosimetry protocols systematically and identifies similarities as well as differences. The investigation was done with 6 and 18 MV photon as well as 5 to 21 MeV electron beams. While only cylindrical chambers were used for photon beams, measurements of electron beams were performed using cylindrical as well as plane-parallel chambers. The discrepancies in the determination of absorbed dose to water between the three protocols were 0.4% for photon beams and 1.5% for electron beams. Comparative measurements showed a deviation of less than 0.5% between our measurements following protocol DIN 6800-2 (2006) and TLD inter-comparison procedure in an external audit.

  6. Ex-vessel neutron dosimetry analysis for westinghouse 4-loop XL pressurized water reactor plant using the RadTrack{sup TM} Code System with the 3D parallel discrete ordinates code RAPTOR-M3G

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, J.; Alpan, F. A.; Fischer, G.A.

    2011-07-01

    Traditional two-dimensional (2D)/one-dimensional (1D) SYNTHESIS methodology has been widely used to calculate fast neutron (>1.0 MeV) fluence exposure to reactor pressure vessel in the belt-line region. However, it is expected that this methodology cannot provide accurate fast neutron fluence calculation at elevations far above or below the active core region. A three-dimensional (3D) parallel discrete ordinates calculation for ex-vessel neutron dosimetry on a Westinghouse 4-Loop XL Pressurized Water Reactor has been done. It shows good agreement between the calculated results and measured results. Furthermore, the results show very different fast neutron flux values at some of the former plate locationsmore » and elevations above and below an active core than those calculated by a 2D/1D SYNTHESIS method. This indicates that for certain irregular reactor internal structures, where the fast neutron flux has a very strong local effect, it is required to use a 3D transport method to calculate accurate fast neutron exposure. (authors)« less

  7. Sci-Sat AM: Radiation Dosimetry and Practical Therapy Solutions - 03: Energy dependence of a clinical probe-format calorimeter and its pertinence to absolute photon and electron beam dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, James; Seuntjens, Jan; Sarfehnia, Arman

    Purpose: To evaluate the intrinsic and absorbed-dose energy dependence of a small-scale graphite calorimeter probe (GPC) developed for use as a routine clinical dosimeter. The influence of charge deposition on the response of the GPC was also assessed by performing absolute dosimetry in clinical linac-based electron beams. Methods: Intrinsic energy dependence was determined by performing constant-temperature calorimetry dose measurements in a water-equivalent solid phantom, under otherwise reference conditions, in five high-energy photon (63.5 < %dd(10){sub X} < 76.3), and five electron (2.3 cm < R{sub 50} < 8.3 cm) beams. Reference dosimetry was performed for all beams in question usingmore » an Exradin A19 ion chamber with a calibration traceable to national standards. The absorbed-dose component of the overall energy dependence was calculated using the EGSnrc egs-chamber user code. Results: A total of 72 measurements were performed with the GPC, resulting in a standard error on the mean absorbed dose of better than 0.3 % for all ten beams. For both the photon and electron beams, no statistically-significant energy dependence was observed experimentally. Peak-to-peak, variations in the relative response of the GPC across all beam qualities of a given radiation type were on the order of 1 %. No effects, either transient or permanent, were attributable to the charge deposited by the electron beams. Conclusions: The GPC’s apparent energy-independence, combined with its well-established linearity and dose rate independence, make it a potentially useful dosimetry system capable measuring photon and electron doses in absolute terms at the clinical level.« less

  8. The visible signal responsible for proton therapy dosimetry using bare optical fibers is not Čerenkov radiation.

    PubMed

    Darafsheh, Arash; Taleei, Reza; Kassaee, Alireza; Finlay, Jarod C

    2016-11-01

    Proton beam dosimetry using bare plastic optical fibers has emerged as a simple approach to proton beam dosimetry. The source of the signal in this method has been attributed to Čerenkov radiation. The aim of this work was a phenomenological study of the nature of the visible light responsible for the signal in bare fiber optic dosimetry of proton therapy beams. Plastic fiber optic probes embedded in solid water phantoms were irradiated with proton beams of energies 100, 180, and 225 MeV produced by a proton therapy cyclotron. Luminescence spectroscopy was performed by a CCD-coupled spectrometer. The spectra were acquired at various depths in phantom to measure the percentage depth dose (PDD) for each beam energy. For comparison, the PDD curves were acquired using a standard multilayer ion chamber device. In order to further analyze the contribution of the Čerenkov radiation in the spectra, Monte Carlo simulation was performed using fluka Monte Carlo code to stochastically simulate radiation transport, ionizing radiation dose deposition, and optical emission of Čerenkov radiation. The measured depth doses using the bare fiber are in agreement with measurements performed by the multilayer ion chamber device, indicating the feasibility of using bare fiber probes for proton beam dosimetry. The spectroscopic study of proton-irradiated fibers showed a continuous spectrum with a shape different from that of Čerenkov radiation. The Monte Carlo simulations confirmed that the amount of the generated Čerenkov light does not follow the radiation absorbed dose in a medium. The source of the optical signal responsible for the proton dose measurement using bare optical fibers is not Čerenkov radiation. It is fluorescence of the plastic material of the fiber.

  9. Individualized adjustments to reference phantom internal organ dosimetry-scaling factors given knowledge of patient internal anatomy.

    PubMed

    Wayson, Michael B; Bolch, Wesley E

    2018-04-13

    Various computational tools are currently available that facilitate patient organ dosimetry in diagnostic nuclear medicine, yet they are typically restricted to reporting organ doses to ICRP-defined reference phantoms. The present study, while remaining computational phantom based, provides straightforward tools to adjust reference phantom organ dose for both internal photon and electron sources. A wide variety of monoenergetic specific absorbed fractions were computed using radiation transport simulations for tissue spheres of varying size and separation distance. Scaling methods were then constructed for both photon and electron self-dose and cross-dose, with data validation provided from patient-specific voxel phantom simulations, as well as via comparison to the scaling methodology given in MIRD Pamphlet No. 11. Photon and electron self-dose was found to be dependent on both radiation energy and sphere size. Photon cross-dose was found to be mostly independent of sphere size. Electron cross-dose was found to be dependent on sphere size when the spheres were in close proximity, owing to differences in electron range. The validation studies showed that this dataset was more effective than the MIRD 11 method at predicting patient-specific photon doses for at both high and low energies, but gave similar results at photon energies between 100 keV and 1 MeV. The MIRD 11 method for electron self-dose scaling was accurate for lower energies but began to break down at higher energies. The photon cross-dose scaling methodology developed in this study showed gains in accuracy of up to 9% for actual patient studies, and the electron cross-dose scaling methodology showed gains in accuracy up to 9% as well when only the bremsstrahlung component of the cross-dose was scaled. These dose scaling methods are readily available for incorporation into internal dosimetry software for diagnostic phantom-based organ dosimetry.

  10. WE-AB-BRB-12: Nanoscintillator Fiber-Optic Detector System for Microbeam Radiation Therapy Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivera, J; Dooley, J; Chang, S

    2015-06-15

    Purpose: Microbeam Radiation Therapy (MRT) is an experimental radiation therapy that has demonstrated a higher therapeutic ratio than conventional radiation therapy in animal studies. There are several roadblocks in translating the promising treatment technology to clinical application, one of which is the lack of a real-time, high-resolution dosimeter. Current clinical radiation detectors have poor spatial resolution and, as such, are unsuitable for measuring microbeams with submillimeter-scale widths. Although GafChromic film has high spatial resolution, it lacks the real-time dosimetry capability necessary for MRT preclinical research and potential clinical use. In this work we have demonstrated the feasibility of using amore » nanoscintillator fiber-optic detector (nanoFOD) system for real-time MRT dosimetry. Methods: A microplanar beam array is generated using a x-ray research irradiator and a custom-made, microbeam-forming collimator. The newest generation nanoFOD has an effective size of 70 µm in the measurement direction and was calibrated against a kV ion chamber (RadCal Accu-Pro) in open field geometry. We have written a computer script that performs automatic data collection with immediate background subtraction. A computer-controlled detector positioning stage is used to precisely measure the microbeam peak dose and beam profile by translating the stage during data collection. We test the new generation nanoFOD system, with increased active scintillation volume, against the previous generation system. Both raw and processed data are time-stamped and recorded to enable future post-processing. Results: The real-time microbeam dosimetry system worked as expected. The new generation dosimeter has approximately double the active volume compared to the previous generation resulting in over 900% increase in signal. The active volume of the dosimeter still provided the spatial resolution that meets the Nyquist criterion for our microbeam widths. Conclusion: We have demonstrated that real-time dosimetry of MRT microbeams is feasible using a nanoscintillator fiber-optic detector with integrated positioning system.« less

  11. Computational modeling of nanoscale and microscale particle deposition, retention and dosimetry in the mouse respiratory tract.

    PubMed

    Asgharian, B; Price, O T; Oldham, M; Chen, Lung-Chi; Saunders, E L; Gordon, T; Mikheev, V B; Minard, K R; Teeguarden, J G

    2014-12-01

    Comparing effects of inhaled particles across rodent test systems and between rodent test systems and humans is a key obstacle to the interpretation of common toxicological test systems for human risk assessment. These comparisons, correlation with effects and prediction of effects, are best conducted using measures of tissue dose in the respiratory tract. Differences in lung geometry, physiology and the characteristics of ventilation can give rise to differences in the regional deposition of particles in the lung in these species. Differences in regional lung tissue doses cannot currently be measured experimentally. Regional lung tissue dosimetry can however be predicted using models developed for rats, monkeys, and humans. A computational model of particle respiratory tract deposition and clearance was developed for BALB/c and B6C3F1 mice, creating a cross-species suite of available models for particle dosimetry in the lung. Airflow and particle transport equations were solved throughout the respiratory tract of these mice strains to obtain temporal and spatial concentration of inhaled particles from which deposition fractions were determined. Particle inhalability (Inhalable fraction, IF) and upper respiratory tract (URT) deposition were directly related to particle diffusive and inertial properties. Measurements of the retained mass at several post-exposure times following exposure to iron oxide nanoparticles, micro- and nanoscale C60 fullerene, and nanoscale silver particles were used to calibrate and verify model predictions of total lung dose. Interstrain (mice) and interspecies (mouse, rat and human) differences in particle inhalability, fractional deposition and tissue dosimetry are described for ultrafine, fine and coarse particles.

  12. Design and Construction of an Optical Computed Tomography Scanner for Polymer Gel Dosimetry Application

    PubMed Central

    Zakariaee, Seyed Salman; Mesbahi, Asghar; Keshtkar, Ahmad; Azimirad, Vahid

    2014-01-01

    Polymer gel dosimeter is the only accurate three dimensional (3D) dosimeter that can measure the absorbed dose distribution in a perfect 3D setting. Gel dosimetry by using optical computed tomography (OCT) has been promoted by several researches. In the current study, we designed and constructed a prototype OCT system for gel dosimetry. First, the electrical system for optical scanning of the gel container using a Helium-Neon laser and a photocell was designed and constructed. Then, the mechanical part for two rotational and translational motions was designed and step motors were assembled to it. The data coming from photocell was grabbed by the home-built interface and sent to a personal computer. Data processing was carried out using MATLAB software. To calibrate the system and tune up the functionality of it, different objects was designed and scanned. Furthermore, the spatial and contrast resolution of the system was determined. The system was able to scan the gel dosimeter container with a diameter up to 11 cm inside the water phantom. The standard deviation of the pixels within water flask image was considered as the criteria for image uniformity. The uniformity of the system was about ±0.05%. The spatial resolution of the system was approximately 1 mm and contrast resolution was about 0.2%. Our primary results showed that this system is able to obtain two-dimensional, cross-sectional images from polymer gel samples. PMID:24761377

  13. A boundary-representation method for designing whole-body radiation dosimetry models: pregnant females at the ends of three gestational periods—RPI-P3, -P6 and -P9

    NASA Astrophysics Data System (ADS)

    Xu, X. George; Taranenko, Valery; Zhang, Juying; Shi, Chengyu

    2007-12-01

    Fetuses are extremely radiosensitive and the protection of pregnant females against ionizing radiation is of particular interest in many health and medical physics applications. Existing models of pregnant females relied on simplified anatomical shapes or partial-body images of low resolutions. This paper reviews two general types of solid geometry modeling: constructive solid geometry (CSG) and boundary representation (BREP). It presents in detail a project to adopt the BREP modeling approach to systematically design whole-body radiation dosimetry models: a pregnant female and her fetus at the ends of three gestational periods of 3, 6 and 9 months. Based on previously published CT images of a 7-month pregnant female, the VIP-Man model and mesh organ models, this new set of pregnant female models was constructed using 3D surface modeling technologies instead of voxels. The organ masses were adjusted to agree with the reference data provided by the International Commission on Radiological Protection (ICRP) and previously published papers within 0.5%. The models were then voxelized for the purpose of performing dose calculations in identically implemented EGS4 and MCNPX Monte Carlo codes. The agreements of the fetal doses obtained from these two codes for this set of models were found to be within 2% for the majority of the external photon irradiation geometries of AP, PA, LAT, ROT and ISO at various energies. It is concluded that the so-called RPI-P3, RPI-P6 and RPI-P9 models have been reliably defined for Monte Carlo calculations. The paper also discusses the needs for future research and the possibility for the BREP method to become a major tool in the anatomical modeling for radiation dosimetry.

  14. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    PubMed

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  15. The world as viewed by and with unpaired electrons.

    PubMed

    Eaton, Sandra S; Eaton, Gareth R

    2012-10-01

    Recent advances in electron paramagnetic resonance (EPR) include capabilities for applications to areas as diverse as archeology, beer shelf life, biological structure, dosimetry, in vivo imaging, molecular magnets, and quantum computing. Enabling technologies include multifrequency continuous wave, pulsed, and rapid scan EPR. Interpretation is enhanced by increasingly powerful computational models. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Whole-body voxel-based personalized dosimetry: Multiple voxel S-value approach for heterogeneous media with non-uniform activity distributions.

    PubMed

    Lee, Min Sun; Kim, Joong Hyun; Paeng, Jin Chul; Kang, Keon Wook; Jeong, Jae Min; Lee, Dong Soo; Lee, Jae Sung

    2017-12-14

    Personalized dosimetry with high accuracy is becoming more important because of the growing interests in personalized medicine and targeted radionuclide therapy. Voxel-based dosimetry using dose point kernel or voxel S-value (VSV) convolution is available. However, these approaches do not consider medium heterogeneity. Here, we propose a new method for whole-body voxel-based personalized dosimetry for heterogeneous media with non-uniform activity distributions, which is referred to as the multiple VSV approach. Methods: The multiple numbers (N) of VSVs for media with different densities covering the whole-body density ranges were used instead of using only a single VSV for water. The VSVs were pre-calculated using GATE Monte Carlo simulation; those were convoluted with the time-integrated activity to generate density-specific dose maps. Computed tomography-based segmentation was conducted to generate binary maps for each density region. The final dose map was acquired by the summation of N segmented density-specific dose maps. We tested several sets of VSVs with different densities: N = 1 (single water VSV), 4, 6, 8, 10, and 20. To validate the proposed method, phantom and patient studies were conducted and compared with direct Monte Carlo, which was considered the ground truth. Finally, patient dosimetry (10 subjects) was conducted using the multiple VSV approach and compared with the single VSV and organ-based dosimetry approaches. Errors at the voxel- and organ-levels were reported for eight organs. Results: In the phantom and patient studies, the multiple VSV approach showed significant improvements regarding voxel-level errors, especially for the lung and bone regions. As N increased, voxel-level errors decreased, although some overestimations were observed at lung boundaries. In the case of multiple VSVs ( N = 8), we achieved voxel-level errors of 2.06%. In the dosimetry study, our proposed method showed much improved results compared to the single VSV and organ-based dosimetry. Errors at the organ-level were -6.71%, 2.17%, and 227.46% for the single VSV, multiple VSV, and organ-based dosimetry, respectively. Conclusion: The multiple VSV approach for heterogeneous media with non-uniform activity distributions offers fast personalized dosimetry at whole-body level, yielding results comparable to those of the direct Monte Carlo approach. Copyright © 2017 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  17. Modeling the impact of prostate edema on LDR brachytherapy: a Monte Carlo dosimetry study based on a 3D biphasic finite element biomechanical model

    NASA Astrophysics Data System (ADS)

    Mountris, K. A.; Bert, J.; Noailly, J.; Rodriguez Aguilera, A.; Valeri, A.; Pradier, O.; Schick, U.; Promayon, E.; Gonzalez Ballester, M. A.; Troccaz, J.; Visvikis, D.

    2017-03-01

    Prostate volume changes due to edema occurrence during transperineal permanent brachytherapy should be taken under consideration to ensure optimal dose delivery. Available edema models, based on prostate volume observations, face several limitations. Therefore, patient-specific models need to be developed to accurately account for the impact of edema. In this study we present a biomechanical model developed to reproduce edema resolution patterns documented in the literature. Using the biphasic mixture theory and finite element analysis, the proposed model takes into consideration the mechanical properties of the pubic area tissues in the evolution of prostate edema. The model’s computed deformations are incorporated in a Monte Carlo simulation to investigate their effect on post-operative dosimetry. The comparison of Day1 and Day30 dosimetry results demonstrates the capability of the proposed model for patient-specific dosimetry improvements, considering the edema dynamics. The proposed model shows excellent ability to reproduce previously described edema resolution patterns and was validated based on previous findings. According to our results, for a prostate volume increase of 10-20% the Day30 urethra D10 dose metric is higher by 4.2%-10.5% compared to the Day1 value. The introduction of the edema dynamics in Day30 dosimetry shows a significant global dose overestimation identified on the conventional static Day30 dosimetry. In conclusion, the proposed edema biomechanical model can improve the treatment planning of transperineal permanent brachytherapy accounting for post-implant dose alterations during the planning procedure.

  18. Modeling the impact of prostate edema on LDR brachytherapy: a Monte Carlo dosimetry study based on a 3D biphasic finite element biomechanical model.

    PubMed

    Mountris, K A; Bert, J; Noailly, J; Aguilera, A Rodriguez; Valeri, A; Pradier, O; Schick, U; Promayon, E; Ballester, M A Gonzalez; Troccaz, J; Visvikis, D

    2017-03-21

    Prostate volume changes due to edema occurrence during transperineal permanent brachytherapy should be taken under consideration to ensure optimal dose delivery. Available edema models, based on prostate volume observations, face several limitations. Therefore, patient-specific models need to be developed to accurately account for the impact of edema. In this study we present a biomechanical model developed to reproduce edema resolution patterns documented in the literature. Using the biphasic mixture theory and finite element analysis, the proposed model takes into consideration the mechanical properties of the pubic area tissues in the evolution of prostate edema. The model's computed deformations are incorporated in a Monte Carlo simulation to investigate their effect on post-operative dosimetry. The comparison of Day1 and Day30 dosimetry results demonstrates the capability of the proposed model for patient-specific dosimetry improvements, considering the edema dynamics. The proposed model shows excellent ability to reproduce previously described edema resolution patterns and was validated based on previous findings. According to our results, for a prostate volume increase of 10-20% the Day30 urethra D10 dose metric is higher by 4.2%-10.5% compared to the Day1 value. The introduction of the edema dynamics in Day30 dosimetry shows a significant global dose overestimation identified on the conventional static Day30 dosimetry. In conclusion, the proposed edema biomechanical model can improve the treatment planning of transperineal permanent brachytherapy accounting for post-implant dose alterations during the planning procedure.

  19. SU-F-T-367: Using PRIMO, a PENELOPE-Based Software, to Improve the Small Field Dosimetry of Linear Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benmakhlouf, H; Andreo, P; Brualla, L

    2016-06-15

    Purpose: To calculate output correction factors for Varian Clinac 2100iX beams for seven small field detectors and use the values to determine the small field output factors for the linacs at Karolinska university hospital. Methods: Phase space files (psf) for square fields between 0.25cm and 10cm were calculated using the PENELOPE-based PRIMO software. The linac MC-model was tuned by comparing PRIMO-estimated and experimentally determined depth doses and lateral dose-profiles for 40cmx40cm fields. The calculated psf were used as radiation sources to calculate the correction factors of IBA and PTW detectors with the code penEasy/PENELOPE. Results: The optimal tuning parameters ofmore » the MClinac model in PRIMO were 5.4 MeV incident electron energy and zero energy spread, focal spot size and beam divergence. Correction factors obtained for the liquid ion chamber (PTW-T31018) are within 1% down to 0.5 cm fields. For unshielded diodes (IBA-EFD, IBA-SFD, PTW-T60017 and PTW-T60018) the corrections are up to 2% at intermediate fields (>1cm side), becoming down to −11% for fields smaller than 1cm. The shielded diode (IBA-PFD and PTW-T60016) corrections vary with field size from 0 to −4%. Volume averaging effects are found for most detectors in the presence of 0.25cm fields. Conclusion: Good agreement was found between correction factors based on PRIMO-generated psf and those from other publications. The calculated factors will be implemented in output factor measurements (using several detectors) in the clinic. PRIMO is a userfriendly general code capable of generating small field psf and can be used without having to code own linac geometries. It can therefore be used to improve the clinical dosimetry, especially in the commissioning of linear accelerators. Important dosimetry data, such as dose-profiles and output factors can be determined more accurately for a specific machine, geometry and setup by using PRIMO and having a MC-model of the detector used.« less

  20. Thermoluminescence dosimetry and its applications in medicine--Part 2: History and applications.

    PubMed

    Kron, T

    1995-03-01

    Thermoluminescence dosimetry (TLD) has been available for dosimetry of ionising radiation for nearly 100 years. The variety of materials and their different physical forms allow the determination of different radiation qualities over a wide range of absorbed dose. This makes TL dosimeters useful in radiation protection where dose levels of microGy are monitored as well as in radiotherapy where doses up to several Gray are to be measured. The major advantages of TL detectors are their small physical size and that no cables or auxiliary equipment is required during the dose assessment. Therefore TLD is a good method for point dose measurements in phantoms as well as for in vivo dosimetry on patients during radiotherapy treatment. As an integrative dosimetric technique, it can be applied to personal dosimetry and it lends itself to the determination of dose distributions due to multiple or moving radiation sources (e.g. conformal and dynamic radiotherapy, computed tomography). In addition, TL dosimeters are easy to transport, and they can be mailed. This makes them well suited for intercomparison of doses delivered in different institutions. The present article aims at describing the various applications TLD has found in medicine by taking into consideration the physics and practice of TLD measurements which have been discussed in the first part of this review (Australas. Phys. Eng. Sci. Med. 17: 175-199, 1994).

  1. Three-dimensional illumination procedure for photodynamic therapy of dermatology

    NASA Astrophysics Data System (ADS)

    Hu, Xiao-ming; Zhang, Feng-juan; Dong, Fei; Zhou, Ya

    2014-09-01

    Light dosimetry is an important parameter that affects the efficacy of photodynamic therapy (PDT). However, the irregular morphologies of lesions complicate lesion segmentation and light irradiance adjustment. Therefore, this study developed an illumination demo system comprising a camera, a digital projector, and a computing unit to solve these problems. A three-dimensional model of a lesion was reconstructed using the developed system. Hierarchical segmentation was achieved with the superpixel algorithm. The expected light dosimetry on the targeted lesion was achieved with the proposed illumination procedure. Accurate control and optimization of light delivery can improve the efficacy of PDT.

  2. Fan-beam scanning laser optical computed tomography for large volume dosimetry

    NASA Astrophysics Data System (ADS)

    Dekker, K. H.; Battista, J. J.; Jordan, K. J.

    2017-05-01

    A prototype scanning-laser fan beam optical CT scanner is reported which is capable of high resolution, large volume dosimetry with reasonable scan time. An acylindrical, asymmetric aquarium design is presented which serves to 1) generate parallel-beam scan geometry, 2) focus light towards a small acceptance angle detector, and 3) avoid interference fringe-related artifacts. Preliminary experiments with uniform solution phantoms (11 and 15 cm diameter) and finger phantoms (13.5 mm diameter FEP tubing) demonstrate that the design allows accurate optical CT imaging, with optical CT measurements agreeing within 3% of independent Beer-Lambert law calculations.

  3. Calculated effects of backscattering on skin dosimetry for nuclear fuel fragments.

    PubMed

    Aydarous, A Sh

    2008-01-01

    The size of hot particles contained in nuclear fallout ranges from 10 nm to 20 microm for the worldwide weapons fallout. Hot particles from nuclear power reactors can be significantly bigger (100 microm to several millimetres). Electron backscattering from such particles is a prominent secondary effect in beta dosimetry for radiological protection purposes, such as skin dosimetry. In this study, the effect of electron backscattering due to hot particles contamination on skin dose is investigated. These include parameters such as detector area, source radius, source energy, scattering material and source density. The Monte-Carlo Neutron Particle code (MCNP4C) was used to calculate the depth dose distribution for 10 different beta sources and various materials. The backscattering dose factors (BSDF) were then calculated. A significant dependence is shown for the BSDF magnitude upon detector area, source radius and scatterers. It is clearly shown that the BSDF increases with increasing detector area. For high Z scatterers, the BSDF can reach as high as 40 and 100% for sources with radii 0.1 and 0.0001 cm, respectively. The variation of BSDF with source radius, source energy and source density is discussed.

  4. Solar particle events observed at Mars: dosimetry measurements and model calculations

    NASA Astrophysics Data System (ADS)

    Cleghorn, T.; Saganti, P.; Zeitlin, C.; Cucinotta, F.

    The first solar particle events from a Martian orbit are observed with the MARIE (Martian Radiation Environment Experiment) on the 2001 Mars Odyssey space -craft that is currently in orbit and collecting the mapping data of the red planet. These solar particle events observed at Mars during March and April 2002, are correlated with the GOES-8 and ACE satellite data from the same time period at Earth orbits. Dosimetry measurements for the Mars orbit from the period of March 13t h through April 29t h . Particle count rate and the corresponding dose rate enhancements were observed on March 16t h through 20t h and on April 22n d corresponding to solar particle events that were observed at Earth orbit on March 16t h through 21s t and beginning on April 21s t respectively. The model calculations with the HZETRN (High Z=atomic number and high Energy Transport) code estimated the background GCR (Galactic Cosmic Rays) dose rates. The dose rates observed by the MARIE instrument are within 10% of the model calculations. Dosimetry measurements and model calculation will be presented.

  5. Index extraction for electromagnetic field evaluation of high power wireless charging system

    PubMed Central

    2017-01-01

    This paper presents the precise dosimetry for highly resonant wireless power transfer (HR-WPT) system using an anatomically realistic human voxel model. The dosimetry for the HR-WPT system designed to operate at 13.56 MHz frequency, which one of the ISM band frequency band, is conducted in the various distances between the human model and the system, and in the condition of alignment and misalignment between transmitting and receiving circuits. The specific absorption rates in the human body are computed by the two-step approach; in the first step, the field generated by the HR-WPT system is calculated and in the second step the specific absorption rates are computed with the scattered field finite-difference time-domain method regarding the fields obtained in the first step as the incident fields. The safety compliance for non-uniform field exposure from the HR-WPT system is discussed with the international safety guidelines. Furthermore, the coupling factor concept is employed to relax the maximum allowable transmitting power. Coupling factors derived from the dosimetry results are presented. In this calculation, the external magnetic field from the HR-WPT system can be relaxed by approximately four times using coupling factor in the worst exposure scenario. PMID:28708840

  6. A broad-group cross-section library based on ENDF/B-VII.0 for fast neutron dosimetry Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpan, F.A.

    2011-07-01

    A new ENDF/B-VII.0-based coupled 44-neutron, 20-gamma-ray-group cross-section library was developed to investigate the latest evaluated nuclear data file (ENDF) ,in comparison to ENDF/B-VI.3 used in BUGLE-96, as well as to generate an objective-specific library. The objectives selected for this work consisted of dosimetry calculations for in-vessel and ex-vessel reactor locations, iron atom displacement calculations for reactor internals and pressure vessel, and {sup 58}Ni(n,{gamma}) calculation that is important for gas generation in the baffle plate. The new library was generated based on the contribution and point-wise cross-section-driven (CPXSD) methodology and was applied to one of the most widely used benchmarks, themore » Oak Ridge National Laboratory Pool Critical Assembly benchmark problem. In addition to the new library, BUGLE-96 and an ENDF/B-VII.0-based coupled 47-neutron, 20-gamma-ray-group cross-section library was generated and used with both SNLRML and IRDF dosimetry cross sections to compute reaction rates. All reaction rates computed by the multigroup libraries are within {+-} 20 % of measurement data and meet the U. S. Nuclear Regulatory Commission acceptance criterion for reactor vessel neutron exposure evaluations specified in Regulatory Guide 1.190. (authors)« less

  7. NSDann2BS, a neutron spectrum unfolding code based on neural networks technology and two bonner spheres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.

    In this work a neutron spectrum unfolding code, based on artificial intelligence technology is presented. The code called ''Neutron Spectrometry and Dosimetry with Artificial Neural Networks and two Bonner spheres'', (NSDann2BS), was designed in a graphical user interface under the LabVIEW programming environment. The main features of this code are to use an embedded artificial neural network architecture optimized with the ''Robust design of artificial neural networks methodology'' and to use two Bonner spheres as the only piece of information. In order to build the code here presented, once the net topology was optimized and properly trained, knowledge stored atmore » synaptic weights was extracted and using a graphical framework build on the LabVIEW programming environment, the NSDann2BS code was designed. This code is friendly, intuitive and easy to use for the end user. The code is freely available upon request to authors. To demonstrate the use of the neural net embedded in the NSDann2BS code, the rate counts of {sup 252}Cf, {sup 241}AmBe and {sup 239}PuBe neutron sources measured with a Bonner spheres system.« less

  8. NSDann2BS, a neutron spectrum unfolding code based on neural networks technology and two bonner spheres

    NASA Astrophysics Data System (ADS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-07-01

    In this work a neutron spectrum unfolding code, based on artificial intelligence technology is presented. The code called "Neutron Spectrometry and Dosimetry with Artificial Neural Networks and two Bonner spheres", (NSDann2BS), was designed in a graphical user interface under the LabVIEW programming environment. The main features of this code are to use an embedded artificial neural network architecture optimized with the "Robust design of artificial neural networks methodology" and to use two Bonner spheres as the only piece of information. In order to build the code here presented, once the net topology was optimized and properly trained, knowledge stored at synaptic weights was extracted and using a graphical framework build on the LabVIEW programming environment, the NSDann2BS code was designed. This code is friendly, intuitive and easy to use for the end user. The code is freely available upon request to authors. To demonstrate the use of the neural net embedded in the NSDann2BS code, the rate counts of 252Cf, 241AmBe and 239PuBe neutron sources measured with a Bonner spheres system.

  9. Rational evaluation of the therapeutic effect and dosimetry of auger electrons for radionuclide therapy in a cell culture model.

    PubMed

    Shinohara, Ayaka; Hanaoka, Hirofumi; Sakashita, Tetsuya; Sato, Tatsuhiko; Yamaguchi, Aiko; Ishioka, Noriko S; Tsushima, Yoshito

    2018-02-01

    Radionuclide therapy with low-energy auger electron emitters may provide high antitumor efficacy while keeping the toxicity to normal organs low. Here we evaluated the usefulness of an auger electron emitter and compared it with that of a beta emitter for tumor treatment in in vitro models and conducted a dosimetry simulation using radioiodine-labeled metaiodobenzylguanidine (MIBG) as a model compound. We evaluated the cellular uptake of 125 I-MIBG and the therapeutic effects of 125 I- and 131 I-MIBG in 2D and 3D PC-12 cell culture models. We used a Monte Carlo simulation code (PHITS) to calculate the absorbed radiation dose of 125 I or 131 I in computer simulation models for 2D and 3D cell cultures. In the dosimetry calculation for the 3D model, several distribution patterns of radionuclide were applied. A higher cumulative dose was observed in the 3D model due to the prolonged retention of MIBG compared to the 2D model. However, 125 I-MIBG showed a greater therapeutic effect in the 2D model compared to the 3D model (respective EC 50 values in the 2D and 3D models: 86.9 and 303.9 MBq/cell), whereas 131 I-MIBG showed the opposite result (respective EC 50 values in the 2D and 3D models: 49.4 and 30.2 MBq/cell). The therapeutic effect of 125 I-MIBG was lower than that of 131 I-MIBG in both models, but the radionuclide-derived difference was smaller in the 2D model. The dosimetry simulation with PHITS revealed the influence of the radiation quality, the crossfire effect, radionuclide distribution, and tumor shape on the absorbed dose. Application of the heterogeneous distribution series dramatically changed the radiation dose distribution of 125 I-MIBG, and mitigated the difference between the estimated and measured therapeutic effects of 125 I-MIBG. The therapeutic effect of 125 I-MIBG was comparable to that of 131 I-MIBG in the 2D model, but the efficacy was inferior to that of 131 I-MIBG in the 3D model, since the crossfire effect is negligible and the homogeneous distribution of radionuclides was insufficient. Thus, auger electrons would be suitable for treating small-sized tumors. The design of radiopharmaceuticals with auger electron emitters requires particularly careful consideration of achieving a homogeneous distribution of the compound in the tumor.

  10. Experimental check of bremsstrahlung dosimetry predictions for 0.75 MeV electrons

    NASA Astrophysics Data System (ADS)

    Sanford, T. W. L.; Halbleib, J. A.; Beezhold, W.

    Bremsstrahlung dose in CaF2 TLDs from the radiation produced by 0.75 MeV electrons incident on Ta/C targets is measured and compared with that calculated via the CYLTRAN Monte Carlo code. The comparison was made to validate the code, which is used to predict and analyze radiation environments of flash X-ray simulators measured by TLDs. Over a wide range of Ta target thicknesses and radiation angles the code is found to agree with the 5% measurements. For Ta thickness near those that optimize the radiation output, however, the code overestimates the radiation dose at small angles. Maximum overprediction is about 14 + or - 5%. The general agreement, nonetheless, gives confidence in using the code at this energy and in the TLD calibration procedure. For the bulk of the measurements, a standard TLD employing a 2.2 mm thick Al equilibrator was used. In this paper we also show that this thickness can significantly attenuate the free-field dose and introduces significant photon buildup in the equalibrator.

  11. Fast protocol for radiochromic film dosimetry using a cloud computing web application.

    PubMed

    Calvo-Ortega, Juan-Francisco; Pozo, Miquel; Moragues, Sandra; Casals, Joan

    2017-07-01

    To investigate the feasibility of a fast protocol for radiochromic film dosimetry to verify intensity-modulated radiotherapy (IMRT) plans. EBT3 film dosimetry was conducted in this study using the triple-channel method implemented in the cloud computing application (Radiochromic.com). We described a fast protocol for radiochromic film dosimetry to obtain measurement results within 1h. Ten IMRT plans were delivered to evaluate the feasibility of the fast protocol. The dose distribution of the verification film was derived at 15, 30, 45min using the fast protocol and also at 24h after completing the irradiation. The four dose maps obtained per plan were compared using global and local gamma index (5%/3mm) with the calculated one by the treatment planning system. Gamma passing rates obtained for 15, 30 and 45min post-exposure were compared with those obtained after 24h. Small differences respect to the 24h protocol were found in the gamma passing rates obtained for films digitized at 15min (global: 99.6%±0.9% vs. 99.7%±0.5%; local: 96.3%±3.4% vs. 96.3%±3.8%), at 30min (global: 99.5%±0.9% vs. 99.7%±0.5%; local: 96.5%±3.2% vs. 96.3±3.8%) and at 45min (global: 99.2%±1.5% vs. 99.7%±0.5%; local: 96.1%±3.8% vs. 96.3±3.8%). The fast protocol permits dosimetric results within 1h when IMRT plans are verified, with similar results as those reported by the standard 24h protocol. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. Development of computational small animal models and their applications in preclinical imaging and therapy research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Tianwu; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch; Geneva Neuroscience Center, Geneva University, Geneva CH-1205

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and themore » development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.« less

  13. Improving the accuracy of ionization chamber dosimetry in small megavoltage x-ray fields

    NASA Astrophysics Data System (ADS)

    McNiven, Andrea L.

    The dosimetry of small x-ray fields is difficult, but important, in many radiation therapy delivery methods. The accuracy of ion chambers for small field applications, however, is limited due to the relatively large size of the chamber with respect to the field size, leading to partial volume effects, lateral electronic disequilibrium and calibration difficulties. The goal of this dissertation was to investigate the use of ionization chambers for the purpose of dosimetry in small megavoltage photon beams with the aim of improving clinical dose measurements in stereotactic radiotherapy and helical tomotherapy. A new method for the direct determination of the sensitive volume of small-volume ion chambers using micro computed tomography (muCT) was investigated using four nominally identical small-volume (0.56 cm3) cylindrical ion chambers. Agreement between their measured relative volume and ionization measurements (within 2%) demonstrated the feasibility of volume determination through muCT. Cavity-gas calibration coefficients were also determined, demonstrating the promise for accurate ion chamber calibration based partially on muCT. The accuracy of relative dose factor measurements in 6MV stereotactic x-ray fields (5 to 40mm diameter) was investigated using a set of prototype plane-parallel ionization chambers (diameters of 2, 4, 10 and 20mm). Chamber and field size specific correction factors ( CSFQ ), that account for perturbation of the secondary electron fluence, were calculated using Monte Carlo simulation methods (BEAM/EGSnrc simulations). These correction factors (e.g. CSFQ = 1.76 (2mm chamber, 5mm field) allow for accurate relative dose factor (RDF) measurement when applied to ionization readings, under conditions of electronic disequilibrium. With respect to the dosimetry of helical tomotherapy, a novel application of the ion chambers was developed to characterize the fan beam size and effective dose rate. Characterization was based on an adaptation of the computed tomography dose index (CTDI), a concept normally used in diagnostic radiology. This involved experimental determination of the fan beam thickness using the ion chambers to acquire fan beam profiles and extrapolation to a 'zero-size' detector. In conclusion, improvements have been made in the accuracy of small field dosimetry measurements in stereotactic radiotherapy and helical tomotherapy. This was completed through introduction of an original technique involving micro-CT imaging for sensitive volume determination and potentially ion chamber calibration coefficients, the use of appropriate Monte Carlo derived correction factors for RDF measurement, and the exploitation of the partial volume effect for helical tomotherapy fan beam dosimetry. With improved dosimetry for a wide range of challenging small x-ray field situations, it is expected that the patient's radiation safety will be maintained, and that clinical trials will adopt calibration protocols specialized for modern radiotherapy with small fields or beamlets. Keywords. radiation therapy, ionization chambers, small field dosimetry, stereotactic radiotherapy, helical tomotherapy, micro-CT.

  14. The EDIT-COMGEOM Code

    DTIC Science & Technology

    1975-09-01

    This report assumes a familiarity with the GIFT and MAGIC computer codes. The EDIT-COMGEOM code is a FORTRAN computer code. The EDIT-COMGEOM code...converts the target description data which was used in the MAGIC computer code to the target description data which can be used in the GIFT computer code

  15. Patient-specific CT dosimetry calculation: a feasibility study.

    PubMed

    Fearon, Thomas; Xie, Huchen; Cheng, Jason Y; Ning, Holly; Zhuge, Ying; Miller, Robert W

    2011-11-15

    Current estimation of radiation dose from computed tomography (CT) scans on patients has relied on the measurement of Computed Tomography Dose Index (CTDI) in standard cylindrical phantoms, and calculations based on mathematical representations of "standard man". Radiation dose to both adult and pediatric patients from a CT scan has been a concern, as noted in recent reports. The purpose of this study was to investigate the feasibility of adapting a radiation treatment planning system (RTPS) to provide patient-specific CT dosimetry. A radiation treatment planning system was modified to calculate patient-specific CT dose distributions, which can be represented by dose at specific points within an organ of interest, as well as organ dose-volumes (after image segmentation) for a GE Light Speed Ultra Plus CT scanner. The RTPS calculation algorithm is based on a semi-empirical, measured correction-based algorithm, which has been well established in the radiotherapy community. Digital representations of the physical phantoms (virtual phantom) were acquired with the GE CT scanner in axial mode. Thermoluminescent dosimeter (TLDs) measurements in pediatric anthropomorphic phantoms were utilized to validate the dose at specific points within organs of interest relative to RTPS calculations and Monte Carlo simulations of the same virtual phantoms (digital representation). Congruence of the calculated and measured point doses for the same physical anthropomorphic phantom geometry was used to verify the feasibility of the method. The RTPS algorithm can be extended to calculate the organ dose by calculating a dose distribution point-by-point for a designated volume. Electron Gamma Shower (EGSnrc) codes for radiation transport calculations developed by National Research Council of Canada (NRCC) were utilized to perform the Monte Carlo (MC) simulation. In general, the RTPS and MC dose calculations are within 10% of the TLD measurements for the infant and child chest scans. With respect to the dose comparisons for the head, the RTPS dose calculations are slightly higher (10%-20%) than the TLD measurements, while the MC results were within 10% of the TLD measurements. The advantage of the algebraic dose calculation engine of the RTPS is a substantially reduced computation time (minutes vs. days) relative to Monte Carlo calculations, as well as providing patient-specific dose estimation. It also provides the basis for a more elaborate reporting of dosimetric results, such as patient specific organ dose volumes after image segmentation.

  16. The radiation dosimetry of intrathecally administered radionuclides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stabin, M.G.; Evans, J.F.

    The radiation dose to the spine, spinal cord, marrow, and other organs of the body from intrathecal administration of several radiopharmaceuticals was studied. Anatomic models were developed for the spine, spinal cerebrospinal fluid (CSF), spinal cord, spinal skeleton, cranial skeleton, and cranial CSF. A kinetic model for the transport of CSF was used to determine residence times in the CSF; material leaving the CSF was thereafter assumed to enter the bloodstream and follow the kinetics of the radiopharmaceutical as if intravenously administered. The radiation transport codes MCNP and ALGAMP were used to model the electron and photon transport and energymore » deposition. The dosimetry of Tc-99m DTPA and HSA, In-111 DTPA, I-131 HSA, and Yb-169 DTPA was studied. Radiation dose profiles for the spinal cord and marrow in the spine were developed and average doses to all other organs were estimated, including dose distributions within the bone and marrow.« less

  17. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Maynard, Matthew R.; Geyer, John W.; Aris, John P.; Shifrin, Roger Y.; Bolch, Wesley

    2011-08-01

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR™ and then imported to the 3D modeling software package Rhinoceros™ for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations in skeletal size, individual organ masses and total fetal masses. The resulting series of fetal hybrid computational phantoms is applicable to organ-level and bone-level internal and external radiation dosimetry for human fetuses of various ages and weight percentiles

  18. Partition Model-Based 99mTc-MAA SPECT/CT Predictive Dosimetry Compared with 90Y TOF PET/CT Posttreatment Dosimetry in Radioembolization of Hepatocellular Carcinoma: A Quantitative Agreement Comparison.

    PubMed

    Gnesin, Silvano; Canetti, Laurent; Adib, Salim; Cherbuin, Nicolas; Silva Monteiro, Marina; Bize, Pierre; Denys, Alban; Prior, John O; Baechler, Sebastien; Boubaker, Ariane

    2016-11-01

    90 Y-microsphere selective internal radiation therapy (SIRT) is a valuable treatment in unresectable hepatocellular carcinoma (HCC). Partition-model predictive dosimetry relies on differential tumor-to-nontumor perfusion evaluated on pretreatment 99m Tc-macroaggregated albumin (MAA) SPECT/CT. The aim of this study was to evaluate agreement between the predictive dosimetry of 99m Tc-MAA SPECT/CT and posttreatment dosimetry based on 90 Y time-of-flight (TOF) PET/CT. We compared the 99m Tc-MAA SPECT/CT results for 27 treatment sessions (25 HCC patients, 41 tumors) with 90 Y SIRT (7 glass spheres, 20 resin spheres) and the posttreatment 90 Y TOF PET/CT results. Three-dimensional voxelized dose maps were computed from the 99m Tc-MAA SPECT/CT and 90 Y TOF PET/CT data. Mean absorbed dose ([Formula: see text]) was evaluated to compute the predicted-to-actual dose ratio ([Formula: see text]) in tumor volumes (TVs) and nontumor volumes (NTVs) for glass and resin spheres. The Lin concordance ([Formula: see text]) was used to measure accuracy ([Formula: see text]) and precision (ρ). Administered activity ranged from 0.8 to 1.9 GBq for glass spheres and from 0.6 to 3.4 GBq for resin spheres, and the respective TVs ranged from 2 to 125 mL and from 6 to 1,828 mL. The mean dose [Formula: see text] was 240 Gy for glass and 122 Gy for resin in TVs and 72 Gy for glass and 47 Gy for resin in NTVs. [Formula: see text] was 1.46 ± 0.58 (0.65-2.53) for glass and 1.16 ± 0.41 (0.54-2.54) for resin, and the respective values for [Formula: see text] were 0.88 ± 0.15 (0.56-1.00) and 0.86 ± 0.2 (0.58-1.35). DR variability was substantially lower in NTVs than in TVs. The Lin concordance between [Formula: see text] and [Formula: see text] (resin) was significantly better for tumors larger than 150 mL than for tumors 150 mL or smaller ([Formula: see text] = 0.93 and [Formula: see text] = 0.95 vs. [Formula: see text] = 0.57 and [Formula: see text] = 0.93; P < 0.05). In 90 Y radioembolization of HCC, predictive dosimetry based on 99m Tc-MAA SPECT/CT provided good estimates of absorbed doses calculated from posttreatment 90 Y TOF PET/CT for tumor and nontumor tissues. The low variability of [Formula: see text] demonstrates that pretreatment dosimetry is particularly suitable for minimizing radiation-induced hepatotoxicity. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  19. Skeletal dosimetry: A hyperboloid representation of the bone-marrow interface to reduce voxel effects in three-dimensional images of trabecular bone

    NASA Astrophysics Data System (ADS)

    Rajon, Didier Alain

    Radiation damage to the hematopoietic bone marrow is clearly defined as the limiting factor to the development of internal emitter therapies. Current dosimetry models rely on chord-length distributions measured through the complex microstructure of the trabecular bone regions of the skeleton in which most of the active marrow is located. Recently, Nuclear Magnetic Resonance (NMR) has been used to obtain high-resolution three-dimensional (3D) images of small trabecular bone samples. These images have been coupled with computer programs to estimate dosimetric parameters such as chord-length distributions, and energy depositions by monoenergetic electrons. This new technique is based on the assumption that each voxel of the image is assigned either to bone tissue or to marrow tissue after application of a threshold value. Previous studies showed that this assumption had important consequences on the outcome of the computer calculations. Both the chord-length distribution measurements and the energy deposition calculations are subject to voxel effects that are responsible for large discrepancies when applied to mathematical models of trabecular bone. The work presented in this dissertation proposes first a quantitative study of the voxel effects. Consensus is that the voxelized representation of surfaces should not be used as direct input to dosimetry computer programs. Instead we need a new technique to transform the interfaces into smooth surfaces. The Marching Cube (MC) algorithm was used and adapted to do this transformation. The initial image was used to generate a continuous gray-level field throughout the image. The interface between bone and marrow was then simulated by the iso-gray-level surface that corresponds to a predetermined threshold value. Calculations were then performed using this new representation. Excellent results were obtained for both the chord-length distribution and the energy deposition measurements. Voxel effects were reduced to an acceptable level and the discrepancies found when using the voxelized representation of the interface were reduced to a few percent. We conclude that this new model should be used every time one performs dosimetry estimates using NMR images of trabecular bone samples.

  20. The effect of tandem-ovoid titanium applicator on points A, B, bladder, and rectum doses in gynecological brachytherapy using 192Ir.

    PubMed

    Sadeghi, Mohammad Hosein; Sina, Sedigheh; Mehdizadeh, Amir; Faghihi, Reza; Moharramzadeh, Vahed; Meigooni, Ali Soleimani

    2018-02-01

    The dosimetry procedure by simple superposition accounts only for the self-shielding of the source and does not take into account the attenuation of photons by the applicators. The purpose of this investigation is an estimation of the effects of the tandem and ovoid applicator on dose distribution inside the phantom by MCNP5 Monte Carlo simulations. In this study, the superposition method is used for obtaining the dose distribution in the phantom without using the applicator for a typical gynecological brachytherapy (superposition-1). Then, the sources are simulated inside the tandem and ovoid applicator to identify the effect of applicator attenuation (superposition-2), and the dose at points A, B, bladder, and rectum were compared with the results of superposition. The exact dwell positions, times of the source, and positions of the dosimetry points were determined in images of a patient and treatment data of an adult woman patient from a cancer center. The MCNP5 Monte Carlo (MC) code was used for simulation of the phantoms, applicators, and the sources. The results of this study showed no significant differences between the results of superposition method and the MC simulations for different dosimetry points. The difference in all important dosimetry points was found to be less than 5%. According to the results, applicator attenuation has no significant effect on the calculated points dose, the superposition method, adding the dose of each source obtained by the MC simulation, can estimate the dose to points A, B, bladder, and rectum with good accuracy.

  1. Real-time computed tomography dosimetry during ultrasound-guided brachytherapy for prostate cancer.

    PubMed

    Kaplan, Irving D; Meskell, Paul; Oldenburg, Nicklas E; Saltzman, Brian; Kearney, Gary P; Holupka, Edward J

    2006-01-01

    Ultrasound-guided implantation of permanent radioactive seeds is a treatment option for localized prostate cancer. Several techniques have been described for the optimal placement of the seeds in the prostate during this procedure. Postimplantation dosimetric calculations are performed after the implant. Areas of underdosing can only be corrected with either an external beam boost or by performing a second implant. We demonstrate the feasibility of performing computed tomography (CT)-based postplanning during the ultrasound-guided implant and subsequently correcting for underdosed areas. Ultrasound-guided brachytherapy is performed on a modified CT table with general anesthesia. The postplanning CT scan is performed after the implant, while the patient is still under anesthesia. Additional seeds are implanted into "cold spots," and the resultant dosimetry confirmed with CT. Intraoperative postplanning was successfully performed. Dose-volume histograms demonstrated adequate dose coverage during the initial implant, but on detailed analysis, for some patients, areas of underdosing were observed either at the apex or the peripheral zone. Additional seeds were implanted to bring these areas to prescription dose. Intraoperative postplanning is feasible during ultrasound-guided brachytherapy for prostate cancer. Although the postimplant dose-volume histograms for all patients, before the implantation of additional seeds, were adequate according to the American Brachytherapy Society criteria, specific critical areas can be underdosed. Additional seeds can then be implanted to optimize the dosimetry and reduce the risk of underdosing areas of cancer.

  2. In vitro dosimetry of agglomerates

    NASA Astrophysics Data System (ADS)

    Hirsch, V.; Kinnear, C.; Rodriguez-Lorenzo, L.; Monnier, C. A.; Rothen-Rutishauser, B.; Balog, S.; Petri-Fink, A.

    2014-06-01

    Agglomeration of nanoparticles in biological fluids is a pervasive phenomenon that leads to difficulty in the interpretation of results from in vitro exposure, primarily due to differing particokinetics of agglomerates to nanoparticles. Therefore, well-defined small agglomerates were designed that possessed different particokinetic profiles, and their cellular uptake was compared to a computational model of dosimetry. The approach used here paves the way for a better understanding of the impact of agglomeration on the nanoparticle-cell interaction.Agglomeration of nanoparticles in biological fluids is a pervasive phenomenon that leads to difficulty in the interpretation of results from in vitro exposure, primarily due to differing particokinetics of agglomerates to nanoparticles. Therefore, well-defined small agglomerates were designed that possessed different particokinetic profiles, and their cellular uptake was compared to a computational model of dosimetry. The approach used here paves the way for a better understanding of the impact of agglomeration on the nanoparticle-cell interaction. Electronic supplementary information (ESI) available: ITC data for tiopronin/Au-NP interactions, agglomeration kinetics at different pHs for tiopronin-coated Au-NPs, UV-Vis spectra in water, PBS and DMEM and temporal correlation functions for single Au-NPs and corresponding agglomerates, calculation of diffusion and sedimentation parameters, modelling of relative cell uptake based on the ISDD model and cytotoxicity of single Au-NPs and their agglomerates, and synthesis and cell uptake of large spherical Au-NPs. See DOI: 10.1039/c4nr00460d

  3. Optical dosimetry probes to validate Monte Carlo and empirical-method-based NIR dose planning in the brain.

    PubMed

    Verleker, Akshay Prabhu; Shaffer, Michael; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M

    2016-12-01

    A three-dimensional photon dosimetry in tissues is critical in designing optical therapeutic protocols to trigger light-activated drug release. The objective of this study is to investigate the feasibility of a Monte Carlo-based optical therapy planning software by developing dosimetry tools to characterize and cross-validate the local photon fluence in brain tissue, as part of a long-term strategy to quantify the effects of photoactivated drug release in brain tumors. An existing GPU-based 3D Monte Carlo (MC) code was modified to simulate near-infrared photon transport with differing laser beam profiles within phantoms of skull bone (B), white matter (WM), and gray matter (GM). A novel titanium-based optical dosimetry probe with isotropic acceptance was used to validate the local photon fluence, and an empirical model of photon transport was developed to significantly decrease execution time for clinical application. Comparisons between the MC and the dosimetry probe measurements were on an average 11.27%, 13.25%, and 11.81% along the illumination beam axis, and 9.4%, 12.06%, 8.91% perpendicular to the beam axis for WM, GM, and B phantoms, respectively. For a heterogeneous head phantom, the measured % errors were 17.71% and 18.04% along and perpendicular to beam axis. The empirical algorithm was validated by probe measurements and matched the MC results (R20.99), with average % error of 10.1%, 45.2%, and 22.1% relative to probe measurements, and 22.6%, 35.8%, and 21.9% relative to the MC, for WM, GM, and B phantoms, respectively. The simulation time for the empirical model was 6 s versus 8 h for the GPU-based Monte Carlo for a head phantom simulation. These tools provide the capability to develop and optimize treatment plans for optimal release of pharmaceuticals in the treatment of cancer. Future work will test and validate these novel delivery and release mechanisms in vivo.

  4. Air density correction in ionization dosimetry.

    PubMed

    Christ, G; Dohm, O S; Schüle, E; Gaupp, S; Martin, M

    2004-05-21

    Air density must be taken into account when ionization dosimetry is performed with unsealed ionization chambers. The German dosimetry protocol DIN 6800-2 states an air density correction factor for which current barometric pressure and temperature and their reference values must be known. It also states that differences between air density and the attendant reference value, as well as changes in ionization chamber sensitivity, can be determined using a radioactive check source. Both methods have advantages and drawbacks which the paper discusses in detail. Barometric pressure at a given height above sea level can be determined by using a suitable barometer, or data downloaded from airport or weather service internet sites. The main focus of the paper is to show how barometric data from measurement or from the internet are correctly processed. Therefore the paper also provides all the requisite equations and terminological explanations. Computed and measured barometric pressure readings are compared, and long-term experience with air density correction factors obtained using both methods is described.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leichner, P.K.

    This report summarizes research in beta-particle dosimetry, quantitative single-photon emission computed tomography (SPECT), the clinical implementation of these two areas of research in radioimmunotherapy (RIT), and postgraduate training provided since the inception of this grant on July 15, 1989. To improve beta-particle dosimetry, a point source function was developed that is valid for a wide range of beta emitters. Analytical solutions for beta-particle dose rates within out outside slabs of finite thickness were validated in experimental tumors and are now being used in clinical RIT. Quantitative SPECT based on the circular harmonic transform (CHT) algorithm was validated in phantom, experimental,more » and clinical studies. This has led to improved macrodosimetry in clinical RIT. In dosimetry at the multi-cellular level studies were made of the HepG2 human hepatoblastoma grown subcutaneously in nude mice. Histologic sections and autoradiographs were prepared to quantitate activity distributions of radiolabeled antibodies. Absorbed-dose calculations are being carried out for {sup 131}I and {sup 90}Y beta particles for these antibody distributions.« less

  6. Radiofrequency Radiation Dosimetry Handbook. 4th Edition

    DTIC Science & Technology

    1986-10-01

    reasonable. Such an equivalence was demon- strated by -Nielsen and Nielsen (1965) when they measured identical thermoregu- latory responses to exercise ...Circulatory and sweating responses during exercise and heat stress, pp. 251-276. In E. R. Adair (ed.). Microwavcs and Thermoregula- tion. ISBN:0-12-044020-2... RESPONSIBLE INDIVIDUAL 22b. TELEPHONE (Include Area Code) 22c. OFFICE SYMBOL Y.Williami D. Hurt ( 512) 536-20 USAFSAM/RZP DD FORM 1473, 84 MAR 83 APR

  7. Toward a New Evaluation of Neutron Standards

    DOE PAGES

    Carlson, Allan D.; Pronyaev, Vladimir G.; Capote, Roberto; ...

    2016-02-03

    Measurements related to neutron cross section standards and certain prompt neutron fission spectra are being evaluated. In addition to the standard cross sections, investigations of reference data that are not as well known as the standards are being considered. We discuss procedures and codes for performing this work. A number of libraries will use the results of this standards evaluation for new versions of their libraries. Most of these data have applications in neutron dosimetry.

  8. Validating Fricke dosimetry for the measurement of absorbed dose to water for HDR 192Ir brachytherapy: a comparison between primary standards of the LCR, Brazil, and the NRC, Canada.

    PubMed

    Salata, Camila; David, Mariano Gazineu; de Almeida, Carlos Eduardo; El Gamal, Islam; Cojocaru, Claudiu; Mainegra-Hing, Ernesto; McEwen, Malcom

    2018-04-05

    Two Fricke-based absorbed dose to water standards for HDR Ir-192 dosimetry, developed independently by the LCR in Brazil and the NRC in Canada have been compared. The agreement in the determination of the dose rate from a HDR Ir-192 source at 1 cm in a water phantom was found to be within the k  =  1 combined measurement uncertainties of the two standards: D NRC /D LCR   =  1.011, standard uncertainty  =  2.2%. The dose-based standards also agreed within the uncertainties with the manufacturer's stated dose rate value, which is traceable to a national standard of air kerma. A number of possible influence quantities were investigated, including the specific method for producing the ferrous-sulphate Fricke solution, the geometry of the holder, and the Monte Carlo code used to determine correction factors. The comparison highlighted the lack of data on the determination of G(Fe 3+ ) in this energy range and the possibilities for further development of the holders used to contain the Fricke solution. The comparison also confirmed the suitability of Fricke dosimetry for Ir-192 primary standard dose rate determinations at therapy dose levels.

  9. Validating Fricke dosimetry for the measurement of absorbed dose to water for HDR 192Ir brachytherapy: a comparison between primary standards of the LCR, Brazil, and the NRC, Canada

    NASA Astrophysics Data System (ADS)

    Salata, Camila; Gazineu David, Mariano; de Almeida, Carlos Eduardo; El Gamal, Islam; Cojocaru, Claudiu; Mainegra-Hing, Ernesto; McEwen, Malcom

    2018-04-01

    Two Fricke-based absorbed dose to water standards for HDR Ir-192 dosimetry, developed independently by the LCR in Brazil and the NRC in Canada have been compared. The agreement in the determination of the dose rate from a HDR Ir-192 source at 1 cm in a water phantom was found to be within the k  =  1 combined measurement uncertainties of the two standards: D NRC/D LCR  =  1.011, standard uncertainty  =  2.2%. The dose-based standards also agreed within the uncertainties with the manufacturer’s stated dose rate value, which is traceable to a national standard of air kerma. A number of possible influence quantities were investigated, including the specific method for producing the ferrous-sulphate Fricke solution, the geometry of the holder, and the Monte Carlo code used to determine correction factors. The comparison highlighted the lack of data on the determination of G(Fe3+) in this energy range and the possibilities for further development of the holders used to contain the Fricke solution. The comparison also confirmed the suitability of Fricke dosimetry for Ir-192 primary standard dose rate determinations at therapy dose levels.

  10. Deriving an explicit hepatic clearance equation accounting for plasma protein binding and hepatocellular uptake.

    PubMed

    Yoon, Miyoung; Clewell, Harvey J; Andersen, Melvin E

    2013-02-01

    High throughput in vitro biochemical and cell-based assays have the promise to provide more mechanism-based assessments of the adverse effects of large numbers of chemicals. One of the most challenging hurdles for interpreting in vitro toxicity findings is the need for reverse dosimetry tools that estimate the exposures that will give concentrations in vivo similar to the active concentrations in vitro. Recent experience using IVIVE approaches to estimate in vivo pharmacokinetics (Wetmore et al., 2012) identified the need to develop a hepatic clearance equation that explicitly accounted for a broader set of protein binding and membrane transport processes and did not depend on a well-mixed description of the liver compartment. Here we derive an explicit steady-state hepatic clearance equation that includes these factors. In addition to the derivation, we provide simple computer code to calculate steady-state extraction for any combination of blood flow, membrane transport processes and plasma protein-chemical binding rates. This expanded equation provides a tool to estimate hepatic clearance for a more diverse array of compounds. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Computational lymphatic node models in pediatric and adult hybrid phantoms for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Lee, Choonsik; Lamart, Stephanie; Moroz, Brian E.

    2013-03-01

    We developed models of lymphatic nodes for six pediatric and two adult hybrid computational phantoms to calculate the lymphatic node dose estimates from external and internal radiation exposures. We derived the number of lymphatic nodes from the recommendations in International Commission on Radiological Protection (ICRP) Publications 23 and 89 at 16 cluster locations for the lymphatic nodes: extrathoracic, cervical, thoracic (upper and lower), breast (left and right), mesentery (left and right), axillary (left and right), cubital (left and right), inguinal (left and right) and popliteal (left and right), for different ages (newborn, 1-, 5-, 10-, 15-year-old and adult). We modeled each lymphatic node within the voxel format of the hybrid phantoms by assuming that all nodes have identical size derived from published data except narrow cluster sites. The lymph nodes were generated by the following algorithm: (1) selection of the lymph node site among the 16 cluster sites; (2) random sampling of the location of the lymph node within a spherical space centered at the chosen cluster site; (3) creation of the sphere or ovoid of tissue representing the node based on lymphatic node characteristics defined in ICRP Publications 23 and 89. We created lymph nodes until the pre-defined number of lymphatic nodes at the selected cluster site was reached. This algorithm was applied to pediatric (newborn, 1-, 5-and 10-year-old male, and 15-year-old males) and adult male and female ICRP-compliant hybrid phantoms after voxelization. To assess the performance of our models for internal dosimetry, we calculated dose conversion coefficients, called S values, for selected organs and tissues with Iodine-131 distributed in six lymphatic node cluster sites using MCNPX2.6, a well validated Monte Carlo radiation transport code. Our analysis of the calculations indicates that the S values were significantly affected by the location of the lymph node clusters and that the values increased for smaller phantoms due to the shorter inter-organ distances compared to the bigger phantoms. By testing sensitivity of S values to random sampling and voxel resolution, we confirmed that the lymph node model is reasonably stable and consistent for different random samplings and voxel resolutions.

  12. Agent-Based Computational Modeling to Examine How Individual Cell Morphology Affects Dosimetry

    EPA Science Inventory

    Cell-based models utilizing high-content screening (HCS) data have applications for predictive toxicology. Evaluating concentration-dependent effects on cell fate and state response is a fundamental utilization of HCS data.Although HCS assays may capture quantitative readouts at ...

  13. A hyperboliod representation of the bone-marrow interface within 3D NMR images of trabecular bone: applications to skeletal dosimetry

    NASA Astrophysics Data System (ADS)

    Rajon, D. A.; Shah, A. P.; Watchman, C. J.; Brindle, J. M.; Bolch, W. E.

    2003-06-01

    Recent advances in physical models of skeletal dosimetry utilize high-resolution NMR microscopy images of trabecular bone. These images are coupled to radiation transport codes to assess energy deposition within active bone marrow irradiated by bone- or marrow-incorporated radionuclides. Recent studies have demonstrated that the rectangular shape of image voxels is responsible for cross-region (bone-to-marrow) absorbed fraction errors of up to 50% for very low-energy electrons (<50 keV). In this study, a new hyperboloid adaptation of the marching cube (MC) image-visualization algorithm is implemented within 3D digital images of trabecular bone to better define the bone-marrow interface, and thus reduce voxel effects in the assessment of cross-region absorbed fractions. To test the method, a mathematical sample of trabecular bone was constructed, composed of a random distribution of spherical marrow cavities, and subsequently coupled to the EGSnrc radiation code to generate reference values for the energy deposition in marrow or bone. Next, digital images of the bone model were constructed over a range of simulated image resolutions, and coupled to EGSnrc using the hyperboloid MC (HMC) algorithm. For the radionuclides 33P, 117mSn, 131I and 153Sm, values of S(marrow←bone) estimated using voxel models of trabecular bone were shown to have relative errors of 10%, 9%, <1% and <1% at a voxel size of 150 µm. At a voxel size of 60 µm, these errors were 6%, 5%, <1% and <1%, respectively. When the HMC model was applied during particle transport, the relative errors on S(marrow←bone) for these same radionuclides were reduced to 7%, 6%, <1% and <1% at a voxel size of 150 µm, and to 2%, 2%, <1% and <1% at a voxel size of 60 µm. The technique was also applied to a real NMR image of human trabecular bone with a similar demonstration of reductions in dosimetry errors.

  14. Results from a Prototype Proton-CT Head Scanner

    NASA Astrophysics Data System (ADS)

    Johnson, R. P.; Bashkirov, V. A.; Coutrakon, G.; Giacometti, V.; Karbasi, P.; Karonis, N. T.; Ordoñez, C. E.; Pankuch, M.; Sadrozinski, H. F.-W.; Schubert, K. E.; Schulte, R. W.

    We are exploring low-dose proton radiography and computed tomography (pCT) as techniques to improve the accuracy of proton treatment planning and to provide artifact-free images for verification and adaptive therapy at the time of treatment. Here we report on comprehensive beam test results with our prototype pCT head scanner. The detector system and data acquisition attain a sustained rate of more than a million protons individually measured per second, allowing a full CT scan to be completed in six minutes or less of beam time. In order to assess the performance of the scanner for proton radiography as well as computed tomography, we have performed numerous scans of phantoms at the Northwestern Medicine Chicago Proton Center including a custom phantom designed to assess the spatial resolution, a phantom to assess the measurement of relative stopping power, and a dosimetry phantom. Some images, performance, and dosimetry results from those phantom scans are presented together with a description of the instrument, the data acquisition system, and the calibration methods.

  15. Monte Carlo simulations in Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Loudos, George K.

    2007-11-01

    Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.

  16. The EGS4 Code System: Solution of Gamma-ray and Electron Transport Problems

    DOE R&D Accomplishments Database

    Nelson, W. R.; Namito, Yoshihito

    1990-03-01

    In this paper we present an overview of the EGS4 Code System -- a general purpose package for the Monte Carlo simulation of the transport of electrons and photons. During the last 10-15 years EGS has been widely used to design accelerators and detectors for high-energy physics. More recently the code has been found to be of tremendous use in medical radiation physics and dosimetry. The problem-solving capabilities of EGS4 will be demonstrated by means of a variety of practical examples. To facilitate this review, we will take advantage of a new add-on package, called SHOWGRAF, to display particle trajectories in complicated geometries. These are shown as 2-D laser pictures in the written paper and as photographic slides of a 3-D high-resolution color monitor during the oral presentation. 11 refs., 15 figs.

  17. Combined experimental and Monte Carlo verification of brachytherapy plans for vaginal applicators

    NASA Astrophysics Data System (ADS)

    Sloboda, Ron S.; Wang, Ruqing

    1998-12-01

    Dose rates in a phantom around a shielded and an unshielded vaginal applicator containing Selectron low-dose-rate sources were determined by experiment and Monte Carlo simulation. Measurements were performed with thermoluminescent dosimeters in a white polystyrene phantom using an experimental protocol geared for precision. Calculations for the same set-up were done using a version of the EGS4 Monte Carlo code system modified for brachytherapy applications into which a new combinatorial geometry package developed by Bielajew was recently incorporated. Measured dose rates agree with Monte Carlo estimates to within 5% (1 SD) for the unshielded applicator, while highlighting some experimental uncertainties for the shielded applicator. Monte Carlo calculations were also done to determine a value for the effective transmission of the shield required for clinical treatment planning, and to estimate the dose rate in water at points in axial and sagittal planes transecting the shielded applicator. Comparison with dose rates generated by the planning system indicates that agreement is better than 5% (1 SD) at most positions. The precision thermoluminescent dosimetry protocol and modified Monte Carlo code are effective complementary tools for brachytherapy applicator dosimetry.

  18. Air kerma calibration factors and chamber correction values for PTW soft x-ray, NACP and Roos ionization chambers at very low x-ray energies.

    PubMed

    Ipe, N E; Rosser, K E; Moretti, C J; Manning, J W; Palmer, M J

    2001-08-01

    This paper evaluates the characteristics of ionization chambers for the measurement of absorbed dose to water using very low-energy x-rays. The values of the chamber correction factor, k(ch), used in the IPEMB 1996 code of practice for the UK secondary standard ionization chambers (PTW type M23342 and PTW type M23344), the Roos (PTW type 34001) and NACP electron chambers are derived. The responses in air of the small and large soft x-ray chambers (PTW type M23342 and PTW type M23344) and the NACP and Roos electron ionization chambers were compared. Besides the soft x-ray chambers, the NACP and Roos chambers can be used for very low-energy x-ray dosimetry provided that they are used in the restricted energy range for which their response does not change by more than 5%. The chamber correction factor was found by comparing the absorbed dose to water determined using the dosimetry protocol recommended for low-energy x-rays with that for very low-energy x-rays. The overlap energy range was extended using data from Grosswendt and Knight. Chamber correction factors given in this paper are chamber dependent, varying from 1.037 to 1.066 for a PTW type M23344 chamber, which is very different from a value of unity given in the IPEMB code. However, the values of k(ch) determined in this paper agree with those given in the DIN standard within experimental uncertainty. The authors recommend that the very low-energy section of the IPEMB code is amended to include the most up-to-date values of k(ch).

  19. Terahertz Radiation: A Non-contact Tool for the Selective Stimulation of Biological Responses in Human Cells

    DTIC Science & Technology

    2014-01-01

    computational and empirical dosimetric tools [31]. For the computational dosimetry, we employed finite-dif- ference time- domain (FDTD) modeling techniques to...temperature-time data collected for a well exposed to THz radiation using finite-difference time- domain (FDTD) modeling techniques and thermocouples... like )). Alter- ation in the expression of such genes underscores the signif- 62 IEEE TRANSACTIONS ON TERAHERTZ SCIENCE AND TECHNOLOGY, VOL. 6, NO. 1

  20. The effect of tandem-ovoid titanium applicator on points A, B, bladder, and rectum doses in gynecological brachytherapy using 192Ir

    PubMed Central

    Sadeghi, Mohammad Hosein; Mehdizadeh, Amir; Faghihi, Reza; Moharramzadeh, Vahed; Meigooni, Ali Soleimani

    2018-01-01

    Purpose The dosimetry procedure by simple superposition accounts only for the self-shielding of the source and does not take into account the attenuation of photons by the applicators. The purpose of this investigation is an estimation of the effects of the tandem and ovoid applicator on dose distribution inside the phantom by MCNP5 Monte Carlo simulations. Material and methods In this study, the superposition method is used for obtaining the dose distribution in the phantom without using the applicator for a typical gynecological brachytherapy (superposition-1). Then, the sources are simulated inside the tandem and ovoid applicator to identify the effect of applicator attenuation (superposition-2), and the dose at points A, B, bladder, and rectum were compared with the results of superposition. The exact dwell positions, times of the source, and positions of the dosimetry points were determined in images of a patient and treatment data of an adult woman patient from a cancer center. The MCNP5 Monte Carlo (MC) code was used for simulation of the phantoms, applicators, and the sources. Results The results of this study showed no significant differences between the results of superposition method and the MC simulations for different dosimetry points. The difference in all important dosimetry points was found to be less than 5%. Conclusions According to the results, applicator attenuation has no significant effect on the calculated points dose, the superposition method, adding the dose of each source obtained by the MC simulation, can estimate the dose to points A, B, bladder, and rectum with good accuracy. PMID:29619061

  1. Comparison of parameters affecting GNP-loaded choroidal melanoma dosimetry; Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Sharabiani, Marjan; Asadi, Somayeh; Barghi, Amir Rahnamai; Vaezzadeh, Mehdi

    2018-04-01

    The current study reports the results of tumor dosimetry in the presence of gold nanoparticles (GNPs) with different sizes and concentrations. Due to limited number of works carried out on the brachytherapy of choroidal melanoma in combination with GNPs, this study was performed to determine the optimum size and concentration for GNPs which contributes the highest dose deposition in tumor region, using two phantom test cases namely water phantom and a full Monte Carlo model of human eye. Both water and human eye phantoms were simulated with MCNP5 code. Tumor dosimetry was performed for a typical point photon source with an energy of 0.38 MeV as a high energy source and 103Pd brachytherapy source with an average energy of 0.021 MeV as a low energy source in water phantom and eye phantom respectively. Such a dosimetry was done for different sizes and concentrations of GNPs. For all of the diameters, increase in concentration of GNPs resulted in an increase in dose deposited in the region of interest. In a certain concentration, GNPs with larger diameters contributed more dose to the tumor region, which was more pronounced using eye phantom. 100 nm was reported as the optimum size in order to achieve the highest energy deposition within the target. This work investigated the optimum parameters affecting macroscopic dose enhancement in GNP-aided brachytherapy of choroidal melanoma. The current work also had implications on using low energy photon sources in the presence of GNPs to acquire the highest dose enhancement. This study is conducted through four different sizes and concentrations of GNPs. Considering the sensitivity of human eye tissue, in order to report the precise optimum parameters affecting radiosensitivity, a comprehensive study on a wide range of sizes and concentrations are required.

  2. A new cubic phantom for PET/CT dosimetry: Experimental and Monte Carlo characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belinato, Walmir; Silva, Rogerio M.V.; Souza, Divanizia N.

    In recent years, positron emission tomography (PET) associated with multidetector computed tomography (MDCT) has become a diagnostic technique widely disseminated to evaluate various malignant tumors and other diseases. However, during PET/CT examinations, the doses of ionizing radiation experienced by the internal organs of patients may be substantial. To study the doses involved in PET/CT procedures, a new cubic phantom of overlapping acrylic plates was developed and characterized. This phantom has a deposit for the placement of the fluorine-18 fluoro-2-deoxy-D-glucose ({sup 18}F-FDG) solution. There are also small holes near the faces for the insertion of optically stimulated luminescence dosimeters (OSLD). Themore » holes for OSLD are positioned at different distances from the {sup 18}F-FDG deposit. The experimental results were obtained in two PET/CT devices operating with different parameters. Differences in the absorbed doses were observed in OSLD measurements due to the non-orthogonal positioning of the detectors inside the phantom. This phantom was also evaluated using Monte Carlo simulations, with the MCNPX code. The phantom and the geometrical characteristics of the equipment were carefully modeled in the MCNPX code, in order to develop a new methodology form comparison of experimental and simulated results, as well as to allow the characterization of PET/CT equipments in Monte Carlo simulations. All results showed good agreement, proving that this new phantom may be applied for these experiments. (authors)« less

  3. Implementing Shared Memory Parallelism in MCBEND

    NASA Astrophysics Data System (ADS)

    Bird, Adam; Long, David; Dobson, Geoff

    2017-09-01

    MCBEND is a general purpose radiation transport Monte Carlo code from AMEC Foster Wheelers's ANSWERS® Software Service. MCBEND is well established in the UK shielding community for radiation shielding and dosimetry assessments. The existing MCBEND parallel capability effectively involves running the same calculation on many processors. This works very well except when the memory requirements of a model restrict the number of instances of a calculation that will fit on a machine. To more effectively utilise parallel hardware OpenMP has been used to implement shared memory parallelism in MCBEND. This paper describes the reasoning behind the choice of OpenMP, notes some of the challenges of multi-threading an established code such as MCBEND and assesses the performance of the parallel method implemented in MCBEND.

  4. Introduction of a deformable x-ray CT polymer gel dosimetry system

    NASA Astrophysics Data System (ADS)

    Maynard, E.; Heath, E.; Hilts, M.; Jirasek, A.

    2018-04-01

    This study introduces the first 3D deformable dosimetry system based on x-ray computed tomography (CT) polymer gel dosimetry and establishes the setup reproducibility, deformation characteristics and dose response of the system. A N-isopropylacrylamide (NIPAM)-based gel formulation optimized for x-ray CT gel dosimetry was used, with a latex balloon serving as the deformable container and low-density polyethylene and polyvinyl alcohol providing additional oxygen barrier. Deformable gels were irradiated with a 6 MV calibration pattern to determine dosimetric response and a dosimetrically uniform plan to determine the spatial uniformity of the response. Wax beads were added to each gel as fiducial markers to track the deformation and setup of the gel dosimeters. From positions of the beads on CT images the setup reproducibility and the limits and reproducibility of gel deformation were determined. Comparison of gel measurements with Monte Carlo dose calculations found excellent dosimetric accuracy, comparable to that of an established non-deformable dosimetry system, with a mean dose discrepancy of 1.5% in the low-dose gradient region and a gamma pass rate of 97.9% using a 3%/3 mm criterion. The deformable dosimeter also showed good overall spatial dose uniformity throughout the dosimeter with some discrepancies within 20 mm of the edge of the container. Tracking of the beads within the dosimeter found that sub-millimetre setup accuracy is achievable with this system. The dosimeter was able to deform and relax when externally compressed by up to 30 mm without sustaining any permanent damage. Internal deformations in 3D produced average marker movements of up to 12 mm along the direction of compression. These deformations were also shown to be reproducible over 100 consecutive deformations. This work has established several important characteristics of a new deformable dosimetry system which shows promise for future clinical applications, including the validation of deformable dose accumulation algorithms.

  5. Computational dosimetry for grounded and ungrounded human models due to contact current

    NASA Astrophysics Data System (ADS)

    Chan, Kwok Hung; Hattori, Junya; Laakso, Ilkka; Hirata, Akimasa; Taki, Masao

    2013-08-01

    This study presents the computational dosimetry of contact currents for grounded and ungrounded human models. The uncertainty of the quasi-static (QS) approximation of the in situ electric field induced in a grounded/ungrounded human body due to the contact current is first estimated. Different scenarios of cylindrical and anatomical human body models are considered, and the results are compared with the full-wave analysis. In the QS analysis, the induced field in the grounded cylindrical model is calculated by the QS finite-difference time-domain (QS-FDTD) method, and compared with the analytical solution. Because no analytical solution is available for the grounded/ungrounded anatomical human body model, the results of the QS-FDTD method are then compared with those of the conventional FDTD method. The upper frequency limit for the QS approximation in the contact current dosimetry is found to be 3 MHz, with a relative local error of less than 10%. The error increases above this frequency, which can be attributed to the neglect of the displacement current. The QS or conventional FDTD method is used for the dosimetry of induced electric field and/or specific absorption rate (SAR) for a contact current injected into the index finger of a human body model in the frequency range from 10 Hz to 100 MHz. The in situ electric fields or SAR are compared with the basic restrictions in the international guidelines/standards. The maximum electric field or the 99th percentile value of the electric fields appear not only in the fat and muscle tissues of the finger, but also around the wrist, forearm, and the upper arm. Some discrepancies are observed between the basic restrictions for the electric field and SAR and the reference levels for the contact current, especially in the extremities. These discrepancies are shown by an equation that relates the current density, tissue conductivity, and induced electric field in the finger with a cross-sectional area of 1 cm2.

  6. Boundary Electron and Beta Dosimetry-Quantification of the Effects of Dissimilar Media on Absorbed Dose

    NASA Astrophysics Data System (ADS)

    Nunes, Josane C.

    1991-02-01

    This work quantifies the changes effected in electron absorbed dose to a soft-tissue equivalent medium when part of this medium is replaced by a material that is not soft -tissue equivalent. That is, heterogeneous dosimetry is addressed. Radionuclides which emit beta particles are the electron sources of primary interest. They are used in brachytherapy and in nuclear medicine: for example, beta -ray applicators made with strontium-90 are employed in certain ophthalmic treatments and iodine-131 is used to test thyroid function. More recent medical procedures under development and which involve beta radionuclides include radioimmunotherapy and radiation synovectomy; the first is a cancer modality and the second deals with the treatment of rheumatoid arthritis. In addition, the possibility of skin surface contamination exists whenever there is handling of radioactive material. Determination of absorbed doses in the examples of the preceding paragraph requires considering boundaries of interfaces. Whilst the Monte Carlo method can be applied to boundary calculations, for routine work such as in clinical situations, or in other circumstances where doses need to be determined quickly, analytical dosimetry would be invaluable. Unfortunately, few analytical methods for boundary beta dosimetry exist. Furthermore, the accuracy of results from both Monte Carlo and analytical methods has to be assessed. Although restricted to one radionuclide, phosphorus -32, the experimental data obtained in this work serve several purposes, one of which is to provide standards against which calculated results can be tested. The experimental data also contribute to the relatively sparse set of published boundary dosimetry data. At the same time, they may be useful in developing analytical boundary dosimetry methodology. The first application of the experimental data is demonstrated. Results from two Monte Carlo codes and two analytical methods, which were developed elsewhere, are compared with experimental data. Monte Carlo results compare satisfactory with experimental results for the boundaries considered. The agreement with experimental results for air interfaces is of particular interest because of discrepancies reported previously by another investigator who used data obtained from a different experimental technique. Results from one of the analytical methods differ significantly from the experimental data obtained here. The second analytical method provided data which approximate experimental results to within 30%. This is encouraging but it remains to be determined whether this method performs equally well for other source energies.

  7. Verification of eye lens dose in IMRT by MOSFET measurement.

    PubMed

    Wang, Xuetao; Li, Guangjun; Zhao, Jianling; Song, Ying; Xiao, Jianghong; Bai, Sen

    2018-04-17

    The eye lens is recognized as one of the most radiosensitive structures in the human body. The widespread use of intensity-modulated radiotherapy (IMRT) complicates dose verification and necessitates high standards of dose computation. The purpose of this work was to assess the computed dose accuracy of eye lens through measurements using a metal-oxide-semiconductor field-effect transistor (MOSFET) dosimetry system. Sixteen clinical IMRT plans of head and neck patients were copied to an anthropomorphic head phantom. Measurements were performed using the MOSFET dosimetry system based on the head phantom. Two MOSFET detectors were imbedded in the eyes of the head phantom as the left and the right lens, covered by approximately 5-mm-thick paraffin wax. The measurement results were compared with the calculated values with a dose grid size of 1 mm. Sixteen IMRT plans were delivered, and 32 measured lens doses were obtained for analysis. The MOSFET dosimetry system can be used to verify the lens dose, and our measurements showed that the treatment planning system used in our clinic can provide adequate dose assessment in eye lenses. The average discrepancy between measurement and calculation was 6.7 ± 3.4%, and the largest discrepancy was 14.3%, which met the acceptability criterion set by the American Association of Physicists in Medicine Task Group 53 for external beam calculation for multileaf collimator-shaped fields in buildup regions. Copyright © 2018 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  8. An image-based skeletal dosimetry model for the ICRP reference adult male—internal electron sources

    NASA Astrophysics Data System (ADS)

    Hough, Matthew; Johnson, Perry; Rajon, Didier; Jokisch, Derek; Lee, Choonsik; Bolch, Wesley

    2011-04-01

    In this study, a comprehensive electron dosimetry model of the adult male skeletal tissues is presented. The model is constructed using the University of Florida adult male hybrid phantom of Lee et al (2010 Phys. Med. Biol. 55 339-63) and the EGSnrc-based Paired Image Radiation Transport code of Shah et al (2005 J. Nucl. Med. 46 344-53). Target tissues include the active bone marrow, associated with radiogenic leukemia, and total shallow marrow, associated with radiogenic bone cancer. Monoenergetic electron emissions are considered over the energy range 1 keV to 10 MeV for the following sources: bone marrow (active and inactive), trabecular bone (surfaces and volumes), and cortical bone (surfaces and volumes). Specific absorbed fractions are computed according to the MIRD schema, and are given as skeletal-averaged values in the paper with site-specific values reported in both tabular and graphical format in an electronic annex available from http://stacks.iop.org/0031-9155/56/2309/mmedia. The distribution of cortical bone and spongiosa at the macroscopic dimensions of the phantom, as well as the distribution of trabecular bone and marrow tissues at the microscopic dimensions of the phantom, is imposed through detailed analyses of whole-body ex vivo CT images (1 mm resolution) and spongiosa-specific ex vivo microCT images (30 µm resolution), respectively, taken from a 40 year male cadaver. The method utilized in this work includes: (1) explicit accounting for changes in marrow self-dose with variations in marrow cellularity, (2) explicit accounting for electron escape from spongiosa, (3) explicit consideration of spongiosa cross-fire from cortical bone, and (4) explicit consideration of the ICRP's change in the surrogate tissue region defining the location of the osteoprogenitor cells (from a 10 µm endosteal layer covering the trabecular and cortical surfaces to a 50 µm shallow marrow layer covering trabecular and medullary cavity surfaces). Skeletal-averaged values of absorbed fraction in the present model are noted to be very compatible with those weighted by the skeletal tissue distributions found in the ICRP Publication 110 adult male and female voxel phantoms, but are in many cases incompatible with values used in current and widely implemented internal dosimetry software.

  9. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less

  10. MODULAR APPLICATION OF COMPUTATIONAL MODELS OF INHALED REACTIVE GAS DOSIMETRY FOR RISK ASSESSMENT OF RESPIRATORY TRACT TOXICITY: CHLORINE

    EPA Science Inventory

    Inhaled reactive gases typically cause respiratory tract toxicity with a prominent proximal to distal lesion pattern. This pattern is largely driven by airflow and interspecies differences between rodents and humans result from factors such as airway architecture, ventilation ra...

  11. Updates on EPA’s High-Throughput Exposure Forecast (ExpoCast) Research Project (CPCP)

    EPA Science Inventory

    Recent research advances by the ORD ExpoCast project (CSS Rapid Exposure and Dosimetry) are presented to the computational toxicology community in the context of prioritizing chemicals on a risk-basis using joint ExpoCast and ToxCast predictions. Recent publications by Wambaugh e...

  12. MR and CT image fusion for postimplant analysis in permanent prostate seed implants.

    PubMed

    Polo, Alfredo; Cattani, Federica; Vavassori, Andrea; Origgi, Daniela; Villa, Gaetano; Marsiglia, Hugo; Bellomi, Massimo; Tosi, Giampiero; De Cobelli, Ottavio; Orecchia, Roberto

    2004-12-01

    To compare the outcome of two different image-based postimplant dosimetry methods in permanent seed implantation. Between October 1999 and October 2002, 150 patients with low-risk prostate carcinoma were treated with (125)I and (103)Pd in our institution. A CT-MRI image fusion protocol was used in 21 consecutive patients treated with exclusive brachytherapy. The accuracy and reproducibility of the method was calculated, and then the CT-based dosimetry was compared with the CT-MRI-based dosimetry using the dose-volume histogram (DVH) related parameters recommended by the American Brachytherapy Society and the American Association of Physicists in Medicine. Our method for CT-MRI image fusion was accurate and reproducible (median shift <1 mm). Differences in prostate volume were found, depending on the image modality used. Quality assurance DVH-related parameters strongly depended on the image modality (CT vs. CT-MRI): V(100) = 82% vs. 88%, p < 0.05. D(90) = 96% vs. 115%, p < 0.05. Those results depend on the institutional implant technique and reflect the importance of lowering inter- and intraobserver discrepancies when outlining prostate and organs at risk for postimplant dosimetry. Computed tomography-MRI fused images allow accurate determination of prostate size, significantly improving the dosimetric evaluation based on DVH analysis. This provides a consistent method to judge a prostate seed implant's quality.

  13. Development and application of the GIM code for the Cyber 203 computer

    NASA Technical Reports Server (NTRS)

    Stainaker, J. F.; Robinson, M. A.; Rawlinson, E. G.; Anderson, P. G.; Mayne, A. W.; Spradley, L. W.

    1982-01-01

    The GIM computer code for fluid dynamics research was developed. Enhancement of the computer code, implicit algorithm development, turbulence model implementation, chemistry model development, interactive input module coding and wing/body flowfield computation are described. The GIM quasi-parabolic code development was completed, and the code used to compute a number of example cases. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and implicit finite difference scheme were also added. Development was completed on the interactive module for generating the input data for GIM. Solutions for inviscid hypersonic flow over a wing/body configuration are also presented.

  14. Comparison between the TRS-398 code of practice and the TG-51 dosimetry protocol for flattening filter free beams

    NASA Astrophysics Data System (ADS)

    Lye, J. E.; Butler, D. J.; Oliver, C. P.; Alves, A.; Lehmann, J.; Gibbons, F. P.; Williams, I. M.

    2016-07-01

    Dosimetry protocols for external beam radiotherapy currently in use, such as the IAEA TRS-398 and AAPM TG-51, were written for conventional linear accelerators. In these accelerators, a flattening filter is used to produce a beam which is uniform at water depths where the ionization chamber is used to measure the absorbed dose. Recently, clinical linacs have been implemented without the flattening filter, and published theoretical analysis suggested that with these beams a dosimetric error of order 0.6% could be expected for IAEA TRS-398, because the TPR20,10 beam quality index does not accurately predict the stopping power ratio (water to air) for the softer flattening-filter-free (FFF) beam spectra. We measured doses on eleven FFF linacs at 6 MV and 10 MV using both dosimetry protocols and found average differences of 0.2% or less. The expected shift due to stopping powers was not observed. We present Monte Carlo k Q calculations which show a much smaller difference between FFF and flattened beams than originally predicted. These results are explained by the inclusion of the added backscatter plates and build-up filters used in modern clinical FFF linacs, compared to a Monte Carlo model of an FFF linac in which the flattening filter is removed and no additional build-up or backscatter plate is added.

  15. Three-dimensional radiation dosimetry based on optically-stimulated luminescence

    NASA Astrophysics Data System (ADS)

    Sadel, M.; Høye, E. M.; Skyt, P. S.; Muren, L. P.; Petersen, J. B. B.; Balling, P.

    2017-05-01

    A new approach to three-dimensional (3D) dosimetry based on optically-stimulated luminescence (OSL) is presented. By embedding OSL-active particles into a transparent silicone matrix (PDMS), the well-established dosimetric properties of an OSL material are exploited in a 3D-OSL dosimeter. By investigating prototype dosimeters in standard cuvettes in combination with small test samples for OSL readers, it is shown that a sufficient transparency of the 3D-OSL material can be combined with an OSL response giving an estimated >10.000 detected photons in 1 second per 1mm3 voxel of the dosimeter at a dose of 1 Gy. The dose distribution in the 3D-OSL dosimeters can be directly read out optically without the need for subsequent reconstruction by computational inversion algorithms. The dosimeters carry the advantages known from personal-dosimetry use of OSL: the dose distribution following irradiation can be stored with minimal fading for extended periods of time, and dosimeters are reusable as they can be reset, e.g. by an intense (bleaching) light field.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parra, Pamela Ochoa, E-mail: lapochoap@unal.edu.co; Veloza, Stella

    The radiotracer called {sup 68}Ga-labelled Glu-urea-Lys(Ahx)-HBED-CC ([68Ga]Ga-PSMA-HBED-CC) is a novel radiophar-maceutical for the detection of prostate cancer lesions by positron emission tomography (PET) imaging. Setting up a cost-effective manual synthesis of this radiotracer and making its clinical translation in Colombia will require two important elements: the evaluation of the procedure to yield a consistent product, meeting standards of radio-chemical purity and low toxicity and then, the evaluation of the radiation dosimetry. In this paper a protocol to extrapolate the biokinetic model made in normal mice to humans by using the computer software for internal dose assessment OLINDA/EXM® is presented asmore » an accurate and standardized method for the calculation of radiation dosimetry estimates.« less

  17. Dosimetry and prescription in liver radioembolization with 90Y microspheres: 3D calculation of tumor-to-liver ratio from global 99mTc-MAA SPECT information

    NASA Astrophysics Data System (ADS)

    Mañeru, Fernando; Abós, Dolores; Bragado, Laura; Fuentemilla, Naiara; Caudepón, Fernando; Pellejero, Santiago; Miquelez, Santiago; Rubio, Anastasio; Goñi, Elena; Hernández-Vitoria, Araceli

    2017-12-01

    Dosimetry in liver radioembolization with 90Y microspheres is a fundamental tool, both for the optimization of each treatment and for improving knowledge of the treatment effects in the tissues. Different options are available for estimating the administered activity and the tumor/organ dose, among them the so-called partition method. The key factor in the partition method is the tumor/normal tissue activity uptake ratio (T/N), which is obtained by a single-photon emission computed tomography (SPECT) scan during a pre-treatment simulation. The less clear the distinction between healthy and tumor parenchyma within the liver, the more difficult it becomes to estimate the T/N ratio; therefore the use of the method is limited. This study presents a methodology to calculate the T/N ratio using global information from the SPECT. The T/N ratio is estimated by establishing uptake thresholds consistent with previously performed volumetry. This dose calculation method was validated against 3D voxel dosimetry, and was also compared with the standard partition method based on freehand regions of interest (ROI) outlining on SPECT slices. Both comparisons were done on a sample of 20 actual cases of hepatocellular carcinoma treated with resin microspheres. The proposed method and the voxel dosimetry method yield similar results, while the ROI-based method tends to over-estimate the dose to normal tissues. In addition, the variability associated with the ROI-based method is more extreme than the other methods. The proposed method is simpler than either the ROI or voxel dosimetry approaches and avoids the subjectivity associated with the manual selection of regions.

  18. Shared Dosimetry Error in Epidemiological Dose-Response Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stram, Daniel O.; Preston, Dale L.; Sokolnikov, Mikhail

    2015-03-23

    Radiation dose reconstruction systems for large-scale epidemiological studies are sophisticated both in providing estimates of dose and in representing dosimetry uncertainty. For example, a computer program was used by the Hanford Thyroid Disease Study to provide 100 realizations of possible dose to study participants. The variation in realizations reflected the range of possible dose for each cohort member consistent with the data on dose determinates in the cohort. Another example is the Mayak Worker Dosimetry System 2013 which estimates both external and internal exposures and provides multiple realizations of "possible" dose history to workers given dose determinants. This paper takesmore » up the problem of dealing with complex dosimetry systems that provide multiple realizations of dose in an epidemiologic analysis. In this paper we derive expected scores and the information matrix for a model used widely in radiation epidemiology, namely the linear excess relative risk (ERR) model that allows for a linear dose response (risk in relation to radiation) and distinguishes between modifiers of background rates and of the excess risk due to exposure. We show that treating the mean dose for each individual (calculated by averaging over the realizations) as if it was true dose (ignoring both shared and unshared dosimetry errors) gives asymptotically unbiased estimates (i.e. the score has expectation zero) and valid tests of the null hypothesis that the ERR slope β is zero. Although the score is unbiased the information matrix (and hence the standard errors of the estimate of β) is biased for β≠0 when ignoring errors in dose estimates, and we show how to adjust the information matrix to remove this bias, using the multiple realizations of dose. Use of these methods for several studies, including the Mayak Worker Cohort and the U.S. Atomic Veterans Study, is discussed.« less

  19. Patient‐specific CT dosimetry calculation: a feasibility study

    PubMed Central

    Xie, Huchen; Cheng, Jason Y.; Ning, Holly; Zhuge, Ying; Miller, Robert W.

    2011-01-01

    Current estimation of radiation dose from computed tomography (CT) scans on patients has relied on the measurement of Computed Tomography Dose Index (CTDI) in standard cylindrical phantoms, and calculations based on mathematical representations of “standard man”. Radiation dose to both adult and pediatric patients from a CT scan has been a concern, as noted in recent reports. The purpose of this study was to investigate the feasibility of adapting a radiation treatment planning system (RTPS) to provide patient‐specific CT dosimetry. A radiation treatment planning system was modified to calculate patient‐specific CT dose distributions, which can be represented by dose at specific points within an organ of interest, as well as organ dose‐volumes (after image segmentation) for a GE Light Speed Ultra Plus CT scanner. The RTPS calculation algorithm is based on a semi‐empirical, measured correction‐based algorithm, which has been well established in the radiotherapy community. Digital representations of the physical phantoms (virtual phantom) were acquired with the GE CT scanner in axial mode. Thermoluminescent dosimeter (TLDs) measurements in pediatric anthropomorphic phantoms were utilized to validate the dose at specific points within organs of interest relative to RTPS calculations and Monte Carlo simulations of the same virtual phantoms (digital representation). Congruence of the calculated and measured point doses for the same physical anthropomorphic phantom geometry was used to verify the feasibility of the method. The RTPS algorithm can be extended to calculate the organ dose by calculating a dose distribution point‐by‐point for a designated volume. Electron Gamma Shower (EGSnrc) codes for radiation transport calculations developed by National Research Council of Canada (NRCC) were utilized to perform the Monte Carlo (MC) simulation. In general, the RTPS and MC dose calculations are within 10% of the TLD measurements for the infant and child chest scans. With respect to the dose comparisons for the head, the RTPS dose calculations are slightly higher (10%–20%) than the TLD measurements, while the MC results were within 10% of the TLD measurements. The advantage of the algebraic dose calculation engine of the RTPS is a substantially reduced computation time (minutes vs. days) relative to Monte Carlo calculations, as well as providing patient‐specific dose estimation. It also provides the basis for a more elaborate reporting of dosimetric results, such as patient specific organ dose volumes after image segmentation. PACS numbers: 87.55.D‐, 87.57.Q‐, 87.53.Bn, 87.55.K‐ PMID:22089016

  20. Dosimetry of 64Cu-DOTA-AE105, a PET tracer for uPAR imaging.

    PubMed

    Persson, Morten; El Ali, Henrik H; Binderup, Tina; Pfeifer, Andreas; Madsen, Jacob; Rasmussen, Palle; Kjaer, Andreas

    2014-03-01

    (64)Cu-DOTA-AE105 is a novel positron emission tomography (PET) tracer specific to the human urokinase-type plasminogen activator receptor (uPAR). In preparation of using this tracer in humans, as a new promising method to distinguish between indolent and aggressive cancers, we have performed PET studies in mice to evaluate the in vivo biodistribution and estimate human dosimetry of (64)Cu-DOTA-AE105. Five mice received iv tail injection of (64)Cu-DOTA-AE105 and were PET/CT scanned 1, 4.5 and 22 h post injection. Volume-of-interest (VOI) were manually drawn on the following organs: heart, lung, liver, kidney, spleen, intestine, muscle, bone and bladder. The activity concentrations in the mentioned organs [%ID/g] were used for the dosimetry calculation. The %ID/g of each organ at 1, 4.5 and 22 h was scaled to human value based on a difference between organ and body weights. The scaled values were then exported to OLINDA software for computation of the human absorbed doses. The residence times as well as effective dose equivalent for male and female could be obtained for each organ. To validate this approach, of human projection using mouse data, five mice received iv tail injection of another (64)Cu-DOTA peptide-based tracer, (64)Cu-DOTA-TATE, and underwent same procedure as just described. The human dosimetry estimates were then compared with observed human dosimetry estimate recently found in a first-in-man study using (64)Cu-DOTA-TATE. Human estimates of (64)Cu-DOTA-AE105 revealed the heart wall to receive the highest dose (0.0918 mSv/MBq) followed by the liver (0.0815 mSv/MBq), All other organs/tissue were estimated to receive doses in the range of 0.02-0.04 mSv/MBq. The mean effective whole-body dose of (64)Cu-DOTA-AE105 was estimated to be 0.0317 mSv/MBq. Relatively good correlation between human predicted and observed dosimetry estimates for (64)Cu-DOTA-TATE was found. Importantly, the effective whole body dose was predicted with very high precision (predicted value: 0.0252 mSv/Mbq, Observed value: 0.0315 mSv/MBq) thus validating our approach for human dosimetry estimation. Favorable dosimetry estimates together with previously reported uPAR PET data fully support human testing of (64)Cu-DOTA-AE105. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Reconstruction of organ dose for external radiotherapy patients in retrospective epidemiologic studies

    NASA Astrophysics Data System (ADS)

    Lee, Choonik; Jung, Jae Won; Pelletier, Christopher; Pyakuryal, Anil; Lamart, Stephanie; Kim, Jong Oh; Lee, Choonsik

    2015-03-01

    Organ dose estimation for retrospective epidemiological studies of late effects in radiotherapy patients involves two challenges: radiological images to represent patient anatomy are not usually available for patient cohorts who were treated years ago, and efficient dose reconstruction methods for large-scale patient cohorts are not well established. In the current study, we developed methods to reconstruct organ doses for radiotherapy patients by using a series of computational human phantoms coupled with a commercial treatment planning system (TPS) and a radiotherapy-dedicated Monte Carlo transport code, and performed illustrative dose calculations. First, we developed methods to convert the anatomy and organ contours of the pediatric and adult hybrid computational phantom series to Digital Imaging and Communications in Medicine (DICOM)-image and DICOM-structure files, respectively. The resulting DICOM files were imported to a commercial TPS for simulating radiotherapy and dose calculation for in-field organs. The conversion process was validated by comparing electron densities relative to water and organ volumes between the hybrid phantoms and the DICOM files imported in TPS, which showed agreements within 0.1 and 2%, respectively. Second, we developed a procedure to transfer DICOM-RT files generated from the TPS directly to a Monte Carlo transport code, x-ray Voxel Monte Carlo (XVMC) for more accurate dose calculations. Third, to illustrate the performance of the established methods, we simulated a whole brain treatment for the 10 year-old male phantom and a prostate treatment for the adult male phantom. Radiation doses to selected organs were calculated using the TPS and XVMC, and compared to each other. Organ average doses from the two methods matched within 7%, whereas maximum and minimum point doses differed up to 45%. The dosimetry methods and procedures established in this study will be useful for the reconstruction of organ dose to support retrospective epidemiological studies of late effects in radiotherapy patients.

  2. A New Dual-purpose Quality Control Dosimetry Protocol for Diagnostic Reference-level Determination in Computed Tomography.

    PubMed

    Sohrabi, Mehdi; Parsi, Masoumeh; Sina, Sedigheh

    2018-05-17

    A diagnostic reference level is an advisory dose level set by a regulatory authority in a country as an efficient criterion for protection of patients from unwanted medical exposure. In computed tomography, the direct dose measurement and data collection methods are commonly applied for determination of diagnostic reference levels. Recently, a new quality-control-based dose survey method was proposed by the authors to simplify the diagnostic reference-level determination using a retrospective quality control database usually available at a regulatory authority in a country. In line with such a development, a prospective dual-purpose quality control dosimetry protocol is proposed for determination of diagnostic reference levels in a country, which can be simply applied by quality control service providers. This new proposed method was applied to five computed tomography scanners in Shiraz, Iran, and diagnostic reference levels for head, abdomen/pelvis, sinus, chest, and lumbar spine examinations were determined. The results were compared to those obtained by the data collection and quality-control-based dose survey methods, carried out in parallel in this study, and were found to agree well within approximately 6%. This is highly acceptable for quality-control-based methods according to International Atomic Energy Agency tolerance levels (±20%).

  3. Implementation of a 3D mixing layer code on parallel computers

    NASA Technical Reports Server (NTRS)

    Roe, K.; Thakur, R.; Dang, T.; Bogucz, E.

    1995-01-01

    This paper summarizes our progress and experience in the development of a Computational-Fluid-Dynamics code on parallel computers to simulate three-dimensional spatially-developing mixing layers. In this initial study, the three-dimensional time-dependent Euler equations are solved using a finite-volume explicit time-marching algorithm. The code was first programmed in Fortran 77 for sequential computers. The code was then converted for use on parallel computers using the conventional message-passing technique, while we have not been able to compile the code with the present version of HPF compilers.

  4. a Dosimetry Assessment for the Core Restraint of AN Advanced Gas Cooled Reactor

    NASA Astrophysics Data System (ADS)

    Thornton, D. A.; Allen, D. A.; Tyrrell, R. J.; Meese, T. C.; Huggon, A. P.; Whiley, G. S.; Mossop, J. R.

    2009-08-01

    This paper describes calculations of neutron damage rates within the core restraint structures of Advanced Gas Cooled Reactors (AGRs). Using advanced features of the Monte Carlo radiation transport code MCBEND, and neutron source data from core follow calculations performed with the reactor physics code PANTHER, a detailed model of the reactor cores of two of British Energy's AGR power plants has been developed for this purpose. Because there are no relevant neutron fluence measurements directly supporting this assessment, results of benchmark comparisons and successful validation of MCBEND for Magnox reactors have been used to estimate systematic and random uncertainties on the predictions. In particular, it has been necessary to address the known under-prediction of lower energy fast neutron responses associated with the penetration of large thicknesses of graphite.

  5. Measurements and simulations of the radiation exposure to aircraft crew workplaces due to cosmic radiation in the atmosphere.

    PubMed

    Beck, P; Latocha, M; Dorman, L; Pelliccioni, M; Rollet, S

    2007-01-01

    As required by the European Directive 96/29/Euratom, radiation exposure due to natural ionizing radiation has to be taken into account at workplaces if the effective dose could become more than 1 mSv per year. An example of workers concerned by this directive is aircraft crew due to cosmic radiation exposure in the atmosphere. Extensive measurement campaigns on board aircrafts have been carried out to assess ambient dose equivalent. A consortium of European dosimetry institutes within EURADOS WG5 summarized experimental data and results of calculations, together with detailed descriptions of the methods for measurements and calculations. The radiation protection quantity of interest is the effective dose, E (ISO). The comparison of results by measurements and calculations is done in terms of the operational quantity ambient dose equivalent, H(10). This paper gives an overview of the EURADOS Aircraft Crew In-Flight Database and it presents a new empirical model describing fitting functions for this data. Furthermore, it describes numerical simulations performed with the Monte Carlo code FLUKA-2005 using an updated version of the cosmic radiation primary spectra. The ratio between ambient dose equivalent and effective dose at commercial flight altitudes, calculated with FLUKA-2005, is discussed. Finally, it presents the aviation dosimetry model AVIDOS based on FLUKA-2005 simulations for routine dose assessment. The code has been developed by Austrian Research Centers (ARC) for the public usage (http://avidos.healthphysics.at).

  6. Tumor and red bone marrow dosimetry: comparison of methods for prospective treatment planning in pretargeted radioimmunotherapy.

    PubMed

    Woliner-van der Weg, Wietske; Schoffelen, Rafke; Hobbs, Robert F; Gotthardt, Martin; Goldenberg, David M; Sharkey, Robert M; Slump, Cornelis H; van der Graaf, Winette Ta; Oyen, Wim Jg; Boerman, Otto C; Sgouros, George; Visser, Eric P

    2015-12-01

    Red bone marrow (RBM) toxicity is dose-limiting in (pretargeted) radioimmunotherapy (RIT). Previous blood-based and two-dimensional (2D) image-based methods have failed to show a clear dose-response relationship. We developed a three-dimensional (3D) image-based RBM dosimetry approach using the Monte Carlo-based 3D radiobiological dosimetry (3D-RD) software and determined its additional value for predicting RBM toxicity. RBM doses were calculated for 13 colorectal cancer patients after pretargeted RIT with the two-step administration of an anti-CEA × anti-HSG bispecific monoclonal antibody and a (177)Lu-labeled di-HSG-peptide. 3D-RD RBM dosimetry was based on the lumbar vertebrae, delineated on single photon emission computed tomography (SPECT) scans acquired directly, 3, 24, and 72 h after (177)Lu administration. RBM doses were correlated to hematologic effects, according to NCI-CTC v3 and compared with conventional 2D cranium-based and blood-based dosimetry results. Tumor doses were calculated with 3D-RD, which has not been possible with 2D dosimetry. Tumor-to-RBM dose ratios were calculated and compared for (177)Lu-based pretargeted RIT and simulated pretargeted RIT with (90)Y. 3D-RD RBM doses of all seven patients who developed thrombocytopenia were higher (range 0.43 to 0.97 Gy) than that of the six patients without thrombocytopenia (range 0.12 to 0.39 Gy), except in one patient (0.47 Gy) without thrombocytopenia but with grade 2 leucopenia. Blood and 2D image-based RBM doses for patients with grade 1 to 2 thrombocytopenia were in the same range as in patients without thrombocytopenia (0.14 to 0.29 and 0.11 to 0.26 Gy, respectively). Blood-based RBM doses for two grade 3 to 4 patients were higher (0.66 and 0.51 Gy, respectively) than the others, and the cranium-based dose of only the grade 4 patient was higher (0.34 Gy). Tumor-to-RBM dose ratios would increase by 25% on average when treating with (90)Y instead of (177)Lu. 3D dosimetry identifies patients at risk of developing any grade of RBM toxicity more accurately than blood- or 2D image-based methods. It has the added value to enable calculation of tumor-to-RBM dose ratios.

  7. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  8. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  9. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  10. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  11. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  12. A simplified model of the source channel of the Leksell GammaKnife tested with PENELOPE.

    PubMed

    Al-Dweri, Feras M O; Lallena, Antonio M; Vilches, Manuel

    2004-06-21

    Monte Carlo simulations using the code PENELOPE have been performed to test a simplified model of the source channel geometry of the Leksell GammaKnife. The characteristics of the radiation passing through the treatment helmets are analysed in detail. We have found that only primary particles emitted from the source with polar angles smaller than 3 degrees with respect to the beam axis are relevant for the dosimetry of the Gamma Knife. The photon trajectories reaching the output helmet collimators at (x, v, z = 236 mm) show strong correlations between rho = (x2 + y2)(1/2) and their polar angle theta, on one side, and between tan(-1)(y/x) and their azimuthal angle phi, on the other. This enables us to propose a simplified model which treats the full source channel as a mathematical collimator. This simplified model produces doses in good agreement with those found for the full geometry. In the region of maximal dose, the relative differences between both calculations are within 3%, for the 18 and 14 mm helmets, and 10%, for the 8 and 4 mm ones. Besides, the simplified model permits a strong reduction (larger than a factor 15) in the computational time.

  13. Conception and realization of a parallel-plate free-air ionization chamber for the absolute dosimetry of an ultrasoft X-ray beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groetz, J.-E., E-mail: jegroetz@univ-fcomte.fr; Mavon, C.; Fromm, M.

    2014-08-15

    We report the design of a millimeter-sized parallel plate free-air ionization chamber (IC) aimed at determining the absolute air kerma rate of an ultra-soft X-ray beam (E = 1.5 keV). The size of the IC was determined so that the measurement volume satisfies the condition of charged-particle equilibrium. The correction factors necessary to properly measure the absolute kerma using the IC have been established. Particular attention was given to the determination of the effective mean energy for the 1.5 keV photons using the PENELOPE code. Other correction factors were determined by means of computer simulation (COMSOL™and FLUKA). Measurements of airmore » kerma rates under specific operating parameters of the lab-bench X-ray source have been performed at various distances from that source and compared to Monte Carlo calculations. We show that the developed ionization chamber makes it possible to determine accurate photon fluence rates in routine work and will constitute substantial time-savings for future radiobiological experiments based on the use of ultra-soft X-rays.« less

  14. Neutron dose estimation in a zero power nuclear reactor

    NASA Astrophysics Data System (ADS)

    Triviño, S.; Vedelago, J.; Cantargi, F.; Keil, W.; Figueroa, R.; Mattea, F.; Chautemps, A.; Santibañez, M.; Valente, M.

    2016-10-01

    This work presents the characterization and contribution of neutron and gamma components to the absorbed dose in a zero power nuclear reactor. A dosimetric method based on Fricke gel was implemented to evaluate the separation between dose components in the mixed field. The validation of this proposed method was performed by means of direct measurements of neutron flux in different positions using Au and Mg-Ni activation foils. Monte Carlo simulations were conversely performed using the MCNP main code with a dedicated subroutine to incorporate the exact complete geometry of the nuclear reactor facility. Once nuclear fuel elements were defined, the simulations computed the different contributions to the absorbed dose in specific positions inside the core. Thermal/epithermal contributions of absorbed dose were assessed by means of Fricke gel dosimetry using different isotopic compositions aimed at modifying the sensitivity of the dosimeter for specific dose components. Clear distinctions between gamma and neutron capture dose were obtained. Both Monte Carlo simulations and experimental results provided reliable estimations about neutron flux rate as well as dose rate during the reactor operation. Simulations and experimental results are in good agreement in every positions measured and simulated in the core.

  15. Individualized adjustments to reference phantom internal organ dosimetry—scaling factors given knowledge of patient internal anatomy

    NASA Astrophysics Data System (ADS)

    Wayson, Michael B.; Bolch, Wesley E.

    2018-04-01

    Various computational tools are currently available that facilitate patient organ dosimetry in diagnostic nuclear medicine, yet they are typically restricted to reporting organ doses to ICRP-defined reference phantoms. The present study, while remaining computational phantom based, provides straightforward tools to adjust reference phantom organ dose for both internal photon and electron sources. A wide variety of monoenergetic specific absorbed fractions were computed using radiation transport simulations for tissue spheres of varying size and separation distance. Scaling methods were then constructed for both photon and electron self-dose and cross-dose, with data validation provided from patient-specific voxel phantom simulations, as well as via comparison to the scaling methodology given in MIRD Pamphlet No. 11. Photon and electron self-dose was found to be dependent on both radiation energy and sphere size. Photon cross-dose was found to be mostly independent of sphere size. Electron cross-dose was found to be dependent on sphere size when the spheres were in close proximity, owing to differences in electron range. The validation studies showed that this dataset was more effective than the MIRD 11 method at predicting patient-specific photon doses for at both high and low energies, but gave similar results at photon energies between 100 keV and 1 MeV. The MIRD 11 method for electron self-dose scaling was accurate for lower energies but began to break down at higher energies. The photon cross-dose scaling methodology developed in this study showed gains in accuracy of up to 9% for actual patient studies, and the electron cross-dose scaling methodology showed gains in accuracy up to 9% as well when only the bremsstrahlung component of the cross-dose was scaled. These dose scaling methods are readily available for incorporation into internal dosimetry software for diagnostic phantom-based organ dosimetry.

  16. Applying an analytical method to study neutron behavior for dosimetry

    NASA Astrophysics Data System (ADS)

    Shirazi, S. A. Mousavi

    2016-12-01

    In this investigation, a new dosimetry process is studied by applying an analytical method. This novel process is associated with a human liver tissue. The human liver tissue has compositions including water, glycogen and etc. In this study, organic compound materials of liver are decomposed into their constituent elements based upon mass percentage and density of every element. The absorbed doses are computed by analytical method in all constituent elements of liver tissue. This analytical method is introduced applying mathematical equations based on neutron behavior and neutron collision rules. The results show that the absorbed doses are converged for neutron energy below 15MeV. This method can be applied to study the interaction of neutrons in other tissues and estimating the absorbed dose for a wide range of neutron energy.

  17. A modern Monte Carlo investigation of the TG-43 dosimetry parameters for an {sup 125}I seed already having AAPM consensus data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aryal, Prakash; Molloy, Janelle A.; Rivard, Mark J., E-mail: mark.j.rivard@gmail.com

    2014-02-15

    Purpose: To investigate potential causes for differences in TG-43 brachytherapy dosimetry parameters in the existent literature for the model IAI-125A{sup 125}I seed and to propose new standard dosimetry parameters. Methods: The MCNP5 code was used for Monte Carlo (MC) simulations. Sensitivity of dose distributions, and subsequently TG-43 dosimetry parameters, was explored to reproduce historical methods upon which American Association of Physicists in Medicine (AAPM) consensus data are based. Twelve simulation conditions varying{sup 125}I coating thickness, coating mass density, photon interaction cross-section library, and photon emission spectrum were examined. Results: Varying{sup 125}I coating thickness, coating mass density, photon cross-section library, andmore » photon emission spectrum for the model IAI-125A seed changed the dose-rate constant by up to 0.9%, about 1%, about 3%, and 3%, respectively, in comparison to the proposed standard value of 0.922 cGy h{sup −1} U{sup −1}. The dose-rate constant values by Solberg et al. [“Dosimetric parameters of three new solid core {sup 125}I brachytherapy sources,” J. Appl. Clin. Med. Phys. 3, 119–134 (2002)], Meigooni et al. [“Experimental and theoretical determination of dosimetric characteristics of IsoAid ADVANTAGE™ {sup 125}I brachytherapy source,” Med. Phys. 29, 2152–2158 (2002)], and Taylor and Rogers [“An EGSnrc Monte Carlo-calculated database of TG-43 parameters,” Med. Phys. 35, 4228–4241 (2008)] for the model IAI-125A seed and Kennedy et al. [“Experimental and Monte Carlo determination of the TG-43 dosimetric parameters for the model 9011 THINSeed™ brachytherapy source,” Med. Phys. 37, 1681–1688 (2010)] for the model 6711 seed were +4.3% (0.962 cGy h{sup −1} U{sup −1}), +6.2% (0.98 cGy h{sup −1} U{sup −1}), +0.3% (0.925 cGy h{sup −1} U{sup −1}), and −0.2% (0.921 cGy h{sup −1} U{sup −1}), respectively, in comparison to the proposed standard value. Differences in the radial dose functions between the current study and both Solberg et al. and Meigooni et al. were <10% for r ≤ 5 cm, and increased for r > 5 cm with a maximum difference of 29% at r = 9 cm. In comparison to Taylor and Rogers, these differences were lower (maximum of 2% at r = 9 cm). For the similarly designed model 6711 {sup 125}I seed, differences did not exceed 0.5% for 0.5 ≤ r ≤ 10 cm. Radial dose function values varied by 1% as coating thickness and coating density were changed. Varying the cross-section library and source spectrum altered the radial dose function by 25% and 12%, respectively, but these differences occurred at r = 10 cm where the dose rates were very low. The 2D anisotropy function results were most similar to those of Solberg et al. and most different to those of Meigooni et al. The observed order of simulation condition variables from most to least important for influencing the 2D anisotropy function was spectrum, coating thickness, coating density, and cross-section library. Conclusions: Several MC radiation transport codes are available for calculation of the TG-43 dosimetry parameters for brachytherapy seeds. The physics models in these codes and their related cross-section libraries have been updated and improved since publication of the 2007 AAPM TG-43U1S1 report. Results using modern data indicated statistically significant differences in these dosimetry parameters in comparison to data recommended in the TG-43U1S1 report. Therefore, it seems that professional societies such as the AAPM should consider reevaluating the consensus data for this and others seeds and establishing a process of regular evaluations in which consensus data are based upon methods that remain state-of-the-art.« less

  18. Computer Description of Black Hawk Helicopter

    DTIC Science & Technology

    1979-06-01

    Model Combinatorial Geometry Models Black Hawk Helicopter Helicopter GIFT Computer Code Geometric Description of Targets 20. ABSTRACT...description was made using the technique of combinatorial geometry (COM-GEOM) and will be used as input to the GIFT computer code which generates Tliic...rnHp The data used bv the COVART comtmter code was eenerated bv the Geometric Information for Targets ( GIFT )Z computer code. This report documents

  19. Low-energy electron dose-point kernel simulations using new physics models implemented in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Bordes, Julien; Incerti, Sébastien; Lampe, Nathanael; Bardiès, Manuel; Bordage, Marie-Claude

    2017-05-01

    When low-energy electrons, such as Auger electrons, interact with liquid water, they induce highly localized ionizing energy depositions over ranges comparable to cell diameters. Monte Carlo track structure (MCTS) codes are suitable tools for performing dosimetry at this level. One of the main MCTS codes, Geant4-DNA, is equipped with only two sets of cross section models for low-energy electron interactions in liquid water (;option 2; and its improved version, ;option 4;). To provide Geant4-DNA users with new alternative physics models, a set of cross sections, extracted from CPA100 MCTS code, have been added to Geant4-DNA. This new version is hereafter referred to as ;Geant4-DNA-CPA100;. In this study, ;Geant4-DNA-CPA100; was used to calculate low-energy electron dose-point kernels (DPKs) between 1 keV and 200 keV. Such kernels represent the radial energy deposited by an isotropic point source, a parameter that is useful for dosimetry calculations in nuclear medicine. In order to assess the influence of different physics models on DPK calculations, DPKs were calculated using the existing Geant4-DNA models (;option 2; and ;option 4;), newly integrated CPA100 models, and the PENELOPE Monte Carlo code used in step-by-step mode for monoenergetic electrons. Additionally, a comparison was performed of two sets of DPKs that were simulated with ;Geant4-DNA-CPA100; - the first set using Geant4‧s default settings, and the second using CPA100‧s original code default settings. A maximum difference of 9.4% was found between the Geant4-DNA-CPA100 and PENELOPE DPKs. Between the two Geant4-DNA existing models, slight differences, between 1 keV and 10 keV were observed. It was highlighted that the DPKs simulated with the two Geant4-DNA's existing models were always broader than those generated with ;Geant4-DNA-CPA100;. The discrepancies observed between the DPKs generated using Geant4-DNA's existing models and ;Geant4-DNA-CPA100; were caused solely by their different cross sections. The different scoring and interpolation methods used in CPA100 and Geant4 to calculate DPKs showed differences close to 3.0% near the source.

  20. SU-F-I-13: Correction Factor Computations for the NIST Ritz Free Air Chamber for Medium-Energy X Rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergstrom, P

    Purpose: The National Institute of Standards and Technology (NIST) uses 3 free-air chambers to establish primary standards for radiation dosimetry at x-ray energies. For medium-energy × rays, the Ritz free-air chamber is the main measurement device. In order to convert the charge or current collected by the chamber to the radiation quantities air kerma or air kerma rate, a number of correction factors specific to the chamber must be applied. Methods: We used the Monte Carlo codes EGSnrc and PENELOPE. Results: Among these correction factors are the diaphragm correction (which accounts for interactions of photons from the x-ray source inmore » the beam-defining diaphragm of the chamber), the scatter correction (which accounts for the effects of photons scattered out of the primary beam), the electron-loss correction (which accounts for electrons that only partially expend their energy in the collection region), the fluorescence correction (which accounts for ionization due to reabsorption ffluorescence photons and the bremsstrahlung correction (which accounts for the reabsorption of bremsstrahlung photons). We have computed monoenergetic corrections for the NIST Ritz chamber for the 1 cm, 3 cm and 7 cm collection plates. Conclusion: We find good agreement with other’s results for the 7 cm plate. The data used to obtain these correction factors will be used to establish air kerma and it’s uncertainty in the standard NIST x-ray beams.« less

  1. SU-F-J-218: Predicting Radiation-Induced Xerostomia by Dosimetrically Accounting for Daily Setup Uncertainty During Head and Neck IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, S; Quon, H; McNutt, T

    2016-06-15

    Purpose: To determine if the accumulated parotid dosimetry using planning CT to daily CBCT deformation and dose re-calculation can predict for radiation-induced xerostomia. Methods: To track and dosimetrically account for the effects of anatomical changes on the parotid glands, we propagated physicians’ contours from planning CT to daily CBCT using a deformable registration with iterative CBCT intensity correction. A surface mesh for each OAR was created with the deformation applied to the mesh to obtain the deformed parotid volumes. Daily dose was computed on the deformed CT and accumulated to the last fraction. For both the accumulated and the plannedmore » parotid dosimetry, we tested the prediction power of different dosimetric parameters including D90, D50, D10, mean, standard deviation, min/max dose to the combined parotids and patient age to severe xerostomia (NCI-CTCAE grade≥2 at 6 mo follow-up). We also tested the dosimetry to parotid sub-volumes. Three classification algorithms, random tree, support vector machine, and logistic regression were tested to predict severe xerostomia using a leave-one-out validation approach. Results: We tested our prediction model on 35 HN IMRT cases. Parameters from the accumulated dosimetry model demonstrated an 89% accuracy for predicting severe xerostomia. Compared to the planning dosimetry, the accumulated dose consistently demonstrated higher prediction power with all three classification algorithms, including 11%, 5% and 30% higher accuracy, sensitivity and specificity, respectively. Geometric division of the combined parotid glands into superior-inferior regions demonstrated ∼5% increased accuracy than the whole volume. The most influential ranked features include age, mean accumulated dose of the submandibular glands and the accumulated D90 of the superior parotid glands. Conclusion: We demonstrated that the accumulated parotid dosimetry using CT-CBCT registration and dose re-calculation more accurately predicts for severe xerostomia and that the superior portion of the parotid glands may be particularly important in predicting for severe xerostomia. This work was supported in part by NIH/NCI under grant R42CA137886 and in part by Toshiba big data research project funds.« less

  2. An image-based skeletal dosimetry model for the ICRP reference adult female—internal electron sources

    NASA Astrophysics Data System (ADS)

    O'Reilly, Shannon E.; DeWeese, Lindsay S.; Maynard, Matthew R.; Rajon, Didier A.; Wayson, Michael B.; Marshall, Emily L.; Bolch, Wesley E.

    2016-12-01

    An image-based skeletal dosimetry model for internal electron sources was created for the ICRP-defined reference adult female. Many previous skeletal dosimetry models, which are still employed in commonly used internal dosimetry software, do not properly account for electron escape from trabecular spongiosa, electron cross-fire from cortical bone, and the impact of marrow cellularity on active marrow self-irradiation. Furthermore, these existing models do not employ the current ICRP definition of a 50 µm bone endosteum (or shallow marrow). Each of these limitations was addressed in the present study. Electron transport was completed to determine specific absorbed fractions to both active and shallow marrow of the skeletal regions of the University of Florida reference adult female. The skeletal macrostructure and microstructure were modeled separately. The bone macrostructure was based on the whole-body hybrid computational phantom of the UF series of reference models, while the bone microstructure was derived from microCT images of skeletal region samples taken from a 45 years-old female cadaver. The active and shallow marrow are typically adopted as surrogate tissue regions for the hematopoietic stem cells and osteoprogenitor cells, respectively. Source tissues included active marrow, inactive marrow, trabecular bone volume, trabecular bone surfaces, cortical bone volume, and cortical bone surfaces. Marrow cellularity was varied from 10 to 100 percent for active marrow self-irradiation. All other sources were run at the defined ICRP Publication 70 cellularity for each bone site. A total of 33 discrete electron energies, ranging from 1 keV to 10 MeV, were either simulated or analytically modeled. The method of combining skeletal macrostructure and microstructure absorbed fractions assessed using MCNPX electron transport was found to yield results similar to those determined with the PIRT model applied to the UF adult male skeletal dosimetry model. Calculated skeletal averaged absorbed fractions for each source-target combination were found to follow similar trends of more recent dosimetry models (image-based models) but did not follow results from skeletal models based upon assumptions of an infinite expanse of trabecular spongiosa.

  3. Safety and biodistribution of 111In-amatuximab in patients with mesothelin expressing cancers using Single Photon Emission Computed Tomography-Computed Tomography (SPECT-CT) imaging

    PubMed Central

    Adler, Stephen; Mena, Esther; Kurdziel, Karen; Maltzman, Julia; Wallin, Bruce; Hoffman, Kimberly; Pastan, Ira; Paik, Chang Hum; Choyke, Peter; Hassan, Raffit

    2015-01-01

    Amatuximab is a chimeric high-affinity monoclonal IgG1/k antibody targeting mesothelin that is being developed for treatment of mesothelin-expressing cancers. Considering the ongoing clinical development of amatuximab in these cancers, our objective was to characterize the biodistribution, and dosimetry of 111Indium (111In) radiolabelled amatuximab in mesothelin-expressing cancers. Between October 2011 and February 2013, six patients including four with malignant mesothelioma and two with pancreatic adenocarcinoma underwent Single Photon Emission Computed Tomography-Computed Tomography (SPECT/CT) imaging following administration of 111In amatuximab. SPECT/CT images were obtained at 2–4 hours, 24–48 hours and 96–168 hours after radiotracer injection. In all patients, tumor to background ratios (TBR) consistently met or exceeded an uptake of 1.2 (range 1.2–62.0) which is considered the minimum TBR that can be visualized. TBRs were higher in tumors of patients with mesothelioma than pancreatic adenocarcinoma. 111In-amatuximab uptake was noted in both primary tumors and metastatic sites. The radiotracer dose was generally well-tolerated and demonstrated physiologic uptake in the heart, liver, kidneys and spleen. This is the first study to show tumor localization of an anti-mesothelin antibody in humans. Our results show that 111In-amatuximab was well tolerated with a favorable dosimetry profile. It localizes to mesothelin expressing cancers with a higher uptake in mesothelioma than pancreatic cancer. PMID:25756664

  4. User manual for semi-circular compact range reflector code: Version 2

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Burnside, Walter D.

    1987-01-01

    A computer code has been developed at the Ohio State University ElectroScience Laboratory to analyze a semi-circular paraboloidal reflector with or without a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the reflector or its individual components at a given distance from the center of the paraboloid. The code computes the fields along a radial, horizontal, vertical or axial cut at that distance. Thus, it is very effective in computing the size of the sweet spot for a semi-circular compact range reflector. This report describes the operation of the code. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.

  5. Single ionization and capture cross sections from biological molecules by bare projectile impact*

    NASA Astrophysics Data System (ADS)

    Quinto, Michele A.; Monti, Juan M.; Montenegro, Pablo D.; Fojón, Omar A.; Champion, Christophe; Rivarola, Roberto D.

    2017-02-01

    We report calculations on single differential and total cross sections for single ionization and single electron capture from biological targets, namely, vapor water and DNA nucleobasese molecules, by bare projectile impact: H+, He2+, and C6+. They are performed within the Continuum Distorted Wave - Eikonal Initial State approximation and compared to several existing experimental data. This study is oriented to the obtention of a reliable set of theoretical data to be used as input in a Monte Carlo code destined to micro- and nano- dosimetry.

  6. Nuclear medicine in clinical urology and nephrology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tauxe, W.N.; Dubousky, E.V.

    This book presents explanations of current procedures involving the kidney with information of the performance of each test, its rationale, and interpretation. The information covers all currently used radiopharmaceuticals, radiation dosimetry, instrumentation, test protocols, and mathematical principles of pathophysiology as they relate to nuclear medicine studies. Information is provided on which radiopharmaceutical, instrument, or computer application to use, and when.

  7. Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burk, K.W.; Andrews, G.L.

    1989-02-01

    The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to themore » Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.« less

  8. A parameterization method and application in breast tomosynthesis dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xinhua; Zhang, Da; Liu, Bob

    2013-09-15

    Purpose: To present a parameterization method based on singular value decomposition (SVD), and to provide analytical parameterization of the mean glandular dose (MGD) conversion factors from eight references for evaluating breast tomosynthesis dose in the Mammography Quality Standards Act (MQSA) protocol and in the UK, European, and IAEA dosimetry protocols.Methods: MGD conversion factor is usually listed in lookup tables for the factors such as beam quality, breast thickness, breast glandularity, and projection angle. The authors analyzed multiple sets of MGD conversion factors from the Hologic Selenia Dimensions quality control manual and seven previous papers. Each data set was parameterized usingmore » a one- to three-dimensional polynomial function of 2–16 terms. Variable substitution was used to improve accuracy. A least-squares fit was conducted using the SVD.Results: The differences between the originally tabulated MGD conversion factors and the results computed using the parameterization algorithms were (a) 0.08%–0.18% on average and 1.31% maximum for the Selenia Dimensions quality control manual, (b) 0.09%–0.66% on average and 2.97% maximum for the published data by Dance et al. [Phys. Med. Biol. 35, 1211–1219 (1990); ibid. 45, 3225–3240 (2000); ibid. 54, 4361–4372 (2009); ibid. 56, 453–471 (2011)], (c) 0.74%–0.99% on average and 3.94% maximum for the published data by Sechopoulos et al. [Med. Phys. 34, 221–232 (2007); J. Appl. Clin. Med. Phys. 9, 161–171 (2008)], and (d) 0.66%–1.33% on average and 2.72% maximum for the published data by Feng and Sechopoulos [Radiology 263, 35–42 (2012)], excluding one sample in (d) that does not follow the trends in the published data table.Conclusions: A flexible parameterization method is presented in this paper, and was applied to breast tomosynthesis dosimetry. The resultant data offer easy and accurate computations of MGD conversion factors for evaluating mean glandular breast dose in the MQSA protocol and in the UK, European, and IAEA dosimetry protocols. Microsoft Excel™ spreadsheets are provided for the convenience of readers.« less

  9. Protocols for the dosimetry of high-energy photon and electron beams: a comparison of the IAEA TRS-398 and previous international Codes of Practice

    NASA Astrophysics Data System (ADS)

    Andreo, Pedro; Saiful Huq, M.; Westermark, Mathias; Song, Haijun; Tilikidis, Aris; DeWerd, Larry; Shortt, Ken

    2002-09-01

    A new international Code of Practice for radiotherapy dosimetry co-sponsored by several international organizations has been published by the IAEA, TRS-398. It is based on standards of absorbed dose to water, whereas previous protocols (TRS-381 and TRS-277) were based on air kerma standards. To estimate the changes in beam calibration caused by the introduction of TRS-398, a detailed experimental comparison of the dose determination in reference conditions in high-energy photon and electron beams has been made using the different IAEA protocols. A summary of the formulation and reference conditions in the various Codes of Practice, as well as of their basic data, is presented first. Accurate measurements have been made in 25 photon and electron beams from 10 clinical accelerators using 12 different cylindrical and plane-parallel chambers, and dose ratios under different conditions of TRS-398 to the other protocols determined. A strict step-by-step checklist was followed by the two participating clinical institutions to ascertain that the resulting calculations agreed within tenths of a per cent. The maximum differences found between TRS-398 and the previous Codes of Practice TRS-277 (2nd edn) and TRS-381 are of the order of 1.5-2.0%. TRS-398 yields absorbed doses larger than the previous protocols, around 1.0% for photons (TRS-277) and for electrons (TRS-381 and TRS-277) when plane-parallel chambers are cross-calibrated. For the Markus chamber, results show a very large variation, although a fortuitous cancellation of the old stopping powers with the ND,w/NK ratios makes the overall discrepancy between TRS-398 and TRS-277 in this case smaller than for well-guarded plane-parallel chambers. Chambers of the Roos-type with a 60Co ND,w calibration yield the maximum discrepancy in absorbed dose, which varies between 1.0% and 1.5% for TRS-381 and between 1.5% and 2.0% for TRS-277. Photon beam calibrations using directly measured or calculated TPR20,10 from a percentage dose data at SSD = 100 cm were found to be indistinguishable. Considering that approximately 0.8% of the differences between TRS-398 and the NK-based protocols are caused by the change to the new type of standards, the remaining difference in absolute dose is due either to a close similarity in basic data or to a fortuitous cancellation of the discrepancies in data and type of chamber calibration. It is emphasized that the NK-ND,air and ND,w formalisms have very similar uncertainty when the same criteria are used for both procedures. Arguments are provided in support of the recommendation for a change in reference dosimetry based on standards of absorbed dose to water.

  10. User's manual for semi-circular compact range reflector code

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Burnside, Walter D.

    1986-01-01

    A computer code was developed to analyze a semi-circular paraboloidal reflector antenna with a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the antenna or its individual components at a given distance from the center of the paraboloid. Thus, it is very effective in computing the size of the sweet spot for RCS or antenna measurement. The operation of the code is described. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.

  11. Highly fault-tolerant parallel computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spielman, D.A.

    We re-introduce the coded model of fault-tolerant computation in which the input and output of a computational device are treated as words in an error-correcting code. A computational device correctly computes a function in the coded model if its input and output, once decoded, are a valid input and output of the function. In the coded model, it is reasonable to hope to simulate all computational devices by devices whose size is greater by a constant factor but which are exponentially reliable even if each of their components can fail with some constant probability. We consider fine-grained parallel computations inmore » which each processor has a constant probability of producing the wrong output at each time step. We show that any parallel computation that runs for time t on w processors can be performed reliably on a faulty machine in the coded model using w log{sup O(l)} w processors and time t log{sup O(l)} w. The failure probability of the computation will be at most t {center_dot} exp(-w{sup 1/4}). The codes used to communicate with our fault-tolerant machines are generalized Reed-Solomon codes and can thus be encoded and decoded in O(n log{sup O(1)} n) sequential time and are independent of the machine they are used to communicate with. We also show how coded computation can be used to self-correct many linear functions in parallel with arbitrarily small overhead.« less

  12. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  13. A generalized one-dimensional computer code for turbomachinery cooling passage flow calculations

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Roelke, Richard J.; Meitner, Peter L.

    1989-01-01

    A generalized one-dimensional computer code for analyzing the flow and heat transfer in the turbomachinery cooling passages was developed. This code is capable of handling rotating cooling passages with turbulators, 180 degree turns, pin fins, finned passages, by-pass flows, tip cap impingement flows, and flow branching. The code is an extension of a one-dimensional code developed by P. Meitner. In the subject code, correlations for both heat transfer coefficient and pressure loss computations were developed to model each of the above mentioned type of coolant passages. The code has the capability of independently computing the friction factor and heat transfer coefficient on each side of a rectangular passage. Either the mass flow at the inlet to the channel or the exit plane pressure can be specified. For a specified inlet total temperature, inlet total pressure, and exit static pressure, the code computers the flow rates through the main branch and the subbranches, flow through tip cap for impingement cooling, in addition to computing the coolant pressure, temperature, and heat transfer coefficient distribution in each coolant flow branch. Predictions from the subject code for both nonrotating and rotating passages agree well with experimental data. The code was used to analyze the cooling passage of a research cooled radial rotor.

  14. Monte Carlo Investigation on the Effect of Heterogeneities on Strut Adjusted Volume Implant (SAVI) Dosimetry

    NASA Astrophysics Data System (ADS)

    Koontz, Craig

    Breast cancer is the most prevalent cancer for women with more than 225,000 new cases diagnosed in the United States in 2012 (ACS, 2012). With the high prevalence, comes an increased emphasis on researching new techniques to treat this disease. Accelerated partial breast irradiation (APBI) has been used as an alternative to whole breast irradiation (WBI) in order to treat occult disease after lumpectomy. Similar recurrence rates have been found using ABPI after lumpectomy as with mastectomy alone, but with the added benefit of improved cosmetic and psychological results. Intracavitary brachytherapy devices have been used to deliver the APBI prescription. However, inability to produce asymmetric dose distributions in order to avoid overdosing skin and chest wall has been an issue with these devices. Multi-lumen devices were introduced to overcome this problem. Of these, the Strut-Adjusted Volume Implant (SAVI) has demonstrated the greatest ability to produce an asymmetric dose distribution, which would have greater ability to avoid skin and chest wall dose, and thus allow more women to receive this type of treatment. However, SAVI treatments come with inherent heterogeneities including variable backscatter due to the proximity to the tissue-air and tissue-lung interfaces and variable contents within the cavity created by the SAVI. The dose calculation protocol based on TG-43 does not account for heterogeneities and thus will not produce accurate dosimetry; however Acuros, a model-based dose calculation algorithm manufactured by Varian Medical Systems, claims to accurately account for heterogeneities. Monte Carlo simulation can calculate the dosimetry with high accuracy. In this thesis, a model of the SAVI will be created for Monte Carlo, specifically using MCNP code, in order to explore the affects of heterogeneities on the dose distribution. This data will be compared to TG-43 and Acuros calculated dosimetry to explore their accuracy.

  15. SU‐C‐105‐05: Reference Dosimetry of High‐Energy Electron Beams with a Farmer‐Type Ionization Chamber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muir, B; Rogers, D

    2013-06-15

    Purpose: To investigate gradient effects and provide Monte Carlo calculated beam quality conversion factors to characterize the Farmer‐type NE2571 ion chamber for high‐energy reference dosimetry of clinical electron beams. Methods: The EGSnrc code system is used to calculate the absorbed dose to water and to the gas in a fully modeled NE2571 chamber as a function of depth in a water phantom. Electron beams incident on the surface of the phantom are modeled using realistic BEAMnrc accelerator simulations and electron beam spectra. Beam quality conversion factors are determined using calculated doses to water and to air in the chamber inmore » high‐energy electron beams and in a cobalt‐60 reference field. Calculated water‐to‐air stopping power ratios are employed for investigation of the overall ion chamber perturbation factor. Results: An upstream shift of 0.3–0.4 multiplied by the chamber radius, r-cav, both minimizes the variation of the overall ion chamber perturbation factor with depth and reduces the difference between the beam quality specifier (R{sub 5} {sub 0}) calculated using ion chamber simulations and that obtained with simulations of dose‐to‐water in the phantom. Beam quality conversion factors are obtained at the reference depth and gradient effects are optimized using a shift of 0.2r-cav. The photon‐electron conversion factor, k-ecal, amounts to 0.906 when gradient effects are minimized using the shift established here and 0.903 if no shift of the data is used. Systematic uncertainties in beam quality conversion factors are investigated and amount to between 0.4 to 1.1% depending on assumptions used. Conclusion: The calculations obtained in this work characterize the use of an NE2571 ion chamber for reference dosimetry of high‐energy electron beams. These results will be useful as the AAPM continues to review their reference dosimetry protocols.« less

  16. SU-D-213-06: Dosimetry of Modulated Electron Radiation Therapy Using Fricke Gel Dosimeter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gawad, M Abdel; Elgohary, M; Hassaan, M

    Purpose: Modulated electron radiation therapy (MERT) has been proposed as an effective modality for treatment of superficial targets. MERT utilizes multiple beams of different energies which are intensity modulated to deliver optimized dose distribution. Energy independent dosimeters are thus needed for quantitative evaluations of MERT dose distributions and measurements of absolute doses delivered to patients. Thus in the current work we study the feasibility of Fricke gel dosimeters in MERT dosimetry. Methods: Batches of radiation sensitive Fricke gel is fabricated and poured into polymethyl methacrylate cuvettes. The samples were irradiated in solid water phantom and a thick layer of bolusmore » was used as a buildup. A spectrophotometer system was used for measuring the color changes (the absorbance) before and after irradiation and then we calculate net absorbance. We constructed calibration curves to relate the measured absorbance in terms of absorbed dose for all available electron energies. Dosimetric measurements were performed for mixed electron beam delivery and we also performed measurement for segmented field delivery with the dosimeter placed at the junction of two adjacent electron beams of different energies. Dose measured by our gel dosimetry is compared to that calculation from our precise treatment planning system. We also initiated a Monte Carlo study to evaluate the water equivalence of our dosimeters. MCBEAM and MCSIM codes were used for treatment head simulation and phantom dose calculation. PDDs and profiles were calculated for electron beams incident on a phantom designed with 1cm slab of Fricke gel. Results: The calibration curves showed no observed energy dependence with all studied electron beam energies. Good agreement was obtained between dose calculated and that obtained by gel dosimetry. Monte Carlo results illustrated the tissue equivalency of our Gel dosimeters. Conclusion: Fricke Gel dosimeters represent a good option for the dosimetric quality assurance prior to MERT application.« less

  17. Pediatric dosimetry for intrapleural lung injections of 32P chromic phosphate

    NASA Astrophysics Data System (ADS)

    Konijnenberg, Mark W.; Olch, Arthur

    2010-10-01

    Intracavitary injections of 32P chromic phosphate are used in the therapy of pleuropulmonary blastoma and pulmonary sarcomas in children. The lung dose, however, has never been calculated despite the potential risk of lung toxicity from treatment. In this work the dosimetry has been calculated in target tissue and lung for pediatric phantoms. Pleural cavities were modeled in the Monte Carlo code MCNP within the pediatric MIRD phantoms. Both the depth-dose curves in the pleural lining and into the lung as well as 3D dose distributions were calculated for either homogeneous or inhomogeneous 32P activity distributions. Dose-volume histograms for the lung tissue and isodose graphs were generated. The results for the 2D depth-dose curve to the pleural lining and tumor around the pleural cavity correspond well with the point kernel model-based recommendations. With a 2 mm thick pleural lining, one-third of the lung parenchyma volume gets a dose more than 30 Gy (V30) for 340 MBq 32P in a 10 year old. This is close to lung tolerance. Younger children will receive a larger dose to the lung when the lung density remains equal to the adult value; the V30 relative lung volume for a 5 year old is 35% at an activity of 256 MBq and for a 1 year old 165 MBq yields a V30 of 43%. At higher densities of the lung tissue V30 stays below 32%. All activities yield a therapeutic dose of at least 225 Gy in the pleural lining. With a more normal pleural lining thickness (0.5 mm instead of 2 mm) the injected activities will have to be reduced by a factor 5 to obtain tolerable lung doses in pediatric patients. Previous dosimetry recommendations for the adult apply well down to lung surface areas of 400 cm2. Monte Carlo dosimetry quantitates the three-dimensional dose distribution, providing a better insight into the maximum tolerable activity for this therapy.

  18. Computer Model Of Fragmentation Of Atomic Nuclei

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.; Tripathi, Ram K.; Norbury, John W.; KHAN FERDOUS; Badavi, Francis F.

    1995-01-01

    High Charge and Energy Semiempirical Nuclear Fragmentation Model (HZEFRG1) computer program developed to be computationally efficient, user-friendly, physics-based program for generating data bases on fragmentation of atomic nuclei. Data bases generated used in calculations pertaining to such radiation-transport applications as shielding against radiation in outer space, radiation dosimetry in outer space, cancer therapy in laboratories with beams of heavy ions, and simulation studies for designing detectors for experiments in nuclear physics. Provides cross sections for production of individual elements and isotopes in breakups of high-energy heavy ions by combined nuclear and Coulomb fields of interacting nuclei. Written in ANSI FORTRAN 77.

  19. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †

    PubMed Central

    Murdani, Muhammad Harist; Hong, Bonghee

    2018-01-01

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes (Ad-Hoc) and neighborhood proximity (Top-K). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space. PMID:29587366

  20. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †.

    PubMed

    Murdani, Muhammad Harist; Kwon, Joonho; Choi, Yoon-Ho; Hong, Bonghee

    2018-03-24

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes ( Ad-Hoc ) and neighborhood proximity ( Top-K ). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.

  1. Evaluation of the UF/NCI hybrid computational phantoms for use in organ dosimetry of pediatric patients undergoing fluoroscopically guided cardiac procedures

    NASA Astrophysics Data System (ADS)

    Marshall, Emily L.; Borrego, David; Tran, Trung; Fudge, James C.; Bolch, Wesley E.

    2018-03-01

    Epidemiologic data demonstrate that pediatric patients face a higher relative risk of radiation induced cancers than their adult counterparts at equivalent exposures. Infants and children with congenital heart defects are a critical patient population exposed to ionizing radiation during life-saving procedures. These patients will likely incur numerous procedures throughout their lifespan, each time increasing their cumulative radiation absorbed dose. As continued improvements in long-term prognosis of congenital heart defect patients is achieved, a better understanding of organ radiation dose following treatment becomes increasingly vital. Dosimetry of these patients can be accomplished using Monte Carlo radiation transport simulations, coupled with modern anatomical patient models. The aim of this study was to evaluate the performance of the University of Florida/National Cancer Institute (UF/NCI) pediatric hybrid computational phantom library for organ dose assessment of patients that have undergone fluoroscopically guided cardiac catheterizations. In this study, two types of simulations were modeled. A dose assessment was performed on 29 patient-specific voxel phantoms (taken as representing the patient’s true anatomy), height/weight-matched hybrid library phantoms, and age-matched reference phantoms. Two exposure studies were conducted for each phantom type. First, a parametric study was constructed by the attending pediatric interventional cardiologist at the University of Florida to model the range of parameters seen clinically. Second, four clinical cardiac procedures were simulated based upon internal logfiles captured by a Toshiba Infinix-i Cardiac Bi-Plane fluoroscopic unit. Performance of the phantom library was quantified by computing both the percent difference in individual organ doses, as well as the organ dose root mean square values for overall phantom assessment between the matched phantoms (UF/NCI library or reference) and the patient-specific phantoms. The UF/NCI hybrid phantoms performed at percent differences of between 15% and 30% for the parametric set of irradiation events. Among internal logfile reconstructed procedures, the UF/NCI hybrid phantoms performed with RMS organ dose values between 7% and 29%. Percent improvement in organ dosimetry via the use of hybrid library phantoms over the reference phantoms ranged from 6.6% to 93%. The use of a hybrid phantom library, Monte Carlo radiation transport methods, and clinical information on irradiation events provide a means for tracking organ dose in these radiosensitive patients undergoing fluoroscopically guided cardiac procedures. This work was supported by Advanced Laboratory for Radiation Dosimetry Studies, University of Florida, American Association of University Women, National Cancer Institute Grant 1F31 CA159464.

  2. An electron-beam dose deposition experiment: TIGER 1-D simulation code versus thermoluminescent dosimetry

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Tipton, Charles W.; Self, Charles T.

    1991-03-01

    The dose absorbed in an integrated circuit (IC) die exposed to a pulse of low-energy electrons is a strong function of both electron energy and surrounding packaging materials. This report describes an experiment designed to measure how well the Integrated TIGER Series one-dimensional (1-D) electron transport simulation program predicts dose correction factors for a state-of-the-art IC package and package/printed circuit board (PCB) combination. These derived factors are compared with data obtained experimentally using thermoluminescent dosimeters (TLD's) and the FX-45 flash x-ray machine (operated in electron-beam (e-beam) mode). The results of this experiment show that the TIGER 1-D simulation code can be used to accurately predict FX-45 e-beam dose deposition correction factors for reasonably complex IC packaging configurations.

  3. Monte Carlo study of a 60Co calibration field of the Dosimetry Laboratory Seibersdorf.

    PubMed

    Hranitzky, C; Stadtmann, H

    2007-01-01

    The gamma radiation fields of the reference irradiation facility of the Dosimetry Laboratory Seibersdorf with collimated beam geometry are used for calibrating radiation protection dosemeters. A close-to-reality simulation model of the facility including the complex geometry of a 60Co source was set up using the Monte Carlo code MCNP. The goal of this study is to characterise the radionuclide gamma calibration field and resulting air-kerma distributions inside the measurement hall with a total of 20 m in length. For the whole range of source-detector-distances (SDD) along the central beam axis, simulated and measured relative air-kerma values are within +/-0.6%. Influences on the accuracy of the simulation results are investigated, including e.g., source mass density effects or detector volume dependencies. A constant scatter contribution from the lead ring-collimator of approximately 1% and an increasing scatter contribution from the concrete floor for distances above 7 m are identified, resulting in a total air-kerma scatter contribution below 5%, which is in accordance to the ISO 4037-1 recommendations.

  4. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  5. "Hour of Code": Can It Change Students' Attitudes toward Programming?

    ERIC Educational Resources Information Center

    Du, Jie; Wimmer, Hayden; Rada, Roy

    2016-01-01

    The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…

  6. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    ERIC Educational Resources Information Center

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  7. Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy.

    PubMed

    Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe

    2015-07-07

    The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm(3) calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.

  8. Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy

    NASA Astrophysics Data System (ADS)

    Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe

    2015-07-01

    The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm3 calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.

  9. Guidelines for developing vectorizable computer programs

    NASA Technical Reports Server (NTRS)

    Miner, E. W.

    1982-01-01

    Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.

  10. Dosimetry for Small Fields in Stereotactic Radiosurgery Using Gafchromic MD-V2-55 Film, TLD-100 and Alanine Dosimeters

    PubMed Central

    Massillon-JL, Guerda; Cueva-Prócel, Diego; Díaz-Aguirre, Porfirio; Rodríguez-Ponce, Miguel; Herrera-Martínez, Flor

    2013-01-01

    This work investigated the suitability of passive dosimeters for reference dosimetry in small fields with acceptable accuracy. Absorbed dose to water rate was determined in nine small radiation fields with diameters between 4 and 35 mm in a Leksell Gamma Knife (LGK) and a modified linear accelerator (linac) for stereotactic radiosurgery treatments. Measurements were made using Gafchromic film (MD-V2-55), alanine and thermoluminescent (TLD-100) dosimeters and compared with conventional dosimetry systems. Detectors were calibrated in terms of absorbed dose to water in 60Co gamma-ray and 6 MV x-ray reference (10×10 cm2) fields using an ionization chamber calibrated at a standards laboratory. Absorbed dose to water rate computed with MD-V2-55 was higher than that obtained with the others dosimeters, possibly due to a smaller volume averaging effect. Ratio between the dose-rates determined with each dosimeter and those obtained with the film was evaluated for both treatment modalities. For the LGK, the ratio decreased as the dosimeter size increased and remained constant for collimator diameters larger than 8 mm. The same behaviour was observed for the linac and the ratio increased with field size, independent of the dosimeter used. These behaviours could be explained as an averaging volume effect due to dose gradient and lack of electronic equilibrium. Evaluation of the output factors for the LGK collimators indicated that, even when agreement was observed between Monte Carlo simulation and measurements with different dosimeters, this does not warrant that the absorbed dose to water rate in the field was properly known and thus, investigation of the reference dosimetry should be an important issue. These results indicated that alanine dosimeter provides a high degree of accuracy but cannot be used in fields smaller than 20 mm diameter. Gafchromic film can be considered as a suitable methodology for reference dosimetry. TLD dosimeters are not appropriate in fields smaller than 10 mm diameters. PMID:23671677

  11. Shared dosimetry error in epidemiological dose-response analyses

    DOE PAGES

    Stram, Daniel O.; Preston, Dale L.; Sokolnikov, Mikhail; ...

    2015-03-23

    Radiation dose reconstruction systems for large-scale epidemiological studies are sophisticated both in providing estimates of dose and in representing dosimetry uncertainty. For example, a computer program was used by the Hanford Thyroid Disease Study to provide 100 realizations of possible dose to study participants. The variation in realizations reflected the range of possible dose for each cohort member consistent with the data on dose determinates in the cohort. Another example is the Mayak Worker Dosimetry System 2013 which estimates both external and internal exposures and provides multiple realizations of "possible" dose history to workers given dose determinants. This paper takesmore » up the problem of dealing with complex dosimetry systems that provide multiple realizations of dose in an epidemiologic analysis. In this paper we derive expected scores and the information matrix for a model used widely in radiation epidemiology, namely the linear excess relative risk (ERR) model that allows for a linear dose response (risk in relation to radiation) and distinguishes between modifiers of background rates and of the excess risk due to exposure. We show that treating the mean dose for each individual (calculated by averaging over the realizations) as if it was true dose (ignoring both shared and unshared dosimetry errors) gives asymptotically unbiased estimates (i.e. the score has expectation zero) and valid tests of the null hypothesis that the ERR slope β is zero. Although the score is unbiased the information matrix (and hence the standard errors of the estimate of β) is biased for β≠0 when ignoring errors in dose estimates, and we show how to adjust the information matrix to remove this bias, using the multiple realizations of dose. The use of these methods in the context of several studies including, the Mayak Worker Cohort, and the U.S. Atomic Veterans Study, is discussed.« less

  12. Characterization of the nanoDot OSLD dosimeter in CT.

    PubMed

    Scarboro, Sarah B; Cody, Dianna; Alvarez, Paola; Followill, David; Court, Laurence; Stingo, Francesco C; Zhang, Di; McNitt-Gray, Michael; Kry, Stephen F

    2015-04-01

    The extensive use of computed tomography (CT) in diagnostic procedures is accompanied by a growing need for more accurate and patient-specific dosimetry techniques. Optically stimulated luminescent dosimeters (OSLDs) offer a potential solution for patient-specific CT point-based surface dosimetry by measuring air kerma. The purpose of this work was to characterize the OSLD nanoDot for CT dosimetry, quantifying necessary correction factors, and evaluating the uncertainty of these factors. A characterization of the Landauer OSL nanoDot (Landauer, Inc., Greenwood, IL) was conducted using both measurements and theoretical approaches in a CT environment. The effects of signal depletion, signal fading, dose linearity, and angular dependence were characterized through direct measurement for CT energies (80-140 kV) and delivered doses ranging from ∼5 to >1000 mGy. Energy dependence as a function of scan parameters was evaluated using two independent approaches: direct measurement and a theoretical approach based on Burlin cavity theory and Monte Carlo simulated spectra. This beam-quality dependence was evaluated for a range of CT scanning parameters. Correction factors for the dosimeter response in terms of signal fading, dose linearity, and angular dependence were found to be small for most measurement conditions (<3%). The relative uncertainty was determined for each factor and reported at the two-sigma level. Differences in irradiation geometry (rotational versus static) resulted in a difference in dosimeter signal of 3% on average. Beam quality varied with scan parameters and necessitated the largest correction factor, ranging from 0.80 to 1.15 relative to a calibration performed in air using a 120 kV beam. Good agreement was found between the theoretical and measurement approaches. Correction factors for the measurement of air kerma were generally small for CT dosimetry, although angular effects, and particularly effects due to changes in beam quality, could be more substantial. In particular, it would likely be necessary to account for variations in CT scan parameters and measurement location when performing CT dosimetry using OSLD.

  13. Effect of contrast media on megavoltage photon beam dosimetry.

    PubMed

    Rankine, Ashley W; Lanzon, Peter J; Spry, Nigel A

    2008-01-01

    The purpose of this study was to quantify changes in photon beam dosimetry caused by using contrast media during computed tomography (CT) simulation and determine if the resulting changes are clinically significant. The effect of contrast on dosimetry was first examined for a single 6-MV photon beam incident on a plane phantom with a structure of varying electron densities (rho(e)) and thickness. Patient studies were then undertaken in which CT data sets were collected with and without contrast for 6 typical patients. Three patients received IV contrast (Optiray-240) only and 3 received IV plus oral (Gastrograffin) contrast. Each patient was planned using conformal multifield techniques in accordance with the department standards. Two methods were used to compare the effect of contrast on dosimetry for each patient. The phantom analysis showed that the change in dose at the isocenter for a single 10 x 10 cm2 6-MV photon beam traversing 10 cm of a contrast-enhanced structure with rho(e) 1.22 was 7.0% (1.22 was the highest average rho(e) observed in the patient data). As a result of using contrast, increases in rho(e) were observed in structures for the 6 patients studied. Consequently, when using contrast-enhanced CT data for multifield planning, increases in dose at the isocenter and in critical structures were observed up to 2.1% and 2.5%, respectively. Planning on contrast-enhanced CT images may result in an increase in dose of up to 2.1% at the isocenter, which would generally be regarded as clinically insignificant. If, however, a critical organ is in close proximity to the planning target volume (PTV) and is planned to receive its maximum allowable dose, planning on contrast-enhanced CT images may result in that organ receiving dose beyond the recommended tolerance. In these instances, pre-contrast CT data should be used for dosimetry.

  14. WE-E-BRE-01: An Image-Based Skeletal Dosimetry Model for the ICRP Reference Adult Female - Internal Electron Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Reilly, S; Maynard, M; Marshall, E

    Purpose: Limitations seen in previous skeletal dosimetry models, which are still employed in commonly used software today, include the lack of consideration of electron escape and cross-fire from cortical bone, the modeling of infinite spongiosa, the disregard of the effect of varying cellularity on active marrow self-irradiation, and the lack of use of the more recent ICRP definition of a 50 micron surrogate tissue region for the osteoprogenitor cells - shallow marrow. These limitations were addressed in the present dosimetry model. Methods: Electron transport was completed to determine specific absorbed fractions to active marrow and shallow marrow of the skeletalmore » regions of the adult female. The bone macrostructure was obtained from the whole-body hybrid computational phantom of the UF series of reference phantoms, while the bone microstructure was derived from microCT images of skeletal region samples taken from a 45 year-old female cadaver. The target tissue regions were active marrow and shallow marrow. The source tissues were active marrow, inactive marrow, trabecular bone volume, trabecular bone surfaces, cortical bone volume and cortical bone surfaces. The marrow cellularity was varied from 10 to 100 percent for active marrow self-irradiation. A total of 33 discrete electron energies, ranging from 1 keV to 10 MeV, were either simulated or modeled analytically. Results: The method of combining macro- and microstructure absorbed fractions calculated using MCNPX electron transport was found to yield results similar to those determined with the PIRT model for the UF adult male in the Hough et al. study. Conclusion: The calculated skeletal averaged absorbed fractions for each source-target combination were found to follow similar trends of more recent dosimetry models (image-based models) and did not follow current models used in nuclear medicine dosimetry at high energies (due to that models use of an infinite expanse of trabecular spongiosa)« less

  15. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  16. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  17. The FLUKA code for space applications: recent developments

    NASA Technical Reports Server (NTRS)

    Andersen, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; hide

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to-date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results have been obtained using a modified version of the RQMD-2.4 code. This interim solution is now fully operational, while waiting for the development of new models based on the FLUKA hadron-nucleus interaction code, a newly developed QMD code, and the implementation of the Boltzmann master equation theory for low energy ion interactions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  18. Enhanced fault-tolerant quantum computing in d-level systems.

    PubMed

    Campbell, Earl T

    2014-12-05

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  19. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  20. Implementation of radiation shielding calculation methods. Volume 1: Synopsis of methods and summary of results

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.

  1. Nonuniform code concatenation for universal fault-tolerant quantum computing

    NASA Astrophysics Data System (ADS)

    Nikahd, Eesa; Sedighi, Mehdi; Saheb Zamani, Morteza

    2017-09-01

    Using transversal gates is a straightforward and efficient technique for fault-tolerant quantum computing. Since transversal gates alone cannot be computationally universal, they must be combined with other approaches such as magic state distillation, code switching, or code concatenation to achieve universality. In this paper we propose an alternative approach for universal fault-tolerant quantum computing, mainly based on the code concatenation approach proposed in [T. Jochym-O'Connor and R. Laflamme, Phys. Rev. Lett. 112, 010505 (2014), 10.1103/PhysRevLett.112.010505], but in a nonuniform fashion. The proposed approach is described based on nonuniform concatenation of the 7-qubit Steane code with the 15-qubit Reed-Muller code, as well as the 5-qubit code with the 15-qubit Reed-Muller code, which lead to two 49-qubit and 47-qubit codes, respectively. These codes can correct any arbitrary single physical error with the ability to perform a universal set of fault-tolerant gates, without using magic state distillation.

  2. BUGJEFF311.BOLIB (JEFF-3.1.1) and BUGENDF70.BOLIB (ENDF/B-VII.0) - Generation Methodology and Preliminary Testing of two ENEA-Bologna Group Cross Section Libraries for LWR Shielding and Pressure Vessel Dosimetry

    NASA Astrophysics Data System (ADS)

    Pescarini, Massimo; Sinitsa, Valentin; Orsi, Roberto; Frisoni, Manuela

    2016-02-01

    Two broad-group coupled neutron/photon working cross section libraries in FIDO-ANISN format, dedicated to LWR shielding and pressure vessel dosimetry applications, were generated following the methodology recommended by the US ANSI/ANS-6.1.2-1999 (R2009) standard. These libraries, named BUGJEFF311.BOLIB and BUGENDF70.BOLIB, are respectively based on JEFF-3.1.1 and ENDF/B-VII.0 nuclear data and adopt the same broad-group energy structure (47 n + 20 γ) of the ORNL BUGLE-96 similar library. They were respectively obtained from the ENEA-Bologna VITJEFF311.BOLIB and VITENDF70.BOLIB libraries in AMPX format for nuclear fission applications through problem-dependent cross section collapsing with the ENEA-Bologna 2007 revision of the ORNL SCAMPI nuclear data processing system. Both previous libraries are based on the Bondarenko self-shielding factor method and have the same AMPX format and fine-group energy structure (199 n + 42 γ) as the ORNL VITAMIN-B6 similar library from which BUGLE-96 was obtained at ORNL. A synthesis of a preliminary validation of the cited BUGLE-type libraries, performed through 3D fixed source transport calculations with the ORNL TORT-3.2 SN code, is included. The calculations were dedicated to the PCA-Replica 12/13 and VENUS-3 engineering neutron shielding benchmark experiments, specifically conceived to test the accuracy of nuclear data and transport codes in LWR shielding and radiation damage analyses.

  3. Phase 1 Evaluation of [(64)Cu]DOTA-Patritumab to Assess Dosimetry, Apparent Receptor Occupancy, and Safety in Subjects with Advanced Solid Tumors.

    PubMed

    Lockhart, A Craig; Liu, Yongjian; Dehdashti, Farrokh; Laforest, Richard; Picus, Joel; Frye, Jennifer; Trull, Lauren; Belanger, Stefanie; Desai, Madhuri; Mahmood, Syed; Mendell, Jeanne; Welch, Michael J; Siegel, Barry A

    2016-06-01

    The purpose of this study was to evaluate the safety, dosimetry, and apparent receptor occupancy (RO) of [(64)Cu]DOTA-patritumab, a radiolabeled monoclonal antibody directed against HER3/ERBB3 in subjects with advanced solid tumors. Dosimetry subjects (n = 5) received [(64)Cu]DOTA-patritumab and underwent positron emission tomography (PET)/X-ray computed tomography (CT) at 3, 24, and 48 h. Evaluable RO subjects (n = 3 out of 6) received [(64)Cu]DOTA-patritumab at day 1 and day 8 (after 9.0 mg/kg patritumab) followed by PET/CT at 24 h post-injection. Endpoints included safety, tumor uptake, and efficacy. The tumor SUVmax (± SD) was 5.6 ± 4.5, 3.3 ± 1.7, and 3.0 ± 1.1 at 3, 24, and 48 h in dosimetry subjects. The effective dose and critical organ dose (liver) averaged 0.044 ± 0.008 mSv/MBq and 0.46 ± 0.086 mGy/MBq, respectively. In RO subjects, tumor-to-blood ratio decreased from 1.00 ± 0.32 at baseline to 0.57 ± 0.17 after stable patritumab, corresponding to a RO of 42.1 ± 3. [(64)Cu]DOTA-patritumab was safe. These limited results suggest that this PET-based method can be used to determine tumor-apparent RO.

  4. Self‐expanding stent effects on radiation dosimetry in esophageal cancer

    PubMed Central

    Francis, Samual R.; Wang, Brian; Williams, Greg V.; Cox, Kristen; Adler, Douglas G.; Shrieve, Dennis C.; Salter, Bill J.

    2013-01-01

    It is the purpose of this study to evaluate how self‐expanding stents (SESs) affect esophageal cancer radiation planning target volumes (PTVs) and dose delivered to surrounding organs at risk (OARs). Ten patients were evaluated, for whom a SES was placed before radiation. A computed tomography (CT) scan obtained before stent placement was fused to the post‐stent CT simulation scan. Three methods were used to represent pre‐stent PTVs: 1) image fusion (IF), 2) volume approximation (VA), and 3) diameter approximation (DA). PTVs and OARs were contoured per RTOG 1010 protocol using Eclipse Treatment Planning software. Post‐stent dosimetry for each patient was compared to approximated pre‐stent dosimetry. For each of the three pre‐stent approximations (IF, VA, and DA), the mean lung and liver doses and the estimated percentages of lung volumes receiving 5 Gy, 10 Gy, 20 Gy, and 30 Gy, and heart volumes receiving 40 Gy were significantly lower (p‐values <0.02) than those estimated in the post‐stent treatment plans. The lung V5, lung V10, and heart V40 constraints were achieved more often using our pre‐stent approximations. Esophageal SES placement increases the dose delivered to the lungs, heart, and liver. This may have clinical importance, especially when the dose‐volume constraints are near the recommended thresholds, as was the case for lung V5, lung V10, and heart V40. While stents have established benefits for treating patients with significant dysphagia, physicians considering stent placement and radiation therapy must realize the effects stents can have on the dosimetry. PACS number: 87.55.dk PMID:23835387

  5. Fixed, object-specific intensity compensation for cone beam optical CT radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Dekker, Kurtis H.; Hazarika, Rubin; Silveira, Matheus A.; Jordan, Kevin J.

    2018-03-01

    Optical cone beam computed tomography (CT) scanning of radiochromic gel dosimeters, using a CCD camera and a low stray light convergent source, provides fast, truly 3D radiation dosimetry with high accuracy. However, a key limiting factor in radiochromic gel dosimetry at large (⩾10 cm diameter) volumes is the initial attenuation of the dosimeters. It is not unusual to observe a 5–10×  difference in signal intensity through the dosimeter center versus through the surrounding medium in pre-irradiation images. Thus, all dosimetric information in a typical experiment is measured within the lower 10%–20% of the camera sensor’s range, and re-use of gels is often not possible due to a lack of transmission. To counteract this, in this note we describe a simple method to create source compensators by printing on transparent films. This technique, which is easily implemented and inexpensive, is an optical analogue to the bowtie filter in x-ray CT. We present transmission images and solution phantom reconstructions to demonstrate that (1) placing compensators beyond the focal zone of the imaging lens prevents high spatial frequency features of the printed films from generating reconstruction artifacts, and (2) object-specific compensation considerably reduces the range of intensities measured in projection images. This will improve the measurable dose range in optical CT dosimetry, and will enable imaging of larger gel volumes (∼15 cm diameter). Additionally, it should enable re-use of dosimeters by printing a new compensator for a second experiment.

  6. Self-expanding stent effects on radiation dosimetry in esophageal cancer.

    PubMed

    Francis, Samual R; Anker, Christopher J; Wang, Brian; Williams, Greg V; Cox, Kristen; Adler, Douglas G; Shrieve, Dennis C; Salter, Bill J

    2013-07-08

    It is the purpose of this study to evaluate how self-expanding stents (SESs) affect esophageal cancer radiation planning target volumes (PTVs) and dose delivered to surrounding organs at risk (OARs). Ten patients were evaluated, for whom a SES was placed before radiation. A computed tomography (CT) scan obtained before stent placement was fused to the post-stent CT simulation scan. Three methods were used to represent pre-stent PTVs: 1) image fusion (IF), 2) volume approximation (VA), and 3) diameter approximation (DA). PTVs and OARs were contoured per RTOG 1010 protocol using Eclipse Treatment Planning software. Post-stent dosimetry for each patient was compared to approximated pre-stent dosimetry. For each of the three pre-stent approximations (IF, VA, and DA), the mean lung and liver doses and the estimated percentages of lung volumes receiving 5 Gy, 10 Gy, 20 Gy, and 30 Gy, and heart volumes receiving 40 Gy were significantly lower (p-values < 0.02) than those estimated in the post-stent treatment plans. The lung V5, lung V10, and heart V40 constraints were achieved more often using our pre-stent approximations. Esophageal SES placement increases the dose delivered to the lungs, heart, and liver. This may have clinical importance, especially when the dose-volume constraints are near the recommended thresholds, as was the case for lung V5, lung V10, and heart V40. While stents have established benefits for treating patients with significant dysphagia, physicians considering stent placement and radiation therapy must realize the effects stents can have on the dosimetry.

  7. Green's function methods in heavy ion shielding

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.

    1993-01-01

    An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.

  8. Analytical modeling of operating characteristics of premixing-prevaporizing fuel-air mixing passages. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Chiappetta, L. M.; Edwards, D. E.; Mcvey, J. B.

    1982-01-01

    A user's manual describing the operation of three computer codes (ADD code, PTRAK code, and VAPDIF code) is presented. The general features of the computer codes, the input/output formats, run streams, and sample input cases are described.

  9. Agent-Based Computational Modeling of Cell Culture: Understanding Dosimetry In Vitro as Part of In Vitro to In Vivo Extrapolation

    EPA Science Inventory

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assu...

  10. Dosimetric characterization of the M−15 high‐dose‐rate Iridium−192 brachytherapy source using the AAPM and ESTRO formalism

    PubMed Central

    Thanh, Minh‐Tri Ho; Munro, John J.

    2015-01-01

    The Source Production & Equipment Co. (SPEC) model M−15 is a new Iridium−192 brachytherapy source model intended for use as a temporary high‐dose‐rate (HDR) brachytherapy source for the Nucletron microSelectron Classic afterloading system. The purpose of this study is to characterize this HDR source for clinical application by obtaining a complete set of Monte Carlo calculated dosimetric parameters for the M‐15, as recommended by AAPM and ESTRO, for isotopes with average energies greater than 50 keV. This was accomplished by using the MCNP6 Monte Carlo code to simulate the resulting source dosimetry at various points within a pseudoinfinite water phantom. These dosimetric values next were converted into the AAPM and ESTRO dosimetry parameters and the respective statistical uncertainty in each parameter also calculated and presented. The M−15 source was modeled in an MCNP6 Monte Carlo environment using the physical source specifications provided by the manufacturer. Iridium−192 photons were uniformly generated inside the iridium core of the model M−15 with photon and secondary electron transport replicated using photoatomic cross‐sectional tables supplied with MCNP6. Simulations were performed for both water and air/vacuum computer models with a total of 4×109 sources photon history for each simulation and the in‐air photon spectrum filtered to remove low‐energy photons below δ=10%keV. Dosimetric data, including D(r,θ),gL(r),F(r,θ),Φan(r), and φ¯an, and their statistical uncertainty were calculated from the output of an MCNP model consisting of an M−15 source placed at the center of a spherical water phantom of 100 cm diameter. The air kerma strength in free space, SK, and dose rate constant, Λ, also was computed from a MCNP model with M−15 Iridium−192 source, was centered at the origin of an evacuated phantom in which a critical volume containing air at STP was added 100 cm from the source center. The reference dose rate, D˙(r0,θ0)≡D˙(1cm,π/2), is found to be 4.038±0.064 cGy mCi−1 h−1. The air kerma strength, SK, is reported to be 3.632±0.086 cGy cm2 mCi−1 g−1, and the dose rate constant, Λ, is calculated to be 1.112±0.029 cGy h−1 U−1. The normalized dose rate, radial dose function, and anisotropy function with their uncertainties were computed and are represented in both tabular and graphical format in the report. A dosimetric study was performed of the new M−15 Iridium−192 HDR brachytherapy source using the MCNP6 radiation transport code. Dosimetric parameters, including the dose‐rate constant, radial dose function, and anisotropy function, were calculated in accordance with the updated AAPM and ESTRO dosimetric parameters for brachytherapy sources of average energy greater than 50 keV. These data therefore may be applied toward the development of a treatment planning program and for clinical use of the source. PACS numbers: 87.56.bg, 87.53.Jw PMID:26103489

  11. A practical three-dimensional dosimetry system for radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo Pengyi; Adamovics, John; Oldham, Mark

    2006-10-15

    There is a pressing need for a practical three-dimensional (3D) dosimetry system, convenient for clinical use, and with the accuracy and resolution to enable comprehensive verification of the complex dose distributions typical of modern radiation therapy. Here we introduce a dosimetry system that can achieve this challenge, consisting of a radiochromic dosimeter (PRESAGE trade mark sign ) and a commercial optical computed tomography (CT) scanning system (OCTOPUS trade mark sign ). PRESAGE trade mark sign is a transparent material with compelling properties for dosimetry, including insensitivity of the dose response to atmospheric exposure, a solid texture negating the need formore » an external container (reducing edge effects), and amenability to accurate optical CT scanning due to radiochromic optical contrast as opposed to light-scattering contrast. An evaluation of the performance and viability of the PRESAGE trade mark sign /OCTOPUS, combination for routine clinical 3D dosimetry is presented. The performance of the two components (scanner and dosimeter) was investigated separately prior to full system test. The optical CT scanner has a spatial resolution of {<=}1 mm, geometric accuracy within 1 mm, and high reconstruction linearity (with a R{sup 2} value of 0.9979 and a standard error of estimation of {approx}1%) relative to independent measurement. The overall performance of the PRESAGE trade mark sign /OCTOPUS system was evaluated with respect to a simple known 3D dose distribution, by comparison with GAFCHROMIC[reg] EBT film and the calculated dose from a commissioned planning system. The 'measured' dose distribution in a cylindrical PRESAGE trade mark sign dosimeter (16 cm diameter and 11 cm height) was determined by optical-CT, using a filtered backprojection reconstruction algorithm. A three-way Gamma map comparison (4% dose difference and 4 mm distance to agreement), between the PRESAGE trade mark sign , EBT and calculated dose distributions, showed full agreement in measurable region of PRESAGE trade mark sign dosimeter ({approx}90% of radius). The EBT and PRESAGE trade mark sign distributions agreed more closely with each other than with the calculated plan, consistent with penumbral blurring in the planning data which was acquired with an ion chamber. In summary, our results support the conclusion that the PRESAGE trade mark sign optical-CT combination represents a significant step forward in 3D dosimetry, and provides a robust, clinically effective and viable high-resolution relative 3D dosimetry system for radiation therapy.« less

  12. Performance evaluation of an improved optical computed tomography polymer gel dosimeter system for 3D dose verification of static and dynamic phantom deliveries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopatiuk-Tirpak, O.; Langen, K. M.; Meeks, S. L.

    2008-09-15

    The performance of a next-generation optical computed tomography scanner (OCTOPUS-5X) is characterized in the context of three-dimensional gel dosimetry. Large-volume (2.2 L), muscle-equivalent, radiation-sensitive polymer gel dosimeters (BANG-3) were used. Improvements in scanner design leading to shorter acquisition times are discussed. The spatial resolution, detectable absorbance range, and reproducibility are assessed. An efficient method for calibrating gel dosimeters using the depth-dose relationship is applied, with photon- and electron-based deliveries yielding equivalent results. A procedure involving a preirradiation scan was used to reduce the edge artifacts in reconstructed images, thereby increasing the useful cross-sectional area of the dosimeter by nearly amore » factor of 2. Dose distributions derived from optical density measurements using the calibration coefficient show good agreement with the treatment planning system simulations and radiographic film measurements. The feasibility of use for motion (four-dimensional) dosimetry is demonstrated on an example comparing dose distributions from static and dynamic delivery of a single-field photon plan. The capability to visualize three-dimensional dose distributions is also illustrated.« less

  13. Evaluation of six TPS algorithms in computing entrance and exit doses.

    PubMed

    Tan, Yun I; Metwaly, Mohamed; Glegg, Martin; Baggarley, Shaun; Elliott, Alex

    2014-05-08

    Entrance and exit doses are commonly measured in in vivo dosimetry for comparison with expected values, usually generated by the treatment planning system (TPS), to verify accuracy of treatment delivery. This report aims to evaluate the accuracy of six TPS algorithms in computing entrance and exit doses for a 6 MV beam. The algorithms tested were: pencil beam convolution (Eclipse PBC), analytical anisotropic algorithm (Eclipse AAA), AcurosXB (Eclipse AXB), FFT convolution (XiO Convolution), multigrid superposition (XiO Superposition), and Monte Carlo photon (Monaco MC). Measurements with ionization chamber (IC) and diode detector in water phantoms were used as a reference. Comparisons were done in terms of central axis point dose, 1D relative profiles, and 2D absolute gamma analysis. Entrance doses computed by all TPS algorithms agreed to within 2% of the measured values. Exit doses computed by XiO Convolution, XiO Superposition, Eclipse AXB, and Monaco MC agreed with the IC measured doses to within 2%-3%. Meanwhile, Eclipse PBC and Eclipse AAA computed exit doses were higher than the IC measured doses by up to 5.3% and 4.8%, respectively. Both algorithms assume that full backscatter exists even at the exit level, leading to an overestimation of exit doses. Despite good agreements at the central axis for Eclipse AXB and Monaco MC, 1D relative comparisons showed profiles mismatched at depths beyond 11.5 cm. Overall, the 2D absolute gamma (3%/3 mm) pass rates were better for Monaco MC, while Eclipse AXB failed mostly at the outer 20% of the field area. The findings of this study serve as a useful baseline for the implementation of entrance and exit in vivo dosimetry in clinical departments utilizing any of these six common TPS algorithms for reference comparison.

  14. Automated apparatus and method of generating native code for a stitching machine

    NASA Technical Reports Server (NTRS)

    Miller, Jeffrey L. (Inventor)

    2000-01-01

    A computer system automatically generates CNC code for a stitching machine. The computer determines the locations of a present stitching point and a next stitching point. If a constraint is not found between the present stitching point and the next stitching point, the computer generates code for making a stitch at the next stitching point. If a constraint is found, the computer generates code for changing a condition (e.g., direction) of the stitching machine's stitching head.

  15. Computer codes developed and under development at Lewis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1992-01-01

    The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.

  16. Laboratory Services Guide

    DTIC Science & Technology

    1994-10-01

    dosimetry services using thermoluminescent dosimeters ( TLDs ) to meet 10 CFR 19, 20, 30-36, 40 and 70; to proNide dosimetry service for environmental...USAF Personnel Dosimetry Branch. Once it is determined that area or external dosimetry is necessary, request the number of TLDs required by FAX or letter... dosimetry , Request TLDs 2 - 4 weeks in advance and always designate a control badge. The Radiation Dosimetry Branch thanks you in advance for doing everything

  17. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N., E-mail: zizin@adis.vver.kiae.ru

    2010-12-15

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit ofmore » the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.« less

  18. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    NASA Astrophysics Data System (ADS)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N.; Kryakvin, L. V.; Pitilimov, V. A.; Tereshonok, V. A.

    2010-12-01

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit of the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.

  19. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Folkerts, M; University of California, San Diego, La Jolla, CA; Graves, Y

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is ablemore » to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.« less

  20. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  1. An Object-Oriented Approach to Writing Computational Electromagnetics Codes

    NASA Technical Reports Server (NTRS)

    Zimmerman, Martin; Mallasch, Paul G.

    1996-01-01

    Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.

  2. Evaluation and characterization of fetal exposures to low frequency magnetic fields generated by laptop computers.

    PubMed

    Zoppetti, Nicola; Andreuccetti, Daniele; Bellieni, Carlo; Bogi, Andrea; Pinto, Iole

    2011-12-01

    Portable - or "laptop" - computers (LCs) are widely and increasingly used all over the world. Since LCs are often used in tight contact with the body even by pregnant women, fetal exposures to low frequency magnetic fields generated by these units can occur. LC emissions are usually characterized by complex waveforms and are often generated by the main AC power supply (when connected) and by the display power supply sub-system. In the present study, low frequency magnetic field emissions were measured for a set of five models of portable computers. For each of them, the magnetic flux density was characterized in terms not just of field amplitude, but also of the so called "weighted peak" (WP) index, introduced in the 2003 ICNIRP Statement on complex waveforms and confirmed in the 2010 ICNIRP Guidelines for low frequency fields. For the model of LC presenting the higher emission, a deeper analysis was also carried out, using numerical dosimetry techniques to calculate internal quantities (current density and in-situ electric field) with reference to a digital body model of a pregnant woman. Since internal quantities have complex waveforms too, the concept of WP index was extended to them, considering the ICNIRP basic restrictions defined in the 1998 Guidelines for the current density and in the 2010 Guidelines for the in-situ electric field. Induced quantities and WP indexes were computed using an appropriate original formulation of the well known Scalar Potential Finite Difference (SPFD) numerical method for electromagnetic dosimetry in quasi-static conditions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Computer Description of the Field Artillery Ammunition Supply Vehicle

    DTIC Science & Technology

    1983-04-01

    Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and

  4. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  5. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  6. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  7. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...

  8. Antenna pattern study, task 2

    NASA Technical Reports Server (NTRS)

    Harper, Warren

    1989-01-01

    Two electromagnetic scattering codes, NEC-BSC and ESP3, were delivered and installed on a NASA VAX computer for use by Marshall Space Flight Center antenna design personnel. The existing codes and certain supplementary software were updated, the codes installed on a computer that will be delivered to the customer, to provide capability for graphic display of the data to be computed by the use of the codes and to assist the customer in the solution of specific problems that demonstrate the use of the codes. With the exception of one code revision, all of these tasks were performed.

  9. High-performing simulations of the space radiation environment for the International Space Station and Apollo Missions

    NASA Astrophysics Data System (ADS)

    Lund, Matthew Lawrence

    The space radiation environment is a significant challenge to future manned and unmanned space travels. Future missions will rely more on accurate simulations of radiation transport in space through spacecraft to predict astronaut dose and energy deposition within spacecraft electronics. The International Space Station provides long-term measurements of the radiation environment in Low Earth Orbit (LEO); however, only the Apollo missions provided dosimetry data beyond LEO. Thus dosimetry analysis for deep space missions is poorly supported with currently available data, and there is a need to develop dosimetry-predicting models for extended deep space missions. GEANT4, a Monte Carlo Method, provides a powerful toolkit in C++ for simulation of radiation transport in arbitrary media, thus including the spacecraft and space travels. The newest version of GEANT4 supports multithreading and MPI, resulting in faster distributive processing of simulations in high-performance computing clusters. This thesis introduces a new application based on GEANT4 that greatly reduces computational time using Kingspeak and Ember computational clusters at the Center for High Performance Computing (CHPC) to simulate radiation transport through full spacecraft geometry, reducing simulation time to hours instead of weeks without post simulation processing. Additionally, this thesis introduces a new set of detectors besides the historically used International Commission of Radiation Units (ICRU) spheres for calculating dose distribution, including a Thermoluminescent Detector (TLD), Tissue Equivalent Proportional Counter (TEPC), and human phantom combined with a series of new primitive scorers in GEANT4 to calculate dose equivalence based on the International Commission of Radiation Protection (ICRP) standards. The developed models in this thesis predict dose depositions in the International Space Station and during the Apollo missions showing good agreement with experimental measurements. From these models the greatest contributor to radiation dose for the Apollo missions was from Galactic Cosmic Rays due to the short time within the radiation belts. The Apollo 14 dose measurements were an order of magnitude higher compared to other Apollo missions. The GEANT4 model of the Apollo Command Module shows consistent doses due to Galactic Cosmic Rays and Radiation Belts for all missions, with a small variation in dose distribution across the capsule. The model also predicts well the dose depositions and equivalent dose values in various human organs for the International Space Station or Apollo Command Module.

  10. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation.

    PubMed

    Yoriyaz, Hélio; Moralles, Maurício; Siqueira, Paulo de Tarso Dalledone; Guimarães, Carla da Costa; Cintra, Felipe Belonsi; dos Santos, Adimir

    2009-11-01

    Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons. Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  11. SU-E-T-493: Accelerated Monte Carlo Methods for Photon Dosimetry Using a Dual-GPU System and CUDA.

    PubMed

    Liu, T; Ding, A; Xu, X

    2012-06-01

    To develop a Graphics Processing Unit (GPU) based Monte Carlo (MC) code that accelerates dose calculations on a dual-GPU system. We simulated a clinical case of prostate cancer treatment. A voxelized abdomen phantom derived from 120 CT slices was used containing 218×126×60 voxels, and a GE LightSpeed 16-MDCT scanner was modeled. A CPU version of the MC code was first developed in C++ and tested on Intel Xeon X5660 2.8GHz CPU, then it was translated into GPU version using CUDA C 4.1 and run on a dual Tesla m 2 090 GPU system. The code was featured with automatic assignment of simulation task to multiple GPUs, as well as accurate calculation of energy- and material- dependent cross-sections. Double-precision floating point format was used for accuracy. Doses to the rectum, prostate, bladder and femoral heads were calculated. When running on a single GPU, the MC GPU code was found to be ×19 times faster than the CPU code and ×42 times faster than MCNPX. These speedup factors were doubled on the dual-GPU system. The dose Result was benchmarked against MCNPX and a maximum difference of 1% was observed when the relative error is kept below 0.1%. A GPU-based MC code was developed for dose calculations using detailed patient and CT scanner models. Efficiency and accuracy were both guaranteed in this code. Scalability of the code was confirmed on the dual-GPU system. © 2012 American Association of Physicists in Medicine.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Danielle; Siegbahn, Albert; Fallone, Gin

    Purpose: The BioMedical Imaging and Therapy (BMIT) beamlines at the Canadian Light Source offer the opportunity for investigating novel imaging and therapy applications of synchrotron radiation. A necessary component in advancing this research, and in progressing toward clinical applications, is the availability of accurate dosimetry that is traceable to a standards institution. However, dosimetry in this setting is challenging. These beams are typically small, non-uniform, and highly intense. This work describes air kerma rate measurements on a BMIT beamline using a free-air ionization chamber (FAC). Methods: The measurements were taken at the 05B1-1 beamline (∼8 – 100 keV) for severalmore » beam qualities with mean energies between 20.0 and 84.0 keV. The Victoreen Model 480 cylindrical FAC, with a specially fabricated 0.52 mm diameter aperture, was used to measure air kerma rates. The required correction factors were determined using a variety of methods: tabulated data, measurements, theoretical calculations and Monte Carlo simulations (EGSnrc user code egs-fac). Results: The experimental air kerma rates measured between 0.270 ± 13.6% and 312 ± 2.7% Gy/min. At lower energies (low filtration), the most impactful correction factors were those for ion recombination and for x-ray attenuation. Conclusions: These measurements marked the first absolute dosimetry performed at the BMIT beamlines. The experimental and Monte Carlo methods developed will allow air kerma rates to be measured under other experimental conditions, provide a benchmark to which other dosimeters will be compared, and provide a reference for imaging and therapy research programs on this beamline.« less

  13. Dose Enhancement near Metal Interfaces in Synthetic Diamond Based X-ray Dosimeters

    NASA Astrophysics Data System (ADS)

    Alamoudi, Dalal

    Diamond is an attractive material for medical dosimetry due to its radiation hardness, fast response, chemical resilience, small sensitive volume, high spatial resolution, near-tissue equivalence, and energy and dose rate independence. These properties make diamond a promising material for medical dosimetry compared to other semiconductor detector materials and wider radiation detection applications. This study is focused on one of the important factors to consider in the radiation detector; the influence of dose enhancement on the photocurrent performance at metallic interfaces in synthetic diamond radiation dosimeters with carbon based electrodes as a function of bias voltages. Monte Carlo (MC) simulations with BEAMnrc code were carried out to simulate the dose enhancement factor (DEF) and compared against the equivalent photocurrent ratio from experimental investigation. MC simulations show that the sensitive region for the absorbed dose distribution covers a few micrometers distances from the interface. Experimentally, two single crystal (SC) and one polycrystalline (PC) samples with carbon based electrodes were used. The samples were each mounted inside a tissue equivalent encapsulation design in order to minimize fluence perturbations. Copper, Gold and Lead have been investigated experimentally as generators of photoelectrons using 50 kVp and 100 kVp X-rays relevant for medical dosimetry. The results show enhancement in the detectors' photocurrent performance when different metals are butted up to the diamond detector. The variation in the photocurrent ratio measurements depends on the type of diamond samples, their electrode fabrication and the applied bias voltages indicating that the dose enhancement from diamond-metal interface modifies the electronic performance of the detector.

  14. DRDC Ottawa Participation in the SILENE Accident Dosimetry Intercomparison Exercise. June 10-21, 2002

    DTIC Science & Technology

    2002-11-01

    of CaF2:Mn and A120 3 TLDs for gamma-ray dosimetry ). In addition, DRDC Ottawa has recently substantially expanded its efforts in radiation dosimetry ...use of any real- time electronic dosimeter. Foils have long been proposed and used for criticality dosimetry (as well as for general monitoring of...ray Dosimetry DRDC Ottawa offers a number (over five) of various thermoluminescence dosimetry ( TLD ) systems. The choice of any particular TLD depends

  15. Hybrid computational phantoms representing the reference adult male and adult female: construction and applications for retrospective dosimetry.

    PubMed

    Hurtado, Jorge L; Lee, Choonsik; Lodwick, Daniel; Goede, Timothy; Williams, Jonathan L; Bolch, Wesley E

    2012-03-01

    Currently, two classes of computational phantoms have been developed for dosimetry calculation: (1) stylized (or mathematical) and (2) voxel (or tomographic) phantoms describing human anatomy through mathematical surface equations and 3D voxel matrices, respectively. Mathematical surface equations in stylized phantoms are flexible, but the resulting anatomy is not as realistic. Voxel phantoms display far better anatomical realism, but they are limited in terms of their ability to alter organ shape, position, and depth, as well as body posture. A new class of computational phantoms called hybrid phantoms takes advantage of the best features of stylized and voxel phantoms-flexibility and anatomical realism, respectively. In the current study, hybrid computational phantoms representing the adult male and female reference anatomy and anthropometry are presented. These phantoms serve as the starting framework for creating patient or worker sculpted whole-body phantoms for retrospective dose reconstruction. Contours of major organs and tissues were converted or segmented from computed tomography images of a 36-y-old Korean volunteer and a 25-y-old U.S. female patient, respectively, with supplemental high-resolution CT images of the cranium. Polygon mesh models for the major organs and tissues were reconstructed and imported into Rhinoceros™ for non-uniform rational B-spline (NURBS) surface modeling. The resulting NURBS/polygon mesh models representing body contour and internal anatomy were matched to anthropometric data and reference organ mass data provided by Centers for Disease Control and Prevention and International Commission on Radiation Protection, respectively. Finally, two hybrid adult male and female phantoms were completed where a total of eight anthropometric data categories were matched to standard values within 4% and organ volumes matched to ICRP data within 1% with the exception of total skin. The hybrid phantoms were voxelized from the NURBS phantoms at resolutions of 0.158 × 0.158 × 0.158 cm and 0.126 × 0.126 × 0.126 cm for the male and female, respectively. To highlight the flexibility of the hybrid phantoms, graphical displays are given of (1) underweight and overweight adult male phantoms, (2) a sitting position for the adult female phantom, and (3) extraction and higher-resolution voxelization of the small intestine for localized dosimetry of mucosal and stem cell layers. These phantoms are used to model radioactively contaminated individuals and to then assess time-dependent detector count rate thresholds corresponding to 50, 250, and 500 mSv effective dose, as might be needed during in-field radiological triage by first responders or first receivers.

  16. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...

  17. Parallel Computation of the Jacobian Matrix for Nonlinear Equation Solvers Using MATLAB

    NASA Technical Reports Server (NTRS)

    Rose, Geoffrey K.; Nguyen, Duc T.; Newman, Brett A.

    2017-01-01

    Demonstrating speedup for parallel code on a multicore shared memory PC can be challenging in MATLAB due to underlying parallel operations that are often opaque to the user. This can limit potential for improvement of serial code even for the so-called embarrassingly parallel applications. One such application is the computation of the Jacobian matrix inherent to most nonlinear equation solvers. Computation of this matrix represents the primary bottleneck in nonlinear solver speed such that commercial finite element (FE) and multi-body-dynamic (MBD) codes attempt to minimize computations. A timing study using MATLAB's Parallel Computing Toolbox was performed for numerical computation of the Jacobian. Several approaches for implementing parallel code were investigated while only the single program multiple data (spmd) method using composite objects provided positive results. Parallel code speedup is demonstrated but the goal of linear speedup through the addition of processors was not achieved due to PC architecture.

  18. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  19. Calculated and measured brachytherapy dosimetry parameters in water for the Xoft Axxent X-Ray Source: an electronic brachytherapy source.

    PubMed

    Rivard, Mark J; Davis, Stephen D; DeWerd, Larry A; Rusch, Thomas W; Axelrod, Steve

    2006-11-01

    A new x-ray source, the model S700 Axxent X-Ray Source (Source), has been developed by Xoft Inc. for electronic brachytherapy. Unlike brachytherapy sources containing radionuclides, this Source may be turned on and off at will and may be operated at variable currents and voltages to change the dose rate and penetration properties. The in-water dosimetry parameters for this electronic brachytherapy source have been determined from measurements and calculations at 40, 45, and 50 kV settings. Monte Carlo simulations of radiation transport utilized the MCNP5 code and the EPDL97-based mcplib04 cross-section library. Inter-tube consistency was assessed for 20 different Sources, measured with a PTW 34013 ionization chamber. As the Source is intended to be used for a maximum of ten treatment fractions, tube stability was also assessed. Photon spectra were measured using a high-purity germanium (HPGe) detector, and calculated using MCNP. Parameters used in the two-dimensional (2D) brachytherapy dosimetry formalism were determined. While the Source was characterized as a point due to the small anode size, < 1 mm, use of the one-dimensional (1D) brachytherapy dosimetry formalism is not recommended due to polar anisotropy. Consequently, 1D brachytherapy dosimetry parameters were not sought. Calculated point-source model radial dose functions at gP(5) were 0.20, 0.24, and 0.29 for the 40, 45, and 50 kV voltage settings, respectively. For 1

  20. Feasibility of CBCT dosimetry for IMRT using a normoxic polymethacrylic-acid gel dosimeter

    NASA Astrophysics Data System (ADS)

    Bong, Ji Hye; Kwon, Soo-Il; Kim, Kum Bae; Kim, Mi Suk; Jung, Hai Jo; Ji, Young Hoon; Ko, In Ok; Park, Ji Ae; Kim, Kyeong Min

    2013-09-01

    The purpose of this study is to evaluate the availability of cone-beam computed tomography(CBCT) for gel dosimetry. The absorbed dose was analyzed by using intensity-modulated radiation therapy(IMRT) to irradiate several tumor shapes with a calculated dose and several tumor acquiring images with CBCT in order to verify the possibility of reading a dose on the polymer gel dosimeter by means of the CBCT image. The results were compared with those obtained using magnetic resonance imaging(MRI) and CT. The linear correlation coefficients at doses less than 10 Gy for the polymer gel dosimeter were 0.967, 0.933 and 0.985 for MRI, CT and CBCT, respectively. The dose profile was symmetric on the basis of the vertical axis in a circular shape, and the uniformity was 2.50% for the MRI and 8.73% for both the CT and the CBCT. In addition, the gradient in the MR image of the gel dosimeter irradiated in an H shape was 109.88 while the gradients of the CT and the CBCT were 71.95 and 14.62, respectively. Based on better image quality, the present study showed that CBCT dosimetry for IMRT could be restrictively performed using a normoxic polymethacrylic-acid gel dosimeter.

  1. Analysis of localised dose distribution in human body by Monte Carlo code system for photon irradiation.

    PubMed

    Ohnishi, S; Odano, N; Nariyama, N; Saito, K

    2004-01-01

    In usual personal dosimetry, whole body irradiation is assumed. However, the opportunity of partial irradiation is increasing and the tendencies of protection quantities caused under those irradiation conditions are different. The code system has been developed and effective dose and organ absorbed doses have been calculated in the case of horizontal narrow photon beam irradiated from various directions at three representative body sections, 40, 50 and 60 cm originating from the top of the head. This work covers 24 beam directions, each 15 degrees angle ranging from 0 degrees to 345 degrees, three energy levels, 45 keV, 90 keV and 1.25 MeV, and three beam diameters of 1, 2 and 4 cm. These results show that the beam injected from diagonally front or other specific direction causes peak dose in the case of partial irradiation.

  2. Generation of 238U Covariance Matrices by Using the Integral Data Assimilation Technique of the CONRAD Code

    NASA Astrophysics Data System (ADS)

    Privas, E.; Archier, P.; Bernard, D.; De Saint Jean, C.; Destouche, C.; Leconte, P.; Noguère, G.; Peneliau, Y.; Capote, R.

    2016-02-01

    A new IAEA Coordinated Research Project (CRP) aims to test, validate and improve the IRDF library. Among the isotopes of interest, the modelisation of the 238U capture and fission cross sections represents a challenging task. A new description of the 238U neutrons induced reactions in the fast energy range is within progress in the frame of an IAEA evaluation consortium. The Nuclear Data group of Cadarache participates in this effort utilizing the 238U spectral indices measurements and Post Irradiated Experiments (PIE) carried out in the fast reactors MASURCA (CEA Cadarache) and PHENIX (CEA Marcoule). Such a collection of experimental results provides reliable integral information on the (n,γ) and (n,f) cross sections. This paper presents the Integral Data Assimilation (IDA) technique of the CONRAD code used to propagate the uncertainties of the integral data on the 238U cross sections of interest for dosimetry applications.

  3. Clinical implementation of total skin electron irradiation treatment with a 6 MeV electron beam in high-dose total skin electron mode

    NASA Astrophysics Data System (ADS)

    Lucero, J. F.; Rojas, J. I.

    2016-07-01

    Total skin electron irradiation (TSEI) is a special treatment technique offered by modern radiation oncology facilities, given for the treatment of mycosis fungoides, a rare skin disease, which is type of cutaneous T-cell lymphoma [1]. During treatment the patient's entire skin is irradiated with a uniform dose. The aim of this work is to present implementation of total skin electron irradiation treatment using IAEA TRS-398 code of practice for absolute dosimetry and taking advantage of the use of radiochromic films.

  4. Simulation of MeV electron energy deposition in CdS quantum dots absorbed in silicate glass for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Baharin, R.; Hobson, P. R.; Smith, D. R.

    2010-09-01

    We are currently developing 2D dosimeters with optical readout based on CdS or CdS/CdSe core-shell quantum-dots using commercially available materials. In order to understand the limitations on the measurement of a 2D radiation profile the 3D deposited energy profile of MeV energy electrons in CdS quantum-dot-doped silica glass have been studied by Monte Carlo simulation using the CASINO and PENELOPE codes. Profiles for silica glass and CdS quantum-dot-doped silica glass were then compared.

  5. Clinical implementation of total skin electron irradiation treatment with a 6 MeV electron beam in high-dose total skin electron mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucero, J. F., E-mail: fernando.lucero@hoperadiotherapy.com.gt; Hope International, Guatemala; Rojas, J. I., E-mail: isaac.rojas@siglo21.cr

    Total skin electron irradiation (TSEI) is a special treatment technique offered by modern radiation oncology facilities, given for the treatment of mycosis fungoides, a rare skin disease, which is type of cutaneous T-cell lymphoma [1]. During treatment the patient’s entire skin is irradiated with a uniform dose. The aim of this work is to present implementation of total skin electron irradiation treatment using IAEA TRS-398 code of practice for absolute dosimetry and taking advantage of the use of radiochromic films.

  6. Biodistribution and Radiation Dosimetry for the Novel SV2A Radiotracer [(18)F]UCB-H: First-in-Human Study.

    PubMed

    Bretin, F; Bahri, M A; Bernard, C; Warnock, G; Aerts, J; Mestdagh, N; Buchanan, T; Otoul, C; Koestler, F; Mievis, F; Giacomelli, F; Degueldre, C; Hustinx, R; Luxen, A; Seret, A; Plenevaux, A; Salmon, E

    2015-08-01

    [(18)F]UCB-H is a novel radiotracer with a high affinity for synaptic vesicle glycoprotein 2A (SV2A), a protein expressed in synaptic vesicles. SV2A is the binding site of levetiracetam, a "first-in-class" antiepileptic drug with a distinct but still poorly understood mechanism of action. The objective of this study was to determine the biodistribution and radiation dosimetry of [(18)F]UCB-H in a human clinical trial and to establish injection limits according to biomedical research guidelines. Additionally, the clinical radiation dosimetry results were compared to estimations in previously published preclinical data. Dynamic whole body positron emission tomography/X-ray computed tomography (PET/CT) imaging was performed over approximately 110 min on five healthy male volunteers after injection of 144.5 ± 7.1 MBq (range, 139.1-156.5 MBq) of [(18)F]UCB-H. Major organs were delineated on CT images, and time-activity curves were obtained from co-registered dynamic PET emission scans. The bladder could only be delineated on PET images. Time-integrated activity coefficients were calculated as area under the curve using trapezoidal numerical integration. Urinary excretion data based on PET activities including voiding was also simulated using the dynamic bladder module of OLINDA/EXM. The radiation dosimetry was calculated using OLINDA/EXM. The effective dose to the OLINDA/EXM 70-kg standard male was 1.54 × 10(-2) ± 6.84 × 10(-4) millisieverts (mSv)/MBq, with urinary bladder wall, gallbladder wall, and the liver receiving the highest absorbed dose. The brain, the tracer's main organ of interest, received an absorbed dose of 1.89 × 10(-2) ± 2.32 × 10(-3) mGy/MBq. This first human dosimetry study of [(18)F]UCB-H indicated that the tracer shows similar radiation burdens to widely used common clinical tracers. Single injections of at maximum 672 MBq for US practice and 649 MBq for European practice keep radiation exposure below recommended limits. Recently published preclinical dosimetry data extrapolated from mice provided satisfactory prediction of total body and effective dose but showed significant differences in organ absorbed doses compared to human data.

  7. Establishing a standard calibration methodology for MOSFET detectors in computed tomography dosimetry.

    PubMed

    Brady, S L; Kaufman, R A

    2012-06-01

    The use of metal-oxide-semiconductor field-effect transistor (MOSFET) detectors for patient dosimetry has increased by ~25% since 2005. Despite this increase, no standard calibration methodology has been identified nor calibration uncertainty quantified for the use of MOSFET dosimetry in CT. This work compares three MOSFET calibration methodologies proposed in the literature, and additionally investigates questions relating to optimal time for signal equilibration and exposure levels for maximum calibration precision. The calibration methodologies tested were (1) free in-air (FIA) with radiographic x-ray tube, (2) FIA with stationary CT x-ray tube, and (3) within scatter phantom with rotational CT x-ray tube. Each calibration was performed at absorbed dose levels of 10, 23, and 35 mGy. Times of 0 min or 5 min were investigated for signal equilibration before or after signal read out. Calibration precision was measured to be better than 5%-7%, 3%-5%, and 2%-4% for the 10, 23, and 35 mGy respective dose levels, and independent of calibration methodology. No correlation was demonstrated for precision and signal equilibration time when allowing 5 min before or after signal read out. Differences in average calibration coefficients were demonstrated between the FIA with CT calibration methodology 26.7 ± 1.1 mV cGy(-1) versus the CT scatter phantom 29.2 ± 1.0 mV cGy(-1) and FIA with x-ray 29.9 ± 1.1 mV cGy(-1) methodologies. A decrease in MOSFET sensitivity was seen at an average change in read out voltage of ~3000 mV. The best measured calibration precision was obtained by exposing the MOSFET detectors to 23 mGy. No signal equilibration time is necessary to improve calibration precision. A significant difference between calibration outcomes was demonstrated for FIA with CT compared to the other two methodologies. If the FIA with a CT calibration methodology was used to create calibration coefficients for the eventual use for phantom dosimetry, a measurement error ~12% will be reflected in the dosimetry results. The calibration process must emulate the eventual CT dosimetry process by matching or excluding scatter when calibrating the MOSFETs. Finally, the authors recommend that the MOSFETs are energy calibrated approximately every 2500-3000 mV. © 2012 American Association of Physicists in Medicine.

  8. TU-F-201-00: Radiochromic Film Dosimetry Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Since the introduction of radiochromic films (RCF) for radiation dosimetry, the scope of RCF dosimetry has expanded steadily to include many medical applications, such as radiation therapy and diagnostic radiology. The AAPM Task Group (TG) 55 published a report on the recommendations for RCF dosimetry in 1998. As the technology is advancing rapidly, and its routine clinical use is expanding, TG 235 has been formed to provide an update to TG-55 on radiochromic film dosimetry. RCF dosimetry applications in clinical radiotherapy have become even more widespread, expanding from primarily brachytherapy and radiosurgery applications, and gravitating towards (but not limited to)more » external beam therapy (photon, electron and protons), such as quality assurance for IMRT, VMAT, Tomotherapy, SRS/SRT, and SBRT. In addition, RCF applications now extend to measurements of radiation dose in particle beams and patients undergoing medical exams, especially fluoroscopically guided interventional procedures and CT. The densitometers/scanners used for RCF dosimetry have also evolved from the He-Ne laser scanner to CCD-based scanners, including roller-based scanner, light box-based digital camera, and flatbed color scanner. More recently, multichannel RCF dosimetry introduced a new paradigm for external beam dose QA for its high accuracy and efficiency. This course covers in detail the recent advancements in RCF dosimetry. Learning Objectives: Introduce the paradigm shift on multichannel film dosimetry Outline the procedures to achieve accurate dosimetry with a RCF dosimetry system Provide comprehensive guidelines on RCF dosimetry for various clinical applications One of the speakers has a research agreement from Ashland Inc., the manufacturer of Gafchromic film.« less

  9. TU-F-201-01: General Aspects of Radiochromic Film Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niroomand-Rad, A.

    Since the introduction of radiochromic films (RCF) for radiation dosimetry, the scope of RCF dosimetry has expanded steadily to include many medical applications, such as radiation therapy and diagnostic radiology. The AAPM Task Group (TG) 55 published a report on the recommendations for RCF dosimetry in 1998. As the technology is advancing rapidly, and its routine clinical use is expanding, TG 235 has been formed to provide an update to TG-55 on radiochromic film dosimetry. RCF dosimetry applications in clinical radiotherapy have become even more widespread, expanding from primarily brachytherapy and radiosurgery applications, and gravitating towards (but not limited to)more » external beam therapy (photon, electron and protons), such as quality assurance for IMRT, VMAT, Tomotherapy, SRS/SRT, and SBRT. In addition, RCF applications now extend to measurements of radiation dose in particle beams and patients undergoing medical exams, especially fluoroscopically guided interventional procedures and CT. The densitometers/scanners used for RCF dosimetry have also evolved from the He-Ne laser scanner to CCD-based scanners, including roller-based scanner, light box-based digital camera, and flatbed color scanner. More recently, multichannel RCF dosimetry introduced a new paradigm for external beam dose QA for its high accuracy and efficiency. This course covers in detail the recent advancements in RCF dosimetry. Learning Objectives: Introduce the paradigm shift on multichannel film dosimetry Outline the procedures to achieve accurate dosimetry with a RCF dosimetry system Provide comprehensive guidelines on RCF dosimetry for various clinical applications One of the speakers has a research agreement from Ashland Inc., the manufacturer of Gafchromic film.« less

  10. Thermoluminescent dosimetry in veterinary diagnostic radiology.

    PubMed

    Hernández-Ruiz, L; Jimenez-Flores, Y; Rivera-Montalvo, T; Arias-Cisneros, L; Méndez-Aguilar, R E; Uribe-Izquierdo, P

    2012-12-01

    This paper presents the results of Environmental and Personnel Dosimetry made in a radiology area of a veterinary hospital. Dosimetry was realized using thermoluminescent (TL) materials. Environmental Dosimetry results show that areas closer to the X-ray equipment are safe. Personnel Dosimetry shows important measurements of daily workday in some persons near to the limit established by ICRP. TL results of radiation measurement suggest TLDs are good candidates as a dosimeter to radiation dosimetry in veterinary radiology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Dosimetry in MARS spectral CT: TOPAS Monte Carlo simulations and ion chamber measurements.

    PubMed

    Lu, Gray; Marsh, Steven; Damet, Jerome; Carbonez, Pierre; Laban, John; Bateman, Christopher; Butler, Anthony; Butler, Phil

    2017-06-01

    Spectral computed tomography (CT) is an up and coming imaging modality which shows great promise in revealing unique diagnostic information. Because this imaging modality is based on X-ray CT, it is of utmost importance to study the radiation dose aspects of its use. This study reports on the implementation and evaluation of a Monte Carlo simulation tool using TOPAS for estimating dose in a pre-clinical spectral CT scanner known as the MARS scanner. Simulated estimates were compared with measurements from an ionization chamber. For a typical MARS scan, TOPAS estimated for a 30 mm diameter cylindrical phantom a CT dose index (CTDI) of 29.7 mGy; CTDI was measured by ion chamber to within 3% of TOPAS estimates. Although further development is required, our investigation of TOPAS for estimating MARS scan dosimetry has shown its potential for further study of spectral scanning protocols and dose to scanned objects.

  12. Simulation of the neutron flux in the irradiation facility at RA-3 reactor.

    PubMed

    Bortolussi, S; Pinto, J M; Thorp, S I; Farias, R O; Soto, M S; Sztejnberg, M; Pozzi, E C C; Gonzalez, S J; Gadan, M A; Bellino, A N; Quintana, J; Altieri, S; Miller, M

    2011-12-01

    A facility for the irradiation of a section of patients' explanted liver and lung was constructed at RA-3 reactor, Comisión Nacional de Energía Atómica, Argentina. The facility, located in the thermal column, is characterized by the possibility to insert and extract samples without the need to shutdown the reactor. In order to reach the best levels of security and efficacy of the treatment, it is necessary to perform an accurate dosimetry. The possibility to simulate neutron flux and absorbed dose in the explanted organs, together with the experimental dosimetry, allows setting more precise and effective treatment plans. To this end, a computational model of the entire reactor was set-up, and the simulations were validated with the experimental measurements performed in the facility. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. The Role of Dosimetry in High-Quality EMI Risk Assessment

    DTIC Science & Technology

    2006-09-14

    wireless communication usage and exposure to different parts of the body (especially for children and foetuses ), including multiple exposure from...Calculation of induced electric fields in pregnant women and in the foetus is urgently needed. Very little computation has been carried out on...advanced models of the pregnant human and the foetus with appropriate anatomical modelling. It is important to assess possible enhanced induction of

  14. Development of a model and computer code to describe solar grade silicon production processes

    NASA Technical Reports Server (NTRS)

    Gould, R. K.; Srivastava, R.

    1979-01-01

    Two computer codes were developed for describing flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides. The first is the CHEMPART code, an axisymmetric, marching code which treats two phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. It can be used to described flow reactors in which reactants, mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, deposition of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail.

  15. Matching extended-SSD electron beams to multileaf collimated photon beams in the treatment of head and neck cancer.

    PubMed

    Steel, Jared; Stewart, Allan; Satory, Philip

    2009-09-01

    Matching the penumbra of a 6 MeV electron beam to the penumbra of a 6 MV photon beam is a dose optimization challenge, especially when the electron beam is applied from an extended source-to-surface distance (SSD), as in the case of some head and neck treatments. Traditionally low melting point alloy blocks have been used to define the photon beam shielding over the spinal cord region. However, these are inherently time consuming to construct and employ in the clinical situation. Multileaf collimators (MLCs) provide a fast and reproducible shielding option but generate geometrically nonconformal approximations to the desired beam edge definition. The effects of substituting Cerrobend for the MLC shielding mode in the context of beam matching with extended-SSD electron beams are the subject of this investigation. Relative dose beam data from a Varian EX 2100 linear accelerator were acquired in a water tank under the 6 MeV electron beam at both standard and extended-SSD and under the 6 MV photon beam defined by Cerrobend and a number of MLC stepping regimes. The effect of increasing the electron beam SSD on the beam penumbra was assessed. MLC stepping was also assessed in terms of the effects on both the mean photon beam penumbra and the intraleaf dose-profile nonuniformity relative to the MLC midleaf. Computational techniques were used to combine the beam data so as to simulate composite relative dosimetry in the water tank, allowing fine control of beam abutment gap variation. Idealized volumetric dosimetry was generated based on the percentage depth-dose data for the beam modes and the abutment geometries involved. Comparison was made between each composite dosimetry dataset and the relevant ideal dosimetry dataset by way of subtraction. Weighted dose-difference volume histograms (DDVHs) were produced, and these, in turn, summed to provide an overall dosimetry score for each abutment and shielding type/angle combination. Increasing the electron beam SSD increased the penumbra width (defined as the lateral distance of the 80% and 20% isodose contours) by 8-10 mm at the depths of 10-20 mm. Mean photon beam penumbra width increased with increased MLC stepping, and the mean MLC penumbra was approximately 1.5 times greater than that across the corresponding Cerrobend shielding. Intraleaf dose discrepancy in the direction orthogonal to the beam edge also increased with MLC stepping. The weighted DDVH comparison techniques allowed the composite dosimetry resulting from the interplay of the abovementioned variables to be ranked. The MLC dosimetry ranked as good or better than that resulting from beam matching with Cerrobend for all except large field overlaps (-2.5 mm gap). The results for the linear-weighted DDVH comparison suggest that optimal MLC abutment dosimetry results from an optical surface gap of around 1 +/- 0.5 mm. Furthermore, this appears reasonably lenient to abutment gap variation, such as that arising from uncertainty in beam markup or other setup errors.

  16. Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO

    NASA Technical Reports Server (NTRS)

    Stallworth, R.; Meyers, C. A.; Stinson, H. C.

    1989-01-01

    Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.

  17. Computational Predictions of the Performance Wright 'Bent End' Propellers

    NASA Technical Reports Server (NTRS)

    Wang, Xiang-Yu; Ash, Robert L.; Bobbitt, Percy J.; Prior, Edwin (Technical Monitor)

    2002-01-01

    Computational analysis of two 1911 Wright brothers 'Bent End' wooden propeller reproductions have been performed and compared with experimental test results from the Langley Full Scale Wind Tunnel. The purpose of the analysis was to check the consistency of the experimental results and to validate the reliability of the tests. This report is one part of the project on the propeller performance research of the Wright 'Bent End' propellers, intend to document the Wright brothers' pioneering propeller design contributions. Two computer codes were used in the computational predictions. The FLO-MG Navier-Stokes code is a CFD (Computational Fluid Dynamics) code based on the Navier-Stokes Equations. It is mainly used to compute the lift coefficient and the drag coefficient at specified angles of attack at different radii. Those calculated data are the intermediate results of the computation and a part of the necessary input for the Propeller Design Analysis Code (based on Adkins and Libeck method), which is a propeller design code used to compute the propeller thrust coefficient, the propeller power coefficient and the propeller propulsive efficiency.

  18. Monte Carlo calculated doses to treatment volumes and organs at risk for permanent implant lung brachytherapy

    NASA Astrophysics Data System (ADS)

    Sutherland, J. G. H.; Furutani, K. M.; Thomson, R. M.

    2013-10-01

    Iodine-125 (125I) and Caesium-131 (131Cs) brachytherapy have been used with sublobar resection to treat stage I non-small cell lung cancer and other radionuclides, 169Yb and 103Pd, are considered for these treatments. This work investigates the dosimetry of permanent implant lung brachytherapy for a range of source energies and various implant sites in the lung. Monte Carlo calculated doses are calculated in a patient CT-derived computational phantom using the EGsnrc user-code BrachyDose. Calculations are performed for 103Pd, 125I, 131Cs seeds and 50 and 100 keV point sources for 17 implant positions. Doses to treatment volumes, ipsilateral lung, aorta, and heart are determined and compared to those determined using the TG-43 approach. Considerable variation with source energy and differences between model-based and TG-43 doses are found for both treatment volumes and organs. Doses to the heart and aorta generally increase with increasing source energy. TG-43 underestimates the dose to the heart and aorta for all implants except those nearest to these organs where the dose is overestimated. Results suggest that model-based dose calculations are crucial for selecting prescription doses, comparing clinical endpoints, and studying radiobiological effects for permanent implant lung brachytherapy.

  19. Comparison of U.S. Environmental Protection Agency’s CAP88 PC versions 3.0 and 4.0

    DOE PAGES

    Jannik, Tim; Farfan, Eduardo B.; Dixon, Ken; ...

    2015-08-01

    The Savannah River National Laboratory (SRNL) with the assistance of Georgia Regents University, completed a comparison of the U.S. Environmental Protection Agency's (EPA) environmental dosimetry code CAP88 PC V3.0 with the recently developed V4.0. CAP88 is a set of computer programs and databases used for estimation of dose and risk from radionuclide emissions to air. At the U.S. Department of Energy's Savannah River Site, CAP88 is used by SRNL for determining compliance with EPA's National Emission Standards for Hazardous Air Pollutants (40 CFR 61, Subpart H) regulations. Using standardized input parameters, individual runs were conducted for each radionuclide within itsmore » corresponding database. Some radioactive decay constants, human usage parameters, and dose coefficients changed between the two versions, directly causing a proportional change in the total effective 137Cs, 3H, 129I, 239Pu, and 90Sr) is provided. In general, the total effective doses will decrease for alpha/beta emitters because of reduced inhalation and ingestion rates in V4.0. However, for gamma emitters, such as 60Co and 137Cs, the total effective doses will increase because of changes EPA made in the external ground shine calculations.« less

  20. Experimental characterization of the AFIT neutron facility. Master's thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lessard, O.J.

    1993-09-01

    AFIT's Neutron Facility was characterized for room-return neutrons using a (252)Cf source and a Bonner sphere spectrometer with three experimental models, the shadow shield, the Eisenhauer, Schwartz, and Johnson (ESJ), and the polynomial models. The free-field fluences at one meter from the ESJ and polynomial models were compared to the equivalent value from the accepted experimental shadow shield model to determine the suitability of the models in the AFIT facility. The polynomial model behaved erratically, as expected, while the ESJ model compared to within 4.8% of the shadow shield model results for the four Bonner sphere calibration. The ratio ofmore » total fluence to free-field fluence at one meter for the ESJ model was then compared to the equivalent ratio obtained by a Monte Cario Neutron-Photon transport code (MCNP), an accepted computational model. The ESJ model compared to within 6.2% of the MCNP results. AFIT's fluence ratios were compared to equivalent ratios reported by three other neutron facilities which verified that AFIT's results fit previously published trends based on room volumes. The ESJ model appeared adequate for health physics applications and was chosen was chosen for calibration of the AFIT facility. Neutron Detector, Bonner Sphere, Neutron Dosimetry, Room Characterization.« less

  1. A simplified model of the source channel of the Leksell GammaKnife® tested with PENELOPE

    NASA Astrophysics Data System (ADS)

    Al-Dweri, Feras M. O.; Lallena, Antonio M.; Vilches, Manuel

    2004-06-01

    Monte Carlo simulations using the code PENELOPE have been performed to test a simplified model of the source channel geometry of the Leksell GammaKnife®. The characteristics of the radiation passing through the treatment helmets are analysed in detail. We have found that only primary particles emitted from the source with polar angles smaller than 3° with respect to the beam axis are relevant for the dosimetry of the Gamma Knife. The photon trajectories reaching the output helmet collimators at (x, y, z = 236 mm) show strong correlations between rgr = (x2 + y2)1/2 and their polar angle thgr, on one side, and between tan-1(y/x) and their azimuthal angle phgr, on the other. This enables us to propose a simplified model which treats the full source channel as a mathematical collimator. This simplified model produces doses in good agreement with those found for the full geometry. In the region of maximal dose, the relative differences between both calculations are within 3%, for the 18 and 14 mm helmets, and 10%, for the 8 and 4 mm ones. Besides, the simplified model permits a strong reduction (larger than a factor 15) in the computational time.

  2. Proceduracy: Computer Code Writing in the Continuum of Literacy

    ERIC Educational Resources Information Center

    Vee, Annette

    2010-01-01

    This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…

  3. Computer Code Aids Design Of Wings

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Darden, Christine M.

    1993-01-01

    AERO2S computer code developed to aid design engineers in selection and evaluation of aerodynamically efficient wing/canard and wing/horizontal-tail configurations that includes simple hinged-flap systems. Code rapidly estimates longitudinal aerodynamic characteristics of conceptual airplane lifting-surface arrangements. Developed in FORTRAN V on CDC 6000 computer system, and ported to MS-DOS environment.

  4. Cloud Computing for Complex Performance Codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  5. APC: A New Code for Atmospheric Polarization Computations

    NASA Technical Reports Server (NTRS)

    Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.

    2014-01-01

    A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.

  6. A dual two dimensional electronic portal imaging device transit dosimetry model based on an empirical quadratic formalism

    PubMed Central

    Metwaly, M; Glegg, M; Baggarley, S P; Elliott, A

    2015-01-01

    Objective: This study describes a two dimensional electronic portal imaging device (EPID) transit dosimetry model that can predict either: (1) in-phantom exit dose, or (2) EPID transit dose, for treatment verification. Methods: The model was based on a quadratic equation that relates the reduction in intensity to the equivalent path length (EPL) of the attenuator. In this study, two sets of quadratic equation coefficients were derived from calibration dose planes measured with EPID and ionization chamber in water under reference conditions. With two sets of coefficients, EPL can be calculated from either EPID or treatment planning system (TPS) dose planes. Consequently, either the in-phantom exit dose or the EPID transit dose can be predicted from the EPL. The model was tested with two open, five wedge and seven sliding window prostate and head and neck intensity-modulated radiation therapy (IMRT) fields on phantoms. Results were analysed using absolute gamma analysis (3%/3 mm). Results: The open fields gamma pass rates were >96.8% for all comparisons. For wedge and IMRT fields, comparisons between predicted and TPS-computed in-phantom exit dose resulted in mean gamma pass rate of 97.4% (range, 92.3–100%). As for the comparisons between predicted and measured EPID transit dose, the mean gamma pass rate was 97.5% (range, 92.6–100%). Conclusion: An EPID transit dosimetry model that can predict in-phantom exit dose and EPID transit dose was described and proven to be valid. Advances in knowledge: The described model is practical, generic and flexible to encourage widespread implementation of EPID dosimetry for the improvement of patients' safety in radiotherapy. PMID:25969867

  7. EFFECTIVE DOSE IN TWO DIFFERENT DENTAL CBCT SYSTEMS: NEWTOM VGi AND PLANMECA 3D MID.

    PubMed

    Ghaedizirgar, Mohammad; Faghihi, Reza; Paydar, Reza; Sina, Sedigheh

    2017-11-01

    Cone beam computed tomography, CBCT, is a kind of CT scanner producing conical diverging X-rays, in which a large area of a two-dimensional detector is irradiated in each rotation. Different investigations have been performed on dosimetry of dental CBCT. As there is no special protocol for dental CBCT, CT scan protocols are used for dosimetry. The purpose of this study is measurement of dose to head and neck organs in two CBCT systems, i.e. Planmeca 3D Mid (PM) and NewTom VGi (NT), using thermoluminescence dosimetry and Rando phantom. The thermoluminescent dosimetry (TLD)-100 chips were put at the position of different organs of the head and neck. Two TLD-100 chips were inserted at each position, the dose values were measured for several different field sizes, i.e. 8 × 8, 12 × 8 and 15 × 15 cm2 for NewTom, and 10 × 10 and 20 × 17 cm2 for Planmeca systems. According to the results, the average effective dose in PM is much more than the NT system in the same field size, because of the greater mAs values. For routine imaging protocols used for NT, the effective dose values are 70, 73 and 121 µSv for 8 × 8, 12 × 8 and 15 × 15 cm2 field sizes, respectively. In PM, the effective dose in 10 × 10 cm2 and 17 × 20 cm2 is 259 and 341 µSv, respectively. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. MO-D-BRD-01: Memorial to Bengt Bjarngard - Memorial Lecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, I

    We lost a legendary medical physicist, Dr. Bengt Erik Bjarngard, to angiosarcoma an aggressive type of cancer. He devoted his life to providing improved methods of radiation treatment for this devastating disease over the last 36 years. Bengt was born in a rural village of Bjarnum in southern Sweden, located near forest and is known for its furniture making. He migrated to USA at the age of 35 and was recruited by Dr. Samuel Hellman to lead a group of physicists that became the “mecca of medical physics” known as the Joint Center of Radiation Therapy (JCRT) at Harvard Medicalmore » School in Boston. Bengt mentored some of the best physicists in the country, and many of our modern treatments go back to the early days of research at the JCRT. These accomplishments, dating from 1969–1989, include: dose optimization using computer control; soft wedges; stereotactic radiosurgery (SRS); total-body irradiation (TBI); CT-planning; and radiation dosimetry. Bengt worked at Brown University in Rhode Island and at the University of Pennsylvania in Philadelphia, where he provided major contributions in radiation dosimetry, specifically with the head scatter model. He advocated superior calculation algorithm through the Helax treatment planning system that was on par from most commercial systems. Bengt served as AAPM president in 1979 and was a recipient of the Coolidge Award in 1998. He had a lifelong love of nature, retiring in 2000 from the University of Pennsylvania to take care of his 200 acres of homestead forest in Maine. His legacy continues through his contributions to radiation dosimetry. This session, on small field dosimetry, is a small tribute to his memory. Further details can be found in his obituary in Med Phy, 41(4), 040801, 2014.« less

  9. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  10. Magnetic resonance imaging and computational fluid dynamics (CFD) simulations of rabbit nasal airflows for the development of hybrid CFD/PBPK models.

    PubMed

    Corley, R A; Minard, K R; Kabilan, S; Einstein, D R; Kuprat, A P; Harkema, J R; Kimbell, J S; Gargas, M L; Kinzell, John H

    2009-05-01

    The percentages of total airflows over the nasal respiratory and olfactory epithelium of female rabbits were calculated from computational fluid dynamics (CFD) simulations of steady-state inhalation. These airflow calculations, along with nasal airway geometry determinations, are critical parameters for hybrid CFD/physiologically based pharmacokinetic models that describe the nasal dosimetry of water-soluble or reactive gases and vapors in rabbits. CFD simulations were based upon three-dimensional computational meshes derived from magnetic resonance images of three adult female New Zealand White (NZW) rabbits. In the anterior portion of the nose, the maxillary turbinates of rabbits are considerably more complex than comparable regions in rats, mice, monkeys, or humans. This leads to a greater surface area to volume ratio in this region and thus the potential for increased extraction of water soluble or reactive gases and vapors in the anterior portion of the nose compared to many other species. Although there was considerable interanimal variability in the fine structures of the nasal turbinates and airflows in the anterior portions of the nose, there was remarkable consistency between rabbits in the percentage of total inspired airflows that reached the ethmoid turbinate region (approximately 50%) that is presumably lined with olfactory epithelium. These latter results (airflows reaching the ethmoid turbinate region) were higher than previous published estimates for the male F344 rat (19%) and human (7%). These differences in regional airflows can have significant implications in interspecies extrapolations of nasal dosimetry.

  11. EDITORIAL: Special section: Selected papers from the Third European Workshop on Monte Carlo Treatment Planning (MCTP2012) Special section: Selected papers from the Third European Workshop on Monte Carlo Treatment Planning (MCTP2012)

    NASA Astrophysics Data System (ADS)

    Spezi, Emiliano; Leal, Antonio

    2013-04-01

    The Third European Workshop on Monte Carlo Treatment Planning (MCTP2012) was held from 15-18 May, 2012 in Seville, Spain. The event was organized by the Universidad de Sevilla with the support of the European Workgroup on Monte Carlo Treatment Planning (EWG-MCTP). MCTP2012 followed two successful meetings, one held in Ghent (Belgium) in 2006 (Reynaert 2007) and one in Cardiff (UK) in 2009 (Spezi 2010). The recurrence of these workshops together with successful events held in parallel by McGill University in Montreal (Seuntjens et al 2012), show consolidated interest from the scientific community in Monte Carlo (MC) treatment planning. The workshop was attended by a total of 90 participants, mainly coming from a medical physics background. A total of 48 oral presentations and 15 posters were delivered in specific scientific sessions including dosimetry, code development, imaging, modelling of photon and electron radiation transport, external beam radiation therapy, nuclear medicine, brachitherapy and hadrontherapy. A copy of the programme is available on the workshop's website (www.mctp2012.com). In this special section of Physics in Medicine and Biology we report six papers that were selected following the journal's rigorous peer review procedure. These papers actually provide a good cross section of the areas of application of MC in treatment planning that were discussed at MCTP2012. Czarnecki and Zink (2013) and Wagner et al (2013) present the results of their work in small field dosimetry. Czarnecki and Zink (2013) studied field size and detector dependent correction factors for diodes and ion chambers within a clinical 6MV photon beam generated by a Siemens linear accelerator. Their modelling work based on the BEAMnrc/EGSnrc codes and experimental measurements revealed that unshielded diodes were the best choice for small field dosimetry because of their independence from the electron beam spot size and correction factor close to unity. Wagner et al (2013) investigated the recombination effect on liquid ionization chambers for stereotactic radiotherapy, a field of increasing importance in external beam radiotherapy. They modelled both radiation source (Cyberknife unit) and detector with the BEAMnrc/EGSnrc codes and quantified the dependence of the response of this type of detectors on factors such as the volume effect and the electrode. They also recommended that these dependences be accounted for in measurements involving small fields. In the field of external beam radiotherapy, Chakarova et al (2013) showed how total body irradiation (TBI) could be improved by simulating patient treatments with MC. In particular, BEAMnrc/EGSnrc based simulations highlighted the importance of optimizing individual compensators for TBI treatments. In the same area of application, Mairani et al (2013) reported on a new tool for treatment planning in proton therapy based on the FLUKA MC code. The software, used to model both proton therapy beam and patient anatomy, supports single-field and multiple-field optimization and can be used to optimize physical and relative biological effectiveness (RBE)-weighted dose distribution, using both constant and variable RBE models. In the field of nuclear medicine Marcatili et al (2013) presented RAYDOSE, a Geant4-based code specifically developed for applications in molecular radiotherapy (MRT). RAYDOSE has been designed to work in MRT trials using sequential positron emission tomography (PET) or single-photon emission tomography (SPECT) imaging to model patient specific time-dependent metabolic uptake and to calculate the total 3D dose distribution. The code was validated through experimental measurements in homogeneous and heterogeneous phantoms. Finally, in the field of code development Miras et al (2013) reported on CloudMC, a Windows Azure-based application for the parallelization of MC calculations in a dynamic cluster environment. Although the performance of CloudMC has been tested with the PENELOPE MC code, the authors report that software has been designed in a way that it should be independent of the type of MC code, provided that simulation meets a number of operational criteria. We wish to thank Elekta/CMS Inc., the University of Seville, the Junta of Andalusia and the European Regional Development Fund for their financial support. We would like also to acknowledge the members of EWG-MCTP for their help in peer-reviewing all the abstracts, and all the invited speakers who kindly agreed to deliver keynote presentations in their area of expertise. A final word of thanks to our colleagues who worked on the reviewing process of the papers selected for this special section and to the IOP Publishing staff who made it possible. MCTP2012 was accredited by the European Federation of Organisations for Medical Physics as a CPD event for medical physicists. Emiliano Spezi and Antonio Leal Guest Editors References Chakarova R, Müntzing K, Krantz M, E Hedin E and Hertzman S 2013 Monte Carlo optimization of total body irradiation in a phantom and patient geometry Phys. Med. Biol. 58 2461-69 Czarnecki D and Zink K 2013 Monte Carlo calculated correction factors for diodes and ion chambers in small photon fields Phys. Med. Biol. 58 2431-44 Mairani A, Böhlen T T, Schiavi A, Tessonnier T, Molinelli S, Brons S, Battistoni G, Parodi K and Patera V 2013 A Monte Carlo-based treatment planning tool for proton therapy Phys. Med. Biol. 58 2471-90 Marcatili S, Pettinato C, Daniels S, Lewis G, Edwards P, Fanti S and Spezi E 2013 Development and validation of RAYDOSE: a Geant4 based application for molecular radiotherapy Phys. Med. Biol. 58 2491-508 Miras H, Jiménez R, Miras C and Gomà C 2013 CloudMC: A cloud computing application for Monte Carlo simulation Phys. Med. Biol. 58 N125-33 Reynaert N 2007 First European Workshop on Monte Carlo Treatment Planning J. Phys.: Conf. Ser. 74 011001 Seuntjens J, Beaulieu L, El Naqa I and Després P 2012 Special section: Selected papers from the Fourth International Workshop on Recent Advances in Monte Carlo Techniques for Radiation Therapy Phys. Med. Biol. 57 (11) E01 Spezi E 2010 Special section: Selected papers from the Second European Workshop on Monte Carlo Treatment Planning (MCTP2009) Phys. Med. Biol. 55 (16) E01 Wagner A, Crop F, Lacornerie T, Vandevelde F and Reynaert N 2013 Use of a liquid ionization chamber for stereotactic radiotherapy dosimetry Phys. Med. Biol. 58 2445-59

  12. Calculation of Water Drop Trajectories to and About Arbitrary Three-Dimensional Bodies in Potential Airflow

    NASA Technical Reports Server (NTRS)

    Norment, H. G.

    1980-01-01

    Calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Any subsonic, external, non-lifting flow can be accommodated; flow into, but not through, inlets also can be simulated. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Code descriptions include operating instructions, card inputs and printouts for example problems, and listing of the FORTRAN codes. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.

  13. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  14. PASCO: Structural panel analysis and sizing code: Users manual - Revised

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.

    1981-01-01

    A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.

  15. Computation of Reacting Flows in Combustion Processes

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Chen, Kuo-Huey

    1997-01-01

    The main objective of this research was to develop an efficient three-dimensional computer code for chemically reacting flows. The main computer code developed is ALLSPD-3D. The ALLSPD-3D computer program is developed for the calculation of three-dimensional, chemically reacting flows with sprays. The ALL-SPD code employs a coupled, strongly implicit solution procedure for turbulent spray combustion flows. A stochastic droplet model and an efficient method for treatment of the spray source terms in the gas-phase equations are used to calculate the evaporating liquid sprays. The chemistry treatment in the code is general enough that an arbitrary number of reaction and species can be defined by the users. Also, it is written in generalized curvilinear coordinates with both multi-block and flexible internal blockage capabilities to handle complex geometries. In addition, for general industrial combustion applications, the code provides both dilution and transpiration cooling capabilities. The ALLSPD algorithm, which employs the preconditioning and eigenvalue rescaling techniques, is capable of providing efficient solution for flows with a wide range of Mach numbers. Although written for three-dimensional flows in general, the code can be used for two-dimensional and axisymmetric flow computations as well. The code is written in such a way that it can be run in various computer platforms (supercomputers, workstations and parallel processors) and the GUI (Graphical User Interface) should provide a user-friendly tool in setting up and running the code.

  16. Radiation protection and dosimetry issues in the medical applications of ionizing radiation

    NASA Astrophysics Data System (ADS)

    Vaz, Pedro

    2014-11-01

    The technological advances that occurred during the last few decades paved the way to the dissemination of CT-based procedures in radiology, to an increasing number of procedures in interventional radiology and cardiology as well as to new techniques and hybrid modalities in nuclear medicine and in radiotherapy. These technological advances encompass the exposure of patients and medical staff to unprecedentedly high dose values that are a cause for concern due to the potential detrimental effects of ionizing radiation to the human health. As a consequence, new issues and challenges in radiological protection and dosimetry in the medical applications of ionizing radiation have emerged. The scientific knowledge of the radiosensitivity of individuals as a function of age, gender and other factors has also contributed to raising the awareness of scientists, medical staff, regulators, decision makers and other stakeholders (including the patients and the public) for the need to correctly and accurately assess the radiation induced long-term health effects after medical exposure. Pediatric exposures and their late effects became a cause of great concern. The scientific communities of experts involved in the study of the biological effects of ionizing radiation have made a strong case about the need to undertake low dose radiation research and the International System of Radiological Protection is being challenged to address and incorporate issues such as the individual sensitivities, the shape of dose-response relationship and tissue sensitivity for cancer and non-cancer effects. Some of the answers to the radiation protection and dosimetry issues and challenges in the medical applications of ionizing radiation lie in computational studies using Monte Carlo or hybrid methods to model and simulate particle transport in the organs and tissues of the human body. The development of sophisticated Monte Carlo computer programs and voxel phantoms paves the way to an accurate dosimetric assessment of the medical applications of ionizing radiation. In this paper, the aforementioned topics will be reviewed. The current status and the future trends in the implementation of the justification and optimization principles, pillars of the International System of Radiological Protection, in the medical applications of ionizing radiation will be discussed. Prospective views will be provided on the future of the system of radiological protection and on dosimetry issues in the medical applications of ionizing radiation.

  17. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  18. Final report for the Tera Computer TTI CRADA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, G.S.; Pavlakos, C.; Silva, C.

    1997-01-01

    Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less

  19. Operations analysis (study 2.1). Program listing for the LOVES computer code

    NASA Technical Reports Server (NTRS)

    Wray, S. T., Jr.

    1974-01-01

    A listing of the LOVES computer program is presented. The program is coded partially in SIMSCRIPT and FORTRAN. This version of LOVES is compatible with both the CDC 7600 and the UNIVAC 1108 computers. The code has been compiled, loaded, and executed successfully on the EXEC 8 system for the UNIVAC 1108.

  20. Analysis of the Length of Braille Texts in English Braille American Edition, the Nemeth Code, and Computer Braille Code versus the Unified English Braille Code

    ERIC Educational Resources Information Center

    Knowlton, Marie; Wetzel, Robin

    2006-01-01

    This study compared the length of text in English Braille American Edition, the Nemeth code, and the computer braille code with the Unified English Braille Code (UEBC)--also known as Unified English Braille (UEB). The findings indicate that differences in the length of text are dependent on the type of material that is transcribed and the grade…

  1. A MATLAB based 3D modeling and inversion code for MT data

    NASA Astrophysics Data System (ADS)

    Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.

    2017-07-01

    The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiu-Tsao, S.

    Since the introduction of radiochromic films (RCF) for radiation dosimetry, the scope of RCF dosimetry has expanded steadily to include many medical applications, such as radiation therapy and diagnostic radiology. The AAPM Task Group (TG) 55 published a report on the recommendations for RCF dosimetry in 1998. As the technology is advancing rapidly, and its routine clinical use is expanding, TG 235 has been formed to provide an update to TG-55 on radiochromic film dosimetry. RCF dosimetry applications in clinical radiotherapy have become even more widespread, expanding from primarily brachytherapy and radiosurgery applications, and gravitating towards (but not limited to)more » external beam therapy (photon, electron and protons), such as quality assurance for IMRT, VMAT, Tomotherapy, SRS/SRT, and SBRT. In addition, RCF applications now extend to measurements of radiation dose in particle beams and patients undergoing medical exams, especially fluoroscopically guided interventional procedures and CT. The densitometers/scanners used for RCF dosimetry have also evolved from the He-Ne laser scanner to CCD-based scanners, including roller-based scanner, light box-based digital camera, and flatbed color scanner. More recently, multichannel RCF dosimetry introduced a new paradigm for external beam dose QA for its high accuracy and efficiency. This course covers in detail the recent advancements in RCF dosimetry. Learning Objectives: Introduce the paradigm shift on multichannel film dosimetry Outline the procedures to achieve accurate dosimetry with a RCF dosimetry system Provide comprehensive guidelines on RCF dosimetry for various clinical applications One of the speakers has a research agreement from Ashland Inc., the manufacturer of Gafchromic film.« less

  3. Neutron spectrometry and dosimetry in 100 and 300 MeV quasi-mono-energetic neutron field at RCNP, Osaka University, Japan

    NASA Astrophysics Data System (ADS)

    Mares, Vladimir; Trinkl, Sebastian; Iwamoto, Yosuke; Masuda, Akihiko; Matsumoto, Tetsuro; Hagiwara, Masayuki; Satoh, Daiki; Yashima, Hiroshi; Shima, Tatsushi; Nakamura, Takashi

    2017-09-01

    This paper describes the results of neutron spectrometry and dosimetry measurements using an extended range Bonner Sphere Spectrometer (ERBSS) with 3He proportional counter performed in quasi-mono-energetic neutron fields at the ring cyclotron facility of the Research Center for Nuclear Physics (RCNP), Osaka University, Japan. Using 100 MeV and 296 MeV proton beams, neutron fields with nominal peak energies of 96 MeV and 293 MeV were generated via 7Li(p,n)7Be reactions. Neutrons produced at 0° and 25° emission angles were extracted into the 100 m long time-of-flight (TOF) tunnel, and the energy spectra were measured at a distance of 35 m from the target. To deduce the corresponding neutron spectra from thermal to the nominal maximum energy, the ERBSS data were unfolded using the MSANDB unfolding code. At high energies, the neutron spectra were also measured by means of the TOF method using NE213 organic liquid scintillators. The results are discussed in terms of ambient dose equivalent, H*(10), and compared with the readings of other instruments operated during the experiment.

  4. Applications of automatic differentiation in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  5. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    NASA Astrophysics Data System (ADS)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.

  6. Performance assessment of KORAT-3D on the ANL IBM-SP computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.

    1999-09-01

    The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less

  7. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  8. Diagnostic x-ray dosimetry using Monte Carlo simulation.

    PubMed

    Ioppolo, J L; Price, R I; Tuchyna, T; Buckley, C E

    2002-05-21

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 x 10(7)) than required for the calculation of dose profiles (1 x 10(9)). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  9. Diagnostic x-ray dosimetry using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ioppolo, J. L.; Price, R. I.; Tuchyna, T.; Buckley, C. E.

    2002-05-01

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 × 107) than required for the calculation of dose profiles (1 × 109). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  10. Fast H.264/AVC FRExt intra coding using belief propagation.

    PubMed

    Milani, Simone

    2011-01-01

    In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.

  11. 2,445 Hours of Code: What I Learned from Facilitating Hour of Code Events in High School Libraries

    ERIC Educational Resources Information Center

    Colby, Jennifer

    2015-01-01

    This article describes a school librarian's experience with initiating an Hour of Code event for her school's student body. Hadi Partovi of Code.org conceived the Hour of Code "to get ten million students to try one hour of computer science" (Partovi, 2013a), which is implemented during Computer Science Education Week with a goal of…

  12. Numerical algorithm comparison for the accurate and efficient computation of high-incidence vortical flow

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.

    1991-01-01

    Computations from two Navier-Stokes codes, NSS and F3D, are presented for a tangent-ogive-cylinder body at high angle of attack. Features of this steady flow include a pair of primary vortices on the leeward side of the body as well as secondary vortices. The topological and physical plausibility of this vortical structure is discussed. The accuracy of these codes are assessed by comparison of the numerical solutions with experimental data. The effects of turbulence model, numerical dissipation, and grid refinement are presented. The overall efficiency of these codes are also assessed by examining their convergence rates, computational time per time step, and maximum allowable time step for time-accurate computations. Overall, the numerical results from both codes compared equally well with experimental data, however, the NSS code was found to be significantly more efficient than the F3D code.

  13. User's Manual for FEMOM3DR. Version 1.0

    NASA Technical Reports Server (NTRS)

    Reddy, C. J.

    1998-01-01

    FEMoM3DR is a computer code written in FORTRAN 77 to compute radiation characteristics of antennas on 3D body using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. The code is written to handle different feeding structures like coaxial line, rectangular waveguide, and circular waveguide. This code uses the tetrahedral elements, with vector edge basis functions for FEM and triangular elements with roof-top basis functions for MoM. By virtue of FEM, this code can handle any arbitrary shaped three dimensional bodies with inhomogeneous lossy materials; and due to MoM the computational domain can be terminated in any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.

  14. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGrail, B.P.; Mahoney, L.A.

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected tomore » affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.« less

  15. User's manual for a material transport code on the Octopus Computer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.; Mendez, G.D.

    1978-09-15

    A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.

  16. A gradient of radioactive contamination in Dolon village near the SNTS and comparison of computed dose values with instrumental estimates for the 29 August, 1949 nuclear test.

    PubMed

    Stepanenko, Valeriy F; Hoshi, Masaharu; Dubasov, Yuriy V; Sakaguchi, Aya; Yamamoto, Masayoshi; Orlov, Mark Y; Bailiff, Ian K; Ivannikov, Alexander I; Skvortsov, Valeriy G; Iaskova, Elena K; Kryukova, Irina G; Zhumadilov, Kassym S; Endo, Satoru; Tanaka, Kenichi; Apsalikov, Kazbek N; Gusev, Boris I

    2006-02-01

    Spatial distributions of soil contamination by 137Cs (89 sampling points) and 239+240Pu (76 points) near and within Dolon village were analyzed. An essential exponential decrease of contamination was found in Dolon village: the distance of a half reduction in contamination is about 0.87-1.25 km (in a northwest-southeast direction from the supposed centerline of the radioactive trace). This fact is in agreement with the available exposure rate measurements near Dolon (September 1949 archive data): on the basis of a few measurements the pattern of the trace was estimated to comprise a narrow 2 km corridor of maximum exposure rate. To compare computed external doses in air with local dose estimates by retrospective luminescence dosimetry (RLD) the gradient of radioactive soil contamination within the village was accounted for. The computed dose associated with the central axis of the trace was found to be equal to 2260 mGy (calculations based on archive exposure rate data). Local doses near the RLD sampling points (southeast of the village) were calculated to be in the range 466-780 mGy (averaged value: 645+/-70 mGy), which is comparable with RLD data (averaged value 460+/-92 mGy with range 380-618 mGy). A comparison of the computed mean dose in the settlement with dose estimates by ESR tooth enamel dosimetry makes it possible to estimate the "upper level" of the "shielding and behavior" factor in dose reduction for inhabitants of Dolon village which was found to be 0.28+/-0.068.

  17. Performance analysis of three dimensional integral equation computations on a massively parallel computer. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Logan, Terry G.

    1994-01-01

    The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.

  18. Pediatric Phantom Dosimetry of Kodak 9000 Cone-beam Computed Tomography.

    PubMed

    Yepes, Juan F; Booe, Megan R; Sanders, Brian J; Jones, James E; Ehrlich, Ygal; Ludlow, John B; Johnson, Brandon

    2017-05-15

    The purpose of the study was to evaluate the radiation dose of the Kodak 9000 cone-beam computed tomography (CBCT) device for different anatomical areas using a pediatric phantom. Absorbed doses resulting from maxillary and mandibular region three by five cm CBCT volumes of an anthropomorphic 10-year-old child phantom were acquired using optical stimulated dosimetry. Equivalent doses were calculated for radiosensitive tissues in the head and neck area, and effective dose for maxillary and mandibular examinations were calculated following the 2007 recommendations of the International Commission on Radiological Protection (ICRP). Of the mandibular scans, the salivary glands had the highest equivalent dose (1,598 microsieverts [μSv]), followed by oral mucosa (1,263 μSv), extrathoracic airway (pharynx, larynx, and trachea; 859 μSv), and thyroid gland (578 μSv). For the maxilla, the salivary glands had the highest equivalent dose (1,847 μSv), followed closely by oral mucosa (1,673 μSv), followed by the extrathoracic airway (pharynx, larynx, and trachea; 1,011 μSv) and lens of the eye (202 μSv). Compared to previous research of the Kodak 9000, completed with the adult phantom, a child receives one to three times more radiation for mandibular scans and two to 10 times more radiation for maxillary scans.

  19. Computer Description of the M561 Utility Truck

    DTIC Science & Technology

    1984-10-01

    GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom

  20. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L L; Trent, D S; Budden, M J

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.

  1. CALCULATION OF GAMMA SPECTRA IN A PLASTIC SCINTILLATOR FOR ENERGY CALIBRATIONAND DOSE COMPUTATION.

    PubMed

    Kim, Chankyu; Yoo, Hyunjun; Kim, Yewon; Moon, Myungkook; Kim, Jong Yul; Kang, Dong Uk; Lee, Daehee; Kim, Myung Soo; Cho, Minsik; Lee, Eunjoong; Cho, Gyuseong

    2016-09-01

    Plastic scintillation detectors have practical advantages in the field of dosimetry. Energy calibration of measured gamma spectra is important for dose computation, but it is not simple in the plastic scintillators because of their different characteristics and a finite resolution. In this study, the gamma spectra in a polystyrene scintillator were calculated for the energy calibration and dose computation. Based on the relationship between the energy resolution and estimated energy broadening effect in the calculated spectra, the gamma spectra were simply calculated without many iterations. The calculated spectra were in agreement with the calculation by an existing method and measurements. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. TH-CD-BRA-07: MRI-Linac Dosimetry: Parameters That Change in a Magnetic Field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Brien, D. J.; Sawakuchi, G. O.

    Purpose: In MRI-linac integrated systems, the presence of the magnetic (B-)field has a large impact of the dose-distribution and the dose-responses of detectors; yet established protocols and previous experience may lead to assumptions about the commissioning process that are no longer valid. This study quantifies parameters that change when performing dosimetry with an MRI-linac including beam quality specifiers and the effective-point-of-measurement (EPOM) of ionization chambers. Methods: We used the Geant4 Monte Carlo code for this work with physics parameters that pass the Fano cavity test to within 0.1% for the simulated conditions with and without a 1.5 T B-field. Amore » point source model with the energy distribution of an MRI-linac beam was used with and without the B-field to calculate the beam quality specifiers %dd(10)× and TPR{sup 20}{sub 10}, the variation of chamber response with orientation and the how the B-field affects the EPOM of ionization chambers by comparing depth-dose curves calculated in water to those generated by a model PTW30013 Farmer chamber. Results: The %dd(10)× changes by over 2% in the presence of the B-field while the TPR{sup 20}{sub 10} is unaffected. Ionization chamber dose-response is known to depend on the orientation w.r.t. the B-field, but two alternative perpendicular orientations (anti-parallel to each other) also differ in dose-response by over 1%. The B-field shifts the EPOM downstream (closer to the chamber center) but it is also shifted laterally by 0.27 times the chamber’s cavity radius. Conclusion: The EPOM is affected by the B-field and it even shifts laterally. The relationship between %dd(10)× and the Spencer-Attix stopping powers is also changed. Care must be taken when using chambers perpendicular to the field as the dose-response changes depending on which perpendicular orientation is used. All of these effects must be considered when performing dosimetry in B-fields and should be accounted for in future dosimetry protocols. This project was partially funded by Elekta Ltd.« less

  3. 10 CFR 835.1304 - Nuclear accident dosimetry.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Nuclear accident dosimetry. 835.1304 Section 835.1304... Nuclear accident dosimetry. (a) Installations possessing sufficient quantities of fissile material to... nuclear accident is possible, shall provide nuclear accident dosimetry for those individuals. (b) Nuclear...

  4. 10 CFR 835.1304 - Nuclear accident dosimetry.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Nuclear accident dosimetry. 835.1304 Section 835.1304... Nuclear accident dosimetry. (a) Installations possessing sufficient quantities of fissile material to... nuclear accident is possible, shall provide nuclear accident dosimetry for those individuals. (b) Nuclear...

  5. 10 CFR 835.1304 - Nuclear accident dosimetry.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Nuclear accident dosimetry. 835.1304 Section 835.1304... Nuclear accident dosimetry. (a) Installations possessing sufficient quantities of fissile material to... nuclear accident is possible, shall provide nuclear accident dosimetry for those individuals. (b) Nuclear...

  6. 10 CFR 835.1304 - Nuclear accident dosimetry.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Nuclear accident dosimetry. 835.1304 Section 835.1304... Nuclear accident dosimetry. (a) Installations possessing sufficient quantities of fissile material to... nuclear accident is possible, shall provide nuclear accident dosimetry for those individuals. (b) Nuclear...

  7. 10 CFR 835.1304 - Nuclear accident dosimetry.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Nuclear accident dosimetry. 835.1304 Section 835.1304... Nuclear accident dosimetry. (a) Installations possessing sufficient quantities of fissile material to... nuclear accident is possible, shall provide nuclear accident dosimetry for those individuals. (b) Nuclear...

  8. Characterization of the nanoDot OSLD dosimeter in CT

    PubMed Central

    Scarboro, Sarah B.; Cody, Dianna; Alvarez, Paola; Followill, David; Court, Laurence; Stingo, Francesco C.; Zhang, Di; Kry, Stephen F.

    2015-01-01

    Purpose: The extensive use of computed tomography (CT) in diagnostic procedures is accompanied by a growing need for more accurate and patient-specific dosimetry techniques. Optically stimulated luminescent dosimeters (OSLDs) offer a potential solution for patient-specific CT point-based surface dosimetry by measuring air kerma. The purpose of this work was to characterize the OSLD nanoDot for CT dosimetry, quantifying necessary correction factors, and evaluating the uncertainty of these factors. Methods: A characterization of the Landauer OSL nanoDot (Landauer, Inc., Greenwood, IL) was conducted using both measurements and theoretical approaches in a CT environment. The effects of signal depletion, signal fading, dose linearity, and angular dependence were characterized through direct measurement for CT energies (80–140 kV) and delivered doses ranging from ∼5 to >1000 mGy. Energy dependence as a function of scan parameters was evaluated using two independent approaches: direct measurement and a theoretical approach based on Burlin cavity theory and Monte Carlo simulated spectra. This beam-quality dependence was evaluated for a range of CT scanning parameters. Results: Correction factors for the dosimeter response in terms of signal fading, dose linearity, and angular dependence were found to be small for most measurement conditions (<3%). The relative uncertainty was determined for each factor and reported at the two-sigma level. Differences in irradiation geometry (rotational versus static) resulted in a difference in dosimeter signal of 3% on average. Beam quality varied with scan parameters and necessitated the largest correction factor, ranging from 0.80 to 1.15 relative to a calibration performed in air using a 120 kV beam. Good agreement was found between the theoretical and measurement approaches. Conclusions: Correction factors for the measurement of air kerma were generally small for CT dosimetry, although angular effects, and particularly effects due to changes in beam quality, could be more substantial. In particular, it would likely be necessary to account for variations in CT scan parameters and measurement location when performing CT dosimetry using OSLD. PMID:25832070

  9. Characterization of the nanoDot OSLD dosimeter in CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scarboro, Sarah B.; Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030; The Methodist Hospital, Houston, Texas 77030

    Purpose: The extensive use of computed tomography (CT) in diagnostic procedures is accompanied by a growing need for more accurate and patient-specific dosimetry techniques. Optically stimulated luminescent dosimeters (OSLDs) offer a potential solution for patient-specific CT point-based surface dosimetry by measuring air kerma. The purpose of this work was to characterize the OSLD nanoDot for CT dosimetry, quantifying necessary correction factors, and evaluating the uncertainty of these factors. Methods: A characterization of the Landauer OSL nanoDot (Landauer, Inc., Greenwood, IL) was conducted using both measurements and theoretical approaches in a CT environment. The effects of signal depletion, signal fading, dosemore » linearity, and angular dependence were characterized through direct measurement for CT energies (80–140 kV) and delivered doses ranging from ∼5 to >1000 mGy. Energy dependence as a function of scan parameters was evaluated using two independent approaches: direct measurement and a theoretical approach based on Burlin cavity theory and Monte Carlo simulated spectra. This beam-quality dependence was evaluated for a range of CT scanning parameters. Results: Correction factors for the dosimeter response in terms of signal fading, dose linearity, and angular dependence were found to be small for most measurement conditions (<3%). The relative uncertainty was determined for each factor and reported at the two-sigma level. Differences in irradiation geometry (rotational versus static) resulted in a difference in dosimeter signal of 3% on average. Beam quality varied with scan parameters and necessitated the largest correction factor, ranging from 0.80 to 1.15 relative to a calibration performed in air using a 120 kV beam. Good agreement was found between the theoretical and measurement approaches. Conclusions: Correction factors for the measurement of air kerma were generally small for CT dosimetry, although angular effects, and particularly effects due to changes in beam quality, could be more substantial. In particular, it would likely be necessary to account for variations in CT scan parameters and measurement location when performing CT dosimetry using OSLD.« less

  10. SU-C-201-06: Utility of Quantitative 3D SPECT/CT Imaging in Patient Specific Internal Dosimetry of 153-Samarium with GATE Monte Carlo Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fallahpoor, M; Abbasi, M; Sen, A

    Purpose: Patient-specific 3-dimensional (3D) internal dosimetry in targeted radionuclide therapy is essential for efficient treatment. Two major steps to achieve reliable results are: 1) generating quantitative 3D images of radionuclide distribution and attenuation coefficients and 2) using a reliable method for dose calculation based on activity and attenuation map. In this research, internal dosimetry for 153-Samarium (153-Sm) was done by SPECT-CT images coupled GATE Monte Carlo package for internal dosimetry. Methods: A 50 years old woman with bone metastases from breast cancer was prescribed 153-Sm treatment (Gamma: 103keV and beta: 0.81MeV). A SPECT/CT scan was performed with the Siemens Simbia-Tmore » scanner. SPECT and CT images were registered using default registration software. SPECT quantification was achieved by compensating for all image degrading factors including body attenuation, Compton scattering and collimator-detector response (CDR). Triple energy window method was used to estimate and eliminate the scattered photons. Iterative ordered-subsets expectation maximization (OSEM) with correction for attenuation and distance-dependent CDR was used for image reconstruction. Bilinear energy mapping is used to convert Hounsfield units in CT image to attenuation map. Organ borders were defined by the itk-SNAP toolkit segmentation on CT image. GATE was then used for internal dose calculation. The Specific Absorbed Fractions (SAFs) and S-values were reported as MIRD schema. Results: The results showed that the largest SAFs and S-values are in osseous organs as expected. S-value for lung is the highest after spine that can be important in 153-Sm therapy. Conclusion: We presented the utility of SPECT-CT images and Monte Carlo for patient-specific dosimetry as a reliable and accurate method. It has several advantages over template-based methods or simplified dose estimation methods. With advent of high speed computers, Monte Carlo can be used for treatment planning on a day to day basis.« less

  11. FERRET-SAND II physics-dosimetry analysis for N Reactor Pressure Tubes 2954, 3053 and 1165 using a WIMS calculated input spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McElroy, W.N.; Kellogg, L.S.; Matsumoto, W.Y.

    1988-05-01

    This report is in response to a request from Westinghouse Hanford Company (WHC) that the PNL National Dosimetry Center (NDC) perform physics-dosimetry analyses (E > MeV) for N Reactor Pressure Tubes 2954 and 3053. As a result of these analyses, and recommendations for additional studies, two physics-dosimetry re-evaluations for Pressure Tube 1165 were also accomplished. The primary objective of Pacific Northwest Laboratories' (PNL) National Dosimetry Center (NDC) physics-dosimetry work for N Reactor was to provide FERRET-SAND II physics-dosimetry results to assist in the assessment of neutron radiation-induced changes in the physical and mechanical properties of N Reactor pressure tubes. 15more » refs., 6 figs., 5 tabs.« less

  12. The influence of commenting validity, placement, and style on perceptions of computer code trustworthiness: A heuristic-systematic processing approach.

    PubMed

    Alarcon, Gene M; Gamble, Rose F; Ryan, Tyler J; Walter, Charles; Jessup, Sarah A; Wood, David W; Capiola, August

    2018-07-01

    Computer programs are a ubiquitous part of modern society, yet little is known about the psychological processes that underlie reviewing code. We applied the heuristic-systematic model (HSM) to investigate the influence of computer code comments on perceptions of code trustworthiness. The study explored the influence of validity, placement, and style of comments in code on trustworthiness perceptions and time spent on code. Results indicated valid comments led to higher trust assessments and more time spent on the code. Properly placed comments led to lower trust assessments and had a marginal effect on time spent on code; however, the effect was no longer significant after controlling for effects of the source code. Low style comments led to marginally higher trustworthiness assessments, but high style comments led to longer time spent on the code. Several interactions were also found. Our findings suggest the relationship between code comments and perceptions of code trustworthiness is not as straightforward as previously thought. Additionally, the current paper extends the HSM to the programming literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Adiabatic topological quantum computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less

  14. Adiabatic topological quantum computing

    DOE PAGES

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; ...

    2015-07-31

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less

  15. Fast Computation of the Two-Point Correlation Function in the Age of Big Data

    NASA Astrophysics Data System (ADS)

    Pellegrino, Andrew; Timlin, John

    2018-01-01

    We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.

  16. Optimization of the Temporal Pattern of Applied Radiation Dose: Implication for the Treatment of Prostate Cancer

    DTIC Science & Technology

    2009-03-01

    environment II.A: Characterization of dosimetry in IMRT radiobiological experiment phantom using TLDs and film. (7-10 mos.) Objectives: 1... dosimetry with TLDs and film. (8-10 mos.) 4. Analysis of measured dosimetry with TLDs and film compared to predicted dosimetry from treatment...cells were). Dosimetry in the phantom was assessed with film and monitor units were calculated accordingly to deliver the desired dose. Once in

  17. Characterising an aluminium oxide dosimetry system.

    PubMed

    Conheady, Clement F; Gagliardi, Frank M; Ackerly, Trevor

    2015-09-01

    In vivo dosimetry is recommended as a defence-in-depth strategy in radiotherapy treatments and is currently employed by clinics around the world. The characteristics of a new optically stimulated luminescence dosimetry system were investigated for the purpose of replacing an aging thermoluminescence dosimetry system for in vivo dosimetry. The stability of the system was not sufficient to satisfy commissioning requirements and therefore it has not been released into clinical service at this time.

  18. Application of a color scanner for 60Co high dose rate brachytherapy dosimetry with EBT radiochromic film

    PubMed Central

    Ghorbani, Mahdi; Toossi, Mohammad Taghi Bahreyni; Mowlavi, Ali Asghar; Roodi, Shahram Bayani; Meigooni, Ali Soleimani

    2012-01-01

    Background. The aim of this study is to evaluate the performance of a color scanner as a radiochromic film reader in two dimensional dosimetry around a high dose rate brachytherapy source. Materials and methods A Microtek ScanMaker 1000XL film scanner was utilized for the measurement of dose distribution around a high dose rate GZP6 60Co brachytherapy source with GafChromic® EBT radiochromic films. In these investigations, the non-uniformity of the film and scanner response, combined, as well as the films sensitivity to scanner’s light source was evaluated using multiple samples of films, prior to the source dosimetry. The results of these measurements were compared with the Monte Carlo simulated data using MCNPX code. In addition, isodose curves acquired by radiochromic films and Monte Carlo simulation were compared with those provided by the GZP6 treatment planning system. Results Scanning of samples of uniformly irradiated films demonstrated approximately 2.85% and 4.97% nonuniformity of the response, respectively in the longitudinal and transverse directions of the film. Our findings have also indicated that the film response is not affected by the exposure to the scanner’s light source, particularly in multiple scanning of film. The results of radiochromic film measurements are in good agreement with the Monte Carlo calculations (4%) and the corresponding dose values presented by the GZP6 treatment planning system (5%). Conclusions The results of these investigations indicate that the Microtek ScanMaker 1000XL color scanner in conjunction with GafChromic EBT film is a reliable system for dosimetric evaluation of a high dose rate brachytherapy source. PMID:23411947

  19. Development of a transmission alpha particle dosimetry technique using A549 cells and a Ra-223 source for targeted alpha therapy.

    PubMed

    Al Darwish, R; Staudacher, A H; Li, Y; Brown, M P; Bezak, E

    2016-11-01

    In targeted radionuclide therapy, regional tumors are targeted with radionuclides delivering therapeutic radiation doses. Targeted alpha therapy (TAT) is of particular interest due to its ability to deliver alpha particles of high linear energy transfer within the confines of the tumor. However, there is a lack of data related to alpha particle distribution in TAT. These data are required to more accurately estimate the absorbed dose on a cellular level. As a result, there is a need for a dosimeter that can estimate, or better yet determine the absorbed dose deposited by alpha particles in cells. In this study, as an initial step, the authors present a transmission dosimetry design for alpha particles using A549 lung carcinoma cells, an external alpha particle emitting source (radium 223; Ra-223) and a Timepix pixelated semiconductor detector. The dose delivery to the A549 lung carcinoma cell line from a Ra-223 source, considered to be an attractive radionuclide for alpha therapy, was investigated in the current work. A549 cells were either unirradiated (control) or irradiated for 12, 1, 2, or 3 h with alpha particles emitted from a Ra-223 source positioned below a monolayer of A549 cells. The Timepix detector was used to determine the number of transmitted alpha particles passing through the A549 cells and DNA double strand breaks (DSBs) in the form of γ-H2AX foci were examined by fluorescence microscopy. The number of transmitted alpha particles was correlated with the observed DNA DSBs and the delivered radiation dose was estimated. Additionally, the dose deposited was calculated using Monte Carlo code SRIM. Approximately 20% of alpha particles were transmitted and detected by Timepix. The frequency and number of γ-H2AX foci increased significantly following alpha particle irradiation as compared to unirradiated controls. The equivalent dose delivered to A549 cells was estimated to be approximately 0.66, 1.32, 2.53, and 3.96 Gy after 12, 1, 2, and 3 h irradiation, respectively, considering a relative biological effectiveness of alpha particles of 5.5. The study confirmed that the Timepix detector can be used for transmission alpha particle dosimetry. If cross-calibrated using biological dosimetry, this method will give a good indication of the biological effects of alpha particles without the need for repeated biological dosimetry which is costly, time consuming, and not readily available.

  20. Design of convolutional tornado code

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  1. Three-dimensional turbopump flowfield analysis

    NASA Technical Reports Server (NTRS)

    Sharma, O. P.; Belford, K. A.; Ni, R. H.

    1992-01-01

    A program was conducted to develop a flow prediction method applicable to rocket turbopumps. The complex nature of a flowfield in turbopumps is described and examples of flowfields are discussed to illustrate that physics based models and analytical calculation procedures based on computational fluid dynamics (CFD) are needed to develop reliable design procedures for turbopumps. A CFD code developed at NASA ARC was used as the base code. The turbulence model and boundary conditions in the base code were modified, respectively, to: (1) compute transitional flows and account for extra rates of strain, e.g., rotation; and (2) compute surface heat transfer coefficients and allow computation through multistage turbomachines. Benchmark quality data from two and three-dimensional cascades were used to verify the code. The predictive capabilities of the present CFD code were demonstrated by computing the flow through a radial impeller and a multistage axial flow turbine. Results of the program indicate that the present code operated in a two-dimensional mode is a cost effective alternative to full three-dimensional calculations, and that it permits realistic predictions of unsteady loadings and losses for multistage machines.

  2. High altitude chemically reacting gas particle mixtures. Volume 3: Computer code user's and applications manual. [rocket nozzle and orbital plume flow fields

    NASA Technical Reports Server (NTRS)

    Smith, S. D.

    1984-01-01

    A users manual for the RAMP2 computer code is provided. The RAMP2 code can be used to model the dominant phenomena which affect the prediction of liquid and solid rocket nozzle and orbital plume flow fields. The general structure and operation of RAMP2 are discussed. A user input/output guide for the modified TRAN72 computer code and the RAMP2F code is given. The application and use of the BLIMPJ module are considered. Sample problems involving the space shuttle main engine and motor are included.

  3. Development of numerical methods for overset grids with applications for the integrated Space Shuttle vehicle

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    1995-01-01

    Algorithms and computer code developments were performed for the overset grid approach to solving computational fluid dynamics problems. The techniques developed are applicable to compressible Navier-Stokes flow for any general complex configurations. The computer codes developed were tested on different complex configurations with the Space Shuttle launch vehicle configuration as the primary test bed. General, efficient and user-friendly codes were produced for grid generation, flow solution and force and moment computation.

  4. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  5. ISSYS: An integrated synergistic Synthesis System

    NASA Technical Reports Server (NTRS)

    Dovi, A. R.

    1980-01-01

    Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.

  6. User's manual for a two-dimensional, ground-water flow code on the Octopus computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.

    1978-08-30

    A ground-water hydrology computer code, programmed by R.L. Taylor (in Proc. American Society of Civil Engineers, Journal of Hydraulics Division, 93(HY2), pp. 25-33 (1967)), has been adapted to the Octopus computer system at Lawrence Livermore Laboratory. Using an example problem, this manual details the input, output, and execution options of the code.

  7. Interactive Synthesis of Code Level Security Rules

    DTIC Science & Technology

    2017-04-01

    Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University

  8. Agricultural Spraying

    NASA Technical Reports Server (NTRS)

    1986-01-01

    AGDISP, a computer code written for Langley by Continuum Dynamics, Inc., aids crop dusting airplanes in targeting pesticides. The code is commercially available and can be run on a personal computer by an inexperienced operator. Called SWA+H, it is used by the Forest Service, FAA, DuPont, etc. DuPont uses the code to "test" equipment on the computer using a laser system to measure particle characteristics of various spray compounds.

  9. The measurement of radiation dose profiles for electron-beam computed tomography using film dosimetry.

    PubMed

    Zink, F E; McCollough, C H

    1994-08-01

    The unique geometry of electron-beam CT (EBCT) scanners produces radiation dose profiles with widths which can be considerably different from the corresponding nominal scan width. Additionally, EBCT scanners produce both complex (multiple-slice) and narrow (3 mm) radiation profiles. This work describes the measurement of the axial dose distribution from EBCT within a scattering phantom using film dosimetry methods, which offer increased convenience and spatial resolution compared to thermoluminescent dosimetry (TLD) techniques. Therapy localization film was cut into 8 x 220 mm strips and placed within specially constructed light-tight holders for placement within the cavities of a CT Dose Index (CTDI) phantom. The film was calibrated using a conventional overhead x-ray tube with spectral characteristics matched to the EBCT scanner (130 kVp, 10 mm A1 HVL). The films were digitized at five samples per mm and calibrated dose profiles plotted as a function of z-axis position. Errors due to angle-of-incidence and beam hardening were estimated to be less than 5% and 10%, respectively. The integral exposure under film dose profiles agreed with ion-chamber measurements to within 15%. Exposures measured along the radiation profile differed from TLD measurements by an average of 5%. The film technique provided acceptable accuracy and convenience in comparison to conventional TLD methods, and allowed high spatial-resolution measurement of EBCT radiation dose profiles.

  10. Organ-specific SPECT activity calibration using 3D printed phantoms for molecular radiotherapy dosimetry.

    PubMed

    Robinson, Andrew P; Tipping, Jill; Cullen, David M; Hamilton, David; Brown, Richard; Flynn, Alex; Oldfield, Christopher; Page, Emma; Price, Emlyn; Smith, Andrew; Snee, Richard

    2016-12-01

    Patient-specific absorbed dose calculations for molecular radiotherapy require accurate activity quantification. This is commonly derived from Single-Photon Emission Computed Tomography (SPECT) imaging using a calibration factor relating detected counts to known activity in a phantom insert. A series of phantom inserts, based on the mathematical models underlying many clinical dosimetry calculations, have been produced using 3D printing techniques. SPECT/CT data for the phantom inserts has been used to calculate new organ-specific calibration factors for (99m) Tc and (177)Lu. The measured calibration factors are compared to predicted values from calculations using a Gaussian kernel. Measured SPECT calibration factors for 3D printed organs display a clear dependence on organ shape for (99m) Tc and (177)Lu. The observed variation in calibration factor is reproduced using Gaussian kernel-based calculation over two orders of magnitude change in insert volume for (99m) Tc and (177)Lu. These new organ-specific calibration factors show a 24, 11 and 8 % reduction in absorbed dose for the liver, spleen and kidneys, respectively. Non-spherical calibration factors from 3D printed phantom inserts can significantly improve the accuracy of whole organ activity quantification for molecular radiotherapy, providing a crucial step towards individualised activity quantification and patient-specific dosimetry. 3D printed inserts are found to provide a cost effective and efficient way for clinical centres to access more realistic phantom data.

  11. The adaption and use of research codes for performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebetrau, A.M.

    1987-05-01

    Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less

  12. Topological color codes on Union Jack lattices: a stable implementation of the whole Clifford group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katzgraber, Helmut G.; Theoretische Physik, ETH Zurich, CH-8093 Zurich; Bombin, H.

    We study the error threshold of topological color codes on Union Jack lattices that allow for the full implementation of the whole Clifford group of quantum gates. After mapping the error-correction process onto a statistical mechanical random three-body Ising model on a Union Jack lattice, we compute its phase diagram in the temperature-disorder plane using Monte Carlo simulations. Surprisingly, topological color codes on Union Jack lattices have a similar error stability to color codes on triangular lattices, as well as to the Kitaev toric code. The enhanced computational capabilities of the topological color codes on Union Jack lattices with respectmore » to triangular lattices and the toric code combined with the inherent robustness of this implementation show good prospects for future stable quantum computer implementations.« less

  13. Macroscopic to Microscopic Scales of Particulate Dosimetry: From Source to Fate in the Body

    EPA Science Inventory

    Additional perspective with regards to particle dosimetry is achieved by exploring dosimetry across a range of scales from macroscopic to microscopic in scope. Typically, one thinks of dosimetry as what happens when a particle is inhaled, where it is deposited, and how it is clea...

  14. Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low-Altitude VLF Transmitter

    DTIC Science & Technology

    2007-08-31

    latitude) for 3 different grid spacings. 14 8. Low-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...excellent, validating the new FD code. 16 9. High-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...again excellent. 17 10. Low-altitude fields produced by a 20-k.Hz source computed using the FD and TD codes. 17 11. High-altitude fields produced

  15. SU-F-T-144: Analytical Closed Form Approximation for Carbon Ion Bragg Curves in Water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuomanen, S; Moskvin, V; Farr, J

    2016-06-15

    Purpose: Semi-empirical modeling is a powerful computational method in radiation dosimetry. A set of approximations exist for proton ion depth dose distribution (DDD) in water. However, the modeling is more complicated for carbon ions due to fragmentation. This study addresses this by providing and evaluating a new methodology for DDD modeling of carbon ions in water. Methods: The FLUKA, Monte Carlo (MC) general-purpose transport code was used for simulation of carbon DDDs for energies of 100–400 MeV in water as reference data model benchmarking. Based on Thomas Bortfeld’s closed form equation approximating proton Bragg Curves as a basis, we derivedmore » the critical constants for a beam of Carbon ions by applying models of radiation transport by Lee et. al. and Geiger to our simulated Carbon curves. We hypothesized that including a new exponential (κ) residual distance parameter to Bortfeld’s fluence reduction relation would improve DDD modeling for carbon ions. We are introducing an additional term to be added to Bortfeld’s equation to describe fragmentation tail. This term accounts for the pre-peak dose from nuclear fragments (NF). In the post peak region, the NF transport will be treated as new beams utilizing the Glauber model for interaction cross sections and the Abrasion- Ablation fragmentation model. Results: The carbon beam specific constants in the developed model were determined to be : p= 1.75, β=0.008 cm-1, γ=0.6, α=0.0007 cm MeV, σmono=0.08, and the new exponential parameter κ=0.55. This produced a close match for the plateau part of the curve (max deviation 6.37%). Conclusion: The derived semi-empirical model provides an accurate approximation of the MC simulated clinical carbon DDDs. This is the first direct semi-empirical simulation for the dosimetry of therapeutic carbon ions. The accurate modeling of the NF tail in the carbon DDD will provide key insight into distal edge dose deposition formation.« less

  16. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  17. Quantification of differences in the effective atomic numbers of healthy and cancerous tissues: A discussion in the context of diagnostics and dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, M. L.; Physical Sciences, Peter MacCallum Cancer Centre, East Melbourne 3001

    Purpose: There are a range of genetic and nongenetic factors influencing the elemental composition of different human tissues. The elemental composition of cancerous tissues frequently differs from healthy tissue of the same organ, particularly in high-Z trace element concentrations. For this reason, one could suggest that this may be exploited in diagnostics and perhaps even influence dosimetry. Methods: In this work, for the first time, effective atomic numbers are computed for common cancerous and healthy tissues using a robust, energy-dependent approach between 10 keV and 100 MeV. These are then quantitatively compared within the context of diagnostics and dosimetry. Results:more » Differences between effective atomic numbers of healthy and diseased tissues are found to be typically less than 10%. Fibrotic tissues and calcifications of the breast exhibit substantial (tens to hundreds of percent) differences to healthy tissue. Expectedly, differences are most pronounced in the photoelectric regime and consequently most relevant for kV imaging/therapy and radionuclides with prominent low-energy peaks. Cancerous tissue of the testes and stomach have lower effective atomic numbers than corresponding healthy tissues, while diseased tissues of the other organ sites typically have higher values. Conclusions: As dose calculation approaches improve in accuracy, there may be an argument for the explicit inclusion of pathologies. This is more the case for breast, penile, prostate, nasopharyngeal, and stomach cancer, less so for testicular and kidney cancer. The calculated data suggest dual-energy computed tomography could potentially improve lesion identification in the aforementioned organs (with the exception of testicular cancer), with most import in breast imaging. Ultimately, however, the differences are very small. It is likely that the assumption of a generic 'tissue ramp' in planning will be sufficient for the foreseeable future, and that the Z differences do not notably aid lesion detection beyond that already facilitated by differences in mass density.« less

  18. SU-F-T-434: Development of a Fan-Beam Optical Scanner Using CMOS Array for Small Field Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brost, E; Warmington, L; Watanabe, Y

    Purpose: To design and construct a second generation optical computed tomography (OCT) system using a fan-beam with a CMOS array detector for the 3D dosimetry with polymer gel and radiochromic solid dosimeters. The system was specifically designed for the small field dosimetry. Methods: The optical scanner used a fan-beam laser, which was produced from a collimated red laser beam (λ=620 nm) with a 15-degree laser-line generating lens. The fan-beam was sent through an index-matching bath which holds the sample stage and a sample. The emerging laser light was detected with a 2.54 cm-long CMOS array detector (512 elements). The samplemore » stage rotated through the full 360 degree projection angles at 0.9-degree increments. Each projection was normalized to the unirradiated sample at the projection angle to correct for imperfections in the dosimeter. A larger sample could be scanned by using a motorized mirror and linearly translating the CMOS detector. The height of the sample stage was varied for a full 3D scanning. The image acquisition and motor motion was controlled by a computer. The 3D image reconstruction was accomplished by a fan-beam reconstruction algorithm. All the software was developed inhouse with MATLAB. Results: The scanner was used on both PRESAGE and PAGAT gel dosimeters. Irreconcilable refraction errors were seen with PAGAT because the fan beam laser line refracted away from the detector when the field was highly varying in 3D. With PRESAGE, this type of error was not seen. Conclusion: We could acquire tomographic images of dose distributions by the new OCT system with both polymer gel and radiochromic solid dosimeters. Preliminary results showed that the system was more suited for radiochromic solid dosimeters since the radiochromic dosimeters exhibited minimal refraction and scattering errors. We are currently working on improving the image quality by thorough characterization of the OCT system.« less

  19. Analysis of airborne antenna systems using geometrical theory of diffraction and moment method computer codes

    NASA Technical Reports Server (NTRS)

    Hartenstein, Richard G., Jr.

    1985-01-01

    Computer codes have been developed to analyze antennas on aircraft and in the presence of scatterers. The purpose of this study is to use these codes to develop accurate computer models of various aircraft and antenna systems. The antenna systems analyzed are a P-3B L-Band antenna, an A-7E UHF relay pod antenna, and traffic advisory antenna system installed on a Bell Long Ranger helicopter. Computer results are compared to measured ones with good agreement. These codes can be used in the design stage of an antenna system to determine the optimum antenna location and save valuable time and costly flight hours.

  20. THC-MP: High performance numerical simulation of reactive transport and multiphase flow in porous media

    NASA Astrophysics Data System (ADS)

    Wei, Xiaohui; Li, Weishan; Tian, Hailong; Li, Hongliang; Xu, Haixiao; Xu, Tianfu

    2015-07-01

    The numerical simulation of multiphase flow and reactive transport in the porous media on complex subsurface problem is a computationally intensive application. To meet the increasingly computational requirements, this paper presents a parallel computing method and architecture. Derived from TOUGHREACT that is a well-established code for simulating subsurface multi-phase flow and reactive transport problems, we developed a high performance computing THC-MP based on massive parallel computer, which extends greatly on the computational capability for the original code. The domain decomposition method was applied to the coupled numerical computing procedure in the THC-MP. We designed the distributed data structure, implemented the data initialization and exchange between the computing nodes and the core solving module using the hybrid parallel iterative and direct solver. Numerical accuracy of the THC-MP was verified through a CO2 injection-induced reactive transport problem by comparing the results obtained from the parallel computing and sequential computing (original code). Execution efficiency and code scalability were examined through field scale carbon sequestration applications on the multicore cluster. The results demonstrate successfully the enhanced performance using the THC-MP on parallel computing facilities.

  1. Code of Ethical Conduct for Computer-Using Educators: An ICCE Policy Statement.

    ERIC Educational Resources Information Center

    Computing Teacher, 1987

    1987-01-01

    Prepared by the International Council for Computers in Education's Ethics and Equity Committee, this code of ethics for educators using computers covers nine main areas: curriculum issues, issues relating to computer access, privacy/confidentiality issues, teacher-related issues, student issues, the community, school organizational issues,…

  2. Fast, high-resolution 3D dosimetry utilizing a novel optical-CT scanner incorporating tertiary telecentric collimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakhalkar, H. S.; Oldham, M.

    2008-01-15

    This study introduces a charge coupled device (CCD) area detector based optical-computed tomography (optical-CT) scanner for comprehensive verification of radiation dose distributions recorded in nonscattering radiochromic dosimeters. Defining characteristics include: (i) a very fast scanning time of {approx}5 min to acquire a complete three-dimensional (3D) dataset, (ii) improved image formation through the use of custom telecentric optics, which ensures accurate projection images and minimizes artifacts from scattered and stray-light sources, and (iii) high resolution (potentially 50 {mu}m) isotropic 3D dose readout. The performance of the CCD scanner for 3D dose readout was evaluated by comparison with independent 3D readout frommore » the single laser beam OCTOPUS-scanner for the same PRESAGE dosimeters. The OCTOPUS scanner was considered the 'gold standard' technique in light of prior studies demonstrating its accuracy. Additional comparisons were made against calculated dose distributions from the ECLIPSE treatment-planning system. Dose readout for the following treatments were investigated: (i) a single rectangular beam irradiation to investigate small field and very steep dose gradient dosimetry away from edge effects, (ii) a 2-field open beam parallel-opposed irradiation to investigate dosimetry along steep dose gradients, and (iii) a 7-field intensity modulated radiation therapy (IMRT) irradiation to investigate dosimetry for complex treatment delivery involving modulation of fluence and for dosimetry along moderate dose gradients. Dose profiles, dose-difference plots, and gamma maps were employed to evaluate quantitative estimates of agreement between independently measured and calculated dose distributions. Results indicated that dose readout from the CCD scanner was in agreement with independent gold-standard readout from the OCTOPUS-scanner as well as the calculated ECLIPSE dose distribution for all treatments, except in regions within a few millimeters of the edge of the dosimeter, where edge artifact is predominant. Agreement of line profiles was observed, even along steep dose gradients. Dose difference plots indicated that the CCD scanner dose readout differed from the OCTOPUSscanner readout and ECLIPSE calculations by {approx}10% along steep dose gradients and by {approx}5% along moderate dose gradients. Gamma maps (3% dose-difference and 3 mm distance-to-agreement acceptance criteria) revealed agreement, except for regions within 5 mm of the edge of the dosimeter where the edge artifact occurs. In summary, the data demonstrate feasibility of using the fast, high-resolution CCD scanner for comprehensive 3D dosimetry in all applications, except where dose readout is required close to the edges of the dosimeter. Further work is ongoing to reduce this artifact.« less

  3. Embedding Secure Coding Instruction into the IDE: Complementing Early and Intermediate CS Courses with ESIDE

    ERIC Educational Resources Information Center

    Whitney, Michael; Lipford, Heather Richter; Chu, Bill; Thomas, Tyler

    2018-01-01

    Many of the software security vulnerabilities that people face today can be remediated through secure coding practices. A critical step toward the practice of secure coding is ensuring that our computing students are educated on these practices. We argue that secure coding education needs to be included across a computing curriculum. We are…

  4. Evaluation of six TPS algorithms in computing entrance and exit doses

    PubMed Central

    Metwaly, Mohamed; Glegg, Martin; Baggarley, Shaun P.; Elliott, Alex

    2014-01-01

    Entrance and exit doses are commonly measured in in vivo dosimetry for comparison with expected values, usually generated by the treatment planning system (TPS), to verify accuracy of treatment delivery. This report aims to evaluate the accuracy of six TPS algorithms in computing entrance and exit doses for a 6 MV beam. The algorithms tested were: pencil beam convolution (Eclipse PBC), analytical anisotropic algorithm (Eclipse AAA), AcurosXB (Eclipse AXB), FFT convolution (XiO Convolution), multigrid superposition (XiO Superposition), and Monte Carlo photon (Monaco MC). Measurements with ionization chamber (IC) and diode detector in water phantoms were used as a reference. Comparisons were done in terms of central axis point dose, 1D relative profiles, and 2D absolute gamma analysis. Entrance doses computed by all TPS algorithms agreed to within 2% of the measured values. Exit doses computed by XiO Convolution, XiO Superposition, Eclipse AXB, and Monaco MC agreed with the IC measured doses to within 2%‐3%. Meanwhile, Eclipse PBC and Eclipse AAA computed exit doses were higher than the IC measured doses by up to 5.3% and 4.8%, respectively. Both algorithms assume that full backscatter exists even at the exit level, leading to an overestimation of exit doses. Despite good agreements at the central axis for Eclipse AXB and Monaco MC, 1D relative comparisons showed profiles mismatched at depths beyond 11.5 cm. Overall, the 2D absolute gamma (3%/3 mm) pass rates were better for Monaco MC, while Eclipse AXB failed mostly at the outer 20% of the field area. The findings of this study serve as a useful baseline for the implementation of entrance and exit in vivo dosimetry in clinical departments utilizing any of these six common TPS algorithms for reference comparison. PACS numbers: 87.55.‐x, 87.55.D‐, 87.55.N‐, 87.53.Bn PMID:24892349

  5. Comparison of methods for individualized astronaut organ dosimetry: Morphometry-based phantom library versus body contour autoscaling of a reference phantom

    NASA Astrophysics Data System (ADS)

    Sands, Michelle M.; Borrego, David; Maynard, Matthew R.; Bahadori, Amir A.; Bolch, Wesley E.

    2017-11-01

    One of the hazards faced by space crew members in low-Earth orbit or in deep space is exposure to ionizing radiation. It has been shown previously that while differences in organ-specific and whole-body risk estimates due to body size variations are small for highly-penetrating galactic cosmic rays, large differences in these quantities can result from exposure to shorter-range trapped proton or solar particle event radiations. For this reason, it is desirable to use morphometrically accurate computational phantoms representing each astronaut for a risk analysis, especially in the case of a solar particle event. An algorithm was developed to automatically sculpt and scale the UF adult male and adult female hybrid reference phantom to the individual outer body contour of a given astronaut. This process begins with the creation of a laser-measured polygon mesh model of the astronaut's body contour. Using the auto-scaling program and selecting several anatomical landmarks, the UF adult male or female phantom is adjusted to match the laser-measured outer body contour of the astronaut. A dosimetry comparison study was conducted to compare the organ dose accuracy of both the autoscaled phantom and that based upon a height-weight matched phantom from the UF/NCI Computational Phantom Library. Monte Carlo methods were used to simulate the environment of the August 1972 and February 1956 solar particle events. Using a series of individual-specific voxel phantoms as a local benchmark standard, autoscaled phantom organ dose estimates were shown to provide a 1% and 10% improvement in organ dose accuracy for a population of females and males, respectively, as compared to organ doses derived from height-weight matched phantoms from the UF/NCI Computational Phantom Library. In addition, this slight improvement in organ dose accuracy from the autoscaled phantoms is accompanied by reduced computer storage requirements and a more rapid method for individualized phantom generation when compared to the UF/NCI Computational Phantom Library.

  6. The Latin American Biological Dosimetry Network (LBDNet).

    PubMed

    García, O; Di Giorgio, M; Radl, A; Taja, M R; Sapienza, C E; Deminge, M M; Fernández Rearte, J; Stuck Oliveira, M; Valdivia, P; Lamadrid, A I; González, J E; Romero, I; Mandina, T; Guerrero-Carbajal, C; ArceoMaldonado, C; Cortina Ramírez, G E; Espinoza, M; Martínez-López, W; Di Tomasso, M

    2016-09-01

    Biological Dosimetry is a necessary support for national radiation protection programmes and emergency response schemes. The Latin American Biological Dosimetry Network (LBDNet) was formally founded in 2007 to provide early biological dosimetry assistance in case of radiation emergencies in the Latin American Region. Here are presented the main topics considered in the foundational document of the network, which comprise: mission, partners, concept of operation, including the mechanism to request support for biological dosimetry assistance in the region, and the network capabilities. The process for network activation and the role of the coordinating laboratory during biological dosimetry emergency response is also presented. This information is preceded by historical remarks on biological dosimetry cooperation in Latin America. A summary of the main experimental and practical results already obtained by the LBDNet is also included. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Reference dosimeter system of the iaea

    NASA Astrophysics Data System (ADS)

    Mehta, Kishor; Girzikowsky, Reinhard

    1995-09-01

    Quality assurance programmes must be in operation at radiation processing facilities to satisfy national and international Standards. Since dosimetry has a vital function in these QA programmes, it is imperative that the dosimetry systems in use at these facilities are well calibrated with a traceability to a Primary Standard Dosimetry Laboratory. As a service to the Member States, the International Atomic Energy Agency operates the International Dose Assurance Service (IDAS) to assist in this process. The transfer standard dosimetry system that is used for this service is based on ESR spectrometry. The paper describes the activities undertaken at the IAEA Dosimetry Laboratory to establish the QA programme for its reference dosimetry system. There are four key elements of such a programme: quality assurance manual; calibration that is traceable to a Primary Standard Dosimetry Laboratory; a clear and detailed statement of uncertainty in the dose measurement; and, periodic quality audit.

  8. Computer Simulation of Breast Cancer Screening

    DTIC Science & Technology

    2001-07-01

    21. Tompkins PA, Abreu CC, Carroll FE, Xiao therapeutic medical physics. Med Phys 14. Gentry JR, DeWerd LA. TLD measure- QF, MacDonald CA. Use of...capillary op- 1996; 23:1997-2005. ments of in vivo mammographic expo- tics as a beam intensifier for a Compton 28. Hammerstein GR, Miller DW, White DR...cm), and was only poorly correlated thicker slices. with breast thickness (r2 0.159). The For comparison images and dosimetry , magnification factor

  9. FY 1999 Laboratory Directed Research and Development annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PJ Hughes

    2000-06-13

    A short synopsis of each project is given covering the following main areas of research and development: Atmospheric sciences; Biotechnology; Chemical and instrumentation analysis; Computer and information science; Design and manufacture engineering; Ecological science; Electronics and sensors; Experimental technology; Health protection and dosimetry; Hydrologic and geologic science; Marine sciences; Materials science; Nuclear science and engineering; Process science and engineering; Sociotechnical systems analysis; Statistics and applied mathematics; and Thermal and energy systems.

  10. Calculation of water drop trajectories to and about arbitrary three-dimensional lifting and nonlifting bodies in potential airflow

    NASA Technical Reports Server (NTRS)

    Norment, H. G.

    1985-01-01

    Subsonic, external flow about nonlifting bodies, lifting bodies or combinations of lifting and nonlifting bodies is calculated by a modified version of the Hess lifting code. Trajectory calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Inlet flow can be accommodated, and high Mach number compressibility effects are corrected for approximately. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.

  11. Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.

    DTIC Science & Technology

    1990-09-01

    IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve

  12. Topical Review: Polymer gel dosimetry

    PubMed Central

    Baldock, C; De Deene, Y; Doran, S; Ibbott, G; Jirasek, A; Lepage, M; McAuley, K B; Oldham, M; Schreiner, L J

    2010-01-01

    Polymer gel dosimeters are fabricated from radiation sensitive chemicals which, upon irradiation, polymerize as a function of the absorbed radiation dose. These gel dosimeters, with the capacity to uniquely record the radiation dose distribution in three-dimensions (3D), have specific advantages when compared to one-dimensional dosimeters, such as ion chambers, and two-dimensional dosimeters, such as film. These advantages are particularly significant in dosimetry situations where steep dose gradients exist such as in intensity-modulated radiation therapy (IMRT) and stereotactic radiosurgery. Polymer gel dosimeters also have specific advantages for brachytherapy dosimetry. Potential dosimetry applications include those for low-energy x-rays, high-linear energy transfer (LET) and proton therapy, radionuclide and boron capture neutron therapy dosimetries. These 3D dosimeters are radiologically soft-tissue equivalent with properties that may be modified depending on the application. The 3D radiation dose distribution in polymer gel dosimeters may be imaged using magnetic resonance imaging (MRI), optical-computerized tomography (optical-CT), x-ray CT or ultrasound. The fundamental science underpinning polymer gel dosimetry is reviewed along with the various evaluation techniques. Clinical dosimetry applications of polymer gel dosimetry are also presented. PMID:20150687

  13. A COTS-Based Replacement Strategy for Aging Avionics Computers

    DTIC Science & Technology

    2001-12-01

    Communication Control Unit. A COTS-Based Replacement Strategy for Aging Avionics Computers COTS Microprocessor Real Time Operating System New Native Code...Native Code Objec ts Native Code Thread Real - Time Operating System Legacy Function x Virtual Component Environment Context Switch Thunk Add-in Replace

  14. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  15. The STAGS computer code

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Brogan, F. A.

    1978-01-01

    Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.

  16. Holonomic surface codes for fault-tolerant quantum computation

    NASA Astrophysics Data System (ADS)

    Zhang, Jiang; Devitt, Simon J.; You, J. Q.; Nori, Franco

    2018-02-01

    Surface codes can protect quantum information stored in qubits from local errors as long as the per-operation error rate is below a certain threshold. Here we propose holonomic surface codes by harnessing the quantum holonomy of the system. In our scheme, the holonomic gates are built via auxiliary qubits rather than the auxiliary levels in multilevel systems used in conventional holonomic quantum computation. The key advantage of our approach is that the auxiliary qubits are in their ground state before and after each gate operation, so they are not involved in the operation cycles of surface codes. This provides an advantageous way to implement surface codes for fault-tolerant quantum computation.

  17. Comparison of two- and three-dimensional flow computations with laser anemometer measurements in a transonic compressor rotor

    NASA Technical Reports Server (NTRS)

    Chima, R. V.; Strazisar, A. J.

    1982-01-01

    Two and three dimensional inviscid solutions for the flow in a transonic axial compressor rotor at design speed are compared with probe and laser anemometers measurements at near-stall and maximum-flow operating points. Experimental details of the laser anemometer system and computational details of the two dimensional axisymmetric code and three dimensional Euler code are described. Comparisons are made between relative Mach number and flow angle contours, shock location, and shock strength. A procedure for using an efficient axisymmetric code to generate downstream pressure input for computationally expensive Euler codes is discussed. A film supplement shows the calculations of the two operating points with the time-marching Euler code.

  18. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  19. SU-F-J-100: Standardized Biodistribution Template for Nuclear Medicine Dosimetry Collection and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kesner, A; Poli, G; Beykan, S

    Purpose: As the field of Nuclear Medicine moves forward with efforts to integrate radiation dosimetry into clinical practice we can identify the challenge posed by the lack of standardized dose calculation methods and protocols. All personalized internal dosimetry is derived by projecting biodistribution measurements into dosimetry calculations. In an effort to standardize organization of data and its reporting, we have developed, as a sequel to the EANM recommendation of “Good Dosimetry Reporting”, a freely available biodistribution template, which can be used to create a common point of reference for dosimetry data. It can be disseminated, interpreted, and used for methodmore » development widely across the field. Methods: A generalized biodistribution template was built in a comma delineated format (.csv) to be completed by users performing biodistribution measurements. The template is available for free download. The download site includes instructions and other usage details on the template. Results: This is a new resource developed for the community. It is our hope that users will consider integrating it into their dosimetry operations. Having biodistribution data available and easily accessible for all patients processed is a strategy for organizing large amounts of information. It may enable users to create their own databases that can be analyzed for multiple aspects of dosimetry operations. Furthermore, it enables population data to easily be reprocessed using different dosimetry methodologies. With respect to dosimetry-related research and publications, the biodistribution template can be included as supplementary material, and will allow others in the community to better compare calculations and results achieved. Conclusion: As dosimetry in nuclear medicine become more routinely applied in clinical applications, we, as a field, need to develop the infrastructure for handling large amounts of data. Our organ level biodistribution template can be used as a standard format for data collection, organization, as well as for dosimetry research and software development.« less

  20. Evaluation and implementation of triple‐channel radiochromic film dosimetry in brachytherapy

    PubMed Central

    Bradley, David; Nisbet, Andrew

    2014-01-01

    The measurement of dose distributions in clinical brachytherapy, for the purpose of quality control, commissioning or dosimetric audit, is challenging and requires development. Radiochromic film dosimetry with a commercial flatbed scanner may be suitable, but careful methodologies are required to control various sources of uncertainty. Triple‐channel dosimetry has recently been utilized in external beam radiotherapy to improve the accuracy of film dosimetry, but its use in brachytherapy, with characteristic high maximum doses, steep dose gradients, and small scales, has been less well researched. We investigate the use of advanced film dosimetry techniques for brachytherapy dosimetry, evaluating uncertainties and assessing the mitigation afforded by triple‐channel dosimetry. We present results on postirradiation film darkening, lateral scanner effect, film surface perturbation, film active layer thickness, film curling, and examples of the measurement of clinical brachytherapy dose distributions. The lateral scanner effect in brachytherapy film dosimetry can be very significant, up to 23% dose increase at 14 Gy, at ± 9 cm lateral from the scanner axis for simple single‐channel dosimetry. Triple‐channel dosimetry mitigates the effect, but still limits the useable width of a typical scanner to less than 8 cm at high dose levels to give dose uncertainty to within 1%. Triple‐channel dosimetry separates dose and dose‐independent signal components, and effectively removes disturbances caused by film thickness variation and surface perturbations in the examples considered in this work. The use of reference dose films scanned simultaneously with brachytherapy test films is recommended to account for scanner variations from calibration conditions. Postirradiation darkening, which is a continual logarithmic function with time, must be taken into account between the reference and test films. Finally, films must be flat when scanned to avoid the Callier‐like effects and to provide reliable dosimetric results. We have demonstrated that radiochromic film dosimetry with GAFCHROMIC EBT3 film and a commercial flatbed scanner is a viable method for brachytherapy dose distribution measurement, and uncertainties may be reduced with triple‐channel dosimetry and specific film scan and evaluation methodologies. PACS numbers: 87.55.Qr, 87.56.bg, 87.55.km PMID:25207417

  1. EAC: A program for the error analysis of STAGS results for plates

    NASA Technical Reports Server (NTRS)

    Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.

    1989-01-01

    A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.

  2. CFD Modeling of Free-Piston Stirling Engines

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.

    2001-01-01

    NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.

  3. On the error statistics of Viterbi decoding and the performance of concatenated codes

    NASA Technical Reports Server (NTRS)

    Miller, R. L.; Deutsch, L. J.; Butman, S. A.

    1981-01-01

    Computer simulation results are presented on the performance of convolutional codes of constraint lengths 7 and 10 concatenated with the (255, 223) Reed-Solomon code (a proposed NASA standard). These results indicate that as much as 0.8 dB can be gained by concatenating this Reed-Solomon code with a (10, 1/3) convolutional code, instead of the (7, 1/2) code currently used by the DSN. A mathematical model of Viterbi decoder burst-error statistics is developed and is validated through additional computer simulations.

  4. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  5. A quantitative three-dimensional dose attenuation analysis around Fletcher-Suit-Delclos due to stainless steel tube for high-dose-rate brachytherapy by Monte Carlo calculations.

    PubMed

    Parsai, E Ishmael; Zhang, Zhengdong; Feldmeier, John J

    2009-01-01

    The commercially available brachytherapy treatment-planning systems today, usually neglects the attenuation effect from stainless steel (SS) tube when Fletcher-Suit-Delclos (FSD) is used in treatment of cervical and endometrial cancers. This could lead to potential inaccuracies in computing dwell times and dose distribution. A more accurate analysis quantifying the level of attenuation for high-dose-rate (HDR) iridium 192 radionuclide ((192)Ir) source is presented through Monte Carlo simulation verified by measurement. In this investigation a general Monte Carlo N-Particles (MCNP) transport code was used to construct a typical geometry of FSD through simulation and compare the doses delivered to point A in Manchester System with and without the SS tubing. A quantitative assessment of inaccuracies in delivered dose vs. the computed dose is presented. In addition, this investigation expanded to examine the attenuation-corrected radial and anisotropy dose functions in a form parallel to the updated AAPM Task Group No. 43 Report (AAPM TG-43) formalism. This will delineate quantitatively the inaccuracies in dose distributions in three-dimensional space. The changes in dose deposition and distribution caused by increased attenuation coefficient resulted from presence of SS are quantified using MCNP Monte Carlo simulations in coupled photon/electron transport. The source geometry was that of the Vari Source wire model VS2000. The FSD was that of the Varian medical system. In this model, the bending angles of tandem and colpostats are 15 degrees and 120 degrees , respectively. We assigned 10 dwell positions to the tandem and 4 dwell positions to right and left colpostats or ovoids to represent a typical treatment case. Typical dose delivered to point A was determined according to Manchester dosimetry system. Based on our computations, the reduction of dose to point A was shown to be at least 3%. So this effect presented by SS-FSD systems on patient dose is of concern.

  6. An exponential growth of computational phantom research in radiation protection, imaging, and radiotherapy: A review of the fifty-year history

    PubMed Central

    Xu, X. George

    2014-01-01

    Radiation dose calculation using models of the human anatomy has been a subject of great interest to radiation protection, medical imaging, and radiotherapy. However, early pioneers of this field did not foresee the exponential growth of research activity as observed today. This review article walks the reader through the history of the research and development in this field of study which started some 50 years ago. This review identifies a clear progression of computational phantom complexity which can be denoted by three distinct generations. The first generation of stylized phantoms, representing a grouping of less than dozen models, was initially developed in the 1960s at Oak Ridge National Laboratory to calculate internal doses from nuclear medicine procedures. Despite their anatomical simplicity, these computational phantoms were the best tools available at the time for internal/external dosimetry, image evaluation, and treatment dose evaluations. A second generation of a large number of voxelized phantoms arose rapidly in the late 1980s as a result of the increased availability of tomographic medical imaging and computers. Surprisingly, the last decade saw the emergence of the third generation of phantoms which are based on advanced geometries called boundary representation (BREP) in the form of Non-Uniform Rational B-Splines (NURBS) or polygonal meshes. This new class of phantoms now consists of over 287 models including those used for non-ionizing radiation applications. This review article aims to provide the reader with a general understanding of how the field of computational phantoms came about and the technical challenges it faced at different times. This goal is achieved by defining basic geometry modeling techniques and by analyzing selected phantoms in terms of geometrical features and dosimetric problems to be solved. The rich historical information is summarized in four tables that are aided by highlights in the text on how some of the most well-known phantoms were developed and used in practice. Some of the information covered in this review has not been previously reported, for example, the CAM and CAF phantoms developed in 1970s for space radiation applications. The author also clarifies confusion about “population-average” prospective dosimetry needed for radiological protection under the current ICRP radiation protection system and “individualized” retrospective dosimetry often performed for medical physics studies. To illustrate the impact of computational phantoms, a section of this article is devoted to examples from the author’s own research group. Finally the author explains an unexpected finding during the course of preparing for this article that the phantoms from the past 50 years followed a pattern of exponential growth. The review ends on a brief discussion of future research needs (A supplementary file “3DPhantoms.pdf” to Figure 15 is available for download that will allow a reader to interactively visualize the phantoms in 3D). PMID:25144730

  7. An exponential growth of computational phantom research in radiation protection, imaging, and radiotherapy: a review of the fifty-year history.

    PubMed

    Xu, X George

    2014-09-21

    Radiation dose calculation using models of the human anatomy has been a subject of great interest to radiation protection, medical imaging, and radiotherapy. However, early pioneers of this field did not foresee the exponential growth of research activity as observed today. This review article walks the reader through the history of the research and development in this field of study which started some 50 years ago. This review identifies a clear progression of computational phantom complexity which can be denoted by three distinct generations. The first generation of stylized phantoms, representing a grouping of less than dozen models, was initially developed in the 1960s at Oak Ridge National Laboratory to calculate internal doses from nuclear medicine procedures. Despite their anatomical simplicity, these computational phantoms were the best tools available at the time for internal/external dosimetry, image evaluation, and treatment dose evaluations. A second generation of a large number of voxelized phantoms arose rapidly in the late 1980s as a result of the increased availability of tomographic medical imaging and computers. Surprisingly, the last decade saw the emergence of the third generation of phantoms which are based on advanced geometries called boundary representation (BREP) in the form of Non-Uniform Rational B-Splines (NURBS) or polygonal meshes. This new class of phantoms now consists of over 287 models including those used for non-ionizing radiation applications. This review article aims to provide the reader with a general understanding of how the field of computational phantoms came about and the technical challenges it faced at different times. This goal is achieved by defining basic geometry modeling techniques and by analyzing selected phantoms in terms of geometrical features and dosimetric problems to be solved. The rich historical information is summarized in four tables that are aided by highlights in the text on how some of the most well-known phantoms were developed and used in practice. Some of the information covered in this review has not been previously reported, for example, the CAM and CAF phantoms developed in 1970s for space radiation applications. The author also clarifies confusion about 'population-average' prospective dosimetry needed for radiological protection under the current ICRP radiation protection system and 'individualized' retrospective dosimetry often performed for medical physics studies. To illustrate the impact of computational phantoms, a section of this article is devoted to examples from the author's own research group. Finally the author explains an unexpected finding during the course of preparing for this article that the phantoms from the past 50 years followed a pattern of exponential growth. The review ends on a brief discussion of future research needs (a supplementary file '3DPhantoms.pdf' to figure 15 is available for download that will allow a reader to interactively visualize the phantoms in 3D).

  8. SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMillan, D.B.

    1960-06-01

    >A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)

  9. Computer-assisted coding and clinical documentation: first things first.

    PubMed

    Tully, Melinda; Carmichael, Angela

    2012-10-01

    Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.

  10. Quantitative imaging for clinical dosimetry

    NASA Astrophysics Data System (ADS)

    Bardiès, Manuel; Flux, Glenn; Lassmann, Michael; Monsieurs, Myriam; Savolainen, Sauli; Strand, Sven-Erik

    2006-12-01

    Patient-specific dosimetry in nuclear medicine is now a legal requirement in many countries throughout the EU for targeted radionuclide therapy (TRT) applications. In order to achieve that goal, an increased level of accuracy in dosimetry procedures is needed. Current research in nuclear medicine dosimetry should not only aim at developing new methods to assess the delivered radiation absorbed dose at the patient level, but also to ensure that the proposed methods can be put into practice in a sufficient number of institutions. A unified dosimetry methodology is required for making clinical outcome comparisons possible.

  11. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.

    1987-01-01

    The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.

  12. Bistatic radar cross section of a perfectly conducting rhombus-shaped flat plate

    NASA Astrophysics Data System (ADS)

    Fenn, Alan J.

    1990-05-01

    The bistatic radar cross section of a perfectly conducting flat plate that has a rhombus shape (equilateral parallelogram) is investigated. The Ohio State University electromagnetic surface patch code (ESP version 4) is used to compute the theoretical bistatic radar cross section of a 35- x 27-in rhombus plate at 1.3 GHz over the bistatic angles 15 deg to 142 deg. The ESP-4 computer code is a method of moments FORTRAN-77 program which can analyze general configurations of plates and wires. This code has been installed and modified at Lincoln Laboratory on a SUN 3 computer network. Details of the code modifications are described. Comparisons of the method of moments simulations and measurements of the rhombus plate are made. It is shown that the ESP-4 computer code provides a high degree of accuracy in the calculation of copolarized and cross-polarized bistatic radar cross section patterns.

  13. ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.

    A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less

  14. The ENEA criticality accident dosimetry system: a contribution to the 2002 international intercomparison at the SILENE reactor.

    PubMed

    Gualdrini, G; Bedogni, R; Fantuzzi, E; Mariotti, F

    2004-01-01

    The present paper summarises the activity carried out at the ENEA Radiation Protection Institute for updating the methodologies employed for the evaluation of the neutron and photon dose to the exposed workers in case of a criticality accident, in the framework of the 'International Intercomparison of Criticality Accident Dosimetry Systems' (Silène reactor, IRSN-CEA-Valduc June 2002). The evaluation of the neutron spectra and the neutron dosimetric quantities relies on activation detectors and on unfolding algorithms. Thermoluminescent detectors are employed for the gamma dose measurement. The work is aimed at accurately characterising the measurement system and, at the same time, testing the algorithms. Useful spectral information were included, based on Monte Carlo simulations, to take into account the potential accident scenarios of practical interest. All along this exercise intercomparison a particular attention was devoted to the 'traceability' of all the experimental and computational parameters and therefore, aimed at an easy treatment by the user.

  15. Navier-Stokes Simulation of Homogeneous Turbulence on the CYBER 205

    NASA Technical Reports Server (NTRS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.; Rogallo, R. S.

    1984-01-01

    A computer code which solves the Navier-Stokes equations for three dimensional, time-dependent, homogenous turbulence has been written for the CYBER 205. The code has options for both 64-bit and 32-bit arithmetic. With 32-bit computation, mesh sizes up to 64 (3) are contained within core of a 2 million 64-bit word memory. Computer speed timing runs were made for various vector lengths up to 6144. With this code, speeds a little over 100 Mflops have been achieved on a 2-pipe CYBER 205. Several problems encountered in the coding are discussed.

  16. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E.

    1985-01-01

    The tether control law to retrieve the satellite was modified in order to have a smooth retrieval trajectory of the satellite that minimizes the thruster activation. The satellite thrusters were added to the rotational dynamics computer code and a preliminary control logic was implemented to simulate them during the retrieval maneuver. The high resolution computer code for modelling the three dimensional dynamics of untensioned tether, SLACK3, was made fully operative and a set of computer simulations of possible tether breakages was run. The distribution of the electric field around an electrodynamic tether in vacuo severed at some length from the shuttle was computed with a three dimensional electrodynamic computer code.

  17. Experimental and computational surface and flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.

    1990-01-01

    The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.

  18. WAZA-ARI: computational dosimetry system for X-ray CT examinations II: development of web-based system.

    PubMed

    Ban, Nobuhiko; Takahashi, Fumiaki; Ono, Koji; Hasegawa, Takayuki; Yoshitake, Takayasu; Katsunuma, Yasushi; Sato, Kaoru; Endo, Akira; Kai, Michiaki

    2011-07-01

    A web-based dose computation system, WAZA-ARI, is being developed for patients undergoing X-ray CT examinations. The system is implemented in Java on a Linux server running Apache Tomcat. Users choose scanning options and input parameters via a web browser over the Internet. Dose coefficients, which were calculated in a Japanese adult male phantom (JM phantom) are called upon user request and are summed over the scan range specified by the user to estimate a normalised dose. Tissue doses are finally computed based on the radiographic exposure (mA s) and the pitch factor. While dose coefficients are currently available only for limited CT scanner models, the system has achieved a high degree of flexibility and scalability without the use of commercial software.

  19. Evaluation of the water-equivalence of plastic materials in low- and high-energy clinical proton beams

    NASA Astrophysics Data System (ADS)

    Lourenço, A.; Shipley, D.; Wellock, N.; Thomas, R.; Bouchard, H.; Kacperek, A.; Fracchiolla, F.; Lorentini, S.; Schwarz, M.; MacDougall, N.; Royle, G.; Palmans, H.

    2017-05-01

    The aim of this work was to evaluate the water-equivalence of new trial plastics designed specifically for light-ion beam dosimetry as well as commercially available plastics in clinical proton beams. The water-equivalence of materials was tested by computing a plastic-to-water conversion factor, {{H}\\text{pl,\\text{w}}} . Trial materials were characterized experimentally in 60 MeV and 226 MeV un-modulated proton beams and the results were compared with Monte Carlo simulations using the FLUKA code. For the high-energy beam, a comparison between the trial plastics and various commercial plastics was also performed using FLUKA and Geant4 Monte Carlo codes. Experimental information was obtained from laterally integrated depth-dose ionization chamber measurements in water, with and without plastic slabs with variable thicknesses in front of the water phantom. Fluence correction factors, {{k}\\text{fl}} , between water and various materials were also derived using the Monte Carlo method. For the 60 MeV proton beam, {{H}\\text{pl,\\text{w}}} and {{k}\\text{fl}} factors were within 1% from unity for all trial plastics. For the 226 MeV proton beam, experimental {{H}\\text{pl,\\text{w}}} values deviated from unity by a maximum of about 1% for the three trial plastics and experimental results showed no advantage regarding which of the plastics was the most equivalent to water. Different magnitudes of corrections were found between Geant4 and FLUKA for the various materials due mainly to the use of different nonelastic nuclear data. Nevertheless, for the 226 MeV proton beam, {{H}\\text{pl,\\text{w}}} correction factors were within 2% from unity for all the materials. Considering the results from the two Monte Carlo codes, PMMA and trial plastic #3 had the smallest {{H}\\text{pl,\\text{w}}} values, where maximum deviations from unity were 1%, however, PMMA range differed by 16% from that of water. Overall, {{k}\\text{fl}} factors were deviating more from unity than {{H}\\text{pl,\\text{w}}} factors and could amount to a few percent for some materials.

  20. Evaluation of the water-equivalence of plastic materials in low- and high-energy clinical proton beams.

    PubMed

    Lourenço, A; Shipley, D; Wellock, N; Thomas, R; Bouchard, H; Kacperek, A; Fracchiolla, F; Lorentini, S; Schwarz, M; MacDougall, N; Royle, G; Palmans, H

    2017-05-21

    The aim of this work was to evaluate the water-equivalence of new trial plastics designed specifically for light-ion beam dosimetry as well as commercially available plastics in clinical proton beams. The water-equivalence of materials was tested by computing a plastic-to-water conversion factor, [Formula: see text]. Trial materials were characterized experimentally in 60 MeV and 226 MeV un-modulated proton beams and the results were compared with Monte Carlo simulations using the FLUKA code. For the high-energy beam, a comparison between the trial plastics and various commercial plastics was also performed using FLUKA and Geant4 Monte Carlo codes. Experimental information was obtained from laterally integrated depth-dose ionization chamber measurements in water, with and without plastic slabs with variable thicknesses in front of the water phantom. Fluence correction factors, [Formula: see text], between water and various materials were also derived using the Monte Carlo method. For the 60 MeV proton beam, [Formula: see text] and [Formula: see text] factors were within 1% from unity for all trial plastics. For the 226 MeV proton beam, experimental [Formula: see text] values deviated from unity by a maximum of about 1% for the three trial plastics and experimental results showed no advantage regarding which of the plastics was the most equivalent to water. Different magnitudes of corrections were found between Geant4 and FLUKA for the various materials due mainly to the use of different nonelastic nuclear data. Nevertheless, for the 226 MeV proton beam, [Formula: see text] correction factors were within 2% from unity for all the materials. Considering the results from the two Monte Carlo codes, PMMA and trial plastic #3 had the smallest [Formula: see text] values, where maximum deviations from unity were 1%, however, PMMA range differed by 16% from that of water. Overall, [Formula: see text] factors were deviating more from unity than [Formula: see text] factors and could amount to a few percent for some materials.

Top