Science.gov

Sample records for geant4 physics models

  1. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    NASA Astrophysics Data System (ADS)

    Apostolakis, J.; Asai, M.; Bagulya, A.; Brown, J. M. C.; Burkhardt, H.; Chikuma, N.; Cortes-Giraldo, M. A.; Elles, S.; Grichine, V.; Guatelli, S.; Incerti, S.; Ivanchenko, V. N.; Jacquemier, J.; Kadri, O.; Maire, M.; Pandola, L.; Sawkey, D.; Toshito, T.; Urban, L.; Yamashita, T.

    2015-12-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  2. Preliminary Investigation of Microdosimetric Track Structure Physics Models in Geant4-DNA and RITRACKS

    PubMed Central

    Bezak, Eva

    2015-01-01

    The major differences between the physics models in Geant4-DNA and RITRACKS Monte Carlo packages are investigated. Proton and electron ionisation interactions and electron excitation interactions in water are investigated in the current work. While these packages use similar semiempirical physics models for inelastic cross-sections, the implementation of these models is demonstrated to be significantly different. This is demonstrated in a simple Monte Carlo simulation designed to identify differences in interaction cross-sections. PMID:26124856

  3. The Geant4 Physics Validation Repository

    SciTech Connect

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-23

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described

  4. The Geant4 physics validation repository

    DOE PAGESBeta

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-01-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described

  5. The Geant4 physics validation repository

    NASA Astrophysics Data System (ADS)

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described.

  6. Implementing NRF Physics in Geant4

    SciTech Connect

    Jordan, David V.; Warren, Glen A.

    2006-07-01

    The Geant4 radiation transport Monte Carlo code toolkit currently does not support nuclear resonance fluorescence (NRF). After a brief review of NRF physics, plans for implementing this physics process in Geant4, and validating the output of the code, are described. The plans will be executed as Task 3 of project 50799, "Nuclear Resonance Fluorescence Signatures (NuRFS)".

  7. Physical models implemented in the GEANT4-DNA extension of the GEANT-4 toolkit for calculating initial radiation damage at the molecular level.

    PubMed

    Villagrasa, C; Francis, Z; Incerti, S

    2011-02-01

    The ROSIRIS project aims to study the radiobiology of integrated systems for medical treatment optimisation using ionising radiations and evaluate the associated risk. In the framework of this project, one research focus is the interpretation of the initial radio-induced damage in DNA created by ionising radiation (and detected by γ-H2AX foci analysis) from the track structure of the incident particles. In order to calculate the track structure of ionising particles at a nanometric level, the Geant4 Monte Carlo toolkit was used. Geant4 (Object Oriented Programming Architecture in C++) offers a common platform, available free to all users and relatively easy to use. Nevertheless, the current low-energy threshold for electromagnetic processes in GEANT4 is set to 1 keV (250 eV using the Livermore processes), which is an unsuitable value for nanometric applications. To lower this energy threshold, the necessary interaction processes and models were identified, and the corresponding available cross sections collected from the literature. They are mostly based on the plane-wave Born approximation (first Born approximation, or FBA) for inelastic interactions and on semi-empirical models for energies where the FBA fails (at low energies). In this paper, the extensions that have been introduced into the 9.3 release of the Geant4 toolkit are described, the so-called Geant4-DNA extension, including a set of processes and models adapted in this study and permitting the simulation of electron (8 eV-1 MeV), proton (100 eV-100 MeV) and alpha particle (1 keV-10 MeV) interactions in liquid water. PMID:21186212

  8. Validation of recent Geant4 physics models for application in carbon ion therapy

    NASA Astrophysics Data System (ADS)

    Lechner, A.; Ivanchenko, V. N.; Knobloch, J.

    2010-07-01

    Cancer treatment with energetic carbon ions has distinct advantages over proton or photon irradiation. In this paper we present a simulation model integrated into the Geant4 Monte Carlo toolkit (version 9.3) which enables the use of ICRU 73 stopping powers for ion transport calculations. For a few materials, revised ICRU 73 stopping power tables recently published by ICRU (P. Sigmund, A. Schinner, H. Paul, Errata and Addenda: ICRU Report 73 (Stopping of Ions Heavier than Helium), International Commission on Radiation Units and Measurements, 2009) were incorporated into Geant4, also covering media like water which are of importance in radiotherapeutical applications. We examine, with particular attention paid to the recent developments, the accuracy of current Geant4 models for simulating Bragg peak profiles of 12C ions incident on water and polyethylene targets. Simulated dose distributions are validated against experimental data available in the literature, where the focus is on beam energies relevant to ion therapy applications (90-400 MeV/u). A quantitative analysis is performed which addresses the precision of the Bragg peak position and proportional features of the dose distribution. It is shown that experimental peak positions can be reproduced within 0.2% of the particle range in the case of water, and within 0.9% in the case of polyethylene. The comparisons also demonstrate that the simulations accurately render the full width at half maximum (FWHM) of the measured Bragg peaks in water. For polyethylene slight deviations from experimental peak widths are partly attributed to systematic effects due to a simplified geometry model adopted in the simulation setup.

  9. GEANT4: Applications in High Energy Physics

    SciTech Connect

    Mahmood, Tariq; Zafar, Abrar Ahmed; Hussain, Talib; Rashid, Haris

    2007-02-14

    GEANT4 is a detector simulation toolkit aimed at studying, mainly experimental high energy physics. In this paper we will give an overview of this software with special reference to its applications in high energy physics experiments. A brief of process methods is given. Object-oriented nature of the simulation toolkit is highlighted.

  10. Geant4 electromagnetic physics updates for space radiation effects simulation

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Nieminen, Petteri; Incerti, Sebastien; Santin, Giovanni; Ivantchenko, Vladimir; Grichine, Vladimir; Allison, John; Karamitos, Mathiew

    The Geant4 toolkit is used in many applications including space science studies. The new Geant4 version 10.0 released in December 2013 includes a major revision of the toolkit and offers multi-threaded mode for event level parallelism. At the same time, Geant4 electromagnetic and hadronic physics sub-libraries have been significantly updated. In order to validate the new and updated models Geant4 verification tests and benchmarks were extended. Part of these developments was sponsored by the European Space Agency in the context of research aimed at modelling radiation biological end effects. In this work, we present an overview of results of several benchmarks for electromagnetic physics models relevant to space science. For electromagnetic physics, recently Compton scattering, photoelectric effect, and Rayleigh scattering models have been improved and extended down to lower energies. Models of ionization and fluctuations have also been improved; special micro-dosimetry models for Silicon and liquid water were introduced; the main multiple scattering model was consolidated; and the atomic de-excitation module has been made available to all models. As a result, Geant4 predictions for space radiation effects obtained with different Physics Lists are in better agreement with the benchmark data than previous Geant4 versions. Here we present results of electromagnetic tests and models comparison in the energy interval 10 eV - 10 MeV.

  11. Electro and gamma nuclear physics in Geant4

    SciTech Connect

    J.P. Wellisch; M. Kossov; P. Degtyarenko

    2003-03-01

    Adequate description of electro and gamma nuclear physics is of utmost importance in studies of electron beam-dumps and intense electron beam accelerators. I also is mandatory to describe neutron backgrounds and activation in linear colliders. This physics was elaborated in Geant4 over the last year, and now entered into the stage of practical application. In the Geant4 Photo-nuclear data base there are at present about 50 nuclei for which the Photo-nuclear absorption cross sections have been measured. Of these, data on 14 nuclei are used to parametrize the gamma nuclear reaction cross-section The resulting cross section is a complex, factorized function of A and e = log(E{gamma}), where E{gamma} is the energy of the incident photon. Electro-nuclear reactions are so closely connected with Photo-nuclear reactions that sometimes they are often called ''Photo-nuclear''. The one-photon exchange mechanism dominates in Electro-nuclear reactions, and the electron can be substituted by a flux of photons. Folding this flux with the gamma-nuclear cross-section, we arrive at an acceptable description of the electro-nuclear physics. Final states in gamma and electro nuclear physics are described using chiral invariant phase-space decay at low gamma or equivalent photon energies, and quark gluon string model at high energies. We will present the modeling of this physics in Geant4, and show results from practical applications.

  12. Experimental quantification of Geant4 PhysicsList recommendations: methods and results

    NASA Astrophysics Data System (ADS)

    Basaglia, Tullio; Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Saracco, Paolo

    2015-12-01

    The Geant4 physicsjists package encompasses predefined selections of physics processes and models to be used in simulation applications. Limited documentation is available in the literature about Geant4 pre-packaged PhysicsLists and their validation. The reports in the literature mainly concern specific use cases. This paper documents the epistemological grounds for the validation of Geant4 pre-packaged PhysicsLists (and their accessory classes, Builders and PhysicsConstructors) and some examples of the author's scientific activity on this subject.

  13. Geant4 electromagnetic physics for the LHC and other HEP applications

    NASA Astrophysics Data System (ADS)

    Schälicke, Andreas; Bagulya, Alexander; Dale, Ørjan; Dupertuis, Frederic; Ivanchenko, Vladimir; Kadri, Omrane; Lechner, Anton; Maire, Michel; Tsagri, Mary; Urban, Laszlo

    2011-12-01

    An overview of the electromagnetic physics (EM) models available in the Geant4 toolkit is presented. Recent improvements are focused on the performance of detector simulation results from large MC production exercises at the LHC. Significant efforts were spent for high statistics validation of EM physics. The work on consolidation of Geant4 EM physics was achieved providing common interfaces for EM standard (HEP oriented) and EM low-energy models (other application domains). It allows the combination of ultra-relativistic, relativistic and low-energy models for any Geant4 EM processes. With such a combination both precision and CPU performance are achieved for the simulation of EM interactions in a wide energy range. Due to this migration of EM low-energy models to the common interface additional capabilities become available. Selected validation results are presented in this contribution.

  14. Comparison of GEANT4 very low energy cross section models with experimental data in water

    SciTech Connect

    Incerti, S.; Ivanchenko, A.; Karamitros, M.; Mantero, A.; Moretto, P.; Tran, H. N.; Mascialino, B.; Champion, C.; Ivanchenko, V. N.; Bernal, M. A.; Francis, Z.; Villagrasa, C.; Baldacchino, G.; Gueye, P.; Capra, R.; Nieminen, P.; Zacharatou, C.

    2010-09-15

    Purpose: The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H{sup 0}, H{sup +}) and (He{sup 0}, He{sup +}, He{sup 2+}), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called ''GEANT4-DNA'' physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. Methods: An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. Results: The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant

  15. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. PMID:26653251

  16. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation

    SciTech Connect

    Yoriyaz, Helio; Moralles, Mauricio; Tarso Dalledone Siqueira, Paulo de; Costa Guimaraes, Carla da; Belonsi Cintra, Felipe; Santos, Adimir dos

    2009-11-15

    Purpose: Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. Methods: For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Results: Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons.Conclusion: Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  17. Diffusion-controlled reactions modeling in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Karamitros, M.; Luan, S.; Bernal, M. A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H. N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k-d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  18. Diffusion-controlled reactions modeling in Geant4-DNA

    SciTech Connect

    Karamitros, M.; Luan, S.; Bernal, M.A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H.N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k–d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  19. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4

    PubMed Central

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-01-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ε). In addition, we present comparisons of GEANT4 simulations performed with a “standard” and a “low-energy” physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results. Key Points Testing the feedback mechanism with GEANT4 Validating the GEANT4 programming toolkit Study the ratio of bremsstrahlung photons to electrons at TGF source altitude PMID:26167437

  20. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4

    NASA Astrophysics Data System (ADS)

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-11-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ɛ). In addition, we present comparisons of GEANT4 simulations performed with a "standard" and a "low-energy" physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results.

  1. A modular Geant4 model of Leksell Gamma Knife Perfexion™

    NASA Astrophysics Data System (ADS)

    Pipek, J.; Novotný, J.; Novotný, J., Jr.; Kozubíková, P.

    2014-12-01

    This work presents a Monte Carlo model of Leksell Gamma Knife Perfexion as well as the main parameters of the dose distribution in the standard phantom obtained using this model. The model is developed in the Geant4 simulation toolkit in a modular way which enables its reuse in other Perfexion studies. Large phase space files were created, containing particles that are entering the inner machine cavity after being transported through the collimation system. All 14 output factors of the machine and effective output factors for both the 4 mm (0.830 ± 0.009) and 8 mm (0.921 ± 0.004) collimators were calculated. Dose profiles along the main axes are also included for each collimator size. All results are compared to the values obtained from the treatment planning system, from experiments, and from other Monte Carlo models.

  2. A modular Geant4 model of Leksell Gamma Knife Perfexion™.

    PubMed

    Pipek, J; Novotný, J; Novotný, J; Kozubíková, P

    2014-12-21

    This work presents a Monte Carlo model of Leksell Gamma Knife Perfexion as well as the main parameters of the dose distribution in the standard phantom obtained using this model. The model is developed in the Geant4 simulation toolkit in a modular way which enables its reuse in other Perfexion studies. Large phase space files were created, containing particles that are entering the inner machine cavity after being transported through the collimation system. All 14 output factors of the machine and effective output factors for both the 4 mm (0.830 ± 0.009) and 8 mm (0.921 ± 0.004) collimators were calculated. Dose profiles along the main axes are also included for each collimator size. All results are compared to the values obtained from the treatment planning system, from experiments, and from other Monte Carlo models. PMID:25415510

  3. Introduction to the Geant4 Simulation toolkit

    SciTech Connect

    Guatelli, S.; Cutajar, D.; Rosenfeld, A. B.; Oborn, B.

    2011-05-05

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  4. Introduction to the Geant4 Simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guatelli, S.; Cutajar, D.; Oborn, B.; Rosenfeld, A. B.

    2011-05-01

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  5. The impact of new Geant4-DNA cross section models on electron track structure simulations in liquid water

    NASA Astrophysics Data System (ADS)

    Kyriakou, I.; Šefl, M.; Nourry, V.; Incerti, S.

    2016-05-01

    The most recent release of the open source and general purpose Geant4 Monte Carlo simulation toolkit (Geant4 10.2 release) contains a new set of physics models in the Geant4-DNA extension for improving the modelling of low-energy electron transport in liquid water (<10 keV). This includes updated electron cross sections for excitation, ionization, and elastic scattering. In the present work, the impact of these developments to track-structure calculations is examined for providing the first comprehensive comparison against the default physics models of Geant4-DNA. Significant differences with the default models are found for the average path length and penetration distance, as well as for dose-point-kernels for electron energies below a few hundred eV. On the other hand, self-irradiation absorbed fractions for tissue-like volumes and low-energy electron sources (including some Auger emitters) reveal rather small differences (up to 15%) between these new and default Geant4-DNA models. The above findings indicate that the impact of the new developments will mainly affect those applications where the spatial pattern of interactions and energy deposition of very-low energy electrons play an important role such as, for example, the modelling of the chemical and biophysical stage of radiation damage to cells.

  6. Coupling of Geant4-DNA physics models into the GATE Monte Carlo platform: Evaluation of radiation-induced damage for clinical and preclinical radiation therapy beams

    NASA Astrophysics Data System (ADS)

    Pham, Q. T.; Anne, A.; Bony, M.; Delage, E.; Donnarieix, D.; Dufaure, A.; Gautier, M.; Lee, S. B.; Micheau, P.; Montarou, G.; Perrot, Y.; Shin, J. I.; Incerti, S.; Maigne, L.

    2015-06-01

    The GATE Monte Carlo simulation platform based on the Geant4 toolkit is in constant improvement for dosimetric calculations. In this paper, we present the integration of Geant4-DNA processes into the GATE 7.0 platform in the objective to perform multi-scale simulations (from macroscopic to nanometer scale). We simulated three types of clinical and preclinical beams: a 6 MeV electron clinical beam, a X-ray irradiator beam and a clinical proton beam for which we validated depth dose distributions against measurements in water. Frequencies of energy depositions and DNA damage were evaluated using a specific algorithm in charge of allocating energy depositions to atoms constituting DNA molecules represented by their PDB (Protein Data Bank) description.

  7. Modeling proton and alpha elastic scattering in liquid water in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Tran, H. N.; El Bitar, Z.; Champion, C.; Karamitros, M.; Bernal, M. A.; Francis, Z.; Ivantchenko, V.; Lee, S. B.; Shin, J. I.; Incerti, S.

    2015-01-01

    Elastic scattering of protons and alpha (α) particles by water molecules cannot be neglected at low incident energies. However, this physical process is currently not available in the "Geant4-DNA" extension of the Geant4 Monte Carlo simulation toolkit. In this work, we report on theoretical differential and integral cross sections of the elastic scattering process for 100 eV-1 MeV incident protons and for 100 eV-10 MeV incident α particles in liquid water. The calculations are performed within the classical framework described by Everhart et al., Ziegler et al. and by the ICRU 49 Report. Then, we propose an implementation of the corresponding classes into the Geant4-DNA toolkit for modeling the elastic scattering of protons and α particles. Stopping powers as well as ranges are also reported. Then, it clearly appears that the account of the elastic scattering process in the slowing-down of the charged particle improves the agreement with the existing data in particular with the ICRU recommendations.

  8. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN

    PubMed Central

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists. PMID:26170558

  9. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists. PMID:26170558

  10. Applications of the Monte Carlo method in nuclear physics using the GEANT4 toolkit

    SciTech Connect

    Moralles, Mauricio; Guimaraes, Carla C.; Menezes, Mario O.; Bonifacio, Daniel A. B.; Okuno, Emico; Guimaraes, Valdir; Murata, Helio M.; Bottaro, Marcio

    2009-06-03

    The capabilities of the personal computers allow the application of Monte Carlo methods to simulate very complex problems that involve the transport of particles through matter. Among the several codes commonly employed in nuclear physics problems, the GEANT4 has received great attention in the last years, mainly due to its flexibility and possibility to be improved by the users. Differently from other Monte Carlo codes, GEANT4 is a toolkit written in object oriented language (C++) that includes the mathematical engine of several physical processes, which are suitable to be employed in the transport of practically all types of particles and heavy ions. GEANT4 has also several tools to define materials, geometry, sources of radiation, beams of particles, electromagnetic fields, and graphical visualization of the experimental setup. After a brief description of the GEANT4 toolkit, this presentation reports investigations carried out by our group that involve simulations in the areas of dosimetry, nuclear instrumentation and medical physics. The physical processes available for photons, electrons, positrons and heavy ions were used in these simulations.

  11. Applications of the Monte Carlo method in nuclear physics using the GEANT4 toolkit

    NASA Astrophysics Data System (ADS)

    Moralles, Maurício; Guimarães, Carla C.; Bonifácio, Daniel A. B.; Okuno, Emico; Murata, Hélio M.; Bottaro, Márcio; Menezes, Mário O.; Guimarães, Valdir

    2009-06-01

    The capabilities of the personal computers allow the application of Monte Carlo methods to simulate very complex problems that involve the transport of particles through matter. Among the several codes commonly employed in nuclear physics problems, the GEANT4 has received great attention in the last years, mainly due to its flexibility and possibility to be improved by the users. Differently from other Monte Carlo codes, GEANT4 is a toolkit written in object oriented language (C++) that includes the mathematical engine of several physical processes, which are suitable to be employed in the transport of practically all types of particles and heavy ions. GEANT4 has also several tools to define materials, geometry, sources of radiation, beams of particles, electromagnetic fields, and graphical visualization of the experimental setup. After a brief description of the GEANT4 toolkit, this presentation reports investigations carried out by our group that involve simulations in the areas of dosimetry, nuclear instrumentation and medical physics. The physical processes available for photons, electrons, positrons and heavy ions were used in these simulations.

  12. Evaluation of proton inelastic reaction models in Geant4 for prompt gamma production during proton radiotherapy

    NASA Astrophysics Data System (ADS)

    Jeyasugiththan, Jeyasingam; Peterson, Stephen W.

    2015-10-01

    During proton beam radiotherapy, discrete secondary prompt gamma rays are induced by inelastic nuclear reactions between protons and nuclei in the human body. In recent years, the Geant4 Monte Carlo toolkit has played an important role in the development of a device for real time dose range verification purposes using prompt gamma radiation. Unfortunately the default physics models in Geant4 do not reliably replicate the measured prompt gamma emission. Determining a suitable physics model for low energy proton inelastic interactions will boost the accuracy of prompt gamma simulations. Among the built-in physics models, we found that the precompound model with a modified initial exciton state of 2 (1 particle, 1 hole) produced more accurate discrete gamma lines from the most important elements found within the body such as 16O, 12C and 14N when comparing them with the available gamma production cross section data. Using the modified physics model, we investigated the prompt gamma spectra produced in a water phantom by a 200 MeV pencil beam of protons. The spectra were attained using a LaBr3 detector with a time-of-flight (TOF) window and BGO active shield to reduce the secondary neutron and gamma background. The simulations show that a 2 ns TOF window could reduce 99% of the secondary neutron flux hitting the detector. The results show that using both timing and active shielding can remove up to 85% of the background radiation which includes a 33% reduction by BGO subtraction.

  13. Evaluation of proton inelastic reaction models in Geant4 for prompt gamma production during proton radiotherapy.

    PubMed

    Jeyasugiththan, Jeyasingam; Peterson, Stephen W

    2015-10-01

    During proton beam radiotherapy, discrete secondary prompt gamma rays are induced by inelastic nuclear reactions between protons and nuclei in the human body. In recent years, the Geant4 Monte Carlo toolkit has played an important role in the development of a device for real time dose range verification purposes using prompt gamma radiation. Unfortunately the default physics models in Geant4 do not reliably replicate the measured prompt gamma emission. Determining a suitable physics model for low energy proton inelastic interactions will boost the accuracy of prompt gamma simulations. Among the built-in physics models, we found that the precompound model with a modified initial exciton state of 2 (1 particle, 1 hole) produced more accurate discrete gamma lines from the most important elements found within the body such as 16O, 12C and 14N when comparing them with the available gamma production cross section data. Using the modified physics model, we investigated the prompt gamma spectra produced in a water phantom by a 200 MeV pencil beam of protons. The spectra were attained using a LaBr3 detector with a time-of-flight (TOF) window and BGO active shield to reduce the secondary neutron and gamma background. The simulations show that a 2 ns TOF window could reduce 99% of the secondary neutron flux hitting the detector. The results show that using both timing and active shielding can remove up to 85% of the background radiation which includes a 33% reduction by BGO subtraction. PMID:26389549

  14. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products

    PubMed Central

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm3 water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  15. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products.

    PubMed

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm(3) water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  16. Accuracy of the photon and electron physics in GEANT4 for radiotherapy applications

    SciTech Connect

    Poon, Emily; Verhaegen, Frank

    2005-06-15

    This work involves a validation of the photon and electron transport of the GEANT4 particle simulation toolkit for radiotherapy physics applications. We examine the cross sections and sampling algorithms of the three electromagnetic physics models in version 4.6.1 of the toolkit: Standard, Low-energy, and Penelope. The depth dose distributions in water for incident monoenergetic and clinical beams are compared to the EGSNRC results. In photon beam simulations, all three models agree with EGSNRC to within 2%, except for the buildup region. Larger deviations are found for incident electron beams, and the differences are affected by user-imposed electron step limitations. Particle distributions through thin layers of clinical target materials, and perturbation effects near high-Z and low-Z interfaces are also investigated. The electron step size artifacts observed in our studies indicate potential problems with the condensed history algorithm. A careful selection of physics processes and transport parameters is needed for optimum efficiency and accuracy.

  17. The GEANT4 Visualisation System

    SciTech Connect

    Allison, J.; Asai, M.; Barrand, G.; Donszelmann, M.; Minamimoto, K.; Tanaka, S.; Tcherniaev, E.; Tinslay, J.; /SLAC

    2007-11-02

    The Geant4 Visualization System is a multi-driver graphics system designed to serve the Geant4 Simulation Toolkit. It is aimed at the visualization of Geant4 data, primarily detector descriptions and simulated particle trajectories and hits. It can handle a variety of graphical technologies simultaneously and interchangeably, allowing the user to choose the visual representation most appropriate to requirements. It conforms to the low-level Geant4 abstract graphical user interfaces and introduces new abstract classes from which the various drivers are derived and that can be straightforwardly extended, for example, by the addition of a new driver. It makes use of an extendable class library of models and filters for data representation and selection. The Geant4 Visualization System supports a rich set of interactive commands based on the Geant4 command system. It is included in the Geant4 code distribution and maintained and documented like other components of Geant4.

  18. The Geant4 Bertini Cascade

    NASA Astrophysics Data System (ADS)

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron-nucleus interaction models in the GEANT4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron-nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other GEANT4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  19. The Geant4 Bertini Cascade

    SciTech Connect

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron–nucleus interaction models in the Geant4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron–nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other Geant4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  20. ROSI and GEANT4 - A comparison in the context of high energy X-ray physics

    NASA Astrophysics Data System (ADS)

    Kiunke, Markus; Stritt, Carina; Schielein, Richard; Sukowski, Frank; Hölzing, Astrid; Zabler, Simon; Hofmann, Jürgen; Flisch, Alexander; Kasperl, Stefan; Sennhauser, Urs; Hanke, Randolf

    2016-06-01

    This work compares two popular MC simulation frameworks ROSI (Roentgen Simulation) and GEANT4 (Geometry and Tracking in its fourth version) in the context of X-ray physics. The comparison will be performed with the help of a parameter study considering energy, material and length variations. While the total deposited energy as well as the contribution of Compton scattering show a good accordance between all simulated configurations, all other physical effects exhibit large deviations in a comparison of data-sets. These discrepancies between simulations are shown to originate from the different cross sectional databases used in the frameworks, whereas the overall simulation mechanics seem to not have an influence on the agreement of the simulations. A scan over energy, length and material shows that the two parameters energy and material have a significant influence on the agreement of the simulation results, while the length parameter shows no noticeable influence on the deviations between the data-sets.

  1. Application of TDCR-Geant4 modeling to standardization of 63Ni.

    PubMed

    Thiam, C; Bobin, C; Chauvenet, B; Bouchard, J

    2012-09-01

    As an alternative to the classical TDCR model applied to liquid scintillation (LS) counting, a stochastic approach based on the Geant4 toolkit is presented for the simulation of light emission inside the dedicated three-photomultiplier detection system. To this end, the Geant4 modeling includes a comprehensive description of optical properties associated with each material constituting the optical chamber. The objective is to simulate the propagation of optical photons from their creation in the LS cocktail to the production of photoelectrons in the photomultipliers. First validated for the case of radionuclide standardization based on Cerenkov emission, the scintillation process has been added to a TDCR-Geant4 modeling using the Birks expression in order to account for the light-emission nonlinearity owing to ionization quenching. The scintillation yield of the commercial Ultima Gold LS cocktail has been determined from double-coincidence detection efficiencies obtained for (60)Co and (54)Mn with the 4π(LS)β-γ coincidence method. In this paper, the stochastic TDCR modeling is applied for the case of the standardization of (63)Ni (pure β(-)-emitter; E(max)=66.98 keV) and the activity concentration is compared with the result given by the classical model. PMID:22436447

  2. Monte Carlo modeling and validation of a proton treatment nozzle by using the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Kim, Dae-Hyun; Kang, Young Nam; Suh, Tae-Suk; Shin, Jungwook; Kim, Jong Won; Yoo, Seung Hoon; Park, Seyjoon; Lee, Sang Hoon; Cho, Sungkoo; Shin, Dongho; Kim, Dae Yong; Lee, Se Byeong

    2012-10-01

    Modern commercial treatment planning systems for proton therapy use the pencil beam algorithm for calculating the absorbed dose. Although it is acceptable for clinical radiation treatment, the accuracy of this method is limited. Alternatively, the Monte Carlo method, which is relatively accurate in dose calculations, has been applied recently to proton therapy. To reduce the remaining uncertainty in proton therapy dose calculations, in the present study, we employed Monte Carlo simulations and the Geant4 simulation toolkit to develop a model for a of a proton treatment nozzle. The results from a Geant4-based medical application of the proton treatment nozzle were compared to the measured data. Simulations of the percentage depth dose profiles showed very good agreement within 1 mm in distal range and 3 mm in modulated width. Moreover, the lateral dose profiles showed good agreement within 3% in the central region of the field and within 10% in the penumbra regions. In this work, we proved that the Geant4 Monte Carlo model of a proton treatment nozzle could be used to the calculate proton dose distributions accurately.

  3. Technical Note: Improvements in GEANT4 energy-loss model and the effect on low-energy electron transport in liquid water

    SciTech Connect

    Kyriakou, I.; Incerti, S.

    2015-07-15

    Purpose: The GEANT4-DNA physics models are upgraded by a more accurate set of electron cross sections for ionization and excitation in liquid water. The impact of the new developments on low-energy electron transport simulations by the GEANT4 Monte Carlo toolkit is examined for improving its performance in dosimetry applications at the subcellular and nanometer level. Methods: The authors provide an algorithm for an improved implementation of the Emfietzoglou model dielectric response function of liquid water used in the GEANT4-DNA existing model. The algorithm redistributes the imaginary part of the dielectric function to ensure a physically motivated behavior at the binding energies, while retaining all the advantages of the original formulation, e.g., the analytic properties and the fulfillment of the f-sum-rule. In addition, refinements in the exchange and perturbation corrections to the Born approximation used in the GEANT4-DNA existing model are also made. Results: The new ionization and excitation cross sections are significantly different from those of the GEANT4-DNA existing model. In particular, excitations are strongly enhanced relative to ionizations, resulting in higher W-values and less diffusive dose-point-kernels at sub-keV electron energies. Conclusions: An improved energy-loss model for the excitation and ionization of liquid water by low-energy electrons has been implemented in GEANT4-DNA. The suspiciously low W-values and the unphysical long tail in the dose-point-kernel have been corrected owing to a different partitioning of the dielectric function.

  4. Simulation and modeling for the stand-off radiation detection system (SORDS) using GEANT4

    SciTech Connect

    Hoover, Andrew S; Wallace, Mark; Galassi, Mark; Mocko, Michal; Palmer, David; Schultz, Larry; Tornga, Shawn

    2009-01-01

    A Stand-Off Radiation Detection System (SORDS) is being developed through a joint effort by Raytheon, Los Alamos National Laboratory, Bubble Technology Industries, Radiation Monitoring Devices, and the Massachusetts Institute of Technology, for the Domestic Nuclear Detection Office (DNDO). The system is a mobile truck-based platform performing detection, imaging, and spectroscopic identification of gamma-ray sources. A Tri-Modal Imaging (TMI) approach combines active-mask coded aperture imaging, Compton imaging, and shadow imaging techniques. Monte Carlo simulation and modeling using the GEANT4 toolkit was used to generate realistic data for the development of imaging algorithms and associated software code.

  5. In-beam quality assurance using induced β+ activity in hadrontherapy: a preliminary physical requirements study using Geant4

    NASA Astrophysics Data System (ADS)

    Lestand, L.; Montarou, G.; Force, P.; Pauna, N.

    2012-10-01

    Light and heavy ions particle therapy, mainly by means of protons and carbon ions, represents an advantageous treatment modality for deep-seated and/or radioresistant tumours. An in-beam quality assurance principle is based on the detection of secondary particles induced by nuclear fragmentations between projectile and target nuclei. Three different strategies are currently under investigation: prompt γ rays imaging, proton interaction vertex imaging and in-beam positron emission tomography. Geant4 simulations have been performed first in order to assess the accuracy of some hadronic models to reproduce experimental data. Two different kinds of data have been considered: β+-emitting isotopes and prompt γ-ray production rates. On the one hand simulations reproduce experimental β+ emitting isotopes production rates to an accuracy of 24%. Moreover simulated β+ emitting nuclei production rate as a function of depth reproduce well the peak-to-plateau ratio of experimental data. On the other hand by tuning the tolerance factor of the photon evaporation model available in Geant4, we reduce significantly prompt γ-ray production rates until a very good agreement is reached with experimental data. Then we have estimated the total amount of induced annihilation photons and prompt γ rays for a simple treatment plan of ∼1 physical Gy in a homogenous equivalent soft tissue tumour (6 cm depth, 4 cm radius and 2 cm wide). The average annihilation photons emitted during a 45 s irradiation in a 4 π solid angle are ∼2 × 106 annihilation photon pairs and 108 single prompt γ whose energy ranges from a few keV to 10 MeV.

  6. Multi-scale hybrid models for radiopharmaceutical dosimetry with Geant4.

    PubMed

    Marcatili, S; Villoing, D; Garcia, M P; Bardiès, M

    2014-12-21

    The accuracy of radiopharmaceutical absorbed dose distributions computed through Monte Carlo (MC) simulations is mostly limited by the low spatial resolution of 3D imaging techniques used to define the simulation geometry. This issue also persists with the implementation of realistic hybrid models built using polygonal mesh and/or NURBS as they require to be simulated in their voxel form in order to reduce computation times. The existing trade-off between voxel size and simulation speed leads on one side, in an overestimation of the size of small radiosensitive structures such as the skin or hollow organs walls and, on the other, to unnecessarily detailed voxelization of large, homogeneous structures.We developed a set of computational tools based on VTK and Geant4 in order to build multi-resolution organ models. Our aim is to use different voxel sizes to represent anatomical regions of different clinical relevance: the MC implementation of these models is expected to improve spatial resolution in specific anatomical structures without significantly affecting simulation speed. Here we present the tools developed through a proof of principle example. Our approach is validated against the standard Geant4 technique for the simulation of voxel geometries. PMID:25415621

  7. Benchmarking nuclear models of FLUKA and GEANT4 for carbon ion therapy.

    PubMed

    Böhlen, T T; Cerutti, F; Dosanjh, M; Ferrari, A; Gudowska, I; Mairani, A; Quesada, J M

    2010-10-01

    As carbon ions, at therapeutic energies, penetrate tissue, they undergo inelastic nuclear reactions and give rise to significant yields of secondary fragment fluences. Therefore, an accurate prediction of these fluences resulting from the primary carbon interactions is necessary in the patient's body in order to precisely simulate the spatial dose distribution and the resulting biological effect. In this paper, the performance of nuclear fragmentation models of the Monte Carlo transport codes, FLUKA and GEANT4, in tissue-like media and for an energy regime relevant for therapeutic carbon ions is investigated. The ability of these Monte Carlo codes to reproduce experimental data of charge-changing cross sections and integral and differential yields of secondary charged fragments is evaluated. For the fragment yields, the main focus is on the consideration of experimental approximations and uncertainties such as the energy measurement by time-of-flight. For GEANT4, the hadronic models G4BinaryLightIonReaction and G4QMD are benchmarked together with some recently enhanced de-excitation models. For non-differential quantities, discrepancies of some tens of percent are found for both codes. For differential quantities, even larger deviations are found. Implications of these findings for the therapeutic use of carbon ions are discussed. PMID:20844337

  8. Multi-scale hybrid models for radiopharmaceutical dosimetry with Geant4

    NASA Astrophysics Data System (ADS)

    Marcatili, S.; Villoing, D.; Garcia, M. P.; Bardiès, M.

    2014-12-01

    The accuracy of radiopharmaceutical absorbed dose distributions computed through Monte Carlo (MC) simulations is mostly limited by the low spatial resolution of 3D imaging techniques used to define the simulation geometry. This issue also persists with the implementation of realistic hybrid models built using polygonal mesh and/or NURBS as they require to be simulated in their voxel form in order to reduce computation times. The existing trade-off between voxel size and simulation speed leads on one side, in an overestimation of the size of small radiosensitive structures such as the skin or hollow organs walls and, on the other, to unnecessarily detailed voxelization of large, homogeneous structures. We developed a set of computational tools based on VTK and Geant4 in order to build multi-resolution organ models. Our aim is to use different voxel sizes to represent anatomical regions of different clinical relevance: the MC implementation of these models is expected to improve spatial resolution in specific anatomical structures without significantly affecting simulation speed. Here we present the tools developed through a proof of principle example. Our approach is validated against the standard Geant4 technique for the simulation of voxel geometries.

  9. Beam simulation tools for GEANT4 (and neutrino source applications)

    SciTech Connect

    V.Daniel Elvira, Paul Lebrun and Panagiotis Spentzouris

    2002-12-03

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the High Energy Physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. Although there are many computer programs for beam physics simulations, Geant4 is ideal to model a beam going through material or a system with a beam line integrated to a complex detector. There are many examples in the current international High Energy Physics programs, such as studies related to a future Neutrino Factory, a Linear Collider, and a very Large Hadron Collider.

  10. Geant4.10 simulation of geometric model for metaphase chromosome

    NASA Astrophysics Data System (ADS)

    Rafat-Motavalli, L.; Miri-Hakimabad, H.; Bakhtiyari, E.

    2016-04-01

    In this paper, a geometric model of metaphase chromosome is explained. The model is constructed according to the packing ratio and dimension of the structure from nucleosome up to chromosome. A B-DNA base pair is used to construct 200 base pairs of nucleosomes. Each chromatin fiber loop, which is the unit of repeat, has 49,200 bp. This geometry is entered in Geant4.10 Monte Carlo simulation toolkit and can be extended to the whole metaphase chromosomes and any application in which a DNA geometrical model is needed. The chromosome base pairs, chromosome length, and relative length of chromosomes are calculated. The calculated relative length is compared to the relative length of human chromosomes.

  11. Modeling spallation reactions in tungsten and uranium targets with the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yury; Pshenichnov, Igor; Mishustin, Igor; Greiner, Walter

    2012-02-01

    We study primary and secondary reactions induced by 600 MeV proton beams in monolithic cylindrical targets made of natural tungsten and uranium by using Monte Carlo simulations with the Geant4 toolkit [1-3]. Bertini intranuclear cascade model, Binary cascade model and IntraNuclear Cascade Liège (INCL) with ABLA model [4] were used as calculational options to describe nuclear reactions. Fission cross sections, neutron multiplicity and mass distributions of fragments for 238U fission induced by 25.6 and 62.9 MeV protons are calculated and compared to recent experimental data [5]. Time distributions of neutron leakage from the targets and heat depositions are calculated. This project is supported by Siemens Corporate Technology.

  12. Comparing Geant4 hadronic models for the WENDI-II rem meter response function.

    PubMed

    Vanaudenhove, T; Dubus, A; Pauly, N

    2013-01-01

    The WENDI-II rem meter is one of the most popular neutron dosemeters used to assess a useful quantity of radiation protection, namely the ambient dose equivalent. This is due to its high sensitivity and its energy response that approximately follows the conversion function between neutron fluence and ambient dose equivalent in the range of thermal to 5 GeV. The simulation of the WENDI-II response function with the Geant4 toolkit is then perfectly suited to compare low- and high-energy hadronic models provided by this Monte Carlo code. The results showed that the thermal treatment of hydrogen in polyethylene for neutron <4 eV has a great influence over the whole detector range. Above 19 MeV, both Bertini Cascade and Binary Cascade models show a good correlation with the results found in the literature, while low-energy parameterised models are not suitable for this application. PMID:22972796

  13. Hadronic models validation in GEANT4 with CALICE highly granular calorimeters

    NASA Astrophysics Data System (ADS)

    Ramilli, Marco; CALICE Collaboration

    2012-12-01

    The CALICE collaboration has constructed highly granular hadronic and electromagnetic calorimeter prototypes to evaluate technologies for the use in detector systems at a future Linear Collider, and to validate hadronic shower models with unprecedented spatial segmentation. The electromagnetic calorimeter is a sampling structure of tungsten and silicon with 9720 readout channels. The hadron calorimeter uses 7608 small plastic scintillator cells individually read out with silicon photomultipliers. This high granularity opens up the possibility for precise three-dimensional shower reconstructions and for software compensation techniques to improve the energy resolution of the detector. We discuss the latest results on the studies of shower shapes and shower properties and the comparison to the latest developed GEANT4 models for hadronic showers. A satisfactory agreement at better than 5% is found between data and simulations for most of the investigated variables. We show that applying software compensation methods based on reconstructed clusters the energy resolution for hadrons improves by a factor of 15%. The next challenge for CALICE calorimeters will be to validate the 4th dimension of hadronic showers, namely their time evolution.

  14. Modeling of x-ray fluorescence using MCNPX and Geant4

    SciTech Connect

    Rajasingam, Akshayan; Hoover, Andrew S; Fensin, Michael L; Tobin, Stephen J

    2009-01-01

    X-Ray Fluorescence (XRF) is one of thirteen non-destructive assay techniques being researched for the purpose of quantifying the Pu mass in used fuel assemblies. The modeling portion of this research will be conducted with the MCNPX transport code. The research presented here was undertaken to test the capability of MCNPX so that it can be used to benchmark measurements made at the ORNL and to give confidence in the application of MCNPX as a predictive tool of the expected capability of XRF in the context of used fuel assemblies. The main focus of this paper is a code-to-code comparison between MCNPX and Geant4 code. Since XRF in used fuel is driven by photon emission and beta decay of fission fragments, both terms were independently researched. Simple cases and used fuel cases were modeled for both source terms. In order to prepare for benchmarking to experiments, it was necessary to determine the relative significance of the various fission fragments for producing X-rays.

  15. Geant4-DNA: overview and recent developments

    NASA Astrophysics Data System (ADS)

    Štěpán, Václav

    Space travel and high altitude flights are inherently associated with prolonged exposure to cosmic and solar radiation. Understanding and simulation of radiation action on cellular and subcellular level contributes to precise assessment of the associated health risks and remains a challenge of today’s radiobiology research. The Geant4-DNA project (http://geant4-dna.org) aims at developing an experimentally validated simulation platform for modelling of the damage induced by ionizing radiation at DNA level. The platform is based on the Geant4 Monte Carlo simulation toolkit. This project extends specific functionalities of Geant4 in following areas: The step-by-step single scattering modelling of elementary physical interactions of electrons, protons, alpha particles and light ions with liquid water and DNA bases, for the so-called “physical” stage. The modelling of the “physico-chemical and chemical” stages corresponding to the production, the diffusion, the chemical reactions occurring between chemical species produced by water radiolysis, and to the radical attack on the biological targets. Physical and chemical stage simulations are combined with biological target models on several scales, from DNA double helix, through nucleosome, to chromatin segments and cell geometries. In addition, data mining clustering algorithms have been developed and optimised for the purpose of DNA damage scoring in simulated tracks. Experimental measurements on pBR322 plasmid DNA are being carried out in order to validate the Geant4-DNA models. The plasmid DNA has been irradiated in dry conditions by protons with energies from 100 keV to 30 MeV and in aqueous conditions, with and without scavengers, by 30 MeV protons, 290 MeV/u carbon and 500 MeV/u iron ions. Agarose gel electrophoresis combined with enzymatic treatment has been used to measure the resulting DNA damage. An overview of the developments undertaken by the Geant4-DNA collaboration including a description of

  16. Recent Developments in the Geant4 Hadronic Framework

    NASA Astrophysics Data System (ADS)

    Pokorski, Witold; Ribon, Alberto

    2014-06-01

    In this paper we present the recent developments in the Geant4 hadronic framework. Geant4 is the main simulation toolkit used by the LHC experiments and therefore a lot of effort is put into improving the physics models in order for them to have more predictive power. As a consequence, the code complexity increases, which requires constant improvement and optimization on the programming side. At the same time, we would like to review and eventually reduce the complexity of the hadronic software framework. As an example, a factory design pattern has been applied in Geant4 to avoid duplications of objects, like cross sections, which can be used by several processes or physics models. This approach has been applied also for physics lists, to provide a flexible configuration mechanism at run-time, based on macro files. Moreover, these developments open the future possibility to build Geant4 with only a specified sub-set of physics models. Another technical development focused on the reproducibility of the simulation, i.e. the possibility to repeat an event once the random generator status at the beginning of the event is known. This is crucial for debugging rare situations that may occur after long simulations. Moreover, reproducibility in normal, sequential Geant4 simulation is an important prerequisite to verify the equivalence with multithreaded Geant4 simulations.

  17. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  18. GAMOS: A framework to do GEANT4 simulations in different physics fields with an user-friendly interface

    NASA Astrophysics Data System (ADS)

    Arce, Pedro; Ignacio Lagares, Juan; Harkness, Laura; Pérez-Astudillo, Daniel; Cañadas, Mario; Rato, Pedro; de Prado, María; Abreu, Yamiel; de Lorenzo, Gianluca; Kolstein, Machiel; Díaz, Angelina

    2014-01-01

    GAMOS is a software system for GEANT4-based simulation. It comprises a framework, a set of components providing functionality to simulation applications on top of the GEANT4 toolkit, and a collection of ready-made applications. It allows to perform GEANT4-based simulations using a scripting language, without requiring the writing of C++ code. Moreover, GAMOS design allows the extension of the existing functionality through user-supplied C++ classes. The main characteristics of GAMOS and its embedded functionality are described.

  19. A Virtual Geant4 Environment

    NASA Astrophysics Data System (ADS)

    Iwai, Go

    2015-12-01

    We describe the development of an environment for Geant4 consisting of an application and data that provide users with a more efficient way to access Geant4 applications without having to download and build the software locally. The environment is platform neutral and offers the users near-real time performance. In addition, the environment consists of data and Geant4 libraries built using low-level virtual machine (LLVM) tools which can produce bitcode that can be embedded in HTML and accessed via a browser. The bitcode is downloaded to the local machine via the browser and can then be configured by the user. This approach provides a way of minimising the risk of leaking potentially sensitive data used to construct the Geant4 model and application in the medical domain for treatment planning. We describe several applications that have used this approach and compare their performance with that of native applications. We also describe potential user communities that could benefit from this approach.

  20. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam

    NASA Astrophysics Data System (ADS)

    Hall, David C.; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues.

  1. Geant4 validation with CMS calorimeters test-beam data

    SciTech Connect

    Piperov, Stefan; /Sofiya, Inst. Nucl. Res. /Fermilab

    2008-08-01

    CMS experiment is using Geant4 for Monte-Carlo simulation of the detector setup. Validation of physics processes describing hadronic showers is a major concern in view of getting a proper description of jets and missing energy for signal and background events. This is done by carrying out an extensive studies with test beam using the prototypes or real detector modules of the CMS calorimeter. These data are matched with Geant4 predictions. Tuning of the Geant4 models is carried out and steps to be used in reproducing detector signals are defined in view of measurements of energy response, energy resolution, transverse and longitudinal shower profiles for a variety of hadron beams over a broad energy spectrum between 2 to 300 GeV/c.

  2. A CAD interface for GEANT4.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-09-01

    Often CAD models already exist for parts of a geometry being simulated using GEANT4. Direct import of these CAD models into GEANT4 however, may not be possible and complex components may be difficult to define via other means. Solutions that allow for users to work around the limited support in the GEANT4 toolkit for loading predefined CAD geometries have been presented by others, however these solutions require intermediate file format conversion using commercial software. Here within we describe a technique that allows for CAD models to be directly loaded as geometry without the need for commercial software and intermediate file format conversion. Robustness of the interface was tested using a set of CAD models of various complexity; for the models used in testing, no import errors were reported and all geometry was found to be navigable by GEANT4. PMID:22956356

  3. GEANT4-MT : bringing multi-threading into GEANT4 production

    NASA Astrophysics Data System (ADS)

    Ahn, Sunil; Apostolakis, John; Asai, Makoto; Brandt, Daniel; Cooperman, Gene; Cosmo, Gabriele; Dotti, Andrea; Dong, Xin; Jun, Soon Yung; Nowak, Andrzej

    2014-06-01

    GEANT4-MT is the multi-threaded version of the GEANT4 particle transport code.(1, 2) The key goals for the design of GEANT4-MT have been a) the need to reduce the memory footprint of the multi-threaded application compared to the use of separate jobs and processes; b) to create an easy migration of the existing applications; and c) to use efficiently many threads or cores, by scaling up to tens and potentially hundreds of workers. The first public release of a GEANT4-MT prototype was made in 2011. We report on the revision of GEANT4-MT for inclusion in the production-level release scheduled for end of 2013. This has involved significant re-engineering of the prototype in order to incorporate it into the main GEANT4 development line, and the porting of GEANT4-MT threading code to additional platforms. In order to make the porting of applications as simple as possible, refinements addressed the needs of standalone applications. Further adaptations were created to improve the fit with the frameworks of High Energy Physics (HEP) experiments. We report on performances measurements on Intel Xeon™, AMD Opteron™ the first trials of GEANT4-MT on the Intel Many Integrated Cores (MIC) architecture, in the form of the Xeon Phi™ co-processor.(3) These indicate near-linear scaling through about 200 threads on 60 cores, when holding fixed the number of events per thread.

  4. Simulations of nuclear resonance fluorescence in GEANT4

    NASA Astrophysics Data System (ADS)

    Lakshmanan, Manu N.; Harrawood, Brian P.; Rusev, Gencho; Agasthya, Greeshma A.; Kapadia, Anuj J.

    2014-11-01

    The nuclear resonance fluorescence (NRF) technique has been used effectively to identify isotopes based on their nuclear energy levels. Specific examples of its modern-day applications include detecting spent nuclear waste and cargo scanning for homeland security. The experimental designs for these NRF applications can be more efficiently optimized using Monte Carlo simulations before the experiment is implemented. One of the most widely used Monte Carlo physics simulations is the open-source toolkit GEANT4. However, NRF physics has not been incorporated into the GEANT4 simulation toolkit in publicly available software. Here we describe the development and testing of an NRF simulation in GEANT4. We describe in depth the development and architecture of this software for the simulation of NRF in any isotope in GEANT4; as well as verification and validation testing of the simulation for NRF in boron. In the verification testing, the simulation showed agreement with the analytical model to be within 0.6% difference for boron and iron. In the validation testing, the simulation showed agreement to be within 20.5% difference with the experimental measurements for boron, with the percent difference likely due to small uncertainties in beam polarization, energy distribution, and detector composition.

  5. Galactic Cosmic Rays and Lunar Secondary Particles from Solar Minimum to Maximum: CRaTER Observations and Geant4 Modeling

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J. E.; Blake, J. B.; Spence, H. E.; Schwadron, N.; Golightly, M. J.; Case, A. W.; Kasper, J. C.; Townsend, L. W.; Wilson, J. K.

    2014-12-01

    The Lunar Reconnaissance Orbiter mission was launched in 2009 during the recent deep and extended solar minimum, with the highest galactic cosmic ray (GCR) fluxes observed since the beginning of the space era. Its Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument was designed to measure the spectra of energy deposits in silicon detectors shielded behind pieces of tissue equivalent plastic, simulating the self-shielding provided by an astronaut's body around radiation-sensitive organs. The CRaTER data set now covers the evolution of the GCR environment near the moon during the first five years of development of the present solar cycle. We will present these observations, along with Geant4 modeling to illustrate the varying particle contributions to the energy-deposit spectra. CRaTER has also measured protons traveling up from the lunar surface after their creation during GCR interactions with surface material, and we will report observations and modeling of the energy and angular distributions of these "albedo" protons.

  6. Geant4 Applications in Space

    SciTech Connect

    Asai, M.; /SLAC

    2007-11-07

    Use of Geant4 is rapidly expanding in space application domain. I try to overview three major application areas of Geant4 in space, which are apparatus simulation for pre-launch design and post-launch analysis, planetary scale simulation for radiation spectra and surface and sub-surface explorations, and micro-dosimetry simulation for single event study and radiation-hardening of semiconductor devices. Recently, not only the mission dependent applications but also various multi-purpose or common tools built on top of Geant4 are also widely available. I overview some of such tools as well. The Geant4 Collaboration identifies that the space applications are now one of the major driving forces of the further developments and refinements of Geant4 toolkit. Highlights of such developments are introduced.

  7. GEANT4 Applications in Space

    NASA Astrophysics Data System (ADS)

    Asai, Makoto

    2008-06-01

    The use of Geant4 is rapidly expanding in the domain of space applications. I try to give an overview three major application areas of Geant4 in space, which are apparatus simulation for pre-launch design and post-launch analysis, planetary scale simulation for radiation spectra and surface and sub-surface explorations, and micro-dosimetry simulation for single event study and radiation-hardening of semiconductor devices. Recently, not only the mission-dependent applications but also various multi-purpose or common tools built on top of Geant4 are also widely available. I overview some of these tools as well. The Geant4 Collaboration identifies the space applications now-as one of the major driving forces of the further developments and refinements of Geant4 toolkit. Highlights of such developments are given.

  8. An Overview of the Geant4 Toolkit

    SciTech Connect

    Apostolakis, John; Wright, Dennis H.

    2007-03-19

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications.With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results.Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results.Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  9. An Overview of the GEANT4 Toolkit

    SciTech Connect

    Apostolakis, John; Wright, Dennis H.; /SLAC

    2007-10-05

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualize and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  10. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  11. Medical Applications of the Geant4 Simulation Toolkit

    NASA Astrophysics Data System (ADS)

    Perl, Joseph

    2008-03-01

    Geant4 is a toolkit for the simulation of the passage of particles through matter. While Geant4 was originally developed for High Energy Physics (HEP), applications now include Nuclear, Space and Medical Physics. Medical applications of Geant4 in North America and throughout the world have been increasing rapidly due to the overall growth of Monte Carlo use in Medical Physics and the unique qualities of Geant4 as an all-particle code able to handle complex geometry, motion and fields with the flexibility of modern programming and an open and free source code. Work has included characterizing beams and brachytherapy sources, treatment planning, retrospective studies, imaging and validation. This talk will provide an overview of these applications, with a focus on therapy, and will discuss how Geant4 has responded to the specific challenges of moving from HEP to Medical applications.

  12. Geant4 Computing Performance Benchmarking and Monitoring

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-01

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. The scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  13. Geant4 Computing Performance Benchmarking and Monitoring

    SciTech Connect

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  14. Geant4 Computing Performance Benchmarking and Monitoring

    DOE PAGESBeta

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less

  15. Calculation of electron Dose Point Kernel in water with GEANT4 for medical application

    NASA Astrophysics Data System (ADS)

    Guimarães, C. C.; Moralles, M.; Sene, F. F.; Martinelli, J. R.; Okuno, E.

    2009-06-01

    The rapid insertion of new technologies in medical physics in the last years, especially in nuclear medicine, has been followed by a great development of faster Monte Carlo algorithms. GEANT4 is a Monte Carlo toolkit that contains the tools to simulate the problems of particle transport through matter. In this work, GEANT4 was used to calculate the dose-point-kernel (DPK) for monoenergetic electrons in water, which is an important reference medium for nuclear medicine. The three different physical models of electromagnetic interactions provided by GEANT4—Low Energy, Penelope and Standard—were employed. To verify the adequacy of these models, the results were compared with references from the literature. For all energies and physical models, the agreement between calculated DPKs and reported values is satisfactory.

  16. MCNP5 and GEANT4 comparisons for preliminary Fast Neutron Pencil Beam design at the University of Utah TRIGA system

    NASA Astrophysics Data System (ADS)

    Adjei, Christian Amevi

    The main objective of this thesis is twofold. The starting objective was to develop a model for meaningful benchmarking of different versions of GEANT4 against an experimental set-up and MCNP5 pertaining to photon transport and interactions. The following objective was to develop a preliminary design of a Fast Neutron Pencil Beam (FNPB) Facility to be applicable for the University of Utah research reactor (UUTR) using MCNP5 and GEANT4. The three various GEANT4 code versions, GEANT4.9.4, GEANT4.9.3, and GEANT4.9.2, were compared to MCNP5 and the experimental measurements of gamma attenuation in air. The average gamma dose rate was measured in the laboratory experiment at various distances from a shielded cesium source using a Ludlum model 19 portable NaI detector. As it was expected, the gamma dose rate decreased with distance. All three GEANT4 code versions agreed well with both the experimental data and the MCNP5 simulation. Additionally, a simple GEANT4 and MCNP5 model was developed to compare the code agreements for neutron interactions in various materials. Preliminary FNPB design was developed using MCNP5; a semi-accurate model was developed using GEANT4 (because GEANT4 does not support the reactor physics modeling, the reactor was represented as a surface neutron source, thus a semi-accurate model). Based on the MCNP5 model, the fast neutron flux in a sample holder of the FNPB is obtained to be 6.52×107 n/cm2s, which is one order of magnitude lower than gigantic fast neutron pencil beam facilities existing elsewhere. The MCNP5 model-based neutron spectrum indicates that the maximum expected fast neutron flux is at a neutron energy of ~1 MeV. In addition, the MCNP5 model provided information on gamma flux to be expected in this preliminary FNPB design; specifically, in the sample holder, the gamma flux is to be expected to be around 108 γ/cm 2s, delivering a gamma dose of 4.54×103 rem/hr. This value is one to two orders of magnitudes below the gamma

  17. GEANT4 Tuning For pCT Development

    NASA Astrophysics Data System (ADS)

    Yevseyeva, Olga; de Assis, Joaquim T.; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, João A. P.; Díaz, Katherin S.; Hormaza, Joel M.; Lopes, Ricardo T.

    2011-08-01

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Thus, the fidelity of proton computed tomography (pCT) simulations as a tool for proton therapy planning depends in the general case on the accuracy of results obtained for the proton interaction with thick absorbers. GEANT4 simulations of proton energy spectra after passing thick absorbers do not agree well with existing experimental data, as showed previously. Moreover, the spectra simulated for the Bethe-Bloch domain showed an unexpected sensitivity to the choice of low-energy electromagnetic models during the code execution. These observations were done with the GEANT4 version 8.2 during our simulations for pCT. This work describes in more details the simulations of the proton passage through aluminum absorbers with varied thickness. The simulations were done by modifying only the geometry in the Hadrontherapy Example, and for all available choices of the Electromagnetic Physics Models. As the most probable reasons for these effects is some specific feature in the code, or some specific implicit parameters in the GEANT4 manual, we continued our study with version 9.2 of the code. Some improvements in comparison with our previous results were obtained. The simulations were performed considering further applications for pCT development.

  18. Simulation of Cold Neutron Experiments using GEANT4

    NASA Astrophysics Data System (ADS)

    Frlez, Emil; Hall, Joshua; Root, Melinda; Baessler, Stefan; Pocanic, Dinko

    2013-10-01

    We review the available GEANT4 physics processes for the cold neutrons in the energy range 1-100 meV. We consider the cases of the neutron beam interacting with (i) para- and ortho- polarized liquid hydrogen, (ii) Aluminum, and (iii) carbon tetrachloride (CCl4) targets. Scattering, thermal and absorption cross sections used by GEANT4 and MCNP6 libraries are compared with the National Nuclear Data Center (NNDC) compilation. NPDGamma detector simulation is presented as an example of the implementation of the resulting GEANT4 code. This work is supported by NSF grant PHY-0970013.

  19. SU-E-T-565: RAdiation Resistance of Cancer CElls Using GEANT4 DNA: RACE

    SciTech Connect

    Perrot, Y; Payno, H; Delage, E; Maigne, L

    2014-06-01

    Purpose: The objective of the RACE project is to develop a comparison between Monte Carlo simulation using the Geant4-DNA toolkit and measurements of radiation damage on 3D melanoma and chondrosarcoma culture cells coupled with gadolinium nanoparticles. We currently expose the status of the developments regarding simulations. Methods: Monte Carlo studies are driven using the Geant4 toolkit and the Geant4-DNA extension. In order to model the geometry of a cell population, the opensource CPOP++ program is being developed for the geometrical representation of 3D cell populations including a specific cell mesh coupled with a multi-agent system. Each cell includes cytoplasm and nucleus. The correct modeling of the cell population has been validated with confocal microscopy images of spheroids. The Geant4 Livermore physics models are used to simulate the interactions of a 250 keV X-ray beam and the production of secondaries from gadolinium nanoparticles supposed to be fixed on the cell membranes. Geant4-DNA processes are used to simulate the interactions of charged particles with the cells. An atomistic description of the DNA molecule, from PDB (Protein Data Bank) files, is provided by the so-called PDB4DNA Geant4 user application we developed to score energy depositions in DNA base pairs and sugar-phosphate groups. Results: At the microscopic level, our simulations enable assessing microscopic energy distribution in each cell compartment of a realistic 3D cell population. Dose enhancement factors due to the presence of gadolinium nanoparticles can be estimated. At the nanometer scale, direct damages on nuclear DNA are also estimated. Conclusion: We successfully simulated the impact of direct radiations on a realistic 3D cell population model compatible with microdosimetry calculations using the Geant4-DNA toolkit. Upcoming validation and the future integration of the radiochemistry module of Geant4-DNA will propose to correlate clusters of ionizations with in vitro

  20. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    PubMed

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity. PMID:26151172

  1. Identifying key surface parameters for optical photon transport in GEANT4/GATE simulations.

    PubMed

    Nilsson, Jenny; Cuplov, Vesna; Isaksson, Mats

    2015-09-01

    For a scintillator used for spectrometry, the generation, transport and detection of optical photons have a great impact on the energy spectrum resolution. A complete Monte Carlo model of a scintillator includes a coupled ionizing particle and optical photon transport, which can be simulated with the GEANT4 code. The GEANT4 surface parameters control the physics processes an optical photon undergoes when reaching the surface of a volume. In this work the impact of each surface parameter on the optical transport was studied by looking at the optical spectrum: the number of detected optical photons per ionizing source particle from a large plastic scintillator, i.e. the output signal. All simulations were performed using GATE v6.2 (GEANT4 Application for Tomographic Emission). The surface parameter finish (polished, ground, front-painted or back-painted) showed the greatest impact on the optical spectrum whereas the surface parameter σ(α), which controls the surface roughness, had a relatively small impact. It was also shown how the surface parameters reflectivity and reflectivity types (specular spike, specular lobe, Lambertian and backscatter) changed the optical spectrum depending on the probability for reflection and the combination of reflectivity types. A change in the optical spectrum will ultimately have an impact on a simulated energy spectrum. By studying the optical spectra presented in this work, a GEANT4 user can predict the shift in an optical spectrum caused be the alteration of a specific surface parameter. PMID:26046519

  2. GEANT4 Simulation of the NPDGamma Experiment

    NASA Astrophysics Data System (ADS)

    Frlez, Emil

    2014-03-01

    The n-> + p --> d + γ experiment, currently taking data at the Oak Ridge SNS facility, is a high-precision measurement of weak nuclear forces at low energies. Detecting the correlation between the cold neutron spin and photon direction in the capture of neutrons on Liquid Hydrogen (LH) target, the experiment is sensitive to the properties of neutral weak current. We have written a GEANT4 Monte Carlo simulation of the NPDGamma detector that, in addition to the active CsI detectors, also includes different targets and passive materials as well as the beam line elements. The neutron beam energy spectrum, its profiles, divergencies, and time-of-flight are simulated in detail. We have used the code to cross-calibrate the positions of (i) polarized LH target, (ii) Aluminum target, and (iii) CCl4 target. The responses of the 48 CsI detectors in the simulation were fixed using data taken on the LH target. Both neutron absorption as well as scattering and thermal processes were turned on in the GEANT4 physics lists. We use the results to simulate in detail the data obtained with different targets used in the experiment within a comprehensive analysis. This work is supported by NSF grant PHY-1307328.

  3. Visualization drivers for Geant4

    SciTech Connect

    Beretvas, Andy; /Fermilab

    2005-10-01

    This document is on Geant4 visualization tools (drivers), evaluating pros and cons of each option, including recommendations on which tools to support at Fermilab for different applications. Four visualization drivers are evaluated. They are OpenGL, HepRep, DAWN and VRML. They all have good features, OpenGL provides graphic output without an intermediate file. HepRep provides menus to assist the user. DAWN provides high quality plots and even for large files produces output quickly. VRML uses the smallest disk space for intermediate files. Large experiments at Fermilab will want to write their own display. They should proceed to make this display graphics independent. Medium experiment will probably want to use HepRep because of it's menu support. Smaller scale experiments will want to use OpenGL in the spirit of having immediate response, good quality output and keeping things simple.

  4. Geant4 - Towards major release 10

    NASA Astrophysics Data System (ADS)

    Cosmo, G.; Geant4 Collaboration

    2014-06-01

    The Geant4 simulation toolkit has reached maturity in the middle of the previous decade, providing a wide variety of established features coherently aggregated in a software product, which has become the standard for detector simulation in HEP and is used in a variety of other application domains. We review the most recent capabilities introduced in the kernel, highlighting those, which are being prepared for the next major release (version 10.0) that is scheduled for the end of 2013. A significant new feature contained in this release will be the integration of multi-threading processing, aiming at targeting efficient use of modern many-cores system architectures and minimization of the memory footprint for exploiting event-level parallelism. We discuss its design features and impact on the existing API and user-interface of Geant4. Revisions are made to balance the need for preserving backwards compatibility and to consolidate and improve the interfaces; taking into account requirements from the multithreaded extensions and from the evolution of the data processing models of the LHC experiments.

  5. Probing Planetary Bodies for Subsurface Volatiles: GEANT4 Models of Gamma Ray, Fast, Epithermal, and Thermal Neutron Response to Active Neutron Illumination

    NASA Astrophysics Data System (ADS)

    Chin, G.; Sagdeev, R.; Su, J. J.; Murray, J.

    2014-12-01

    Using an active source of neutrons as an in situ probe of a planetary body has proven to be a powerful tool to extract information about the presence, abundance, and location of subsurface volatiles without the need for drilling. The Dynamic Albedo of Neutrons (DAN) instrument on Curiosity is an example of such an instrument and is designed to detect the location and abundance of hydrogen within the top 50 cm of the Martian surface. DAN works by sending a pulse of neutrons towards the ground beneath the rover and detecting the reflected neutrons. The intensity and time of arrival of the reflection depends on the proportion of water, while the time the pulse takes to reach the detector is a function of the depth at which the water is located. Similar instruments can also be effective probes at the polar-regions of the Moon or on asteroids as a way of detecting sequestered volatiles. We present the results of GEANT4 particle simulation models of gamma ray, fast, epithermal, and thermal neutron responses to active neutron illumination. The results are parameterized by hydrogen abundance, stratification and depth of volatile layers, versus the distribution of neutron and gamma ray energy reflections. Models will be presented to approximate Martian, lunar, and asteroid environments and would be useful tools to assess utility for future NASA exploration missions to these types of planetary bodies.

  6. 6-MV photon beam modeling for the Varian Clinac iX by using the Geant4 virtual jaw

    NASA Astrophysics Data System (ADS)

    Kim, Byung Yong; Kim, Hyung Dong; Kim, Dong Ho; Baek, Jong Geun; Moon, Su Ho; Rho, Gwang Won; Kang, Jeong Ku; Kim, Sung Kyu

    2015-07-01

    Most virtual source models (VSMs), with the exception of the patient-dependent secondary collimator (jaw), use beam modeling. Unlike other components of the treatment head, the jaw absorbs many photons generated by bremsstrahlung, which decreases the efficiency of the simulation. In the present study, a new method of beam modeling using a virtual jaw was applied to improve the calculation efficiency of VSM. This new method of beam modeling was designed so that the interaction was not generated in the jaw. The results for the percentage depth dose and the profile of the virtual jaw VSM calculated in a homogeneous water phantom agreed with the measurement results for the CC13 cylinder-type ion chamber to within an error of 2%, and the 80-20% penumbra width agreed with the measurement results to within an error of 0.6 mm. Compared with the existing VSM, in which a great number of photons are absorbed, the calculation efficiency of the VSM using the virtual jaw is expected to be increased by approximately 67%.

  7. Dosimetric evaluation of nuclear interaction models in the Geant4 Monte Carlo simulation toolkit for carbon-ion radiotherapy.

    PubMed

    Kameoka, S; Amako, K; Iwai, G; Murakami, K; Sasaki, T; Toshito, T; Yamashita, T; Aso, T; Kimura, A; Kanai, T; Kanematsu, N; Komori, M; Takei, Y; Yonai, S; Tashiro, M; Koikegami, H; Tomita, H; Koi, T

    2008-07-01

    We tested the ability of two separate nuclear reaction models, the binary cascade and JQMD (Jaeri version of Quantum Molecular Dynamics), to predict the dose distribution in carbon-ion radiotherapy. This was done by use of a realistic simulation of the experimental irradiation of a water target. Comparison with measurement shows that the binary cascade model does a good job reproducing the spread-out Bragg peak in depth-dose distributions in water irradiated with a 290 MeV/u (per nucleon) beam. However, it significantly overestimates the peak dose for a 400 MeV/u beam. JQMD underestimates the overall dose because of a tendency to break a nucleus into lower-Z fragments than does the binary cascade model. As far as shape of the dose distribution is concerned, JQMD shows fairly good agreement with measurement for both beam energies of 290 and 400 MeV/u, which favors JQMD over the binary cascade model for the calculation of the relative dose distribution in treatment planning. PMID:20821145

  8. A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4

    NASA Astrophysics Data System (ADS)

    Grevillot, L.; Bertrand, D.; Dessy, F.; Freud, N.; Sarrut, D.

    2011-08-01

    This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.

  9. Beam Simulation Tools for GEANT4 (BT-V1.0). User's Guide

    SciTech Connect

    Elvira, V. Daniel; Lebrum, P.; Spentzouris, P.

    2002-12-02

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the high energy physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. The Beam Tools are a set of C++ classes designed to facilitate the simulation of accelerator elements: r.f. cavities, magnets, absorbers, etc. These elements are constructed from Geant4 solid volumes like boxes, tubes, trapezoids, or spheers. There are many computer programs for beam physics simulations, but Geant4 is ideal to model a beam through a material or to integrate a beam line with a complex detector. There are many such examples in the current international High Energy Physics programs. For instance, an essential part of the R&D associated with the Neutrino Source/Muon Collider accelerator is the ionization cooling channel, which is a section of the system aimed to reduce the size of the muon beam in phase space. The ionization cooling technique uses a combination of linacs and light absorbers to reduce the transverse momentum and size of the beam, while keeping the longitudinal momentum constant. The MuCool/MICE (muon cooling) experiments need accurate simulations of the beam transport through the cooling channel in addition to a detailed simulation of the detectors designed to measure the size of the beam. The accuracy of the models for physics processes associated with muon ionization and multiple scattering is critical in this type of applications. Another example is the simulation of the interaction region in future accelerators. The high luminosity and background environments expected in the Next Linear Collider (NLC) and the Very Large Hadron Collider (VLHC) pose great demand on the detectors, which may be optimized by means of a simulation of the detector-accelerator interface.

  10. Comparative study of dose distributions and cell survival fractions for 1H, 4He, 12C and 16O beams using Geant4 and Microdosimetric Kinetic model

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2015-04-01

    Depth and radial dose profiles for therapeutic 1H, 4He, 12C and 16O beams are calculated using the Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT). 4He and 16O ions are presented as alternative options to 1H and 12C broadly used for ion-beam cancer therapy. Biological dose profiles and survival fractions of cells are estimated using the modified Microdosimetric Kinetic model. Depth distributions of cell survival of healthy tissues, assuming 10% and 50% survival of tumor cells, are calculated for 6 cm SOBPs at two tumor depths and for different tissues radiosensitivities. It is found that the optimal ion choice depends on (i) depth of the tumor, (ii) dose levels and (iii) the contrast of radiosensitivities of tumor and surrounding healthy tissues. Our results indicate that 12C and 16O ions are more appropriate to spare healthy tissues in the case of a more radioresistant tumor at moderate depths. On the other hand, a sensitive tumor surrounded by more resistant tissues can be better treated with 1H and 4He ions. In general, 4He beam is found to be a good candidate for therapy. It better spares healthy tissues in all considered cases compared to 1H. Besides, the dose conformation is improved for deep-seated tumors compared to 1H, and the damage to surrounding healthy tissues is reduced compared to heavier ions due to the lower impact of nuclear fragmentation. No definite advantages of 16O with respect to 12C ions are found in this study.

  11. Comparative study of dose distributions and cell survival fractions for 1H, 4He, 12C and 16O beams using Geant4 and Microdosimetric Kinetic model.

    PubMed

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2015-04-21

    Depth and radial dose profiles for therapeutic (1)H, (4)He, (12)C and (16)O beams are calculated using the Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT). (4)He and (16)O ions are presented as alternative options to (1)H and (12)C broadly used for ion-beam cancer therapy. Biological dose profiles and survival fractions of cells are estimated using the modified Microdosimetric Kinetic model. Depth distributions of cell survival of healthy tissues, assuming 10% and 50% survival of tumor cells, are calculated for 6 cm SOBPs at two tumor depths and for different tissues radiosensitivities. It is found that the optimal ion choice depends on (i) depth of the tumor, (ii) dose levels and (iii) the contrast of radiosensitivities of tumor and surrounding healthy tissues. Our results indicate that (12)C and (16)O ions are more appropriate to spare healthy tissues in the case of a more radioresistant tumor at moderate depths. On the other hand, a sensitive tumor surrounded by more resistant tissues can be better treated with (1)H and (4)He ions. In general, (4)He beam is found to be a good candidate for therapy. It better spares healthy tissues in all considered cases compared to (1)H. Besides, the dose conformation is improved for deep-seated tumors compared to (1)H, and the damage to surrounding healthy tissues is reduced compared to heavier ions due to the lower impact of nuclear fragmentation. No definite advantages of (16)O with respect to (12)C ions are found in this study. PMID:25825827

  12. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring.

    PubMed

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm(2). PMID:26858937

  13. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring

    PubMed Central

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M.; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm2. PMID:26858937

  14. Optical simulation of monolithic scintillator detectors using GATE/GEANT4.

    PubMed

    van der Laan, D J Jan; Schaart, Dennis R; Maas, Marnix C; Beekman, Freek J; Bruyndonckx, Peter; van Eijk, Carel W E

    2010-03-21

    Much research is being conducted on position-sensitive scintillation detectors for medical imaging, particularly for emission tomography. Monte Carlo simulations play an essential role in many of these research activities. As the scintillation process, the transport of scintillation photons through the crystal(s), and the conversion of these photons into electronic signals each have a major influence on the detector performance; all of these processes may need to be incorporated in the model to obtain accurate results. In this work the optical and scintillation models of the GEANT4 simulation toolkit are validated by comparing simulations and measurements on monolithic scintillator detectors for high-resolution positron emission tomography (PET). We have furthermore made the GEANT4 optical models available within the user-friendly GATE simulation platform (as of version 3.0). It is shown how the necessary optical input parameters can be determined with sufficient accuracy. The results show that the optical physics models of GATE/GEANT4 enable accurate prediction of the spatial and energy resolution of monolithic scintillator PET detectors. PMID:20182005

  15. A macroscopic and microscopic study of radon exposure using Geant4 and MCNPX to estimate dose rates and DNA damage

    NASA Astrophysics Data System (ADS)

    van den Akker, Mary Evelyn

    Radon is considered the second-leading cause of lung cancer after smoking. Epidemiological studies have been conducted in miner cohorts as well as general populations to estimate the risks associated with high and low dose exposures. There are problems with extrapolating risk estimates to low dose exposures, mainly that the dose-response curve at low doses is not well understood. Calculated dosimetric quantities give average energy depositions in an organ or a whole body, but morphological features of an individual can affect these values. As opposed to human phantom models, Computed Tomography (CT) scans provide unique, patient-specific geometries that are valuable in modeling the radiological effects of the short-lived radon progeny sources. Monte Carlo particle transport code Geant4 was used with the CT scan data to model radon inhalation in the main bronchial bifurcation. The equivalent dose rates are near the lower bounds of estimates found in the literature, depending on source volume. To complement the macroscopic study, simulations were run in a small tissue volume in Geant4-DNA toolkit. As an expansion of Geant4 meant to simulate direct physical interactions at the cellular level, the particle track structure of the radon progeny alphas can be analyzed to estimate the damage that can occur in sensitive cellular structures like the DNA molecule. These estimates of DNA double strand breaks are lower than those found in Geant4-DNA studies. Further refinements of the microscopic model are at the cutting edge of nanodosimetry research.

  16. Alpha Coincidence Spectroscopy studied with GEANT4

    SciTech Connect

    Dion, Michael P.; Miller, Brian W.; Tatishvili, Gocha; Warren, Glen A.

    2013-11-02

    Abstract The high-energy side of peaks in alpha spectra, e.g. 241Am, as measured with a silicon detector has structure caused mainly by alpha-conversion electron and to some extent alphagamma coincidences. We compare GEANT4 simulation results to 241Am alpha spectroscopy measurements with a passivated implanted planar silicon detector. A large discrepancy between the measurements and simulations suggest that the GEANT4 photon evaporation database for 237Np (daughter of 241Am decay) does not accurately describe the conversion electron spectrum and therefore was found to have large discrepancies with experimental measurements. We describe how to improve the agreement between GEANT4 and alpha spectroscopy for actinides of interest by including experimental measurements of conversion electron spectroscopy into the photon evaporation database.

  17. Geant4 VMC 3.0

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, I.; Gheata, A.

    2015-12-01

    Virtual Monte Carlo (VMC) [1] provides an abstract interface into Monte Carlo transport codes. A user VMC based application, independent from the specific Monte Carlo codes, can be then run with any of the supported simulation programs. Developed by the ALICE Offline Project and further included in ROOT [2], the interface and implementations have reached stability during the last decade and have become a foundation for other detector simulation frameworks, the FAIR facility experiments framework being among the first and largest. Geant4 VMC [3], which provides the implementation of the VMC interface for Geant4 [4], is in continuous maintenance and development, driven by the evolution of Geant4 on one side and requirements from users on the other side. Besides the implementation of the VMC interface, Geant4 VMC also provides a set of examples that demonstrate the use of VMC to new users and also serve for testing purposes. Since major release 2.0, it includes the G4Root navigator package, which implements an interface that allows one to run a Geant4 simulation using a ROOT geometry. The release of Geant4 version 10.00 with the integration of multithreading processing has triggered the development of the next major version of Geant4 VMC (version 3.0), which was released in November 2014. A beta version, available for user testing since March, has helped its consolidation and improvement. We will review the new capabilities introduced in this major version, in particular the integration of multithreading into the VMC design, its impact on the Geant4 VMC and G4Root packages, and the introduction of a new package, MTRoot, providing utility functions for ROOT parallel output in independent files with necessary additions for thread-safety. Migration of user applications to multithreading that preserves the ease of use of VMC will be also discussed. We will also report on the introduction of a new CMake [5] based build system, the migration to ROOT major release 6 and the

  18. Medical Applications of the Geant4 Toolkit

    NASA Astrophysics Data System (ADS)

    Agostinelli, S.; Chauvie, S.; Foppiano, F.; Garelli, S.; Marchetto, F.; Pia, M. G.; Nieminen, P.; Rolando, V.; Solano, A.

    A powerful and suitable tool for attacking the problem of the production and transport of different beams in biological matter is offered by the Geant4 Simulation Toolkit. Various activities in progress in the domain of medical applications are presented: studies on calibration of br achy therapie sources and termoluminescent dosimeters, studies of a complete 3-D inline dosimeter, development of general tools for CT interface for treatment planning, studies involving neutron transport, etc. A novel approach, based on the Geant4 Toolkit, for the study of radiation damage at the cellular and DNA level, is also presented.

  19. Space Earthquake Perturbation Simulation (SEPS) an application based on Geant4 tools to model and simulate the interaction between the Earthquake and the particle trapped on the Van Allen belt

    NASA Astrophysics Data System (ADS)

    Ambroglini, Filippo; Jerome Burger, William; Battiston, Roberto; Vitale, Vincenzo; Zhang, Yu

    2014-05-01

    During last decades, few space experiments revealed anomalous bursts of charged particles, mainly electrons with energy larger than few MeV. A possible source of these bursts are the low-frequency seismo-electromagnetic emissions, which can cause the precipitation of the electrons from the lower boundary of their inner belt. Studies of these bursts reported also a short-term pre-seismic excess. Starting from simulation tools traditionally used on high energy physics we developed a dedicated application SEPS (Space Perturbation Earthquake Simulation), based on the Geant4 tool and PLANETOCOSMICS program, able to model and simulate the electromagnetic interaction between the earthquake and the particles trapped in the inner Van Allen belt. With SEPS one can study the transport of particles trapped in the Van Allen belts through the Earth's magnetic field also taking into account possible interactions with the Earth's atmosphere. SEPS provides the possibility of: testing different models of interaction between electromagnetic waves and trapped particles, defining the mechanism of interaction as also shaping the area in which this takes place,assessing the effects of perturbations in the magnetic field on the particles path, performing back-tracking analysis and also modelling the interaction with electric fields. SEPS is in advanced development stage, so that it could be already exploited to test in details the results of correlation analysis between particle bursts and earthquakes based on NOAA and SAMPEX data. The test was performed both with a full simulation analysis, (tracing from the position of the earthquake and going to see if there were paths compatible with the burst revealed) and with a back-tracking analysis (tracing from the burst detection point and checking the compatibility with the position of associated earthquake).

  20. Accurate simulations of TEPC neutron spectra using Geant4

    NASA Astrophysics Data System (ADS)

    Taylor, G. C.; Hawkes, N. P.; Shippen, A.

    2015-11-01

    A Geant4 model of a tissue-equivalent proportional counter (TEPC) has been developed in which the calculated output spectrum exhibits unparalleled agreement with experiment for monoenergetic neutron fields at several energies below 20 MeV. The model uses the standard release of the Geant4 9.6 p2 code, but with a non-standard neutron cross section file as provided by Mendoza et al., and with the environment variable options recommended by the same authors. This configuration was found to produce significant improvements in the alpha-dominated region of the calculated response. In this paper, these improvements are presented, and the post-processing required to convert deposited energy into the number of ion pairs (which is the quantity actually measured experimentally) is discussed.

  1. GEANT4 Simulation of Hadronic Interactions at 8-GeV/C to 10-GeV/C: Response to the HARP-CDP Group

    SciTech Connect

    Uzhinsky, V.; Apostolakis, J.; Folger, G.; Ivanchenko, V.N.; Kossov, M.V.; Wright, D.H.; /SLAC

    2011-11-21

    The results of the HARP-CDP group on the comparison of GEANT4 Monte Carlo predictions versus experimental data are discussed. It is shown that the problems observed by the group are caused by an incorrect implementation of old features at the programming level, and by a lack of the nucleon Fermi motion in the simulation of quasielastic scattering. These drawbacks are not due to the physical models used. They do not manifest themselves in the most important applications of the GEANT4 toolkit.

  2. Response of a proportional counter to 37Ar and 71Ge: Measured spectra versus Geant4 simulation

    NASA Astrophysics Data System (ADS)

    Abdurashitov, D. N.; Malyshkin, Yu. M.; Matushko, V. L.; Suerfu, B.

    2016-04-01

    The energy deposition spectra of 37Ar and 71Ge in a miniature proportional counter are measured and compared in detail to the model response simulated with Geant4. A certain modification of the Geant4 code, making it possible to trace the deexcitation of atomic shells properly, is suggested. Modified Geant4 is able to reproduce a response of particle detectors in detail in the keV energy range. This feature is very important for the laboratory experiments that search for massive sterile neutrinos as well as for dark matter searches that employ direct detection of recoil nuclei. This work demonstrates the reliability of Geant4 simulation at low energies.

  3. Modeling the TrueBeam linac using a CAD to Geant4 geometry implementation: Dose and IAEA-compliant phase space calculations

    SciTech Connect

    Constantin, Magdalena; Perl, Joseph; LoSasso, Tom; Salop, Arthur; Whittum, David; Narula, Anisha; Svatos, Michelle; Keall, Paul J.

    2011-07-15

    Purpose: To create an accurate 6 MV Monte Carlo simulation phase space for the Varian TrueBeam treatment head geometry imported from cad (computer aided design) without adjusting the input electron phase space parameters. Methods: geant4 v4.9.2.p01 was employed to simulate the 6 MV beam treatment head geometry of the Varian TrueBeam linac. The electron tracks in the linear accelerator were simulated with Parmela, and the obtained electron phase space was used as an input to the Monte Carlo beam transport and dose calculations. The geometry components are tessellated solids included in geant4 as gdml (generalized dynamic markup language) files obtained via STEP (standard for the exchange of product) export from Pro/Engineering, followed by STEP import in Fastrad, a STEP-gdml converter. The linac has a compact treatment head and the small space between the shielding collimator and the divergent arc of the upper jaws forbids the implementation of a plane for storing the phase space. Instead, an IAEA (International Atomic Energy Agency) compliant phase space writer was implemented on a cylindrical surface. The simulation was run in parallel on a 1200 node Linux cluster. The 6 MV dose calculations were performed for field sizes varying from 4 x 4 to 40 x 40 cm{sup 2}. The voxel size for the 60x60x40 cm{sup 3} water phantom was 4x4x4 mm{sup 3}. For the 10x10 cm{sup 2} field, surface buildup calculations were performed using 4x4x2 mm{sup 3} voxels within 20 mm of the surface. Results: For the depth dose curves, 98% of the calculated data points agree within 2% with the experimental measurements for depths between 2 and 40 cm. For depths between 5 and 30 cm, agreement within 1% is obtained for 99% (4x4), 95% (10x10), 94% (20x20 and 30x30), and 89% (40x40) of the data points, respectively. In the buildup region, the agreement is within 2%, except at 1 mm depth where the deviation is 5% for the 10x10 cm{sup 2} open field. For the lateral dose profiles, within the field size

  4. Validation of the GEANT4 simulation of bremsstrahlung from thick targets below 3 MeV

    NASA Astrophysics Data System (ADS)

    Pandola, L.; Andenna, C.; Caccia, B.

    2015-05-01

    The bremsstrahlung spectra produced by electrons impinging on thick targets are simulated using the GEANT4 Monte Carlo toolkit. Simulations are validated against experimental data available in literature for a range of energy between 0.5 and 2.8 MeV for Al and Fe targets and for a value of energy of 70 keV for Al, Ag, W and Pb targets. The energy spectra for the different configurations of emission angles, energies and targets are considered. Simulations are performed by using the three alternative sets of electromagnetic models that are available in GEANT4 to describe bremsstrahlung. At higher energies (0.5-2.8 MeV) of the impinging electrons on Al and Fe targets, GEANT4 is able to reproduce the spectral shapes and the integral photon emission in the forward direction. The agreement is within 10-30%, depending on energy, emission angle and target material. The physics model based on the Penelope Monte Carlo code is in slightly better agreement with the measured data than the other two. However, all models over-estimate the photon emission in the backward hemisphere. For the lower energy study (70 keV), which includes higher-Z targets, all models systematically under-estimate the total photon yield, providing agreement between 10% and 50%. The results of this work are of potential interest for medical physics applications, where knowledge of the energy spectra and angular distributions of photons is needed for accurate dose calculations with Monte Carlo and other fluence-based methods.

  5. GEANT4 simulation of the effects of Doppler energy broadening in Compton imaging.

    PubMed

    Uche, C Z; Cree, M J; Round, W H

    2011-09-01

    A Monte Carlo approach was used to study the effects of Doppler energy broadening on Compton camera performance. The GEANT4 simulation toolkit was used to model the radiation transport and interactions with matter in a simulated Compton camera. The low energy electromagnetic physics model of GEANT4 incorporating Doppler broadening developed by Longo et al. was used in the simulations. The camera had a 9 × 9 cm scatterer and a 10 × 10 cm absorber with a scatterer to-absorber separation of 5 cm. Modelling was done such that only the effects of Doppler broadening were taken into consideration and effects of scatterer and absorber thickness and pixelation were not taken into account, thus a 'perfect' Compton camera was assumed. Scatterer materials were either silicon or germanium and the absorber material was cadmium zinc telluride. Simulations were done for point sources 10 cm in front of the scatterer. The results of the simulations validated the use of the low energy model of GEANT4. As expected, Doppler broadening was found to degrade the Compton camera imaging resolution. For a 140.5 keV source the resulting full-width-at-half-maximum (FWHM) of the point source image without accounting for Doppler broadening and using a silicon scatterer was 0.58 mm. This degraded to 7.1 mm when Doppler broadening was introduced and degraded further to 12.3 mm when a germanium scatterer was used instead of silicon. But for a 511 keV source, the FWHM was better than for a 140 keV source. The FWHM improved to 2.4 mm for a silicon scatterer and 4.6 mm for a germanium scatterer. Our result for silicon at 140.5 keV is in very good agreement with that published by An et al. PMID:21556971

  6. GEANT4 for breast dosimetry: parameters optimization study

    NASA Astrophysics Data System (ADS)

    Fedon, C.; Longo, F.; Mettivier, G.; Longo, R.

    2015-08-01

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor ({{r}2}>0.99 ) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4.

  7. GEANT4 for breast dosimetry: parameters optimization study.

    PubMed

    Fedon, C; Longo, F; Mettivier, G; Longo, R

    2015-08-21

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor (r2>0.99) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4. PMID:26267405

  8. GEANT4 simulation of cyclotron radioisotope production in a solid target.

    PubMed

    Poignant, F; Penfold, S; Asp, J; Takhar, P; Jackson, P

    2016-05-01

    The use of radioisotopes in nuclear medicine is essential for diagnosing and treating cancer. The optimization of their production is a key factor in maximizing the production yield and minimizing the associated costs. An efficient approach to this problem is the use of Monte Carlo simulations prior to experimentation. By predicting isotopes yields, one can study the isotope of interest expected activity for different energy ranges. One can also study the target contamination with other radioisotopes, especially undesired radioisotopes of the wanted chemical element which are difficult to separate from the irradiated target and might result in increasing the dose when delivering the radiopharmaceutical product to the patient. The aim of this work is to build and validate a Monte Carlo simulation platform using the GEANT4 toolkit to model the solid target system of the South Australian Health and Medical Research Institute (SAHMRI) GE Healthcare PETtrace cyclotron. It includes a GEANT4 Graphical User Interface (GUI) where the user can modify simulation parameters such as the energy, shape and current of the proton beam, the target geometry and material, the foil geometry and material and the time of irradiation. The paper describes the simulation and presents a comparison of simulated and experimental/theoretical yields for various nuclear reactions on an enriched nickel 64 target using the GEANT4 physics model QGSP_BIC_AllHP, a model recently developed to evaluate with high precision the interaction of protons with energies below 200MeV available in Geant4 version 10.1. The simulation yield of the (64)Ni(p,n)(64)Cu reaction was found to be 7.67±0.074 mCi·μA(-1) for a target energy range of 9-12MeV. Szelecsenyi et al. (1993) gives a theoretical yield of 6.71mCi·μA(-1) and an experimental yield of 6.38mCi·μA(-1). The (64)Ni(p,n)(64)Cu cross section obtained with the simulation was also verified against the yield predicted from the nuclear database TENDL and

  9. Application of dynamic Monte Carlo technique in proton beam radiotherapy using Geant4 simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guan, Fada

    Monte Carlo method has been successfully applied in simulating the particles transport problems. Most of the Monte Carlo simulation tools are static and they can only be used to perform the static simulations for the problems with fixed physics and geometry settings. Proton therapy is a dynamic treatment technique in the clinical application. In this research, we developed a method to perform the dynamic Monte Carlo simulation of proton therapy using Geant4 simulation toolkit. A passive-scattering treatment nozzle equipped with a rotating range modulation wheel was modeled in this research. One important application of the Monte Carlo simulation is to predict the spatial dose distribution in the target geometry. For simplification, a mathematical model of a human body is usually used as the target, but only the average dose over the whole organ or tissue can be obtained rather than the accurate spatial dose distribution. In this research, we developed a method using MATLAB to convert the medical images of a patient from CT scanning into the patient voxel geometry. Hence, if the patient voxel geometry is used as the target in the Monte Carlo simulation, the accurate spatial dose distribution in the target can be obtained. A data analysis tool---root was used to score the simulation results during a Geant4 simulation and to analyze the data and plot results after simulation. Finally, we successfully obtained the accurate spatial dose distribution in part of a human body after treating a patient with prostate cancer using proton therapy.

  10. Integration of g4tools in Geant4

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, Ivana

    2014-06-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  11. Particles Production in Extensive Air Showers: GEANT4 vs CORSIKA

    NASA Astrophysics Data System (ADS)

    Sabra, M. S.; Watts, J. W.; Christl, M. J.

    2014-09-01

    Air shower simulations are essential tools for the interpretation of the Extensive Air Shower (EAS) measurements. The reliability of these codes is evaluated by comparisons with equivalent simulation calculations, and with experimental data (when available). In this work, we present GEANT4 calculations of particles production in EAS induced by primary protons and Iron in the PeV (1015 eV) energy range. The calculations, using different hadronic models, are compared with the results from the well-known air shower simulation code CORSIKA, and the results of this comparison will be discussed. Air shower simulations are essential tools for the interpretation of the Extensive Air Shower (EAS) measurements. The reliability of these codes is evaluated by comparisons with equivalent simulation calculations, and with experimental data (when available). In this work, we present GEANT4 calculations of particles production in EAS induced by primary protons and Iron in the PeV (1015 eV) energy range. The calculations, using different hadronic models, are compared with the results from the well-known air shower simulation code CORSIKA, and the results of this comparison will be discussed. This work is supported by the NASA Postdoctoral Program administered by Oak Ridge Associated Universities.

  12. geant4 hadronic cascade models analysis of proton and charged pion transverse momentum spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c

    SciTech Connect

    Abdel-Waged, Khaled; Felemban, Nuha; Uzhinskii, V. V.

    2011-07-15

    We describe how various hadronic cascade models, which are implemented in the geant4 toolkit, describe proton and charged pion transverse momentum spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c, recently measured in the hadron production (HARP) experiment at CERN. The Binary, ultrarelativistic quantum molecular dynamics (UrQMD) and modified FRITIOF (FTF) hadronic cascade models are chosen for investigation. The first two models are based on limited (Binary) and branched (UrQMD) binary scattering between cascade particles which can be either a baryon or meson, in the three-dimensional space of the nucleus, while the latter (FTF) considers collective interactions between nucleons only, on the plane of impact parameter. It is found that the slow (p{sub T}{<=}0.3 GeV/c) proton spectra are quite sensitive to the different treatments of cascade pictures, while the fast (p{sub T}>0.3 GeV/c) proton spectra are not strongly affected by the differences between the FTF and UrQMD models. It is also shown that the UrQMD and FTF combined with Binary (FTFB) models could reproduce both proton and charged pion spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c with the same accuracy.

  13. Behaviors of the percentage depth dose curves along the beam axis of a phantom filled with different clinical PTO objects, a Monte Carlo Geant4 study

    NASA Astrophysics Data System (ADS)

    EL Bakkali, Jaafar; EL Bardouni, Tarek; Safavi, Seyedmostafa; Mohammed, Maged; Saeed, Mroan

    2016-08-01

    The aim of this work is to assess the capabilities of Monte Carlo Geant4 code to reproduce the real percentage depth dose (PDD) curves generated in phantoms which mimic three important clinical treatment situations that include lung slab, bone slab, bone-lung slab geometries. It is hoped that this work will lead us to a better understanding of dose distributions in an inhomogeneous medium, and to identify any limitations of dose calculation algorithm implemented in the Geant4 code. For this purpose, the PDD dosimetric functions associated to the three clinical situations described above, were compared to one produced in a homogeneous water phantom. Our results show, firstly, that the Geant4 simulation shows potential mistakes on the shape of the calculated PDD curve of the first physical test object (PTO), and it is obviously not able to successfully predict dose values in regions near to the boundaries between two different materials. This is, surely due to the electron transport algorithm and it is well-known as the artifacts at interface phenomenon. To deal with this issue, we have added and optimized the StepMax parameter to the dose calculation program; consequently the artifacts due to the electron transport were quasi disappeared. However, the Geant4 simulation becomes painfully slow when we attempt to completely resolve the electron artifact problems by considering a smaller value of an electron StepMax parameter. After electron transport optimization, our results demonstrate the medium-level capabilities of the Geant4 code to modeling dose distribution in clinical PTO objects.

  14. SU-E-T-347: Validation of the Condensed History Algorithm of Geant4 Using the Fano Test

    SciTech Connect

    Lee, H; Mathis, M; Sawakuchi, G

    2014-06-01

    Purpose: To validate the condensed history algorithm and physics of the Geant4 Monte Carlo toolkit for simulations of ionization chambers (ICs). This study is the first step to validate Geant4 for calculations of photon beam quality correction factors under the presence of a strong magnetic field for magnetic resonance guided linac system applications. Methods: The electron transport and boundary crossing algorithms of Geant4 version 9.6.p02 were tested under Fano conditions using the Geant4 example/application FanoCavity. User-defined parameters of the condensed history and multiple scattering algorithms were investigated under Fano test conditions for three scattering models (physics lists): G4UrbanMscModel95 (PhysListEmStandard-option3), G4GoudsmitSaundersonMsc (PhysListEmStandard-GS), and G4WentzelVIModel/G4CoulombScattering (PhysListEmStandard-WVI). Simulations were conducted using monoenergetic photon beams, ranging from 0.5 to 7 MeV and emphasizing energies from 0.8 to 3 MeV. Results: The GS and WVI physics lists provided consistent Fano test results (within ±0.5%) for maximum step sizes under 0.01 mm at 1.25 MeV, with improved performance at 3 MeV (within ±0.25%). The option3 physics list provided consistent Fano test results (within ±0.5%) for maximum step sizes above 1 mm. Optimal parameters for the option3 physics list were 10 km maximum step size with default values for other user-defined parameters: 0.2 dRoverRange, 0.01 mm final range, 0.04 range factor, 2.5 geometrical factor, and 1 skin. Simulations using the option3 physics list were ∼70 – 100 times faster compared to GS and WVI under optimal parameters. Conclusion: This work indicated that the option3 physics list passes the Fano test within ±0.5% when using a maximum step size of 10 km for energies suitable for IC calculations in a 6 MV spectrum without extensive computational times. Optimal user-defined parameters using the option3 physics list will be used in future IC simulations to

  15. CMS validation experience: Test-beam 2004 data vs GEANT4

    SciTech Connect

    Piperov, Stefan; /Fermilab /Sofiya, Inst. Nucl. Res.

    2007-01-01

    A comparison between the Geant4 Monte-Carlo simulation of CMS Detector's Calorimetric System and data from the 2004 Test-Beam at CERN's SPS H2 beam-line is presented. The overall simulated response agrees quite well with the measured response. Slight differences in the longitudinal shower profiles between the MC predictions made with different Physics Lists are observed.

  16. GEANT4 calculations of neutron dose in radiation protection using a homogeneous phantom and a Chinese hybrid male phantom.

    PubMed

    Geng, Changran; Tang, Xiaobin; Guan, Fada; Johns, Jesse; Vasudevan, Latha; Gong, Chunhui; Shu, Diyun; Chen, Da

    2016-03-01

    The purpose of this study is to verify the feasibility of applying GEANT4 (version 10.01) in neutron dose calculations in radiation protection by comparing the calculation results with MCNP5. The depth dose distributions are investigated in a homogeneous phantom, and the fluence-to-dose conversion coefficients are calculated for different organs in the Chinese hybrid male phantom for neutrons with energy ranging from 1 × 10(-9) to 10 MeV. By comparing the simulation results between GEANT4 and MCNP5, it is shown that using the high-precision (HP) neutron physics list, GEANT4 produces the closest simulation results to MCNP5. However, differences could be observed when the neutron energy is lower than 1 × 10(-6) MeV. Activating the thermal scattering with an S matrix correction in GEANT4 with HP and MCNP5 in thermal energy range can reduce the difference between these two codes. PMID:26156875

  17. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  18. Monte Carlo calculations of thermal neutron capture in gadolinium: a comparison of GEANT4 and MCNP with measurements.

    PubMed

    Enger, Shirin A; Munck af Rosenschöld, Per; Rezaei, Arash; Lundqvist, Hans

    2006-02-01

    GEANT4 is a Monte Carlo code originally implemented for high-energy physics applications and is well known for particle transport at high energies. The capacity of GEANT4 to simulate neutron transport in the thermal energy region is not equally well known. The aim of this article is to compare MCNP, a code commonly used in low energy neutron transport calculations and GEANT4 with experimental results and select the suitable code for gadolinium neutron capture applications. To account for the thermal neutron scattering from chemically bound atoms [S(alpha,beta)] in biological materials a comparison of thermal neutron fluence in tissue-like poly(methylmethacrylate) phantom is made with MCNP4B, GEANT4 6.0 patch1, and measurements from the neutron capture therapy (NCT) facility at the Studsvik, Sweden. The fluence measurements agreed with MCNP calculated results considering S(alpha,beta). The location of the thermal neutron peak calculated with MCNP without S(alpha,beta) and GEANT4 is shifted by about 0.5 cm towards a shallower depth and is 25%-30% lower in amplitude. Dose distribution from the gadolinium neutron capture reaction is then simulated by MCNP and compared with measured data. The simulations made by MCNP agree well with experimental results. As long as thermal neutron scattering from chemically bound atoms are not included in GEANT4 it is not suitable for NCT applications. PMID:16532938

  19. Monte Carlo calculations of thermal neutron capture in gadolinium: A comparison of GEANT4 and MCNP with measurements

    SciTech Connect

    Enger, Shirin A.; Munck af Rosenschoeld, Per; Rezaei, Arash; Lundqvist, Hans

    2006-02-15

    GEANT4 is a Monte Carlo code originally implemented for high-energy physics applications and is well known for particle transport at high energies. The capacity of GEANT4 to simulate neutron transport in the thermal energy region is not equally well known. The aim of this article is to compare MCNP, a code commonly used in low energy neutron transport calculations and GEANT4 with experimental results and select the suitable code for gadolinium neutron capture applications. To account for the thermal neutron scattering from chemically bound atoms [S({alpha},{beta})] in biological materials a comparison of thermal neutron fluence in tissue-like poly(methylmethacrylate) phantom is made with MCNP4B, GEANT4 6.0 patch1, and measurements from the neutron capture therapy (NCT) facility at the Studsvik, Sweden. The fluence measurements agreed with MCNP calculated results considering S({alpha},{beta}). The location of the thermal neutron peak calculated with MCNP without S({alpha},{beta}) and GEANT4 is shifted by about 0.5 cm towards a shallower depth and is 25%-30% lower in amplitude. Dose distribution from the gadolinium neutron capture reaction is then simulated by MCNP and compared with measured data. The simulations made by MCNP agree well with experimental results. As long as thermal neutron scattering from chemically bound atoms are not included in GEANT4 it is not suitable for NCT applications.

  20. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  1. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications.

    PubMed

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled (125)I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10(-6) simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications. PMID:26061230

  2. Polycrystalline neutron scattering for Geant4: NXSG4

    NASA Astrophysics Data System (ADS)

    Kittelmann, T.; Boin, M.

    2015-04-01

    An extension to Geant4 based on the nxs library is presented. It has been implemented in order to include effects of low-energy neutron scattering in polycrystalline materials, and is made available to the scientific community.

  3. Simulating response functions and pulse shape discrimination for organic scintillation detectors with Geant4

    NASA Astrophysics Data System (ADS)

    Hartwig, Zachary S.; Gumplinger, Peter

    2014-02-01

    We present new capabilities of the Geant4 toolkit that enable the precision simulation of organic scintillation detectors within a comprehensive Monte Carlo code for the first time. As of version 10.0-beta, the Geant4 toolkit models the data-driven photon production from any user-defined scintillator, photon transportation through arbitrarily complex detector geometries, and time-resolved photon detection at the light readout device. By fully specifying the optical properties and geometrical configuration of the detector, the user can simulate response functions, photon transit times, and pulse shape discrimination. These capabilities enable detector simulation within a larger experimental environment as well as computationally evaluating novel scintillators, detector geometry, and light readout configurations. We demonstrate agreement of Geant4 with the NRESP7 code and with experiments for the spectroscopy of neutrons and gammas in the ranges 0-20 MeV and 0.511-1.274 MeV, respectively, using EJ301-based organic scintillation detectors. We also show agreement between Geant4 and experimental modeling of the particle-dependent detector pulses that enable simulated pulse shape discrimination.

  4. artG4: A Generic Framework for Geant4 Simulations

    SciTech Connect

    Arvanitis, Tasha; Lyon, Adam

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  5. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Cañadas, M.; Arce, P.; Rato Mendes, P.

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was

  6. R&D on co-working transport schemes in Geant4

    NASA Astrophysics Data System (ADS)

    Pia, M. G.; Saracco, P.; Sudhakar, M.; Zoglauer, A.; Augelli, M.; Gargioni, E.; Kim, C. H.; Quintieri, L.; de Queiroz Filho, P. P.; de Souza Santos, D.; Weidenspointner, G.; Begalli, M.

    2010-04-01

    A research and development (R&D) project related to the extension of the Geant4 toolkit has been recently launched to address fundamental methods in radiation transport simulation. The project focuses on simulation at different scales in the same experimental environment; this problem requires new methods across the current boundaries of condensed-random-walk and discrete transport schemes. The new developments have been motivated by experimental requirements in various domains, including nanodosimetry, astronomy and detector developments for high energy physics applications.

  7. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Suerfu, B.; Xu, J.; Ivantchenko, V.; Mantero, A.; Brown, J. M. C.; Bernal, M. A.; Francis, Z.; Karamitros, M.; Tran, H. N.

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  8. Development of a Geant4 based Monte Carlo Algorithm to evaluate the MONACO VMAT treatment accuracy.

    PubMed

    Fleckenstein, Jens; Jahnke, Lennart; Lohr, Frank; Wenz, Frederik; Hesser, Jürgen

    2013-02-01

    A method to evaluate the dosimetric accuracy of volumetric modulated arc therapy (VMAT) treatment plans, generated with the MONACO™ (version 3.0) treatment planning system in realistic CT-data with an independent Geant4 based dose calculation algorithm is presented. Therefore a model of an Elekta Synergy linear accelerator treatment head with an MLCi2 multileaf collimator was implemented in Geant4. The time dependent linear accelerator components were modeled by importing either logfiles of an actual plan delivery or a DICOM-RT plan sequence. Absolute dose calibration, depending on a reference measurement, was applied. The MONACO as well as the Geant4 treatment head model was commissioned with lateral profiles and depth dose curves of square fields in water and with film measurements in inhomogeneous phantoms. A VMAT treatment plan for a patient with a thoracic tumor and a VMAT treatment plan of a patient, who received treatment in the thoracic spine region including metallic implants, were used for evaluation. MONACO, as well as Geant4, depth dose curves and lateral profiles of square fields had a mean local gamma (2%, 2mm) tolerance criteria agreement of more than 95% for all fields. Film measurements in inhomogeneous phantoms with a global gamma of (3%, 3mm) showed a pass rate above 95% in all voxels receiving more than 25% of the maximum dose. A dose-volume-histogram comparison of the VMAT patient treatment plans showed mean deviations between Geant4 and MONACO of -0.2% (first patient) and 2.0% (second patient) for the PTVs and (0.5±1.0)% and (1.4±1.1)% for the organs at risk in relation to the prescription dose. The presented method can be used to validate VMAT dose distributions generated by a large number of small segments in regions with high electron density gradients. The MONACO dose distributions showed good agreement with Geant4 and film measurements within the simulation and measurement errors. PMID:22921843

  9. Mass attenuation coefficients of composite materials by Geant4, XCOM and experimental data: comparative study

    NASA Astrophysics Data System (ADS)

    Medhat, M. E.; Singh, V. P.

    2014-09-01

    The main goal of this present study is focused on testing the applicability of Geant4 electromagnetic models for studying mass attenuations coefficients for different types of composite materials at 59.5, 80, 356, 661.6, 1173.2 and 1332.5 keV photon energies. The simulated results of mass attenuation coefficients were compared with the experimental and theoretical XCOM data for the same samples and a good agreement has been observed. The results indicate that this process can be followed to determine the data on the attenuation of gamma rays with the several energies in different materials. The modeling for photon interaction parameters was standard for any type of composite samples. The Geant4 code can be utilized for gamma ray attenuation coefficients for the sample at different energies, which may sometimes be impractical by experiment investigation.

  10. Geant4 simulation of the response of phosphor screens for X-ray imaging

    NASA Astrophysics Data System (ADS)

    Pistrui-Maximean, S. A.; Freud, N.; Létang, J. M.; Koch, A.; Munier, B.; Walenta, A. H.; Montarou, G.; Babot, D.

    2006-07-01

    In order to predict and optimize the response of phosphor screens, it is important to understand the role played by the different physical processes inside the scintillator layer. A simulation model based on the Monte Carlo code Geant4 was developed to determine the Modulation Transfer Function (MTF) of phosphor screens for energies used in X-ray medical imaging and nondestructive testing applications. The visualization of the dose distribution inside the phosphor layer gives an insight into how the MTF is progressively degraded by X-ray and electron transport. The simulation model allows to study the influence of physical and technological parameters on the detector performances, as well as to design and optimize new detector configurations. Preliminary MTF measurements have been carried out and agreement with experimental data has been found in the case of a commercial screen (Kodak Lanex Fine) at an X-ray tube potential of 100 kV. Further validation with other screens (transparent or granular) at different energies is under way.

  11. GEANT4 simulations of the n_TOF spallation source and their benchmarking

    NASA Astrophysics Data System (ADS)

    Lo Meo, S.; Cortés-Giraldo, M. A.; Massimi, C.; Lerendegui-Marco, J.; Barbagallo, M.; Colonna, N.; Guerrero, C.; Mancusi, D.; Mingrone, F.; Quesada, J. M.; Sabate-Gilarte, M.; Vannini, G.; Vlachoudis, V.

    2015-12-01

    Neutron production and transport in the spallation target of the n_TOF facility at CERN has been simulated with GEANT4. The results obtained with different models of high-energy nucleon-nucleus interaction have been compared with the measured characteristics of the neutron beam, in particular the flux and its dependence on neutron energy, measured in the first experimental area. The best agreement at present, within 20% for the absolute value of the flux, and within few percent for the energy dependence in the whole energy range from thermal to 1 GeV, is obtained with the INCL++ model coupled with the GEANT4 native de-excitation model. All other available models overestimate by a larger factor, of up to 70%, the n_TOF neutron flux. The simulations are also able to accurately reproduce the neutron beam energy resolution function, which is essentially determined by the moderation time inside the target/moderator assembly. The results here reported provide confidence on the use of GEANT4 for simulations of spallation neutron sources.

  12. Monte Carlo simulation of a photodisintegration of 3 H experiment in Geant4

    NASA Astrophysics Data System (ADS)

    Gray, Isaiah

    2013-10-01

    An upcoming experiment involving photodisintegration of 3 H at the High Intensity Gamma-Ray Source facility at Duke University has been simulated in the software package Geant4. CAD models of silicon detectors and wire chambers were imported from Autodesk Inventor using the program FastRad and the Geant4 GDML importer. Sensitive detectors were associated with the appropriate logical volumes in the exported GDML file so that changes in detector geometry will be easily manifested in the simulation. Probability distribution functions for the energy and direction of outgoing protons were generated using numerical tables from previous theory, and energies and directions were sampled from these distributions using a rejection sampling algorithm. The simulation will be a useful tool to optimize detector geometry, estimate background rates, and test data analysis algorithms. This work was supported by the Triangle Universities Nuclear Laboratory REU program at Duke University.

  13. Calculation of Coincidence Summing Correction Factors for an HPGe detector using GEANT4.

    PubMed

    Giubrone, G; Ortiz, J; Gallardo, S; Martorell, S; Bas, M C

    2016-07-01

    The aim of this paper was to calculate the True Coincidence Summing Correction Factors (TSCFs) for an HPGe coaxial detector in order to correct the summing effect as a result of the presence of (88)Y and (60)Co in a multigamma source used to obtain a calibration efficiency curve. Results were obtained for three volumetric sources using the Monte Carlo toolkit, GEANT4. The first part of this paper deals with modeling the detector in order to obtain a simulated full energy peak efficiency curve. A quantitative comparison between the measured and simulated values was made across the entire energy range under study. The True Summing Correction Factors were calculated for (88)Y and (60)Co using the full peak efficiencies obtained with GEANT4. This methodology was subsequently applied to (134)Cs, and presented a complex decay scheme. PMID:27085040

  14. The GEANT4 toolkit for microdosimetry calculations: application to microbeam radiation therapy (MRT).

    PubMed

    Spiga, J; Siegbahn, E A; Bräuer-Krisch, E; Randaccio, P; Bravin, A

    2007-11-01

    Theoretical dose distributions for microbeam radiation therapy (MRT) are computed in this paper using the GEANT4 Monte Carlo (MC) simulation toolkit. MRT is an innovative experimental radiotherapy technique carried out using an array of parallel microbeams of synchrotron-wiggler-generated x rays. Although the biological mechanisms underlying the effects of microbeams are still largely unknown, the effectiveness of MRT can be traced back to the natural ability of normal tissues to rapidly repair small damages to the vasculature, and on the lack of a similar healing process in tumoral tissues. Contrary to conventional therapy, in which each beam is at least several millimeters wide, the narrowness of the microbeams allows a rapid regeneration of the blood vessels along the beams' trajectories. For this reason the calculation of the "valley" dose is of crucial importance and the correct use of MC codes for such purposes must be understood. GEANT4 offers, in addition to the standard libraries, a specialized package specifically designed to deal with electromagnetic interactions of particles with matter for energies down to 250 eV. This package implements two different approaches for electron and photon transport, one based on evaluated data libraries, the other adopting analytical models. These features are exploited to cross-check theoretical computations for MRT. The lateral and depth dose profiles are studied for the irradiation of a 20 cm diameter, 20 cm long cylindrical phantom, with cylindrical sources of different size and energy. Microbeam arrays are simulated with the aid of superposition algorithms, and the ratios of peak-to-valley doses are computed for typical cases used in preclinical assays. Dose profiles obtained using the GEANT4 evaluated data libraries and analytical models are compared with simulation results previously obtained using the PENELOPE code. The results show that dose profiles computed with GEANT4's analytical model are almost

  15. The GEANT4 toolkit for microdosimetry calculations: Application to microbeam radiation therapy (MRT)

    SciTech Connect

    Spiga, J.; Siegbahn, E. A.; Braeuer-Krisch, E.; Randaccio, P.; Bravin, A.

    2007-11-15

    Theoretical dose distributions for microbeam radiation therapy (MRT) are computed in this paper using the GEANT4 Monte Carlo (MC) simulation toolkit. MRT is an innovative experimental radiotherapy technique carried out using an array of parallel microbeams of synchrotron-wiggler-generated x rays. Although the biological mechanisms underlying the effects of microbeams are still largely unknown, the effectiveness of MRT can be traced back to the natural ability of normal tissues to rapidly repair small damages to the vasculature, and on the lack of a similar healing process in tumoral tissues. Contrary to conventional therapy, in which each beam is at least several millimeters wide, the narrowness of the microbeams allows a rapid regeneration of the blood vessels along the beams' trajectories. For this reason the calculation of the 'valley' dose is of crucial importance and the correct use of MC codes for such purposes must be understood. GEANT4 offers, in addition to the standard libraries, a specialized package specifically designed to deal with electromagnetic interactions of particles with matter for energies down to 250 eV. This package implements two different approaches for electron and photon transport, one based on evaluated data libraries, the other adopting analytical models. These features are exploited to cross-check theoretical computations for MRT. The lateral and depth dose profiles are studied for the irradiation of a 20 cm diameter, 20 cm long cylindrical phantom, with cylindrical sources of different size and energy. Microbeam arrays are simulated with the aid of superposition algorithms, and the ratios of peak-to-valley doses are computed for typical cases used in preclinical assays. Dose profiles obtained using the GEANT4 evaluated data libraries and analytical models are compared with simulation results previously obtained using the PENELOPE code. The results show that dose profiles computed with GEANT4's analytical model are almost

  16. Comparison of GEANT4 Simulations with Experimental Data for Thick Al Absorbers

    NASA Astrophysics Data System (ADS)

    Yevseyeva, Olga; de Assis, Joaquim; Evseev, Ivan; Schelin, Hugo; Paschuk, Sergei; Milhoretto, Edney; Setti, João; Díaz, Katherin; Hormaza, Joel; Lopes, Ricardo

    2009-06-01

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4 can lead to significant disagreements in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents proton energy spectra obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models for 19.68 MeV protons passing through a number of Al absorbers with various thicknesses. The spectra were compared with the experimental data, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and with the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the MCNPX simulations reasonably reproduce well all experimental spectra. For the relatively thin targets all the methods give practically identical results but this is not the same for the thick absorbers. It should be noted that all the spectra were measured at the proton energies significantly above 2 MeV, i.e., in the so-called "Bethe-Bloch region". Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies are necessary for better understanding and definitive conclusions.

  17. Introducing Third-Year Undergraduates to GEANT4 Simulations of Light Transport and Collection in Scintillation Materials

    ERIC Educational Resources Information Center

    Riggi, Simone; La Rocca, Paola; Riggi, Francesco

    2011-01-01

    GEANT4 simulations of the processes affecting the transport and collection of optical photons generated inside a scintillation detector were carried out, with the aim to complement the educational material offered by textbooks to third-year physics undergraduates. Two typical situations were considered: a long scintillator strip with and without a…

  18. Adaptation of GEANT4 to Monte Carlo dose calculations based on CT data.

    PubMed

    Jiang, H; Paganetti, H

    2004-10-01

    The GEANT4 Monte Carlo code provides many powerful functions for conducting particle transport simulations with great reliability and flexibility. However, as a general purpose Monte Carlo code, not all the functions were specifically designed and fully optimized for applications in radiation therapy. One of the primary issues is the computational efficiency, which is especially critical when patient CT data have to be imported into the simulation model. In this paper we summarize the relevant aspects of the GEANT4 tracking and geometry algorithms and introduce our work on using the code to conduct dose calculations based on CT data. The emphasis is focused on modifications of the GEANT4 source code to meet the requirements for fast dose calculations. The major features include a quick voxel search algorithm, fast volume optimization, and the dynamic assignment of material density. These features are ready to be used for tracking the primary types of particles employed in radiation therapy such as photons, electrons, and heavy charged particles. Recalculation of a proton therapy treatment plan generated by a commercial treatment planning program for a paranasal sinus case is presented as an example. PMID:15543788

  19. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Bednarz, B.

    2013-02-01

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  20. GEANT4 simulations of Cherenkov reaction history diagnostics.

    PubMed

    Rubery, M S; Horsfield, C J; Herrmann, H W; Kim, Y; Mack, J M; Young, C S; Caldwell, S E; Evans, S C; Sedilleo, T J; McEvoy, A; Miller, E K; Stoeffl, W; Ali, Z; Toebbe, J

    2010-10-01

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility. PMID:21033850

  1. GEANT4 simulations of Cherenkov reaction history diagnostics

    SciTech Connect

    Rubery, M. S.; Horsfield, C. J.; Herrmann, H. W.; Kim, Y.; Mack, J. M.; Young, C. S.; Caldwell, S. E.; Evans, S. C.; Sedilleo, T. J.; McEvoy, A.; Miller, E. K.; Stoeffl, W.; Ali, Z.

    2010-10-15

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility.

  2. Transmission Efficiency of the Sage Spectrometer Using GEANT4

    NASA Astrophysics Data System (ADS)

    Cox, D. M.; Herzberg, R.-D.; Papadakis, P.; Ali, F.; Butler, P. A.; Cresswell, J. R.; Mistry, A.; Sampson, J.; Seddon, D. A.; Thornhill, J.; Wells, D.; Konki, J.; Greenlees, P. T.; Rahkila, P.; Pakarinen, J.; Sandzelius, M.; Sorri, J.; Julin, R.; Coleman-Smith, P. J.; Lazarus, I. H.; Letts, S. C.; Simpson, J.; Pucknell, V. F. E.

    2014-09-01

    The new SAGE spectrometer allows simultaneous electron and γ-ray in-beam studies of heavy nuclei. A comprehensive GEANT4 simulation suite has been created for the SAGE spectrometer. This includes both the silicon detectors for electron detection and the germanium detectors for γ-ray detection. The simulation can be used for a wide variety of tests with the aim of better understanding the behaviour of SAGE. A number of aspects of electron transmission are presented here.

  3. Geant4 simulations on Compton scattering of laser photons on relativistic electrons

    SciTech Connect

    Filipescu, D.; Utsunomiya, H.; Gheorghe, I.; Glodariu, T.; Tesileanu, O.; Shima, T.; Takahisa, K.; Miyamoto, S.

    2015-02-24

    Using Geant4, a complex simulation code of the interaction between laser photons and relativistic electrons was developed. We implemented physically constrained electron beam emittance and spacial distribution parameters and we also considered a Gaussian laser beam. The code was tested against experimental data produced at the γ-ray beam line GACKO (Gamma Collaboration Hutch of Konan University) of the synchrotron radiation facility NewSUBARU. Here we will discuss the implications of transverse missallignments of the collimation system relative to the electron beam axis.

  4. Comparison of MCNPX and GEANT4 to Predict the Contribution of Non-elastic Nuclear Interactions to Absorbed Dose in Water, PMMA and A150

    NASA Astrophysics Data System (ADS)

    Shtejer, K.; Arruda-Neto, J. D. T.; Schulte, R.; Wroe, A.; Rodrigues, T. E.; de Menezes, M. O.; Moralles, M.; Guzmán, F.; Manso, M. V.

    2008-08-01

    Proton induced non-elastic nuclear reactions play an important role in the dose distribution of clinically used proton beams as they deposit dose of high biological effectiveness both within the primary beam path as well as outside the beam to untargeted tissues. Non-elastic nuclear reactions can be evaluated using transport codes based on the Monte Carlo method. In this work, we have utilized the Los Alamos code MCNPX and the CERN GEANT4 toolkit, which are currently the most widely used Monte Carlo programs for proton radiation transport simulations in medical physics, to study the contribution of non-elastic nuclear interactions to the absorbed dose of proton beams in the therapeutic energy range. The impact of different available theoretical models to address the nuclear reaction process was investigated. The contribution of secondary particles from non-elastic nuclear reactions was calculated in three materials relevant in radiotherapy applications: water, PMMA and A150. The results evidence that there are differences in the calculated contribution of the secondary particles heavier than protons to the absorbed dose, with different approaches to model the nuclear reactions. The MCNPX calculation give rise to a larger contribution of d, t, α3He to the total dose compared to the GEANT4 physical models chosen in this work.

  5. Comparison of MCNPX and GEANT4 to Predict the Contribution of Non-elastic Nuclear Interactions to Absorbed Dose in Water, PMMA and A150

    SciTech Connect

    Shtejer, K.; Arruda-Neto, J. D. T.; Rodrigues, T. E.; Schulte, R.; Wroe, A.; Menezes, M. O. de; Moralles, M.

    2008-08-11

    Proton induced non-elastic nuclear reactions play an important role in the dose distribution of clinically used proton beams as they deposit dose of high biological effectiveness both within the primary beam path as well as outside the beam to untargeted tissues. Non-elastic nuclear reactions can be evaluated using transport codes based on the Monte Carlo method. In this work, we have utilized the Los Alamos code MCNPX and the CERN GEANT4 toolkit, which are currently the most widely used Monte Carlo programs for proton radiation transport simulations in medical physics, to study the contribution of non-elastic nuclear interactions to the absorbed dose of proton beams in the therapeutic energy range. The impact of different available theoretical models to address the nuclear reaction process was investigated. The contribution of secondary particles from non-elastic nuclear reactions was calculated in three materials relevant in radiotherapy applications: water, PMMA and A150. The results evidence that there are differences in the calculated contribution of the secondary particles heavier than protons to the absorbed dose, with different approaches to model the nuclear reactions. The MCNPX calculation give rise to a larger contribution of d, t, {alpha}{sup 3}He to the total dose compared to the GEANT4 physical models chosen in this work.

  6. Influence of thyroid volume reduction on absorbed dose in 131I therapy studied by using Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ziaur, Rahman; Sikander, M. Mirza; Waheed, Arshed; Nasir, M. Mirza; Waheed, Ahmed

    2014-05-01

    A simulation study has been performed to quantify the effect of volume reduction on the thyroid absorbed dose per decay and to investigate the variation of energy deposition per decay due to β- and γ-activity of 131I with volume/mass of thyroid, for water, ICRP- and ICRU-soft tissue taken as thyroid material. A Monte Carlo model of the thyroid, in the Geant4 radiation transport simulation toolkit was constructed to compute the β- and γ-absorbed dose in the simulated thyroid phantom for various values of its volume. The effect of the size and shape of the thyroid on energy deposition per decay has also been studied by using spherical, ellipsoidal and cylindrical models for the thyroid and varying its volume in 1-25 cm3 range. The relative differences of Geant4 results for different models with each other and MCNP results lie well below 1.870%. The maximum relative difference among the Geant4 estimated results for water with ICRP and ICRU soft tissues is not more than 0.225%. S-values for ellipsoidal, spherical and cylindrical thyroid models were estimated and the relative difference with published results lies within 3.095%. The absorbed fraction values for beta particles show a good agreement with published values within 2.105% deviation. The Geant4 based simulation results of absorbed fractions for gammas again show a good agreement with the corresponding MCNP and EGS4 results (±6.667%) but have 29.032% higher values than that of MIRD calculated values. Consistent with previous studies, the reduction of the thyroid volume is found to have a substantial effect on the absorbed dose. Geant4 simulations confirm dose dependence on the volume/mass of thyroid in agreement with MCNP and EGS4 computed values but are substantially different from MIRD8 data. Therefore, inclusion of size/mass dependence is indicated for 131I radiotherapy of the thyroid.

  7. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Pshenichnov, Igor; Botvina, Alexander; Mishustin, Igor; Greiner, Walter

    2010-03-01

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100A MeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  8. BC404 scintillators as gamma locators studied via Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Cortés, M. L.; Hoischen, R.; Eisenhauer, K.; Gerl, J.; Pietralla, N.

    2014-05-01

    In many applications in industry and academia, an accurate determination of the direction from where gamma rays are emitted is either needed or desirable. Ion-beam therapy treatments, the search for orphan sources, and homeland security applications are examples of fields that can benefit from directional sensitivity to gamma-radiation. Scintillation detectors are a good option for these types of applications as they have relatively low cost, are easy to handle and can be produced in a large range of different sizes. In this work a Geant4 simulation was developed to study the directional sensitivity of different BC404 scintillator geometries and arrangements. The simulation includes all the physical processes relevant for gamma detection in a scintillator. In particular, the creation and propagation of optical photons inside the scintillator was included. A simplified photomultiplier tube model was also simulated. The physical principle exploited is the angular dependence of the shape of the energy spectrum obtained from thin scintillator layers when irradiated from different angles. After an experimental confirmation of the working principle of the device and a check of the simulation, the possibilities and limitations of directional sensitivity to gamma radiation using scintillator layers was tested. For this purpose, point-like sources of typical energies expected in ion-beam therapy were used. Optimal scintillator thicknesses for different energies were determined and the setup efficiencies calculated. The use of arrays of scintillators to reconstruct the direction of incoming gamma rays was also studied. For this case, a spherical source emitting Bremsstrahlung radiation was used together with a setup consisting of scintillator layers. The capability of this setup to identify the center of the extended source was studied together with its angular resolution.

  9. Geant4-Simulations for cellular dosimetry in nuclear medicine.

    PubMed

    Freudenberg, Robert; Wendisch, Maria; Kotzerke, Jörg

    2011-12-01

    The application of unsealed radionuclides in radiobiological experiments can lead to intracellular radionuclide uptake and an increased absorbed dose. Accurate dose quantification is essential to assess observed radiobiological effects. Due to small cellular dimensions direct dose measurement is impossible. We will demonstrate the application of Monte Carlo simulations for dose calculation. Dose calculations were performed using the Geant4 Monte Carlo toolkit, wherefore typical experimental situations were designed. Dose distributions inside wells were simulated for different radionuclides. S values were simulated for spherical cells and cell monolayers of different diameter. Concomitantly experiments were performed using the PC Cl3 cell line with mediated radionuclide uptake. For various activity distributions cellular survival was measured. We yielded S values for dose distribution inside the wells. Calculated S values for a single cell are in good agreement to S values provided in the literature (ratio 0.87 to 1.07). Cross-dose is up to ten times higher for Y-90. Concomitantly performed cellular experiments confirm the dose calculation. Furthermore the necessity of correct dose calculation was shown for assessment of radiobiological effects after application of unsealed radionuclides. Thereby the feasibility of using Geant4 was demonstrated. PMID:21983023

  10. Thermal neutron response of a boron-coated GEM detector via GEANT4 Monte Carlo code.

    PubMed

    Jamil, M; Rhee, J T; Kim, H G; Ahmad, Farzana; Jeon, Y J

    2014-10-22

    In this work, we report the design configuration and the performance of the hybrid Gas Electron Multiplier (GEM) detector. In order to make the detector sensitive to thermal neutrons, the forward electrode of the GEM has been coated with the enriched boron-10 material, which works as a neutron converter. A total of 5×5cm(2) configuration of GEM has been used for thermal neutron studies. The response of the detector has been estimated via using GEANT4 MC code with two different physics lists. Using the QGSP_BIC_HP physics list, the neutron detection efficiency was determined to be about 3%, while with QGSP_BERT_HP physics list the efficiency was around 2.5%, at the incident thermal neutron energies of 25meV. The higher response of the detector proves that GEM-coated with boron converter improves the efficiency for thermal neutrons detection. PMID:25464183

  11. Simulation study of Fast Neutron Radiography using GEANT4

    NASA Astrophysics Data System (ADS)

    Bishnoi, S.; Thomas, R. G.; Sarkar, P. S.; Datar, V. M.; Sinha, A.

    2015-02-01

    Fast neutron radiography (FNR) is an important non-destructive technique for the imaging of thick bulk material. We are designing a FNR system using a laboratory based 14 MeV D-T neutron generator [1]. Simulation studies have been carried using Monte Carlo based GEANT4 code to understand the response of the FNR system for various objects. Different samples ranging from low Z, metallic and high Z materials were simulated for their radiographic images. The quality of constructed neutron radiography images in terms of relative contrast ratio and the contrast to noise ratio were investigated for their dependence on various parameters such as thickness, voids inside high/low Z material and also for low Z material hidden behind high Z material. We report here the potential and limitations of FNR for imaging different materials and a few configurations and also the possible areas where FNR can be implemented.

  12. Positron Production at JLab Simulated Using Geant4

    SciTech Connect

    Kossler, W. J.; Long, S. S.

    2009-09-02

    The results of a Geant4 Monte-Carlo study of the production of slow positrons using a 140 MeV electron beam which might be available at Jefferson Lab are presented. Positrons are produced by pair production for the gamma-rays produced by bremsstrahlung on the target which is also the stopping medium for the positrons. Positrons which diffuse to the surface of the stopping medium are assumed to be ejected due to a negative work function. Here the target and moderator are combined into one piece. For an osmium target/moderator 3 cm long with transverse dimensions of 1 cm by 1 mm, we obtain a slow positron yield of about 8.5centre dot10{sup 10}/(scentre dotmA) If these positrons were remoderated and re-emitted with a 23% probability we would obtain 2centre dot10{sup 10}/(scentre dotmA) in a micro-beam.

  13. Nuclear spectroscopy with Geant4: Proton and neutron emission & radioactivity

    NASA Astrophysics Data System (ADS)

    Sarmiento, L. G.; Rudolph, D.

    2016-07-01

    With the aid of a novel combination of existing equipment - JYFLTRAP and the TASISpec decay station - it is possible to perform very clean quantum-state selective, high-resolution particle-γ decay spectroscopy. We intend to study the determination of the branching ratio of the ℓ = 9 proton emission from the Iπ = 19/2-, 3174-keV isomer in the N = Z - 1 nucleus 53Co. The study aims to initiate a series of similar experiments along the proton dripline, thereby providing unique insights into "open quantum systems". The technique has been pioneered in case studies using SHIPTRAP and TASISpec at GSI. Newly available radioactive decay modes in Geant4 simulations are going to corroborate the anticipated experimental results.

  14. GEANT 4 simulation of (99)Mo photonuclear production in nanoparticles.

    PubMed

    Dikiy, N P; Dovbnya, A N; Fedorchenko, D V; Khazhmuradov, M A

    2016-08-01

    GEANT 4 Monte-Carlo simulation toolkit is used to study the kinematic recoil method of (99)Mo photonuclear production. Simulation for bremsstrahlung photon spectrum with maximum photon energy 30MeV showed that for MoO3 nanoparticle escape fraction decreases from 0.24 to 0.08 when nanoparticle size increases from 20nm to 80nm. For the natural molybdenum and pure (100)Mo we obtained the lower values: from 0.17 to 0.05. The generation of accompanying molybdenum nuclei is significantly lower for pure (100)Mo and is about 3.6 nuclei per single (99)Mo nucleus, while natural molybdenum nanoparticle produce about 48 accompanying nuclei. Also, we have shown that for high-energy photons escape fraction of (99)Mo decreases, while production of unwanted molybdenum isotopes is significantly higher. PMID:27156050

  15. Simulation of a Helical Channel using GEANT4

    SciTech Connect

    Elvira, V. D.; Lebrun, P.; Spentzouris, P.

    2001-02-01

    We present a simulation of a 72 m long cooling channel proposed by V. Balbekov based on the helical cooling concept developed by Ya. Derbenev. LiH wedge absorbers provide the energy loss mechanism and 201 MHz cavities are used for re-acceleration. They are placed inside a main solenoidal field to focus the beam. A helical field with an amplitude of 0.3 T and a period of 1.8 m provides momentum dispersion for emittance exchange.The simulation is performed using GEANT4. The total fractional transmission is 0.85, and the transverse, longitudinal, and 3-D cooling factors are 3.75, 2.27, and 14.61, respectively. Some version of this helical channel could eventually be used to replace the first section of the double flip channel to keep the longitudinal emittance under control and increase transmission. Although this is an interesting option, the technical challenges are still significant.

  16. Geant4 Simulation of Air Showers using Thinning Method

    NASA Astrophysics Data System (ADS)

    Sabra, Mohammad S.; Watts, John W.; Christl, Mark J.

    2015-04-01

    Simulation of complete air showers induced by cosmic ray particles becomes prohibitive at extreme energies due to the large number of secondary particles. Computing time of such simulations roughly scales with the energy of the primary cosmic ray particle, and becomes excessively large. To mitigate the problem, only small fraction of particles can be tracked and, then, the whole shower is reconstructed based on this sample. This method is called Thinning. Using this method in Geant4, we have simulated proton and iron air showers at extreme energies (E >1016 eV). Secondary particle densities are calculated and compared with the standard simulation program in this field, CORSIKA. This work is supported by the NASA Postdoctoral Program administrated by Oak Ridge Associated Universities.

  17. Monte Carlo application based on GEANT4 toolkit to simulate a laser-plasma electron beam line for radiobiological studies

    NASA Astrophysics Data System (ADS)

    Lamia, D.; Russo, G.; Casarino, C.; Gagliano, L.; Candiano, G. C.; Labate, L.; Baffigi, F.; Fulgentini, L.; Giulietti, A.; Koester, P.; Palla, D.; Gizzi, L. A.; Gilardi, M. C.

    2015-06-01

    We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications.

  18. Enhancement and validation of Geant4 Brachytherapy application on clinical HDR 192Ir source

    NASA Astrophysics Data System (ADS)

    Ababneh, Eshraq; Dababneh, Saed; Qatarneh, Sharif; Wadi-Ramahi, Shada

    2014-10-01

    The Geant4 Monte Carlo MC associated Brachytherapy example was adapted, enhanced and several analysis techniques have been developed. The simulation studies the isodose distribution of the total, primary and scattered doses around a Nucletron microSelectron 192Ir source. Different phantom materials were used (water, tissue and bone) and the calculation was conducted at various depths and planes. The work provides an early estimate of the required number of primary events to ultimately achieve a given uncertainty at a given distance, in the otherwise CPU and time consuming clinical MC calculation. The adaptation of the Geant4 toolkit and the enhancements introduced to the code are all validated including the comprehensive decay of the 192Ir source, the materials used to build the geometry, the geometry itself and the calculated scatter to primary dose ratio. The simulation quantitatively illustrates that the scattered dose in the bone medium is larger than its value in water and tissue. As the distance away from the source increases, scatter contribution to dose becomes more significant as the primary dose decreases. The developed code could be viewed as a platform that contains detailed dose calculation model for clinical application of HDR 192Ir in Brachytherapy.

  19. Development and validation of a GEANT4 radiation transport code for CT dosimetry

    PubMed Central

    Carver, DE; Kost, SD; Fernald, MJ; Lewis, KG; Fraser, ND; Pickens, DR; Price, RR; Stabin, MG

    2014-01-01

    We have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate our simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air, a standard 16-cm acrylic head phantom, and a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of our Monte Carlo simulations. We found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  20. Development and validation of a GEANT4 radiation transport code for CT dosimetry.

    PubMed

    Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G

    2015-04-01

    The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  1. Geant4 simulations of Solar-Orbiter STIX Caliste detectors’ response to solar X-ray radiation

    NASA Astrophysics Data System (ADS)

    Barylak, Jaromir; Barylak, Aleksandra; Mrozek, Tomasz; Steslicki, Marek; Podgorski, Piotr; Netzel, Henryka

    2015-08-01

    Spectrometer/Telescope for Imaging X-rays (STIX) is a part of Solar Orbiter (SO) science payload. SO which will be launched in October 2018 into final orbit approaching the Sun to within 0.3 a.u. STIX is a Fourier imager equipped with pairs of grids that comprise the flare hard X-ray tomograph. Similar imager types were already used in the past (eq. RHESSI, Yohkoh/HXT), but STIX will incorporate Moiré modulation and a new type of pixelated? detectors.We developed a method of modeling these detectors’ response matrix (DRM) using the Geant4 simulations of X-ray photons interactions with CdTe crystals. Taking into account known detector effects (Fano noise, hole tailing etc.) we modeled the resulting spectra with high accuracy. Comparison of Caliste-SO laboratory measurements of 241Am decay spectrum with our results shows a perfect agreement (within 1-2%).By using the Geant4 tool we proceed to model ageing response of detectors (several years in interplanetary space). The modeling based on the Geant4 simulations significantly improves our understanding of detector response to X-ray photons and secondary background emission due to particles. As an example we present predicted X-ray spectra of solar flares obtained for several levels of detectors’ degradation and for various distances of SO from the Sun.

  2. Nuclear reaction measurements on tissue-equivalent materials and GEANT4 Monte Carlo simulations for hadrontherapy

    NASA Astrophysics Data System (ADS)

    De Napoli, M.; Romano, F.; D'Urso, D.; Licciardello, T.; Agodi, C.; Candiano, G.; Cappuzzello, F.; Cirrone, G. A. P.; Cuttone, G.; Musumarra, A.; Pandola, L.; Scuderi, V.

    2014-12-01

    When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned. Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u-1 12C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.

  3. G4SiPM: A novel silicon photomultiplier simulation package for Geant4

    NASA Astrophysics Data System (ADS)

    Niggemann, Tim; Dietz-Laursonn, Erik; Hebbeker, Thomas; Künsken, Andreas; Lauscher, Markus; Merschmeyer, Markus

    2015-07-01

    The signal of silicon photomultipliers (SiPMs) depends not only on the number of incoming photons but also on thermal and correlated noise of which the latter is difficult to handle. Additionally, the properties of SiPMs vary with the supplied bias voltage and the ambient temperature. The purpose of the G4SiPM simulation package is the integration of a detailed SiPM simulation into Geant4 which is widely used in particle physics. The prediction of the G4SiPM simulation code is validated with a laboratory measurement of the dynamic range of a 3×3 mm2 SiPM with 3600 cells manufactured by Hamamatsu.

  4. Validating Geant4 Versions 7.1 and 8.3 Against 6.1 for BaBar

    SciTech Connect

    Banerjee, Swagato; Brown, David N.; Chen, Chunhui; Cote, David; Dubois-Felsmann, Gregory P.; Gaponenko, Igor; Kim, Peter C.; Lockman, William S.; Neal, Homer A.; Simi, Gabriele; Telnov, Alexandre V.; Wright, Dennis H.; /SLAC

    2011-11-08

    Since 2005 and 2006, respectively, Geant4 versions 7.1 and 8.3 have been available, providing: improvements in modeling of multiple scattering; corrections to muon ionization and improved MIP signature; widening of the core of electromagnetic shower shape profiles; newer implementation of elastic scattering for hadronic processes; detailed implementation of Bertini cascade model for kaons and lambdas, and updated hadronic cross-sections from calorimeter beam tests. The effects of these changes in simulation are studied in terms of closer agreement of simulation using Geant4 versions 7.1 and 8.3 as compared to Geant4 version 6.1 with respect to data distributions of: the hit residuals of tracks in BABAR silicon vertex tracker; the photon and K{sub L}{sup 0} shower shapes in the electromagnetic calorimeter; the ratio of energy deposited in the electromagnetic calorimeter and the flux return of the magnet instrumented with a muon detection system composed of resistive plate chambers and limited-streamer tubes; and the muon identification efficiency in the muon detector system of the BABAR detector.

  5. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  6. GEANT4 Simulation of Neutron Detector for DAMPE

    NASA Astrophysics Data System (ADS)

    He, M.; Ma, T.; Chang, J.; Zhang, Y.; Huang, Y. Y.; Zang, J. J.; Wu, J.; Dong, T. K.

    2016-01-01

    During recent tens of years dark matter has gradually become a hot topic in astronomical research field, and related theory researches and experiment projects change with each passing day. The Dark Matter Particle Explorer (DAMPE) of our country is proposed under this background. As the probing object involves high energy electrons, appropriate methods must be taken to distinguish them from protons in order to reduce the event probability of other charged particles (e.g. a proton) being mistaken as electrons. The experiments show that, the hadronic shower of high energy proton in BGO electromagnetic calorimeter, which is usually accompanied by the emitting of large number of secondary neutrons, is significantly different from the electromagnetic shower of high energy electron. Through the detection of secondary neutron signal emitting from the bottom of BGO electromagnetic calorimeter and the shower shape of incident particles in BGO electromagnetic calorimeter, we can effectively distinguish whether the incident particles are high energy protons or electrons. This paper introduces the structure and detecting principle of DAMPE neutron detector. We use Monte-Carlo method with GEANT4 software to simulate the signal emitting from protons and electrons at characteristic energy in the neutron detector, and finally summarize the neutron detector's ability to distinguish protons and electrons under different electron acception efficiencies.

  7. GEANT4 simulation of APEX background radiation and shielding

    NASA Astrophysics Data System (ADS)

    Kaluarachchi, Maduka M.; Cates, Gordon D.; Wojtsekhowski, B.

    2015-04-01

    The A' Experiment (APEX), which is approved to run at the Thomas Jefferson National Accelerator Facility (JLab) Hall A, will search for a new vector boson that is hypothesized to be a possible force carrier that couples to dark matter. APEX results should be sensitive to the mass range of 65 MeV to 550 MeV, and high sensitivity will be achieved by means of a high intensity 100 μA beam on a 0.5 g/cm2 Tungsten target resulting in very high luminosity. The experiment should be able to observe the A ' with a coupling constant α ' ~ 1 × 107 times smaller than the electromagnetic coupling constant α. To deal safely with such enormous intensity and luminosity, a full radiation analysis must be used to help with the design of proper radiation shielding. The purpose of this talk is to present preliminary results obtained by simulating radiation background from the APEX experiment using the 3D Monte-Carlo transport code Geant4. Included in the simulation is a detailed Hall A setup: the hall, spectrometers and shield house, beam dump, beam line, septa magnet with its field, as well as the production target. The results were compared to the APEX test run data and used in development of the radiation shielding for sensitive electronics.

  8. Optimization of a photoneutron source based on 10 MeV electron beam using Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Askri, Boubaker

    2015-10-01

    Geant4 Monte Carlo code has been used to conceive and optimize a simple and compact neutron source based on a 10 MeV electron beam impinging on a tungsten target adjoined to a beryllium target. For this purpose, a precise photonuclear reaction cross-section model issued from the International Atomic Energy Agency (IAEA) database was linked to Geant4 to accurately simulate the interaction of low energy bremsstrahlung photons with beryllium material. A benchmark test showed that a good agreement was achieved when comparing the emitted neutron flux spectra predicted by Geant4 and Fluka codes for a beryllium cylinder bombarded with a 5 MeV photon beam. The source optimization was achieved through a two stage Monte Carlo simulation. In the first stage, the distributions of the seven phase space coordinates of the bremsstrahlung photons at the boundaries of the tungsten target were determined. In the second stage events corresponding to photons emitted according to these distributions were tracked. A neutron yield of 4.8 × 1010 neutrons/mA/s was obtained at 20 cm from the beryllium target. A thermal neutron yield of 1.5 × 109 neutrons/mA/s was obtained after introducing a spherical shell of polyethylene as a neutron moderator.

  9. Radiation quality of cosmic ray nuclei studied with Geant4-based simulations

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas N.; Pshenichnov, Igor A.; Mishustin, Igor N.; Bleicher, Marcus

    2014-04-01

    In future missions in deep space a space craft will be exposed to a non-negligible flux of high charge and energy (HZE) particles present in the galactic cosmic rays (GCR). One of the major concerns of manned missions is the impact on humans of complex radiation fields which result from the interactions of HZE particles with the spacecraft materials. The radiation quality of several ions representing GCR is investigated by calculating microdosimetry spectra. A Geant4-based Monte Carlo model for Heavy Ion Therapy (MCHIT) is used to simulate microdosimetry data for HZE particles in extended media where fragmentation reactions play a certain role. Our model is able to reproduce measured microdosimetry spectra for H, He, Li, C and Si in the energy range of 150-490 MeV/u. The effect of nuclear fragmentation on the relative biological effectiveness (RBE) of He, Li and C is estimated and found to be below 10%.

  10. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.

    PubMed

    Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L

    2012-06-01

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy. PMID:22572100

  11. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy

    NASA Astrophysics Data System (ADS)

    Afsharpour, H.; Landry, G.; D'Amours, M.; Enger, S.; Reniers, B.; Poon, E.; Carrier, J.-F.; Verhaegen, F.; Beaulieu, L.

    2012-06-01

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  12. GATE - Geant4 Application for Tomographic Emission: a simulation toolkit for PET and SPECT

    PubMed Central

    Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.

    2012-01-01

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document, and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.ep.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects toward the gridification of GATE and its extension to other domains such as dosimetry are also discussed. PMID:15552416

  13. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV

    NASA Astrophysics Data System (ADS)

    Maigne, L.; Perrot, Y.; Schaart, D. R.; Donnarieix, D.; Breton, V.

    2011-02-01

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV.

  14. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV.

    PubMed

    Maigne, L; Perrot, Y; Schaart, D R; Donnarieix, D; Breton, V

    2011-02-01

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV. PMID:21239846

  15. Study on GEANT4 code applications to dose calculation using imaging data

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Ok; Kang, Jeong Ku; Kim, Jhin Kee; Kwon, Hyeong Cheol; Kim, Jung Soo; Kim, Bu Gil; Jeong, Dong Hyeok

    2015-07-01

    The use of the GEANT4 code has increased in the medical field. Various studies have calculated the patient dose distributions by users the GEANT4 code with imaging data. In present study, Monte Carlo simulations based on DICOM data were performed to calculate the dose absorb in the patient's body. Various visualization tools are installed in the GEANT4 code to display the detector construction; however, the display of DICOM images is limited. In addition, to displaying the dose distributions on the imaging data of the patient is difficult. Recently, the gMocren code, a volume visualization tool for GEANT4 simulation, was developed and has been used in volume visualization of image files. In this study, the imaging based on the dose distributions absorbed in the patients was performed by using the gMocren code. Dosimetric evaluations with were carried out by using thermo luminescent dosimeter and film dosimetry to verify the calculated results.

  16. Monte Carlo simulation and scatter correction of the GE Advance PET scanner with SimSET and Geant4

    NASA Astrophysics Data System (ADS)

    Barret, Olivier; Carpenter, T. Adrian; Clark, John C.; Ansorge, Richard E.; Fryer, Tim D.

    2005-10-01

    For Monte Carlo simulations to be used as an alternative solution to perform scatter correction, accurate modelling of the scanner as well as speed is paramount. General-purpose Monte Carlo packages (Geant4, EGS, MCNP) allow a detailed description of the scanner but are not efficient at simulating voxel-based geometries (patient images). On the other hand, dedicated codes (SimSET, PETSIM) will perform well for voxel-based objects but will be poor in their capacity of simulating complex geometries such as a PET scanner. The approach adopted in this work was to couple a dedicated code (SimSET) with a general-purpose package (Geant4) to have the efficiency of the former and the capabilities of the latter. The combined SimSET+Geant4 code (SimG4) was assessed on the GE Advance PET scanner and compared to the use of SimSET only. A better description of the resolution and sensitivity of the scanner and of the scatter fraction was obtained with SimG4. The accuracy of scatter correction performed with SimG4 and SimSET was also assessed from data acquired with the 20 cm NEMA phantom. SimG4 was found to outperform SimSET and to give slightly better results than the GE scatter correction methods installed on the Advance scanner (curve fitting and scatter modelling for the 300-650 keV and 375-650 keV energy windows, respectively). In the presence of a hot source close to the edge of the field of view (as found in oxygen scans), the GE curve-fitting method was found to fail whereas SimG4 maintained its performance.

  17. Monte Carlo simulation and scatter correction of the GE advance PET scanner with SimSET and Geant4.

    PubMed

    Barret, Olivier; Carpenter, T Adrian; Clark, John C; Ansorge, Richard E; Fryer, Tim D

    2005-10-21

    For Monte Carlo simulations to be used as an alternative solution to perform scatter correction, accurate modelling of the scanner as well as speed is paramount. General-purpose Monte Carlo packages (Geant4, EGS, MCNP) allow a detailed description of the scanner but are not efficient at simulating voxel-based geometries (patient images). On the other hand, dedicated codes (SimSET, PETSIM) will perform well for voxel-based objects but will be poor in their capacity of simulating complex geometries such as a PET scanner. The approach adopted in this work was to couple a dedicated code (SimSET) with a general-purpose package (Geant4) to have the efficiency of the former and the capabilities of the latter. The combined SimSET+Geant4 code (SimG4) was assessed on the GE Advance PET scanner and compared to the use of SimSET only. A better description of the resolution and sensitivity of the scanner and of the scatter fraction was obtained with SimG4. The accuracy of scatter correction performed with SimG4 and SimSET was also assessed from data acquired with the 20 cm NEMA phantom. SimG4 was found to outperform SimSET and to give slightly better results than the GE scatter correction methods installed on the Advance scanner (curve fitting and scatter modelling for the 300-650 keV and 375-650 keV energy windows, respectively). In the presence of a hot source close to the edge of the field of view (as found in oxygen scans), the GE curve-fitting method was found to fail whereas SimG4 maintained its performance. PMID:16204875

  18. Interaction of Fast Nucleons with Actinide Nuclei Studied with GEANT4

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yu.; Pshenichnov, I.; Mishustin, I.; Greiner, W.

    2014-04-01

    We model interactions of protons and neutrons with energies from 1 to 1000 MeV with 241Am and 243Am nuclei. The calculations are performed with the Monte Carlo model for Accelerator Driven Systems (MCADS) which we developed based on the GEANT4 toolkit of version 9.4. This toolkit is widely used to simulate the propagation of particles in various materials which contain nuclei up to uranium. After several extensions we apply this toolkit also to proton- and neutron-induced reactions on Am. The fission and radiative neutron capture cross sections, neutron multiplicities and distributions of fission fragments were calculated for 241Am and 243Am and compared with experimental data. As demonstrated, the fission of americium by energetic protons with energies above 20 MeV can be well described by the Intra-Nuclear Cascade Liège (INCL) model combined with the fission-evaporation model ABLA. The calculated average numbers of fission neutrons and mass distributions of fission products agree well with the corresponding data. However, the proton-induced fission below 20 MeV is described less accurately. This is attributed to the limitations of the Intra-Nuclear Cascade model at low projectile energies.

  19. Use of GEANT4 vs. MCNPX for the characterization of a boron-lined neutron detector

    NASA Astrophysics Data System (ADS)

    van der Ende, B. M.; Atanackovic, J.; Erlandson, A.; Bentoumi, G.

    2016-06-01

    This work compares GEANT4 with MCNPX in the characterization of a boron-lined neutron detector. The neutron energy ranges simulated in this work (0.025 eV to 20 MeV) are the traditional domain of MCNP simulations. This paper addresses the question, how well can GEANT4 and MCNPX be employed for detailed thermal neutron detector characterization? To answer this, GEANT4 and MCNPX have been employed to simulate detector response to a 252Cf energy spectrum point source, as well as to simulate mono-energetic parallel beam source geometries. The 252Cf energy spectrum simulation results demonstrate agreement in detector count rate within 3% between the two packages, with the MCNPX results being generally closer to experiment than are those from GEANT4. The mono-energetic source simulations demonstrate agreement in detector response within 5% between the two packages for all neutron energies, and within 1% for neutron energies between 100 eV and 5 MeV. Cross-checks between the two types of simulations using ISO-8529 252Cf energy bins demonstrates that MCNPX results are more self-consistent than are GEANT4 results, by 3-4%.

  20. The Simulation of AN Imaging Gamma-Ray Compton Backscattering Device Using GEANT4

    NASA Astrophysics Data System (ADS)

    Flechas, D.; Sarmiento, L. G.; Cristancho, F.; Fajardo, E.

    2014-02-01

    A gamma-backscattering imaging device dubbed Compton Camera, developed at GSI (Darmstadt, Germany) and modified and studied at the Nuclear Physics Group of the National University of Colombia in Bogotá, uses the back-to-back emission of two gamma rays in the positron annihilation to construct a bidimensional image that represents the distribution of matter in the field-of-view of the camera. This imaging capability can be used in a host of different situations, for example, to identify and study deposition and structural defects, and to help locating concealed objects, to name just two cases. In order to increase the understanding of the response of the Compton Camera and, in particular, its image formation process, and to assist in the data analysis, a simulation of the camera was developed using the GEANT4 simulation toolkit. In this work, the images resulting from different experimental conditions are shown. The simulated images and their comparison with the experimental ones already suggest methods to improve the present experimental device

  1. The radiation environment near the lunar surface: CRaTER observations and Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J. E.; Blake, J. B.; Spence, H. E.; Schwadron, N. A.; Golightly, M. J.; Case, A. W.; Kasper, J. C.; Townsend, L. W.

    2013-04-01

    At the start of the Lunar Reconnaissance Orbiter mission in 2009, its Cosmic Ray Telescope for the Effects of Radiation instrument measured the radiation environment near the Moon during the recent deep solar minimum, when galactic cosmic rays (GCRs) were at the highest level observed during the space age. We present observations that show the combined effects of GCR primaries, secondary particles ("albedo") created by the interaction of GCRs with the lunar surface, and the interactions of these particles in the shielding material overlying the silicon solid-state detectors of the Cosmic Ray Telescope for the Effects of Radiation. We use Geant4 to model the energy and angular distribution of the albedo particles, and to model the response of the sensor to the various particle species reaching the 50 kilometer altitude of the Lunar Reconnaissance Orbiter. Using simulations to gain insight into the observations, we are able to present preliminary energy-deposit spectra for evaluation of the radiation environment's effects on other sensitive materials, whether biological or electronic, that would be exposed to a similar near-lunar environment.

  2. Distributions of deposited energy and ionization clusters around ion tracks studied with Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Hilgers, Gerhard; Bleicher, Marcus

    2016-05-01

    The Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT) was extended to study the patterns of energy deposition at sub-micrometer distance from individual ion tracks. Dose distributions for low-energy 1H, 4He, 12C and 16O ions measured in several experiments are well described by the model in a broad range of radial distances, from 0.5 to 3000 nm. Despite the fact that such distributions are characterized by long tails, a dominant fraction of deposited energy (∼80%) is confined within a radius of about 10 nm. The probability distributions of clustered ionization events in nanoscale volumes of water traversed by 1H, 2H, 4He, 6Li, 7Li, and 12C ions are also calculated. A good agreement of calculated ionization cluster-size distributions with the corresponding experimental data suggests that the extended MCHIT can be used to characterize stochastic processes of energy deposition to sensitive cellular structures.

  3. Application of GEANT4 in the Development of New Radiation Therapy Treatment Methods

    NASA Astrophysics Data System (ADS)

    Brahme, Anders; Gudowska, Irena; Larsson, Susanne; Andreassen, Björn; Holmberg, Rickard; Svensson, Roger; Ivanchenko, Vladimir; Bagulya, Alexander; Grichine, Vladimir; Starkov, Nikolay

    2006-04-01

    There is a very fast development of new radiation treatment methods today, from advanced use of intensity modulated photon and electron beams to light ion therapy with narrow scanned beam based treatment units. Accurate radiation transport calculations are a key requisite for these developments where Geant4 is a very useful Monte Carlo code for accurate design of new treatment units. Today we cannot only image the tumor by PET-CT imaging before the treatment but also determine the tumor sensitivity to radiation and even measure in vivo the delivered absorbed dose in three dimensions in the patient. With such methods accurate Monte Carlo calculations will make radiation therapy an almost exact science where the curative doses can be calculated based on patient individual response data. In the present study results from the application of Geant4 are discussed and the comparisons between Geant4 and experimental and other Monte Carlo data are presented.

  4. Application of GEANT4 radiation transport toolkit to dose calculations in anthropomorphic phantoms.

    PubMed

    Rodrigues, P; Trindade, A; Peralta, L; Alves, C; Chaves, A; Lopes, M C

    2004-12-01

    In this paper, we present a novel implementation of a dose calculation application, based on the GEANT4 Monte Carlo toolkit. Validation studies were performed with an homogeneous water phantom and an Alderson-Rando anthropomorphic phantom both irradiated with high-energy photon beams produced by a clinical linear accelerator. As input, this tool requires computer tomography images for automatic codification of voxel-based geometries and phase-space distributions to characterize the incident radiation field. Simulation results were compared with ionization chamber, thermoluminescent dosimetry data and commercial treatment planning system calculations. In homogeneous water phantom, overall agreement with measurements were within 1-2%. For anthropomorphic simulated setups (thorax and head irradiation) mean differences between GEANT4 and TLD measurements were less than 2%. Significant differences between GEANT4 and a semi-analytical algorithm implemented in the treatment planning system, were found in low-density regions, such as air cavities with strong electronic disequilibrium. PMID:15388147

  5. Dose conversion coefficients for ICRP110 voxel phantom in the Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Martins, M. C.; Cordeiro, T. P. V.; Silva, A. X.; Souza-Santos, D.; Queiroz-Filho, P. P.; Hunt, J. G.

    2014-02-01

    The reference adult male voxel phantom recommended by International Commission on Radiological Protection no. 110 was implemented in the Geant4 Monte Carlo code. Geant4 was used to calculate Dose Conversion Coefficients (DCCs) expressed as dose deposited in organs per air kerma for photons, electrons and neutrons in the Annals of the ICRP. In this work the AP and PA irradiation geometries of the ICRP male phantom were simulated for the purpose of benchmarking the Geant4 code. Monoenergetic photons were simulated between 15 keV and 10 MeV and the results were compared with ICRP 110, the VMC Monte Carlo code and the literature data available, presenting a good agreement.

  6. Microdosimetry of the Auger electron emitting 123I radionuclide using Geant4-DNA simulations

    NASA Astrophysics Data System (ADS)

    Fourie, H.; Newman, R. T.; Slabbert, J. P.

    2015-04-01

    Microdosimetric calculations of the Auger electron emitter 123I were done in liquid water spheres using the Geant4 toolkit. The electron emission spectrum of 123I produced by Geant4 is presented. Energy deposition and corresponding S-values were calculated to investigate the influence of the sub-cellular localization of the Auger emitter. It was found that S-values calculated by the Geant4 toolkit are generally lower than the values calculated by other Monte Carlo codes for the 123I radionuclide. The differences in the compared S-values are mainly due to the different particle emission spectra employed by the respective computational codes and emphasizes the influence of the spectra on dosimetry calculations.

  7. Microdosimetry of the Auger electron emitting 123I radionuclide using Geant4-DNA simulations.

    PubMed

    Fourie, H; Newman, R T; Slabbert, J P

    2015-04-21

    Microdosimetric calculations of the Auger electron emitter (123)I were done in liquid water spheres using the Geant4 toolkit. The electron emission spectrum of (123)I produced by Geant4 is presented. Energy deposition and corresponding S-values were calculated to investigate the influence of the sub-cellular localization of the Auger emitter. It was found that S-values calculated by the Geant4 toolkit are generally lower than the values calculated by other Monte Carlo codes for the (123)I radionuclide. The differences in the compared S-values are mainly due to the different particle emission spectra employed by the respective computational codes and emphasizes the influence of the spectra on dosimetry calculations. PMID:25825914

  8. Geant4 simulations of proton beam transport through a carbon or beryllium degrader and following a beam line.

    PubMed

    van Goethem, M J; van der Meer, R; Reist, H W; Schippers, J M

    2009-10-01

    Monte Carlo simulations based on the Geant4 simulation toolkit were performed for the carbon wedge degrader used in the beam line at the Center of Proton Therapy of the Paul Scherrer Institute (PSI). The simulations are part of the beam line studies for the development and understanding of the GANTRY2 and OPTIS2 treatment facilities at PSI, but can also be applied to other beam lines. The simulated stopping power, momentum distributions at the degrader exit and beam line transmission have been compared to accurate benchmark measurements. Because the beam transport through magnetic elements is not easily modeled using Geant4a connection to the TURTLE beam line simulation program was made. After adjusting the mean ionization potential of the carbon degrader material from 78 eV to 95 eV, we found an accurate match between simulations and benchmark measurements, so that the simulation model could be validated. We found that the degrader does not completely erase the initial beam phase space even at low degraded beam energies. Using the validation results, we present a study of the usability of beryllium as a degrader material (mean ionization potential 63.7 eV). We found an improvement in the transmission of 30-45%, depending on the degraded beam energy, the higher value for the lower energies. PMID:19741273

  9. Experimental spectra analysis in THM with the help of simulation based on the Geant4 framework

    NASA Astrophysics Data System (ADS)

    Li, Cheng-Bo; Wen, Qun-Gang; Zhou, Shu-Hua; Jiang, Zong-Jun; Fu, Yuan-Yong; Zhou, Jing; Meng, Qiu-Ying; Wang, Xiao-Lian

    2015-05-01

    The Coulomb barrier and electron screening cause difficulties in directly measuring nuclear reaction cross sections of charged particles at astrophysical energies. The Trojan-horse method (THM) has been introduced to solve the difficulties as a powerful indirect tool. In order to understand experimental spectra better, Geant4 is employed to simulate the method. Validity and reliability of simulation data are examined by comparing the experimental data with simulated results. The Geant4 simulation of THM improves data analysis and is beneficial to the design for future related experiments. Supported by National Natural Science Foundation of China (11075218, 10575132) and Beijing Natural Science Foundation (1122017)

  10. Validation of a dose deposited by low-energy photons using GATE/GEANT4.

    PubMed

    Thiam, C O; Breton, V; Donnarieix, D; Habib, B; Maigne, L

    2008-06-01

    The GATE Monte Carlo simulation platform based on the Geant4 toolkit has now become a diffused tool for simulating PET and SPECT imaging devices. In this paper, we explore its relevance for dosimetry of low-energy 125I photon brachytherapy sources used to treat prostate cancers. To that end, three 125-iodine sources widely used in prostate cancer brachytherapy treatment have been modelled. GATE simulations reproducing dosimetric reference observables such as radial dose function g(r), anisotropy function F(r, theta) and dose-rate constant (Lambda) were performed in liquid water. The calculations were splitted on the EGEE grid infrastructure to reduce the computing time of the simulations. The results were compared to other relevant Monte Carlo results and to measurements published and fixed as recommended values by the AAPM Task Group 43. GATE results agree with consensus values published by AAPM Task Group 43 with an accuracy better than 2%, demonstrating that GATE is a relevant tool for the study of the dose induced by low-energy photons. PMID:18490808

  11. Determination of age specific ¹³¹I S-factor values for thyroid using anthropomorphic phantom in Geant4 simulations.

    PubMed

    Rahman, Ziaur; Ahmad, Syed Bilal; Mirza, Sikander M; Arshed, Waheed; Mirza, Nasir M; Ahmed, Waheed

    2014-08-01

    Using anthropomorphic phantom in Geant4, determination of β- and γ-absorbed fractions and energy absorbed per event due to (131)I activity in thyroid of individuals of various age groups and geometrical models, have been carried out. In the case of (131)I β-particles, the values of the absorbed fraction increased from 0.88 to 0.97 with fetus age. The maximum difference in absorbed energy per decay for soft tissue and water is 7.2% for γ-rays and 0.4% for β-particles. The new mathematical MIRD embedded in Geant4 (MEG) and two-lobe ellipsoidal models developed in this work have 4.3% and 2.9% lower value of S-factor as compared with the ORNL data. PMID:24681428

  12. SU-E-J-72: Geant4 Simulations of Spot-Scanned Proton Beam Treatment Plans

    SciTech Connect

    Kanehira, T; Sutherland, K; Matsuura, T; Umegaki, K; Shirato, H

    2014-06-01

    Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generated and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.

  13. Simulation response of B4C-coated PPAC for thermal neutrons using GEANT4 Monte Carlo approach

    NASA Astrophysics Data System (ADS)

    Jamil, M.; Rhee, J. T.; Jeon, Y. J.

    2013-08-01

    In this work we report a technique employed for the detection of thermal neutrons using the parallel plate avalanche counter (PPAC). In order to make the detector sensitive to thermal neutrons a thin layer of B4C has been coated on the forward electrode of the PPAC configuration. Upon falling on the converter coating, charged particles were generated via the 10B (n , α)7Li reaction. In this simulation study, thermal neutrons have been simulated using the GEANT4 MC code, and the response of the detector has been evaluated as a function of neutron energy. For a better understanding of the simulation response, the performance of the detector has been found using the two different physics list i.e., QGSP _ BIC _ HP and QGSP _ BERT _ HP. The obtained results predict that such boron-carbide based PPAC can be potentially utilized for thermal neutron detection. A complete description of the detector configuration and the simulation results is also presented.

  14. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes

    NASA Astrophysics Data System (ADS)

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes.

  15. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes.

    PubMed

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes. PMID:26975304

  16. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    NASA Astrophysics Data System (ADS)

    Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M. G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  17. First Results of Saturation Curve Measurements of Heat-Resistant Steel Using GEANT4 and MCNP5 Codes

    NASA Astrophysics Data System (ADS)

    Hoang, Duc-Tam; Tran, Thien-Thanh; Le, Bao-Tran; Tran, Kim-Tuyet; Huynh, Dinh-Chuong; Vo, Hoang-Nguyen; Chau, Van-Tao

    A gamma backscattering technique is applied to calculate the saturation curve and the effective mass attenuation coefficient of material. A NaI(Tl) detector collimated by collimator of large diameter is modeled by Monte Carlo technique using both MCNP5 and GEANT4 codes. The result shows a good agreement in response function of the scattering spectra for the two codes. Based on such spectra, the saturation curve of heat-resistant steel is determined. The results represent a strong confirmation that it is appropriate to use the detector collimator of large diameter to obtain the scattering spectra and this work is also the basis of experimental set-up for determining the thickness of material.

  18. Therapeutic dose simulation of a 6 MV Varian Linac photon beam using GEANT4

    NASA Astrophysics Data System (ADS)

    Salama, E.; Ali, A. S.; Khaled, N. E.; Radi, A.

    2015-10-01

    A developed program in C++ language using GEANT4 libraries was used to simulate the gantry of a 6 MV high energy photon linear accelerator (Linac). The head of a clinical linear accelerator based on the manufacturer's detailed information is simulated. More than 2× 109 primary electrons are used to create the phase space file. Evaluation of the percentage depth dose (PDD) and flatness symmetry (lateral dose profiles) in water phantom were performed. Comparisons between experimental and simulated data were carried out for three field sizes; 5 × 5, 10 × 10 and 15 × 15 cm2. A relatively good agreement appeared between computed and measured PDD. Electron contamination and spatial distribution for both photons and electrons in the simulated beam are evaluated. Moreover, the obtained lateral dose profiles at 15, 50, and 100 mm depth are compatible with the measured values. The obtained results concluded that, GEANT4 code is a promising applicable Monte Carlo program in radiotherapy applications.

  19. Simulation of neutron production in heavy metal targets using Geant4 software

    NASA Astrophysics Data System (ADS)

    Baldin, A. A.; Berlev, A. I.; Kudashkin, I. V.; Mogildea, G.; Mogildea, M.; Paraipan, M.; Tyutyunnikov, S. I.

    2016-03-01

    Inelastic hadronic interactions in heavy targets have been simulated using Geant4 and compared with experimental data for thin and thick lead and uranium targets. Special attention is paid to neutron and fission fragment production. Good agreement in the description of proton-beam interaction with thick targets is demonstrated, which is important for the simulation of experiments aimed at the development of subcritical reactors.

  20. Calculation of self-shielding factor for neutron activation experiments using GEANT4 and MCNP

    NASA Astrophysics Data System (ADS)

    Romero-Barrientos, Jaime; Molina, F.; Aguilera, Pablo; Arellano, H. F.

    2016-07-01

    The neutron self-shielding factor G as a function of the neutron energy was obtained for 14 pure metallic samples in 1000 isolethargic energy bins from 1.10-5eV to 2.107eV using Monte Carlo simulations in GEANT4 and MCNP6. The comparison of these two Monte Carlo codes shows small differences in the final self-shielding factor mostly due to the different cross section databases that each program uses.

  1. New Geant4 based simulation tools for space radiation shielding and effects analysis

    NASA Astrophysics Data System (ADS)

    Santina, G.; Nieminen, P.; Evansa, H.; Daly, E.; Lei, F.; Truscott, P. R.; Dyer, C. S.; Quaghebeur, B.; Heynderickx, D.

    2003-09-01

    We present here a set of tools for space applications based on the Geant4 simulation toolkit, developed for radiation shielding analysis as part of the European Space Agency (ESA) activities in the Geant4 collaboration. The Sector Shielding Analysis Tool (SSAT) and the Materials and Geometry Association (MGA) utility will first be described. An overview of the main features of the MUlti-LAyered Shielding SImulation Software tool (MULASSIS) will follow. The tool is specifically addressed to shielding optimization and effects analysis. A Java interface allows the use of MULASSIS by the space community over the World Wide Web, integrated in the widely used SPENVIS package. The analysis of the particle transport output provides automatically radiation fluence, ionising and NIEL dose and effects analysis. ESA is currently funding the porting of this tools to a lowcost parallel processor facility using the GRID technology under the ESA SpaceGRID initiative. Other Geant4 present and future projects will be presented related to the study of space environment effects on spacecrafts.

  2. Comparison of dose distributions for Hounsfield number conversion methods in GEANT4

    NASA Astrophysics Data System (ADS)

    Kim, Hyung Dong; Kim, Byung Yong; Kim, Eng Chan; Yun, Sang Mo; Kang, Jeong Ku; Kim, Sung Kyu

    2014-06-01

    The conversion of patient computed tomography (CT) data to voxel phantoms is essential for CT-based Monte Carlo (MC) dose calculations, and incorrect assignments of materials and mass densities can lead to large errors in dose distributions. We investigated the effects of mass density and material assignments on GEANT4-based photon dose calculations. Three material conversion methods and four density conversion methods were compared for a lung tumor case. The dose calculations for 6-MV photon beams with a field size of 10 × 10 cm2 were performed using a 0.5 × 0.5 × 0.5 cm3 voxel with 1.2 × 109 histories. The material conversion methods led to different material assignment percentages in converted voxel regions. The GEANT4 example and the modified Schneider material conversion methods showed large local dose differences relative to the BEAMnrc default method for lung and other tissues. For mass density conversion methods when only water was used, our results showed only slight dose differences. Gaussian-like distributions, with mean values close to zero, were obtained when the reference method was compared with the other methods. The maximum dose difference of ˜2% indicated that the dose distributions agreed relatively well. Material assignment methods probably have more significant impacts on dose distributions than mass density assignment methods. The study confirms that material assignment methods cause significant dose differences in GEANT4-based photon dose calculations.

  3. MaGe - a GEANT4-based Monte Carlo Application Framework for Low-background Germanium Experiments

    SciTech Connect

    Boswell, M.; Chan, Yuen-Dat; Detwiler, Jason A.; Finnerty, P.; Henning, R.; Gehman, Victor; Johnson, Robert A.; Jordan, David V.; Kazkaz, Kareem; Knapp, Markus; Kroninger, Kevin; Lenz, Daniel; Leviner, L.; Liu, Jing; Liu, Xiang; MacMullin, S.; Marino, Michael G.; Mokhtarani, A.; Pandola, Luciano; Schubert, Alexis G.; Schubert, J.; Tomei, Claudia; Volynets, Oleksandr

    2011-06-13

    We describe a physics simulation software framework, MAGE, that is based on the GEANT4 simulation toolkit. MAGE is used to simulate the response of ultra-low radioactive background radiation detectors to ionizing radiation, specifically the MAJ ORANA and GE RDA neutrinoless double-beta decay experiments. MAJ ORANA and GERDA use high-purity germanium technology to search for the neutrinoless double-beta decay of the 76 Ge isotope, and MAGE is jointly developed between these two collaborations. The MAGE framework contains simulated geometries of common objects, prototypes, test stands, and the actual experiments. It also implements customized event generators, GE ANT 4 physics lists, and output formats. All of these features are available as class libraries that are typically compiled into a single executable. The user selects the particular experimental setup implementation at run-time via macros. The combination of all these common classes into one framework reduces duplication of efforts, eases comparison between simulated data and experiment, and simplifies the addition of new detectors to be simulated. This paper focuses on the software framework, custom event generators, and physics list.

  4. Integration of the low-energy particle track simulation code in Geant4

    NASA Astrophysics Data System (ADS)

    Arce, Pedro; Muñoz, Antonio; Moraleda, Montserrat; Gomez Ros, José María; Blanco, Fernando; Perez, José Manuel; García, Gustavo

    2015-08-01

    The Low-Energy Particle Track Simulation code (LEPTS) is a Monte Carlo code developed to simulate the damage caused by radiation at molecular level. The code is based on experimental data of scattering cross sections, both differential and integral, and energy loss data, complemented with theoretical calculations. It covers the interactions of electrons and positrons from energies of 10 keV down to 0.1 eV in different biologically relevant materials. In this article we briefly mention the main characteristics of this code and we present its integration within the Geant4 Monte Carlo toolkit.

  5. Application of Geant4 simulation for analysis of soil carbon inelastic neutron scattering measurements.

    PubMed

    Yakubova, Galina; Kavetskiy, Aleksandr; Prior, Stephen A; Torbert, H Allen

    2016-07-01

    Inelastic neutron scattering (INS) was applied to determine soil carbon content. Due to non-uniform soil carbon depth distribution, the correlation between INS signals with some soil carbon content parameter is not obvious; however, a proportionality between INS signals and average carbon weight percent in ~10cm layer for any carbon depth profile is demonstrated using Monte-Carlo simulation (Geant4). Comparison of INS and dry combustion measurements confirms this conclusion. Thus, INS measurements give the value of this soil carbon parameter. PMID:27124122

  6. Geant4 studies of the CNAO facility system for hadrontherapy treatment of uveal melanomas

    NASA Astrophysics Data System (ADS)

    Rimoldi, A.; Piersimoni, P.; Pirola, M.; Riccardi, C.

    2014-06-01

    The Italian National Centre of Hadrontherapy for Cancer Treatment (CNAO -Centro Nazionale di Adroterapia Oncologica) in Pavia, Italy, has started the treatment of selected cancers with the first patients in late 2011. In the coming months at CNAO plans are to activate a new dedicated treatment line for irradiation of uveal melanomas using the available active beam scan. The beam characteristics and the experimental setup should be tuned in order to reach the necessary precision required for such treatments. Collaboration between CNAO foundation, University of Pavia and INFN has started in 2011 to study the feasibility of these specialised treatments by implementing a MC simulation of the transport beam line and comparing the obtained simulation results with measurements at CNAO. The goal is to optimise an eye-dedicated transport beam line and to find the best conditions for ocular melanoma irradiations. This paper describes the Geant4 toolkit simulation of the CNAO setup as well as a modelised human eye with a tumour inside. The Geant4 application could be also used to test possible treatment planning systems. Simulation results illustrate the possibility to adapt the CNAO standard transport beam line by optimising the position of the isocentre and the addition of some passive elements to better shape the beam for this dedicated study.

  7. A Compton camera application for the GAMOS GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Harkness, L. J.; Arce, P.; Judson, D. S.; Boston, A. J.; Boston, H. C.; Cresswell, J. R.; Dormand, J.; Jones, M.; Nolan, P. J.; Sampson, J. A.; Scraggs, D. P.; Sweeney, A.; Lazarus, I.; Simpson, J.

    2012-04-01

    Compton camera systems can be used to image sources of gamma radiation in a variety of applications such as nuclear medicine, homeland security and nuclear decommissioning. To locate gamma-ray sources, a Compton camera employs electronic collimation, utilising Compton kinematics to reconstruct the paths of gamma rays which interact within the detectors. The main benefit of this technique is the ability to accurately identify and locate sources of gamma radiation within a wide field of view, vastly improving the efficiency and specificity over existing devices. Potential advantages of this imaging technique, along with advances in detector technology, have brought about a rapidly expanding area of research into the optimisation of Compton camera systems, which relies on significant input from Monte-Carlo simulations. In this paper, the functionality of a Compton camera application that has been integrated into GAMOS, the GEANT4-based Architecture for Medicine-Oriented Simulations, is described. The application simplifies the use of GEANT4 for Monte-Carlo investigations by employing a script based language and plug-in technology. To demonstrate the use of the Compton camera application, simulated data have been generated using the GAMOS application and acquired through experiment for a preliminary validation, using a Compton camera configured with double sided high purity germanium strip detectors. Energy spectra and reconstructed images for the data sets are presented.

  8. Simulation of positron backscattering and implantation profiles using Geant4 code

    NASA Astrophysics Data System (ADS)

    Huang, Shi-Juan; Pan, Zi-Wen; Liu, Jian-Dang; Han, Rong-Dian; Ye, Bang-Jiao

    2015-10-01

    For the proper interpretation of the experimental data produced in slow positron beam technique, the positron implantation properties are studied carefully using the latest Geant4 code. The simulated backscattering coefficients, the implantation profiles, and the median implantation depths for mono-energetic positrons with energy range from 1 keV to 50 keV normally incident on different crystals are reported. Compared with the previous experimental results, our simulation backscattering coefficients are in reasonable agreement, and we think that the accuracy may be related to the structures of the host materials in the Geant4 code. Based on the reasonable simulated backscattering coefficients, the adjustable parameters of the implantation profiles which are dependent on materials and implantation energies are obtained. The most important point is that we calculate the positron backscattering coefficients and median implantation depths in amorphous polymers for the first time and our simulations are in fairly good agreement with the previous experimental results. Project supported by the National Natural Science Foundation of China (Grant Nos. 11175171 and 11105139).

  9. Calculation of HPGe efficiency for environmental samples: comparison of EFFTRAN and GEANT4

    NASA Astrophysics Data System (ADS)

    Nikolic, Jelena; Vidmar, Tim; Jokovic, Dejan; Rajacic, Milica; Todorovic, Dragana

    2014-11-01

    Determination of full energy peak efficiency is one of the most important tasks that have to be performed before gamma spectrometry of environmental samples. Many methods, including measurement of specific reference materials, Monte Carlo simulations, efficiency transfer and semi empirical calculations, were developed in order to complete this task. Monte Carlo simulation, based on GEANT4 simulation package and EFFTRAN efficiency transfer software are applied for the efficiency calibration of three detectors, readily used in the Environment and Radiation Protection Laboratory of Institute for Nuclear Sciences Vinca, for measurement of environmental samples. Efficiencies were calculated for water, soil and aerosol samples. The aim of this paper is to perform efficiency calculations for HPGe detectors using both GEANT4 simulation and EFFTRAN efficiency transfer software and to compare obtained results with the experimental results. This comparison should show how the two methods agree with experimentally obtained efficiencies of our measurement system and in which part of the spectrum do the discrepancies appear. The detailed knowledge of accuracy and precision of both methods should enable us to choose an appropriate method for each situation that is presented in our and other laboratories on a daily basis.

  10. Calculation of extrapolation curves in the 4π(LS)β-γ coincidence technique with the Monte Carlo code Geant4.

    PubMed

    Bobin, C; Thiam, C; Bouchard, J

    2016-03-01

    At LNE-LNHB, a liquid scintillation (LS) detection setup designed for Triple to Double Coincidence Ratio (TDCR) measurements is also used in the β-channel of a 4π(LS)β-γ coincidence system. This LS counter based on 3 photomultipliers was first modeled using the Monte Carlo code Geant4 to enable the simulation of optical photons produced by scintillation and Cerenkov effects. This stochastic modeling was especially designed for the calculation of double and triple coincidences between photomultipliers in TDCR measurements. In the present paper, this TDCR-Geant4 model is extended to 4π(LS)β-γ coincidence counting to enable the simulation of the efficiency-extrapolation technique by the addition of a γ-channel. This simulation tool aims at the prediction of systematic biases in activity determination due to eventual non-linearity of efficiency-extrapolation curves. First results are described in the case of the standardization (59)Fe. The variation of the γ-efficiency in the β-channel due to the Cerenkov emission is investigated in the case of the activity measurements of (54)Mn. The problem of the non-linearity between β-efficiencies is featured in the case of the efficiency tracing technique for the activity measurements of (14)C using (60)Co as a tracer. PMID:26699674

  11. Comparative studies on shielding properties of some steel alloys using Geant4, MCNP, WinXCOM and experimental results

    NASA Astrophysics Data System (ADS)

    Singh, Vishwanath P.; Medhat, M. E.; Shirmardi, S. P.

    2015-01-01

    The mass attenuation coefficients, μ/ρ and effective atomic numbers, Zeff of some carbon steel and stainless steel alloys have been calculated by using Geant4, MCNP simulation codes for different gamma ray energies, 279.1 keV, 661.6 keV, 662 keV, 1115.5 keV, 1173 keV and 1332 keV. The simulation results of Zeff using Geant4 and MCNP codes have been compared with possible available experimental results and theoretical WinXcom, and good agreement has been observed. The simulated μ/ρ and Zeff values using Geant4 and MCNP code signifies that both the simulation process can be followed to determine the gamma ray interaction properties of the alloys for energies wherever analogous experimental results may not be available. This kind of studies can be used for various applications such as for radiation dosimetry, medical and radiation shielding.

  12. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4

    NASA Astrophysics Data System (ADS)

    Schümann, J.; Paganetti, H.; Shin, J.; Faddegon, B.; Perl, J.

    2012-06-01

    A key task within all Monte Carlo particle transport codes is ‘navigation’, the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the efficiency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the last with and without boundary skipping, a method where neighboring voxels with the same Hounsfield unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a tool for particle simulations layered on top of Geant4. Runtime comparisons were made on three distinct patient CT data sets: a head and neck, a liver and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average runtime ratio for G4PhantomParameterisation with and without boundary skipping for our heterogeneous data was equal to 0.97: 1. The calculated dose distributions agreed with the reference distribution for all but the G4

  13. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4.

    PubMed

    Schümann, J; Paganetti, H; Shin, J; Faddegon, B; Perl, J

    2012-06-01

    A key task within all Monte Carlo particle transport codes is 'navigation', the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the efficiency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the last with and without boundary skipping, a method where neighboring voxels with the same Hounsfield unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a tool for particle simulations layered on top of Geant4. Runtime comparisons were made on three distinct patient CT data sets: a head and neck, a liver and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average runtime ratio for G4PhantomParameterisation with and without boundary skipping for our heterogeneous data was equal to 0.97: 1. The calculated dose distributions agreed with the reference distribution for all but the G4Phantom

  14. Molecular scale track structure simulations in liquid water using the Geant4-DNA Monte-Carlo processes.

    PubMed

    Francis, Z; Incerti, S; Capra, R; Mascialino, B; Montarou, G; Stepan, V; Villagrasa, C

    2011-01-01

    This paper presents a study of energy deposits induced by ionising particles in liquid water at the molecular scale. Particles track structures were generated using the Geant4-DNA processes of the Geant4 Monte-Carlo toolkit. These processes cover electrons (0.025 eV-1 MeV), protons (1 keV-100 MeV), hydrogen atoms (1 keV-100 MeV) and alpha particles (10 keV-40 MeV) including their different charge states. Electron ranges and lineal energies for protons were calculated in nanometric and micrometric volumes. PMID:20810287

  15. Distributions of positron-emitting nuclei in proton and carbon-ion therapy studied with GEANT4.

    PubMed

    Pshenichnov, Igor; Mishustin, Igor; Greiner, Walter

    2006-12-01

    Depth distributions of positron-emitting nuclei in PMMA phantoms are calculated within a Monte Carlo model for heavy-ion therapy (MCHIT) based on the GEANT4 toolkit (version 8.0). The calculated total production rates of (11)C, (10)C and (15)O nuclei are compared with experimental data and with corresponding results of the FLUKA and POSGEN codes. The distributions of e(+) annihilation points are obtained by simulating radioactive decay of unstable nuclei and transporting positrons in the surrounding medium. A finite spatial resolution of the positron emission tomography (PET) is taken into account in a simplified way. Depth distributions of beta(+)-activity as seen by a PET scanner are calculated and compared to available data for PMMA phantoms. The obtained beta(+)-activity profiles are in good agreement with PET data for proton and (12)C beams at energies suitable for particle therapy. The MCHIT capability to predict the beta(+)-activity and dose distributions in tissue-like materials of different chemical composition is demonstrated. PMID:17110773

  16. Geant4 software application for the simulation of cosmic ray showers in the Earth’s atmosphere

    NASA Astrophysics Data System (ADS)

    Paschalis, P.; Mavromichalaki, H.; Dorman, L. I.; Plainaki, C.; Tsirigkas, D.

    2014-11-01

    Galactic cosmic rays and solar energetic particles with sufficient rigidity to penetrate the geomagnetic field, enter the Earth’s atmosphere and interact with the electrons and the nuclei of its atoms and molecules. From the interactions with the nuclei, cascades of secondary particles are produced that can be detected by ground-based detectors such as neutron monitors and muon counters. The theoretical study of the details of the atmospheric showers is of great importance, since many applications, such as the dosimetry for the aviation crews, are based on it. In this work, a new application which can be used in order to study the showers of the secondary particles in the atmosphere is presented. This application is based on the Monte Carlo simulation techniques, performed by using the well-known Geant4 toolkit. We present a thorough analysis of the simulation’s critical points, including a description of the procedure applied in order to model the atmosphere and the geomagnetic field. Representative results obtained by the application are presented and future plans for the project are discussed.

  17. Geant4 calculations for space radiation shielding material Al2O3

    NASA Astrophysics Data System (ADS)

    Capali, Veli; Acar Yesil, Tolga; Kaya, Gokhan; Kaplan, Abdullah; Yavuz, Mustafa; Tilki, Tahir

    2015-07-01

    Aluminium Oxide, Al2O3 is the most widely used material in the engineering applications. It is significant aluminium metal, because of its hardness and as a refractory material owing to its high melting point. This material has several engineering applications in diverse fields such as, ballistic armour systems, wear components, electrical and electronic substrates, automotive parts, components for electric industry and aero-engine. As well, it is used as a dosimeter for radiation protection and therapy applications for its optically stimulated luminescence properties. In this study, stopping powers and penetrating distances have been calculated for the alpha, proton, electron and gamma particles in space radiation shielding material Al2O3 for incident energies 1 keV - 1 GeV using GEANT4 calculation code.

  18. Comparison of MCNPX and Geant4 proton energy deposition predictions for clinical use

    PubMed Central

    Titt, U.; Bednarz, B.; Paganetti, H.

    2012-01-01

    Several different Monte Carlo codes are currently being used at proton therapy centers to improve upon dose predictions over standard methods using analytical or semi-empirical dose algorithms. There is a need to better ascertain the differences between proton dose predictions from different available Monte Carlo codes. In this investigation Geant4 and MCNPX, the two most-utilized Monte Carlo codes for proton therapy applications, were used to predict energy deposition distributions in a variety of geometries, comprising simple water phantoms, water phantoms with complex inserts and in a voxelized geometry based on clinical CT data. The gamma analysis was used to evaluate the differences of the predictions between the codes. The results show that in the all cases the agreement was better than clinical acceptance criteria. PMID:22996039

  19. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation

    NASA Astrophysics Data System (ADS)

    Ogawara, R.; Ishikawa, M.

    2016-07-01

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements.

  20. GEANT4 Application for the Simulation of the Head of a Siemens Primus Linac

    SciTech Connect

    Cortes-Giraldo, M. A.; Quesada, J. M.; Gallardo, M. I.

    2010-04-26

    The Monte Carlo simulation of the head of a Siemens Primus Linac used at Virgen Macarena Hospital (Sevilla, Spain) has been performed using the code GEANT4, version 9.2. In this work, the main features of the application built by our group are presented. They are mainly focused in the optimization of the performance of the simulation. The geometry, including the water phantom, has been entirely wrapped by a shielding volume which discards all the particles escaping far away through its walls. With this, a factor of four in the time spent by the simulation can be saved. An interface to read and write phase-space files in IAEA format has been also developed to save CPU time in our simulations. Finally, some calculations of the dose absorption in the water phantom have been done and compared with the results given by EGSnrc and with experimental data obtained for the calibration of the machine.

  1. Geant4 simulation of the n_TOF-EAR2 neutron beam: Characteristics and prospects

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Lo Meo, S.; Guerrero, C.; Cortés-Giraldo, M. A.; Massimi, C.; Quesada, J. M.; Barbagallo, M.; Colonna, N.; Mancusi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.

    2016-04-01

    The characteristics of the neutron beam at the new n_TOF-EAR2 facility have been simulated with the Geant4 code with the aim of providing useful data for both the analysis and planning of the upcoming measurements. The spatial and energy distributions of the neutrons, the resolution function and the in-beam γ-ray background have been studied in detail and their implications in the forthcoming experiments have been discussed. The results confirm that, with this new short (18.5m flight path) beam line, reaching an instantaneous neutron flux beyond 105n/μs/pulse in the keV region, n_TOF is one of the few facilities where challenging measurements can be performed, involving in particular short-lived radioisotopes.

  2. Geant4 Predictions of Energy Spectra in Typical Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Sabra, M. S.; Barghouty, A. F.

    2014-01-01

    Accurate knowledge of energy spectra inside spacecraft is important for protecting astronauts as well as sensitive electronics from the harmful effects of space radiation. Such knowledge allows one to confidently map the radiation environment inside the vehicle. The purpose of this talk is to present preliminary calculations for energy spectra inside a spherical shell shielding and behind a slab in typical space radiation environment using the 3D Monte-Carlo transport code Geant4. We have simulated proton and iron isotropic sources and beams impinging on Aluminum and Gallium arsenide (GaAs) targets at energies of 0.2, 0.6, 1, and 10 GeV/u. If time permits, other radiation sources and beams (_, C, O) and targets (C, Si, Ge, water) will be presented. The results are compared to ground-based measurements where available.

  3. Geant4 predictions of energy spectra in typical space radiation environment

    NASA Astrophysics Data System (ADS)

    Sabra, M. S.; Barghouty, A. F.

    2014-03-01

    Accurate knowledge of energy spectra inside spacecraft is important for protecting astronauts as well as sensitive electronics from the harmful effects of space radiation. Such knowledge allows one to confidently map the radiation environment inside the vehicle. The purpose of this talk is to present preliminary calculations for energy spectra inside a spherical shell shielding and behind a slab in typical space radiation environment using the 3D Monte-Carlo transport code Geant4. We have simulated proton and iron isotropic sources and beams impinging on Aluminum and Gallium arsenide (GaAs) targets at energies of 0.2, 0.6, 1, and 10 GeV/u. If time permits, other radiation sources and beams (α, C, O) and targets (C, Si, Ge, water) will be presented. The results are compared to ground-based measurements where available.

  4. Performance of the Nab segmented silicon detectors: GEANT4 and data

    NASA Astrophysics Data System (ADS)

    Frlez, Emil; Nab Collaboration

    2015-10-01

    The Nab Collaboration has proposed to measure neutron β-decay correlation parameters a and b at the Oak Ridge National Laboratory using a custom superconducting spectrometer and novel Si detectors. Two large area 2-mm thick silicon detectors, each segmented into 127 hexagonal pixels, will be used to detect the proton and electron from cold neutron decay. We present GEANT4 Monte Carlo simulations of the Si detector energy and timing responses to electrons below 1 MeV and to 30 keV protons with realistic simulated amplified anode waveforms. Both the data acquired with a prototype detector at Los Alamos National Laboratory with radioactive sources and the synthetic waveforms are analyzed by the same code. Energy and timing responses of the Si detectors are discussed, with the MC waveforms calibrated to the decay constants, baselines, noise, gains, and timing offsets extracted from measured data, pixel by pixel. Work supported by NSF Grants PHY-1126683, 1205833, 1307328, 1506320, and others.

  5. Geant4 Simulations of SuperCDMS iZip Detector Charge Carrier Propagation

    NASA Astrophysics Data System (ADS)

    Agnese, Robert; Brandt, Daniel; Redl, Peter; Asai, Makoto; Faiez, Dana; Kelsey, Mike; Bagli, Enrico; Anderson, Adam; Schlupf, Chandler

    2014-03-01

    The SuperCDMS experiment uses germanium crystal detectors instrumented with ionization and phonon readout circuits to search for dark matter. In order to simulate the response of the detectors to particle interactions the SuperCDMS Detector Monte Carlo (DMC) group has been implementing the processes governing electrons and phonons at low temperatures in Geant4. The charge portion of the DMC simulates oblique propagation of the electrons through the L-valleys, propagation of holes through the Γ-valleys, inter-valley scattering, and emission of Neganov-Luke phonons in a complex applied electric field. The field is calculated by applying a directed walk search on a tetrahedral mesh of known potentials and then interpolating the value. This talk will present an overview of the DMC status and a comparison of the charge portion of the DMC to experimental data of electron-hole pair propagation in germanium.

  6. Evaluation using GEANT4 of the transit dose in the Tunisian gamma irradiator for insect sterilization.

    PubMed

    Mannai, K; Askri, B; Loussaief, A; Trabelsi, A

    2007-06-01

    A simulation study of the Tunisian Gamma Irradiation Facility for sterile insects release programs has been realized using the GEANT4 Monte Carlo code of CERN. The dose was calculated and measured for high and low dose values inside the irradiation cell. The calculated high dose was in good agreement with measurements. However, a discrepancy between calculated and measured values occurs at dose levels commonly used for sterilization of insects. We argue that this discrepancy is due to the transit dose absorbed during displacement of targets from their initial position towards their irradiation position and displacement of radiation source pencils from storage towards their irradiation position. The discrepancy is corrected by taking into account the transit dose. PMID:17395474

  7. GEANT4 calibration of gamma spectrometry efficiency for measurements of airborne radioactivity on filter paper.

    PubMed

    Alrefae, Tareq

    2014-11-01

    A simple method of efficiency calibration for gamma spectrometry was performed. This method, which focused on measuring airborne radioactivity collected on filter paper, was based on Monte Carlo simulations using the toolkit GEANT4. Experimentally, the efficiency values of an HPGe detector were calculated for a multi-gamma disk source. These efficiency values were compared to their counterparts produced by a computer code that simulated experimental conditions. Such comparison revealed biases of 24, 10, 1, 3, 7, and 3% for the radionuclides (photon energies in keV) of Ce (166), Sn (392), Cs (662), Co (1,173), Co (1,333), and Y (1,836), respectively. The output of the simulation code was in acceptable agreement with the experimental findings, thus validating the proposed method. PMID:25271933

  8. Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.

    PubMed

    Chagren, S; Ben Tekaya, M; Reguigui, N; Gharbi, F

    2016-01-01

    In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed. PMID:26623928

  9. The effects of mapping CT images to Monte Carlo materials on GEANT4 proton simulation accuracy

    SciTech Connect

    Barnes, Samuel; McAuley, Grant; Slater, James; Wroe, Andrew

    2013-04-15

    Purpose: Monte Carlo simulations of radiation therapy require conversion from Hounsfield units (HU) in CT images to an exact tissue composition and density. The number of discrete densities (or density bins) used in this mapping affects the simulation accuracy, execution time, and memory usage in GEANT4 and other Monte Carlo code. The relationship between the number of density bins and CT noise was examined in general for all simulations that use HU conversion to density. Additionally, the effect of this on simulation accuracy was examined for proton radiation. Methods: Relative uncertainty from CT noise was compared with uncertainty from density binning to determine an upper limit on the number of density bins required in the presence of CT noise. Error propagation analysis was also performed on continuously slowing down approximation range calculations to determine the proton range uncertainty caused by density binning. These results were verified with Monte Carlo simulations. Results: In the presence of even modest CT noise (5 HU or 0.5%) 450 density bins were found to only cause a 5% increase in the density uncertainty (i.e., 95% of density uncertainty from CT noise, 5% from binning). Larger numbers of density bins are not required as CT noise will prevent increased density accuracy; this applies across all types of Monte Carlo simulations. Examining uncertainty in proton range, only 127 density bins are required for a proton range error of <0.1 mm in most tissue and <0.5 mm in low density tissue (e.g., lung). Conclusions: By considering CT noise and actual range uncertainty, the number of required density bins can be restricted to a very modest 127 depending on the application. Reducing the number of density bins provides large memory and execution time savings in GEANT4 and other Monte Carlo packages.

  10. A Geant4 simulation of the depth dose percentage in brain tumors treatments using protons and carbon ions

    NASA Astrophysics Data System (ADS)

    José A. Diaz, M.; Torres, D. A.

    2016-07-01

    The deposited energy and dose distribution of beams of protons and carbon over a head are simulated using the free tool package Geant4 and the data analysis package ROOT-C++. The present work shows a methodology to understand the microscopical process occurring in a session of hadron-therapy using advance simulation tools.

  11. Dosimetry characterization of 32P intravascular brachytherapy source wires using Monte Carlo codes PENELOPE and GEANT4.

    PubMed

    Torres, Javier; Buades, Manuel J; Almansa, Julio F; Guerrero, Rafael; Lallena, Antonio M

    2004-02-01

    Monte Carlo calculations using the codes PENELOPE and GEANT4 have been performed to characterize the dosimetric parameters of the new 20 mm long catheter-based 32P beta source manufactured by the Guidant Corporation. The dose distribution along the transverse axis and the two-dimensional dose rate table have been calculated. Also, the dose rate at the reference point, the radial dose function, and the anisotropy function were evaluated according to the adapted TG-60 formalism for cylindrical sources. PENELOPE and GEANT4 codes were first verified against previous results corresponding to the old 27 mm Guidant 32P beta source. The dose rate at the reference point for the unsheathed 27 mm source in water was calculated to be 0.215 +/- 0.001 cGy s(-1) mCi(-1), for PENELOPE, and 0.2312 +/- 0.0008 cGy s(-1) mCi(-1), for GEANT4. For the unsheathed 20 mm source, these values were 0.2908 +/- 0.0009 cGy s(-1) mCi(-1) and 0.311 0.001 cGy s(-1) mCi(-1), respectively. Also, a comparison with the limited data available on this new source is shown. We found non-negligible differences between the results obtained with PENELOPE and GEANT4. PMID:15000615

  12. Feasibility of using Geant4 Monte Carlo simulation for IMRT dose calculations for the Novalis Tx with a HD-120 multi-leaf collimator

    NASA Astrophysics Data System (ADS)

    Jung, Hyunuk; Shin, Jungsuk; Chung, Kwangzoo; Han, Youngyih; Kim, Jinsung; Choi, Doo Ho

    2015-05-01

    The aim of this study was to develop an independent dose verification system by using a Monte Carlo (MC) calculation method for intensity modulated radiation therapy (IMRT) conducted by using a Varian Novalis Tx (Varian Medical Systems, Palo Alto, CA, USA) equipped with a highdefinition multi-leaf collimator (HD-120 MLC). The Geant4 framework was used to implement a dose calculation system that accurately predicted the delivered dose. For this purpose, the Novalis Tx Linac head was modeled according to the specifications acquired from the manufacturer. Subsequently, MC simulations were performed by varying the mean energy, energy spread, and electron spot radius to determine optimum values of irradiation with 6-MV X-ray beams by using the Novalis Tx system. Computed percentage depth dose curves (PDDs) and lateral profiles were compared to the measurements obtained by using an ionization chamber (CC13). To validate the IMRT simulation by using the MC model we developed, we calculated a simple IMRT field and compared the result with the EBT3 film measurements in a water-equivalent solid phantom. Clinical cases, such as prostate cancer treatment plans, were then selected, and MC simulations were performed. The accuracy of the simulation was assessed against the EBT3 film measurements by using a gamma-index criterion. The optimal MC model parameters to specify the beam characteristics were a 6.8-MeV mean energy, a 0.5-MeV energy spread, and a 3-mm electron radius. The accuracy of these parameters was determined by comparison of MC simulations with measurements. The PDDs and the lateral profiles of the MC simulation deviated from the measurements by 1% and 2%, respectively, on average. The computed simple MLC fields agreed with the EBT3 measurements with a 95% passing rate with 3%/3-mm gamma-index criterion. Additionally, in applying our model to clinical IMRT plans, we found that the MC calculations and the EBT3 measurements agreed well with a passing rate of greater

  13. Sensitivity analysis for liver iron measurement through neutron stimulated emission computed tomography: a Monte Carlo study in GEANT4

    NASA Astrophysics Data System (ADS)

    Agasthya, G. A.; Harrawood, B. C.; Shah, J. P.; Kapadia, A. J.

    2012-01-01

    Neutron stimulated emission computed tomography (NSECT) is being developed as a non-invasive imaging modality to detect and quantify iron overload in the human liver. NSECT uses gamma photons emitted by the inelastic interaction between monochromatic fast neutrons and iron nuclei in the body to detect and quantify the disease. Previous simulated and physical experiments with phantoms have shown that NSECT has the potential to accurately diagnose iron overload with reasonable levels of radiation dose. In this work, we describe the results of a simulation study conducted to determine the sensitivity of the NSECT system for hepatic iron quantification in patients of different sizes. A GEANT4 simulation of the NSECT system was developed with a human liver and two torso sizes corresponding to small and large patients. The iron concentration in the liver ranged between 0.5 and 20 mg g-1,In this paper all iron concentrations with units mg g-1 refer to wet weight concentrations. corresponding to clinically reported iron levels in iron-overloaded patients. High-purity germanium gamma detectors were simulated to detect the emitted gamma spectra, which were background corrected using suitable water phantoms and analyzed to determine the minimum detectable level (MDL) of iron and the sensitivity of the NSECT system. These analyses indicate that for a small patient (torso major axis = 30 cm) the MDL is 0.5 mg g-1 and sensitivity is ˜13 ± 2 Fe counts/mg/mSv and for a large patient (torso major axis = 40 cm) the values are 1 mg g-1 and ˜5 ± 1 Fe counts/mg/mSv, respectively. The results demonstrate that the MDL for both patient sizes lies within the clinically significant range for human iron overload.

  14. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT. PMID:24763641

  15. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  16. GEANT4 Application for the Simulation of the Head of a Siemens Primus Linac

    NASA Astrophysics Data System (ADS)

    Cortés-Giraldo, M. A.; Quesada, J. M.; Gallardo, M. I.

    2010-04-01

    The Monte Carlo simulation of the head of a Siemens Primus Linac used at Virgen Macarena Hospital (Sevilla, Spain) has been performed using the code GEANT4 [1-2], version 9.2. In this work, the main features of the application built by our group are presented. They are mainly focused in the optimization of the performance of the simulation. The geometry, including the water phantom, has been entirely wrapped by a shielding volume which discards all the particles escaping far away through its walls. With this, a factor of four in the time spent by the simulation can be saved. An interface to read and write phase-space files in IAEA format has been also developed to save CPU time in our simulations [3-4]. Finally, some calculations of the dose absorption in the water phantom have been done and compared with the results given by EGSnrc [5] and with experimental data obtained for the calibration of the machine.

  17. Ion therapy for uveal melanoma in new human eye phantom based on GEANT4 toolkit.

    PubMed

    Mahdipour, Seyed Ali; Mowlavi, Ali Asghar

    2016-01-01

    Radiotherapy with ion beams like proton and carbon has been used for treatment of eye uveal melanoma for many years. In this research, we have developed a new phantom of human eye for Monte Carlo simulation of tumors treatment to use in GEANT4 toolkit. Total depth-dose profiles for the proton, alpha, and carbon incident beams with the same ranges have been calculated in the phantom. Moreover, the deposited energy of the secondary particles for each of the primary beams is calculated. The dose curves are compared for 47.8MeV proton, 190.1MeV alpha, and 1060MeV carbon ions that have the same range in the target region reaching to the center of tumor. The passively scattered spread-out Bragg peak (SOBP) for each incident beam as well as the flux curves of the secondary particles including neutron, gamma, and positron has been calculated and compared for the primary beams. The high sharpness of carbon beam׳s Bragg peak with low lateral broadening is the benefit of this beam in hadrontherapy but it has disadvantages of dose leakage in the tail after its Bragg peak and high intensity of neutron production. However, proton beam, which has a good conformation with tumor shape owing to the beam broadening caused by scattering, can be a good choice for the large-size tumors. PMID:26831752

  18. Characterisation of a SAGe well detector using GEANT4 and LabSOCS

    NASA Astrophysics Data System (ADS)

    Britton, R.; Davies, A. V.

    2015-06-01

    This paper reports on the performance of a recently developed Small Anode Germanium (SAGe) well detector from Canberra Industries. This has been specifically designed to improve the energy resolution of the detector, such that it is comparable to the performance of broad-energy designs while achieving far higher efficiencies. Accurate efficiency characterisations and cascade summing correction factors are crucial for quantifying the radionuclides present in environmental samples, and these were calculated for the complex geometry posed by the well detector using two different methodologies. The first relied on Monte-Carlo simulations based upon the GEANT4 toolkit, and the second utilised Canberra Industries GENIE™ 2000 Gamma Analysis software in conjunction with a LabSOCS™ characterisation. Both were found to be in excellent agreement for all nuclides except for 152Eu, which presents a known issue in the Canberra software (all nuclides affected by this issue were well documented, and fixes are being developed). The correction factors were used to analyse two fully characterised reference samples, yielding results in good agreement with the accepted activity concentrations. Given the sensitivity of well type geometries to cascade summing, this represents a considerable achievement, and paves the way for the use of the SAGe well detector in analysis of 'real-world' environmental samples. With the efficiency increase when using the SAGe well in place of a BEGe, substantial reductions in the Minimum Detectable Activity (MDA) should be achievable for a range of nuclides.

  19. VIDA: A Voxel-Based Dosimetry Method for Targeted Radionuclide Therapy Using Geant4

    PubMed Central

    Dewaraja, Yuni K.; Abramson, Richard G.; Stabin, Michael G.

    2015-01-01

    Abstract We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy (131I, 90Y, 111In, 177Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by 131I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression. PMID:25594357

  20. Geant4 simulation study of Indian National Gamma Array at TIFR

    NASA Astrophysics Data System (ADS)

    Saha, S.; Palit, R.; Sethi, J.; Biswas, S.; Singh, P.

    2016-03-01

    A Geant4 simulation code for the Indian National Gamma Array (INGA) consisting of 24 Compton suppressed clover high purity germanium (HPGe) detectors has been developed. The calculated properties in the energy range that is of interest for nuclear γ-ray spectroscopy are spectral distributions for various standard radioactive sources, intrinsic peak efficiencies and peak-to-total (P/T) ratios in various configurations such as singles, add-back and Compton suppressed mode. The principle of operation of the detectors in add-back and Compton suppression mode have been reproduced in the simulation. The reliability of the calculation is checked by comparison with the experimental data for various γ-ray energies up to 5 MeV. The comparison between simulation results and experimental data demonstrate the need of incorporating the exact geometry of the clover detectors, Anti-Compton Shield and other surrounding materials in the array to explain the detector response to the γ-ray. Several experimental effects are also investigated. These include the geometrical correction to angular distribution, crosstalk probability and the impact of heavy metal collimators between the target and the array on the P/T ratio.

  1. VIDA: a voxel-based dosimetry method for targeted radionuclide therapy using Geant4.

    PubMed

    Kost, Susan D; Dewaraja, Yuni K; Abramson, Richard G; Stabin, Michael G

    2015-02-01

    We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy ((131)I, (90)Y, (111)In, (177)Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by (131)I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression. PMID:25594357

  2. Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.

    PubMed

    Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B

    2010-09-01

    The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made. PMID:20511404

  3. GEANT4 simulation of the angular dependence of TLD-based monitor response

    NASA Astrophysics Data System (ADS)

    Guimarães, C. C.; Moralles, M.; Okuno, E.

    2007-09-01

    In this work, the response of thermoluminescent (TL) monitors to X-ray beams impinging on them at different angles was investigated and compared with results of simulations performed with the GEANT4 radiation transport toolkit. Each monitor used contains four TL detectors (TLD): two CaF 2 pellets and two TLD-100 (one of each type within lead filter and the other without filter). Monitors were irradiated free-in-air with X-ray beams of the narrow and wide spectrum with effective energy of 61 and 130 keV and angles of incidence of 0°, 30°, 45°, and 60°. Curves of TL response relative to air kerma as a function of photon effective energy for each detector, with and without filter, are used to correct the energetic dependence of TL response. Such curves were also obtained from the data of radiation energy stored in the TLDs provided by the simulations. The attenuation increases with the increase of the incidence angle, since the thickness of lead filter traversed by the beam also enlarges. As the monitor calibration is usually performed with the beams impinging the monitor at 0°, changes in the attenuation become a source of error in the energy determination and consequently in the value of dose equivalent obtained with this monitor. The changes in attenuation observed in experiments were corroborated by the Monte Carlo simulations.

  4. GATE as a GEANT4-based Monte Carlo platform for the evaluation of proton pencil beam scanning treatment plans

    NASA Astrophysics Data System (ADS)

    Grevillot, L.; Bertrand, D.; Dessy, F.; Freud, N.; Sarrut, D.

    2012-07-01

    Active scanning delivery systems take full advantage of ion beams to best conform to the tumor and to spare surrounding healthy tissues; however, it is also a challenging technique for quality assurance. In this perspective, we upgraded the GATE/GEANT4 Monte Carlo platform in order to recalculate the treatment planning system (TPS) dose distributions for active scanning systems. A method that allows evaluating the TPS dose distributions with the GATE Monte Carlo platform has been developed and applied to the XiO TPS (Elekta), for the IBA proton pencil beam scanning (PBS) system. First, we evaluated the specificities of each dose engine. A dose-conversion scheme that allows one to convert dose to medium into dose to water was implemented within GATE. Specific test cases in homogeneous and heterogeneous configurations allowed for the estimation of the differences between the beam models implemented in XiO and GATE. Finally, dose distributions of a prostate treatment plan were compared. In homogeneous media, a satisfactory agreement was generally obtained between XiO and GATE. The maximum stopping power difference of 3% occurred in a human tissue of 0.9 g cm-3 density and led to a significant range shift. Comparisons in heterogeneous configurations pointed out the limits of the TPS dose calculation accuracy and the superiority of Monte Carlo simulations. The necessity of computing dose to water in our Monte Carlo code for comparisons with TPSs is also presented. Finally, the new capabilities of the platform are applied to a prostate treatment plan and dose differences between both dose engines are analyzed in detail. This work presents a generic method to compare TPS dose distributions with the GATE Monte Carlo platform. It is noteworthy that GATE is also a convenient tool for imaging applications, therefore opening new research possibilities for the PBS modality.

  5. Geant4 simulation for a study of a possible use of carbon ion pencil beams for the treatment of ocular melanomas with the active scanning system at CNAO

    NASA Astrophysics Data System (ADS)

    Farina, E.; Piersimoni, P.; Riccardi, C.; Rimoldi, A.; Tamborini, A.; Ciocca, M.

    2015-12-01

    The aim of this work was to study a possible use of carbon ion pencil beams (delivered with active scanning modality) for the treatment of ocular melanomas at the Centro Nazionale di Adroterapia Oncologica (CNAO). The promising aspect of carbon ions radiotherapy for the treatment of this disease lies in its superior relative radio-biological effectiveness (RBE). The Monte Carlo (MC) Geant4 10.00 toolkit was used to simulate the complete CNAO extraction beamline, with the active and passive components along it. A human eye modeled detector, including a realistic target tumor volume, was used as target. Cross check with previous studies at CNAO using protons allowed comparisons on possible benefits on using such a technique with respect to proton beams. Experimental data on proton and carbon ion beams transverse distributions were used to validate the simulation.

  6. Assessment of patient dose reduction by bismuth shielding in CT using measurements, GEANT4 and MCNPX simulations.

    PubMed

    Mendes, M; Costa, F; Figueira, C; Madeira, P; Teles, P; Vaz, P

    2015-07-01

    This work reports on the use of two different Monte Carlo codes (GEANT4 and MCNPX) for assessing the dose reduction using bismuth shields in computer tomography (CT) procedures in order to protect radiosensitive organs such as eye lens, thyroid and breast. Measurements were performed using head and body PMMA phantoms and an ionisation chamber placed in five different positions of the phantom. Simulations were performed to estimate Computed Tomography Dose Index values using GEANT4 and MCNPX. The relative differences between measurements and simulations were <10 %. The dose reduction arising from the use of bismuth shielding ranges from 2 to 45 %, depending on the position of the bismuth shield. The percentage of dose reduction was more significant for the area covered by the bismuth shielding (36 % for eye lens, 39 % for thyroid and 45 % for breast shields). PMID:25813483

  7. Evaluation of open MPI and MPICH2 performances for the computation time in proton therapy dose calculations with Geant4

    NASA Astrophysics Data System (ADS)

    Kazemi, M.; Afarideh, H.; Riazi, Z.

    2015-11-01

    The aim of this research work is to use a better parallel software structure to improve the performance of the Monte Carlo Geant4 code in proton treatment planning. The hadron therapy simulation is rewritten to parallelize the shared memory multiprocessor systems by using the Message-Passing Interface (MPI). The speedup performance of the code has been studied by using two MPI-compliant libraries including Open MPI and the MPICH2, separately. Despite the speedup, the results are almost linear for both the Open MPI and MPICH2; the latter was chosen because of its better characteristics and lower computation time. The Geant4 parameters, including the step limiter and the set cut, have been analyzed to minimize the simulation time as much as possible. For a reasonable compromise between the spatial dose distribution and the calculation time, the improvement in time reduction coefficient reaches about 157.

  8. Geant4 simulations on medical Linac operation at 18 MV: Experimental validation based on activation foils

    NASA Astrophysics Data System (ADS)

    Vagena, E.; Stoulos, S.; Manolopoulou, M.

    2016-03-01

    The operation of a medical linear accelerator was simulated using the Geant4 code regarding to study the characteristics of an 18 MeV photon beam. Simulations showed that (a) the photon spectrum at the isocenter is not influenced by changes of the primary electron beam's energy distribution and spatial spread (b) 98% of the photon energy fluence scored at the isocenter is primary photons that have only interacted with the target (c) the number of contaminant electrons is not negligible since it fluctuated around 5×10-5 per primary electron or 2.40×10-3 per photon at the isocenter (d) the number of neutrons that are created by (γ, n) reactions is 3.13×10-6 per primary electron or 1.50×10-3 per photon at the isocenter (e) a flattening filter free beam needs less primary electrons in order to deliver the same photon fluence at the isocenter than a normal flattening filter operation (f) there is no significant increase of the surface dose due to the contaminant electrons by removing the flattening filter (g) comparing the neutron fluences per incident electron for the flattened and unflattened beam, the neutron fluencies is 7% higher for the unflattened beams. To validate the simulations results, the total neutron and photon fluence at the isocenter field were measured using nickel, indium, and natural uranium activation foils. The percentage difference between simulations and measurements was 1.26% in case of uranium and 2.45% in case of the indium foil regarding photon fluencies while for neutrons the discrepancy is higher up to 8.0%. The photon and neutron fluencies of the simulated experiments fall within a range of ±1 and ±2 sigma error, respectively, compared to the ones obtained experimentally.

  9. The simulation of the LANFOS-H food radiation contamination detector using Geant4 package

    NASA Astrophysics Data System (ADS)

    Piotrowski, Lech Wiktor; Casolino, Marco; Ebisuzaki, Toshikazu; Higashide, Kazuhiro

    2015-02-01

    Recent incident in the Fukushima power plant caused a growing concern about the radiation contamination and resulted in lowering the Japanese limits for the permitted amount of 137Cs in food to 100 Bq/kg. To increase safety and ease the concern we are developing LANFOS (Large Food Non-destructive Area Sampler)-a compact, easy to use detector for assessment of radiation in food. Described in this paper LANFOS-H has a 4 π coverage to assess the amount of 137Cs present, separating it from the possible 40K food contamination. Therefore, food samples do not have to be pre-processed prior to a test and can be consumed after measurements. It is designed for use by non-professionals in homes and small institutions such as schools, showing safety of the samples, but can be also utilized by specialists providing radiation spectrum. Proper assessment of radiation in food in the apparatus requires estimation of the γ conversion factor of the detectors-how many γ photons will produce a signal. In this paper we show results of the Monte Carlo estimation of this factor for various approximated shapes of fish, vegetables and amounts of rice, performed with Geant4 package. We find that the conversion factor combined from all the detectors is similar for all food types and is around 37%, varying maximally by 5% with sample length, much less than for individual detectors. The different inclinations and positions of samples in the detector introduce uncertainty of 1.4%. This small uncertainty validates the concept of a 4 π non-destructive apparatus.

  10. PDB4DNA: Implementation of DNA geometry from the Protein Data Bank (PDB) description for Geant4-DNA Monte-Carlo simulations

    NASA Astrophysics Data System (ADS)

    Delage, E.; Pham, Q. T.; Karamitros, M.; Payno, H.; Stepan, V.; Incerti, S.; Maigne, L.; Perrot, Y.

    2015-07-01

    This paper describes PDB4DNA, a new Geant4 user application, based on an independent, cross-platform, free and open source C++ library, so-called PDBlib, which enables use of atomic level description of DNA molecule in Geant4 Monte Carlo particle transport simulations. For the evaluation of direct damage induced on the DNA molecule by ionizing particles, the application makes use of an algorithm able to determine the closest atom in the DNA molecule to energy depositions. Both the PDB4DNA application and the PDBlib library are available as free and open source under the Geant4 license.

  11. GEANT4 used for neutron beam design of a neutron imaging facility at TRIGA reactor in Morocco

    NASA Astrophysics Data System (ADS)

    Ouardi, A.; Machmach, A.; Alami, R.; Bensitel, A.; Hommada, A.

    2011-09-01

    Neutron imaging has a broad scope of applications and has played a pivotal role in visualizing and quantifying hydrogenous masses in metallic matrices. The field continues to expand into new applications with the installation of new neutron imaging facilities. In this scope, a neutron imaging facility for computed tomography and real-time neutron radiography is currently being developed around 2.0MW TRIGA MARK-II reactor at Maamora Nuclear Research Center in Morocco (Reuscher et al., 1990 [1]; de Menezes et al., 2003 [2]; Deinert et al., 2005 [3]). The neutron imaging facility consists of neutron collimator, real-time neutron imaging system and imaging process systems. In order to reduce the gamma-ray content in the neutron beam, the tangential channel was selected. For power of 250 kW, the corresponding thermal neutron flux measured at the inlet of the tangential channel is around 3×10 11 ncm 2/s. This facility will be based on a conical neutron collimator with two circular diaphragms with diameters of 4 and 2 cm corresponding to L/D-ratio of 165 and 325, respectively. These diaphragms' sizes allow reaching a compromise between good flux and efficient L/D-ratio. Convergent-divergent collimator geometry has been adopted. The beam line consists of a gamma filter, fast neutrons filter, neutron moderator, neutron and gamma shutters, biological shielding around the collimator and several stages of neutron collimator. Monte Carlo calculations by a fully 3D numerical code GEANT4 were used to design the neutron beam line ( http://www.info.cern.ch/asd/geant4/geant4.html[4]). To enhance the neutron thermal beam in terms of quality, several materials, mainly bismuth (Bi) and sapphire (Al 2O 3) were examined as gamma and neutron filters respectively. The GEANT4 simulations showed that the gamma and epithermal and fast neutron could be filtered using the bismuth (Bi) and sapphire (Al 2O 3) filters, respectively. To get a good cadmium ratio, GEANT 4 simulations were used to

  12. Compton polarimeter as a focal plane detector for hard X-ray telescope: sensitivity estimation with Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, T.; Vadawale, S. V.; Pendharkar, J.

    2013-04-01

    X-ray polarimetry can be an important tool for investigating various physical processes as well as their geometries at the celestial X-ray sources. However, X-ray polarimetry has not progressed much compared to the spectroscopy, timing and imaging mainly due to the extremely photon-hungry nature of X-ray polarimetry leading to severely limited sensitivity of X-ray polarimeters. The great improvement in sensitivity in spectroscopy and imaging was possible due to focusing X-ray optics which is effective only at the soft X-ray energy range. Similar improvement in sensitivity of polarisation measurement at soft X-ray range is expected in near future with the advent of GEM based photoelectric polarimeters. However, at energies >10 keV, even spectroscopic and imaging sensitivities of X-ray detector are limited due to lack of focusing optics. Thus hard X-ray polarimetry so far has been largely unexplored area. On the other hand, typically the polarisation degree is expected to increase at higher energies as the radiation from non-thermal processes is dominant fraction. So polarisation measurement in hard X-ray can yield significant insights into such processes. With the recent availability of hard X-ray optics (e.g. with upcoming NuSTAR, Astro-H missions) which can focus X-rays from 5 KeV to 80 KeV, sensitivity of X-ray detectors in hard X-ray range is expected to improve significantly. In this context we explore feasibility of a focal plane hard X-ray polarimeter based on Compton scattering having a thin plastic scatterer surrounded by cylindrical array scintillator detectors. We have carried out detailed Geant4 simulation to estimate the modulation factor for 100 % polarized beam as well as polarimetric efficiency of this configuration. We have also validated these results with a semi-analytical approach. Here we present the initial results of polarisation sensitivities of such focal plane Compton polarimeter coupled with the reflection efficiency of present era hard X

  13. Simulation of the radiation exposure in space during a large solar energetic particle event with GEANT4

    NASA Astrophysics Data System (ADS)

    Matthiä, Daniel; Berger, Thomas; Puchalska, Monika; Reitz, Guenther

    in August 1972 in the energy range from 45 MeV to 1 GeV. The transport calculations of the energetic particles through the shielding and the phantom model were performed using the Monte-Carlo code GEANT4.

  14. Monte Carlo simulation of MOSFET dosimeter for electron backscatter using the GEANT4 code.

    PubMed

    Chow, James C L; Leung, Michael K K

    2008-06-01

    The aim of this study is to investigate the influence of the body of the metal-oxide-semiconductor field effect transistor (MOSFET) dosimeter in measuring the electron backscatter from lead. The electron backscatter factor (EBF), which is defined as the ratio of dose at the tissue-lead interface to the dose at the same point without the presence of backscatter, was calculated by the Monte Carlo simulation using the GEANT4 code. Electron beams with energies of 4, 6, 9, and 12 MeV were used in the simulation. It was found that in the presence of the MOSFET body, the EBFs were underestimated by about 2%-0.9% for electron beam energies of 4-12 MeV, respectively. The trend of the decrease of EBF with an increase of electron energy can be explained by the small MOSFET dosimeter, mainly made of epoxy and silicon, not only attenuated the electron fluence of the electron beam from upstream, but also the electron backscatter generated by the lead underneath the dosimeter. However, this variation of the EBF underestimation is within the same order of the statistical uncertainties as the Monte Carlo simulations, which ranged from 1.3% to 0.8% for the electron energies of 4-12 MeV, due to the small dosimetric volume. Such small EBF deviation is therefore insignificant when the uncertainty of the Monte Carlo simulation is taken into account. Corresponding measurements were carried out and uncertainties compared to Monte Carlo results were within +/- 2%. Spectra of energy deposited by the backscattered electrons in dosimetric volumes with and without the lead and MOSFET were determined by Monte Carlo simulations. It was found that in both cases, when the MOSFET body is either present or absent in the simulation, deviations of electron energy spectra with and without the lead decrease with an increase of the electron beam energy. Moreover, the softer spectrum of the backscattered electron when lead is present can result in a reduction of the MOSFET response due to stronger

  15. PET monitoring of cancer therapy with 3He and 12C beams: a study with the GEANT4 toolkit.

    PubMed

    Pshenichnov, Igor; Larionov, Alexei; Mishustin, Igor; Greiner, Walter

    2007-12-21

    We study the spatial distributions of beta(+)-activity produced by therapeutic beams of (3)He and (12)C ions in various tissue-like materials. The calculations were performed within a Monte Carlo model for heavy-ion therapy (MCHIT) based on the GEANT4 toolkit. The contributions from positron-emitting nuclei with T(1/2) > 10 s, namely (10,11)C, (13)N, (14,15)O, (17,18)F and (30)P, were calculated and compared with experimental data obtained during and after irradiation, where available. Positron-emitting nuclei are created by a (12)C beam in fragmentation reactions of projectile and target nuclei. This leads to a beta(+)-activity profile characterized by a noticeable peak located close to the Bragg peak in the corresponding depth-dose distribution. This can be used for dose monitoring in carbon-ion therapy of cancer. In contrast, as most of the positron-emitting nuclei are produced by a (3)He beam in target fragmentation reactions, the calculated total beta(+)-activity during or soon after the irradiation period is evenly distributed within the projectile range. However, we predict also the presence of (13)N, (14)O, (17,18)F created in charge-transfer reactions by low-energy (3)He ions close to the end of their range in several tissue-like media. The time evolution of beta(+)-activity profiles was investigated for both kinds of beams. We found that due to the production of (18)F nuclides the beta(+)-activity profile measured 2 or 3 h after irradiation with (3)He ions will have a distinct peak correlated with the maximum of depth-dose distribution. We also found certain advantages of low-energy (3)He beams over low-energy proton beams for reliable PET monitoring during particle therapy of shallow-located tumours. In this case the distal edge of beta(+)-activity distribution from (17)F nuclei clearly marks the range of (3)He in tissues. PMID:18065840

  16. TH-E-BRE-01: A 3D Solver of Linear Boltzmann Transport Equation Based On a New Angular Discretization Method with Positivity for Photon Dose Calculation Benchmarked with Geant4

    SciTech Connect

    Hong, X; Gao, H

    2014-06-15

    Purpose: The Linear Boltzmann Transport Equation (LBTE) solved through statistical Monte Carlo (MC) method provides the accurate dose calculation in radiotherapy. This work is to investigate the alternative way for accurately solving LBTE using deterministic numerical method due to its possible advantage in computational speed from MC. Methods: Instead of using traditional spherical harmonics to approximate angular scattering kernel, our deterministic numerical method directly computes angular scattering weights, based on a new angular discretization method that utilizes linear finite element method on the local triangulation of unit angular sphere. As a Result, our angular discretization method has the unique advantage in positivity, i.e., to maintain all scattering weights nonnegative all the time, which is physically correct. Moreover, our method is local in angular space, and therefore handles the anisotropic scattering well, such as the forward-peaking scattering. To be compatible with image-guided radiotherapy, the spatial variables are discretized on the structured grid with the standard diamond scheme. After discretization, the improved sourceiteration method is utilized for solving the linear system without saving the linear system to memory. The accuracy of our 3D solver is validated using analytic solutions and benchmarked with Geant4, a popular MC solver. Results: The differences between Geant4 solutions and our solutions were less than 1.5% for various testing cases that mimic the practical cases. More details are available in the supporting document. Conclusion: We have developed a 3D LBTE solver based on a new angular discretization method that guarantees the positivity of scattering weights for physical correctness, and it has been benchmarked with Geant4 for photon dose calculation.

  17. A comparison of the measured responses of a tissue-equivalent proportional counter to high energy heavy (HZE) particles and those simulated using the Geant4 Monte Carlo code

    PubMed Central

    Taddei, Phillip J.; Zhao, Zhongxiang; Borak, Thomas B.

    2010-01-01

    Monte Carlo simulations of heavy ion interactions using the Geant4 toolkit were compared with measurements of energy deposition in a spherical tissue-equivalent proportional counter (TEPC). A spherical cavity with a physical diameter of 12.7 mm was filled with propane-based tissue-equivalent gas surrounded by a wall of A-150 tissue-equivalent plastic that was 2.54 mm to thick. Measurements and Monte Carlo simulations were used to record the energy deposition and the trajectory of the incident particle on an event-by-event basis for ions ranging in atomic number from 2 (4He) to 26 (56Fe) and in energy from 200 MeV/nucleon to 1000 MeV/nucleon. In the simulations, tracking of secondary electrons was terminated when the range of an electron was below a specified threshold. The effects of range cuts for electrons at 0.5 μm, 1 μm, 10 μm, and 100 μm were evaluated. To simulate an energy deposition influenced by large numbers of low energy electrons with large transverse momentum, it was necessary to track electrons down to range cuts of 10 μm or less. The Geant4 simulated data closely matched the measured data acquired using a TEPC for incident particles traversing the center of the detector as well as near the gas-wall interface. Values of frequency mean lineal energy and dose mean lineal energy were within 8% of the measured data. The production of secondary particles in the aluminum vacuum chamber had no effect on the response of the TEPC for 56Fe at 1000 MeV/nucleon. The results of this study confirm that Geant4 can simulate patterns of energy deposition for existing microdosimeters and is valuable for improving the design of a new generation of detectors used for space dosimetry and for characterizing particle beams used in hadron radiotherapy. PMID:20862212

  18. A comparison of the measured responses of a tissue-equivalent proportional counter to high energy heavy (HZE) particles and those simulated using the Geant4 Monte Carlo code.

    PubMed

    Taddei, Phillip J; Zhao, Zhongxiang; Borak, Thomas B

    2008-10-01

    Monte Carlo simulations of heavy ion interactions using the Geant4 toolkit were compared with measurements of energy deposition in a spherical tissue-equivalent proportional counter (TEPC). A spherical cavity with a physical diameter of 12.7 mm was filled with propane-based tissue-equivalent gas surrounded by a wall of A-150 tissue-equivalent plastic that was 2.54 mm to thick. Measurements and Monte Carlo simulations were used to record the energy deposition and the trajectory of the incident particle on an event-by-event basis for ions ranging in atomic number from 2 ((4)He) to 26 ((56)Fe) and in energy from 200 MeV/nucleon to 1000 MeV/nucleon. In the simulations, tracking of secondary electrons was terminated when the range of an electron was below a specified threshold. The effects of range cuts for electrons at 0.5 μm, 1 μm, 10 μm, and 100 μm were evaluated. To simulate an energy deposition influenced by large numbers of low energy electrons with large transverse momentum, it was necessary to track electrons down to range cuts of 10 μm or less. The Geant4 simulated data closely matched the measured data acquired using a TEPC for incident particles traversing the center of the detector as well as near the gas-wall interface. Values of frequency mean lineal energy and dose mean lineal energy were within 8% of the measured data. The production of secondary particles in the aluminum vacuum chamber had no effect on the response of the TEPC for (56)Fe at 1000 MeV/nucleon. The results of this study confirm that Geant4 can simulate patterns of energy deposition for existing microdosimeters and is valuable for improving the design of a new generation of detectors used for space dosimetry and for characterizing particle beams used in hadron radiotherapy. PMID:20862212

  19. Computed Pion Yields from a Tantalum Rod Target: Comparing MARS15 and GEANT4 Across Proton Energies

    NASA Astrophysics Data System (ADS)

    Brooks, S. J.; Walaron, K. A.

    2006-05-01

    The choice of proton driver energy is an important variable in maximising the pion flux available in later stages of the neutrino factory. Simulations of pion production using a range of energies are presented and cross-checked for reliability between the codes MARS15 and GEANT4. The distributions are combined with postulated apertures for the pion decay channel and muon front-end to estimate the usable muon flux after capture losses. Resolution of discrepancies between the codes awaits experimental data in the required energy range.

  20. Studying the response of a plastic scintillator to gamma rays using the Geant4 Monte Carlo code.

    PubMed

    Ghadiri, Rasoul; Khorsandi, Jamshid

    2015-05-01

    To determine the gamma ray response function of an NE-102 scintillator and to investigate the gamma spectra due to the transport of optical photons, we simulated an NE-102 scintillator using Geant4 code. The results of the simulation were compared with experimental data. Good consistency between the simulation and data was observed. In addition, the time and spatial distributions, along with the energy distribution and surface treatments of scintillation detectors, were calculated. This simulation makes us capable of optimizing the photomultiplier tube (or photodiodes) position to yield the best coupling to the detector. PMID:25725326

  1. First GEANT4-based simulation investigation of a Li-coated resistive plate chamber for low-energy neutrons

    NASA Astrophysics Data System (ADS)

    Rhee, J. T.; Jamil, M.; Jeon, Y. J.

    2013-08-01

    A simulation study of the performance of a single-gap resistive plate chamber coated with Li-layer for the detection of low energy neutrons was performed by means of GEANT4 Monte Carlo code. Low energy neutrons were detected via 7Li(n, α) 3He nuclear reaction. To make the detector sensitive to low energy neutrons, Li- coating was employed both on the forward and backward electrodes of the converter. Low energy neutrons were transported onto the Li-coating RPC by GEANT4 MC code. A detector with converter area of 5×5 cm2 was utilized for this work. The detection response was evaluated as a function of incident low energy neutrons in the range of 25 MeV-100 MeV. The evaluated results predicted higher detection response for the backward-coated converter detector than that of forward coated converter RPC setup. This type of detector can be useful for the detection of low energy neutrons.

  2. Layered mass geometry: a novel technique to overlay seeds and applicators onto patient geometry in Geant4 brachytherapy simulations.

    PubMed

    Enger, Shirin A; Landry, Guillaume; D'Amours, Michel; Verhaegen, Frank; Beaulieu, Luc; Asai, Makoto; Perl, Joseph

    2012-10-01

    A problem faced by all Monte Carlo (MC) particle transport codes is how to handle overlapping geometries. The Geant4 MC toolkit allows the user to create parallel geometries within a single application. In Geant4 the standard mass-containing geometry is defined in a simulation volume called the World Volume. Separate parallel geometries can be defined in parallel worlds, that is, alternate three dimensional simulation volumes that share the same coordinate system with the World Volume for geometrical event biasing, scoring of radiation interactions, and/or the creation of hits in detailed readout structures. Until recently, only one of those worlds could contain mass so these parallel worlds provided no solution to simplify a complex geometric overlay issue in brachytherapy, namely the overlap of radiation sources and applicators with a CT based patient geometry. The standard method to handle seed and applicator overlay in MC requires removing CT voxels whose boundaries would intersect sources, placing the sources into the resulting void and then backfilling the remaining space of the void with a relevant material. The backfilling process may degrade the accuracy of patient representation, and the geometrical complexity of the technique precludes using fast and memory-efficient coding techniques that have been developed for regular voxel geometries. The patient must be represented by the less memory and CPU-efficient Geant4 voxel placement technique, G4PVPlacement, rather than the more efficient G4NestedParameterization (G4NestedParam). We introduce for the first time a Geant4 feature developed to solve this issue: Layered Mass Geometry (LMG) whereby both the standard (CT based patient geometry) and the parallel world (seeds and applicators) may now have mass. For any area where mass is present in the parallel world, the parallel mass is used. Elsewhere, the mass of the standard world is used. With LMG the user no longer needs to remove patient CT voxels that would

  3. Layered mass geometry: a novel technique to overlay seeds and applicators onto patient geometry in Geant4 brachytherapy simulations

    NASA Astrophysics Data System (ADS)

    Enger, Shirin A.; Landry, Guillaume; D'Amours, Michel; Verhaegen, Frank; Beaulieu, Luc; Asai, Makoto; Perl, Joseph

    2012-10-01

    A problem faced by all Monte Carlo (MC) particle transport codes is how to handle overlapping geometries. The Geant4 MC toolkit allows the user to create parallel geometries within a single application. In Geant4 the standard mass-containing geometry is defined in a simulation volume called the World Volume. Separate parallel geometries can be defined in parallel worlds, that is, alternate three dimensional simulation volumes that share the same coordinate system with the World Volume for geometrical event biasing, scoring of radiation interactions, and/or the creation of hits in detailed readout structures. Until recently, only one of those worlds could contain mass so these parallel worlds provided no solution to simplify a complex geometric overlay issue in brachytherapy, namely the overlap of radiation sources and applicators with a CT based patient geometry. The standard method to handle seed and applicator overlay in MC requires removing CT voxels whose boundaries would intersect sources, placing the sources into the resulting void and then backfilling the remaining space of the void with a relevant material. The backfilling process may degrade the accuracy of patient representation, and the geometrical complexity of the technique precludes using fast and memory-efficient coding techniques that have been developed for regular voxel geometries. The patient must be represented by the less memory and CPU-efficient Geant4 voxel placement technique, G4PVPlacement, rather than the more efficient G4NestedParameterization (G4NestedParam). We introduce for the first time a Geant4 feature developed to solve this issue: Layered Mass Geometry (LMG) whereby both the standard (CT based patient geometry) and the parallel world (seeds and applicators) may now have mass. For any area where mass is present in the parallel world, the parallel mass is used. Elsewhere, the mass of the standard world is used. With LMG the user no longer needs to remove patient CT voxels that would

  4. Ray tracing simulations for the wide-field x-ray telescope of the Einstein Probe mission based on Geant4 and XRTG4

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Willingale, Richard; Ling, Zhixing; Feng, Hua; Li, Hong; Ji, Jianfeng; Wang, Wenxin; Zhang, Shuangnan

    2014-07-01

    Einstein Probe (EP) is a proposed small scientific satellite dedicated to time-domain astrophysics working in the soft X-ray band. It will discover transients and monitor variable objects in 0.5-4 keV, for which it will employ a very large instantaneous field-of-view (60° × 60°), along with moderate spatial resolution (FWHM ˜ 5 arcmin). Its wide-field imaging capability will be achieved by using established technology in novel lobster-eye optics. In this paper, we present Monte-Carlo simulations for the focusing capabilities of EP's Wide-field X-ray Telescope (WXT). The simulations are performed using Geant4 with an X-ray tracer which was developed by cosine (http://cosine.nl/) to trace X-rays. Our work is the first step toward building a comprehensive model with which the design of the X-ray optics and the ultimate sensitivity of the instrument can be optimized by simulating the X-ray tracing and radiation environment of the system, including the focal plane detector and the shielding at the same time.

  5. Organ doses from hepatic radioembolization with 90Y, 153Sm, 166Ho and 177Lu: A Monte Carlo simulation study using Geant4

    NASA Astrophysics Data System (ADS)

    Hashikin, N. A. A.; Yeong, C. H.; Guatelli, S.; Abdullah, B. J. J.; Ng, K. H.; Malaroda, A.; Rosenfeld, A. B.; Perkins, A. C.

    2016-03-01

    90Y-radioembolization is a palliative treatment for liver cancer. 90Y decays via beta emission, making imaging difficult due to absence of gamma radiation. Since post-procedure imaging is crucial, several theranostic radionuclides have been explored as alternatives. However, exposures to gamma radiation throughout the treatment caused concern for the organs near the liver. Geant4 Monte Carlo simulation using MIRD Pamphlet 5 reference phantom was carried out. A spherical tumour with 4.3cm radius was modelled within the liver. 1.82GBq of 90Y sources were isotropically distributed within the tumour, with no extrahepatic shunting. The simulation was repeated with 153Sm, 166Ho and 177Lu. The estimated tumour doses for all radionuclides were 262.9Gy. Tumour dose equivalent to 1.82GBq 90Y can be achieved with 8.32, 5.83, and 4.44GBq for 153Sm, 166Ho and 177Lu, respectively. Normal liver doses by the other radionuclides were lower than 90Y, hence beneficial for normal tissue sparing. The organ doses from 153Sm and 177Lu were relatively higher due to higher gamma energy, but were still well below 1Gy. 166Ho, 177Lu and 153Sm offer useful gamma emission for post-procedure imaging. They show potential as 90Y substitutes, delivering comparable tumour doses, lower normal liver doses and other organs doses far below the tolerance limit.

  6. Measurement of depth-dose of linear accelerator and simulation by use of Geant4 computer code

    PubMed Central

    Sardari, D.; Maleki, R.; Samavat, H.; Esmaeeli, A.

    2010-01-01

    Radiation therapy is an established method of cancer treatment. New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapy treatment plan. This study presents some results of a Geant4-based application for simulation of the absorbed dose distribution given by a medical linear accelerator (LINAC). The LINAC geometry is accurately described in the Monte Carlo code with use of the accelerator manufacturer's specifications. The capability of the software for evaluating the dose distribution has been verified by comparisons with measurements in a water phantom; the comparisons were performed for percentage depth dose (PDD) and profiles for various field sizes and depths, for a 6-MV electron beam. Experimental and calculated dose values were in good agreement both in PDD and in transverse sections of the water phantom. PMID:24376926

  7. Design of Cherenkov bars for the optical part of the time-of-flight detector in Geant4.

    PubMed

    Nozka, L; Brandt, A; Rijssenbeek, M; Sykora, T; Hoffman, T; Griffiths, J; Steffens, J; Hamal, P; Chytka, L; Hrabovsky, M

    2014-11-17

    We present the results of studies devoted to the development and optimization of the optical part of a high precision time-of-flight (TOF) detector for the Large Hadron Collider (LHC). This work was motivated by a proposal to use such a detector in conjunction with a silicon detector to tag and measure protons from interactions of the type p + p → p + X + p, where the two outgoing protons are scattered in the very forward directions. The fast timing detector uses fused silica (quartz) bars that emit Cherenkov radiation as a relativistic particle passes through and the emitted Cherenkov photons are detected by, for instance, a micro-channel plate multi-anode Photomultiplier Tube (MCP-PMT). Several possible designs are implemented in Geant4 and studied for timing optimization as a function of the arrival time, and the number of Cherenkov photons reaching the photo-sensor. PMID:25402137

  8. SU-E-T-531: Performance Evaluation of Multithreaded Geant4 for Proton Therapy Dose Calculations in a High Performance Computing Facility

    SciTech Connect

    Shin, J; Coss, D; McMurry, J; Farr, J; Faddegon, B

    2014-06-01

    Purpose: To evaluate the efficiency of multithreaded Geant4 (Geant4-MT, version 10.0) for proton Monte Carlo dose calculations using a high performance computing facility. Methods: Geant4-MT was used to calculate 3D dose distributions in 1×1×1 mm3 voxels in a water phantom and patient's head with a 150 MeV proton beam covering approximately 5×5 cm2 in the water phantom. Three timestamps were measured on the fly to separately analyze the required time for initialization (which cannot be parallelized), processing time of individual threads, and completion time. Scalability of averaged processing time per thread was calculated as a function of thread number (1, 100, 150, and 200) for both 1M and 50 M histories. The total memory usage was recorded. Results: Simulations with 50 M histories were fastest with 100 threads, taking approximately 1.3 hours and 6 hours for the water phantom and the CT data, respectively with better than 1.0 % statistical uncertainty. The calculations show 1/N scalability in the event loops for both cases. The gains from parallel calculations started to decrease with 150 threads. The memory usage increases linearly with number of threads. No critical failures were observed during the simulations. Conclusion: Multithreading in Geant4-MT decreased simulation time in proton dose distribution calculations by a factor of 64 and 54 at a near optimal 100 threads for water phantom and patient's data respectively. Further simulations will be done to determine the efficiency at the optimal thread number. Considering the trend of computer architecture development, utilizing Geant4-MT for radiotherapy simulations is an excellent cost-effective alternative for a distributed batch queuing system. However, because the scalability depends highly on simulation details, i.e., the ratio of the processing time of one event versus waiting time to access for the shared event queue, a performance evaluation as described is recommended.

  9. Technical Note: Implementation of biological washout processes within GATE/GEANT4—A Monte Carlo study in the case of carbon therapy treatments

    SciTech Connect

    Martínez-Rovira, I. Jouvie, C.; Jan, S.

    2015-04-15

    Purpose: The imaging of positron emitting isotopes produced during patient irradiation is the only in vivo method used for hadrontherapy dose monitoring in clinics nowadays. However, the accuracy of this method is limited by the loss of signal due to the metabolic decay processes (biological washout). In this work, a generic modeling of washout was incorporated into the GATE simulation platform. Additionally, the influence of the washout on the β{sup +} activity distributions in terms of absolute quantification and spatial distribution was studied. Methods: First, the irradiation of a human head phantom with a {sup 12}C beam, so that a homogeneous dose distribution was achieved in the tumor, was simulated. The generated {sup 11}C and {sup 15}O distribution maps were used as β{sup +} sources in a second simulation, where the PET scanner was modeled following a detailed Monte Carlo approach. The activity distributions obtained in the presence and absence of washout processes for several clinical situations were compared. Results: Results show that activity values are highly reduced (by a factor of 2) in the presence of washout. These processes have a significant influence on the shape of the PET distributions. Differences in the distal activity falloff position of 4 mm are observed for a tumor dose deposition of 1 Gy (T{sub ini} = 0 min). However, in the case of high doses (3 Gy), the washout processes do not have a large effect on the position of the distal activity falloff (differences lower than 1 mm). The important role of the tumor washout parameters on the activity quantification was also evaluated. Conclusions: With this implementation, GATE/GEANT 4 is the only open-source code able to simulate the full chain from the hadrontherapy irradiation to the PET dose monitoring including biological effects. Results show the strong impact of the washout processes, indicating that the development of better models and measurement of biological washout data are

  10. Triple GEM detector sensitivity simulations with Geant4 for the CMS Forward Muon Upgrade at CERN LHC

    NASA Astrophysics Data System (ADS)

    Zenoni, Florian; CMS GEM Collaboration

    2015-04-01

    Triple Gas Electron Multiplier (GEM) detectors are being developed for the forward muon upgrade of the CMS experiment in Phase 2 of the CERN LHC. After the second long LHC shutdown, their implementation will take place for the GE1/1 system in the 1 . 5 < | η | < 2 . 2 region of the muon endcap. This upgrade aims at controlling muon level-1 trigger rates, thanks to their high performance in extreme particle rates (~ MHz/cm2). Moreover, the GEM technology can improve the muon track reconstruction and identification capabilities of the forward detector. The Triple GEMs will work in a hostile radiation background (several hundreds of Hz/cm2) mostly made of photons, neutrons, electrons and positrons. To understand how this background could affect the detectors' functionality it is important to know the sensitivity to these kinds of radiation. The goal of this work is to estimate the sensitivity of Triple GEMs to background particles in the CMS cavern environment, thanks to the latest updates of GEANT4, a toolkit for the simulation of the passage of particles through matter.

  11. Simulation of the 6 MV Elekta Synergy Platform linac photon beam using Geant4 Application for Tomographic Emission.

    PubMed

    Didi, Samir; Moussa, Abdelilah; Yahya, Tayalati; Mustafa, Zerfaoui

    2015-01-01

    The present work validates the Geant4 Application for Tomographic Emission Monte Carlo software for the simulation of a 6 MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The simulation includes the major components of the linear accelerator (LINAC) with multi-leaf collimator and a homogeneous water phantom. Calculations were performed for the photon beam with several treatment field sizes ranging from 5 cm × 5 cm to 30 cm × 30 cm at 100 cm distance from the source. The simulation was successfully validated by comparison with experimental distributions. Good agreement between simulations and measurements was observed, with dose differences of about 0.02% and 2.5% for depth doses and lateral dose profiles, respectively. This agreement was also emphasized by the Kolmogorov-Smirnov goodness-of-fit test and by the gamma-index comparisons where more than 99% of the points for all simulations fulfill the quality assurance criteria of 2 mm/2%. PMID:26500399

  12. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    SciTech Connect

    Uzunyan, S. A.; Blazey, G.; Boi, S.; Coutrakon, G.; Dyshkant, A.; Francis, K.; Hedin, D.; Johnson, E.; Kalnins, J.; Zutshi, V.; Ford, R.; Rauch, J. E.; Rubinov, P.; Sellberg, G.; Wilson, P.; Naimuddin, M.

    2015-12-29

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input for image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.

  13. GEANT4 simulations for in trap decay spectroscopy for electron capture branching ratio measurements using the TITAN facility

    NASA Astrophysics Data System (ADS)

    Seeraji, Shakil; Andreoiu, C.; Jang, F.; Ma, T.; Chaudhuri, A.; Grossheim, A.; Kwiatkowski, A. A.; Schultz, B. E.; Mane, E.; Gwinner, G.; Dilling, J.; Lennarz, A.; Frekers, D.; Chowdhury, U.; Simon, V. V.; Brunner, T.; Delheij, P.; Simon, M. C.

    2012-10-01

    The TITAN-EC project has developed a unique technique to measure electron capture branching ratios (ECBRs) of short lived intermediate nuclide involved in double beta decay. The ECBR information is important for determination of nuclear matrix elements of double-β decay for both double beta decay (2νββ) and neutrino-less double beta decay (0νββ) processes. An important feature of this technique is the use of open access penning trap. Radioactive ions are stored in the trap and their decays are observed. Electrons produced from β decay are guided out of the trap by the Penning trap's strong magnetic field and the x-ray from EC are detected by seven Si(Li) detectors placed radially around trap using thin Be windows. This set-up provides a lower background for the x-ray detection compared to earlier ECBC measurements where the beam is implanted in mylar tape. Detailed GEANT4 simulations have been performed to characterize the efficiency of the detectors and understand their response. In addition the impact of different sizes and shapes of the ion cloud inside the trap has also been investigated to optimize the experimental set-up.

  14. 3D polymer gel dosimetry and Geant4 Monte Carlo characterization of novel needle based X-ray source

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Sozontov, E.; Safronov, V.; Gutman, G.; Strumban, E.; Jiang, Q.; Li, S.

    2010-11-01

    In the recent years, there have been a few attempts to develop a low energy x-ray radiation sources alternative to conventional radioisotopes used in brachytherapy. So far, all efforts have been centered around the intent to design an interstitial miniaturized x-ray tube. Though direct irradiation of tumors looks very promising, the known insertable miniature x-ray tubes have many limitations: (a) difficulties with focusing and steering the electron beam to the target; (b)necessity to cool the target to increase x-ray production efficiency; (c)impracticability to reduce the diameter of the miniaturized x-ray tube below 4mm (the requirement to decrease the diameter of the x-ray tube and the need to have a cooling system for the target have are mutually exclusive); (c) significant limitations in changing shape and energy of the emitted radiation. The specific aim of this study is to demonstrate the feasibility of a new concept for an insertable low-energy needle x-ray device based on simulation with Geant4 Monte Carlo code and to measure the dose rate distribution for low energy (17.5 keV) x-ray radiation with the 3D polymer gel dosimetry.

  15. Space radiation analysis: Radiation effects and particle interaction outside the Earth's magnetosphere using GRAS and GEANT4

    NASA Astrophysics Data System (ADS)

    Martinez, Lisandro M.; Kingston, Jennifer

    2012-03-01

    In order to explore the Moon and Mars it is necessary to investigate the hazards due to the space environment and especially ionizing radiation. According to previous papers, much information has been presented in radiation analysis inside the Earth's magnetosphere, but much of this work was not directly relevant to the interplanetary medium. This work intends to explore the effect of radiation on humans inside structures such as the ISS and provide a detailed analysis of galactic cosmic rays (GCRs) and solar proton events (SPEs) using SPENVIS (Space Environment Effects and Information System) and CREME96 data files for particle flux outside the Earth's magnetosphere. The simulation was conducted using GRAS, a European Space Agency (ESA) software based on GEANT4. Dose and equivalent dose have been calculated as well as secondary particle effects and GCR energy spectrum. The calculated total dose effects and equivalent dose indicate the risk and effects that space radiation could have on the crew, these values are calculated using two different types of structures, the ISS and the TransHab modules. Final results indicate the amounts of radiation expected to be absorbed by the astronauts during long duration interplanetary flights; this denotes importance of radiation shielding and the use of proper materials to reduce the effects.

  16. Simulation of Cherenkov photons emitted in photomultiplier windows induced by Compton diffusion using the Monte Carlo code GEANT4.

    PubMed

    Thiam, C; Bobin, C; Bouchard, J

    2010-01-01

    The implementation of the TDCR method (Triple to Double Coincidence Ratio) is based on a liquid scintillation system which comprises three photomultipliers; at LNHB, this counter can also be used in the beta-channel of a 4pi(LS)beta-gamma coincidence counting equipment. It is generally considered that the gamma-sensitivity of the liquid scintillation detector comes from the interaction of the gamma-photons in the scintillation cocktail but when introducing solid gamma-ray emitting sources instead of the scintillation vial, light emitted by the surrounding of the counter is observed. The explanation proposed in this article is that this effect comes from the emission of Cherenkov photons induced by Compton diffusion in the photomultiplier windows. In order to support this assertion, the creation and the propagation of Cherenkov photons inside the TDCR counter is simulated using the Monte Carlo code GEANT4. Stochastic calculations of double coincidences confirm the hypothesis of Cherenkov light produced in the photomultiplier windows. PMID:20031429

  17. Designing a new type of neutron detector for neutron and gamma-ray discrimination via GEANT4.

    PubMed

    Shan, Qing; Chu, Shengnan; Ling, Yongsheng; Cai, Pingkun; Jia, Wenbao

    2016-04-01

    Design of a new type of neutron detector, consisting of a fast neutron converter, plastic scintillator, and Cherenkov detector, to discriminate 14-MeV fast neutrons and gamma rays in a pulsed n-γ mixed field and monitor their neutron fluxes is reported in this study. Both neutrons and gamma rays can produce fluorescence in the scintillator when they are incident on the detector. However, only the secondary charged particles of the gamma rays can produce Cherenkov light in the Cherenkov detector. The neutron and gamma-ray fluxes can be calculated by measuring the fluorescence and Cherenkov light. The GEANT4 Monte Carlo simulation toolkit is used to simulate the whole process occurring in the detector, whose optimum parameters are known. Analysis of the simulation results leads to a calculation method of neutron flux. This method is verified by calculating the neutron fluxes using pulsed n-γ mixed fields with different n/γ ratios, and the results show that the relative errors of all calculations are <5%. PMID:26844541

  18. Simulation of the 6 MV Elekta Synergy Platform linac photon beam using Geant4 Application for Tomographic Emission

    PubMed Central

    Didi, Samir; Moussa, Abdelilah; Yahya, Tayalati; Mustafa, Zerfaoui

    2015-01-01

    The present work validates the Geant4 Application for Tomographic Emission Monte Carlo software for the simulation of a 6 MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The simulation includes the major components of the linear accelerator (LINAC) with multi-leaf collimator and a homogeneous water phantom. Calculations were performed for the photon beam with several treatment field sizes ranging from 5 cm × 5 cm to 30 cm × 30 cm at 100 cm distance from the source. The simulation was successfully validated by comparison with experimental distributions. Good agreement between simulations and measurements was observed, with dose differences of about 0.02% and 2.5% for depth doses and lateral dose profiles, respectively. This agreement was also emphasized by the Kolmogorov–Smirnov goodness-of-fit test and by the gamma-index comparisons where more than 99% of the points for all simulations fulfill the quality assurance criteria of 2 mm/2%. PMID:26500399

  19. Benchmarking the Geant4 full system simulation of an associated alpha-particle detector for use in a D-T neutron generator.

    PubMed

    Zhang, Xiaodong; Hayward, Jason P; Cates, Joshua W; Hausladen, Paul A; Laubach, Mitchell A; Sparger, Johnathan E; Donnald, Samuel B

    2012-08-01

    The position-sensitive alpha-particle detector used to provide the starting time and initial direction of D-T neutrons in a fast-neutron imaging system was simulated with a Geant4-based Monte Carlo program. The whole detector system, which consists of a YAP:Ce scintillator, a fiber-optic faceplate, a light guide, and a position-sensitive photo-multiplier tube (PSPMT), was modeled, starting with incident D-T alphas. The scintillation photons, whose starting time follows the distribution of a scintillation decay curve, were produced and emitted uniformly into a solid angle of 4π along the track segments of the alpha and its secondaries. Through tracking all photons and taking into account the quantum efficiency of the photocathode, the number of photoelectrons and their time and position distributions were obtained. Using a four-corner data reconstruction formula, the flood images of the alpha detector with and without optical grease between the YAP scintillator and the fiber-optic faceplate were obtained, which show agreement with the experimental results. The reconstructed position uncertainties of incident alpha particles for both cases are 1.198 mm and 0.998 mm respectively across the sensitive area of the detector. Simulation results also show that comparing with other faceplates composed of 500 μm, 300 μm, and 100 μm fibers, the 10-μm-fiber faceplate is the best choice to build the detector for better position performance. In addition, the study of the background originating inside the D-T generator suggests that for 500-μm-thick YAP:Ce coated with 1-μm-thick aluminum, and very good signal-to-noise ratio can be expected through application of a simple threshold. PMID:22728838

  20. SU-E-T-519: Emission of Secondary Particles From a PMMA Phantom During Proton Irradiation: A Simulation Study with the Geant4 Monte Carlo Toolkit

    SciTech Connect

    Lau, A; Chen, Y; Ahmad, S

    2014-06-01

    Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMA phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study.

  1. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the geant4 Monte Carlo code

    PubMed Central

    Guan, Fada; Peeler, Christopher; Bronk, Lawrence; Geng, Changran; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Grosshans, David; Mohan, Radhe; Titt, Uwe

    2015-01-01

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the geant 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from geant 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LETt and dose-averaged LET, LETd) using geant 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LETt and LETd of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LETt but significant for LETd. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in geant 4 can result in incorrect LETd calculation results in the dose plateau region for small step limits. The erroneous LETd results can be attributed to the algorithm to determine fluctuations in energy deposition along the

  2. Development and validation of RAYDOSE: a Geant4-based application for molecular radiotherapy

    NASA Astrophysics Data System (ADS)

    Marcatili, S.; Pettinato, C.; Daniels, S.; Lewis, G.; Edwards, P.; Fanti, S.; Spezi, E.

    2013-04-01

    We developed and validated a Monte-Carlo-based application (RAYDOSE) to generate patient-specific 3D dose maps on the basis of pre-treatment imaging studies. A CT DICOM image is used to model patient geometry, while repeated PET scans are employed to assess radionuclide kinetics and distribution at the voxel level. In this work, we describe the structure of this application and present the tests performed to validate it against reference data and experiments. We used the spheres of a NEMA phantom to calculate S values and total doses. The comparison with reference data from OLINDA/EXM showed an agreement within 2% for a sphere size above 2.8 cm diameter. A custom heterogeneous phantom composed of several layers of Perspex and lung equivalent material was used to compare TLD measurements of gamma radiation from 131I to Monte Carlo simulations. An agreement within 5% was found. RAYDOSE has been validated against reference data and experimental measurements and can be a useful multi-modality platform for treatment planning and research in MRT.

  3. Dose distribution in water for monoenergetic photon point sources in the energy range of interest in brachytherapy: Monte Carlo simulations with PENELOPE and GEANT4

    NASA Astrophysics Data System (ADS)

    Almansa, Julio F.; Guerrero, Rafael; Al-Dweri, Feras M. O.; Anguiano, Marta; Lallena, Antonio M.

    2007-05-01

    Monte Carlo calculations using the codes PENELOPE and GEANT4 have been performed to characterize the dosimetric properties of monoenergetic photon point sources in water. The dose rate in water has been calculated for energies of interest in brachytherapy, ranging between 10 keV and 2 MeV. A comparison of the results obtained using the two codes with the available data calculated with other Monte Carlo codes is carried out. A χ2-like statistical test is proposed for these comparisons. PENELOPE and GEANT4 show a reasonable agreement for all energies analyzed and distances to the source larger than 1 cm. Significant differences are found at distances from the source up to 1 cm. A similar situation occurs between PENELOPE and EGS4.

  4. Comparison of nanodosimetric parameters of track structure calculated by the Monte Carlo codes Geant4-DNA and PTra

    NASA Astrophysics Data System (ADS)

    Lazarakis, P.; Bug, M. U.; Gargioni, E.; Guatelli, S.; Rabus, H.; Rosenfeld, A. B.

    2012-03-01

    The concept of nanodosimetry is based on the assumption that initial damage to cells is related to the number of ionizations (the ionization cluster size) directly produced by single particles within, or in the close vicinity of, short segments of DNA. The ionization cluster-size distribution and other nanodosimetric quantities, however, are not directly measurable in biological targets and our current knowledge is mostly based on numerical simulations of particle tracks in water, calculating track structure parameters for nanometric target volumes. The assessment of nanodosimetric quantities derived from particle-track calculations using different Monte Carlo codes plays, therefore, an important role for a more accurate evaluation of the initial damage to cells and, as a consequence, of the biological effectiveness of ionizing radiation. The aim of this work is to assess the differences in the calculated nanodosimetric quantities obtained with Geant4-DNA as compared to those of the ad hoc particle-track Monte Carlo code ‘PTra’ developed at Physikalisch-Technische Bundesanstalt (PTB), Germany. The comparison of the two codes was made for incident electrons of energy in the range between 50 eV and 10 keV, for protons of energy between 300 keV and 10 MeV, and for alpha particles of energy between 1 and 10 MeV as these were the energy ranges available in both codes at the time this investigation was carried out. Good agreement was found for nanodosimetric characteristics of track structure calculated in the high-energy range of each particle type. For lower energies, significant differences were observed, most notably in the estimates of the biological effectiveness. The largest relative differences obtained were over 50%; however, generally the order of magnitude was between 10% and 20%.

  5. Comparison of nanodosimetric parameters of track structure calculated by the Monte Carlo codes Geant4-DNA and PTra.

    PubMed

    Lazarakis, P; Bug, M U; Gargioni, E; Guatelli, S; Rabus, H; Rosenfeld, A B

    2012-03-01

    The concept of nanodosimetry is based on the assumption that initial damage to cells is related to the number of ionizations (the ionization cluster size) directly produced by single particles within, or in the close vicinity of, short segments of DNA. The ionization cluster-size distribution and other nanodosimetric quantities, however, are not directly measurable in biological targets and our current knowledge is mostly based on numerical simulations of particle tracks in water, calculating track structure parameters for nanometric target volumes. The assessment of nanodosimetric quantities derived from particle-track calculations using different Monte Carlo codes plays, therefore, an important role for a more accurate evaluation of the initial damage to cells and, as a consequence, of the biological effectiveness of ionizing radiation. The aim of this work is to assess the differences in the calculated nanodosimetric quantities obtained with Geant4-DNA as compared to those of the ad hoc particle-track Monte Carlo code 'PTra' developed at Physikalisch-Technische Bundesanstalt (PTB), Germany. The comparison of the two codes was made for incident electrons of energy in the range between 50 eV and 10 keV, for protons of energy between 300 keV and 10 MeV, and for alpha particles of energy between 1 and 10 MeV as these were the energy ranges available in both codes at the time this investigation was carried out. Good agreement was found for nanodosimetric characteristics of track structure calculated in the high-energy range of each particle type. For lower energies, significant differences were observed, most notably in the estimates of the biological effectiveness. The largest relative differences obtained were over 50%; however, generally the order of magnitude was between 10% and 20%. PMID:22330641

  6. Use of the GEANT4 Monte Carlo to determine three-dimensional dose factors for radionuclide dosimetry

    NASA Astrophysics Data System (ADS)

    Amato, Ernesto; Italiano, Antonio; Minutoli, Fabio; Baldari, Sergio

    2013-04-01

    The voxel-level dosimetry is the most simple and common approach to internal dosimetry of nonuniform distributions of activity within the human body. Aim of this work was to obtain the dose "S" factors (mGy/MBqs) at the voxel level for eight beta and beta-gamma emitting radionuclides commonly used in nuclear medicine diagnostic and therapeutic procedures. We developed a Monte Carlo simulation in GEANT4 of a region of soft tissue as defined by the ICRP, divided into 11×11×11 cubic voxels, 3 mm in side. The simulation used the parameterizations of the electromagnetic interaction optimized for low energy (EEDL, EPDL). The decay of each radionuclide (32P, 90Y, 99mTc, 177Lu, 131I, 153Sm, 186Re, 188Re) were simulated homogeneously distributed within the central voxel (0,0,0), and the energy deposited in the surrounding voxels was mediated on the 8 octants of the three dimensional space, for reasons of symmetry. The results obtained were compared with those available in the literature. While the iodine deviations remain within 16%, for phosphorus, a pure beta emitter, the agreement is very good for self-dose (0,0,0) and good for the dose to first neighbors, while differences are observed ranging from -60% to +100% for voxels far distant from the source. The existence of significant differences in the percentage calculation of the voxel S factors, especially for pure beta emitters such as 32P or 90Y, has already been highlighted by other authors. These data can usefully extend the dosimetric approach based on the voxel to other radionuclides not covered in the available literature.

  7. Scatter in an uncollimated x-ray CT machine based on a Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Wadeson, Nicola; Morton, Edward; Lionheart, William

    2010-04-01

    A high-speed motionless-gantry x-ray CT machine has been designed to allow for 3D images to be collected in real time. By using multiple, switched x-ray sources and fixed detector rings, the time consuming mechanical rotation of conventional CT machines can be removed. However, the nature of this design limits the possibility of detector collimation since each detector must now be able to record the energy of x-ray beams from a number of different directions. The lack of collimation has implications in the reconstructed image due to an increase in the number of scattered photons recorded. A Monte Carlo computer simulation of the x-ray machine has been developed, using the Geant4 software toolkit, to analyse the behaviour of both Rayleigh and Compton scattered photons when considering airport baggage and medical applications. Four different scattering objects were analysed based on 50kVp, 100kVp and 150kVp spectra for a tungsten target. Two suitcase objects, a body and a brain phantom were chosen as objects typical of airport baggage and medical CT. The results indicate that the level of scatter is negligible for a typical airport baggage application, since the majority of space in a suitcase consists of clothing, which has a low density. Scatter contributes to less than 1% of the image in all instances. However, due to the large amounts of water found in the human body, the level of scatter in the medical instances are significantly higher, reaching 37% when the body phantom is analysed at 50kVp.

  8. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β+-emitting nuclei during therapeutic particle irradiation to measured data

    NASA Astrophysics Data System (ADS)

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-01

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β+-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β+-activity and dose is not feasible, a simulation of the expected β+-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β+-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β+-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  9. Development of a randomized 3D cell model for Monte Carlo microdosimetry simulations

    SciTech Connect

    Douglass, Michael; Bezak, Eva; Penfold, Scott

    2012-06-15

    Purpose: The objective of the current work was to develop an algorithm for growing a macroscopic tumor volume from individual randomized quasi-realistic cells. The major physical and chemical components of the cell need to be modeled. It is intended to import the tumor volume into GEANT4 (and potentially other Monte Carlo packages) to simulate ionization events within the cell regions. Methods: A MATLAB Copyright-Sign code was developed to produce a tumor coordinate system consisting of individual ellipsoidal cells randomized in their spatial coordinates, sizes, and rotations. An eigenvalue method using a mathematical equation to represent individual cells was used to detect overlapping cells. GEANT4 code was then developed to import the coordinate system into GEANT4 and populate it with individual cells of varying sizes and composed of the membrane, cytoplasm, reticulum, nucleus, and nucleolus. Each region is composed of chemically realistic materials. Results: The in-house developed MATLAB Copyright-Sign code was able to grow semi-realistic cell distributions ({approx}2 Multiplication-Sign 10{sup 8} cells in 1 cm{sup 3}) in under 36 h. The cell distribution can be used in any number of Monte Carlo particle tracking toolkits including GEANT4, which has been demonstrated in this work. Conclusions: Using the cell distribution and GEANT4, the authors were able to simulate ionization events in the individual cell components resulting from 80 keV gamma radiation (the code is applicable to other particles and a wide range of energies). This virtual microdosimetry tool will allow for a more complete picture of cell damage to be developed.

  10. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4

    NASA Astrophysics Data System (ADS)

    Pope, D. J.; Cutajar, D. L.; George, S. P.; Guatelli, S.; Bucci, J. A.; Enari, K. E.; Miller, S.; Siegele, R.; Rosenfeld, A. B.

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences. Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  11. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4.

    PubMed

    Pope, D J; Cutajar, D L; George, S P; Guatelli, S; Bucci, J A; Enari, K E; Miller, S; Siegele, R; Rosenfeld, A B

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences.Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  12. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    SciTech Connect

    Guan, Fada; Peeler, Christopher; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Mohan, Radhe; Titt, Uwe; Bronk, Lawrence; Geng, Changran; Grosshans, David

    2015-11-15

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to

  13. A GEANT4 Monte Carlo simulation to describe the time response of a coupled SiPM and LYSO detection system

    NASA Astrophysics Data System (ADS)

    Leming, E.; De Santo, A.; Salvatore, F.; Camanzi, B.; Lohstroh, A.

    2014-06-01

    In recent years the silicon photomultiplier has been investigated as an alternative to the traditional photomultiplier tube in a range of applications, including Time-of-flight Positron Emission Tomography (TOF-PET). In this paper we discuss a GEANT4 simulation framework, which has been developed to drive the design of a scalable TOF-PET apparatus to be built at the Rutherford Appleton Laboratory, UK. First results presented in this paper simulate the response of an Hamamatsu Multi-Pixel Photon Counter (S10362-33-050c) coupled to LYSO scintillating crystals, with focus on the timing response of coincidence signals.

  14. The Cryogenic AntiCoincidence Detector for the ATHENA X-IFU: Design Aspects by Geant4 Simulation and Preliminary Characterization of the New Single Pixel

    NASA Astrophysics Data System (ADS)

    Macculi, C.; Argan, A.; D'Andrea, M.; Lotti, S.; Piro, L.; Biasotti, M.; Corsini, D.; Gatti, F.; Orlando, A.; Torrioli, G.

    2016-01-01

    The ATHENA observatory is the second large-class ESA mission, in the context of the Cosmic Vision 2015-2025, scheduled to be launched on 2028 at L2 orbit. One of the two planned focal plane instruments is the X-ray Integral Field Unit (X-IFU), which will be able to perform simultaneous high-grade energy spectroscopy and imaging over the 5 arcmin FoV by means of a kilo-pixel array of transition-edge sensor (TES) microcalorimeters, coupled to a high-quality X-ray optics. The X-IFU sensitivity is degraded by the particle background, induced by primary protons of both solar and cosmic rays' origin and secondary electrons. A Cryogenic AntiCoincidence (CryoAC) TES-based detector, located < 1 mm below the TES array, will allow the mission to reach the background level that enables its scientific goals. The CryoAC is a 4-pixel detector made of Silicon absorbers sensed by Iridium TESs. We currently achieve a TRL = 3-4 at the single-pixel level. We have designed and developed two further prototypes in order to reach TRL = 4. The design of the CryoAC has been also optimized using the Geant4 simulation tool. Here we will describe some results from the Geant4 simulations performed to optimize the design and preliminary test results from the first of the two detectors, 1 cm2 area, made of 65 Ir TESs.

  15. The Cryogenic AntiCoincidence Detector for the ATHENA X-IFU: Design Aspects by Geant4 Simulation and Preliminary Characterization of the New Single Pixel

    NASA Astrophysics Data System (ADS)

    Macculi, C.; Argan, A.; D'Andrea, M.; Lotti, S.; Piro, L.; Biasotti, M.; Corsini, D.; Gatti, F.; Orlando, A.; Torrioli, G.

    2016-08-01

    The ATHENA observatory is the second large-class ESA mission, in the context of the Cosmic Vision 2015-2025, scheduled to be launched on 2028 at L2 orbit. One of the two planned focal plane instruments is the X-ray Integral Field Unit (X-IFU), which will be able to perform simultaneous high-grade energy spectroscopy and imaging over the 5 arcmin FoV by means of a kilo-pixel array of transition-edge sensor (TES) microcalorimeters, coupled to a high-quality X-ray optics. The X-IFU sensitivity is degraded by the particle background, induced by primary protons of both solar and cosmic rays' origin and secondary electrons. A Cryogenic AntiCoincidence (CryoAC) TES-based detector, located <1 mm below the TES array, will allow the mission to reach the background level that enables its scientific goals. The CryoAC is a 4-pixel detector made of Silicon absorbers sensed by Iridium TESs. We currently achieve a TRL = 3-4 at the single-pixel level. We have designed and developed two further prototypes in order to reach TRL = 4. The design of the CryoAC has been also optimized using the Geant4 simulation tool. Here we will describe some results from the Geant4 simulations performed to optimize the design and preliminary test results from the first of the two detectors, 1 cm2 area, made of 65 Ir TESs.

  16. Influence of the geometrical detail in the description of DNA and the scoring method of ionization clustering on nanodosimetric parameters of track structure: a Monte Carlo study using Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Bueno, M.; Schulte, R.; Meylan, S.; Villagrasa, C.

    2015-11-01

    The aim of this study was to evaluate the influence of the geometrical detail of the DNA on nanodosimetric parameters of track structure induced by protons and alpha particles of different energies (LET values ranging from 1 to 162.5~\\text{keV}~μ {{\\text{m}}-1} ) as calculated by Geant4-DNA Monte Carlo simulations. The first geometry considered consisted of a well-structured placement of a realistic description of the DNA double helix wrapped around cylindrical histones (GeomHist) forming a 18 kbp-long chromatin fiber. In the second geometry considered, the DNA was modeled as a total of 1800 ten bp-long homogeneous cylinders (2.3 nm diameter and 3.4 nm height) placed in random positions and orientations (GeomCyl). As for GeomHist, GeomCyl contained a DNA material equivalent to 18 kbp. Geant4-DNA track structure simulations were performed and ionizations were counted in the scoring volumes. For GeomCyl, clusters were defined as the number of ionizations (ν) scored in each 10 bp-long cylinder. For GeomHist, clusters of ionizations scored in the sugar-phosphate groups of the double-helix were revealed by the DBSCAN clustering algorithm according to a proximity criteria among ionizations separated by less than 10 bp. The topology of the ionization clusters formed using GeomHist and GeomCyl geometries were compared in terms of biologically relevant nanodosimetric quantities. The discontinuous modeling of the DNA for GeomCyl led to smaller cluster sizes than for GeomHist. The continuous modeling of the DNA molecule for GeomHist allowed the merging of ionization points by the DBSCAN algorithm giving rise to larger clusters, which were not detectable within the GeomCyl geometry. Mean cluster size (m1) was found to be of the order of 10% higher for GeomHist compared to GeomCyl for LET <15~\\text{keV}~μ {{\\text{m}}-1} . For higher LETs, the difference increased with LET similarly for protons and alpha particles. Both geometries showed the same relationship

  17. Influence of the geometrical detail in the description of DNA and the scoring method of ionization clustering on nanodosimetric parameters of track structure: a Monte Carlo study using Geant4-DNA.

    PubMed

    Bueno, M; Schulte, R; Meylan, S; Villagrasa, C

    2015-11-01

    The aim of this study was to evaluate the influence of the geometrical detail of the DNA on nanodosimetric parameters of track structure induced by protons and alpha particles of different energies (LET values ranging from 1 to 162.5 keV µm-1) as calculated by Geant4-DNA Monte Carlo simulations.The first geometry considered consisted of a well-structured placement of a realistic description of the DNA double helix wrapped around cylindrical histones (GeomHist) forming a 18 kbp-long chromatin fiber. In the second geometry considered, the DNA was modeled as a total of 1800 ten bp-long homogeneous cylinders (2.3 nm diameter and 3.4 nm height) placed in random positions and orientations (GeomCyl). As for GeomHist, GeomCyl contained a DNA material equivalent to 18 kbp. Geant4-DNA track structure simulations were performed and ionizations were counted in the scoring volumes. For GeomCyl, clusters were defined as the number of ionizations (ν) scored in each 10 bp-long cylinder. For GeomHist, clusters of ionizations scored in the sugar-phosphate groups of the double-helix were revealed by the DBSCAN clustering algorithm according to a proximity criteria among ionizations separated by less than 10 bp. The topology of the ionization clusters formed using GeomHist and GeomCyl geometries were compared in terms of biologically relevant nanodosimetric quantities.The discontinuous modeling of the DNA for GeomCyl led to smaller cluster sizes than for GeomHist. The continuous modeling of the DNA molecule for GeomHist allowed the merging of ionization points by the DBSCAN algorithm giving rise to larger clusters, which were not detectable within the GeomCyl geometry. Mean cluster size (m1) was found to be of the order of 10% higher for GeomHist compared to GeomCyl for LET < 15 keV µm-1. For higher LETs, the difference increased with LET similarly for protons and alpha particles. Both geometries showed the same relationship between m1 and the cumulative relative frequency of

  18. Beyond Standard Model Physics

    SciTech Connect

    Bellantoni, L.

    2009-11-01

    There are many recent results from searches for fundamental new physics using the TeVatron, the SLAC b-factory and HERA. This talk quickly reviewed searches for pair-produced stop, for gauge-mediated SUSY breaking, for Higgs bosons in the MSSM and NMSSM models, for leptoquarks, and v-hadrons. There is a SUSY model which accommodates the recent astrophysical experimental results that suggest that dark matter annihilation is occurring in the center of our galaxy, and a relevant experimental result. Finally, model-independent searches at D0, CDF, and H1 are discussed.

  19. Geant4 Monte Carlo simulation of absorbed dose and radiolysis yields enhancement from a gold nanoparticle under MeV proton irradiation

    NASA Astrophysics Data System (ADS)

    Tran, H. N.; Karamitros, M.; Ivanchenko, V. N.; Guatelli, S.; McKinnon, S.; Murakami, K.; Sasaki, T.; Okada, S.; Bordage, M. C.; Francis, Z.; El Bitar, Z.; Bernal, M. A.; Shin, J. I.; Lee, S. B.; Barberet, Ph.; Tran, T. T.; Brown, J. M. C.; Nhan Hao, T. V.; Incerti, S.

    2016-04-01

    Gold nanoparticles have been reported as a possible radio-sensitizer agent in radiation therapy due to their ability to increase energy deposition and subsequent direct damage to cells and DNA within their local vicinity. Moreover, this increase in energy deposition also results in an increase of the radiochemical yields. In this work we present, for the first time, an in silico investigation, based on the general purpose Monte Carlo simulation toolkit Geant4, into energy deposition and radical species production around a spherical gold nanoparticle 50 nm in diameter via proton irradiation. Simulations were preformed for incident proton energies ranging from 2 to 170 MeV, which are of interest for clinical proton therapy.

  20. Simulation, optimization and testing of a novel high spatial resolution X-ray imager based on Zinc Oxide nanowires in Anodic Aluminium Oxide membrane using Geant4

    NASA Astrophysics Data System (ADS)

    Esfandi, F.; Saramad, S.

    2015-07-01

    In this work, a new generation of scintillator based X-ray imagers based on ZnO nanowires in Anodized Aluminum Oxide (AAO) nanoporous template is characterized. The optical response of ordered ZnO nanowire arrays in porous AAO template under low energy X-ray illumination is simulated by the Geant4 Monte Carlo code and compared with experimental results. The results show that for 10 keV X-ray photons, by considering the light guiding properties of zinc oxide inside the AAO template and suitable selection of detector thickness and pore diameter, the spatial resolution less than one micrometer and the detector detection efficiency of 66% are accessible. This novel nano scintillator detector can have many advantages for medical applications in the future.

  1. A polygon-surface reference Korean male phantom (PSRK-Man) and its direct implementation in Geant4 Monte Carlo simulation.

    PubMed

    Kim, Chan Hyeong; Jeong, Jong Hwi; Bolch, Wesley E; Cho, Kun-Woo; Hwang, Sung Bae

    2011-05-21

    Even though the hybrid phantom embodies both the anatomic reality of voxel phantoms and the deformability of stylized phantoms, it must be voxelized to be used in a Monte Carlo code for dose calculation or some imaging simulation, which incurs the inherent limitations of voxel phantoms. In the present study, a voxel phantom named VKH-Man (Visible Korean Human-Man), was converted to a polygon-surface phantom (PSRK-Man, Polygon-Surface Reference Korean-Man), which was then adjusted to the Reference Korean data. Subsequently, the PSRK-Man polygon phantom was directly, without any voxelization process, implemented in the Geant4 Monte Carlo code for dose calculations. The calculated dose values and computation time were then compared with those of HDRK-Man (High Definition Reference Korean-Man), a corresponding voxel phantom adjusted to the same Reference Korean data from the same VKH-Man voxel phantom. Our results showed that the calculated dose values of the PSRK-Man surface phantom agreed well with those of the HDRK-Man voxel phantom. The calculation speed for the PSRK-Man polygon phantom though was 70-150 times slower than that of the HDRK-Man voxel phantom; that speed, however, could be acceptable in some applications, in that direct use of the surface phantom PSRK-Man in Geant4 does not require a separate voxelization process. Computing speed can be enhanced, in future, either by optimizing the Monte Carlo transport kernel for the polygon surfaces or by using modern computing technologies such as grid computing and general-purpose computing on graphics processing units programming. PMID:21521906

  2. Ionospheric irregularity physics modelling

    SciTech Connect

    Ossakow, S.L.; Keskinen, M.J.; Zalesak, S.T.

    1982-01-01

    Theoretical and numerical simulation techniques have been employed to study ionospheric F region plasma cloud striation phenomena, equatorial spread F phenomena, and high latitude diffuse auroral F region irregularity phenomena. Each of these phenomena can cause scintillation effects. The results and ideas from these studies are state-of-the-art, agree well with experimental observations, and have induced experimentalists to look for theoretically predicted results. One conclusion that can be drawn from these studies is that ionospheric irregularity phenomena can be modelled from a first principles physics point of view. Theoretical and numerical simulation results from the aforementioned ionospheric irregularity areas will be presented.

  3. Physical Models of Cognition

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1994-01-01

    This paper presents and discusses physical models for simulating some aspects of neural intelligence, and, in particular, the process of cognition. The main departure from the classical approach here is in utilization of a terminal version of classical dynamics introduced by the author earlier. Based upon violations of the Lipschitz condition at equilibrium points, terminal dynamics attains two new fundamental properties: it is spontaneous and nondeterministic. Special attention is focused on terminal neurodynamics as a particular architecture of terminal dynamics which is suitable for modeling of information flows. Terminal neurodynamics possesses a well-organized probabilistic structure which can be analytically predicted, prescribed, and controlled, and therefore which presents a powerful tool for modeling real-life uncertainties. Two basic phenomena associated with random behavior of neurodynamic solutions are exploited. The first one is a stochastic attractor ; a stable stationary stochastic process to which random solutions of a closed system converge. As a model of the cognition process, a stochastic attractor can be viewed as a universal tool for generalization and formation of classes of patterns. The concept of stochastic attractor is applied to model a collective brain paradigm explaining coordination between simple units of intelligence which perform a collective task without direct exchange of information. The second fundamental phenomenon discussed is terminal chaos which occurs in open systems. Applications of terminal chaos to information fusion as well as to explanation and modeling of coordination among neurons in biological systems are discussed. It should be emphasized that all the models of terminal neurodynamics are implementable in analog devices, which means that all the cognition processes discussed in the paper are reducible to the laws of Newtonian mechanics.

  4. Physical models of cognition

    NASA Astrophysics Data System (ADS)

    Zak, Michail

    1994-05-01

    This paper presents and discusses physical models for simulating some aspects of neural intelligence, and, in particular, the process of cognition. The main departure from the classical approach here is in utilization of a terminal version of classical dynamics introduced by the author earlier. Based upon violations of the Lipschitz condition at equilibrium points, terminal dynamics attains two new fundamental properties: it is spontaneous and nondeterministic. Special attention is focused on terminal neurodynamics as a particular architecture of terminal dynamics which is suitable for modeling of information flows. Terminal neurodynamics possesses a well-organized probabilistic structure which can be analytically predicted, prescribed, and controlled, and therefore which presents a powerful tool for modeling real-life uncertainties. Two basic phenomena associated with random behavior of neurodynamic solutions are exploited. The first one is a stochastic attractor—a stable stationary stochastic process to which random solutions of a closed system converge. As a model of the cognition process, a stochastic attractor can be viewed as a universal tool for generalization and formation of classes of patterns. The concept of stochastic attractor is applied to model a collective brain paradigm explaining coordination between simple units of intelligence which perform a collective task without direct exchange of information. The second fundamental phenomenon discussed is terminal chaos which occurs in open systems. Applications of terminal chaos to information fusion as well as to explanation and modeling of coordination among neurons in biological systems are discussed. It should be emphasized that all the models of terminal neurodynamics are implementable in analog devices, which means that all the cognition processes discussed in the paper are reducible to the laws of Newtonian mechanics.

  5. MODELING PHYSICAL HABITAT PARAMETERS

    EPA Science Inventory

    Salmonid populations can be affected by alterations in stream physical habitat. Fish productivity is determined by the stream's physical habitat structure ( channel form, substrate distribution, riparian vegetation), water quality, flow regime and inputs from the watershed (sedim...

  6. Simulating cosmic radiation absorption and secondary particle production of solar panel layers of Low Earth Orbit (LEO) satellite with GEANT4

    NASA Astrophysics Data System (ADS)

    Yiǧitoǧlu, Merve; Veske, Doǧa; Nilüfer Öztürk, Zeynep; Bilge Demirköz, Melahat

    2016-07-01

    All devices which operate in space are exposed to cosmic rays during their operation. The resulting radiation may cause fatal damages in the solid structure of devices and the amount of absorbed radiation dose and secondary particle production for each component should be calculated carefully before the production. Solar panels are semiconductor solid state devices and are very sensitive to radiation. Even a short term power cut-off may yield a total failure of the satellite. Even little doses of radiation can change the characteristics of solar cells. This deviation can be caused by rarer high energetic particles as well as the total ionizing dose from the abundant low energy particles. In this study, solar panels planned for a specific LEO satellite, IMECE, are analyzed layer by layer. The Space Environment Information System (SPENVIS) database and GEANT4 simulation software are used to simulate the layers of the panels. The results obtained from the simulation will be taken in account to determine the amount of radiation protection and resistance needed for the panels or to revise the design of the panels.

  7. Monte Carlo simulation of a PhosWatch detector using Geant4 for xenon isotope beta-gamma coincidence spectrum profile and detection efficiency calculations.

    PubMed

    Mekarski, P; Zhang, W; Ungar, K; Bean, M; Korpach, E

    2009-10-01

    A simulation tool has been developed using the Geant4 Toolkit to simulate a PhosWatch single channel beta-gamma coincidence detection system consisting of a CsI(Tl)/BC404 Phoswich well detector and pulse shape analysis algorithms implemented digital signal processor. The tool can be used to simulate the detector's response for all the gamma rays and beta particles emitted from (135)Xe, (133m)Xe, (133)Xe, (131m)Xe and (214)Pb. Two- and three-dimensional beta-gamma coincidence spectra from the PhosWatch detector can be produced using the simulation tool. The accurately simulated spectra could be used to calculate system coincidence detection efficiency for each xenon isotope, the corrections for the interference from the various spectral components from radon and xenon isotopes, and system gain calibration. Also, it can generate two- and three-dimensional xenon reference spectra to test beta-gamma coincidence spectral deconvolution analysis software. PMID:19647444

  8. Building Mental Models by Dissecting Physical Models

    ERIC Educational Resources Information Center

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…

  9. Physical Modeling of the Piano

    NASA Astrophysics Data System (ADS)

    Giordano, N.; Jiang, M.

    2004-12-01

    A project aimed at constructing a physical model of the piano is described. Our goal is to calculate the sound produced by the instrument entirely from Newton's laws. The structure of the model is described along with experiments that augment and test the model calculations. The state of the model and what can be learned from it are discussed.

  10. Validation of GEANT4 simulations for percentage depth dose calculations in heterogeneous media by using small photon beams from the 6-MV Cyberknife: Comparison with photon beam dosimetry with EBT2 film

    NASA Astrophysics Data System (ADS)

    Lee, Chung Il; Yoon, Sei-Chul; Shin, Jae Won; Hong, Seung-Woo; Suh, Tae Suk; Min, Kyung Joo; Lee, Sang Deok; Chung, Su Mi; Jung, Jae-Yong

    2015-04-01

    Percentage depth dose (PDD) distributions in heterogeneous phantoms with lung and soft bone equivalent media are studied by using the GEANT4 Monte Carlo code. For lung equivalent media, Balsa wood is used, and for soft bone equivalent media, a compound material with epoxy resin, hardener and calcium carbonate is used. Polystyrene slabs put together with these materials are used as a heterogeneous phantom. Dose measurements are performed with Gafchromic EBT2 film by using photon beams from the 6-MV CyberKnife at the Seoul Uridul Hospital. The cone sizes of the photon beams are varied from 5 to 10 to 30 mm. When the Balsa wood is inserted in the phantom, the dose measured with EBT2 film is found to be significantly different from the dose without the EBT2 film in and the dose beyond the Balsa wood region, particularly for small field sizes. On the other hand, when the soft bone equivalent material is inserted in the phantom, the discrepancy between the dose measured with EBT2 film and the dose without EBT2 film can be seen only in the region of the soft bone equivalent material. GEANT4 simulations are done with and without the EBT2 film to compare the simulation results with measurements. The GEANT4 simulations including EBT2 film are found to agree well with the measurements for all the cases within an error of 2.2%. The results of the present study show that GEANT4 gives reasonable results for the PDD calculations in heterogeneous media when using photon beams produced by the 6-MV CyberKnife

  11. Efficiency corrections in determining the (137)Cs inventory of environmental soil samples by using relative measurement method and GEANT4 simulations.

    PubMed

    Li, Gang; Liang, Yongfei; Xu, Jiayun; Bai, Lixin

    2015-08-01

    The determination of (137)Cs inventory is widely used to estimate the soil erosion or deposition rate. The generally used method to determine the activity of volumetric samples is the relative measurement method, which employs a calibration standard sample with accurately known activity. This method has great advantages in accuracy and operation only when there is a small difference in elemental composition, sample density and geometry between measuring samples and the calibration standard. Otherwise it needs additional efficiency corrections in the calculating process. The Monte Carlo simulations can handle these correction problems easily with lower financial cost and higher accuracy. This work presents a detailed description to the simulation and calibration procedure for a conventionally used commercial P-type coaxial HPGe detector with cylindrical sample geometry. The effects of sample elemental composition, density and geometry were discussed in detail and calculated in terms of efficiency correction factors. The effect of sample placement was also analyzed, the results indicate that the radioactive nuclides and sample density are not absolutely uniform distributed along the axial direction. At last, a unified binary quadratic functional relationship of efficiency correction factors as a function of sample density and height was obtained by the least square fitting method. This function covers the sample density and height range of 0.8-1.8 g/cm(3) and 3.0-7.25 cm, respectively. The efficiency correction factors calculated by the fitted function are in good agreement with those obtained by the GEANT4 simulations with the determination coefficient value greater than 0.9999. The results obtained in this paper make the above-mentioned relative measurements more accurate and efficient in the routine radioactive analysis of environmental cylindrical soil samples. PMID:25973538

  12. SU-E-T-290: Secondary Dose Monitoring Using Scintillating Fibers in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    SciTech Connect

    Tesfamicael, B; Gueye, P; Lyons, D; Avery, S; Mahesh, M

    2014-06-01

    Purpose: To monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers. Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate prostate cancer proton therapy based treatments. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cmm{sup 3} DuPont™ Delrin blocks were used to monitor the emission of secondary particles in the transverse (left and right) and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were used to extract the energy deposited in each fiber and the scintillating block. Results: The transverse dose distributions from secondary particles in both cases agree within <5% and with a very good symmetry. The energy deposited not only gradually increases as one moves from the peripheral row fibers towards the center of the block (aligned with the center of the prostate) but also decreases as one goes from the frontal to distal region of the block. The ratio of the doses from the prostate to the ones in the middle two rows of fibers showed a linear relationship with a slope (−3.55±2.26) × 10−5 MeV per treatment Gy. The distal detectors recorded a very small energy deposited due to water attenuation. Conclusion: With a good calibration and the ability to define a good correlation between the dose to the external fibers and the prostate, such fibers can be used for real time dose verification to the target.

  13. Dose distribution changes with shielding disc misalignments and wrong orientations in breast IOERT: a Monte Carlo - GEANT4 and experimental study.

    PubMed

    Russo, Giorgio; Casarino, Carlo; Arnetta, Gaetano; Candiano, Giuliana; Stefano, Alessandro; Alongi, Filippo; Borasi, Giovanni; Messa, Cristina; Gilardi, Maria C

    2012-01-01

    One of the most relevant risks in breast intraoperative electron radiotherapy (IOERT) is the incorrect positioning of the shielding disc. If such a setup error occurs, the treatment zone could receive a nonuniform dose delivery, and a considerable part of the electron beam could hit - and irradiate - the patient's healthy tissue. However misalignment and tilt angle of the shielding disc can be evaluated, but it is not possible to measure the corresponding in vivo dose distribution. This led us to develop a simulation using the Geant4 Monte Carlo toolkit to study the effects of disc configuration on dose distribution. Some parameters were investigated: the shielding factor (SF), the radiation back scattering factor (BSF), the volume-dose histogram in the treatment zone, and the maximum leakage dose (MLD) in normal tissue. A lateral shift of the disc (in the plane perpendicular to the beam axis) causes a decrease in SF (from 4% for a misalignment of 5 mm to 40% for a misalignment of 40 mm), but no relevant dose variations were found for a tilt angle until 10°. In the same uncorrected disc positions, the BSF shows no significant change. MLD rises to 3.45 Gy for a 14 mm misalignment and 4.60 Gy for 30° tilt angle when the prescribed dose is 21 Gy. The simulation results are compared with the experimental ones, and allow an a posteriori estimation of the dose distribution in the breast target and underlying healthy tissue. This information could help the surgical team choose a more correct clinical setup, and assist in quantifying the degree of success or failure of an IOERT breast treatment. PMID:22955646

  14. GEANT4 simulation of a scintillating-fibre tracker for the cosmic-ray muon tomography of legacy nuclear waste containers

    NASA Astrophysics Data System (ADS)

    Clarkson, A.; Hamilton, D. J.; Hoek, M.; Ireland, D. G.; Johnstone, J. R.; Kaiser, R.; Keri, T.; Lumsden, S.; Mahon, D. F.; McKinnon, B.; Murray, M.; Nutbeam-Tuffs, S.; Shearer, C.; Staines, C.; Yang, G.; Zimmerman, C.

    2014-05-01

    Cosmic-ray muons are highly penetrative charged particles that are observed at the sea level with a flux of approximately one per square centimetre per minute. They interact with matter primarily through Coulomb scattering, which is exploited in the field of muon tomography to image shielded objects in a wide range of applications. In this paper, simulation studies are presented that assess the feasibility of a scintillating-fibre tracker system for use in the identification and characterisation of nuclear materials stored within industrial legacy waste containers. A system consisting of a pair of tracking modules above and a pair below the volume to be assayed is simulated within the GEANT4 framework using a range of potential fibre pitches and module separations. Each module comprises two orthogonal planes of fibres that allow the reconstruction of the initial and Coulomb-scattered muon trajectories. A likelihood-based image reconstruction algorithm has been developed that allows the container content to be determined with respect to the scattering density λ, a parameter which is related to the atomic number Z of the scattering material. Images reconstructed from this simulation are presented for a range of anticipated scenarios that highlight the expected image resolution and the potential of this system for the identification of high-Z materials within a shielded, concrete-filled container. First results from a constructed prototype system are presented in comparison with those from a detailed simulation. Excellent agreement between experimental data and simulation is observed showing clear discrimination between the different materials assayed throughout.

  15. Physical model of kitesurfing

    NASA Astrophysics Data System (ADS)

    Zimoch, Pawel; Paxson, Adam; Obropta, Edward; Peleg, Tom; Parker, Sam; Hosoi, A. E.

    2013-11-01

    Kitesurfing is a popular water sport, similar to windsurfing, utilizing a surfboard-like platform pulled by a large kite operated by the surfer. While the kite generates thrust that propels the surfer across the water, much like a traditional sail, it is also capable of generating vertical forces on the surfer, reducing the hydrodynamic lift generated by the surfboard required to support the surfer's weight. This in turn reduces drag acting on the surfboard, making sailing possible in winds lower than required by other sailing sports. We describe aerodynamic and hydrodynamic models for the forces acting on the kite and the surfboard, and couple them while considering the kite's position in space and the requirement for the kite to support its own weight. We then use these models to quantitatively characterize the significance of the vertical force component generated by the kite on sailing performance (the magnitude of achievable steady-state velocities and the range of headings, relative to the true wind direction, in which sailing is possible), and the degradation in sailing performance with decreasing wind speeds. Finally, we identify the areas of kite and surfboard design whose development could have the greatest impact on improving sailing performance in low wind conditions.

  16. TH-A-19A-05: Modeling Physics Properties and Biologic Effects Induced by Proton and Helium Ions

    SciTech Connect

    Taleei, R; Titt, U; Peeler, C; Guan, F; Mirkovic, D; Grosshans, D; Mohan, R

    2014-06-15

    Purpose: Currently, proton and carbon ions are used for cancer treatment. More recently, other light ions including helium ions have shown interesting physical and biological properties. The purpose of this work is to study the biological and physical properties of helium ions (He-3) in comparison to protons. Methods: Monte Carlo simulations with FLUKA, GEANT4 and MCNPX were used to calculate proton and He-3 dose distributions in water phantoms. The energy spectra of proton and He-3 beams were calculated with high resolution for use in biological models. The repair-misrepairfixation (RMF) model was subsequently used to calculate the RBE. Results: The proton Bragg curve calculations show good agreement between the three general purpose Monte Carlo codes. In contrast, the He-3 Bragg curve calculations show disagreement (for the magnitude of the Bragg peak) between FLUKA and the other two Monte Carlo codes. The differences in the magnitude of the Bragg peak are mainly due to the discrepancy in the secondary fragmentation cross sections used by the codes. The RBE for V79 cell lines is about 0.96 and 0.98 at the entrance of proton and He-3 ions depth dose respectively. The RBE increases to 1.06 and 1.59 at the Bragg peak of proton and He-3 ions. The results demonstrated that LET, microdosimetric parameters (such as dose-mean lineal energy) and RBE are nearly constant along the plateau region of Bragg curve, while all parameters increase within the Bragg peak and at the distal edge for both proton and He-3 ions. Conclusion: The Monte Carlo codes should revise the fragmentation cross sections to more accurately simulate the physical properties of He-3 ions. The increase in RBE for He-3 ions is higher than for proton beams at the Bragg peak.

  17. Physical Modeling of Microtubules Network

    NASA Astrophysics Data System (ADS)

    Allain, Pierre; Kervrann, Charles

    2014-10-01

    Microtubules (MT) are highly dynamic tubulin polymers that are involved in many cellular processes such as mitosis, intracellular cell organization and vesicular transport. Nevertheless, the modeling of cytoskeleton and MT dynamics based on physical properties is difficult to achieve. Using the Euler-Bernoulli beam theory, we propose to model the rigidity of microtubules on a physical basis using forces, mass and acceleration. In addition, we link microtubules growth and shrinkage to the presence of molecules (e.g. GTP-tubulin) in the cytosol. The overall model enables linking cytosol to microtubules dynamics in a constant state space thus allowing usage of data assimilation techniques.

  18. Physical Modeling of Aqueous Solvation

    PubMed Central

    Fennell, Christopher J.

    2014-01-01

    We consider the free energies of solvating molecules in water. Computational modeling usually involves either detailed explicit-solvent simulations, or faster computations, which are based on implicit continuum approximations or additivity assumptions. These simpler approaches often miss microscopic physical details and non-additivities present in experimental data. We review explicit-solvent modeling that identifies the physical bases for the errors in the simpler approaches. One problem is that water molecules that are shared between two substituent groups often behave differently than waters around each substituent individually. One manifestation of non-additivities is that solvation free energies in water can depend not only on surface area or volume, but on other properties, such as the surface curvature. We also describe a new computational approach, called Semi-Explicit Assembly, that aims to repair these flaws and capture more of the physics of explicit water models, but with computational efficiencies approaching those of implicit-solvent models. PMID:25143658

  19. Physical Models In GPSOMC Software

    NASA Technical Reports Server (NTRS)

    Sovers, Ojars J.; Border, James S.

    1992-01-01

    Report desribes physical models incorporated into GPSOMC, (modeling module of GIPSY software) which processes geodetic measurements in Global Positioning Satellite (GPS) system. Models describe spacecraft orbits and motions of receivers fixed to Earth. Supplies apriori values of computed observables and partial derivatives of computed observables with respect to parameters of models. Describes portion of software modeling locations of receivers and motions of whole Earth and computes observables and partial derivatives. Corrected, expanded, and updated version of JPL Publication 87-21, September 15, 1987.

  20. Building mental models by dissecting physical models.

    PubMed

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to ensure focused learning; models that are too constrained require less supervision, but can be constructed mechanically, with little to no conceptual engagement. We propose "model-dissection" as an alternative to "model-building," whereby instructors could make efficient use of supervisory resources, while simultaneously promoting focused learning. We report empirical results from a study conducted with biology undergraduate students, where we demonstrate that asking them to "dissect" out specific conceptual structures from an already built 3D physical model leads to a significant improvement in performance than asking them to build the 3D model from simpler components. Using questionnaires to measure understanding both before and after model-based interventions for two cohorts of students, we find that both the "builders" and the "dissectors" improve in the post-test, but it is the latter group who show statistically significant improvement. These results, in addition to the intrinsic time-efficiency of "model dissection," suggest that it could be a valuable pedagogical tool. PMID:26712513

  1. Accelerator physics and modeling: Proceedings

    SciTech Connect

    Parsa, Z.

    1991-12-31

    This report contains papers on the following topics: Physics of high brightness beams; radio frequency beam conditioner for fast-wave free-electron generators of coherent radiation; wake-field and space-charge effects on high brightness beams. Calculations and measured results for BNL-ATF; non-linear orbit theory and accelerator design; general problems of modeling for accelerators; development and application of dispersive soft ferrite models for time-domain simulation; and bunch lengthening in the SLC damping rings.

  2. Accelerator physics and modeling: Proceedings

    SciTech Connect

    Parsa, Z.

    1991-01-01

    This report contains papers on the following topics: Physics of high brightness beams; radio frequency beam conditioner for fast-wave free-electron generators of coherent radiation; wake-field and space-charge effects on high brightness beams. Calculations and measured results for BNL-ATF; non-linear orbit theory and accelerator design; general problems of modeling for accelerators; development and application of dispersive soft ferrite models for time-domain simulation; and bunch lengthening in the SLC damping rings.

  3. Physical and mathematical cochlear models

    NASA Astrophysics Data System (ADS)

    Lim, Kian-Meng

    2000-10-01

    The cochlea is an intricate organ in the inner ear responsible for our hearing. Besides acting as a transducer to convert mechanical sound vibrations to electrical neural signals, the cochlea also amplifies and separates the sound signal into its spectral components for further processing in the brain. It operates over a broad-band of frequency and a huge dynamic range of input while maintaining a low power consumption. The present research takes the approach of building cochlear models to study and understand the underlying mechanics involved in the functioning of the cochlea. Both physical and mathematical models of the cochlea are constructed. The physical model is a first attempt to build a life- sized replica of the human cochlea using advanced micro- machining techniques. The model takes a modular design, with a removable silicon-wafer based partition membrane encapsulated in a plastic fluid chamber. Preliminary measurements in the model are obtained and they compare roughly with simulation results. Parametric studies on the design parameters of the model leads to an improved design of the model. The studies also revealed that the width and orthotropy of the basilar membrane in the cochlea have significant effects on the sharply tuned responses observed in the biological cochlea. The mathematical model is a physiologically based model that includes three-dimensional viscous fluid flow and a tapered partition with variable properties along its length. A hybrid asymptotic and numerical method provides a uniformly valid and efficient solution to the short and long wave regions in the model. Both linear and non- linear activity are included in the model to simulate the active cochlea. The mathematical model has successfully reproduced many features of the response in the biological cochlea, as observed in experiment measurements performed on animals. These features include sharply tuned frequency responses, significant amplification with inclusion of activity

  4. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  5. Excellence in Physics Education Award: Modeling Theory for Physics Instruction

    NASA Astrophysics Data System (ADS)

    Hestenes, David

    2014-03-01

    All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.

  6. A Multivariate Model of Physics Problem Solving

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Farley, John

    2013-01-01

    A model of expertise in physics problem solving was tested on undergraduate science, physics, and engineering majors enrolled in an introductory-level physics course. Structural equation modeling was used to test hypothesized relationships among variables linked to expertise in physics problem solving including motivation, metacognitive planning,…

  7. Integrated modeling, data transfers, and physical models

    NASA Astrophysics Data System (ADS)

    Brookshire, D. S.; Chermak, J. M.

    2003-04-01

    Difficulties in developing precise economic policy models for water reallocation and re-regulation in various regional and transboundary settings has been exacerbated not only by climate issues but also by institutional changes reflected in the promulgation of environmental laws, changing regional populations, and an increased focus on water quality standards. As complexity of the water issues have increased, model development at a micro-policy level is necessary to capture difficult institutional nuances and represent the differing national, regional and stakeholders' viewpoints. More often than not, adequate "local" or specific micro-data are not available in all settings for modeling and policy decisions. Economic policy analysis increasingly deals with this problem through data transfers (transferring results from one study area to another) and significant progress has been made in understanding the issue of the dimensionality of data transfers. This paper explores the conceptual and empirical dimensions of data transfers in the context of integrated modeling when the transfers are not only from the behavioral, but also from the hard sciences. We begin by exploring the domain of transfer issues associated with policy analyses that directly consider uncertainty in both the behavioral and physical science settings. We then, through a stylized, hybrid, economic-engineering model of water supply and demand in the Middle Rio Grand Valley of New Mexico (USA) analyze the impacts of; (1) the relative uncertainty of data transfers methods, (2) the uncertainty of climate data and, (3) the uncertainly of population growth. These efforts are motivated by the need to address the relative importance of more accurate data both from the physical sciences as well as from demography and economics for policy analyses. We evaluate the impacts by empirically addressing (within the Middle Rio Grand model): (1) How much does the surrounding uncertainty of the benefit transfer

  8. Modeling QCD for Hadron Physics

    NASA Astrophysics Data System (ADS)

    Tandy, P. C.

    2011-10-01

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  9. Modeling QCD for Hadron Physics

    SciTech Connect

    Tandy, P. C.

    2011-10-24

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  10. Modelization For Electromagnetic Electron Scattering at Low Energies for Radiotherapy applications.

    NASA Astrophysics Data System (ADS)

    Nazaryan, Vahagn; Gueye, Paul

    2006-03-01

    Since release of the GEANT4 particle simulation toolkit in 2003, there has been a growing interest in its applications to medical physics. The applicability of GEANT4 to radiotherapy has been a subject of several investigations in recent years, and it was found to be of great use. Its low-energy model allows for electromagnetic interaction simulations down to 250 eV. The electron physics data are obtained from the Lawrence Livermore National Laboratory's Evaluated Electron Data Library (EEDL). At very lower energies (below 10 MeV), some of the tabulated data in EEDL have big uncertainties (more than 50%), and rely on various extrapolations to energy regions where there is no experimental data. We have investigated the variations of these cross-section data to radiotherapy applications. Our study suggests a strong need for better theoretical models of electron interactions with matter at these energies, and the necessity of new and more reliable experimental data. The progress towards such theoretical model will be presented.

  11. Physics modeling support contract: Final report

    SciTech Connect

    Not Available

    1987-09-30

    This document is the final report for the Physics Modeling Support contract between TRW, Inc. and the Lawrence Livermore National Laboratory for fiscal year 1987. It consists of following projects: TIBER physics modeling and systems code development; advanced blanket modeling task; time dependent modeling; and free electron maser for TIBER II.

  12. Model Formulation for Physics Problem Solving. Draft.

    ERIC Educational Resources Information Center

    Novak, Gordon S., Jr.

    The major task in solving a physics problem is to construct an appropriate model of the problem in terms of physical principles. The functions performed by such a model, the information which needs to be represented, and the knowledge used in selecting and instantiating an appropriate model are discussed. An example of a model for a mechanics…

  13. Software and Physics Simulation at Belle II

    NASA Astrophysics Data System (ADS)

    Fulsom, Bryan; Belle Collaboration, II

    2016-03-01

    The Belle II experiment at the SuperKEKB collider in Tsukuba, Japan, will start taking physics data in 2018 and will accumulate 50 ab-1 of e+e-collision data, about 50 times larger than the data set of the earlier Belle experiment. The new detector will use GEANT4 for Monte Carlo simulation and an entirely new software and reconstruction system based on modern computing tools. Examples of physics simulation including beam background overlays will be described.

  14. Physical modeling of Tibetan bowls

    NASA Astrophysics Data System (ADS)

    Antunes, Jose; Inacio, Octavio

    2001-05-01

    Tibetan bowls produce rich penetrating sounds, used in musical contexts and to induce a state of relaxation for meditation or therapy purposes. To understand the dynamics of these instruments under impact and rubbing excitation, we developed a simulation method based on the modal approach, following our previous papers on physical modeling of plucked/bowed strings and impacted/bowed bars. This technique is based on a compact representation of the system dynamics, in terms of the unconstrained bowl modes. Nonlinear contact/friction interaction forces, between the exciter (puja) and the bowl, are computed at each time step and projected on the bowl modal basis, followed by step integration of the modal equations. We explore the behavior of two different-sized bowls, for extensive ranges of excitation conditions (contact/friction parameters, normal force, and tangential puja velocity). Numerical results and experiments show that various self-excited motions may arise depending on the playing conditions and, mainly, on the contact/friction interaction parameters. Indeed, triggering of a given bowl modal frequency mainly depends on the puja material. Computed animations and experiments demonstrate that self-excited modes spin, following the puja motion. Accordingly, the sensed pressure field pulsates, with frequency controlled by the puja spinning velocity and the spatial pattern of the singing mode.

  15. A qualitative model of physical fields

    SciTech Connect

    Lundell, M.

    1996-12-31

    A qualitative model of the spatio-temporal behaviour of distributed parameter systems based on physical fields is presented. Field-based models differ from the object-based models normally used in qualitative physics by treating parameters as continuous entities instead of as attributes of discrete objects. This is especially suitable for natural physical systems, e.g. in ecology. The model is divided into a static and a dynamic part. The static model describes the distribution of each parameter as a qualitative physical field. Composite fields are constructed from intersection models of pairs of fields. The dynamic model describes processes acting on the fields, and qualitative relationships between parameters. Spatio-temporal behaviour is modelled by interacting temporal processes, influencing single points in space, and spatial processes that gradually spread temporal processes over space. We give an example of a qualitative model of a natural physical system and discuss the ambiguities that arise during simulation.

  16. Sensitivity study of proton radiography and comparison with kV and MV x-ray imaging using GEANT4 Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Depauw, Nicolas; Seco, Joao

    2011-04-01

    The imaging sensitivity of proton radiography has been studied and compared with kV and MV x-ray imaging using Monte Carlo simulations. A phantom was specifically modeled using 21 different material inserts with densities ranging from 0.001 to 1.92 g cm-3. These simulations were run using the MGH double scattered proton beam, scanned pencil proton beams from 200 to 490 MeV, as well as pure 50 keV, 100 keV, 1 MeV and 2 MeV gamma x-ray beams. In order to compare the physics implied in both proton and photon radiography without being biased by the current state of the art in detector technology, the detectors were considered perfect. Along with spatial resolution, the contrast-to-noise ratio was evaluated and compared for each material. These analyses were performed using radiographic images that took into account the following: only primary protons, both primary and secondary protons, and both contributions while performing angular and energetic cuts. Additionally, tissue-to-tissue contrasts in an actual lung cancer patient case were studied for simulated proton radiographs and compared against the original kV x-ray image which corresponds to the current patient set-up image in the proton clinic. This study highlights the poorer spatial resolution of protons versus x-rays for radiographic imaging purposes, and the excellent density resolution of proton radiography. Contrasts around the tumor are higher using protons in a lung cancer patient case. The high-density resolution of proton radiography is of great importance for specific tumor diagnostics, such as in lung cancer, where x-ray radiography operates poorly. Furthermore, the use of daily proton radiography prior to proton therapy would ameliorate patient set-up while reducing the absorbed dose delivered through imaging.

  17. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  18. Evaluating a Model of Youth Physical Activity

    ERIC Educational Resources Information Center

    Heitzler, Carrie D.; Lytle, Leslie A.; Erickson, Darin J.; Barr-Anderson, Daheia; Sirard, John R.; Story, Mary

    2010-01-01

    Objective: To explore the relationship between social influences, self-efficacy, enjoyment, and barriers and physical activity. Methods: Structural equation modeling examined relationships between parent and peer support, parent physical activity, individual perceptions, and objectively measured physical activity using accelerometers among a…

  19. Comprehensive Physical Education Program Model

    ERIC Educational Resources Information Center

    Kamiya, Artie

    2005-01-01

    In 2004, the Wake County Public School System (North Carolina) received $1.3 million as one of 237 national winners of the $70 million federal Carol M. White Physical Education Program (PEP) Grant competition. The PEP Grant program is funded by the U.S. Department of Education and provides monies to school districts able to demonstrate the…

  20. CHEMICAL AND PHYSICAL PROCESS AND MECHANISM MODELING

    EPA Science Inventory

    The goal of this task is to develop and test chemical and physical mechanisms for use in the chemical transport models of EPA's Models-3. The target model for this research is the Community Multiscale Air Quality (CMAQ) model. These mechanisms include gas and aqueous phase ph...

  1. Nuclear Physics and the New Standard Model

    SciTech Connect

    Ramsey-Musolf, Michael J.

    2010-08-04

    Nuclear physics studies of fundamental symmetries and neutrino properties have played a vital role in the development and confirmation of the Standard Model of fundamental interactions. With the advent of the CERN Large Hadron Collider, experiments at the high energy frontier promise exciting discoveries about the larger framework in which the Standard Model lies. In this talk, I discuss the complementary opportunities for probing the 'new Standard Model' with nuclear physics experiments at the low-energy high precision frontier.

  2. Determination and Fabrication of New Shield Super Alloys Materials for Nuclear Reactor Safety by Experiments and Cern-Fluka Monte Carlo Simulation Code, Geant4 and WinXCom

    NASA Astrophysics Data System (ADS)

    Aygun, Bünyamin; Korkut, Turgay; Karabulut, Abdulhalik

    2016-05-01

    Despite the possibility of depletion of fossil fuels increasing energy needs the use of radiation tends to increase. Recently the security-focused debate about planned nuclear power plants still continues. The objective of this thesis is to prevent the radiation spread from nuclear reactors into the environment. In order to do this, we produced higher performanced of new shielding materials which are high radiation holders in reactors operation. Some additives used in new shielding materials; some of iron (Fe), rhenium (Re), nickel (Ni), chromium (Cr), boron (B), copper (Cu), tungsten (W), tantalum (Ta), boron carbide (B4C). The results of this experiments indicated that these materials are good shields against gamma and neutrons. The powder metallurgy technique was used to produce new shielding materials. CERN - FLUKA Geant4 Monte Carlo simulation code and WinXCom were used for determination of the percentages of high temperature resistant and high-level fast neutron and gamma shielding materials participated components. Super alloys was produced and then the experimental fast neutron dose equivalent measurements and gamma radiation absorpsion of the new shielding materials were carried out. The produced products to be used safely reactors not only in nuclear medicine, in the treatment room, for the storage of nuclear waste, nuclear research laboratories, against cosmic radiation in space vehicles and has the qualities.

  3. Geant4 simulation of the PSI LEM beam line: energy loss and muonium formation in thin foils and the impact of unmoderated muons on the μSR spectrometer

    NASA Astrophysics Data System (ADS)

    Khaw, K. S.; Antognini, A.; Crivelli, P.; Kirch, K.; Morenzoni, E.; Salman, Z.; Suter, A.; Prokscha, T.

    2015-10-01

    The PSI low-energy μSR spectrometer is an instrument dedicated to muon spin rotation and relaxation measurements. Knowledge of the muon beam parameters such as spatial, kinetic energy and arrival-time distributions at the sample position are important ingredients to analyze the μSR spectra. We present here the measured energy losses in the thin carbon foil of the muon start detector deduced from time-of-flight measurements. Muonium formation in the thin carbon foil (10 nm thickness) of the muon start detector also affect the measurable decay asymmetry and therefore need to be accounted for. Muonium formation and energy losses in the start detector, whose relevance increase with decreasing muon implantation energy (<10 keV), have been implemented in Geant4 Monte Carlo simulation to reproduce the measured time-of-flight spectra. Simulated and measured time-of-flight and beam spot agrees only if a small fraction of so called ``unmoderated'' muons which contaminate the mono-energetic muon beam of the μSR spectrometer is introduced. Moreover the sensitivity of the beam size and related upstream-downstream asymmetry for a specially shaped ``nose'' sample plate has been studied for various beam line settings, which is of relevance for the study of thermal muonium emission into vacuum from mesoporous silica at cryogenic temperatures.

  4. Models of Strategy for Solving Physics Problems.

    ERIC Educational Resources Information Center

    Larkin, Jill H.

    A set of computer implemented models are presented which can assist in developing problem solving strategies. The three levels of expertise which are covered are beginners (those who have completed at least one university physics course), intermediates (university level physics majors in their third year of study), and professionals (university…

  5. Are Physical Education Majors Models for Fitness?

    ERIC Educational Resources Information Center

    Kamla, James; Snyder, Ben; Tanner, Lori; Wash, Pamela

    2012-01-01

    The National Association of Sport and Physical Education (NASPE) (2002) has taken a firm stance on the importance of adequate fitness levels of physical education teachers stating that they have the responsibility to model an active lifestyle and to promote fitness behaviors. Since the NASPE declaration, national initiatives like Let's Move…

  6. The trinucleons: Physical observables and model properties

    SciTech Connect

    Gibson, B.F.

    1992-05-01

    Our progress in understanding the properties of {sup 3}H and {sup 3}He in terms of a nonrelativistic Hamiltonian picture employing realistic nuclear forces is reviewed. Trinucleon model properties are summarized for a number of contemporary force models, and predictions for physical observables are presented. Disagreement between theoretical model results and experimental results are highlighted.

  7. The trinucleons: Physical observables and model properties

    SciTech Connect

    Gibson, B.F.

    1992-01-01

    Our progress in understanding the properties of {sup 3}H and {sup 3}He in terms of a nonrelativistic Hamiltonian picture employing realistic nuclear forces is reviewed. Trinucleon model properties are summarized for a number of contemporary force models, and predictions for physical observables are presented. Disagreement between theoretical model results and experimental results are highlighted.

  8. Modeling Physics with Easy Java Simulations

    ERIC Educational Resources Information Center

    Christian, Wolfgang; Esquembre, Francisco

    2007-01-01

    Modeling has been shown to correct weaknesses of traditional instruction by engaging students in the design of physical models to describe, explain, and predict phenomena. Although the modeling method can be used without computers, the use of computers allows students to study problems that are difficult and time consuming, to visualize their…

  9. Bridging physics and biology teaching through modeling

    NASA Astrophysics Data System (ADS)

    Hoskinson, Anne-Marie; Couch, Brian A.; Zwickl, Benjamin M.; Hinko, Kathleen A.; Caballero, Marcos D.

    2014-05-01

    As the frontiers of biology become increasingly interdisciplinary, the physics education community has engaged in ongoing efforts to make physics classes more relevant to life science majors. These efforts are complicated by the many apparent differences between these fields, including the types of systems that each studies, the behavior of those systems, the kinds of measurements that each makes, and the role of mathematics in each field. Nonetheless, physics and biology are both sciences that rely on observations and measurements to construct models of the natural world. In this article, we propose that efforts to bridge the teaching of these two disciplines must emphasize shared scientific practices, particularly scientific modeling. We define modeling using language common to both disciplines and highlight how an understanding of the modeling process can help reconcile apparent differences between the teaching of physics and biology. We elaborate on how models can be used for explanatory, predictive, and functional purposes and present common models from each discipline demonstrating key modeling principles. By framing interdisciplinary teaching in the context of modeling, we aim to bridge physics and biology teaching and to equip students with modeling competencies applicable in any scientific discipline.

  10. Developing + Using Models in Physics

    ERIC Educational Resources Information Center

    Campbell, Todd; Neilson, Drew; Oh, Phil Seok

    2013-01-01

    Of the eight practices of science identified in "A Framework for K-12 Science Education" (NRC 2012), helping students develop and use models has been identified by many as an anchor (Schwarz and Passmore 2012; Windschitl 2012). In instruction, disciplinary core ideas, crosscutting concepts, and scientific practices can be meaningfully…

  11. Physics of the Quark Model

    ERIC Educational Resources Information Center

    Young, Robert D.

    1973-01-01

    Discusses the charge independence, wavefunctions, magnetic moments, and high-energy scattering of hadrons on the basis of group theory and nonrelativistic quark model with mass spectrum calculated by first-order perturbation theory. The presentation is explainable to advanced undergraduate students. (CC)

  12. PHYSICAL MODEL FOR RECOGNITION TUNNELING

    PubMed Central

    Krstić, Predrag; Ashcroft, Brian; Lindsay, Stuart

    2015-01-01

    Recognition tunneling (RT) identifies target molecules trapped between tunneling electrodes functionalized with recognition molecules that serve as specific chemical linkages between the metal electrodes and the trapped target molecule. Possible applications include single molecule DNA and protein sequencing. This paper addresses several fundamental aspects of RT by multiscale theory, applying both all-atom and coarse-grained DNA models: (1) We show that the magnitude of the observed currents are consistent with the results of non-equilibrium Green's function calculations carried out on a solvated all-atom model. (2) Brownian fluctuations in hydrogen bond-lengths lead to current spikes that are similar to what is observed experimentally. (3) The frequency characteristics of these fluctuations can be used to identify the trapped molecules with a machine-learning algorithm, giving a theoretical underpinning to this new method of identifying single molecule signals. PMID:25650375

  13. Higgs Physics in Supersymmetric Models

    NASA Astrophysics Data System (ADS)

    Jaiswal, Prerit

    Standard Model (SM) successfully describes the particle spectrum in nature and the interaction between these particles using gauge symmetries. However, in order to give masses to these particles, the electroweak gauge symmetry must be broken. In the SM, this is achieved through the Higgs mechanism where a scalar Higgs field acquires a vacuum expectation value. It is well known that the presence of a scalar field in the SM leads to a hierarchy problem, and therefore the SM by itself can not be the fundamental theory of nature. A well-motivated extension of the SM which addresses this problem is the Minimal Supersymmetric Standard Model (MSSM). The Higgs sector in the MSSM has a rich phenomenology and its predictions can be tested at colliders. In this thesis, I will describe three examples in supersymmetric models where the Higgs phenomenology is significantly different from that in SM. The first example is the MSSM with large tan β where the Higgs coupling to the bottom quarks receives large radiative supersymmetric QCD corrections. As a consequence, bg bh can be a dominant Higgs production mode in certain parameter spaces of the MSSM. A second example is an extension of the MSSM wherein a fourth generation of chiral fermions and their super-partners are added. I will show that the Higgs boson in such models can be as heavy as ˜ 500 GeV. Finally, as a third example, the MSSM with one of the stops lighter than the top quark is considered. Such a scenario is required to generate sufficient baryon asymmetry in the universe through the process of electroweak baryogenesis. By using the correlations between the Higgs production and decay rates, it will be shown that the electroweak baryogenesis in the MSSM is highly constrained.

  14. The Standard Model of Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Detmold, William

    2015-04-01

    At its core, nuclear physics, which describes the properties and interactions of hadrons, such as protons and neutrons, and atomic nuclei, arises from the Standard Model of particle physics. However, the complexities of nuclei result in severe computational difficulties that have historically prevented the calculation of central quantities in nuclear physics directly from this underlying theory. The availability of petascale (and prospect of exascale) high performance computing is changing this situation by enabling us to extend the numerical techniques of lattice Quantum Chromodynamics (LQCD), applied successfully in particle physics, to the more intricate dynamics of nuclear physics. In this talk, I will discuss this revolution and the emerging understanding of hadrons and nuclei within the Standard Model.

  15. PHYSICAL MODELING OF CONTRACTED FLOW.

    USGS Publications Warehouse

    Lee, Jonathan K.

    1987-01-01

    Experiments on steady flow over uniform grass roughness through centered single-opening contractions were conducted in the Flood Plain Simulation Facility at the U. S. Geological Survey's Gulf Coast Hydroscience Center near Bay St. Louis, Miss. The experimental series was designed to provide data for calibrating and verifying two-dimensional, vertically averaged surface-water flow models used to simulate flow through openings in highway embankments across inundated flood plains. Water-surface elevations, point velocities, and vertical velocity profiles were obtained at selected locations for design discharges ranging from 50 to 210 cfs. Examples of observed water-surface elevations and velocity magnitudes at basin cross-sections are presented.

  16. Beta-gamma coincidence counting efficiency and energy resolution optimization by Geant4 Monte Carlo simulations for a phoswich well detector.

    PubMed

    Zhang, Weihua; Mekarski, Pawel; Ungar, Kurt

    2010-12-01

    A single-channel phoswich well detector has been assessed and analysed in order to improve beta-gamma coincidence measurement sensitivity of (131m)Xe and (133m)Xe. This newly designed phoswich well detector consists of a plastic cell (BC-404) embedded in a CsI(Tl) crystal coupled to a photomultiplier tube (PMT). It can be used to distinguish 30.0-keV X-ray signals of (131m)Xe and (133m)Xe using their unique coincidence signatures between the conversion electrons (CEs) and the 30.0-keV X-rays. The optimum coincidence efficiency signal depends on the energy resolutions of the two CE peaks, which could be affected by relative positions of the plastic cell to the CsI(Tl) because the embedded plastic cell would interrupt scintillation light path from the CsI(Tl) crystal to the PMT. In this study, several relative positions between the embedded plastic cell and the CsI(Tl) crystal have been evaluated using Monte Carlo modeling for its effects on coincidence detection efficiency and X-ray and CE energy resolutions. The results indicate that the energy resolution and beta-gamma coincidence counting efficiency of X-ray and CE depend significantly on the plastic cell locations inside the CsI(Tl). The degraded X-ray and CE peak energy resolutions due to light collection efficiency deterioration by the embedded cell can be minimised. The optimum of CE and X-ray energy resolution, beta-gamma coincidence efficiency as well as the ease of manufacturing could be achieved by varying the embedded plastic cell positions inside the CsI(Tl) and consequently setting the most efficient geometry. PMID:20598559

  17. Physical Modelling of Sedimentary Basin

    SciTech Connect

    Yuen, David A.

    2003-04-24

    The main goals of the first three years have been achieved, i.e., the development of particle-based and continuum-based algorithms for cross-scaleup-scale analysis of complex fluid flows. The U. Minnesota team has focused on particle-based methods, wavelets (Rustad et al., 2001) and visualization and has had great success with the dissipative and fluid particle dynamics algorithms, as applied to colloidal, polymeric and biological systems, wavelet filtering and visualization endeavors. We have organized two sessions in nonlinear geophysics at the A.G.U. Fall Meeting (2000,2002), which have indeed synergetically stimulated the community and promoted cross-disciplinary efforts in the geosciences. The LANL team has succeeded with continuum-based algorithms, in particular, fractal interpolating functions (fif). These have been applied to 1-D flow and transport equations (Travis, 2000; 2002) as a proof of principle, providing solutions that capture dynamics at all scales. In addition, the fif representations can be integrated to provide sub-grid-scale homogenization, which can be used in more traditional finite difference or finite element solutions of porous flow and transport. Another useful tool for fluid flow problems is the ability to solve inverse problems, that is, given present-time observations of a fluid flow, what was the initial state of that fluid system? We have demonstrated this capability for a large-scale problem of 3-D flow in the Earth's crust (Bunge, Hagelberg & Travis, 2002). Use of the adjoint method for sensitivity analysis (Marchuk, 1995) to compute derivatives of models makes the large-scale inversion feasible in 4-D, , space and time. Further, a framework for simulating complex fluid flow in the Earth's crust has been implemented (Dutrow et al, 2001). The remaining task of the first three-year campaign is to extend the implementation of the fif formalism to our 2-D and 3-D computer codes, which is straightforward, but involved.

  18. Waste Feed Evaporation Physical Properties Modeling

    SciTech Connect

    Daniel, W.E.

    2003-08-25

    This document describes the waste feed evaporator modeling work done in the Waste Feed Evaporation and Physical Properties Modeling test specification and in support of the Hanford River Protection Project (RPP) Waste Treatment Plant (WTP) project. A private database (ZEOLITE) was developed and used in this work in order to include the behavior of aluminosilicates such a NAS-gel in the OLI/ESP simulations, in addition to the development of the mathematical models. Mathematical models were developed that describe certain physical properties in the Hanford RPP-WTP waste feed evaporator process (FEP). In particular, models were developed for the feed stream to the first ultra-filtration step characterizing its heat capacity, thermal conductivity, and viscosity, as well as the density of the evaporator contents. The scope of the task was expanded to include the volume reduction factor across the waste feed evaporator (total evaporator feed volume/evaporator bottoms volume). All the physical properties were modeled as functions of the waste feed composition, temperature, and the high level waste recycle volumetric flow rate relative to that of the waste feed. The goal for the mathematical models was to predict the physical property to predicted simulation value. The simulation model approximating the FEP process used to develop the correlations was relatively complex, and not possible to duplicate within the scope of the bench scale evaporation experiments. Therefore, simulants were made of 13 design points (a subset of the points used in the model fits) using the compositions of the ultra-filtration feed streams as predicted by the simulation model. The chemistry and physical properties of the supernate (the modeled stream) as predicted by the simulation were compared with the analytical results of experimental simulant work as a method of validating the simulation software.

  19. SU-E-T-289: Scintillating Fiber Based In-Vivo Dose Monitoring System to the Rectum in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    SciTech Connect

    Tesfamicael, B; Gueye, P; Lyons, D; Mahesh, M; Avery, S

    2014-06-01

    Purpose: To construct a dose monitoring system based on an endorectal balloon coupled to thin scintillating fibers to study the dose delivered to the rectum during prostate cancer proton therapy Methods: The Geant4 Monte Carlo toolkit version 9.6p02 was used to simulate prostate cancer proton therapy treatments of an endorectal balloon (for immobilization of a 2.9 cm diameter prostate gland) and a set of 34 scintillating fibers symmetrically placed around the balloon and perpendicular to the proton beam direction (for dosimetry measurements) Results: A linear response of the fibers to the dose delivered was observed within <2%, a property that makes them good candidates for real time dosimetry. Results obtained show that the closest fiber recorded about 1/3 of the dose to the target with a 1/r{sup 2} decrease in the dose distribution as one goes toward the frontal and distal top fibers. Very low dose was recorded by the bottom fibers (about 45 times comparatively), which is a clear indication that the overall volume of the rectal wall that is exposed to a higher dose is relatively minimized. Further analysis indicated a simple scaling relationship between the dose to the prostate and the dose to the top fibers (a linear fit gave a slope of −0.07±0.07 MeV per treatment Gy) Conclusion: Thin (1 mm × 1 mm × 100 cm) long scintillating fibers were found to be ideal for real time in-vivo dose measurement to the rectum for prostate cancer proton therapy. The linear response of the fibers to the dose delivered makes them good candidates of dosimeters. With thorough calibration and the ability to define a good correlation between the dose to the target and the dose to the fibers, such dosimeters can be used for real time dose verification to the target.

  20. Dosimetry for electron Intra-Operative RadioTherapy: Comparison of output factors obtained through alanine/EPR pellets, ionization chamber and Monte Carlo-GEANT4 simulations for IORT mobile dedicate accelerator

    NASA Astrophysics Data System (ADS)

    Marrale, Maurizio; Longo, Anna; Russo, Giorgio; Casarino, Carlo; Candiano, Giuliana; Gallo, Salvatore; Carlino, Antonio; Brai, Maria

    2015-09-01

    In this work a comparison between the response of alanine and Markus ionization chamber was carried out for measurements of the output factors (OF) of electron beams produced by a linear accelerator used for Intra-Operative Radiation Therapy (IORT). Output factors (OF) for conventional high-energy electron beams are normally measured using ionization chamber according to international dosimetry protocols. However, the electron beams used in IORT have characteristics of dose per pulse, energy spectrum and angular distribution quite different from beams usually used in external radiotherapy, so the direct application of international dosimetry protocols may introduce additional uncertainties in dosimetric determinations. The high dose per pulse could lead to an inaccuracy in dose measurements with ionization chamber, due to overestimation of ks recombination factor. Furthermore, the electron fields obtained with IORT-dedicated applicators have a wider energy spectrum and a wider angular distribution than the conventional fields, due to the presence of electrons scattered by the applicator's wall. For this reason, a dosimetry system should be characterized by a minimum dependence from the beam energy and from angle of incidence of electrons. This become particularly critical for small and bevelled applicators. All of these reasons lead to investigate the use of detectors different from the ionization chamber for measuring the OFs. Furthermore, the complete characterization of the radiation field could be accomplished also by the use of Monte Carlo simulations which allows to obtain detailed information on dose distributions. In this work we compare the output factors obtained by means of alanine dosimeters and Markus ionization chamber. The comparison is completed by the Monte Carlo calculations of OFs determined through the use of the Geant4 application "iort _ therapy" . The results are characterized by a good agreement of response of alanine pellets and Markus

  1. PIXE simulation: Models, methods and technologies

    SciTech Connect

    Batic, M.; Pia, M. G.; Saracco, P.; Weidenspointner, G.

    2013-04-19

    The simulation of PIXE (Particle Induced X-ray Emission) is discussed in the context of general-purpose Monte Carlo systems for particle transport. Dedicated PIXE codes are mainly concerned with the application of the technique to elemental analysis, but they lack the capability of dealing with complex experimental configurations. General-purpose Monte Carlo codes provide powerful tools to model the experimental environment in great detail, but so far they have provided limited functionality for PIXE simulation. This paper reviews recent developments that have endowed the Geant4 simulation toolkit with advanced capabilities for PIXE simulation, and related efforts for quantitative validation of cross sections and other physical parameters relevant to PIXE simulation.

  2. Simplified Models for LHC New Physics Searches

    SciTech Connect

    Alves, Daniele; Arkani-Hamed, Nima; Arora, Sanjay; Bai, Yang; Baumgart, Matthew; Berger, Joshua; Buckley, Matthew; Butler, Bart; Chang, Spencer; Cheng, Hsin-Chia; Cheung, Clifford; Chivukula, R.Sekhar; Cho, Won Sang; Cotta, Randy; D'Alfonso, Mariarosaria; El Hedri, Sonia; Essig, Rouven,; Evans, Jared A.; Fitzpatrick, Liam; Fox, Patrick; Franceschini, Roberto; /more authors..

    2012-06-01

    This document proposes a collection of simplified models relevant to the design of new-physics searches at the LHC and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the 'Topologies for Early LHC Searches' workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first {approx} 50-500 pb{sup -1} of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

  3. Simplified models for LHC new physics searches

    NASA Astrophysics Data System (ADS)

    Alves, Daniele; Arkani-Hamed, Nima; Arora, Sanjay; Bai, Yang; Baumgart, Matthew; Berger, Joshua; Buckley, Matthew; Butler, Bart; Chang, Spencer; Cheng, Hsin-Chia; Cheung, Clifford; Sekhar Chivukula, R.; Cho, Won Sang; Cotta, Randy; D'Alfonso, Mariarosaria; El Hedri, Sonia; Essig (Editor, Rouven; Evans, Jared A.; Fitzpatrick, Liam; Fox, Patrick; Franceschini, Roberto; Freitas, Ayres; Gainer, James S.; Gershtein, Yuri; Gray, Richard; Gregoire, Thomas; Gripaios, Ben; Gunion, Jack; Han, Tao; Haas, Andy; Hansson, Per; Hewett, JoAnne; Hits, Dmitry; Hubisz, Jay; Izaguirre, Eder; Kaplan, Jared; Katz, Emanuel; Kilic, Can; Kim, Hyung-Do; Kitano, Ryuichiro; Koay, Sue Ann; Ko, Pyungwon; Krohn, David; Kuflik, Eric; Lewis, Ian; Lisanti (Editor, Mariangela; Liu, Tao; Liu, Zhen; Lu, Ran; Luty, Markus; Meade, Patrick; Morrissey, David; Mrenna, Stephen; Nojiri, Mihoko; Okui, Takemichi; Padhi, Sanjay; Papucci, Michele; Park, Michael; Park, Myeonghun; Perelstein, Maxim; Peskin, Michael; Phalen, Daniel; Rehermann, Keith; Rentala, Vikram; Roy, Tuhin; Ruderman, Joshua T.; Sanz, Veronica; Schmaltz, Martin; Schnetzer, Stephen; Schuster (Editor, Philip; Schwaller, Pedro; Schwartz, Matthew D.; Schwartzman, Ariel; Shao, Jing; Shelton, Jessie; Shih, David; Shu, Jing; Silverstein, Daniel; Simmons, Elizabeth; Somalwar, Sunil; Spannowsky, Michael; Spethmann, Christian; Strassler, Matthew; Su, Shufang; Tait (Editor, Tim; Thomas, Brooks; Thomas, Scott; Toro (Editor, Natalia; Volansky, Tomer; Wacker (Editor, Jay; Waltenberger, Wolfgang; Yavin, Itay; Yu, Felix; Zhao, Yue; Zurek, Kathryn; LHC New Physics Working Group

    2012-10-01

    This document proposes a collection of simplified models relevant to the design of new-physics searches at the Large Hadron Collider (LHC) and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the ‘Topologies for Early LHC Searches’ workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first ˜50-500 pb-1 of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

  4. Model reduction in the physical coordinate system

    NASA Technical Reports Server (NTRS)

    Yae, K. Harold; Joeng, K. Y.

    1989-01-01

    In the dynamics modeling of a flexible structure, finite element analysis employs reduction techniques, such as Guyan's reduction, to remove some of the insignificant physical coordinates, thus producing a dynamics model that has smaller mass and stiffness matrices. But this reduction is limited in the sense that it removes certain degrees of freedom at a node points themselves in the model. From the standpoint of linear control design, the resultant model is still too large despite the reduction. Thus, some form of the model reduction is frequently used in control design by approximating a large dynamical system with a fewer number of state variables. However, a problem arises from the placement of sensors and actuators in the reduced model, because a model usually undergoes, before being reduced, some form of coordinate transformations that do not preserve the physical meanings of the states. To correct such a problem, a method is developed that expresses a reduced model in terms of a subset of the original states. The proposed method starts with a dynamic model that is originated and reduced in finite element analysis. Then the model is converted to the state space form, and reduced again by the internal balancing method. At this point, being in the balanced coordinate system, the states in the reduced model have no apparent resemblance to those of the original model. Through another coordinate transformation that is developed, however, this reduced model is expressed by a subset of the original states.

  5. A physical analogue of the Schelling model

    NASA Astrophysics Data System (ADS)

    Vinković, Dejan; Kirman, Alan

    2006-12-01

    We present a mathematical link between Schelling's socio-economic model of segregation and the physics of clustering. We replace the economic concept of "utility" by the physics concept of a particle's internal energy. As a result cluster dynamics is driven by the "surface tension" force. The resultant segregated areas can be very large and can behave like spherical "liquid" droplets or as a collection of static clusters in "frozen" form. This model will hopefully provide a useful framework for studying many spatial economic phenomena that involve individuals making location choices as a function of the characteristics and choices of their neighbors.

  6. Waste glass melter numerical and physical modeling

    SciTech Connect

    Eyler, L.L.; Peters, R.D.; Lessor, D.L.; Lowery, P.S.; Elliott, M.L.

    1991-10-01

    Results of physical and numerical simulation modeling of high-level liquid waste vitrification melters are presented. Physical modeling uses simulant fluids in laboratory testing. Visualization results provide insight into convective melt flow patterns from which information is derived to support performance estimation of operating melters and data to support numerical simulation. Numerical simulation results of several melter configurations are presented. These are in support of programs to evaluate melter operation characteristics and performance. Included are investigations into power skewing and alternating current electric field phase angle in a dual electrode pair reference design and bi-modal convective stability in an advanced design. 9 refs., 9 figs., 1 tab.

  7. Transmission calculation by empirical numerical model and Monte Carlo simulation in high energy proton radiography of thick objects

    NASA Astrophysics Data System (ADS)

    Zheng, Na; Xu, Hai-Bo

    2015-10-01

    An empirical numerical model that includes nuclear absorption, multiple Coulomb scattering and energy loss is presented for the calculation of transmission through thick objects in high energy proton radiography. In this numerical model the angular distributions are treated as Gaussians in the laboratory frame. A Monte Carlo program based on the Geant4 toolkit was developed and used for high energy proton radiography experiment simulations and verification of the empirical numerical model. The two models are used to calculate the transmission fraction of carbon and lead step-wedges in proton radiography at 24 GeV/c, and to calculate radial transmission of the French Test Object in proton radiography at 24 GeV/c with different angular cuts. It is shown that the results of the two models agree with each other, and an analysis of the slight differences is given. Supported by NSAF (11176001) and Science and Technology Developing Foundation of China Academy of Engineering Physics (2012A0202006)

  8. Topos models for physics and topos theory

    SciTech Connect

    Wolters, Sander

    2014-08-15

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  9. Mental Models in Expert Physics Reasoning.

    ERIC Educational Resources Information Center

    Roschelle, Jeremy; Greeno, James G.

    Proposed is a relational framework for characterizing experienced physicists' representations of physics problem situations and the process of constructing these representations. A representation includes a coherent set of relations among: (1) a mental model of the objects in the situation, along with their relevant properties and relations; (2) a…

  10. Mathematical and physical modelling of materials processing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Mathematical and physical modeling of turbulence phenomena in metals processing, electromagnetically driven flows in materials processing, gas-solid reactions, rapid solidification processes, the electroslag casting process, the role of cathodic depolarizers in the corrosion of aluminum in sea water, and predicting viscoelastic flows are described.

  11. Dilution physics modeling: Dissolution/precipitation chemistry

    SciTech Connect

    Onishi, Y.; Reid, H.C.; Trent, D.S.

    1995-09-01

    This report documents progress made to date on integrating dilution/precipitation chemistry and new physical models into the TEMPEST thermal-hydraulics computer code. Implementation of dissolution/precipitation chemistry models is necessary for predicting nonhomogeneous, time-dependent, physical/chemical behavior of tank wastes with and without a variety of possible engineered remediation and mitigation activities. Such behavior includes chemical reactions, gas retention, solids resuspension, solids dissolution and generation, solids settling/rising, and convective motion of physical and chemical species. Thus this model development is important from the standpoint of predicting the consequences of various engineered activities, such as mitigation by dilution, retrieval, or pretreatment, that can affect safe operations. The integration of a dissolution/precipitation chemistry module allows the various phase species concentrations to enter into the physical calculations that affect the TEMPEST hydrodynamic flow calculations. The yield strength model of non-Newtonian sludge correlates yield to a power function of solids concentration. Likewise, shear stress is concentration-dependent, and the dissolution/precipitation chemistry calculations develop the species concentration evolution that produces fluid flow resistance changes. Dilution of waste with pure water, molar concentrations of sodium hydroxide, and other chemical streams can be analyzed for the reactive species changes and hydrodynamic flow characteristics.

  12. Physical models for classroom teaching in hydrology

    NASA Astrophysics Data System (ADS)

    Rodhe, A.

    2012-09-01

    Hydrology teaching benefits from the fact that many important processes can be illustrated and explained with simple physical models. A set of mobile physical models has been developed and used during many years of lecturing at basic university level teaching in hydrology. One model, with which many phenomena can be demonstrated, consists of a 1.0-m-long plexiglass container containing an about 0.25-m-deep open sand aquifer through which water is circulated. The model can be used for showing the groundwater table and its influence on the water content in the unsaturated zone and for quantitative determination of hydraulic properties such as the storage coefficient and the saturated hydraulic conductivity. It is also well suited for discussions on the runoff process and the significance of recharge and discharge areas for groundwater. The flow paths of water and contaminant dispersion can be illustrated in tracer experiments using fluorescent or colour dye. This and a few other physical models, with suggested demonstrations and experiments, are described in this article. The finding from using models in classroom teaching is that it creates curiosity among the students, promotes discussions and most likely deepens the understanding of the basic processes.

  13. Transforming teacher knowledge: Modeling instruction in physics

    NASA Astrophysics Data System (ADS)

    Cabot, Lloyd H.

    I show that the Modeling physics curriculum is readily accommodated by most teachers in favor of traditional didactic pedagogies. This is so, at least in part, because Modeling focuses on a small set of connected models embedded in a self-consistent theoretical framework and thus is closely congruent with human cognition in this context which is to generate mental models of physical phenomena as both predictive and explanatory devices. Whether a teacher fully implements the Modeling pedagogy depends on the depth of the teacher's commitment to inquiry-based instruction, specifically Modeling instruction, as a means of promoting student understanding of Newtonian mechanics. Moreover, this commitment trumps all other characteristics: teacher educational background, content coverage issues, student achievement data, district or state learning standards, and district or state student assessments. Indeed, distinctive differences exist in how Modeling teachers deliver their curricula and some teachers are measurably more effective than others in their delivery, but they all share an unshakable belief in the efficacy of inquiry-based, constructivist-oriented instruction. The Modeling Workshops' pedagogy, duration, and social interactions impacts teachers' self-identification as members of a professional community. Finally, I discuss the consequences my research may have for the Modeling Instruction program designers and for designers of professional development programs generally.

  14. Investigations of physical model of biological tissue

    NASA Astrophysics Data System (ADS)

    Linkov, Kirill G.; Kisselev, Gennady L.; Loschenov, Victor B.

    1996-12-01

    Physical model of a biological tissue for comparison with earlier created mathematical model of a biological tissue and researches of distribution photosensitizer in a depth was created and investigated. Mathematical model is based on granulated representation of optical medium. The model of a biological tissue was created on the basis of enough thin layers of a special material. For fluorescence excitation laser sources with a various wavelength were used. For investigation of scattering and fluorescent signal laser- fiber spectrum-analyzer LESA-5 was applied. Water solution of aluminum phthalocyanine and oil solution of zinc phthalocyanine were used for receiving of fluorescent signal. Created samples have certain absorbing and fluorescent properties. Scattering properties of samples are close to scattering properties of real human skin. By virtue of layered structure the model permits to simulate as a biological tissue without photosensitizer accumulation in it, as tissue with photosensitizer accumulation with certain distribution in a depth. Dependence of fields distribution on a surface was investigated at change of parameters of a model. Essential changes of distribution on a surface depending on the characteristics of model was revealed. The space and angular characteristics was investigated also. The investigations with physical model correspond to predicted results of theoretical model.

  15. A physical interpretation of hydrologic model complexity

    NASA Astrophysics Data System (ADS)

    Moayeri, MohamadMehdi; Pande, Saket

    2015-04-01

    It is intuitive that instability of hydrological system representation, in the sense of how perturbations in input forcings translate into perturbation in a hydrologic response, may depend on its hydrological characteristics. Responses of unstable systems are thus complex to model. We interpret complexity in this context and define complexity as a measure of instability in hydrological system representation. We provide algorithms to quantify model complexity in this context. We use Sacramento soil moisture accounting model (SAC-SMA) parameterized for MOPEX basins and quantify complexities of corresponding models. Relationships between hydrologic characteristics of MOPEX basins such as location, precipitation seasonality index, slope, hydrologic ratios, saturated hydraulic conductivity and NDVI and respective model complexities are then investigated. We hypothesize that complexities of basin specific SAC-SMA models correspond to aforementioned hydrologic characteristics, thereby suggesting that model complexity, in the context presented here, may have a physical interpretation.

  16. Service Learning In Physics: The Consultant Model

    NASA Astrophysics Data System (ADS)

    Guerra, David

    2005-04-01

    Each year thousands of students across the country and across the academic disciplines participate in service learning. Unfortunately, with no clear model for integrating community service into the physics curriculum, there are very few physics students engaged in service learning. To overcome this shortfall, a consultant based service-learning program has been developed and successfully implemented at Saint Anselm College (SAC). As consultants, students in upper level physics courses apply their problem solving skills in the service of others. Most recently, SAC students provided technical and managerial support to a group from Girl's Inc., a national empowerment program for girls in high-risk, underserved areas, who were participating in the national FIRST Lego League Robotics competition. In their role as consultants the SAC students provided technical information through brainstorming sessions and helped the girls stay on task with project management techniques, like milestone charting. This consultant model of service-learning, provides technical support to groups that may not have a great deal of resources and gives physics students a way to improve their interpersonal skills, test their technical expertise, and better define the marketable skill set they are developing through the physics curriculum.

  17. Particle radiation transport and effects models from research to space weather operations

    NASA Astrophysics Data System (ADS)

    Santin, Giovanni; Nieminen, Petteri; Rivera, Angela; Ibarmia, Sergio; Truscott, Pete; Lei, Fan; Desorgher, Laurent; Ivanchenko, Vladimir; Kruglanski, Michel; Messios, Neophytos

    Assessment of risk from potential radiation-induced effects to space systems requires knowledge of both the conditions of the radiation environment and of the impact of radiation on sensi-tive spacecraft elements. During sensitivity analyses, test data are complemented by models to predict how external radiation fields are transported and modified in spacecraft materials. Radiation transport is still itself a subject of research and models are continuously improved to describe the physical interactions that take place when particles pass through shielding materi-als or hit electronic systems or astronauts, sometimes down to nanometre-scale interactions of single particles with deep sub-micron technologies or DNA structures. In recent years, though, such radiation transport models are transitioning from being a research subject by itself, to being widely used in the space engineering domain and finally being directly applied in the context of operation of space weather services. A significant "research to operations" (R2O) case is offered by Geant4, an open source toolkit initially developed and used in the context of fundamental research in high energy physics. Geant4 is also being used in the space domain, e.g. for modelling detector responses in science payloads, but also for studying the radiation environment itself, with subjects ranging from cosmic rays, to solar energetic particles in the heliosphere, to geomagnetic shielding. Geant4-based tools are now becoming more and more integrated in spacecraft design procedures, also through user friendly interfaces such as SPEN-VIS. Some examples are given by MULASSIS, offering multi-layered shielding analysis capa-bilities in realistic spacecraft materials, or GEMAT, focused on micro-dosimetry in electronics, or PLANETOCOSMICS, describing the interaction of the space environment with planetary magneto-and atmospheres, or GRAS, providing a modular and easy to use interface to various analysis types in simple or

  18. Full-waveform modeling and inversion of physical model data

    NASA Astrophysics Data System (ADS)

    Cai, Jian; Zhang, Jie

    2016-08-01

    Because full elastic waveform inversion requires considerable computation time for forward modeling and inversion, acoustic waveform inversion is often applied to marine data for reducing the computational time. To understand the validity of the acoustic approximation, we study data collected from an ultrasonic laboratory with a known physical model by applying elastic and acoustic waveform modeling and acoustic waveform inversion. This study enables us to evaluate waveform differences quantitatively between synthetics and real data from the same physical model and to understand the effects of different objective functions in addressing the waveform differences for full-waveform inversion. Because the materials used in the physical experiment are viscoelastic, we find that both elastic and acoustic synthetics differ substantially from the physical data over offset in true amplitude. If attenuation is taken into consideration, the amplitude versus offset (AVO) of viscoelastic synthetics more closely approximates the physical data. To mitigate the effect of amplitude differences, we apply trace normalization to both synthetics and physical data in acoustic full-waveform inversion. The objective function is equivalent to minimizing the phase differences with indirect contributions from the amplitudes. We observe that trace normalization helps to stabilize the inversion and obtain more accurate model solutions for both synthetics and physical data.

  19. Modelling Students' Construction of Energy Models in Physics.

    ERIC Educational Resources Information Center

    Devi, Roshni; And Others

    1996-01-01

    Examines students' construction of experimentation models for physics theories in energy storage, transformation, and transfers involving electricity and mechanics. Student problem solving dialogs and artificial intelligence modeling of these processes is analyzed. Construction of models established relations between elements with linear causal…

  20. Physics Beyond the Standard Model: Supersymmetry

    SciTech Connect

    Nojiri, M.M.; Plehn, T.; Polesello, G.; Alexander, John M.; Allanach, B.C.; Barr, Alan J.; Benakli, K.; Boudjema, F.; Freitas, A.; Gwenlan, C.; Jager, S.; /CERN /LPSC, Grenoble

    2008-02-01

    This collection of studies on new physics at the LHC constitutes the report of the supersymmetry working group at the Workshop 'Physics at TeV Colliders', Les Houches, France, 2007. They cover the wide spectrum of phenomenology in the LHC era, from alternative models and signatures to the extraction of relevant observables, the study of the MSSM parameter space and finally to the interplay of LHC observations with additional data expected on a similar time scale. The special feature of this collection is that while not each of the studies is explicitly performed together by theoretical and experimental LHC physicists, all of them were inspired by and discussed in this particular environment.

  1. Modeling quantum physics with machine learning

    NASA Astrophysics Data System (ADS)

    Lopez-Bezanilla, Alejandro; Arsenault, Louis-Francois; Millis, Andrew; Littlewood, Peter; von Lilienfeld, Anatole

    2014-03-01

    Machine Learning (ML) is a systematic way of inferring new results from sparse information. It directly allows for the resolution of computationally expensive sets of equations by making sense of accumulated knowledge and it is therefore an attractive method for providing computationally inexpensive 'solvers' for some of the important systems of condensed matter physics. In this talk a non-linear regression statistical model is introduced to demonstrate the utility of ML methods in solving quantum physics related problem, and is applied to the calculation of electronic transport in 1D channels. DOE contract number DE-AC02-06CH11357.

  2. Physics Beyond the Standard Model at Colliders

    NASA Astrophysics Data System (ADS)

    Matchev, Konstantin

    These lectures introduce the modern machinery used in searches and studies of new physics Beyond the Standard Model (BSM) at colliders. The first lecture provides an overview of the main simulation tools used in high energy physics, including automated parton-level calculators, general purpose event generators, detector simulators, etc. The second lecture is a brief introduction to low energy supersymmetry (SUSY) as a representative BSM paradigm. The third lecture discusses the main collider signatures of SUSY and methods for measuring the masses of new particles in events with missing energy.

  3. Monte Carlo investigation of the increased radiation deposition due to gold nanoparticles using kilovoltage and megavoltage photons in a 3D randomized cell model

    SciTech Connect

    Douglass, Michael; Bezak, Eva; Penfold, Scott

    2013-07-15

    Purpose: Investigation of increased radiation dose deposition due to gold nanoparticles (GNPs) using a 3D computational cell model during x-ray radiotherapy.Methods: Two GNP simulation scenarios were set up in Geant4; a single 400 nm diameter gold cluster randomly positioned in the cytoplasm and a 300 nm gold layer around the nucleus of the cell. Using an 80 kVp photon beam, the effect of GNP on the dose deposition in five modeled regions of the cell including cytoplasm, membrane, and nucleus was simulated. Two Geant4 physics lists were tested: the default Livermore and custom built Livermore/DNA hybrid physics list. 10{sup 6} particles were simulated at 840 cells in the simulation. Each cell was randomly placed with random orientation and a diameter varying between 9 and 13 {mu}m. A mathematical algorithm was used to ensure that none of the 840 cells overlapped. The energy dependence of the GNP physical dose enhancement effect was calculated by simulating the dose deposition in the cells with two energy spectra of 80 kVp and 6 MV. The contribution from Auger electrons was investigated by comparing the two GNP simulation scenarios while activating and deactivating atomic de-excitation processes in Geant4.Results: The physical dose enhancement ratio (DER) of GNP was calculated using the Monte Carlo model. The model has demonstrated that the DER depends on the amount of gold and the position of the gold cluster within the cell. Individual cell regions experienced statistically significant (p < 0.05) change in absorbed dose (DER between 1 and 10) depending on the type of gold geometry used. The DER resulting from gold clusters attached to the cell nucleus had the more significant effect of the two cases (DER {approx} 55). The DER value calculated at 6 MV was shown to be at least an order of magnitude smaller than the DER values calculated for the 80 kVp spectrum. Based on simulations, when 80 kVp photons are used, Auger electrons have a statistically insignificant (p

  4. Monte Carlo modeling provides accurate calibration factors for radionuclide activity meters.

    PubMed

    Zagni, F; Cicoria, G; Lucconi, G; Infantino, A; Lodi, F; Marengo, M

    2014-12-01

    Accurate determination of calibration factors for radionuclide activity meters is crucial for quantitative studies and in the optimization step of radiation protection, as these detectors are widespread in radiopharmacy and nuclear medicine facilities. In this work we developed the Monte Carlo model of a widely used activity meter, using the Geant4 simulation toolkit. More precisely the "PENELOPE" EM physics models were employed. The model was validated by means of several certified sources, traceable to primary activity standards, and other sources locally standardized with spectrometry measurements, plus other experimental tests. Great care was taken in order to accurately reproduce the geometrical details of the gas chamber and the activity sources, each of which is different in shape and enclosed in a unique container. Both relative calibration factors and ionization current obtained with simulations were compared against experimental measurements; further tests were carried out, such as the comparison of the relative response of the chamber for a source placed at different positions. The results showed a satisfactory level of accuracy in the energy range of interest, with the discrepancies lower than 4% for all the tested parameters. This shows that an accurate Monte Carlo modeling of this type of detector is feasible using the low-energy physics models embedded in Geant4. The obtained Monte Carlo model establishes a powerful tool for first instance determination of new calibration factors for non-standard radionuclides, for custom containers, when a reference source is not available. Moreover, the model provides an experimental setup for further research and optimization with regards to materials and geometrical details of the measuring setup, such as the ionization chamber itself or the containers configuration. PMID:25195174

  5. Testing Physical Models of Passive Membrane Permeation

    PubMed Central

    Leung, Siegfried S. F.; Mijalkovic, Jona; Borrelli, Kenneth; Jacobson, Matthew P.

    2012-01-01

    The biophysical basis of passive membrane permeability is well understood, but most methods for predicting membrane permeability in the context of drug design are based on statistical relationships that indirectly capture the key physical aspects. Here, we investigate molecular mechanics-based models of passive membrane permeability and evaluate their performance against different types of experimental data, including parallel artificial membrane permeability assays (PAMPA), cell-based assays, in vivo measurements, and other in silico predictions. The experimental data sets we use in these tests are diverse, including peptidomimetics, congeneric series, and diverse FDA approved drugs. The physical models are not specifically trained for any of these data sets; rather, input parameters are based on standard molecular mechanics force fields, such as partial charges, and an implicit solvent model. A systematic approach is taken to analyze the contribution from each component in the physics-based permeability model. A primary factor in determining rates of passive membrane permeation is the conformation-dependent free energy of desolvating the molecule, and this measure alone provides good agreement with experimental permeability measurements in many cases. Other factors that improve agreement with experimental data include deionization and estimates of entropy losses of the ligand and the membrane, which lead to size-dependence of the permeation rate. PMID:22621168

  6. Physical Modeling of the Composting Ecosystem †

    PubMed Central

    Hogan, J. A.; Miller, F. C.; Finstein, M. S.

    1989-01-01

    A composting physical model with an experimental chamber with a working volume of 14 × 103 cm3 (0.5 ft3) was designed to avoid exaggerated conductive heat loss resulting from, relative to field-scale piles, a disproportionately large outer surface-area-to-volume ratio. In the physical model, conductive flux (rate of heat flow through chamber surfaces) was made constant and slight through a combination of insulation and temperature control of the surrounding air. This control was based on the instantaneous conductive flux, as calculated from temperature differentials via a conductive heat flow model. An experiment was performed over a 10-day period in which control of the composting process was based on ventilative heat removal in reference to a microbially favorable temperature ceiling (temperature feedback). By using the conduction control system (surrounding air temperature controlled), 2.4% of the total heat evolved from the chamber was through conduction, whereas the remainder was through the ventilative mechanisms of the latent heat of vaporization and the sensible temperature increase of air. By comparison, with insulation alone (the conduction control system was not used) conduction accounted for 33.5% of the total heat evolved. This difference in conduction resulted in substantial behavioral differences with respect to the temperature of the composting matrix and the amount of water removed. By emphasizing the slight conduction system (2.4% of total heat flow) as being a better representative of field conditions, a comparison was made between composting system behavior in the laboratory physical model and field-scale piles described in earlier reports. Numerous behavioral patterns were qualitatively similar in the laboratory and field (e.g., temperature gradient, O2 content, and water removal). It was concluded that field-scale composting system behavior can be simulated reasonably faithfully in the physical model. Images PMID:16347903

  7. Physical modelling of failure in composites.

    PubMed

    Talreja, Ramesh

    2016-07-13

    Structural integrity of composite materials is governed by failure mechanisms that initiate at the scale of the microstructure. The local stress fields evolve with the progression of the failure mechanisms. Within the full span from initiation to criticality of the failure mechanisms, the governing length scales in a fibre-reinforced composite change from the fibre size to the characteristic fibre-architecture sizes, and eventually to a structural size, depending on the composite configuration and structural geometry as well as the imposed loading environment. Thus, a physical modelling of failure in composites must necessarily be of multi-scale nature, although not always with the same hierarchy for each failure mode. With this background, the paper examines the currently available main composite failure theories to assess their ability to capture the essential features of failure. A case is made for an alternative in the form of physical modelling and its skeleton is constructed based on physical observations and systematic analysis of the basic failure modes and associated stress fields and energy balances. This article is part of the themed issue 'Multiscale modelling of the structural integrity of composite materials'. PMID:27242307

  8. Physical models of polarization mode dispersion

    SciTech Connect

    Menyuk, C.R.; Wai, P.K.A.

    1995-12-31

    The effect of randomly varying birefringence on light propagation in optical fibers is studied theoretically in the parameter regime that will be used for long-distance communications. In this regime, the birefringence is large and varies very rapidly in comparison to the nonlinear and dispersive scale lengths. We determine the polarization mode dispersion, and we show that physically realistic models yield the same result for polarization mode dispersion as earlier heuristic models that were introduced by Poole. We also prove an ergodic theorem.

  9. Statistical physical models of cellular motility

    NASA Astrophysics Data System (ADS)

    Banigan, Edward J.

    Cellular motility is required for a wide range of biological behaviors and functions, and the topic poses a number of interesting physical questions. In this work, we construct and analyze models of various aspects of cellular motility using tools and ideas from statistical physics. We begin with a Brownian dynamics model for actin-polymerization-driven motility, which is responsible for cell crawling and "rocketing" motility of pathogens. Within this model, we explore the robustness of self-diffusiophoresis, which is a general mechanism of motility. Using this mechanism, an object such as a cell catalyzes a reaction that generates a steady-state concentration gradient that propels the object in a particular direction. We then apply these ideas to a model for depolymerization-driven motility during bacterial chromosome segregation. We find that depolymerization and protein-protein binding interactions alone are sufficient to robustly pull a chromosome, even against large loads. Next, we investigate how forces and kinetics interact during eukaryotic mitosis with a many-microtubule model. Microtubules exert forces on chromosomes, but since individual microtubules grow and shrink in a force-dependent way, these forces lead to bistable collective microtubule dynamics, which provides a mechanism for chromosome oscillations and microtubule-based tension sensing. Finally, we explore kinematic aspects of cell motility in the context of the immune system. We develop quantitative methods for analyzing cell migration statistics collected during imaging experiments. We find that during chronic infection in the brain, T cells run and pause stochastically, following the statistics of a generalized Levy walk. These statistics may contribute to immune function by mimicking an evolutionarily conserved efficient search strategy. Additionally, we find that naive T cells migrating in lymph nodes also obey non-Gaussian statistics. Altogether, our work demonstrates how physical

  10. Physical vs. Mathematical Models in Rock Mechanics

    NASA Astrophysics Data System (ADS)

    Morozov, I. B.; Deng, W.

    2013-12-01

    One of the less noted challenges in understanding the mechanical behavior of rocks at both in situ and lab conditions is the character of theoretical approaches being used. Currently, the emphasis is made on spatial averaging theories (homogenization and numerical models of microstructure), empirical models for temporal behavior (material memory, compliance functions and complex moduli), and mathematical transforms (Laplace and Fourier) used to infer the Q-factors and 'relaxation mechanisms'. In geophysical applications, we have to rely on such approaches for very broad spatial and temporal scales which are not available in experiments. However, the above models often make insufficient use of physics and utilize, for example, the simplified 'correspondence principle' instead of the laws of viscosity and friction. As a result, the commonly-used time- and frequency dependent (visco)elastic moduli represent apparent properties related to the measurement procedures and not necessarily to material properties. Predictions made from such models may therefore be inaccurate or incorrect when extrapolated beyond the lab scales. To overcome the above challenge, we need to utilize the methods of micro- and macroscopic mechanics and thermodynamics known in theoretical physics. This description is rigorous and accurate, uses only partial differential equations, and allows straightforward numerical implementations. One important observation from the physical approach is that the analysis should always be done for the specific geometry and parameters of the experiment. Here, we illustrate these methods on axial deformations of a cylindrical rock sample in the lab. A uniform, isotropic elastic rock with a thermoelastic effect is considered in four types of experiments: 1) axial extension with free transverse boundary, 2) pure axial extension with constrained transverse boundary, 3) pure bulk expansion, and 4) axial loading harmonically varying with time. In each of these cases, an

  11. A physical model of Titan's clouds

    NASA Technical Reports Server (NTRS)

    Toon, O. B.; Pollack, J. B.; Turco, R. P.

    1980-01-01

    A physical model of the formation and growth of aerosols in the atmosphere of Titan has been constructed in light of the observed correlation between variations in Titan's albedo and the sunspot cycle. The model was developed to fit spectral observations of deep methane bands, pressures, temperature distributions, and cloud structure, and is based on a one-dimensional physical-chemical model developed to simulate the earth's stratospheric aerosol layer. Sensitivity tests reveal the model parameters to be relatively insensitive to particle shape but sensitive to particle density, with high particle densities requiring larger aerosol mass production rates to produce compatible clouds. Solution of the aerosol continuity equations for particles of sizes 13 A to about 3 microns indicates the importance of a warm upper atmosphere and a high-altitude mass injection layer, and the production of aerosols at very low aerosol optical depths. Limits are obtained for the chemical production of aerosol mass and the eddy diffusion coefficient, and it is found that an increase in mass input causes a decrease in mean particle size.

  12. Material model for physically based rendering

    NASA Astrophysics Data System (ADS)

    Robart, Mathieu; Paulin, Mathias; Caubet, Rene

    1999-09-01

    In computer graphics, a complete knowledge of the interactions between light and a material is essential to obtain photorealistic pictures. Physical measurements allow us to obtain data on the material response, but are limited to industrial surfaces and depend on measure conditions. Analytic models do exist, but they are often inadequate for common use: the empiric ones are too simple to be realistic, and the physically-based ones are often to complex or too specialized to be generally useful. Therefore, we have developed a multiresolution virtual material model, that not only describes the surface of a material, but also its internal structure thanks to distribution functions of microelements, arranged in layers. Each microelement possesses its own response to an incident light, from an elementary reflection to a complex response provided by its inner structure, taking into account geometry, energy, polarization, . . ., of each light ray. This model is virtually illuminated, in order to compute its response to an incident radiance. This directional response is stored in a compressed data structure using spherical wavelets, and is destined to be used in a rendering model such as directional radiosity.

  13. Improving the physics models in the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Toth, G.; Fang, F.; Frazin, R. A.; Gombosi, T. I.; Ilie, R.; Liemohn, M. W.; Manchester, W. B.; Meng, X.; Pawlowski, D. J.; Ridley, A. J.; Sokolov, I.; van der Holst, B.; Vichare, G.; Yigit, E.; Yu, Y.; Buzulukova, N.; Fok, M. H.; Glocer, A.; Jordanova, V. K.; Welling, D. T.; Zaharia, S. G.

    2010-12-01

    The success of physics based space weather forecasting depends on several factors: we need sufficient amount and quality of timely observational data, we have to understand the physics of the Sun-Earth system well enough, we need sophisticated computational models, and the models have to run faster than real time on the available computational resources. This presentation will focus on a single ingredient, the recent improvements of the mathematical and numerical models in the Space Weather Modeling Framework. We have developed a new physics based CME initiation code using flux emergence from the convection zone solving the equations of radiative magnetohydrodynamics (MHD). Our new lower corona and solar corona models use electron heat conduction, Alfven wave heating, and boundary conditions based on solar tomography. We can obtain a physically consistent solar wind model from the surface of the Sun all the way to the L1 point without artificially changing the polytropic index. The global magnetosphere model can now solve the multi-ion MHD equations and take into account the oxygen outflow from the polar wind model. We have also added the options of solving for Hall MHD and anisotropic pressure. Several new inner magnetosphere models have been added to the framework: CRCM, HEIDI and RAM-SCB. These new models resolve the pitch angle distribution of the trapped particles. The upper atmosphere model GITM has been improved by including a self-consistent equatorial electrodynamics and the effects of solar flares. This presentation will very briefly describe the developments and highlight some results obtained with the improved and new models.

  14. Beyond the standard model of particle physics.

    PubMed

    Virdee, T S

    2016-08-28

    The Large Hadron Collider (LHC) at CERN and its experiments were conceived to tackle open questions in particle physics. The mechanism of the generation of mass of fundamental particles has been elucidated with the discovery of the Higgs boson. It is clear that the standard model is not the final theory. The open questions still awaiting clues or answers, from the LHC and other experiments, include: What is the composition of dark matter and of dark energy? Why is there more matter than anti-matter? Are there more space dimensions than the familiar three? What is the path to the unification of all the fundamental forces? This talk will discuss the status of, and prospects for, the search for new particles, symmetries and forces in order to address the open questions.This article is part of the themed issue 'Unifying physics and technology in light of Maxwell's equations'. PMID:27458261

  15. Models in Physics, Models for Physics Learning, and Why the Distinction May Matter in the Case of Electric Circuits

    ERIC Educational Resources Information Center

    Hart, Christina

    2008-01-01

    Models are important both in the development of physics itself and in teaching physics. Historically, the consensus models of physics have come to embody particular ontological assumptions and epistemological commitments. Educators have generally assumed that the consensus models of physics, which have stood the test of time, will also work well…

  16. Physical modelling of the nuclear pore complex

    PubMed Central

    Fassati, Ariberto; Ford, Ian J.; Hoogenboom, Bart W.

    2013-01-01

    Physically interesting behaviour can arise when soft matter is confined to nanoscale dimensions. A highly relevant biological example of such a phenomenon is the Nuclear Pore Complex (NPC) found perforating the nuclear envelope of eukaryotic cells. In the central conduit of the NPC, of ∼30–60 nm diameter, a disordered network of proteins regulates all macromolecular transport between the nucleus and the cytoplasm. In spite of a wealth of experimental data, the selectivity barrier of the NPC has yet to be explained fully. Experimental and theoretical approaches are complicated by the disordered and heterogeneous nature of the NPC conduit. Modelling approaches have focused on the behaviour of the partially unfolded protein domains in the confined geometry of the NPC conduit, and have demonstrated that within the range of parameters thought relevant for the NPC, widely varying behaviour can be observed. In this review, we summarise recent efforts to physically model the NPC barrier and function. We illustrate how attempts to understand NPC barrier function have employed many different modelling techniques, each of which have contributed to our understanding of the NPC.

  17. Physical model for membrane protrusions during spreading.

    PubMed

    Chamaraux, F; Ali, O; Keller, S; Bruckert, F; Fourcade, B

    2008-01-01

    During cell spreading onto a substrate, the kinetics of the contact area is an observable quantity. This paper is concerned with a physical approach to modeling this process in the case of ameboid motility where the membrane detaches itself from the underlying cytoskeleton at the leading edge. The physical model we propose is based on previous reports which highlight that membrane tension regulates cell spreading. Using a phenomenological feedback loop to mimic stress-dependent biochemistry, we show that the actin polymerization rate can be coupled to the stress which builds up at the margin of the contact area between the cell and the substrate. In the limit of small variation of membrane tension, we show that the actin polymerization rate can be written in a closed form. Our analysis defines characteristic lengths which depend on elastic properties of the membrane-cytoskeleton complex, such as the membrane-cytoskeleton interaction, and on molecular parameters, the rate of actin polymerization. We discuss our model in the case of axi-symmetric and non-axi-symmetric spreading and we compute the characteristic time scales as a function of fundamental elastic constants such as the strength of membrane-cytoskeleton adherence. PMID:18824791

  18. Ionospheric irregularity physics modelling. Memorandum report

    SciTech Connect

    Ossakow, S.L.; Keskinen, M.J.; Zalesak, S.T.

    1982-02-09

    Theoretical and numerical simulation techniques have been employed to study ionospheric F region plasma cloud striation phenomena, equatorial spread F phenomena, and high latitude diffuse auroral F region irregularity phenomena. Each of these phenomena can cause scintillation effects. The results and ideas from these studies are state-of-the-art, agree well with experimental observations, and have induced experimentalists to look for theoretically predicted results. One conclusion that can be drawn from these studies is that ionospheric irregularity phenomena can be modelled from a first principles physics point of view. Theoretical and numerical simulation results from the aforementioned ionospheric irregularity areas will be presented.

  19. Modelling biological complexity: a physical scientist's perspective

    PubMed Central

    Coveney, Peter V; Fowler, Philip W

    2005-01-01

    We discuss the modern approaches of complexity and self-organization to understanding dynamical systems and how these concepts can inform current interest in systems biology. From the perspective of a physical scientist, it is especially interesting to examine how the differing weights given to philosophies of science in the physical and biological sciences impact the application of the study of complexity. We briefly describe how the dynamics of the heart and circadian rhythms, canonical examples of systems biology, are modelled by sets of nonlinear coupled differential equations, which have to be solved numerically. A major difficulty with this approach is that all the parameters within these equations are not usually known. Coupled models that include biomolecular detail could help solve this problem. Coupling models across large ranges of length- and time-scales is central to describing complex systems and therefore to biology. Such coupling may be performed in at least two different ways, which we refer to as hierarchical and hybrid multiscale modelling. While limited progress has been made in the former case, the latter is only beginning to be addressed systematically. These modelling methods are expected to bring numerous benefits to biology, for example, the properties of a system could be studied over a wider range of length- and time-scales, a key aim of systems biology. Multiscale models couple behaviour at the molecular biological level to that at the cellular level, thereby providing a route for calculating many unknown parameters as well as investigating the effects at, for example, the cellular level, of small changes at the biomolecular level, such as a genetic mutation or the presence of a drug. The modelling and simulation of biomolecular systems is itself very computationally intensive; we describe a recently developed hybrid continuum-molecular model, HybridMD, and its associated molecular insertion algorithm, which point the way towards the

  20. Modelling biological complexity: a physical scientist's perspective.

    PubMed

    Coveney, Peter V; Fowler, Philip W

    2005-09-22

    We discuss the modern approaches of complexity and self-organization to understanding dynamical systems and how these concepts can inform current interest in systems biology. From the perspective of a physical scientist, it is especially interesting to examine how the differing weights given to philosophies of science in the physical and biological sciences impact the application of the study of complexity. We briefly describe how the dynamics of the heart and circadian rhythms, canonical examples of systems biology, are modelled by sets of nonlinear coupled differential equations, which have to be solved numerically. A major difficulty with this approach is that all the parameters within these equations are not usually known. Coupled models that include biomolecular detail could help solve this problem. Coupling models across large ranges of length- and time-scales is central to describing complex systems and therefore to biology. Such coupling may be performed in at least two different ways, which we refer to as hierarchical and hybrid multiscale modelling. While limited progress has been made in the former case, the latter is only beginning to be addressed systematically. These modelling methods are expected to bring numerous benefits to biology, for example, the properties of a system could be studied over a wider range of length- and time-scales, a key aim of systems biology. Multiscale models couple behaviour at the molecular biological level to that at the cellular level, thereby providing a route for calculating many unknown parameters as well as investigating the effects at, for example, the cellular level, of small changes at the biomolecular level, such as a genetic mutation or the presence of a drug. The modelling and simulation of biomolecular systems is itself very computationally intensive; we describe a recently developed hybrid continuum-molecular model, HybridMD, and its associated molecular insertion algorithm, which point the way towards the

  1. Detailed Physical Trough Model for NREL's Solar Advisor Model: Preprint

    SciTech Connect

    Wagner, M. J.; Blair, N.; Dobos, A.

    2010-10-01

    Solar Advisor Model (SAM) is a free software package made available by the National Renewable Energy Laboratory (NREL), Sandia National Laboratory, and the US Department of Energy. SAM contains hourly system performance and economic models for concentrating solar power (CSP) systems, photovoltaic, solar hot-water, and generic fuel-use technologies. Versions of SAM prior to 2010 included only the parabolic trough model based on Excelergy. This model uses top-level empirical performance curves to characterize plant behavior, and thus is limited in predictive capability for new technologies or component configurations. To address this and other functionality challenges, a new trough model; derived from physical first principles was commissioned to supplement the Excelergy-based empirical model. This new 'physical model' approaches the task of characterizing the performance of the whole parabolic trough plant by replacing empirical curve-fit relationships with more detailed calculations where practical. The resulting model matches the annual performance of the SAM empirical model (which has been previously verified with plant data) while maintaining run-times compatible with parametric analysis, adding additional flexibility in modeled system configurations, and providing more detailed performance calculations in the solar field, power block, piping, and storage subsystems.

  2. Semi-Empirical Modeling of SLD Physics

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Potapczuk, Mark G.

    2004-01-01

    The effects of supercooled large droplets (SLD) in icing have been an area of much interest in recent years. As part of this effort, the assumptions used for ice accretion software have been reviewed. A literature search was performed to determine advances from other areas of research that could be readily incorporated. Experimental data in the SLD regime was also analyzed. A semi-empirical computational model is presented which incorporates first order physical effects of large droplet phenomena into icing software. This model has been added to the LEWICE software. Comparisons are then made to SLD experimental data that has been collected to date. Results will be presented for the comparison of water collection efficiency, ice shape and ice mass.

  3. Physics-based models of the plasmasphere

    SciTech Connect

    Jordanova, Vania K; Pierrard, Vivane; Goldstein, Jerry; Andr'e, Nicolas; Lemaire, Joseph F; Liemohn, Mike W; Matsui, H

    2008-01-01

    We describe recent progress in physics-based models of the plasmasphere using the Auid and the kinetic approaches. Global modeling of the dynamics and inAuence of the plasmasphere is presented. Results from global plasmasphere simulations are used to understand and quantify (i) the electric potential pattern and evolution during geomagnetic storms, and (ii) the inAuence of the plasmasphere on the excitation of electromagnetic ion cyclotron (ElvIIC) waves a.nd precipitation of energetic ions in the inner magnetosphere. The interactions of the plasmasphere with the ionosphere a.nd the other regions of the magnetosphere are pointed out. We show the results of simulations for the formation of the plasmapause and discuss the inAuence of plasmaspheric wind and of ultra low frequency (ULF) waves for transport of plasmaspheric material. Theoretical formulations used to model the electric field and plasma distribution in the plasmasphere are given. Model predictions are compared to recent CLUSTER and MAGE observations, but also to results of earlier models and satellite observations.

  4. New Physics Beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Cai, Haiying

    In this thesis we discuss several extensons of the standard model, with an emphasis on the hierarchy problem. The hierachy problem related to the Higgs boson mass is a strong indication of new physics beyond the Standard Model. In the literature, several mechanisms, e.g. , supersymmetry (SUSY), the little Higgs and extra dimensions, are proposed to explain why the Higgs mass can be stabilized to the electroweak scale. In the Standard Model, the largest quadratically divergent contribution to the Higgs mass-squared comes from the top quark loop. We consider a few novel possibilities on how this contribution is cancelled. In the standard SUSY scenario, the quadratic divergence from the fermion loops is cancelled by the scalar superpartners and the SUSY breaking scale determines the masses of the scalars. We propose a new SUSY model, where the superpartner of the top quark is spin-1 rather than spin-0. In little Higgs theories, the Higgs field is realized as a psudo goldstone boson in a nonlinear sigma model. The smallness of its mass is protected by the global symmetry. As a variation, we put the little Higgs into an extra dimensional model where the quadratically divergent top loop contribution to the Higgs mass is cancelled by an uncolored heavy "top quirk" charged under a different SU(3) gauge group. Finally, we consider a supersymmetric warped extra dimensional model where the superpartners have continuum mass spectra. We use the holographic boundary action to study how a mass gap can arise to separate the zero modes from continuum modes. Such extensions of the Standard Model have novel signatures at the Large Hadron Collider.

  5. Propulsion Physics Using the Chameleon Density Model

    NASA Technical Reports Server (NTRS)

    Robertson, Glen A.

    2011-01-01

    To grow as a space faring race, future spaceflight systems will require a new theory of propulsion. Specifically one that does not require mass ejection without limiting the high thrust necessary to accelerate within or beyond our solar system and return within a normal work period or lifetime. The Chameleon Density Model (CDM) is one such model that could provide new paths in propulsion toward this end. The CDM is based on Chameleon Cosmology a dark matter theory; introduced by Khrouy and Weltman in 2004. Chameleon as it is hidden within known physics, where the Chameleon field represents a scalar field within and about an object; even in the vacuum. The CDM relates to density changes in the Chameleon field, where the density changes are related to matter accelerations within and about an object. These density changes in turn change how an object couples to its environment. Whereby, thrust is achieved by causing a differential in the environmental coupling about an object. As a demonstration to show that the CDM fits within known propulsion physics, this paper uses the model to estimate the thrust from a solid rocket motor. Under the CDM, a solid rocket constitutes a two body system, i.e., the changing density of the rocket and the changing density in the nozzle arising from the accelerated mass. Whereby, the interactions between these systems cause a differential coupling to the local gravity environment of the earth. It is shown that the resulting differential in coupling produces a calculated value for the thrust near equivalent to the conventional thrust model used in Sutton and Ross, Rocket Propulsion Elements. Even though imbedded in the equations are the Universe energy scale factor, the reduced Planck mass and the Planck length, which relates the large Universe scale to the subatomic scale.

  6. 3-D physical models of amitosis (cytokinesis).

    PubMed

    Cheng, Kang; Zou, Changhua

    2005-01-01

    Based on Newton's laws, extended Coulomb's law and published biological data, we develop our 3-D physical models of natural and normal amitosis (cytokinesis), for prokaryotes (bacterial cells) in M phase. We propose following hypotheses: Chromosome rings exclusion: No normally and naturally replicated chromosome rings (RCR) can occupy the same prokaryote, a bacterial cell. The RCR produce spontaneous and strong electromagnetic fields (EMF), that can be alternated environmentally, in protoplasm and cortex. The EMF is approximately a repulsive quasi-static electric (slowly variant and mostly electric) field (EF). The EF forces between the RCR are strong enough, and orderly accumulate contractile proteins that divide the procaryotes in the cell cortex of division plane or directly split the cell compartment envelope longitudinally. The radial component of the EF forces could also make furrows or cleavages of procaryotes. The EF distribution controls the protoplasm partition and completes the amitosis (cytokinesis). After the cytokinesis, the spontaneous and strong EF disappear because the net charge accumulation becomes weak, in the protoplasm. The exclusion is because the two sets of informative objects (RCR) have identical DNA codes information and they are electro magnetically identical, therefore they repulse from each other. We also compare divisions among eukaryotes, prokaryotes, mitochondria and chloroplasts and propose our hypothesis: The principles of our models are applied to divisions of mitochondria and chloroplasts of eucaryotes too because these division mechanisms are closer than others in a view of physics. Though we develop our model using 1 division plane (i.e., 1 cell is divided into 2 cells) as an example, the principle of our model is applied to the cases with multiple division planes (i.e., 1 cell is divided into multiple cells) too. PMID:15533619

  7. Fuzzy modelling of Atlantic salmon physical habitat

    NASA Astrophysics Data System (ADS)

    St-Hilaire, André; Mocq, Julien; Cunjak, Richard

    2015-04-01

    Fish habitat models typically attempt to quantify the amount of available river habitat for a given fish species for various flow and hydraulic conditions. To achieve this, information on the preferred range of values of key physical habitat variables (e.g. water level, velocity, substrate diameter) for the targeted fishs pecies need to be modelled. In this context, we developed several habitat suitability indices sets for three Atlantic salmon life stages (young-of-the-year (YOY), parr, spawning adults) with the help of fuzzy logic modeling. Using the knowledge of twenty-seven experts, from both sides of the Atlantic Ocean, we defined fuzzy sets of four variables (depth, substrate size, velocity and Habitat Suitability Index, or HSI) and associated fuzzy rules. When applied to the Romaine River (Canada), median curves of standardized Weighted Usable Area (WUA) were calculated and a confidence interval was obtained by bootstrap resampling. Despite the large range of WUA covered by the expert WUA curves, confidence intervals were relatively narrow: an average width of 0.095 (on a scale of 0 to 1) for spawning habitat, 0.155 for parr rearing habitat and 0.160 for YOY rearing habitat. When considering an environmental flow value corresponding to 90% of the maximum reached by WUA curve, results seem acceptable for the Romaine River. Generally, this proposed fuzzy logic method seems suitable to model habitat availability for the three life stages, while also providing an estimate of uncertainty in salmon preferences.

  8. Computer Integrated Manufacturing: Physical Modelling Systems Design. A Personal View.

    ERIC Educational Resources Information Center

    Baker, Richard

    A computer-integrated manufacturing (CIM) Physical Modeling Systems Design project was undertaken in a time of rapid change in the industrial, business, technological, training, and educational areas in Australia. A specification of a manufacturing physical modeling system was drawn up. Physical modeling provides a flexibility and configurability…

  9. Tactile Teaching: Exploring Protein Structure/Function Using Physical Models

    ERIC Educational Resources Information Center

    Herman, Tim; Morris, Jennifer; Colton, Shannon; Batiza, Ann; Patrick, Michael; Franzen, Margaret; Goodsell, David S.

    2006-01-01

    The technology now exists to construct physical models of proteins based on atomic coordinates of solved structures. We review here our recent experiences in using physical models to teach concepts of protein structure and function at both the high school and the undergraduate levels. At the high school level, physical models are used in a…

  10. Compass models: Theory and physical motivations

    NASA Astrophysics Data System (ADS)

    Nussinov, Zohar; van den Brink, Jeroen

    2015-01-01

    Compass models are theories of matter in which the couplings between the internal spin (or other relevant field) components are inherently spatially (typically, direction) dependent. A simple illustrative example is furnished by the 90° compass model on a square lattice in which only couplings of the form τixτjx (where {τia}a denote Pauli operators at site i ) are associated with nearest-neighbor sites i and j separated along the x axis of the lattice while τiyτjy couplings appear for sites separated by a lattice constant along the y axis. Similar compass-type interactions can appear in diverse physical systems. For instance, compass models describe Mott insulators with orbital degrees of freedom where interactions sensitively depend on the spatial orientation of the orbitals involved as well as the low-energy effective theories of frustrated quantum magnets, and a host of other systems such as vacancy centers, and cold atomic gases. The fundamental interdependence between internal (spin, orbital, or other) and external (i.e., spatial) degrees of freedom which underlies compass models generally leads to very rich behaviors, including the frustration of (semi-)classical ordered states on nonfrustrated lattices, and to enhanced quantum effects, prompting, in certain cases, the appearance of zero-temperature quantum spin liquids. As a consequence of these frustrations, new types of symmetries and their associated degeneracies may appear. These intermediate symmetries lie midway between the extremes of global symmetries and local gauge symmetries and lead to effective dimensional reductions. In this article, compass models are reviewed in a unified manner, paying close attention to exact consequences of these symmetries and to thermal and quantum fluctuations that stabilize orders via order-out-of-disorder effects. This is complemented by a survey of numerical results. In addition to reviewing past works, a number of other models are introduced and new results

  11. A Holoinformational Model of the Physical Observer

    NASA Astrophysics Data System (ADS)

    Biase, Francisco Di

    2013-09-01

    The author proposes a holoinformational view of the observer based, on the holonomic theory of brain/mind function and quantum brain dynamics developed by Karl Pribram, Sir John Eccles, R.L. Amoroso, Hameroff, Jibu and Yasue, and in the quantumholographic and holomovement theory of David Bohm. This conceptual framework is integrated with nonlocal information properties of the Quantum Field Theory of Umesawa, with the concept of negentropy, order, and organization developed by Shannon, Wiener, Szilard and Brillouin, and to the theories of self-organization and complexity of Prigogine, Atlan, Jantsch and Kauffman. Wheeler's "it from bit" concept of a participatory universe, and the developments of the physics of information made by Zureck and others with the concepts of statistical entropy and algorithmic entropy, related to the number of bits being processed in the mind of the observer are also considered. This new synthesis gives a self-organizing quantum nonlocal informational basis for a new model of awareness in a participatory universe. In this synthesis, awareness is conceived as meaningful quantum nonlocal information interconnecting the brain and the cosmos, by a holoinformational unified field (integrating nonlocal holistic (quantum) and local (Newtonian). We propose that the cosmology of the physical observer is this unified nonlocal quantum-holographic cosmos manifesting itself through awareness, interconnected in a participatory holistic and indivisible way the human mind-brain to all levels of the self-organizing holographic anthropic multiverse.

  12. Statistical physics model of an evolving population

    NASA Astrophysics Data System (ADS)

    Sznajd-Weron, K.; Pȩkalski, A.

    1999-12-01

    There are many possible approaches by a theoretical physicist to problems of biological evolution. Some focus on physically interesting features, like the self-organized criticality (P. Bak, K. Sneppen, Phys. Rev. Lett 71 (1993); N. Vadewalle, M. Ausloos, Physica D 90 (1996) 262). Others put on more effort taking into account factors considered by biologists to be important in determining one or another aspect of biological evolution (D. Derrida, P.G. Higgs, J. Phys. A 24 (1991) L985; I. Mróz, A. Pȩkalski, K. Sznajd-Weron, Phys. Rev. Lett. 76 (1996) 3025; A. Pȩkalski, Physica A 265 (1999) 255). The intrinsic complexity of the problem enforces nevertheless drastic simplifications. Certain consolation may come from the fact that the mathematical models used by biologists themselves are quite often even more “coarse grained”.

  13. Dynamical and Physical Models of Ecliptic Comets

    NASA Astrophysics Data System (ADS)

    Dones, L.; Boyce, D. C.; Levison, H. F.; Duncan, M. J.

    2005-08-01

    In most simulations of the dynamical evolution of the cometary reservoirs, a comet is removed from the computer only if it is thrown from the Solar System or strikes the Sun or a planet. However, ejection or collision is probably not the fate of most active comets. Some, like 3D/Biela, disintegrate for no apparent reason, and others, such as the Sun-grazers, 16P/Brooks 2, and D/1993 F2 Shoemaker-Levy 9, are pulled apart by the Sun or a planet. Still others, like 107P/Wilson Harrington and D/1819 W1 Blanpain, are lost and then rediscovered as asteroids. Historically, amateurs discovered most comets. However, robotic surveys now dominate the discovery of comets (http://www.comethunter.de/). These surveys include large numbers of comets observed in a standard way, so the process of discovery is amenable to modeling. Understanding the selection effects for discovery of comets is a key problem in constructing models of cometary origin. To address this issue, we are starting new orbital integrations that will provide the best model to date of the population of ecliptic comets as a function of location in the Solar System and the size of the cometary nucleus, which we expect will vary with location. The integrations include the gravitational effects of the terrestrial and giant planets and, in some cases, nongravitational jetting forces. We will incorporate simple parameterizations for mantling and mass loss based upon detailed physical models. This approach will enable us to estimate the fraction of comets in different states (active, extinct, dormant, or disintegrated) and to track how the cometary size distribution changes as a function of distance from the Sun. We will compare the results of these simulations with bias-corrected models of the orbital and absolute magnitude distributions of Jupiter-family comets and Centaurs.

  14. Physical modeling of transverse drainage mechanisms

    NASA Astrophysics Data System (ADS)

    Douglass, J. C.; Schmeeckle, M. W.

    2005-12-01

    Streams that incise across bedrock highlands such as anticlines, upwarps, cuestas, or horsts are termed transverse drainages. Their relevance today involves such diverse matters as highway and dam construction decisions, location of wildlife corridors, better-informed sediment budgets, and detailed studies into developmental histories of late Cenozoic landscapes. The transient conditions responsible for transverse drainage incision have been extensively studied on a case-by-case basis, and the dominate mechanisms proposed include: antecedence, superimposition, overflow, and piracy. Modeling efforts have been limited to antecedence, and such the specific erosional conditions required for transverse drainage incision, with respect to the individual mechanisms, remains poorly understood. In this study, fifteen experiments attempted to simulate the four mechanisms and constructed on a 9.15 m long, 2.1 m wide, and 0.45 m deep stream table. Experiments lasted between 50 and 220 minutes. The stream table was filled with seven tons of sediment consisting of a silt and clay (30%) and a fine to coarse sand (70%) mixture. The physical models highlighted the importance of downstream aggradation with regard to antecedent incision versus possible defeat and diversion. The overflow experiments indicate that retreating knickpoints across a basin outlet produce a high probability of downstream flooding when associated with a deep lake. Misters used in a couple of experiments illustrate a potential complication with regard to headward erosion driven piracy. Relatively level asymmetrically sloped ridges allow for the drainage divide across the ridge to retreat from headward erosion, but hindered when the ridge's apex undulates or when symmetrically sloped. Although these physical models cannot strictly simulate natural transverse drainages, the observed processes, their development over time, and resultant landforms roughly emulate their natural counterparts. Proposed originally from

  15. A Conceptual Model of Observed Physical Literacy

    ERIC Educational Resources Information Center

    Dudley, Dean A.

    2015-01-01

    Physical literacy is a concept that is gaining greater acceptance around the world with the United Nations Educational, Cultural, and Scientific Organization (2013) recognizing it as one of several central tenets in a quality physical education framework. However, previous attempts to understand progression in physical literacy learning have been…

  16. Models for Curriculum and Pedagogy in Elementary School Physical Education

    ERIC Educational Resources Information Center

    Kulinna, Pamela Hodges

    2008-01-01

    The purpose of this article is to review current models for curriculum and pedagogy used in elementary school physical education programs. Historically, physical educators have developed and used a multiactivity curriculum in order to educate students through physical movement. More recently, a variety of alternative curricular models have been…

  17. A Structural Equation Model of Expertise in College Physics

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Carr, Martha

    2009-01-01

    A model of expertise in physics was tested on a sample of 374 college students in 2 different level physics courses. Structural equation modeling was used to test hypothesized relationships among variables linked to expert performance in physics including strategy use, pictorial representation, categorization skills, and motivation, and these…

  18. A Structural Equation Model of Conceptual Change in Physics

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Sinatra, Gale M.

    2011-01-01

    A model of conceptual change in physics was tested on introductory-level, college physics students. Structural equation modeling was used to test hypothesized relationships among variables linked to conceptual change in physics including an approach goal orientation, need for cognition, motivation, and course grade. Conceptual change in physics…

  19. The Role of Various Curriculum Models on Physical Activity Levels

    ERIC Educational Resources Information Center

    Culpepper, Dean O.; Tarr, Susan J.; Killion, Lorraine E.

    2011-01-01

    Researchers have suggested that physical education curricula can be highly effective in increasing physical activity levels at school (Sallis & Owen, 1999). The purpose of this study was to investigate the impact of various curriculum models on physical activity. Total steps were measured on 1,111 subjects and three curriculum models were studied…

  20. Global scale, physical models of the F region ionosphere

    NASA Technical Reports Server (NTRS)

    Sojka, J. J.

    1989-01-01

    Consideration is given to the development and verification of global computer models of the F-region which simulate the interactions between physical processes in the ionosphere. The limitations of the physical models are discussed, focusing on the inputs to the ionospheric system such as magnetospheric electric field and auroral precipitation. The possibility of coupling ionospheric models with thermospheric and magnetospheric models is examined.

  1. A Physical Model of Electron Radiation Belts of Saturn

    NASA Astrophysics Data System (ADS)

    Lorenzato, L.; Sicard-Piet, A.; Bourdarie, S.

    2012-09-01

    Enrolling on the Cassini age, a physical Salammbô model for the radiation belts of Saturn have been developed including several physical processes governing the kronian magnetosphere. Results have been compared with Cassini MIMI LEMMS data.

  2. Numerical strategy for model correction using physical constraints

    NASA Astrophysics Data System (ADS)

    He, Yanyan; Xiu, Dongbin

    2016-05-01

    In this paper we present a strategy for correcting model deficiency using observational data. We first present the model correction in a general form, involving both external correction and internal correction. The model correction problem is then parameterized and casted into an optimization problem, from which the parameters are determined. More importantly, we discuss the incorporation of physical constraints from the underlying physical problem. Several representative examples are presented, where the physical constraints take very different forms. Numerical tests demonstrate that the physics constrained model correction is an effective way to address model-form uncertainty.

  3. Modelling Mathematical Reasoning in Physics Education

    ERIC Educational Resources Information Center

    Uhden, Olaf; Karam, Ricardo; Pietrocola, Mauricio; Pospiech, Gesche

    2012-01-01

    Many findings from research as well as reports from teachers describe students' problem solving strategies as manipulation of formulas by rote. The resulting dissatisfaction with quantitative physical textbook problems seems to influence the attitude towards the role of mathematics in physics education in general. Mathematics is often seen as a…

  4. Engaging Students In Modeling Instruction for Introductory Physics

    NASA Astrophysics Data System (ADS)

    Brewe, Eric

    2016-05-01

    Teaching introductory physics is arguably one of the most important things that a physics department does. It is the primary way that students from other science disciplines engage with physics and it is the introduction to physics for majors. Modeling instruction is an active learning strategy for introductory physics built on the premise that science proceeds through the iterative process of model construction, development, deployment, and revision. We describe the role that participating in authentic modeling has in learning and then explore how students engage in this process in the classroom. In this presentation, we provide a theoretical background on models and modeling and describe how these theoretical elements are enacted in the introductory university physics classroom. We provide both quantitative and video data to link the development of a conceptual model to the design of the learning environment and to student outcomes. This work is supported in part by DUE #1140706.

  5. Advanced in turbulence physics and modeling by direct numerical simulations

    NASA Technical Reports Server (NTRS)

    Reynolds, W. C.

    1987-01-01

    The advent of direct numerical simulations of turbulence has opened avenues for research on turbulence physics and turbulence modeling. Direct numerical simulation provides values for anything that the scientist or modeler would like to know about the flow. An overview of some recent advances in the physical understanding of turbulence and in turbulence modeling obtained through such simulations is presented.

  6. A Path-Analysis Model of Secondary Physics Enrollments

    ERIC Educational Resources Information Center

    Bryant, Lee T.; Doran, Rodney L.

    1977-01-01

    Develops a path-analysis model of critical variables affecting student enrollment in secondary school physics. A test of the model utilizing state provided data of physics enrollment in New York State resulted in the rejection of the model; however, significant critical variable results were obtained. (SL)

  7. Teacher Fidelity to One Physical Education Curricular Model

    ERIC Educational Resources Information Center

    Kloeppel, Tiffany; Kulinna, Pamela Hodges; Stylianou, Michalis; van der Mars, Hans

    2013-01-01

    This study addressed teachers' fidelity to one Physical Education curricular model. The theoretical framework guiding this study included professional development and fidelity to curricular models. In this study, teachers' fidelity to the Dynamic Physical Education (DPE) curricular model was measured for high and nonsupport district groups.…

  8. Supervision Models with Respect to Physical Education Needs.

    ERIC Educational Resources Information Center

    Williams, Lisa G.

    This paper focuses on several models of supervision in public schools with respect to needs in physical education. A literature review examined the traditional, counseling-based, self-analysis, competency-based, and systematic supervision models. Findings include the use of each model and the failure of each in the physical education setting. One…

  9. Intentional Development: A Model to Guide Lifelong Physical Activity

    ERIC Educational Resources Information Center

    Cherubini, Jeffrey M.

    2009-01-01

    Framed in the context of researching influences on physical activity and actually working with individuals and groups seeking to initiate, increase or maintain physical activity, the purpose of this review is to present the model of Intentional Development as a multi-theoretical approach to guide research and applied work in physical activity.…

  10. TOWARD EFFICIENT RIPARIAN RESTORATION: INTEGRATING ECONOMIC, PHYSICAL, AND BIOLOGICAL MODELS

    EPA Science Inventory

    This paper integrates economic, biological, and physical models to determine the efficient combination and spatial allocation of conservation efforts for water quality protection and salmonid habitat enhancement in the Grande Ronde basin, Oregon. The integrated modeling system co...

  11. An Empirical-Mathematical Modelling Approach to Upper Secondary Physics

    ERIC Educational Resources Information Center

    Angell, Carl; Kind, Per Morten; Henriksen, Ellen K.; Guttersrud, Oystein

    2008-01-01

    In this paper we describe a teaching approach focusing on modelling in physics, emphasizing scientific reasoning based on empirical data and using the notion of multiple representations of physical phenomena as a framework. We describe modelling activities from a project (PHYS 21) and relate some experiences from implementation of the modelling…

  12. Modeling the Discrimination Power of Physics Items

    ERIC Educational Resources Information Center

    Mesic, Vanes

    2011-01-01

    For the purposes of tailoring physics instruction in accordance with the needs and abilities of the students it is useful to explore the knowledge structure of students of different ability levels. In order to precisely differentiate the successive, characteristic states of student achievement it is necessary to use test items that possess…

  13. Testing a Theoretical Model of Immigration Transition and Physical Activity.

    PubMed

    Chang, Sun Ju; Im, Eun-Ok

    2015-01-01

    The purposes of the study were to develop a theoretical model to explain the relationships between immigration transition and midlife women's physical activity and test the relationships among the major variables of the model. A theoretical model, which was developed based on transitions theory and the midlife women's attitudes toward physical activity theory, consists of 4 major variables, including length of stay in the United States, country of birth, level of acculturation, and midlife women's physical activity. To test the theoretical model, a secondary analysis with data from 127 Hispanic women and 123 non-Hispanic (NH) Asian women in a national Internet study was used. Among the major variables of the model, length of stay in the United States was negatively associated with physical activity in Hispanic women. Level of acculturation in NH Asian women was positively correlated with women's physical activity. Country of birth and level of acculturation were significant factors that influenced physical activity in both Hispanic and NH Asian women. The findings support the theoretical model that was developed to examine relationships between immigration transition and physical activity; it shows that immigration transition can play an essential role in influencing health behaviors of immigrant populations in the United States. The NH theoretical model can be widely used in nursing practice and research that focus on immigrant women and their health behaviors. Health care providers need to consider the influences of immigration transition to promote immigrant women's physical activity. PMID:26502554

  14. Simple universal models capture all classical spin physics.

    PubMed

    De las Cuevas, Gemma; Cubitt, Toby S

    2016-03-11

    Spin models are used in many studies of complex systems because they exhibit rich macroscopic behavior despite their microscopic simplicity. Here, we prove that all the physics of every classical spin model is reproduced in the low-energy sector of certain "universal models," with at most polynomial overhead. This holds for classical models with discrete or continuous degrees of freedom. We prove necessary and sufficient conditions for a spin model to be universal and show that one of the simplest and most widely studied spin models, the two-dimensional Ising model with fields, is universal. Our results may facilitate physical simulations of Hamiltonians with complex interactions. PMID:26965624

  15. Mental, physical, and mathematical models in the teaching and learning of physics

    NASA Astrophysics Data System (ADS)

    Greca, Ileana María; Moreira, Marco Antonio

    2002-01-01

    In this paper, we initially discuss the relationships among physical, mathematical, and mental models in the process of constructing and understanding physical theories. We adopt the assumption that comprehension in a particular field of physics is attained when it is possible to predict a physical phenomenon from its physical models without having to previously refer to the mathematical formalism. The physical models constitute the semantic structure of a physical theory and determine the way the classes of phenomena linked to them should be perceived. Within this framework, the first step in order to understand a phenomenon or a process in physics is to construct mental models that will allow the individual to understand the statements that compose the semantic structure of the theory, being necessary, at the same time, to modify the way of perceiving the phenomena by constructing mental models that will permit him to evaluate as true or false the descriptions the theory makes of them. When this double process is attained concerning a particular phenomenon, in such a way that the results of the constructed mental models (predictions and explanations) match those scientifically accepted, one can say that the individual has constructed an adequate mental model of the physical model of the theory. Then, in the light of this discussion, we attempt to interpret the research findings we have obtained so far with college students, regarding mental models and physics education under the framework of Johnson-Laird's mental model theory. The difficulties faced by the students to achieve the understanding of physical theories did not seem to be all of the same level: some are linked to the constraints imposed to the construction of mental models by students' previous knowledge and others, linked to the ways individuals perceive the world, seem to be much more problematic. We argue that teaching should focus on them, at least at introductory level, considering the explicit

  16. Relativistic models in nuclear and particle physics

    SciTech Connect

    Coester, F.

    1988-01-01

    A comparative overview is presented of different approaches to the construction of phenomenological dynamical models that respect basic principles of quantum theory and relativity. Wave functions defined as matrix elements of products of field operators on one hand and wave functions that are defined as representatives of state vectors in model Hilbert spaces are related differently to observables and dynamical models for these wave functions have each distinct advantages and disadvantages 34 refs.

  17. Operational physical models of the ionosphere

    NASA Technical Reports Server (NTRS)

    Nisbet, J. S.

    1978-01-01

    Global models of the neutral constituents are considered relevant to ion density models and improved knowledge of the ion chemistry. Information provided on the pressure gradients that control the wind system and the electric field systems due to balloon, satellite, and incoherent scatter measurements is discussed along with the implication of these results to the development of global ionospheric models. The current state of knowledge of the factors controlling the large day to day variations in the ionosphere and possible approaches for operational models are reviewed.

  18. On physical aspects of the Jiles-Atherton hysteresis models

    NASA Astrophysics Data System (ADS)

    Zirka, Sergey E.; Moroz, Yuriy I.; Harrison, Robert G.; Chwastek, Krzysztof

    2012-08-01

    The physical assumptions underlying the static and dynamic Jiles-Atherton (JA) hysteresis models are critically analyzed. It is shown that the energy-balance method used in deriving these models is actually closer to a balance of coenergies, thereby depriving the resulting JA phenomenology of physical meaning. The non-physical basis of its dynamic extension is demonstrated by a sharp contrast between hysteresis loops predicted by the model and those measured for grain-oriented steel under conditions of controlled sinusoidal flux density at frequencies of 50, 100, and 200 Hz.

  19. Engineered Barrier System: Physical and Chemical Environment Model

    SciTech Connect

    D. M. Jolley; R. Jarek; P. Mariner

    2004-02-09

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  20. Hidden sector DM models and Higgs physics

    SciTech Connect

    Ko, P.

    2014-06-24

    We present an extension of the standard model to dark sector with an unbroken local dark U(1){sub X} symmetry. Including various singlet portal interactions provided by the standard model Higgs, right-handed neutrinos and kinetic mixing, we show that the model can address most of phenomenological issues (inflation, neutrino mass and mixing, baryon number asymmetry, dark matter, direct/indirect dark matter searches, some scale scale puzzles of the standard collisionless cold dark matter, vacuum stability of the standard model Higgs potential, dark radiation) and be regarded as an alternative to the standard model. The Higgs signal strength is equal to one as in the standard model for unbroken U(1){sub X} case with a scalar dark matter, but it could be less than one independent of decay channels if the dark matter is a dark sector fermion or if U(1){sub X} is spontaneously broken, because of a mixing with a new neutral scalar boson in the models.

  1. Towards LHC physics with nonlocal Standard Model

    NASA Astrophysics Data System (ADS)

    Biswas, Tirthabir; Okada, Nobuchika

    2015-09-01

    We take a few steps towards constructing a string-inspired nonlocal extension of the Standard Model. We start by illustrating how quantum loop calculations can be performed in nonlocal scalar field theory. In particular, we show the potential to address the hierarchy problem in the nonlocal framework. Next, we construct a nonlocal abelian gauge model and derive modifications of the gauge interaction vertex and field propagators. We apply the modifications to a toy version of the nonlocal Standard Model and investigate collider phenomenology. We find the lower bound on the scale of nonlocality from the 8 TeV LHC data to be 2.5-3 TeV.

  2. Massive Stars: Input Physics and Stellar Models

    NASA Astrophysics Data System (ADS)

    El Eid, M. F.; The, L.-S.; Meyer, B. S.

    2009-10-01

    We present a general overview of the structure and evolution of massive stars of masses ≥12 M ⊙ during their pre-supernova stages. We think it is worth reviewing this topic owing to the crucial role of massive stars in astrophysics, especially in the evolution of galaxies and the universe. We have performed several test computations with the aim to analyze and discuss many physical uncertainties still encountered in massive-star evolution. In particular, we explore the effects of mass loss, convection, rotation, 12C( α, γ)16O reaction and initial metallicity. We also compare and analyze the similarities and differences among various works and ours. Finally, we present useful comments on the nucleosynthesis from massive stars concerning the s-process and the yields for 26Al and 60Fe.

  3. Physically-Derived Dynamical Cores in Atmospheric General Circulation Models

    NASA Technical Reports Server (NTRS)

    Rood, Richard B.; Lin, Shian-Kiann

    1999-01-01

    The algorithm chosen to represent the advection in atmospheric models is often used as the primary attribute to classify the model. Meteorological models are generally classified as spectral or grid point, with the term grid point implying discretization using finite differences. These traditional approaches have a number of shortcomings that render them non-physical. That is, they provide approximate solutions to the conservation equations that do not obey the fundamental laws of physics. The most commonly discussed shortcomings are overshoots and undershoots which manifest themselves most overtly in the constituent continuity equation. For this reason many climate models have special algorithms to model water vapor advection. This talk focuses on the development of an atmospheric general circulation model which uses a consistent physically-based advection algorithm in all aspects of the model formulation. The shallow-water model of Lin and Rood (QJRMS, 1997) is generalized to three dimensions and combined with the physics parameterizations of NCAR's Community Climate Model. The scientific motivation for the development is to increase the integrity of the underlying fluid dynamics so that the physics terms can be more effectively isolated, examined, and improved. The expected benefits of the new model are discussed and results from the initial integrations will be presented.

  4. Physically-Derived Dynamical Cores in Atmospheric General Circulation Models

    NASA Technical Reports Server (NTRS)

    Rood, Richard B.; Lin, Shian-Jiann

    1999-01-01

    The algorithm chosen to represent the advection in atmospheric models is often used as the primary attribute to classify the model. Meteorological models are generally classified as spectral or grid point, with the term grid point implying discretization using finite differences. These traditional approaches have a number of shortcomings that render them non-physical. That is, they provide approximate solutions to the conservation equations that do not obey the fundamental laws of physics. The most commonly discussed shortcomings are overshoots and undershoots which manifest themselves most overtly in the constituent continuity equation. For this reason many climate models have special algorithms to model water vapor advection. This talk focuses on the development of an atmospheric general circulation model which uses a consistent physically-based advection algorithm in all aspects of the model formulation. The shallow-water model is generalized to three dimensions and combined with the physics parameterizations of NCAR's Community Climate Model. The scientific motivation for the development is to increase the integrity of the underlying fluid dynamics so that the physics terms can be more effectively isolated, examined, and improved. The expected benefits of the new model are discussed and results from the initial integrations will be presented.

  5. Early Childhood Educators' Experience of an Alternative Physical Education Model

    ERIC Educational Resources Information Center

    Tsangaridou, Niki; Genethliou, Nicholas

    2016-01-01

    Alternative instructional and curricular models are regarded as more comprehensive and suitable approaches to providing quality physical education (Kulinna 2008; Lund and Tannehill 2010; McKenzie and Kahan 2008; Metzler 2011; Quay and Peters 2008). The purpose of this study was to describe the impact of the Early Steps Physical Education…

  6. A Model of Physical Performance for Occupational Tasks.

    ERIC Educational Resources Information Center

    Hogan, Joyce

    This report acknowledges the problems faced by industrial/organizational psychologists who must make personnel decisions involving physically demanding jobs. The scarcity of criterion-related validation studies and the difficulty of generalizing validity are considered, and a model of physical performance that builds on Fleishman's (1984)…

  7. Educational Value and Models-Based Practice in Physical Education

    ERIC Educational Resources Information Center

    Kirk, David

    2013-01-01

    A models-based approach has been advocated as a means of overcoming the serious limitations of the traditional approach to physical education. One of the difficulties with this approach is that physical educators have sought to use it to achieve diverse and sometimes competing educational benefits, and these wide-ranging aspirations are rarely if…

  8. A Physically Based Coupled Chemical and Physical Weathering Model for Simulating Soilscape Evolution

    NASA Astrophysics Data System (ADS)

    Willgoose, G. R.; Welivitiya, D.; Hancock, G. R.

    2015-12-01

    A critical missing link in existing landscape evolution models is a dynamic soil evolution models where soils co-evolve with the landform. Work by the authors over the last decade has demonstrated a computationally manageable model for soil profile evolution (soilscape evolution) based on physical weathering. For chemical weathering it is clear that full geochemistry models such as CrunchFlow and PHREEQC are too computationally intensive to be couplable to existing soilscape and landscape evolution models. This paper presents a simplification of CrunchFlow chemistry and physics that makes the task feasible, and generalises it for hillslope geomorphology applications. Results from this simplified model will be compared with field data for soil pedogenesis. Other researchers have previously proposed a number of very simple weathering functions (e.g. exponential, humped, reverse exponential) as conceptual models of the in-profile weathering process. The paper will show that all of these functions are possible for specific combinations of in-soil environmental, geochemical and geologic conditions, and the presentation will outline the key variables controlling which of these conceptual models can be realistic models of in-profile processes and under what conditions. The presentation will finish by discussing the coupling of this model with a physical weathering model, and will show sample results from our SSSPAM soilscape evolution model to illustrate the implications of including chemical weathering in the soilscape evolution model.

  9. Flavour physics in the soft wall model

    NASA Astrophysics Data System (ADS)

    Archer, Paul R.; Huber, Stephan J.; Jäger, Sebastian

    2011-12-01

    We extend the description of flavour that exists in the Randall-Sundrum (RS) model to the soft wall (SW) model in which the IR brane is removed and the Higgs is free to propagate in the bulk. It is demonstrated that, like the RS model, one can generate the hierarchy of fermion masses by localising the fermions at different locations throughout the space. However, there are two significant differences. Firstly the possible fermion masses scale down, from the electroweak scale, less steeply than in the RS model and secondly there now exists a minimum fermion mass for fermions sitting towards the UV brane. With a quadratic Higgs VEV, this minimum mass is about fifteen orders of magnitude lower than the electroweak scale. We derive the gauge propagator and despite the KK masses scaling as m_n^2 ˜ n , it is demonstrated that the coefficients of four fermion operators are not divergent at tree level. FCNC's amongst kaons and leptons are considered and compared to calculations in the RS model, with a brane localised Higgs and equivalent levels of tuning. It is found that since the gauge fermion couplings are slightly more universal and the SM fermions typically sit slightly further towards the UV brane, the contributions to observables such as ɛ K and Δ m K , from the exchange of KK gauge fields, are significantly reduced.

  10. LCDD: A complete detector description package

    NASA Astrophysics Data System (ADS)

    Graf, Norman; McCormick, Jeremy

    2015-07-01

    LCDD has been developed to provide a complete detector description package for physics detector simulations using Geant4. All aspects of the experimental setup, such as the physical geometry, magnetic fields, and sensitive detector readouts, as well as control of the physics simulations, such as physics processes, interaction models and kinematic limits, are defined at runtime. Users are therefore able to concentrate on the design of the detector system without having to master the intricacies of C++ programming or being proficient in setting up their own Geant4 application. We describe both the XML-based file format and the processors which communicate this information to the underlying Geant4 simulation toolkit.

  11. Propulsion Physics Under the Changing Density Field Model

    NASA Technical Reports Server (NTRS)

    Robertson, Glen A.

    2011-01-01

    To grow as a space faring race, future spaceflight systems will requires new propulsion physics. Specifically a propulsion physics model that does not require mass ejection without limiting the high thrust necessary to accelerate within or beyond our solar system and return within a normal work period or lifetime. In 2004 Khoury and Weltman produced a density dependent cosmology theory they called Chameleon Cosmology, as at its nature, it is hidden within known physics. This theory represents a scalar field within and about an object, even in the vacuum. Whereby, these scalar fields can be viewed as vacuum energy fields with definable densities that permeate all matter; having implications to dark matter/energy with universe acceleration properties; implying a new force mechanism for propulsion physics. Using Chameleon Cosmology, the author has developed a new propulsion physics model, called the Changing Density Field (CDF) Model. This model relates to density changes in these density fields, where the density field density changes are related to the acceleration of matter within an object. These density changes in turn change how an object couples to the surrounding density fields. Whereby, thrust is achieved by causing a differential in the coupling to these density fields about an object. Since the model indicates that the density of the density field in an object can be changed by internal mass acceleration, even without exhausting mass, the CDF model implies a new propellant-less propulsion physics model

  12. The proton and carbon therapy experience of the medical physics group at the Italian Southern Laboratories: Monte Carlo simulation and experiment

    NASA Astrophysics Data System (ADS)

    Cirrone, G. A. Pablo; Agodi, C.; Candiano, G.; Cuttone, G.; di Rosa, F.; Mongelli, E.; Lojacono, P.; Mazzaglia, S.; Russo, G.; Romano, F.; Valastro, L. M.; Lo Nigro, S.; Pittera, S.; Sabini, M. G.; Rafaele, L.; Salamone, V.; Morone, C.; Randazzo, N.; Sipala, V.; Bucciolini, M.; Bruzzi, M.; Menichelli, D.

    2008-03-01

    At the Italian Southern Laboratories (LNS) of the Italian National Institute for Nuclear Physics the first, and actually unique, Italian proton therapy center is installed and operating. Up to now, 140 patients have been treated. In this environment a big effort is devoted towards Monte Carlo simulation expeciallt with the GEANT4 Toolkit. The authors of this work belong to the Geant4 Collaboration and they use the toolkit in their research programs. They maintain a Monte Carlo application devoted to the complete simulation of a generic hadron-therapy beam line and take active part in the study of fragmentation processes. Moreover they are working in the development of a prototype of a proton Computed tomographic system. In this work we will report our results in the field of proton and carbon therapy either in the simulation as well in the experimental side of our activity.

  13. A physical model of Titan's aerosols

    NASA Technical Reports Server (NTRS)

    Toon, O. B.; Mckay, C. P.; Griffith, C. A.; Turco, R. P.

    1992-01-01

    A modeling effort is presented for the nature of the stratospheric haze on Titan, under several simplifying assumptions; chief among these is that the aerosols in question are of a single composition, and involatile. It is further assumed that a one-dimensional model is capable of simulating the general characteristics of the aerosol. It is suggested in this light that the detached haze on Titan may be a manifestation of organized, Hadley-type motions above 300 km altitude, with vertical velocities of 1 cm/sec. The hemispherical asymmetry of the visible albedo may be due to organized vertical motions within the upper 150-200 km of the haze.

  14. Multivariate Regression Models for Estimating Journal Usefulness in Physics.

    ERIC Educational Resources Information Center

    Bennion, Bruce C.; Karschamroon, Sunee

    1984-01-01

    This study examines possibility of ranking journals in physics by means of bibliometric regression models that estimate usefulness as it is reported by 167 physicists in United States and Canada. Development of four models, patterns of deviation from models, and validity and application are discussed. Twenty-six references are cited. (EJS)

  15. Kinetic exchange models: From molecular physics to social science

    NASA Astrophysics Data System (ADS)

    Patriarca, Marco; Chakraborti, Anirban

    2013-08-01

    We discuss several multi-agent models that have their origin in the kinetic exchange theory of statistical mechanics and have been recently applied to a variety of problems in the social sciences. This class of models can be easily adapted for simulations in areas other than physics, such as the modeling of income and wealth distributions in economics and opinion dynamics in sociology.

  16. Physical model to predict the ball-burnishing forces

    NASA Astrophysics Data System (ADS)

    González-Rojas, H. A.; Travieso-Rodríguez, J. A.

    2012-04-01

    In this paper, we have developed a physical model to predict the forces of the ball burnishing. The models have been constructed on the basis of the plasticity theory. During the model development we have figured out the dimensionless number B that characterizes the problem of plastic deformation in the ball-burnishing. The experiments performed in steel and aluminum allows to validate the model and to emphasize the correct prediction of behavior patterns that the model describes.

  17. Statistical physics models for nacre fracture simulation

    NASA Astrophysics Data System (ADS)

    Nukala, Phani Kumar V. V.; Šimunović, Srđan

    2005-10-01

    Natural biological materials such as nacre (or mother-of-pearl), exhibit phenomenal fracture strength and toughness properties despite the brittle nature of their constituents. For example, nacre’s work of fracture is three orders of magnitude greater than that of a single crystal of its constituent mineral. This study investigates the fracture properties of nacre using a simple discrete lattice model based on continuous damage random thresholds fuse network. The discrete lattice topology of the proposed model is based on nacre’s unique brick and mortar microarchitecture, and the mechanical behavior of each of the bonds in the discrete lattice model is governed by the characteristic modular damage evolution of the organic matrix that includes the mineral bridges between the aragonite platelets. The analysis indicates that the excellent fracture properties of nacre are a result of their unique microarchitecture, repeated unfolding of protein molecules (modular damage evolution) in the organic polymer, and the presence of fiber bundle of mineral bridges between the aragonite platelets. The numerical results obtained using this simple discrete lattice model are in excellent agreement with the previously obtained experimental results, such as nacre’s stiffness, tensile strength, and work of fracture.

  18. Statistical physics models for nacre fracture simulation.

    PubMed

    Nukala, Phani Kumar V V; Simunović, Srdan

    2005-10-01

    Natural biological materials such as nacre (or mother-of-pearl), exhibit phenomenal fracture strength and toughness properties despite the brittle nature of their constituents. For example, nacre's work of fracture is three orders of magnitude greater than that of a single crystal of its constituent mineral. This study investigates the fracture properties of nacre using a simple discrete lattice model based on continuous damage random thresholds fuse network. The discrete lattice topology of the proposed model is based on nacre's unique brick and mortar microarchitecture, and the mechanical behavior of each of the bonds in the discrete lattice model is governed by the characteristic modular damage evolution of the organic matrix that includes the mineral bridges between the aragonite platelets. The analysis indicates that the excellent fracture properties of nacre are a result of their unique microarchitecture, repeated unfolding of protein molecules (modular damage evolution) in the organic polymer, and the presence of fiber bundle of mineral bridges between the aragonite platelets. The numerical results obtained using this simple discrete lattice model are in excellent agreement with the previously obtained experimental results, such as nacre's stiffness, tensile strength, and work of fracture. PMID:16383432

  19. Tight Binding Models in Cold Atoms Physics

    NASA Astrophysics Data System (ADS)

    Zakrzewski, J.

    2007-05-01

    Cold atomic gases placed in optical lattice potentials offer a unique tool to study simple tight binding models. Both the standard cases known from the condensed matter theory as well as novel situations may be addressed. Cold atoms setting allows for a precise control of parameters of the systems discussed, stimulating new questions and problems. The attempts to treat disorder in a controlled fashion are addressed in detail.

  20. ITER physics-safety interface: models and assessments

    SciTech Connect

    Uckan, N.A.; Putvinski, S.; Wesley, J.; Bartels, H-W.; Honda, T.; Amano, T.; Boucher, D.; Fujisawa, N.; Post, D.; Rosenbluth, M.

    1996-10-01

    Plasma operation conditions and physics requirements to be used as a basis for safety analysis studies are developed and physics results motivated by safety considerations are presented for the ITER design. Physics guidelines and specifications for enveloping plasma dynamic events for Category I (operational event), Category II (likely event), and Category III (unlikely event) are characterized. Safety related physics areas that are considered are: (i) effect of plasma on machined and safety (disruptions, runaway electrons, fast plasma shutdown) and (ii) plasma response to ex-vessel LOCA from first wall providing a potential passive plasma shutdown due to Be evaporation. Physics models and expressions developed are implemented in safety analysis code (SAFALY, couples 0-D dynamic plasma model to thermal response of the in-vessel components). Results from SAFALY are presented.

  1. Diagnosing forecast model errors with a perturbed physics ensemble

    NASA Astrophysics Data System (ADS)

    Mulholland, David; Haines, Keith; Sparrow, Sarah

    2016-04-01

    Perturbed physics ensembles are routinely used to analyse long-timescale climate model behaviour, but have less often been used to study model processes on shorter timescales. We present a method for diagnosing the sources of error in an initialised forecast model by using information from an ensemble of members with known perturbations to model physical parameters. We combine a large perturbed physics ensemble with a set of initialised forecasts to deduce possible process errors present in the standard HadCM3 model, which cause the model to drift from the truth in the early stages of the forecast. It is shown that, even on the sub-seasonal timescale, forecast drifts can be linked to perturbations in individual physical parameters, and that the parameters which exert most influence on forecast drifts vary regionally. Equivalent parameter perturbations are recovered from the initialised forecasts, and used to suggest the physical processes that are most critical to controlling model drifts on a regional basis. It is suggested that this method could be used to improve forecast skill, by reducing model drift through regional tuning of parameter values and targeted parameterisation refinement.

  2. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  3. Catch bonds: physical models and biological functions.

    PubMed

    Zhu, Cheng; McEver, Rodger P

    2005-09-01

    Force can shorten the lifetimes of receptor-ligand bonds by accelerating their dissociation. Perhaps paradoxical at first glance, bond lifetimes can also be prolonged by force. This counterintuitive behavior was named catch bonds, which is in contrast to the ordinary slip bonds that describe the intuitive behavior of lifetimes being shortened by force. Fifteen years after their theoretical proposal, catch bonds have finally been observed. In this article we review recently published data that have demonstrated catch bonds in the selectin system and suggested catch bonds in other systems, the theoretical models for their explanations, and their function as a mechanism for flow-enhanced adhesion. PMID:16708472

  4. Scenarios of physics beyond the standard model

    NASA Astrophysics Data System (ADS)

    Fok, Ricky

    This dissertation discusses three topics on scenarios beyond the Standard Model. Topic one is the effects from a fourth generation of quarks and leptons on electroweak baryogenesis in the early universe. The Standard Model is incapable of electroweak baryogenesis due to an insufficiently strong enough electroweak phase transition (EWPT) as well as insufficient CP violation. We show that the presence of heavy fourth generation fermions solves the first problem but requires additional bosons to be included to stabilize the electroweak vacuum. Introducing supersymmetric partners of the heavy fermions, we find that the EWPT can be made strong enough and new sources of CP violation are present. Topic two relates to the lepton avor problem in supersymmetry. In the Minimal Supersymmetric Standard Model (MSSM), the off-diagonal elements in the slepton mass matrix must be suppressed at the 10-3 level to avoid experimental bounds from lepton avor changing processes. This dissertation shows that an enlarged R-parity can alleviate the lepton avor problem. An analysis of all sensitive parameters was performed in the mass range below 1 TeV, and we find that slepton maximal mixing is possible without violating bounds from the lepton avor changing processes: mu → egamma; mu → e conversion, and mu → 3e. Topic three is the collider phenomenology of quirky dark matter. In this model, quirks are particles that are gauged under the electroweak group, as well as a dark" color SU(2) group. The hadronization scale of this color group is well below the quirk masses. As a result, the dark color strings never break. Quirk and anti-quirk pairs can be produced at the LHC. Once produced, they immediately form a bound state of high angular momentum. The quirk pair rapidly shed angular momentum by emitting soft radiation before they annihilate into observable signals. This dissertation presents the decay branching ratios of quirkonia where quirks obtain their masses through electroweak

  5. Characterizing, modeling, and addressing gender disparities in introductory college physics

    NASA Astrophysics Data System (ADS)

    Kost-Smith, Lauren Elizabeth

    2011-12-01

    The underrepresentation and underperformance of females in physics has been well documented and has long concerned policy-makers, educators, and the physics community. In this thesis, we focus on gender disparities in the first- and second-semester introductory, calculus-based physics courses at the University of Colorado. Success in these courses is critical for future study and careers in physics (and other sciences). Using data gathered from roughly 10,000 undergraduate students, we identify and model gender differences in the introductory physics courses in three areas: student performance, retention, and psychological factors. We observe gender differences on several measures in the introductory physics courses: females are less likely to take a high school physics course than males and have lower standardized mathematics test scores; males outscore females on both pre- and post-course conceptual physics surveys and in-class exams; and males have more expert-like attitudes and beliefs about physics than females. These background differences of males and females account for 60% to 70% of the gender gap that we observe on a post-course survey of conceptual physics understanding. In analyzing underlying psychological factors of learning, we find that female students report lower self-confidence related to succeeding in the introductory courses (self-efficacy) and are less likely to report seeing themselves as a "physics person". Students' self-efficacy beliefs are significant predictors of their performance, even when measures of physics and mathematics background are controlled, and account for an additional 10% of the gender gap. Informed by results from these studies, we implemented and tested a psychological, self-affirmation intervention aimed at enhancing female students' performance in Physics 1. Self-affirmation reduced the gender gap in performance on both in-class exams and the post-course conceptual physics survey. Further, the benefit of the self

  6. Beyond standard model physics at current and future colliders

    NASA Astrophysics Data System (ADS)

    Liu, Zhen

    The Large Hadron Collider (LHC), a multinational experiment which began running in 2009, is highly expected to discover new physics that will help us understand the nature of the universe and begin to find solutions to many of the unsolved puzzles of particle physics. For over 40 years the Standard Model has been the accepted theory of elementary particle physics, except for one unconfirmed component, the Higgs boson. The experiments at the LHC have recently discovered this Standard-Model-like Higgs boson. This discovery is one of the most exciting achievements in elementary particle physics. Yet, a profound question remains: Is this rather light, weakly-coupled boson nothing but a Standard Model Higgs or a first manifestation of a deeper theory? Also, the recent discoveries of neutrino mass and mixing, experimental evidences of dark matter and dark energy, matter-antimatter asymmetry, indicate that our understanding of fundamental physics is currently incomplete. For the next decade and more, the LHC and future colliders will be at the cutting-edge of particle physics discoveries and will shed light on many of these unanswered questions. There are many promising beyond-Standard-Model theories that may help solve the central puzzles of particle physics. To fill the gaps in our knowledge, we need to know how these theories will manifest themselves in controlled experiments, such as high energy colliders. I discuss how we can probe fundamental physics at current and future colliders directly through searches for new phenomena such as resonances, rare Higgs decays, exotic displaced signatures, and indirectly through precision measurements on Higgs in this work. I explore beyond standard model physics effects from different perspectives, including explicit models such as supersymmetry, generic models in terms of resonances, as well as effective field theory approach in terms of higher dimensional operators. This work provides a generic and broad overview of the physics

  7. Application of physical parameter identification to finite element models

    NASA Technical Reports Server (NTRS)

    Bronowicki, Allen J.; Lukich, Michael S.; Kuritz, Steven P.

    1986-01-01

    A time domain technique for matching response predictions of a structural dynamic model to test measurements is developed. Significance is attached to prior estimates of physical model parameters and to experimental data. The Bayesian estimation procedure allows confidence levels in predicted physical and modal parameters to be obtained. Structural optimization procedures are employed to minimize an error functional with physical model parameters describing the finite element model as design variables. The number of complete FEM analyses are reduced using approximation concepts, including the recently developed convoluted Taylor series approach. The error function is represented in closed form by converting free decay test data to a time series model using Prony' method. The technique is demonstrated on simulated response of a simple truss structure.

  8. The limitations of mathematical modeling in high school physics education

    NASA Astrophysics Data System (ADS)

    Forjan, Matej

    The theme of the doctoral dissertation falls within the scope of didactics of physics. Theoretical analysis of the key constraints that occur in the transmission of mathematical modeling of dynamical systems into field of physics education in secondary schools is presented. In an effort to explore the extent to which current physics education promotes understanding of models and modeling, we analyze the curriculum and the three most commonly used textbooks for high school physics. We focus primarily on the representation of the various stages of modeling in the solved tasks in textbooks and on the presentation of certain simplifications and idealizations, which are in high school physics frequently used. We show that one of the textbooks in most cases fairly and reasonably presents the simplifications, while the other two half of the analyzed simplifications do not explain. It also turns out that the vast majority of solved tasks in all the textbooks do not explicitly represent model assumptions based on what we can conclude that in high school physics the students do not develop sufficiently a sense of simplification and idealizations, which is a key part of the conceptual phase of modeling. For the introduction of modeling of dynamical systems the knowledge of students is also important, therefore we performed an empirical study on the extent to which high school students are able to understand the time evolution of some dynamical systems in the field of physics. The research results show the students have a very weak understanding of the dynamics of systems in which the feedbacks are present. This is independent of the year or final grade in physics and mathematics. When modeling dynamical systems in high school physics we also encounter the limitations which result from the lack of mathematical knowledge of students, because they don't know how analytically solve the differential equations. We show that when dealing with one-dimensional dynamical systems

  9. "Let's get physical": advantages of a physical model over 3D computer models and textbooks in learning imaging anatomy.

    PubMed

    Preece, Daniel; Williams, Sarah B; Lam, Richard; Weller, Renate

    2013-01-01

    Three-dimensional (3D) information plays an important part in medical and veterinary education. Appreciating complex 3D spatial relationships requires a strong foundational understanding of anatomy and mental 3D visualization skills. Novel learning resources have been introduced to anatomy training to achieve this. Objective evaluation of their comparative efficacies remains scarce in the literature. This study developed and evaluated the use of a physical model in demonstrating the complex spatial relationships of the equine foot. It was hypothesized that the newly developed physical model would be more effective for students to learn magnetic resonance imaging (MRI) anatomy of the foot than textbooks or computer-based 3D models. Third year veterinary medicine students were randomly assigned to one of three teaching aid groups (physical model; textbooks; 3D computer model). The comparative efficacies of the three teaching aids were assessed through students' abilities to identify anatomical structures on MR images. Overall mean MRI assessment scores were significantly higher in students utilizing the physical model (86.39%) compared with students using textbooks (62.61%) and the 3D computer model (63.68%) (P < 0.001), with no significant difference between the textbook and 3D computer model groups (P = 0.685). Student feedback was also more positive in the physical model group compared with both the textbook and 3D computer model groups. Our results suggest that physical models may hold a significant advantage over alternative learning resources in enhancing visuospatial and 3D understanding of complex anatomical architecture, and that 3D computer models have significant limitations with regards to 3D learning. PMID:23349117

  10. Physical microscopic model of proteins under force.

    PubMed

    Dokholyan, Nikolay V

    2012-06-14

    Nature has evolved proteins to counteract forces applied on living cells, and has designed proteins that can sense forces. One can appreciate Nature's ingenuity in evolving these proteins to be highly sensitive to force and to have a high dynamic force range at which they operate. To achieve this level of sensitivity, many of these proteins are composed of multiple domains and linking peptides connecting these domains, each of them having their own force response regimes. Here, using a simple model of a protein, we address the question of how each individual domain responds to force. We also ask how multidomain proteins respond to forces. We find that the end-to-end distance of individual domains under force scales linearly with force. In multidomain proteins, we find that the force response has a rich range: at low force, extension is predominantly governed by "weaker" linking peptides or domain intermediates, while at higher force, the extension is governed by unfolding of individual domains. Overall, the force extension curve comprises multiple sigmoidal transitions governed by unfolding of linking peptides and domains. Our study provides a basic framework for the understanding of protein response to force, and allows for interpretation experiments in which force is used to study the mechanical properties of multidomain proteins. PMID:22375559

  11. Applying Transtheoretical Model to Promote Physical Activities Among Women

    PubMed Central

    Pirzadeh, Asiyeh; Mostafavi, Firoozeh; Ghofranipour, Fazllolah; Feizi, Awat

    2015-01-01

    Background: Physical activity is one of the most important indicators of health in communities but different studies conducted in the provinces of Iran showed that inactivity is prevalent, especially among women. Objectives: Inadequate regular physical activities among women, the importance of education in promoting the physical activities, and lack of studies on the women using transtheoretical model, persuaded us to conduct this study with the aim of determining the application of transtheoretical model in promoting the physical activities among women of Isfahan. Materials and Methods: This research was a quasi-experimental study which was conducted on 141 women residing in Isfahan, Iran. They were randomly divided into case and control groups. In addition to the demographic information, their physical activities and the constructs of the transtheoretical model (stages of change, processes of change, decisional balance, and self-efficacy) were measured at 3 time points; preintervention, 3 months, and 6 months after intervention. Finally, the obtained data were analyzed through t test and repeated measures ANOVA test using SPSS version 16. Results: The results showed that education based on the transtheoretical model significantly increased physical activities in 2 aspects of intensive physical activities and walking, in the case group over the time. Also, a high percentage of people have shown progress during the stages of change, the mean of the constructs of processes of change, as well as pros and cons. On the whole, a significant difference was observed over the time in the case group (P < 0.01). Conclusions: This study showed that interventions based on the transtheoretical model can promote the physical activity behavior among women. PMID:26834796

  12. Spin-foam models and the physical scalar product

    SciTech Connect

    Alesci, Emanuele; Noui, Karim; Sardelli, Francesco

    2008-11-15

    This paper aims at clarifying the link between loop quantum gravity and spin-foam models in four dimensions. Starting from the canonical framework, we construct an operator P acting on the space of cylindrical functions Cyl({gamma}), where {gamma} is the four-simplex graph, such that its matrix elements are, up to some normalization factors, the vertex amplitude of spin-foam models. The spin-foam models we are considering are the topological model, the Barrett-Crane model, and the Engle-Pereira-Rovelli model. If one of these spin-foam models provides a covariant quantization of gravity, then the associated operator P should be the so-called ''projector'' into physical states and its matrix elements should give the physical scalar product. We discuss the possibility to extend the action of P to any cylindrical functions on the space manifold.

  13. Technical Manual for the SAM Physical Trough Model

    SciTech Connect

    Wagner, M. J.; Gilman, P.

    2011-06-01

    NREL, in conjunction with Sandia National Lab and the U.S Department of Energy, developed the System Advisor Model (SAM) analysis tool for renewable energy system performance and economic analysis. This paper documents the technical background and engineering formulation for one of SAM's two parabolic trough system models in SAM. The Physical Trough model calculates performance relationships based on physical first principles where possible, allowing the modeler to predict electricity production for a wider range of component geometries than is possible in the Empirical Trough model. This document describes the major parabolic trough plant subsystems in detail including the solar field, power block, thermal storage, piping, auxiliary heating, and control systems. This model makes use of both existing subsystem performance modeling approaches, and new approaches developed specifically for SAM.

  14. Snyder-de Sitter model from two-time physics

    SciTech Connect

    Carrisi, M. C.; Mignemi, S.

    2010-11-15

    We show that the symplectic structure of the Snyder model on a de Sitter background can be derived from two-time physics in seven dimensions and propose a Hamiltonian for a free particle consistent with the symmetries of the model.

  15. Partial Possible Models: An Approach To Interpret Students' Physical Representation.

    ERIC Educational Resources Information Center

    Camacho, Fernando Flores; Cazares, Leticia Gallegos

    1998-01-01

    Illustrates the construction of conceptual models on pressure and flotation using high school students' previous ideas on these concepts. Identifies three models and uses them to analyze students' ideas about physical phenomena and to recognize the inferential structure they use. Contains 28 references. (DDR)

  16. Investigating Student Understanding of Quantum Physics: Spontaneous Models of Conductivity.

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Steinberg, Richard N.; Redish, Edward F.

    2002-01-01

    Investigates student reasoning about models of conduction. Reports that students often are unable to account for the existence of free electrons in a conductor and create models that lead to incorrect predictions and responses contradictory to expert descriptions of the physics involved. (Contains 36 references.) (Author/YDS)

  17. Rock.XML - Towards a library of rock physics models

    NASA Astrophysics Data System (ADS)

    Jensen, Erling Hugo; Hauge, Ragnar; Ulvmoen, Marit; Johansen, Tor Arne; Drottning, Åsmund

    2016-08-01

    Rock physics modelling provides tools for correlating physical properties of rocks and their constituents to the geophysical observations we measure on a larger scale. Many different theoretical and empirical models exist, to cover the range of different types of rocks. However, upon reviewing these, we see that they are all built around a few main concepts. Based on this observation, we propose a format for digitally storing the specifications for rock physics models which we have named Rock.XML. It does not only contain data about the various constituents, but also the theories and how they are used to combine these building blocks to make a representative model for a particular rock. The format is based on the Extensible Markup Language XML, making it flexible enough to handle complex models as well as scalable towards extending it with new theories and models. This technology has great advantages as far as documenting and exchanging models in an unambiguous way between people and between software. Rock.XML can become a platform for creating a library of rock physics models; making them more accessible to everyone.

  18. A physical model of Titan's aerosols.

    PubMed

    Toon, O B; McKay, C P; Griffith, C A; Turco, R P

    1992-01-01

    Microphysical simulations of Titan's stratospheric haze show that aerosol microphysics is linked to organized dynamical processes. The detached haze layer may be a manifestation of 1 cm sec-1 vertical velocities at altitudes above 300 km. The hemispherical asymmetry in the visible albedo may be caused by 0.05 cm sec-1 vertical velocities at altitudes of 150 to 200 km, we predict contrast reversal beyond 0.6 micrometer. Tomasko and Smith's (1982, Icarus 51, 65-95) model, in which a layer of large particles above 220 km altitude is responsible for the high forward scattering observed by Rages and Pollack (1983, Icarus 55, 50-62), is a natural outcome of the detached haze layer being produced by rising motions if aerosol mass production occurs primarily below the detached haze layer. The aerosol's electrical charge is critical for the particle size and optical depth of the haze. The geometric albedo, particularly in the ultraviolet and near infrared, requires that the particle size be near 0.15 micrometer down to altitudes below 100 km, which is consistent with polarization observations (Tomasko and Smith 1982, West and Smith 1991, Icarus 90, 330-333). Above about 400 km and below about 150 km Yung et al.'s (1984, Astrophys. J. Suppl. Ser. 55, 465-506) diffusion coefficients are too small. Dynamical processes control the haze particles below about 150 km. The relatively large eddy diffusion coefficients in the lower stratosphere result in a vertically extensive region with nonuniform mixing ratios of condensable gases, so that most hydrocarbons may condense very near the tropopause rather than tens of kilometers above it. The optical depths of hydrocarbon clouds are probably less than one, requiring that abundant gases such as ethane condense on a subset of the haze particles to create relatively large, rapidly removed particles. The wavelength dependence of the optical radius is calculated for use in analyzing observations of the geometric albedo. The lower

  19. The Effects of a Model-Based Physics Curriculum Program with a Physics First Approach: A Causal-Comparative Study

    ERIC Educational Resources Information Center

    Liang, Ling L.; Fulmer, Gavin W.; Majerich, David M.; Clevenstine, Richard; Howanski, Raymond

    2012-01-01

    The purpose of this study is to examine the effects of a model-based introductory physics curriculum on conceptual learning in a Physics First (PF) Initiative. This is the first comparative study in physics education that applies the Rasch modeling approach to examine the effects of a model-based curriculum program combined with PF in the United…

  20. Search for Beyond the Standard Model Physics at D0

    SciTech Connect

    Kraus, James

    2011-08-01

    The standard model (SM) of particle physics has been remarkably successful at predicting the outcomes of particle physics experiments, but there are reasons to expect new physics at the electroweak scale. Over the last several years, there have been a number of searches for beyond the standard model (BSM) physics at D0. Here, we limit our focus to three: searches for diphoton events with large missing transverse energy (E{sub T}), searches for leptonic jets and E{sub T}, and searches for single vector-like quarks. We have discussed three recent searches at D0. There are many more, including limits on heavy neutral gauge boson in the ee channel, a search for scalar top quarks, a search for quirks, and limits on a new resonance decaying to WW or WZ.

  1. A physical data model for fields and agents

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; de Bakker, Merijn; Karssenberg, Derek

    2016-04-01

    Two approaches exist in simulation modeling: agent-based and field-based modeling. In agent-based (or individual-based) simulation modeling, the entities representing the system's state are represented by objects, which are bounded in space and time. Individual objects, like an animal, a house, or a more abstract entity like a country's economy, have properties representing their state. In an agent-based model this state is manipulated. In field-based modeling, the entities representing the system's state are represented by fields. Fields capture the state of a continuous property within a spatial extent, examples of which are elevation, atmospheric pressure, and water flow velocity. With respect to the technology used to create these models, the domains of agent-based and field-based modeling have often been separate worlds. In environmental modeling, widely used logical data models include feature data models for point, line and polygon objects, and the raster data model for fields. Simulation models are often either agent-based or field-based, even though the modeled system might contain both entities that are better represented by individuals and entities that are better represented by fields. We think that the reason for this dichotomy in kinds of models might be that the traditional object and field data models underlying those models are relatively low level. We have developed a higher level conceptual data model for representing both non-spatial and spatial objects, and spatial fields (De Bakker et al. 2016). Based on this conceptual data model we designed a logical and physical data model for representing many kinds of data, including the kinds used in earth system modeling (e.g. hydrological and ecological models). The goal of this work is to be able to create high level code and tools for the creation of models in which entities are representable by both objects and fields. Our conceptual data model is capable of representing the traditional feature data

  2. Physical and numerical modeling of Joule-heated melters

    NASA Astrophysics Data System (ADS)

    Eyler, L. L.; Skarda, R. J.; Crowder, R. S., III; Trent, D. S.; Reid, C. R.; Lessor, D. L.

    1985-10-01

    The Joule-heated ceramic-lined melter is an integral part of the high level waste immobilization process under development by the US Department of Energy. Scaleup and design of this waste glass melting furnace requires an understanding of the relationships between melting cavity design parameters and the furnace performance characteristics such as mixing, heat transfer, and electrical requirements. Developing empirical models of these relationships through actual melter testing with numerous designs would be a very costly and time consuming task. Additionally, the Pacific Northwest Laboratory (PNL) has been developing numerical models that simulate a Joule-heated melter for analyzing melter performance. This report documents the method used and results of this modeling effort. Numerical modeling results are compared with the more conventional, physical modeling results to validate the approach. Also included are the results of numerically simulating an operating research melter at PNL. Physical Joule-heated melters modeling results used for qualiying the simulation capabilities of the melter code included: (1) a melter with a single pair of electrodes and (2) a melter with a dual pair (two pairs) of electrodes. The physical model of the melter having two electrode pairs utilized a configuration with primary and secondary electrodes. The principal melter parameters (the ratio of power applied to each electrode pair, modeling fluid depth, electrode spacing) were varied in nine tests of the physical model during FY85. Code predictions were made for five of these tests. Voltage drops, temperature field data, and electric field data varied in their agreement with the physical modeling results, but in general were judged acceptable.

  3. Physical and numerical modeling of Joule-heated melters

    SciTech Connect

    Eyler, L.L.; Skarda, R.J.; Crowder, R.S. III; Trent, D.S.; Reid, C.R.; Lessor, D.L.

    1985-10-01

    The Joule-heated ceramic-lined melter is an integral part of the high level waste immobilization process under development by the US Department of Energy. Scaleup and design of this waste glass melting furnace requires an understanding of the relationships between melting cavity design parameters and the furnace performance characteristics such as mixing, heat transfer, and electrical requirements. Developing empirical models of these relationships through actual melter testing with numerous designs would be a very costly and time consuming task. Additionally, the Pacific Northwest Laboratory (PNL) has been developing numerical models that simulate a Joule-heated melter for analyzing melter performance. This report documents the method used and results of this modeling effort. Numerical modeling results are compared with the more conventional, physical modeling results to validate the approach. Also included are the results of numerically simulating an operating research melter at PNL. Physical Joule-heated melters modeling results used for qualiying the simulation capabilities of the melter code included: (1) a melter with a single pair of electrodes and (2) a melter with a dual pair (two pairs) of electrodes. The physical model of the melter having two electrode pairs utilized a configuration with primary and secondary electrodes. The principal melter parameters (the ratio of power applied to each electrode pair, modeling fluid depth, electrode spacing) were varied in nine tests of the physical model during FY85. Code predictions were made for five of these tests. Voltage drops, temperature field data, and electric field data varied in their agreement with the physical modeling results, but in general were judged acceptable. 14 refs., 79 figs., 17 tabs.

  4. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.; Kumar, S.; Lapenta, W.; Li, X.; Matsui, T.; Rienecker, M.; Shen, B.W.; Shi, J.J.; Simpson, J.; Zeng, X.

    2008-01-01

    Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite

  5. Female role models in physics education in Ireland

    NASA Astrophysics Data System (ADS)

    Chormaic, Síle Nic; Fee, Sandra; Tobin, Laura; Hennessy, Tara

    2013-03-01

    In this paper we consider the statistics on undergraduate student representation in Irish universities and look at student numbers in secondary (high) schools in one region in Ireland. There seems to be no significant change in female participation in physics from 2002 to 2011. Additionally, we have studied the influence of an educator's gender on the prevalence of girls studying physics in secondary schools in Co. Louth, Ireland, and at the postgraduate level in Irish universities. It would appear that strong female role models have a positive influence and lead to an increase in girls' participation in physics.

  6. Rupture Directivity in a Foam Rubber Physical Model

    NASA Astrophysics Data System (ADS)

    Anooshehpoor, R.; Brune, J. N.

    2003-12-01

    Understanding earthquake rupture dynamics, especially forward rupture directivity (focusing of seismic energy in the direction of rupture propagation), is crucial in determining the seismic hazard for critical structures located near major active faults. We use foam rubber modeling experiments to provide constraints on parameters that control rupture dynamics, and consequently, forward directivity effects. Numerical models currently in use have too many unconstrained parameters to allow confidence in predictions, and may not even be realistic from a physical point of view. The foam rubber model allows us to develop a deep physical understanding of an actual physical model. This in turn will allow us to better specify which physical parameters used in numerical models are critical, and establish a realistic range for their values, and to better understand and qualify particular numerical models. Three-dimensional numerical simulations of earlier experiments with excellent results provided incentive for additional funding from PEER to increase the number of recording channels in the model from 32 to 76. In particular, we have increased the number of recording sites on the fault plane from 12 to 35 to provide a better picture of the slip distribution on the fault during rupture. At the time of meeting we will present waveforms for selected events.

  7. Source signature and acoustic field of seismic physical modeling

    NASA Astrophysics Data System (ADS)

    Lin, Q.; Jackson, C.; Tang, G.; Burbach, G.

    2004-12-01

    As an important tool of seismic research and exploration, seismic physical modeling simulates the real world data acquisition by scaling the model, acquisition parameters, and some features of the source generated by a transducer. Unlike the numerical simulation where a point source is easily satisfied, the transducer can't be made small enough for approximating the point source in physical modeling, therefore yield different source signature than the sources applied in the field data acquisition. To better understand the physical modeling data, characterizing the wave field generated by ultrasonic transducers is desirable and helpful. In this study, we explode several aspects of source characterization; including their radiation pattern, directivity, sensitivity and frequency response. We also try to figure out how to improve the acquired data quality, such as minimize ambient noise, use encoded chirp to prevent ringing, apply deterministic deconvolution to enhance data resolution and t-P filtering to remove linear events. We found that the transducer and their wave field, the modeling system performance, as well as material properties of the model and their coupling conditions all play roles in the physical modeling data acquisition.

  8. A Physical Model of Electron Radiation Belts of Saturn

    NASA Astrophysics Data System (ADS)

    Lorenzato, L.; Sicard-Piet, A.; Bourdarie, S.

    2012-04-01

    Radiation belts causes irreversible damages on on-board instruments materials. That's why for two decades, ONERA proposes studies about radiation belts of magnetized planets. First, in the 90's, the development of a physical model, named Salammbô, carried out a model of the radiation belts of the Earth. Then, for few years, analysis of the magnetosphere of Jupiter and in-situ data (Pioneer, Voyager, Galileo) allow to build a physical model of the radiation belts of Jupiter. Enrolling on the Cassini age and thanks to all information collected, this study permits to adapt Salammbô jovian radiation belts model to the case of Saturn environment. Indeed, some physical processes present in the kronian magnetosphere are similar to those present in the magnetosphere of Jupiter (radial diffusion; interaction of energetic electrons with rings, moons, atmosphere; synchrotron emission). However, some physical processes have to be added to the kronian model (compared to the jovian model) because of the particularity of the magnetosphere of Saturn: interaction of energetic electrons with neutral particles from Enceladus, and wave-particle interaction. This last physical process has been studied in details with the analysis of CASSINI/RPWS (Radio and Plasma Waves Science) data. The major importance of the wave particles interaction is now well known in the case of the radiation belts of the Earth but it is important to investigate on its role in the case of Saturn. So, importance of each physical process has been studied and analysis of Cassini MIMI-LEMMS and CAPS data allows to build a model boundary condition (at L = 6). Finally, results of this study lead to a kronian electrons radiation belts model including radial diffusion, interactions of energetic electrons with rings, moons and neutrals particles and wave-particle interaction (interactions of electrons with atmosphere particles and synchrotron emission are too weak to be taken into account in this model). Then, to

  9. Technique to model and design physical database systems

    SciTech Connect

    Wise, T.E.

    1983-12-01

    Database management systems (DBMSs) allow users to define and manipulate records at a logical level of abstraction. A logical record is not stored as users see it but is mapped into a collection of physical records. Physical records are stored in file structures managed by a DBMS. Likewise, DBMS commands which appear to be directed toward one or more logical records actually correspond to a series of operations on the file structures. The structures and operations of a DBMS (i.e., its physical architecture) are not visible to users at the logical level. Traditionally, logical records and DBMS commands are mapped to physical records and operations in one step. In this report, logical records are mapped to physical records in a series of steps over several levels of abstraction. Each level of abstraction is composed of one or more intermediate record types. A hierarchy of record types results that covers the gap between logical and physical records. The first step of our technique identifies the record types and levels of abstraction that describe a DBMS. The second step maps DBMS commands to physical operations in terms of these records and levels of abstraction. The third step encapsulates each record type and its operations into a programming construct called a module. The applications of our technique include modeling existing DBMSs and designing the physical architectures of new DBMSs. To illustrate one application, we describe in detail the architecture of the commercial DBMS INQUIRE.

  10. A trichrome beam model for biological dose calculation in scanned carbon-ion radiotherapy treatment planning

    NASA Astrophysics Data System (ADS)

    Inaniwa, T.; Kanematsu, N.

    2015-01-01

    In scanned carbon-ion (C-ion) radiotherapy, some primary C-ions undergo nuclear reactions before reaching the target and the resulting particles deliver doses to regions at a significant distance from the central axis of the beam. The effects of these particles on physical dose distribution are accounted for in treatment planning by representing the transverse profile of the scanned C-ion beam as the superposition of three Gaussian distributions. In the calculation of biological dose distribution, however, the radiation quality of the scanned C-ion beam has been assumed to be uniform over its cross-section, taking the average value over the plane at a given depth (monochrome model). Since these particles, which have relatively low radiation quality, spread widely compared to the primary C-ions, the radiation quality of the beam should vary with radial distance from the central beam axis. To represent its transverse distribution, we propose a trichrome beam model in which primary C-ions, heavy fragments with atomic number Z ≥ 3, and light fragments with Z ≤ 2 are assigned to the first, second, and third Gaussian components, respectively. Assuming a realistic beam-delivery system, we performed computer simulations using Geant4 Monte Carlo code for analytical beam modeling of the monochrome and trichrome models. The analytical beam models were integrated into a treatment planning system for scanned C-ion radiotherapy. A target volume of 20  ×  20  ×  40 mm3 was defined within a water phantom. A uniform biological dose of 2.65 Gy (RBE) was planned for the target with the two beam models based on the microdosimetric kinetic model (MKM). The plans were recalculated with Geant4, and the recalculated biological dose distributions were compared with the planned distributions. The mean target dose of the recalculated distribution with the monochrome model was 2.72 Gy (RBE), while the dose with the trichrome model was 2.64 Gy (RBE). The monochrome model

  11. Transport in Polymer-Electrolyte Membranes I. Physical Model

    SciTech Connect

    Weber, Adam Z.; Newman, John

    2003-06-02

    In this paper, a physical model is developed that is semiphenomenological and takes into account Schroeder's paradox. Using the wealth of knowledge contained in the literature regarding polymer-electrolyte membranes as a basis, a novel approach is taken in tying together all of the data into a single coherent theory. This approach involves describing the structural changes of the membrane due to water content, and casting this in terms of capillary phenomena. By treating the membrane in this fashion, Schroeder's paradox can be elucidated. Along with the structural changes, two different transport mechanisms are presented and discussed. These mechanisms, along with the membrane's structural changes, comprise the complete physical model of the membrane. The model is shown to agree qualitatively with different membranes and different membrane forms, and is applicable to modeling perfluorinated sulfonic acid and similar membranes. It is also the first physically based comprehensive model of transport in a membrane that includes a physical description of Schroeder's paradox, and it bridges the gap between the two types of macroscopic models currently in the literature.

  12. A mathematical look at a physical power prediction model

    SciTech Connect

    Landberg, L.

    1997-12-31

    This paper takes a mathematical look at a physical model used to predict the power produced from wind farms. The reason is to see whether simple mathematical expressions can replace the original equations, and to give guidelines as to where the simplifications can be made and where they can not. This paper shows that there is a linear dependence between the geostrophic wind and the wind at the surface, but also that great care must be taken in the selection of the models since physical dependencies play a very important role, e.g. through the dependence of the turning of the wind on the wind speed.

  13. Physics-based model for electro-chemical process

    SciTech Connect

    Zhang, Jinsuo

    2013-07-01

    Considering the kinetics of electrochemical reactions and mass transfer at the surface and near-surface of the electrode, a physics-based separation model for separating actinides from fission products in an electro-refiner is developed. The model, taking into account the physical, chemical and electrochemical processes at the electrode surface, can be applied to study electrorefining kinetics. One of the methods used for validation has been to apply the developed model to the computation of the cyclic voltammetry process of PuCl{sub 3} and UCl{sub 3} at a solid electrode in molten KCl-LiCl. The computed results appear to be similar to experimental measures. The separation model can be applied to predict materials flows under normal and abnormal operation conditions. Parametric studies can be conducted based on the model to identify the most important factors that affect the electrorefining processes.

  14. Highly physical penumbra solar radiation pressure modeling with atmospheric effects

    NASA Astrophysics Data System (ADS)

    Robertson, Robert; Flury, Jakob; Bandikova, Tamara; Schilling, Manuel

    2015-10-01

    We present a new method for highly physical solar radiation pressure (SRP) modeling in Earth's penumbra. The fundamental geometry and approach mirrors past work, where the solar radiation field is modeled using a number of light rays, rather than treating the Sun as a single point source. However, we aim to clarify this approach, simplify its implementation, and model previously overlooked factors. The complex geometries involved in modeling penumbra solar radiation fields are described in a more intuitive and complete way to simplify implementation. Atmospheric effects are tabulated to significantly reduce computational cost. We present new, more efficient and accurate approaches to modeling atmospheric effects which allow us to consider the high spatial and temporal variability in lower atmospheric conditions. Modeled penumbra SRP accelerations for the Gravity Recovery and Climate Experiment (GRACE) satellites are compared to the sub-nm/s2 precision GRACE accelerometer data. Comparisons to accelerometer data and a traditional penumbra SRP model illustrate the improved accuracy which our methods provide. Sensitivity analyses illustrate the significance of various atmospheric parameters and modeled effects on penumbra SRP. While this model is more complex than a traditional penumbra SRP model, we demonstrate its utility and propose that a highly physical model which considers atmospheric effects should be the basis for any simplified approach to penumbra SRP modeling.

  15. Physically-based Models For Flood Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Strupczewski, W. G.; Singh, V. P.; Weglarczyk, S.

    Flood frequency models can be broadly classified into: (1) empirical, (2) phenomeno- logical, and (3) physically based. Despite their appeal, physically-based models have yet to become models of choice in hydrologic practice. Along the lines of physically based models and recognizing that channels are the dominant conduits for transmis- sion of flood waters, it is plausible to develop a model that employs the physics of channel flow routing and in which no explicit consideration is given to the hydrologic processes occurring on the land areas of the watershed. It is well accepted that the complete linearized Saint Venant equation and its simplifications provide a reason- able representation of the physics of channel flow. It is then hypothesized that impulse response function (IRF) of such models can be considered as a probability density function (PDF) for FFA. The impulse response of a linear convective-diffusion anal- ogy (LD) model is proposed for perennial rivers and that of a linear kinematic dif- fusion (KD) model for ephemeral streams. Each of them has two parameters which are derived using the method of moments (MOM) and maximum likelihood method (MLM). Also derived are errors in quantiles for both methods. Both distributions show an equivalency of MOM and MLM with respect to the mean U an important property in the case of unknown true distribution function. The LD model was tested using 39 series of Polish rivers showing its superiority over Log-normal - the main competitor among the family of two-parameter PDFs for the analyzed data. In particular, the LD model represents FF-characteristics well when the LD is likely to be the best of all lin- ear flood routing models. The KD distribution was tested on 44 annual peak flows data series containing zero values. A comparison of empirical and KD distributions shows that MOM better reproduces the upper tail of the distribution, while MLM is more ro- bust for higher sample values and more conditioned on the

  16. Fusion Education Physical Models for Students and Teachers

    NASA Astrophysics Data System (ADS)

    Nagy, A.; Lee, R. L.

    2002-11-01

    Interactive classroom visits by scientists and engineers in the ``Scientist in the Classroom" program and educator workshops led by Fusion Education team members continue to be the catalyst in the development of low cost, age appropriate, understandable physical demonstration models for use in classroom and workshop environments. Physical models developed for these interactive settings are based on topics in plasma science and technology, vacuum, thermodynamics, light, and electricity and magnetism. The physical models are actual hands-on devices students use to observe specific phenomena. One example uses a piston, a sealed volume, and a vacuum chamber to illustrate the ideal gas law. Another example uses liquid nitrogen to explore how temperature affects changes in states of matter, and, as a third example, magnets are used on simple plasma devices to illustrate the effects a magnetic field has on moving, charged particles. The details of these models will be presented. Three very successful ``build-it" days have been sponsored that enable teachers to build these physics models for use in their own classrooms.

  17. Evaluating performances of simplified physically based landslide susceptibility models.

    NASA Astrophysics Data System (ADS)

    Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale

    2015-04-01

    Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk

  18. Coarse-grained, foldable, physical model of the polypeptide chain

    PubMed Central

    Chakraborty, Promita; Zuckermann, Ronald N.

    2013-01-01

    Although nonflexible, scaled molecular models like Pauling–Corey’s and its descendants have made significant contributions in structural biology research and pedagogy, recent technical advances in 3D printing and electronics make it possible to go one step further in designing physical models of biomacromolecules: to make them conformationally dynamic. We report here the design, construction, and validation of a flexible, scaled, physical model of the polypeptide chain, which accurately reproduces the bond rotational degrees of freedom in the peptide backbone. The coarse-grained backbone model consists of repeating amide and α-carbon units, connected by mechanical bonds (corresponding to φ and ψ) that include realistic barriers to rotation that closely approximate those found at the molecular scale. Longer-range hydrogen-bonding interactions are also incorporated, allowing the chain to readily fold into stable secondary structures. The model is easily constructed with readily obtainable parts and promises to be a tremendous educational aid to the intuitive understanding of chain folding as the basis for macromolecular structure. Furthermore, this physical model can serve as the basis for linking tangible biomacromolecular models directly to the vast array of existing computational tools to provide an enhanced and interactive human–computer interface. PMID:23898168

  19. Beyond the Standard Model Physics with Lattice Simulations

    NASA Astrophysics Data System (ADS)

    Rinaldi, Enrico

    2016-03-01

    Lattice simulations of gauge theories are a powerful tool to investigate strongly interacting systems like Quantum ChromoDynamics (QCD). In recent years, the expertise gathered from lattice QCD studies has been used to explore new extensions of the Standard Model of particle physics that include strong dynamics. This change of gear in lattice field theories is related to the growing experimental search for new physics, from accelerator facilites like the Large Hadron Collider (LHC) to dark matter detectors like LUX or ADMX. In my presentation I will explore different plausible scenarios for physics beyond the standard model where strong dynamics play a dominant role and can be tackled by numerical lattice simulations. The importance of lattice field theories is highlighted in the context of dark matter searches and the search for new resonances at the LHC. Acknowledge the support of the DOE under Contract DE-AC52-07NA27344 (LLNL).

  20. Filamentous Phages As a Model System in Soft Matter Physics.

    PubMed

    Dogic, Zvonimir

    2016-01-01

    Filamentous phages have unique physical properties, such as uniform particle lengths, that are not found in other model systems of rod-like colloidal particles. Consequently, suspensions of such phages provided powerful model systems that have advanced our understanding of soft matter physics in general and liquid crystals in particular. We described some of these advances. In particular we briefly summarize how suspensions of filamentous phages have provided valuable insight into the field of colloidal liquid crystals. We also describe recent experiments on filamentous phages that have elucidated a robust pathway for assembly of 2D membrane-like materials. Finally, we outline unique structural properties of filamentous phages that have so far remained largely unexplored yet have the potential to further advance soft matter physics and material science. PMID:27446051