Science.gov

Sample records for geant4 physics models

  1. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    NASA Astrophysics Data System (ADS)

    Apostolakis, J.; Asai, M.; Bagulya, A.; Brown, J. M. C.; Burkhardt, H.; Chikuma, N.; Cortes-Giraldo, M. A.; Elles, S.; Grichine, V.; Guatelli, S.; Incerti, S.; Ivanchenko, V. N.; Jacquemier, J.; Kadri, O.; Maire, M.; Pandola, L.; Sawkey, D.; Toshito, T.; Urban, L.; Yamashita, T.

    2015-12-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  2. Validation of Geant4 hadronic physics models at intermediate energies

    NASA Astrophysics Data System (ADS)

    Banerjee, Sunanda; Geant4 Hadronic Group

    2010-04-01

    GEANT4 provides a number of physics models at intermediate energies (corresponding to incident momenta in the range 1-20 GeV/c). Recently, these models have been validated with existing data from a number of experiments: (a) inclusive proton and neutron production with a variety of beams (π-, π+, p) at different energies between 1 and 9 GeV/c on a number of nuclear targets (from beryllium to uranium); (2) inclusive pion/kaon/proton production from 14.6 GeV/c proton beams on nuclear targets (from beryllium to gold); (3) inclusive pion production from pion beams between 3-13 GeV/c on a number of nuclear targets (from beryllium to lead). The results of simulation/data comparison for different GEANT4 models are discussed in the context of validating the models and determining their usage in physics lists for high energy application. Due to the increasing number of validations becoming available, and the requirement that they be done at regular intervals corresponding to the GEANT4 release schedule, automated methods of validation are being developed.

  3. Progress in Hadronic Physics Modelling in Geant4

    SciTech Connect

    Apostolakis, John; Folger, Gunter; Grichine, Vladimir; Heikkinen, Aatos; Howard, Alexander; Ivanchenko, Vladimir; Kaitaniemi, Pekka; Koi, Tatsumi; Kosov, Mikhail; Quesada, Jose Manuel; Ribon, Alberto; Uzhinsky, Vladimir; Wright, Dennis; /SLAC

    2011-11-28

    Geant4 offers a set of models to simulate hadronic showers in calorimeters. Recent improvements to several models relevant to the modelling of hadronic showers are discussed. These include improved cross sections, a revision of the FTF model, the addition of quasi-elastic scattering to the QGS model, and enhancements in the nuclear precompound and de-excitation models. The validation of physics models against thin target experiments has been extended especially in the energy region 10 GeV and below. Examples of new validation results are shown.

  4. Validation of Geant4 physics models for 56Fe ion beam in various media

    NASA Astrophysics Data System (ADS)

    Jalota, Summit; Kumar, Ashavani

    2012-11-01

    The depth-dose distribution of a 56Fe ion beam has been studied in water, polyethylene, nextel, kevlar and aluminum media. The dose reduction versus areal depth is also calculated for 56Fe ions in carbon, polyethylene and aluminum using the Monte Carlo simulation toolkit Geant4. This study presents the validation of physics models available in Geant4 by comparing the simulated results with the experimental data available in the literature. Simulations are performed using binary cascade (BIC), abrasion-ablation (AA) and quantum molecular dynamics (QMD) models; integrated into Geant4. Deviations from experimental results may be due to the selection of simple geometry. This paper also addresses the differences in the simulated results from various models.

  5. Preliminary Investigation of Microdosimetric Track Structure Physics Models in Geant4-DNA and RITRACKS

    PubMed Central

    Bezak, Eva

    2015-01-01

    The major differences between the physics models in Geant4-DNA and RITRACKS Monte Carlo packages are investigated. Proton and electron ionisation interactions and electron excitation interactions in water are investigated in the current work. While these packages use similar semiempirical physics models for inelastic cross-sections, the implementation of these models is demonstrated to be significantly different. This is demonstrated in a simple Monte Carlo simulation designed to identify differences in interaction cross-sections. PMID:26124856

  6. Validation of Hadronic Models in Geant4

    SciTech Connect

    Koi, Tatsumi; Wright, Dennis H.; Folger, Gunter; Ivantchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Pete; LeiFan; Wellisch, Hans-Peter

    2007-03-19

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin-target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  7. Validation of Hadronic Models in GEANT4

    SciTech Connect

    Koi, Tatsumi; Wright, Dennis H.; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Peter; Lei, Fan; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  8. The Geant4 physics validation repository

    SciTech Connect

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-01-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described

  9. The Geant4 physics validation repository

    NASA Astrophysics Data System (ADS)

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described.

  10. Implementing NRF Physics in Geant4

    SciTech Connect

    Jordan, David V.; Warren, Glen A.

    2006-07-01

    The Geant4 radiation transport Monte Carlo code toolkit currently does not support nuclear resonance fluorescence (NRF). After a brief review of NRF physics, plans for implementing this physics process in Geant4, and validating the output of the code, are described. The plans will be executed as Task 3 of project 50799, "Nuclear Resonance Fluorescence Signatures (NuRFS)".

  11. Physical models implemented in the GEANT4-DNA extension of the GEANT-4 toolkit for calculating initial radiation damage at the molecular level.

    PubMed

    Villagrasa, C; Francis, Z; Incerti, S

    2011-02-01

    The ROSIRIS project aims to study the radiobiology of integrated systems for medical treatment optimisation using ionising radiations and evaluate the associated risk. In the framework of this project, one research focus is the interpretation of the initial radio-induced damage in DNA created by ionising radiation (and detected by γ-H2AX foci analysis) from the track structure of the incident particles. In order to calculate the track structure of ionising particles at a nanometric level, the Geant4 Monte Carlo toolkit was used. Geant4 (Object Oriented Programming Architecture in C++) offers a common platform, available free to all users and relatively easy to use. Nevertheless, the current low-energy threshold for electromagnetic processes in GEANT4 is set to 1 keV (250 eV using the Livermore processes), which is an unsuitable value for nanometric applications. To lower this energy threshold, the necessary interaction processes and models were identified, and the corresponding available cross sections collected from the literature. They are mostly based on the plane-wave Born approximation (first Born approximation, or FBA) for inelastic interactions and on semi-empirical models for energies where the FBA fails (at low energies). In this paper, the extensions that have been introduced into the 9.3 release of the Geant4 toolkit are described, the so-called Geant4-DNA extension, including a set of processes and models adapted in this study and permitting the simulation of electron (8 eV-1 MeV), proton (100 eV-100 MeV) and alpha particle (1 keV-10 MeV) interactions in liquid water. PMID:21186212

  12. Geant4 electromagnetic physics updates for space radiation effects simulation

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Nieminen, Petteri; Incerti, Sebastien; Santin, Giovanni; Ivantchenko, Vladimir; Grichine, Vladimir; Allison, John; Karamitos, Mathiew

    The Geant4 toolkit is used in many applications including space science studies. The new Geant4 version 10.0 released in December 2013 includes a major revision of the toolkit and offers multi-threaded mode for event level parallelism. At the same time, Geant4 electromagnetic and hadronic physics sub-libraries have been significantly updated. In order to validate the new and updated models Geant4 verification tests and benchmarks were extended. Part of these developments was sponsored by the European Space Agency in the context of research aimed at modelling radiation biological end effects. In this work, we present an overview of results of several benchmarks for electromagnetic physics models relevant to space science. For electromagnetic physics, recently Compton scattering, photoelectric effect, and Rayleigh scattering models have been improved and extended down to lower energies. Models of ionization and fluctuations have also been improved; special micro-dosimetry models for Silicon and liquid water were introduced; the main multiple scattering model was consolidated; and the atomic de-excitation module has been made available to all models. As a result, Geant4 predictions for space radiation effects obtained with different Physics Lists are in better agreement with the benchmark data than previous Geant4 versions. Here we present results of electromagnetic tests and models comparison in the energy interval 10 eV - 10 MeV.

  13. Physical Modelling of Proton and Heavy Ion Radiation using Geant4

    NASA Astrophysics Data System (ADS)

    Douglass, M.; Bezak, E.

    2012-10-01

    Protons and heavy ion particles are considered to be ideal particles for use in external beam radiotherapy due to superior properties of the dose distribution that results when these particles are incident externally and due to their relative biological effectiveness. While significant research has been performed into the properties and physical dose characteristics of heavy ions, the nuclear reactions (direct and fragmentation) undergone by He4, C12 and Ne20 nuclei used in radiotherapy in materials other than water is still largely unexplored. In the current project, input code was developed for the Monte Carlo toolkit Geant 4 version 9.3 to simulate the transport of several mono-energetic heavy ions through water. The relative dose contributions from secondary particles and nuclear fragments originating from the primary particles were investigated for each ion in both water and dense bone (ICRU) media. The results indicated that the relative contribution to the total physical dose from nuclear fragments increased with both increasing particle mass and with increasing medium density. In the case of 150 MeV protons, secondary particles were shown to contribute less than 0.5% of the peak dose and as high as 25% when using 10570 MeV neon ions in bone. When water was substituted for a bone medium, the contributions from fragments increased by more than 6% for C12 and Ne20.

  14. Electro and gamma nuclear physics in Geant4

    SciTech Connect

    J.P. Wellisch; M. Kossov; P. Degtyarenko

    2003-03-01

    Adequate description of electro and gamma nuclear physics is of utmost importance in studies of electron beam-dumps and intense electron beam accelerators. I also is mandatory to describe neutron backgrounds and activation in linear colliders. This physics was elaborated in Geant4 over the last year, and now entered into the stage of practical application. In the Geant4 Photo-nuclear data base there are at present about 50 nuclei for which the Photo-nuclear absorption cross sections have been measured. Of these, data on 14 nuclei are used to parametrize the gamma nuclear reaction cross-section The resulting cross section is a complex, factorized function of A and e = log(E{gamma}), where E{gamma} is the energy of the incident photon. Electro-nuclear reactions are so closely connected with Photo-nuclear reactions that sometimes they are often called ''Photo-nuclear''. The one-photon exchange mechanism dominates in Electro-nuclear reactions, and the electron can be substituted by a flux of photons. Folding this flux with the gamma-nuclear cross-section, we arrive at an acceptable description of the electro-nuclear physics. Final states in gamma and electro nuclear physics are described using chiral invariant phase-space decay at low gamma or equivalent photon energies, and quark gluon string model at high energies. We will present the modeling of this physics in Geant4, and show results from practical applications.

  15. Experimental quantification of Geant4 PhysicsList recommendations: methods and results

    NASA Astrophysics Data System (ADS)

    Basaglia, Tullio; Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Saracco, Paolo

    2015-12-01

    The Geant4 physicsjists package encompasses predefined selections of physics processes and models to be used in simulation applications. Limited documentation is available in the literature about Geant4 pre-packaged PhysicsLists and their validation. The reports in the literature mainly concern specific use cases. This paper documents the epistemological grounds for the validation of Geant4 pre-packaged PhysicsLists (and their accessory classes, Builders and PhysicsConstructors) and some examples of the author's scientific activity on this subject.

  16. GEANT4 physics evaluation with testbeam data of the ATLAS hadronic end-cap calorimeter

    NASA Astrophysics Data System (ADS)

    Kiryunin, A. E.; Oberlack, H.; Salihagić, D.; Schacht, P.; Strizenec, P.

    2006-05-01

    We evaluate the validity of the GEANT4 electromagnetic and hadronic physics models by comparing experimental data from beam tests of modules of the ATLAS hadronic end-cap calorimeter with GEANT4-based simulations. Two physics lists (LHEP and QGSP) for the simulation of hadronic showers are evaluated. Calorimeter performance parameters like the energy resolution and shapes of showers are studied both for electrons and charged pions. Furthermore, throughout the paper we compare GEANT4 and the corresponding predictions of GEANT3 used with the G-CALOR code for hadronic shower development.

  17. GEANT4 physics evaluation with testbeam data of the ATLAS hadronic end-cap calorimeter

    NASA Astrophysics Data System (ADS)

    Kiryunin, A. E.; Oberlack, H.; Salihagić, D.; Schacht, P.; Strizenec, P.

    2009-04-01

    The validation of GEANT4 physics models is done by comparing experimental data from beam tests of modules of the ATLAS hadronic end-cap calorimeter with GEANT4 based simulations. Various physics lists for the simulation of hadronic showers are evaluated. We present results of studies of the calorimeter performance parameters (like energy resolution and shower shapes) as well as results of investigations of the influence of the Birks' law and of cuts on the time of development of hadronic showers.

  18. Comparison of GEANT4 very low energy cross section models with experimental data in water

    SciTech Connect

    Incerti, S.; Ivanchenko, A.; Karamitros, M.; Mantero, A.; Moretto, P.; Tran, H. N.; Mascialino, B.; Champion, C.; Ivanchenko, V. N.; Bernal, M. A.; Francis, Z.; Villagrasa, C.; Baldacchino, G.; Gueye, P.; Capra, R.; Nieminen, P.; Zacharatou, C.

    2010-09-15

    Purpose: The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H{sup 0}, H{sup +}) and (He{sup 0}, He{sup +}, He{sup 2+}), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called ''GEANT4-DNA'' physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. Methods: An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. Results: The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant

  19. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described.

  20. Diffusion-controlled reactions modeling in Geant4-DNA

    SciTech Connect

    Karamitros, M.; Luan, S.; Bernal, M.A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H.N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k–d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  1. Diffusion-controlled reactions modeling in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Karamitros, M.; Luan, S.; Bernal, M. A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H. N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k-d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  2. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4

    PubMed Central

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-01-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ε). In addition, we present comparisons of GEANT4 simulations performed with a “standard” and a “low-energy” physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results. Key Points Testing the feedback mechanism with GEANT4 Validating the GEANT4 programming toolkit Study the ratio of bremsstrahlung photons to electrons at TGF source altitude PMID:26167437

  3. Geant4 hadronic physics validation with ATLAS Tile Calorimeter test-beam data

    NASA Astrophysics Data System (ADS)

    Alexa, C.; Constantinescu, S.; DiÅ£ǎ, S.

    2006-10-01

    We present comparison studies between Geant4 shower packages and ATLAS Tile Calorimeter test-beam data collected at CERN in H8 beam line at the SPS. Emphasis is put on hadronic physics lists and data concerning differences between Tilecal response to pions and protons of same energy. The ratio between the pure hadronic fraction of pion and the pure hadronic fraction of proton Fhπ/Fhp was estimated with Tilecal test-beam data and compared with Geant4 simulations.

  4. Geant4 models for simulation of hadron/ion nuclear interactions at moderate and low energies.

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Ivanchenko, Vladimir; Quesada, Jose-Manuel; Wright, Dennis

    The Geant4 toolkit is intended for Monte Carlo simulation of particle transport in media. It was initially designed for High Energy Physics purposes such as experiments at the Large Hadron Collider (LHC) at CERN. The toolkit offers a set of models allowing effective simulation of cosmic ray interactions with different materials. For moderate and low energy hadron/ion interactions with nuclei there are a number of competitive models: Binary and Bertini intra-nuclear cascade models, quantum molecular dynamic model (QMD), INCL/ABLA cascade model, and Chiral Invariant Phase Space Decay model (CHIPS). We report the status of these models for the recent version of Geant4 (release 9.3, December 2009). The Bertini cascade in-ternal cross sections were upgraded. The native Geant4 precompound and deexcitation models were used in the Binary cascade and QMD. They were significantly improved including emis-sion of light fragments, the Fermi break-up model, the General Evaporation Model (GEM), the multi-fragmentation model, and the fission model. Comparisons between model predictions and data for thin target experiments for neutron, proton, light ions, and isotope production are presented and discussed. The focus of these validations is concentrated on target materials important for space missions.

  5. Introduction to the Geant4 Simulation toolkit

    SciTech Connect

    Guatelli, S.; Cutajar, D.; Rosenfeld, A. B.; Oborn, B.

    2011-05-05

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  6. Introduction to the Geant4 Simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guatelli, S.; Cutajar, D.; Oborn, B.; Rosenfeld, A. B.

    2011-05-01

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  7. Analysis of GEANT4 Physics List Properties in the 12 GeV MOLLER Simulation Framework

    NASA Astrophysics Data System (ADS)

    Haufe, Christopher; Moller Collaboration

    2013-10-01

    To determine the validity of new physics beyond the scope of the electroweak theory, nuclear physicists across the globe have been collaborating on future endeavors that will provide the precision needed to confirm these speculations. One of these is the MOLLER experiment - a low-energy particle experiment that will utilize the 12 GeV upgrade of Jefferson Lab's CEBAF accelerator. The motivation of this experiment is to measure the parity-violating asymmetry of scattered polarized electrons off unpolarized electrons in a liquid hydrogen target. This measurement would allow for a more precise determination of the electron's weak charge and weak mixing angle. While still in its planning stages, the MOLLER experiment requires a detailed simulation framework in order to determine how the project should be run in the future. The simulation framework for MOLLER, called ``remoll'', is written in GEANT4 code. As a result, the simulation can utilize a number of GEANT4 coded physics lists that provide the simulation with a number of particle interaction constraints based off of different particle physics models. By comparing these lists with one another using the data-analysis application ROOT, the most optimal physics list for the MOLLER simulation can be determined and implemented. This material is based upon work supported by the National Science Foundation under Grant No. 714001.

  8. GEANT4 Hadronic Physics Validation with Lhc Test-Beam Data

    NASA Astrophysics Data System (ADS)

    Alexa, Călin

    2005-02-01

    In the framework of the LHC Computing Grid (LCG) Simulation Physics Validation Project, we present first conclusions about the validation of the Geant4 hadronic physics lists based on comparisons with test-beam data collected with three LHC calorimeters: the ATLAS Tilecal, the ATLAS HEC and the CMS HCAL.

  9. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists.

  10. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN

    PubMed Central

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists. PMID:26170558

  11. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists. PMID:26170558

  12. Evaluation of proton inelastic reaction models in Geant4 for prompt gamma production during proton radiotherapy

    NASA Astrophysics Data System (ADS)

    Jeyasugiththan, Jeyasingam; Peterson, Stephen W.

    2015-10-01

    During proton beam radiotherapy, discrete secondary prompt gamma rays are induced by inelastic nuclear reactions between protons and nuclei in the human body. In recent years, the Geant4 Monte Carlo toolkit has played an important role in the development of a device for real time dose range verification purposes using prompt gamma radiation. Unfortunately the default physics models in Geant4 do not reliably replicate the measured prompt gamma emission. Determining a suitable physics model for low energy proton inelastic interactions will boost the accuracy of prompt gamma simulations. Among the built-in physics models, we found that the precompound model with a modified initial exciton state of 2 (1 particle, 1 hole) produced more accurate discrete gamma lines from the most important elements found within the body such as 16O, 12C and 14N when comparing them with the available gamma production cross section data. Using the modified physics model, we investigated the prompt gamma spectra produced in a water phantom by a 200 MeV pencil beam of protons. The spectra were attained using a LaBr3 detector with a time-of-flight (TOF) window and BGO active shield to reduce the secondary neutron and gamma background. The simulations show that a 2 ns TOF window could reduce 99% of the secondary neutron flux hitting the detector. The results show that using both timing and active shielding can remove up to 85% of the background radiation which includes a 33% reduction by BGO subtraction.

  13. Modeling the production and acceleration of runaway electrons in strong inhomogeneous electric fields with GEANT4

    NASA Astrophysics Data System (ADS)

    Broberg Skeltved, Alexander; Østgaard, Nikolai

    2015-04-01

    The mechanism responsible for the production of Terrestrial Gamma-ray Flashes (TGFs) is not yet fully understood. However, from satellite measurements we know that approximately 1017 relativistic electrons must be produced at a source altitude of 15 km in order to explain the measured photon intensity. It is also well established that TGFs and lightning are interlinked. One suggested mechanism is the production and multiplication of runaway electrons in the streamer-leader electric fields. We report on a new study that uses the Geometry and Tracking (GEANT4) programming toolkit to model the acceleration and multiplication of electrons in strong inhomogeneous electric fields such as those occuring in lightning leaders. In this model we implement a physics list of cross-sections developed by the GEANT4 collaboration to model low-energy particle interactions, the Low-energy Background Experiments (LBE). It has been shown that the choice of physics is crucial to obtain correct results. This physics list includes elastic scattering of electrons according to the møller-scattering method and bremsstrahlung according to the Seltzer-Berger method. In the model we simulate particle interactions explicitly for energies above 250 eV (10 eV for photons). Below 250 eV a continuous energy loss function is used.

  14. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products.

    PubMed

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm(3) water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  15. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products

    PubMed Central

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm3 water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  16. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products.

    PubMed

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm(3) water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  17. A Roadmap For Geant4

    NASA Astrophysics Data System (ADS)

    Asai, Makoto

    2012-12-01

    The Geant4 simulation toolkit is now in the 14th year of its production phase. Geant4 is the choice of most current and near future high energy physics experiments as their simulation engine, and it is also widely used in astrophysics, space engineering, medicine and industrial application domains. Geant4 is a “living” code under continuous development; improvement of physics quality and computational speed is still a priority for Geant4. It is evolving and being enriched with new functionalities. On the other hand, the simulation paradigm that prevailed during the foundation of Geant4 is now being rethought because of new technologies in both computer hardware and software. The Geant4 Collaboration has identified many options and possibilities. Geant4 has accommodated some of these by providing a multi-threading prototype based on event-level parallelism. In this article we discuss the past, present and future of the Geant4 toolkit.

  18. The GEANT4 Visualisation System

    SciTech Connect

    Allison, J.; Asai, M.; Barrand, G.; Donszelmann, M.; Minamimoto, K.; Tanaka, S.; Tcherniaev, E.; Tinslay, J.; /SLAC

    2007-11-02

    The Geant4 Visualization System is a multi-driver graphics system designed to serve the Geant4 Simulation Toolkit. It is aimed at the visualization of Geant4 data, primarily detector descriptions and simulated particle trajectories and hits. It can handle a variety of graphical technologies simultaneously and interchangeably, allowing the user to choose the visual representation most appropriate to requirements. It conforms to the low-level Geant4 abstract graphical user interfaces and introduces new abstract classes from which the various drivers are derived and that can be straightforwardly extended, for example, by the addition of a new driver. It makes use of an extendable class library of models and filters for data representation and selection. The Geant4 Visualization System supports a rich set of interactive commands based on the Geant4 command system. It is included in the Geant4 code distribution and maintained and documented like other components of Geant4.

  19. The Geant4 Bertini Cascade

    SciTech Connect

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron–nucleus interaction models in the Geant4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron–nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other Geant4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  20. The Geant4 Bertini Cascade

    NASA Astrophysics Data System (ADS)

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron-nucleus interaction models in the GEANT4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron-nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other GEANT4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  1. Application of TDCR-Geant4 modeling to standardization of 63Ni.

    PubMed

    Thiam, C; Bobin, C; Chauvenet, B; Bouchard, J

    2012-09-01

    As an alternative to the classical TDCR model applied to liquid scintillation (LS) counting, a stochastic approach based on the Geant4 toolkit is presented for the simulation of light emission inside the dedicated three-photomultiplier detection system. To this end, the Geant4 modeling includes a comprehensive description of optical properties associated with each material constituting the optical chamber. The objective is to simulate the propagation of optical photons from their creation in the LS cocktail to the production of photoelectrons in the photomultipliers. First validated for the case of radionuclide standardization based on Cerenkov emission, the scintillation process has been added to a TDCR-Geant4 modeling using the Birks expression in order to account for the light-emission nonlinearity owing to ionization quenching. The scintillation yield of the commercial Ultima Gold LS cocktail has been determined from double-coincidence detection efficiencies obtained for (60)Co and (54)Mn with the 4π(LS)β-γ coincidence method. In this paper, the stochastic TDCR modeling is applied for the case of the standardization of (63)Ni (pure β(-)-emitter; E(max)=66.98 keV) and the activity concentration is compared with the result given by the classical model.

  2. Monte Carlo modeling and validation of a proton treatment nozzle by using the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Kim, Dae-Hyun; Kang, Young Nam; Suh, Tae-Suk; Shin, Jungwook; Kim, Jong Won; Yoo, Seung Hoon; Park, Seyjoon; Lee, Sang Hoon; Cho, Sungkoo; Shin, Dongho; Kim, Dae Yong; Lee, Se Byeong

    2012-10-01

    Modern commercial treatment planning systems for proton therapy use the pencil beam algorithm for calculating the absorbed dose. Although it is acceptable for clinical radiation treatment, the accuracy of this method is limited. Alternatively, the Monte Carlo method, which is relatively accurate in dose calculations, has been applied recently to proton therapy. To reduce the remaining uncertainty in proton therapy dose calculations, in the present study, we employed Monte Carlo simulations and the Geant4 simulation toolkit to develop a model for a of a proton treatment nozzle. The results from a Geant4-based medical application of the proton treatment nozzle were compared to the measured data. Simulations of the percentage depth dose profiles showed very good agreement within 1 mm in distal range and 3 mm in modulated width. Moreover, the lateral dose profiles showed good agreement within 3% in the central region of the field and within 10% in the penumbra regions. In this work, we proved that the Geant4 Monte Carlo model of a proton treatment nozzle could be used to the calculate proton dose distributions accurately.

  3. Geant4 Model Validation of Compton Suppressed System for Process monitoring of Spent Fuel

    SciTech Connect

    Bender, Sarah; Unlu, Kenan; Orton, Christopher R.; Schwantes, Jon M.

    2013-05-01

    Nuclear material accountancy is of continuous concern for the regulatory, safeguards, and verification communities. In particular, spent nuclear fuel reprocessing facilities pose one of the most difficult accountancy challenges: monitoring highly radioactive, fluid sample streams in near real-time. The Multi-Isotope Process monitor will allow for near-real-time indication of process alterations using passive gamma-ray detection coupled with multivariate analysis techniques to guard against potential material diversion or to enhance domestic process monitoring. The Compton continuum from the dominant 661.7 keV 137Cs fission product peak obscures lower energy lines which could be used for spectral and multivariate analysis. Compton suppression may be able to mitigate the challenges posed by the high continuum caused by scattering. A Monte Carlo simulation using the Geant4 toolkit is being developed to predict the expected suppressed spectrum from spent fuel samples to estimate the reduction in the Compton continuum. Despite the lack of timing information between decay events in the particle management of Geant4, encouraging results were recorded utilizing only the information within individual decays without accounting for accidental coincidences. The model has been validated with single and cascade decay emitters in two steps: as an unsuppressed system and with suppression activated. Results of the Geant4 model validation will be presented.

  4. Recent developments in GEANT4

    NASA Astrophysics Data System (ADS)

    Allison, J.; Amako, K.; Apostolakis, J.; Arce, P.; Asai, M.; Aso, T.; Bagli, E.; Bagulya, A.; Banerjee, S.; Barrand, G.; Beck, B. R.; Bogdanov, A. G.; Brandt, D.; Brown, J. M. C.; Burkhardt, H.; Canal, Ph.; Cano-Ott, D.; Chauvie, S.; Cho, K.; Cirrone, G. A. P.; Cooperman, G.; Cortés-Giraldo, M. A.; Cosmo, G.; Cuttone, G.; Depaola, G.; Desorgher, L.; Dong, X.; Dotti, A.; Elvira, V. D.; Folger, G.; Francis, Z.; Galoyan, A.; Garnier, L.; Gayer, M.; Genser, K. L.; Grichine, V. M.; Guatelli, S.; Guèye, P.; Gumplinger, P.; Howard, A. S.; Hřivnáčová, I.; Hwang, S.; Incerti, S.; Ivanchenko, A.; Ivanchenko, V. N.; Jones, F. W.; Jun, S. Y.; Kaitaniemi, P.; Karakatsanis, N.; Karamitrosi, M.; Kelsey, M.; Kimura, A.; Koi, T.; Kurashige, H.; Lechner, A.; Lee, S. B.; Longo, F.; Maire, M.; Mancusi, D.; Mantero, A.; Mendoza, E.; Morgan, B.; Murakami, K.; Nikitina, T.; Pandola, L.; Paprocki, P.; Perl, J.; Petrović, I.; Pia, M. G.; Pokorski, W.; Quesada, J. M.; Raine, M.; Reis, M. A.; Ribon, A.; Ristić Fira, A.; Romano, F.; Russo, G.; Santin, G.; Sasaki, T.; Sawkey, D.; Shin, J. I.; Strakovsky, I. I.; Taborda, A.; Tanaka, S.; Tomé, B.; Toshito, T.; Tran, H. N.; Truscott, P. R.; Urban, L.; Uzhinsky, V.; Verbeke, J. M.; Verderi, M.; Wendt, B. L.; Wenzel, H.; Wright, D. H.; Wright, D. M.; Yamashita, T.; Yarba, J.; Yoshida, H.

    2016-11-01

    GEANT4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. The adaptation of GEANT4 to multithreading, advances in physics, detector modeling and visualization, extensions to the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.

  5. Recent developments in Geant4

    DOE PAGES

    Allison, J.; Amako, K.; Apostolakis, J.; Arce, P.; Asai, M.; Aso, T.; Bagli, E.; Bagulya, A.; Banerjee, S.; Barrand, G.; et al

    2016-07-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions tomore » the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.« less

  6. Technical Note: Improvements in GEANT4 energy-loss model and the effect on low-energy electron transport in liquid water

    SciTech Connect

    Kyriakou, I.; Incerti, S.

    2015-07-15

    Purpose: The GEANT4-DNA physics models are upgraded by a more accurate set of electron cross sections for ionization and excitation in liquid water. The impact of the new developments on low-energy electron transport simulations by the GEANT4 Monte Carlo toolkit is examined for improving its performance in dosimetry applications at the subcellular and nanometer level. Methods: The authors provide an algorithm for an improved implementation of the Emfietzoglou model dielectric response function of liquid water used in the GEANT4-DNA existing model. The algorithm redistributes the imaginary part of the dielectric function to ensure a physically motivated behavior at the binding energies, while retaining all the advantages of the original formulation, e.g., the analytic properties and the fulfillment of the f-sum-rule. In addition, refinements in the exchange and perturbation corrections to the Born approximation used in the GEANT4-DNA existing model are also made. Results: The new ionization and excitation cross sections are significantly different from those of the GEANT4-DNA existing model. In particular, excitations are strongly enhanced relative to ionizations, resulting in higher W-values and less diffusive dose-point-kernels at sub-keV electron energies. Conclusions: An improved energy-loss model for the excitation and ionization of liquid water by low-energy electrons has been implemented in GEANT4-DNA. The suspiciously low W-values and the unphysical long tail in the dose-point-kernel have been corrected owing to a different partitioning of the dielectric function.

  7. Simulation and modeling for the stand-off radiation detection system (SORDS) using GEANT4

    SciTech Connect

    Hoover, Andrew S; Wallace, Mark; Galassi, Mark; Mocko, Michal; Palmer, David; Schultz, Larry; Tornga, Shawn

    2009-01-01

    A Stand-Off Radiation Detection System (SORDS) is being developed through a joint effort by Raytheon, Los Alamos National Laboratory, Bubble Technology Industries, Radiation Monitoring Devices, and the Massachusetts Institute of Technology, for the Domestic Nuclear Detection Office (DNDO). The system is a mobile truck-based platform performing detection, imaging, and spectroscopic identification of gamma-ray sources. A Tri-Modal Imaging (TMI) approach combines active-mask coded aperture imaging, Compton imaging, and shadow imaging techniques. Monte Carlo simulation and modeling using the GEANT4 toolkit was used to generate realistic data for the development of imaging algorithms and associated software code.

  8. Validation of GEANT4 Monte Carlo models with a highly granular scintillator-steel hadron calorimeter

    NASA Astrophysics Data System (ADS)

    Adloff, C.; Blaha, J.; Blaising, J.-J.; Drancourt, C.; Espargilière, A.; Gaglione, R.; Geffroy, N.; Karyotakis, Y.; Prast, J.; Vouters, G.; Francis, K.; Repond, J.; Schlereth, J.; Smith, J.; Xia, L.; Baldolemar, E.; Li, J.; Park, S. T.; Sosebee, M.; White, A. P.; Yu, J.; Buanes, T.; Eigen, G.; Mikami, Y.; Watson, N. K.; Mavromanolakis, G.; Thomson, M. A.; Ward, D. R.; Yan, W.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Apostolakis, J.; Dotti, A.; Folger, G.; Ivantchenko, V.; Uzhinskiy, V.; Benyamna, M.; Cârloganu, C.; Fehr, F.; Gay, P.; Manen, S.; Royer, L.; Blazey, G. C.; Dyshkant, A.; Lima, J. G. R.; Zutshi, V.; Hostachy, J.-Y.; Morin, L.; Cornett, U.; David, D.; Falley, G.; Gadow, K.; Göttlicher, P.; Günter, C.; Hermberg, B.; Karstensen, S.; Krivan, F.; Lucaci-Timoce, A.-I.; Lu, S.; Lutz, B.; Morozov, S.; Morgunov, V.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Terwort, M.; Vargas-Trevino, A.; Feege, N.; Garutti, E.; Marchesini, I.; Ramilli, M.; Eckert, P.; Harion, T.; Kaplan, A.; Schultz-Coulon, H.-Ch; Shen, W.; Stamen, R.; Bilki, B.; Norbeck, E.; Onel, Y.; Wilson, G. W.; Kawagoe, K.; Dauncey, P. D.; Magnan, A.-M.; Bartsch, V.; Wing, M.; Salvatore, F.; Calvo Alamillo, E.; Fouz, M.-C.; Puerta-Pelayo, J.; Bobchenko, B.; Chadeeva, M.; Danilov, M.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Popov, V.; Rusinov, V.; Tarkovsky, E.; Kirikova, N.; Kozlov, V.; Smirnov, P.; Soloviev, Y.; Buzhan, P.; Ilyin, A.; Kantserov, V.; Kaplin, V.; Karakash, A.; Popova, E.; Tikhomirov, V.; Kiesling, C.; Seidel, K.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Amjad, M. S.; Bonis, J.; Callier, S.; Conforti di Lorenzo, S.; Cornebise, P.; Doublet, Ph; Dulucq, F.; Fleury, J.; Frisson, T.; van der Kolk, N.; Li, H.; Martin-Chassard, G.; Richard, F.; de la Taille, Ch; Pöschl, R.; Raux, L.; Rouëné, J.; Seguin-Moreau, N.; Anduze, M.; Boudry, V.; Brient, J.-C.; Jeans, D.; Mora de Freitas, P.; Musat, G.; Reinhard, M.; Ruan, M.; Videau, H.; Bulanek, B.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Belhorma, B.; Ghazlane, H.; Takeshita, T.; Uozumi, S.; Götze, M.; Hartbrich, O.; Sauer, J.; Weber, S.; Zeitnitz, C.

    2013-07-01

    Calorimeters with a high granularity are a fundamental requirement of the Particle Flow paradigm. This paper focuses on the prototype of a hadron calorimeter with analog readout, consisting of thirty-eight scintillator layers alternating with steel absorber planes. The scintillator plates are finely segmented into tiles individually read out via Silicon Photomultipliers. The presented results are based on data collected with pion beams in the energy range from 8 GeV to 100 GeV. The fine segmentation of the sensitive layers and the high sampling frequency allow for an excellent reconstruction of the spatial development of hadronic showers. A comparison between data and Monte Carlo simulations is presented, concerning both the longitudinal and lateral development of hadronic showers and the global response of the calorimeter. The performance of several GEANT4 physics lists with respect to these observables is evaluated.

  9. Modeling spallation reactions in tungsten and uranium targets with the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yury; Pshenichnov, Igor; Mishustin, Igor; Greiner, Walter

    2012-02-01

    We study primary and secondary reactions induced by 600 MeV proton beams in monolithic cylindrical targets made of natural tungsten and uranium by using Monte Carlo simulations with the Geant4 toolkit [1-3]. Bertini intranuclear cascade model, Binary cascade model and IntraNuclear Cascade Liège (INCL) with ABLA model [4] were used as calculational options to describe nuclear reactions. Fission cross sections, neutron multiplicity and mass distributions of fragments for 238U fission induced by 25.6 and 62.9 MeV protons are calculated and compared to recent experimental data [5]. Time distributions of neutron leakage from the targets and heat depositions are calculated. This project is supported by Siemens Corporate Technology.

  10. Comparing Geant4 hadronic models for the WENDI-II rem meter response function.

    PubMed

    Vanaudenhove, T; Dubus, A; Pauly, N

    2013-01-01

    The WENDI-II rem meter is one of the most popular neutron dosemeters used to assess a useful quantity of radiation protection, namely the ambient dose equivalent. This is due to its high sensitivity and its energy response that approximately follows the conversion function between neutron fluence and ambient dose equivalent in the range of thermal to 5 GeV. The simulation of the WENDI-II response function with the Geant4 toolkit is then perfectly suited to compare low- and high-energy hadronic models provided by this Monte Carlo code. The results showed that the thermal treatment of hydrogen in polyethylene for neutron <4 eV has a great influence over the whole detector range. Above 19 MeV, both Bertini Cascade and Binary Cascade models show a good correlation with the results found in the literature, while low-energy parameterised models are not suitable for this application.

  11. Geant4.10 simulation of geometric model for metaphase chromosome

    NASA Astrophysics Data System (ADS)

    Rafat-Motavalli, L.; Miri-Hakimabad, H.; Bakhtiyari, E.

    2016-04-01

    In this paper, a geometric model of metaphase chromosome is explained. The model is constructed according to the packing ratio and dimension of the structure from nucleosome up to chromosome. A B-DNA base pair is used to construct 200 base pairs of nucleosomes. Each chromatin fiber loop, which is the unit of repeat, has 49,200 bp. This geometry is entered in Geant4.10 Monte Carlo simulation toolkit and can be extended to the whole metaphase chromosomes and any application in which a DNA geometrical model is needed. The chromosome base pairs, chromosome length, and relative length of chromosomes are calculated. The calculated relative length is compared to the relative length of human chromosomes.

  12. Simulation loop between cad systems, GEANT-4 and GeoModel: Implementation and results

    NASA Astrophysics Data System (ADS)

    Sharmazanashvili, A.; Tsutskiridze, Niko

    2016-09-01

    Compare analysis of simulation and as-built geometry descriptions of detector is important field of study for data_vs_Monte-Carlo discrepancies. Shapes consistency and detalization is not important while adequateness of volumes and weights of detector components are essential for tracking. There are 2 main reasons of faults of geometry descriptions in simulation: (1) Difference between simulated and as-built geometry descriptions; (2) Internal inaccuracies of geometry transformations added by simulation software infrastructure itself. Georgian Engineering team developed hub on the base of CATIA platform and several tools enabling to read in CATIA different descriptions used by simulation packages, like XML->CATIA; VP1->CATIA; Geo-Model->CATIA; Geant4->CATIA. As a result it becomes possible to compare different descriptions with each other using the full power of CATIA and investigate both classes of reasons of faults of geometry descriptions. Paper represents results of case studies of ATLAS Coils and End-Cap toroid structures.

  13. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    SciTech Connect

    Zhou Y.; Mitra S.; Zhu X.; Wang Y.

    2011-10-16

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling, this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C{sub 3}H{sub 6}O{sub 6}N{sub 6}), can be identified to a depth of 20 cm when buried in soil.

  14. Modeling of x-ray fluorescence using MCNPX and Geant4

    SciTech Connect

    Rajasingam, Akshayan; Hoover, Andrew S; Fensin, Michael L; Tobin, Stephen J

    2009-01-01

    X-Ray Fluorescence (XRF) is one of thirteen non-destructive assay techniques being researched for the purpose of quantifying the Pu mass in used fuel assemblies. The modeling portion of this research will be conducted with the MCNPX transport code. The research presented here was undertaken to test the capability of MCNPX so that it can be used to benchmark measurements made at the ORNL and to give confidence in the application of MCNPX as a predictive tool of the expected capability of XRF in the context of used fuel assemblies. The main focus of this paper is a code-to-code comparison between MCNPX and Geant4 code. Since XRF in used fuel is driven by photon emission and beta decay of fission fragments, both terms were independently researched. Simple cases and used fuel cases were modeled for both source terms. In order to prepare for benchmarking to experiments, it was necessary to determine the relative significance of the various fission fragments for producing X-rays.

  15. Geant4-DNA: overview and recent developments

    NASA Astrophysics Data System (ADS)

    Štěpán, Václav

    Space travel and high altitude flights are inherently associated with prolonged exposure to cosmic and solar radiation. Understanding and simulation of radiation action on cellular and subcellular level contributes to precise assessment of the associated health risks and remains a challenge of today’s radiobiology research. The Geant4-DNA project (http://geant4-dna.org) aims at developing an experimentally validated simulation platform for modelling of the damage induced by ionizing radiation at DNA level. The platform is based on the Geant4 Monte Carlo simulation toolkit. This project extends specific functionalities of Geant4 in following areas: The step-by-step single scattering modelling of elementary physical interactions of electrons, protons, alpha particles and light ions with liquid water and DNA bases, for the so-called “physical” stage. The modelling of the “physico-chemical and chemical” stages corresponding to the production, the diffusion, the chemical reactions occurring between chemical species produced by water radiolysis, and to the radical attack on the biological targets. Physical and chemical stage simulations are combined with biological target models on several scales, from DNA double helix, through nucleosome, to chromatin segments and cell geometries. In addition, data mining clustering algorithms have been developed and optimised for the purpose of DNA damage scoring in simulated tracks. Experimental measurements on pBR322 plasmid DNA are being carried out in order to validate the Geant4-DNA models. The plasmid DNA has been irradiated in dry conditions by protons with energies from 100 keV to 30 MeV and in aqueous conditions, with and without scavengers, by 30 MeV protons, 290 MeV/u carbon and 500 MeV/u iron ions. Agarose gel electrophoresis combined with enzymatic treatment has been used to measure the resulting DNA damage. An overview of the developments undertaken by the Geant4-DNA collaboration including a description of

  16. Recent Developments in the Geant4 Hadronic Framework

    NASA Astrophysics Data System (ADS)

    Pokorski, Witold; Ribon, Alberto

    2014-06-01

    In this paper we present the recent developments in the Geant4 hadronic framework. Geant4 is the main simulation toolkit used by the LHC experiments and therefore a lot of effort is put into improving the physics models in order for them to have more predictive power. As a consequence, the code complexity increases, which requires constant improvement and optimization on the programming side. At the same time, we would like to review and eventually reduce the complexity of the hadronic software framework. As an example, a factory design pattern has been applied in Geant4 to avoid duplications of objects, like cross sections, which can be used by several processes or physics models. This approach has been applied also for physics lists, to provide a flexible configuration mechanism at run-time, based on macro files. Moreover, these developments open the future possibility to build Geant4 with only a specified sub-set of physics models. Another technical development focused on the reproducibility of the simulation, i.e. the possibility to repeat an event once the random generator status at the beginning of the event is known. This is crucial for debugging rare situations that may occur after long simulations. Moreover, reproducibility in normal, sequential Geant4 simulation is an important prerequisite to verify the equivalence with multithreaded Geant4 simulations.

  17. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  18. A Virtual Geant4 Environment

    NASA Astrophysics Data System (ADS)

    Iwai, Go

    2015-12-01

    We describe the development of an environment for Geant4 consisting of an application and data that provide users with a more efficient way to access Geant4 applications without having to download and build the software locally. The environment is platform neutral and offers the users near-real time performance. In addition, the environment consists of data and Geant4 libraries built using low-level virtual machine (LLVM) tools which can produce bitcode that can be embedded in HTML and accessed via a browser. The bitcode is downloaded to the local machine via the browser and can then be configured by the user. This approach provides a way of minimising the risk of leaking potentially sensitive data used to construct the Geant4 model and application in the medical domain for treatment planning. We describe several applications that have used this approach and compare their performance with that of native applications. We also describe potential user communities that could benefit from this approach.

  19. Recent improvements on the description of hadronic interactions in Geant4

    NASA Astrophysics Data System (ADS)

    Dotti, A.; Apostolakis, J.; Folger, G.; Grichine, V.; Ivanchenko, V.; Kosov, M.; Ribon, A.; Uzhinsky, V.; Wright, D. H.

    2011-04-01

    We present an overview of recent improvements of hadronic models in Geant4 for the physics configurations (Physics Lists) relevant to applications in high energy experiments. During last year the improvements have concentrated on the study of unphysical discontinuities in calorimeter observables in the transition regions between the models used in Physics Lists. The microscopic origin of these have been investigated, and possible improvements of Geant4 code are currently under validation. In this paper we discuss the status of the latest version of Geant4 with emphasis on the most promising new developments, namely the Fritiof based and CHIPS Physics Lists.

  20. Recent Improvements on the Description of Hadronic Interactions in Geant4

    SciTech Connect

    Dotti, A.; Apostolakis, J.; Folger, G.; Grichine, V.; Ivanchenko, V.; Kosov, M.; Ribon, A.; Uzhinsky, V.; Wright, D.H.; /SLAC

    2012-06-07

    We present an overview of recent improvements of hadronic models in Geant4 for the physics configurations (Physics Lists) relevant to applications in high energy experiments. During last year the improvements have concentrated on the study of unphysical discontinuities in calorimeter observables in the transition regions between the models used in Physics Lists. The microscopic origin of these have been investigated, and possible improvements of Geant4 code are currently under validation. In this paper we discuss the status of the latest version of Geant4 with emphasis on the most promising new developments, namely the Fritiof based and CHIPS Physics Lists.

  1. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam.

    PubMed

    Hall, David C; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues.

  2. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam

    NASA Astrophysics Data System (ADS)

    Hall, David C.; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues.

  3. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam.

    PubMed

    Hall, David C; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues. PMID:26611861

  4. Spallation Source Modelling for an ADS Using the MCNPX and GEANT4 Packages for Sensitivity Analysis of Reactivity

    NASA Astrophysics Data System (ADS)

    Antolin, M. Q.; Marinho, F.; Palma, D. A. P.; Martinez, A. S.

    2014-04-01

    A simulation for the time evolution of the MYRRHA conceptual reactor was developed. The SERPENT code was used to simulate the nuclear fuel depletion and the spallation source which drives the system was simulated using both MCNPX and GEANT4 packages. The obtained results for the neutron energy spectrum from the spallation are coherent with each other and were used as input for the SERPENT code which simulated a constant power operation regime. The obtained results show that the criticality of the system is not sensitive to the spallation models employed and only relative small deviations with respect to the inverse kinetic model coming from the point kinetic equations proposed by Gandini were observed.

  5. Simulations of nuclear resonance fluorescence in GEANT4

    NASA Astrophysics Data System (ADS)

    Lakshmanan, Manu N.; Harrawood, Brian P.; Rusev, Gencho; Agasthya, Greeshma A.; Kapadia, Anuj J.

    2014-11-01

    The nuclear resonance fluorescence (NRF) technique has been used effectively to identify isotopes based on their nuclear energy levels. Specific examples of its modern-day applications include detecting spent nuclear waste and cargo scanning for homeland security. The experimental designs for these NRF applications can be more efficiently optimized using Monte Carlo simulations before the experiment is implemented. One of the most widely used Monte Carlo physics simulations is the open-source toolkit GEANT4. However, NRF physics has not been incorporated into the GEANT4 simulation toolkit in publicly available software. Here we describe the development and testing of an NRF simulation in GEANT4. We describe in depth the development and architecture of this software for the simulation of NRF in any isotope in GEANT4; as well as verification and validation testing of the simulation for NRF in boron. In the verification testing, the simulation showed agreement with the analytical model to be within 0.6% difference for boron and iron. In the validation testing, the simulation showed agreement to be within 20.5% difference with the experimental measurements for boron, with the percent difference likely due to small uncertainties in beam polarization, energy distribution, and detector composition.

  6. Galactic Cosmic Rays and Lunar Secondary Particles from Solar Minimum to Maximum: CRaTER Observations and Geant4 Modeling

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J. E.; Blake, J. B.; Spence, H. E.; Schwadron, N.; Golightly, M. J.; Case, A. W.; Kasper, J. C.; Townsend, L. W.; Wilson, J. K.

    2014-12-01

    The Lunar Reconnaissance Orbiter mission was launched in 2009 during the recent deep and extended solar minimum, with the highest galactic cosmic ray (GCR) fluxes observed since the beginning of the space era. Its Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument was designed to measure the spectra of energy deposits in silicon detectors shielded behind pieces of tissue equivalent plastic, simulating the self-shielding provided by an astronaut's body around radiation-sensitive organs. The CRaTER data set now covers the evolution of the GCR environment near the moon during the first five years of development of the present solar cycle. We will present these observations, along with Geant4 modeling to illustrate the varying particle contributions to the energy-deposit spectra. CRaTER has also measured protons traveling up from the lunar surface after their creation during GCR interactions with surface material, and we will report observations and modeling of the energy and angular distributions of these "albedo" protons.

  7. The Lunar Radiation Environment: LRO/CRaTER Observations and Geant4 Modeling

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J.; Blake, J. B.; Spence, H. E.; Golightly, M.; Case, A. W.

    2010-12-01

    The Cosmic Ray Telescope for the Effects of Radiation (CRaTER) has been in orbit around the moon aboard the Lunar Reconnaissance Orbiter (LRO) for over a year. The purpose of CRaTER is to measure the radiation environment that will be experienced, in particular, by astronauts on and near the lunar surface; to that end, CRaTER consists of a "telescope" of six silicon solid state detectors arranged in three pairs, with two large blocks of Tissue-Equivalent Plastic between pairs to represent the shielding provided by the human body. The data we have collected to date are complex, and to understand our observations we have performed extensive modeling of the "albedo" particles produced by interactions of primary cosmic rays with the lunar surface and with the spacecraft itself, and of the response of the sensor to both primary and albedo particles. We will present measurements of the Linear Energy Transfer (LET) and dose from CRaTER, and will show more generic LET and dose spectra, using our models to remove the effects specific to the CRaTER sensor geometry and spacecraft environment (shielding, locally-produced albedo), for lunar orbit and at the lunar surface.

  8. Geant4 Applications in Space

    SciTech Connect

    Asai, M.; /SLAC

    2007-11-07

    Use of Geant4 is rapidly expanding in space application domain. I try to overview three major application areas of Geant4 in space, which are apparatus simulation for pre-launch design and post-launch analysis, planetary scale simulation for radiation spectra and surface and sub-surface explorations, and micro-dosimetry simulation for single event study and radiation-hardening of semiconductor devices. Recently, not only the mission dependent applications but also various multi-purpose or common tools built on top of Geant4 are also widely available. I overview some of such tools as well. The Geant4 Collaboration identifies that the space applications are now one of the major driving forces of the further developments and refinements of Geant4 toolkit. Highlights of such developments are introduced.

  9. An Overview of the Geant4 Toolkit

    NASA Astrophysics Data System (ADS)

    Apostolakis, John; Wright, Dennis H.

    2007-03-01

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  10. Space and Medical Applications of the Geant4 Simulation Toolkit

    NASA Astrophysics Data System (ADS)

    Perl, Joseph

    2008-10-01

    Geant4 is a toolkit to simulate the passage of particles through matter. While Geant4 was developed for High Energy Physics (HEP), applications now include Nuclear, Medical and Space Physics. Medical applications have been increasing rapidly due to the overall growth of Monte Carlo in Medical Physics and the unique qualities of Geant4 as an all-particle code able to handle complex geometry, motion and fields with the flexibility of modern programming and an open free source code. Work has included characterizing beams and sources, treatment planning and imaging. The all-particle nature of Geant4 has made it popular for the newest modes of radiation treatment: Proton and Particle therapy. Geant4 has been used by ESA, NASA and JAXA to study radiation effects to spacecraft and personnel. The flexibility of Geant4 has enabled teams to incorporate it into their own applications (SPENVIS MULASSIS space environment from QinetiQ and ESA, RADSAFE simulation from Vanderbilt University and NASA). We provide an overview of applications and discuss how Geant4 has responded to specific challenges of moving from HEP to Medical and Space Physics, including recent work to extend Geant4's energy range to low dose radiobiology.

  11. Analysis of the physical interactions of therapeutic proton beams in water with the use of Geant4 Monte Carlo calculations.

    PubMed

    Morávek, Zdenek; Bogner, Ludwig

    2009-01-01

    The processes that occur when protons traverse a medium are investigated theoretically for a full therapeutic range of energies [20 MeV, 220 MeV]. The investigation is undertaken using the Geant4 toolkit for water medium. The beam is simulated only inside the phantom, effects of beamline are included in the overall beam properties as lateral width and momentum bandwidth. Every energy deposition is catalogued according to the particle and the process that caused it. The catalogued depositions are analysed statistically. There are only few important processes such as proton ionisation and nuclear scattering (elastic/inelastic) that constitute the main features of the energy distribution. At the same time processes concerning electrons are used very often without obvious effect to the result. Such processes can be therefore approximated in the simulation codes in order to improve the performance of the code. Neutron depositions are most important before the Bragg peak, still they are by an order of magnitude smaller than those of protons. In the region behind the Bragg peak only a small number of neutrons is created in the simulation and their energy contribution through secondary protons is by orders smaller than the effect of proton-produced secondary protons within the Bragg peak. Hence, the effects of neutrons created in the calculation volume can be neglected. PMID:19761094

  12. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  13. Geant4 Computing Performance Benchmarking and Monitoring

    DOE PAGES

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less

  14. Geant4 Computing Performance Benchmarking and Monitoring

    SciTech Connect

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  15. Geant4 Computing Performance Benchmarking and Monitoring

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-01

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. The scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  16. GEANT4 distributed computing for compact clusters

    NASA Astrophysics Data System (ADS)

    Harrawood, Brian P.; Agasthya, Greeshma A.; Lakshmanan, Manu N.; Raterman, Gretchen; Kapadia, Anuj J.

    2014-11-01

    A new technique for distribution of GEANT4 processes is introduced to simplify running a simulation in a parallel environment such as a tightly coupled computer cluster. Using a new C++ class derived from the GEANT4 toolkit, multiple runs forming a single simulation are managed across a local network of computers with a simple inter-node communication protocol. The class is integrated with the GEANT4 toolkit and is designed to scale from a single symmetric multiprocessing (SMP) machine to compact clusters ranging in size from tens to thousands of nodes. User designed 'work tickets' are distributed to clients using a client-server work flow model to specify the parameters for each individual run of the simulation. The new g4DistributedRunManager class was developed and well tested in the course of our Neutron Stimulated Emission Computed Tomography (NSECT) experiments. It will be useful for anyone running GEANT4 for large discrete data sets such as covering a range of angles in computed tomography, calculating dose delivery with multiple fractions or simply speeding the through-put of a single model.

  17. MCNP5 and GEANT4 comparisons for preliminary Fast Neutron Pencil Beam design at the University of Utah TRIGA system

    NASA Astrophysics Data System (ADS)

    Adjei, Christian Amevi

    The main objective of this thesis is twofold. The starting objective was to develop a model for meaningful benchmarking of different versions of GEANT4 against an experimental set-up and MCNP5 pertaining to photon transport and interactions. The following objective was to develop a preliminary design of a Fast Neutron Pencil Beam (FNPB) Facility to be applicable for the University of Utah research reactor (UUTR) using MCNP5 and GEANT4. The three various GEANT4 code versions, GEANT4.9.4, GEANT4.9.3, and GEANT4.9.2, were compared to MCNP5 and the experimental measurements of gamma attenuation in air. The average gamma dose rate was measured in the laboratory experiment at various distances from a shielded cesium source using a Ludlum model 19 portable NaI detector. As it was expected, the gamma dose rate decreased with distance. All three GEANT4 code versions agreed well with both the experimental data and the MCNP5 simulation. Additionally, a simple GEANT4 and MCNP5 model was developed to compare the code agreements for neutron interactions in various materials. Preliminary FNPB design was developed using MCNP5; a semi-accurate model was developed using GEANT4 (because GEANT4 does not support the reactor physics modeling, the reactor was represented as a surface neutron source, thus a semi-accurate model). Based on the MCNP5 model, the fast neutron flux in a sample holder of the FNPB is obtained to be 6.52×107 n/cm2s, which is one order of magnitude lower than gigantic fast neutron pencil beam facilities existing elsewhere. The MCNP5 model-based neutron spectrum indicates that the maximum expected fast neutron flux is at a neutron energy of ~1 MeV. In addition, the MCNP5 model provided information on gamma flux to be expected in this preliminary FNPB design; specifically, in the sample holder, the gamma flux is to be expected to be around 108 γ/cm 2s, delivering a gamma dose of 4.54×103 rem/hr. This value is one to two orders of magnitudes below the gamma

  18. Simulation of Cold Neutron Experiments using GEANT4

    NASA Astrophysics Data System (ADS)

    Frlez, Emil; Hall, Joshua; Root, Melinda; Baessler, Stefan; Pocanic, Dinko

    2013-10-01

    We review the available GEANT4 physics processes for the cold neutrons in the energy range 1-100 meV. We consider the cases of the neutron beam interacting with (i) para- and ortho- polarized liquid hydrogen, (ii) Aluminum, and (iii) carbon tetrachloride (CCl4) targets. Scattering, thermal and absorption cross sections used by GEANT4 and MCNP6 libraries are compared with the National Nuclear Data Center (NNDC) compilation. NPDGamma detector simulation is presented as an example of the implementation of the resulting GEANT4 code. This work is supported by NSF grant PHY-0970013.

  19. SU-E-T-565: RAdiation Resistance of Cancer CElls Using GEANT4 DNA: RACE

    SciTech Connect

    Perrot, Y; Payno, H; Delage, E; Maigne, L

    2014-06-01

    Purpose: The objective of the RACE project is to develop a comparison between Monte Carlo simulation using the Geant4-DNA toolkit and measurements of radiation damage on 3D melanoma and chondrosarcoma culture cells coupled with gadolinium nanoparticles. We currently expose the status of the developments regarding simulations. Methods: Monte Carlo studies are driven using the Geant4 toolkit and the Geant4-DNA extension. In order to model the geometry of a cell population, the opensource CPOP++ program is being developed for the geometrical representation of 3D cell populations including a specific cell mesh coupled with a multi-agent system. Each cell includes cytoplasm and nucleus. The correct modeling of the cell population has been validated with confocal microscopy images of spheroids. The Geant4 Livermore physics models are used to simulate the interactions of a 250 keV X-ray beam and the production of secondaries from gadolinium nanoparticles supposed to be fixed on the cell membranes. Geant4-DNA processes are used to simulate the interactions of charged particles with the cells. An atomistic description of the DNA molecule, from PDB (Protein Data Bank) files, is provided by the so-called PDB4DNA Geant4 user application we developed to score energy depositions in DNA base pairs and sugar-phosphate groups. Results: At the microscopic level, our simulations enable assessing microscopic energy distribution in each cell compartment of a realistic 3D cell population. Dose enhancement factors due to the presence of gadolinium nanoparticles can be estimated. At the nanometer scale, direct damages on nuclear DNA are also estimated. Conclusion: We successfully simulated the impact of direct radiations on a realistic 3D cell population model compatible with microdosimetry calculations using the Geant4-DNA toolkit. Upcoming validation and the future integration of the radiochemistry module of Geant4-DNA will propose to correlate clusters of ionizations with in vitro

  20. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    PubMed

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity.

  1. Geant4 Monte Carlo simulation of energy loss and transmission and ranges for electrons, protons and ions

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Vladimir

    Geant4 is a toolkit for Monte Carlo simulation of particle transport originally developed for applications in high-energy physics with the focus on experiments at the Large Hadron Collider (CERN, Geneva). The transparency and flexibility of the code has spread its use to other fields of research, e.g. radiotherapy and space science. The tool provides possibility to simulate complex geometry, transportation in electric and magnetic fields and variety of physics models of interaction of particles with media. Geant4 has been used for simulation of radiation effects for number of space missions. Recent upgrades of the toolkit released in December 2009 include new model for ion electronic stopping power based on the revised version of ICRU'73 Report increasing accuracy of simulation of ion transport. In the current work we present the status of Geant4 electromagnetic package for simulation of particle energy loss, ranges and transmission. This has a direct implication for simulation of ground testing setups at existing European facilities and for simulation of radiation effects in space. A number of improvements were introduced for electron and proton transport, followed by a thorough validation. It was the aim of the present study to validate the range against reference data from the United States National Institute of Standards and Technologies (NIST) ESTAR, PSTAR and ASTAR databases. We compared Geant4 and NIST ranges of electrons using different Geant4 models. The best agreement was found for Penelope, except at very low energies in heavy materials, where the Standard package gave better results. Geant4 proton ranges in water agreed with NIST within 1 The validation of the new ion model is performed against recent data on Bragg peak position in water. The data from transmission of carbon ions via various absorbers following Bragg peak in water demonstrate that the new Geant4 model significantly improves precision of ion range. The absolute accuracy of ion range

  2. Validation of Geant4 Hadronic Generators versus Thin Target Data

    SciTech Connect

    Banerjee, S.; Folger, G.; Ivanchenko, A.; Ivanchenko, V.N.; Kossov, M.; Quesada, J.M.; Schalicke, A.; Uzhinsky, V.; Wenzel, H.; Wright, D.H.; Yarba, J.; /Fermilab

    2012-04-19

    The GEANT4 toolkit is widely used for simulation of high energy physics (HEP) experiments, in particular, those at the Large Hadron Collider (LHC). The requirements of robustness, stability and quality of simulation for the LHC are demanding. This requires an accurate description of hadronic interactions for a wide range of targets over a large energy range, from stopped particle reactions to low energy nuclear interactions to interactions at the TeV energy scale. This is achieved within the Geant4 toolkit by combining a number of models, each of which are valid within a certain energy domain. Comparison of these models to thin target data over a large energy range indicates the strengths and weaknesses of the model descriptions and the energy range over which each model is valid. Software has been developed to handle the large number of validation tests required to provide the feedback needed to improve the models. An automated process for carrying out the validation and storing/displaying the results is being developed and will be discussed.

  3. GEANT4 Simulation of the NPDGamma Experiment

    NASA Astrophysics Data System (ADS)

    Frlez, Emil

    2014-03-01

    The n-> + p --> d + γ experiment, currently taking data at the Oak Ridge SNS facility, is a high-precision measurement of weak nuclear forces at low energies. Detecting the correlation between the cold neutron spin and photon direction in the capture of neutrons on Liquid Hydrogen (LH) target, the experiment is sensitive to the properties of neutral weak current. We have written a GEANT4 Monte Carlo simulation of the NPDGamma detector that, in addition to the active CsI detectors, also includes different targets and passive materials as well as the beam line elements. The neutron beam energy spectrum, its profiles, divergencies, and time-of-flight are simulated in detail. We have used the code to cross-calibrate the positions of (i) polarized LH target, (ii) Aluminum target, and (iii) CCl4 target. The responses of the 48 CsI detectors in the simulation were fixed using data taken on the LH target. Both neutron absorption as well as scattering and thermal processes were turned on in the GEANT4 physics lists. We use the results to simulate in detail the data obtained with different targets used in the experiment within a comprehensive analysis. This work is supported by NSF grant PHY-1307328.

  4. Identifying key surface parameters for optical photon transport in GEANT4/GATE simulations.

    PubMed

    Nilsson, Jenny; Cuplov, Vesna; Isaksson, Mats

    2015-09-01

    For a scintillator used for spectrometry, the generation, transport and detection of optical photons have a great impact on the energy spectrum resolution. A complete Monte Carlo model of a scintillator includes a coupled ionizing particle and optical photon transport, which can be simulated with the GEANT4 code. The GEANT4 surface parameters control the physics processes an optical photon undergoes when reaching the surface of a volume. In this work the impact of each surface parameter on the optical transport was studied by looking at the optical spectrum: the number of detected optical photons per ionizing source particle from a large plastic scintillator, i.e. the output signal. All simulations were performed using GATE v6.2 (GEANT4 Application for Tomographic Emission). The surface parameter finish (polished, ground, front-painted or back-painted) showed the greatest impact on the optical spectrum whereas the surface parameter σ(α), which controls the surface roughness, had a relatively small impact. It was also shown how the surface parameters reflectivity and reflectivity types (specular spike, specular lobe, Lambertian and backscatter) changed the optical spectrum depending on the probability for reflection and the combination of reflectivity types. A change in the optical spectrum will ultimately have an impact on a simulated energy spectrum. By studying the optical spectra presented in this work, a GEANT4 user can predict the shift in an optical spectrum caused be the alteration of a specific surface parameter.

  5. Visualization drivers for Geant4

    SciTech Connect

    Beretvas, Andy; /Fermilab

    2005-10-01

    This document is on Geant4 visualization tools (drivers), evaluating pros and cons of each option, including recommendations on which tools to support at Fermilab for different applications. Four visualization drivers are evaluated. They are OpenGL, HepRep, DAWN and VRML. They all have good features, OpenGL provides graphic output without an intermediate file. HepRep provides menus to assist the user. DAWN provides high quality plots and even for large files produces output quickly. VRML uses the smallest disk space for intermediate files. Large experiments at Fermilab will want to write their own display. They should proceed to make this display graphics independent. Medium experiment will probably want to use HepRep because of it's menu support. Smaller scale experiments will want to use OpenGL in the spirit of having immediate response, good quality output and keeping things simple.

  6. Geant4 - Towards major release 10

    NASA Astrophysics Data System (ADS)

    Cosmo, G.; Geant4 Collaboration

    2014-06-01

    The Geant4 simulation toolkit has reached maturity in the middle of the previous decade, providing a wide variety of established features coherently aggregated in a software product, which has become the standard for detector simulation in HEP and is used in a variety of other application domains. We review the most recent capabilities introduced in the kernel, highlighting those, which are being prepared for the next major release (version 10.0) that is scheduled for the end of 2013. A significant new feature contained in this release will be the integration of multi-threading processing, aiming at targeting efficient use of modern many-cores system architectures and minimization of the memory footprint for exploiting event-level parallelism. We discuss its design features and impact on the existing API and user-interface of Geant4. Revisions are made to balance the need for preserving backwards compatibility and to consolidate and improve the interfaces; taking into account requirements from the multithreaded extensions and from the evolution of the data processing models of the LHC experiments.

  7. Probing Planetary Bodies for Subsurface Volatiles: GEANT4 Models of Gamma Ray, Fast, Epithermal, and Thermal Neutron Response to Active Neutron Illumination

    NASA Astrophysics Data System (ADS)

    Chin, G.; Sagdeev, R.; Su, J. J.; Murray, J.

    2014-12-01

    Using an active source of neutrons as an in situ probe of a planetary body has proven to be a powerful tool to extract information about the presence, abundance, and location of subsurface volatiles without the need for drilling. The Dynamic Albedo of Neutrons (DAN) instrument on Curiosity is an example of such an instrument and is designed to detect the location and abundance of hydrogen within the top 50 cm of the Martian surface. DAN works by sending a pulse of neutrons towards the ground beneath the rover and detecting the reflected neutrons. The intensity and time of arrival of the reflection depends on the proportion of water, while the time the pulse takes to reach the detector is a function of the depth at which the water is located. Similar instruments can also be effective probes at the polar-regions of the Moon or on asteroids as a way of detecting sequestered volatiles. We present the results of GEANT4 particle simulation models of gamma ray, fast, epithermal, and thermal neutron responses to active neutron illumination. The results are parameterized by hydrogen abundance, stratification and depth of volatile layers, versus the distribution of neutron and gamma ray energy reflections. Models will be presented to approximate Martian, lunar, and asteroid environments and would be useful tools to assess utility for future NASA exploration missions to these types of planetary bodies.

  8. Geant4 simulations of the neutron production and transport in the n_TOF spallation target

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.

    2016-11-01

    The neutron production and transport in the spallation target of the n_TOF facility at CERN has been simulated with Geant4. The results obtained with the different hadronic Physics Lists provided by Geant4 have been compared with the experimental neutron flux in n_TOF-EAR1. The best overall agreement in both the absolute value and the energy dependence of the flux from thermal to 1GeV, is obtained with the INCL++ model coupled with the Fritiof Model(FTFP). This Physics List has been thus used to simulate and study the main features of the new n_TOF-EAR2 beam line, currently in its commissioning phase.

  9. A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4.

    PubMed

    Grevillot, L; Bertrand, D; Dessy, F; Freud, N; Sarrut, D

    2011-08-21

    This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.

  10. Comparative study of dose distributions and cell survival fractions for 1H, 4He, 12C and 16O beams using Geant4 and Microdosimetric Kinetic model

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2015-04-01

    Depth and radial dose profiles for therapeutic 1H, 4He, 12C and 16O beams are calculated using the Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT). 4He and 16O ions are presented as alternative options to 1H and 12C broadly used for ion-beam cancer therapy. Biological dose profiles and survival fractions of cells are estimated using the modified Microdosimetric Kinetic model. Depth distributions of cell survival of healthy tissues, assuming 10% and 50% survival of tumor cells, are calculated for 6 cm SOBPs at two tumor depths and for different tissues radiosensitivities. It is found that the optimal ion choice depends on (i) depth of the tumor, (ii) dose levels and (iii) the contrast of radiosensitivities of tumor and surrounding healthy tissues. Our results indicate that 12C and 16O ions are more appropriate to spare healthy tissues in the case of a more radioresistant tumor at moderate depths. On the other hand, a sensitive tumor surrounded by more resistant tissues can be better treated with 1H and 4He ions. In general, 4He beam is found to be a good candidate for therapy. It better spares healthy tissues in all considered cases compared to 1H. Besides, the dose conformation is improved for deep-seated tumors compared to 1H, and the damage to surrounding healthy tissues is reduced compared to heavier ions due to the lower impact of nuclear fragmentation. No definite advantages of 16O with respect to 12C ions are found in this study.

  11. Geant4 application in a Web browser

    NASA Astrophysics Data System (ADS)

    Garnier, Laurent; Geant4 Collaboration

    2014-06-01

    Geant4 is a toolkit for the simulation of the passage of particles through matter. The Geant4 visualization system supports many drivers including OpenGL[1], OpenInventor, HepRep[2], DAWN[3], VRML, RayTracer, gMocren[4] and ASCIITree, with diverse and complementary functionalities. Web applications have an increasing role in our work, and thanks to emerging frameworks such as Wt [5], building a web application on top of a C++ application without rewriting all the code can be done. Because the Geant4 toolkit's visualization and user interface modules are well decoupled from the rest of Geant4, it is straightforward to adapt these modules to render in a web application instead of a computer's native window manager. The API of the Wt framework closely matches that of Qt [6], our experience in building Qt driver will benefit for Wt driver. Porting a Geant4 application to a web application is easy, and with minimal effort, Geant4 users can replicate this process to share their own Geant4 applications in a web browser.

  12. A study of the runaway relativistic electron avalanche and the feedback theory using GEANT4

    NASA Astrophysics Data System (ADS)

    Broberg Skeltved, Alexander; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas

    2014-05-01

    This study investigate the Runaway Relativistic Electron Avalanche (RREA) and the feedback process as well as the production of Bremsstrahlung photons from Runaway Electrons (REs). These processes are important to understand the production of the intense bursts of gamma-rays known as Terrestrial Gamma-Ray Flashes (TGFs). Results are obtained from Monte Carlo (MC) simulations using the GEometry ANd Tracking 4 (GEANT4) programming toolkit. The simulations takes into account the effects of electron ionisation, electron by electron scattering (Møller scattering) as well as positron and photon interactions, in the 250 eV-100 GeV energy range. Several physics libraries or 'physics lists' are provided with GEANT4 to implement these physics processes in the simulations. We give a detailed analysis of the electron and the feedback multiplication, in particular the avalanche lengths, Λ, the energy distribution and the feedback factor, γ. We also find that our results vary significantly depending on which physics list we implement. In order to verify our results and the GEANT4 programming toolkit, we compare them to previous results from existing models. In addition we present the ratio of the production of bremsstrahlung photons to runaway electrons. From this ratio we obtain the parameter, α, which describe the electron to photon relation.

  13. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring.

    PubMed

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm(2). PMID:26858937

  14. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring

    PubMed Central

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M.; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm2. PMID:26858937

  15. A macroscopic and microscopic study of radon exposure using Geant4 and MCNPX to estimate dose rates and DNA damage

    NASA Astrophysics Data System (ADS)

    van den Akker, Mary Evelyn

    Radon is considered the second-leading cause of lung cancer after smoking. Epidemiological studies have been conducted in miner cohorts as well as general populations to estimate the risks associated with high and low dose exposures. There are problems with extrapolating risk estimates to low dose exposures, mainly that the dose-response curve at low doses is not well understood. Calculated dosimetric quantities give average energy depositions in an organ or a whole body, but morphological features of an individual can affect these values. As opposed to human phantom models, Computed Tomography (CT) scans provide unique, patient-specific geometries that are valuable in modeling the radiological effects of the short-lived radon progeny sources. Monte Carlo particle transport code Geant4 was used with the CT scan data to model radon inhalation in the main bronchial bifurcation. The equivalent dose rates are near the lower bounds of estimates found in the literature, depending on source volume. To complement the macroscopic study, simulations were run in a small tissue volume in Geant4-DNA toolkit. As an expansion of Geant4 meant to simulate direct physical interactions at the cellular level, the particle track structure of the radon progeny alphas can be analyzed to estimate the damage that can occur in sensitive cellular structures like the DNA molecule. These estimates of DNA double strand breaks are lower than those found in Geant4-DNA studies. Further refinements of the microscopic model are at the cutting edge of nanodosimetry research.

  16. Alpha Coincidence Spectroscopy studied with GEANT4

    SciTech Connect

    Dion, Michael P.; Miller, Brian W.; Tatishvili, Gocha; Warren, Glen A.

    2013-11-02

    Abstract The high-energy side of peaks in alpha spectra, e.g. 241Am, as measured with a silicon detector has structure caused mainly by alpha-conversion electron and to some extent alphagamma coincidences. We compare GEANT4 simulation results to 241Am alpha spectroscopy measurements with a passivated implanted planar silicon detector. A large discrepancy between the measurements and simulations suggest that the GEANT4 photon evaporation database for 237Np (daughter of 241Am decay) does not accurately describe the conversion electron spectrum and therefore was found to have large discrepancies with experimental measurements. We describe how to improve the agreement between GEANT4 and alpha spectroscopy for actinides of interest by including experimental measurements of conversion electron spectroscopy into the photon evaporation database.

  17. Geant4 VMC 3.0

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, I.; Gheata, A.

    2015-12-01

    Virtual Monte Carlo (VMC) [1] provides an abstract interface into Monte Carlo transport codes. A user VMC based application, independent from the specific Monte Carlo codes, can be then run with any of the supported simulation programs. Developed by the ALICE Offline Project and further included in ROOT [2], the interface and implementations have reached stability during the last decade and have become a foundation for other detector simulation frameworks, the FAIR facility experiments framework being among the first and largest. Geant4 VMC [3], which provides the implementation of the VMC interface for Geant4 [4], is in continuous maintenance and development, driven by the evolution of Geant4 on one side and requirements from users on the other side. Besides the implementation of the VMC interface, Geant4 VMC also provides a set of examples that demonstrate the use of VMC to new users and also serve for testing purposes. Since major release 2.0, it includes the G4Root navigator package, which implements an interface that allows one to run a Geant4 simulation using a ROOT geometry. The release of Geant4 version 10.00 with the integration of multithreading processing has triggered the development of the next major version of Geant4 VMC (version 3.0), which was released in November 2014. A beta version, available for user testing since March, has helped its consolidation and improvement. We will review the new capabilities introduced in this major version, in particular the integration of multithreading into the VMC design, its impact on the Geant4 VMC and G4Root packages, and the introduction of a new package, MTRoot, providing utility functions for ROOT parallel output in independent files with necessary additions for thread-safety. Migration of user applications to multithreading that preserves the ease of use of VMC will be also discussed. We will also report on the introduction of a new CMake [5] based build system, the migration to ROOT major release 6 and the

  18. GEANT4 simulations of the DANCE array

    NASA Astrophysics Data System (ADS)

    Jandel, M.; Bredeweg, T. A.; Couture, A.; Fowler, M. M.; Bond, E. M.; Chadwick, M. B.; Clement, R. R. C.; Esch, E.-I.; O'Donnell, J. M.; Reifarth, R.; Rundberg, R. S.; Ullmann, J. L.; Vieira, D. J.; Wilhelmy, J. B.; Wouters, J. M.; Macri, R. A.; Wu, C. Y.; Becker, J. A.

    2007-08-01

    The detector for advanced neutron capture experiments (DANCE) at Los Alamos National Laboratory (LANL) is used for neutron capture cross sections measurements. Its high granularity of 160 BaF2 detectors in a 4π geometry allows for highly efficient detection of prompt γ-rays following a neutron capture. The performance of the detector was simulated using the GEANT4 Monte Carlo code. The model includes all 160 BaF2 crystals with realistic dimensions and geometry. The 6LiH shell, beam pipe, crystal wrapping material, aluminum holders, photomultiplier material and materials of the calibration sources were included in the simulation. Simulated γ-ray energy and total γ-ray energy spectra gated on cluster and crystal multiplicities were compared to measured data using 88Y, 60Co, 22Na calibration sources. Good agreement was achieved. A total efficiency and peak-to-total ratio as a function of γ-ray energy was established for mono-energetic γ-rays.

  19. Space Earthquake Perturbation Simulation (SEPS) an application based on Geant4 tools to model and simulate the interaction between the Earthquake and the particle trapped on the Van Allen belt

    NASA Astrophysics Data System (ADS)

    Ambroglini, Filippo; Jerome Burger, William; Battiston, Roberto; Vitale, Vincenzo; Zhang, Yu

    2014-05-01

    During last decades, few space experiments revealed anomalous bursts of charged particles, mainly electrons with energy larger than few MeV. A possible source of these bursts are the low-frequency seismo-electromagnetic emissions, which can cause the precipitation of the electrons from the lower boundary of their inner belt. Studies of these bursts reported also a short-term pre-seismic excess. Starting from simulation tools traditionally used on high energy physics we developed a dedicated application SEPS (Space Perturbation Earthquake Simulation), based on the Geant4 tool and PLANETOCOSMICS program, able to model and simulate the electromagnetic interaction between the earthquake and the particles trapped in the inner Van Allen belt. With SEPS one can study the transport of particles trapped in the Van Allen belts through the Earth's magnetic field also taking into account possible interactions with the Earth's atmosphere. SEPS provides the possibility of: testing different models of interaction between electromagnetic waves and trapped particles, defining the mechanism of interaction as also shaping the area in which this takes place,assessing the effects of perturbations in the magnetic field on the particles path, performing back-tracking analysis and also modelling the interaction with electric fields. SEPS is in advanced development stage, so that it could be already exploited to test in details the results of correlation analysis between particle bursts and earthquakes based on NOAA and SAMPEX data. The test was performed both with a full simulation analysis, (tracing from the position of the earthquake and going to see if there were paths compatible with the burst revealed) and with a back-tracking analysis (tracing from the burst detection point and checking the compatibility with the position of associated earthquake).

  20. GEANT4 Simulation of Hadronic Interactions at 8-GeV/C to 10-GeV/C: Response to the HARP-CDP Group

    SciTech Connect

    Uzhinsky, V.; Apostolakis, J.; Folger, G.; Ivanchenko, V.N.; Kossov, M.V.; Wright, D.H.; /SLAC

    2011-11-21

    The results of the HARP-CDP group on the comparison of GEANT4 Monte Carlo predictions versus experimental data are discussed. It is shown that the problems observed by the group are caused by an incorrect implementation of old features at the programming level, and by a lack of the nucleon Fermi motion in the simulation of quasielastic scattering. These drawbacks are not due to the physical models used. They do not manifest themselves in the most important applications of the GEANT4 toolkit.

  1. Validation of the GEANT4 simulation of bremsstrahlung from thick targets below 3 MeV

    NASA Astrophysics Data System (ADS)

    Pandola, L.; Andenna, C.; Caccia, B.

    2015-05-01

    The bremsstrahlung spectra produced by electrons impinging on thick targets are simulated using the GEANT4 Monte Carlo toolkit. Simulations are validated against experimental data available in literature for a range of energy between 0.5 and 2.8 MeV for Al and Fe targets and for a value of energy of 70 keV for Al, Ag, W and Pb targets. The energy spectra for the different configurations of emission angles, energies and targets are considered. Simulations are performed by using the three alternative sets of electromagnetic models that are available in GEANT4 to describe bremsstrahlung. At higher energies (0.5-2.8 MeV) of the impinging electrons on Al and Fe targets, GEANT4 is able to reproduce the spectral shapes and the integral photon emission in the forward direction. The agreement is within 10-30%, depending on energy, emission angle and target material. The physics model based on the Penelope Monte Carlo code is in slightly better agreement with the measured data than the other two. However, all models over-estimate the photon emission in the backward hemisphere. For the lower energy study (70 keV), which includes higher-Z targets, all models systematically under-estimate the total photon yield, providing agreement between 10% and 50%. The results of this work are of potential interest for medical physics applications, where knowledge of the energy spectra and angular distributions of photons is needed for accurate dose calculations with Monte Carlo and other fluence-based methods.

  2. GEANT4 for breast dosimetry: parameters optimization study

    NASA Astrophysics Data System (ADS)

    Fedon, C.; Longo, F.; Mettivier, G.; Longo, R.

    2015-08-01

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor ({{r}2}>0.99 ) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4.

  3. GEANT4 for breast dosimetry: parameters optimization study.

    PubMed

    Fedon, C; Longo, F; Mettivier, G; Longo, R

    2015-08-21

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor (r2>0.99) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4. PMID:26267405

  4. GEANT4 simulation of cyclotron radioisotope production in a solid target.

    PubMed

    Poignant, F; Penfold, S; Asp, J; Takhar, P; Jackson, P

    2016-05-01

    The use of radioisotopes in nuclear medicine is essential for diagnosing and treating cancer. The optimization of their production is a key factor in maximizing the production yield and minimizing the associated costs. An efficient approach to this problem is the use of Monte Carlo simulations prior to experimentation. By predicting isotopes yields, one can study the isotope of interest expected activity for different energy ranges. One can also study the target contamination with other radioisotopes, especially undesired radioisotopes of the wanted chemical element which are difficult to separate from the irradiated target and might result in increasing the dose when delivering the radiopharmaceutical product to the patient. The aim of this work is to build and validate a Monte Carlo simulation platform using the GEANT4 toolkit to model the solid target system of the South Australian Health and Medical Research Institute (SAHMRI) GE Healthcare PETtrace cyclotron. It includes a GEANT4 Graphical User Interface (GUI) where the user can modify simulation parameters such as the energy, shape and current of the proton beam, the target geometry and material, the foil geometry and material and the time of irradiation. The paper describes the simulation and presents a comparison of simulated and experimental/theoretical yields for various nuclear reactions on an enriched nickel 64 target using the GEANT4 physics model QGSP_BIC_AllHP, a model recently developed to evaluate with high precision the interaction of protons with energies below 200MeV available in Geant4 version 10.1. The simulation yield of the (64)Ni(p,n)(64)Cu reaction was found to be 7.67±0.074 mCi·μA(-1) for a target energy range of 9-12MeV. Szelecsenyi et al. (1993) gives a theoretical yield of 6.71mCi·μA(-1) and an experimental yield of 6.38mCi·μA(-1). The (64)Ni(p,n)(64)Cu cross section obtained with the simulation was also verified against the yield predicted from the nuclear database TENDL and

  5. Application of dynamic Monte Carlo technique in proton beam radiotherapy using Geant4 simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guan, Fada

    Monte Carlo method has been successfully applied in simulating the particles transport problems. Most of the Monte Carlo simulation tools are static and they can only be used to perform the static simulations for the problems with fixed physics and geometry settings. Proton therapy is a dynamic treatment technique in the clinical application. In this research, we developed a method to perform the dynamic Monte Carlo simulation of proton therapy using Geant4 simulation toolkit. A passive-scattering treatment nozzle equipped with a rotating range modulation wheel was modeled in this research. One important application of the Monte Carlo simulation is to predict the spatial dose distribution in the target geometry. For simplification, a mathematical model of a human body is usually used as the target, but only the average dose over the whole organ or tissue can be obtained rather than the accurate spatial dose distribution. In this research, we developed a method using MATLAB to convert the medical images of a patient from CT scanning into the patient voxel geometry. Hence, if the patient voxel geometry is used as the target in the Monte Carlo simulation, the accurate spatial dose distribution in the target can be obtained. A data analysis tool---root was used to score the simulation results during a Geant4 simulation and to analyze the data and plot results after simulation. Finally, we successfully obtained the accurate spatial dose distribution in part of a human body after treating a patient with prostate cancer using proton therapy.

  6. geant4 hadronic cascade models analysis of proton and charged pion transverse momentum spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c

    SciTech Connect

    Abdel-Waged, Khaled; Felemban, Nuha; Uzhinskii, V. V.

    2011-07-15

    We describe how various hadronic cascade models, which are implemented in the geant4 toolkit, describe proton and charged pion transverse momentum spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c, recently measured in the hadron production (HARP) experiment at CERN. The Binary, ultrarelativistic quantum molecular dynamics (UrQMD) and modified FRITIOF (FTF) hadronic cascade models are chosen for investigation. The first two models are based on limited (Binary) and branched (UrQMD) binary scattering between cascade particles which can be either a baryon or meson, in the three-dimensional space of the nucleus, while the latter (FTF) considers collective interactions between nucleons only, on the plane of impact parameter. It is found that the slow (p{sub T}{<=}0.3 GeV/c) proton spectra are quite sensitive to the different treatments of cascade pictures, while the fast (p{sub T}>0.3 GeV/c) proton spectra are not strongly affected by the differences between the FTF and UrQMD models. It is also shown that the UrQMD and FTF combined with Binary (FTFB) models could reproduce both proton and charged pion spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c with the same accuracy.

  7. Behaviors of the percentage depth dose curves along the beam axis of a phantom filled with different clinical PTO objects, a Monte Carlo Geant4 study

    NASA Astrophysics Data System (ADS)

    EL Bakkali, Jaafar; EL Bardouni, Tarek; Safavi, Seyedmostafa; Mohammed, Maged; Saeed, Mroan

    2016-08-01

    The aim of this work is to assess the capabilities of Monte Carlo Geant4 code to reproduce the real percentage depth dose (PDD) curves generated in phantoms which mimic three important clinical treatment situations that include lung slab, bone slab, bone-lung slab geometries. It is hoped that this work will lead us to a better understanding of dose distributions in an inhomogeneous medium, and to identify any limitations of dose calculation algorithm implemented in the Geant4 code. For this purpose, the PDD dosimetric functions associated to the three clinical situations described above, were compared to one produced in a homogeneous water phantom. Our results show, firstly, that the Geant4 simulation shows potential mistakes on the shape of the calculated PDD curve of the first physical test object (PTO), and it is obviously not able to successfully predict dose values in regions near to the boundaries between two different materials. This is, surely due to the electron transport algorithm and it is well-known as the artifacts at interface phenomenon. To deal with this issue, we have added and optimized the StepMax parameter to the dose calculation program; consequently the artifacts due to the electron transport were quasi disappeared. However, the Geant4 simulation becomes painfully slow when we attempt to completely resolve the electron artifact problems by considering a smaller value of an electron StepMax parameter. After electron transport optimization, our results demonstrate the medium-level capabilities of the Geant4 code to modeling dose distribution in clinical PTO objects.

  8. An XML description of detector geometries for GEANT4

    NASA Astrophysics Data System (ADS)

    Figgins, J.; Walker, B.; Comfort, J. R.

    2006-12-01

    A code has been developed that enables the geometry of detectors to be specified easily and flexibly in the XML language, for use in the Monte Carlo program GEANT4. The user can provide clear documentation of the geometry without being proficient in the C++ language of GEANT4. The features and some applications are discussed.

  9. A generic x-ray tracing toolbox in Geant4

    NASA Astrophysics Data System (ADS)

    Vacanti, Giuseppe; Buis, Ernst-Jan; Collon, Maximilien; Beijersbergen, Marco; Kelly, Chris

    2009-05-01

    We have developed a generic X-ray tracing toolbox based on Geant4, a generic simulation toolkit. By leveraging the facilities available on Geant4, we are able to design and analyze complex X-ray optical systems. In this article we describe our toolbox, and describe how it is being applied to support the development of silicon pore optics for IXO.

  10. SU-E-T-347: Validation of the Condensed History Algorithm of Geant4 Using the Fano Test

    SciTech Connect

    Lee, H; Mathis, M; Sawakuchi, G

    2014-06-01

    Purpose: To validate the condensed history algorithm and physics of the Geant4 Monte Carlo toolkit for simulations of ionization chambers (ICs). This study is the first step to validate Geant4 for calculations of photon beam quality correction factors under the presence of a strong magnetic field for magnetic resonance guided linac system applications. Methods: The electron transport and boundary crossing algorithms of Geant4 version 9.6.p02 were tested under Fano conditions using the Geant4 example/application FanoCavity. User-defined parameters of the condensed history and multiple scattering algorithms were investigated under Fano test conditions for three scattering models (physics lists): G4UrbanMscModel95 (PhysListEmStandard-option3), G4GoudsmitSaundersonMsc (PhysListEmStandard-GS), and G4WentzelVIModel/G4CoulombScattering (PhysListEmStandard-WVI). Simulations were conducted using monoenergetic photon beams, ranging from 0.5 to 7 MeV and emphasizing energies from 0.8 to 3 MeV. Results: The GS and WVI physics lists provided consistent Fano test results (within ±0.5%) for maximum step sizes under 0.01 mm at 1.25 MeV, with improved performance at 3 MeV (within ±0.25%). The option3 physics list provided consistent Fano test results (within ±0.5%) for maximum step sizes above 1 mm. Optimal parameters for the option3 physics list were 10 km maximum step size with default values for other user-defined parameters: 0.2 dRoverRange, 0.01 mm final range, 0.04 range factor, 2.5 geometrical factor, and 1 skin. Simulations using the option3 physics list were ∼70 – 100 times faster compared to GS and WVI under optimal parameters. Conclusion: This work indicated that the option3 physics list passes the Fano test within ±0.5% when using a maximum step size of 10 km for energies suitable for IC calculations in a 6 MV spectrum without extensive computational times. Optimal user-defined parameters using the option3 physics list will be used in future IC simulations to

  11. Development of a Geant4 based Monte Carlo Algorithm to evaluate the MONACO VMAT treatment accuracy.

    PubMed

    Fleckenstein, Jens; Jahnke, Lennart; Lohr, Frank; Wenz, Frederik; Hesser, Jürgen

    2013-02-01

    A method to evaluate the dosimetric accuracy of volumetric modulated arc therapy (VMAT) treatment plans, generated with the MONACO™ (version 3.0) treatment planning system in realistic CT-data with an independent Geant4 based dose calculation algorithm is presented. Therefore a model of an Elekta Synergy linear accelerator treatment head with an MLCi2 multileaf collimator was implemented in Geant4. The time dependent linear accelerator components were modeled by importing either logfiles of an actual plan delivery or a DICOM-RT plan sequence. Absolute dose calibration, depending on a reference measurement, was applied. The MONACO as well as the Geant4 treatment head model was commissioned with lateral profiles and depth dose curves of square fields in water and with film measurements in inhomogeneous phantoms. A VMAT treatment plan for a patient with a thoracic tumor and a VMAT treatment plan of a patient, who received treatment in the thoracic spine region including metallic implants, were used for evaluation. MONACO, as well as Geant4, depth dose curves and lateral profiles of square fields had a mean local gamma (2%, 2mm) tolerance criteria agreement of more than 95% for all fields. Film measurements in inhomogeneous phantoms with a global gamma of (3%, 3mm) showed a pass rate above 95% in all voxels receiving more than 25% of the maximum dose. A dose-volume-histogram comparison of the VMAT patient treatment plans showed mean deviations between Geant4 and MONACO of -0.2% (first patient) and 2.0% (second patient) for the PTVs and (0.5±1.0)% and (1.4±1.1)% for the organs at risk in relation to the prescription dose. The presented method can be used to validate VMAT dose distributions generated by a large number of small segments in regions with high electron density gradients. The MONACO dose distributions showed good agreement with Geant4 and film measurements within the simulation and measurement errors.

  12. CMS validation Experience: Test-beam 2004 data vs Geant4

    NASA Astrophysics Data System (ADS)

    Piperov, Stefan

    2007-03-01

    A comparison between the Geant4 Monte-Carlo simulation of CMS Detector's Calorimetric System and data from the 2004 Test-Beam at CERN's SPS H2 beam-line is presented. The overall simulated response agrees quite well with the measured response. Slight differences in the longitudinal shower profiles between the MC predictions made with different Physics Lists are observed.

  13. CMS validation experience: Test-beam 2004 data vs GEANT4

    SciTech Connect

    Piperov, Stefan; /Fermilab /Sofiya, Inst. Nucl. Res.

    2007-01-01

    A comparison between the Geant4 Monte-Carlo simulation of CMS Detector's Calorimetric System and data from the 2004 Test-Beam at CERN's SPS H2 beam-line is presented. The overall simulated response agrees quite well with the measured response. Slight differences in the longitudinal shower profiles between the MC predictions made with different Physics Lists are observed.

  14. Refactoring, reengineering and evolution: paths to Geant4 uncertainty quantification and performance improvement

    NASA Astrophysics Data System (ADS)

    Batič, M.; Begalli, M.; Han, M.; Hauf, S.; Hoff, G.; Kim, C. H.; Kuster, M.; Pia, M. G.; Saracco, P.; Seo, H.; Weidenspointner, G.; Zoglauer, A.

    2012-12-01

    Ongoing investigations for the improvement of Geant4 accuracy and computational performance resulting by refactoring and reengineering parts of the code are discussed. Issues in refactoring that are specific to the domain of physics simulation are identified and their impact is elucidated. Preliminary quantitative results are reported.

  15. A Student Project to use Geant4 Simulations for a TMS-PET combination

    NASA Astrophysics Data System (ADS)

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Rueda, A.; Solano Salinas, C. J.; Wahl, D.; Zamudio, A.

    2007-10-01

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  16. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  17. A CUDA Monte Carlo simulator for radiation therapy dosimetry based on Geant4

    NASA Astrophysics Data System (ADS)

    Henderson, N.; Murakami, K.; Amako, K.; Asai, M.; Aso, T.; Dotti, A.; Kimura, A.; Gerritsen, M.; Kurashige, H.; Perl, J.; Sasaki, T.

    2014-06-01

    Geant4 is a large-scale particle physics package that facilitates every aspect of particle transport simulation. This includes, but is not limited to, geometry description, material definition, tracking of particles passing through and interacting with matter, storage of event data, and visualization. As more detailed and complex simulations are required in different application domains, there is much interest in adapting the code for parallel and multi-core architectures. Parallelism can be achieved by tracking many particles at the same time. The complexity in the context of a GPU/CUDA adaptation is the highly serialized nature of the Geant4 package and the presence of large lookup tables that guide the simulation. This work presents G4CU, a CUDA implementation of the core Geant4 algorithm adapted for dose calculations in radiation therapy. For these applications the geometry is a block of voxels and the physics is limited to low energy electromagnetic physics. These features allow efficient tracking of many particles in parallel on the GPU. Experiments with radiotherapy simulations in G4CU demonstrate about 40 times speedups over Geant4.

  18. Geant4-based radiation hazard assessment for human exploration missions

    NASA Astrophysics Data System (ADS)

    Bernabeu, J.; Casanova, I.

    Human exploration missions in the Solar System will spend most of the time in deep space, without the shielding from the Earth's magnetic field and hence, directly exposed to Galactic Cosmic Rays (GCR) and Solar Particle Events (SPE). Both GCR and SPE fluences have been used to calculate the dose deposited on a water slab phantom using MULASSIS, a program based on the Geant4 Monte Carlo particle transport code. Doses from several extreme SPE events and from a GCR model are calculated for different shielding materials and thicknesses using a planar slab geometry and compared to current dose limits for space operations. Cross-comparison of MULASSIS with HZETRN (a deterministic code) has also been performed for SPE and GCR environments showing an overall reasonable agreement between both codes.

  19. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  20. artG4: A Generic Framework for Geant4 Simulations

    SciTech Connect

    Arvanitis, Tasha; Lyon, Adam

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  1. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework.

    PubMed

    Cañadas, M; Arce, P; Rato Mendes, P

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL(-1) and the simulated peak

  2. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Suerfu, B.; Xu, J.; Ivantchenko, V.; Mantero, A.; Brown, J. M. C.; Bernal, M. A.; Francis, Z.; Karamitros, M.; Tran, H. N.

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  3. Application of automated weight windows to spallation neutron source shielding calculations using Geant4

    NASA Astrophysics Data System (ADS)

    Stenander, John; DiJulio, Douglas D.

    2015-10-01

    We present an implementation of a general weight-window generator for global variance reduction in Geant4 based applications. The implementation is flexible and can be easily adjusted to a user-defined model. In this work, the weight-window generator was applied to calculations based on an instrument shielding model of the European Spallation Source, which is currently under construction in Lund, Sweden. The results and performance of the implemented methods were evaluated through the definition of two figures of merit. It was found that the biased simulations showed an overall improvement in performance compared to the unbiased simulations. The present work demonstrates both the suitability of the generator method and Geant4 for these types of calculations.

  4. Calculation of Coincidence Summing Correction Factors for an HPGe detector using GEANT4.

    PubMed

    Giubrone, G; Ortiz, J; Gallardo, S; Martorell, S; Bas, M C

    2016-07-01

    The aim of this paper was to calculate the True Coincidence Summing Correction Factors (TSCFs) for an HPGe coaxial detector in order to correct the summing effect as a result of the presence of (88)Y and (60)Co in a multigamma source used to obtain a calibration efficiency curve. Results were obtained for three volumetric sources using the Monte Carlo toolkit, GEANT4. The first part of this paper deals with modeling the detector in order to obtain a simulated full energy peak efficiency curve. A quantitative comparison between the measured and simulated values was made across the entire energy range under study. The True Summing Correction Factors were calculated for (88)Y and (60)Co using the full peak efficiencies obtained with GEANT4. This methodology was subsequently applied to (134)Cs, and presented a complex decay scheme. PMID:27085040

  5. Monte Carlo simulation of a photodisintegration of 3 H experiment in Geant4

    NASA Astrophysics Data System (ADS)

    Gray, Isaiah

    2013-10-01

    An upcoming experiment involving photodisintegration of 3 H at the High Intensity Gamma-Ray Source facility at Duke University has been simulated in the software package Geant4. CAD models of silicon detectors and wire chambers were imported from Autodesk Inventor using the program FastRad and the Geant4 GDML importer. Sensitive detectors were associated with the appropriate logical volumes in the exported GDML file so that changes in detector geometry will be easily manifested in the simulation. Probability distribution functions for the energy and direction of outgoing protons were generated using numerical tables from previous theory, and energies and directions were sampled from these distributions using a rejection sampling algorithm. The simulation will be a useful tool to optimize detector geometry, estimate background rates, and test data analysis algorithms. This work was supported by the Triangle Universities Nuclear Laboratory REU program at Duke University.

  6. The GEANT4 toolkit for microdosimetry calculations: application to microbeam radiation therapy (MRT).

    PubMed

    Spiga, J; Siegbahn, E A; Bräuer-Krisch, E; Randaccio, P; Bravin, A

    2007-11-01

    Theoretical dose distributions for microbeam radiation therapy (MRT) are computed in this paper using the GEANT4 Monte Carlo (MC) simulation toolkit. MRT is an innovative experimental radiotherapy technique carried out using an array of parallel microbeams of synchrotron-wiggler-generated x rays. Although the biological mechanisms underlying the effects of microbeams are still largely unknown, the effectiveness of MRT can be traced back to the natural ability of normal tissues to rapidly repair small damages to the vasculature, and on the lack of a similar healing process in tumoral tissues. Contrary to conventional therapy, in which each beam is at least several millimeters wide, the narrowness of the microbeams allows a rapid regeneration of the blood vessels along the beams' trajectories. For this reason the calculation of the "valley" dose is of crucial importance and the correct use of MC codes for such purposes must be understood. GEANT4 offers, in addition to the standard libraries, a specialized package specifically designed to deal with electromagnetic interactions of particles with matter for energies down to 250 eV. This package implements two different approaches for electron and photon transport, one based on evaluated data libraries, the other adopting analytical models. These features are exploited to cross-check theoretical computations for MRT. The lateral and depth dose profiles are studied for the irradiation of a 20 cm diameter, 20 cm long cylindrical phantom, with cylindrical sources of different size and energy. Microbeam arrays are simulated with the aid of superposition algorithms, and the ratios of peak-to-valley doses are computed for typical cases used in preclinical assays. Dose profiles obtained using the GEANT4 evaluated data libraries and analytical models are compared with simulation results previously obtained using the PENELOPE code. The results show that dose profiles computed with GEANT4's analytical model are almost

  7. Introducing Third-Year Undergraduates to GEANT4 Simulations of Light Transport and Collection in Scintillation Materials

    ERIC Educational Resources Information Center

    Riggi, Simone; La Rocca, Paola; Riggi, Francesco

    2011-01-01

    GEANT4 simulations of the processes affecting the transport and collection of optical photons generated inside a scintillation detector were carried out, with the aim to complement the educational material offered by textbooks to third-year physics undergraduates. Two typical situations were considered: a long scintillator strip with and without a…

  8. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Bednarz, B.

    2013-02-01

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  9. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field.

    PubMed

    Yang, Y M; Bednarz, B

    2013-02-21

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  10. GEANT4 simulations of Cherenkov reaction history diagnostics.

    PubMed

    Rubery, M S; Horsfield, C J; Herrmann, H W; Kim, Y; Mack, J M; Young, C S; Caldwell, S E; Evans, S C; Sedilleo, T J; McEvoy, A; Miller, E K; Stoeffl, W; Ali, Z; Toebbe, J

    2010-10-01

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility. PMID:21033850

  11. GEANT4 simulations of Cherenkov reaction history diagnostics

    SciTech Connect

    Rubery, M. S.; Horsfield, C. J.; Herrmann, H. W.; Kim, Y.; Mack, J. M.; Young, C. S.; Caldwell, S. E.; Evans, S. C.; Sedilleo, T. J.; McEvoy, A.; Miller, E. K.; Stoeffl, W.; Ali, Z.

    2010-10-15

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility.

  12. GEANT4 simulations of Cherenkov reaction history diagnosticsa)

    NASA Astrophysics Data System (ADS)

    Rubery, M. S.; Horsfield, C. J.; Herrmann, H. W.; Kim, Y.; Mack, J. M.; Young, C. S.; Caldwell, S. E.; Evans, S. C.; Sedilleo, T. J.; McEvoy, A.; Miller, E. K.; Stoeffl, W.; Ali, Z.; Toebbe, J.

    2010-10-01

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility.

  13. Transmission Efficiency of the Sage Spectrometer Using GEANT4

    NASA Astrophysics Data System (ADS)

    Cox, D. M.; Herzberg, R.-D.; Papadakis, P.; Ali, F.; Butler, P. A.; Cresswell, J. R.; Mistry, A.; Sampson, J.; Seddon, D. A.; Thornhill, J.; Wells, D.; Konki, J.; Greenlees, P. T.; Rahkila, P.; Pakarinen, J.; Sandzelius, M.; Sorri, J.; Julin, R.; Coleman-Smith, P. J.; Lazarus, I. H.; Letts, S. C.; Simpson, J.; Pucknell, V. F. E.

    2014-09-01

    The new SAGE spectrometer allows simultaneous electron and γ-ray in-beam studies of heavy nuclei. A comprehensive GEANT4 simulation suite has been created for the SAGE spectrometer. This includes both the silicon detectors for electron detection and the germanium detectors for γ-ray detection. The simulation can be used for a wide variety of tests with the aim of better understanding the behaviour of SAGE. A number of aspects of electron transmission are presented here.

  14. Geant4 simulations on Compton scattering of laser photons on relativistic electrons

    SciTech Connect

    Filipescu, D.; Utsunomiya, H.; Gheorghe, I.; Glodariu, T.; Tesileanu, O.; Shima, T.; Takahisa, K.; Miyamoto, S.

    2015-02-24

    Using Geant4, a complex simulation code of the interaction between laser photons and relativistic electrons was developed. We implemented physically constrained electron beam emittance and spacial distribution parameters and we also considered a Gaussian laser beam. The code was tested against experimental data produced at the γ-ray beam line GACKO (Gamma Collaboration Hutch of Konan University) of the synchrotron radiation facility NewSUBARU. Here we will discuss the implications of transverse missallignments of the collimation system relative to the electron beam axis.

  15. Comparison of MCNPX and GEANT4 to Predict the Contribution of Non-elastic Nuclear Interactions to Absorbed Dose in Water, PMMA and A150

    SciTech Connect

    Shtejer, K.; Arruda-Neto, J. D. T.; Rodrigues, T. E.; Schulte, R.; Wroe, A.; Menezes, M. O. de; Moralles, M.

    2008-08-11

    Proton induced non-elastic nuclear reactions play an important role in the dose distribution of clinically used proton beams as they deposit dose of high biological effectiveness both within the primary beam path as well as outside the beam to untargeted tissues. Non-elastic nuclear reactions can be evaluated using transport codes based on the Monte Carlo method. In this work, we have utilized the Los Alamos code MCNPX and the CERN GEANT4 toolkit, which are currently the most widely used Monte Carlo programs for proton radiation transport simulations in medical physics, to study the contribution of non-elastic nuclear interactions to the absorbed dose of proton beams in the therapeutic energy range. The impact of different available theoretical models to address the nuclear reaction process was investigated. The contribution of secondary particles from non-elastic nuclear reactions was calculated in three materials relevant in radiotherapy applications: water, PMMA and A150. The results evidence that there are differences in the calculated contribution of the secondary particles heavier than protons to the absorbed dose, with different approaches to model the nuclear reactions. The MCNPX calculation give rise to a larger contribution of d, t, {alpha}{sup 3}He to the total dose compared to the GEANT4 physical models chosen in this work.

  16. Antinucleus-Nucleus Cross Sections Implemented in Geant4

    SciTech Connect

    Uzhinsky, V.; Apostolakis, J.; Galoyan, A.; Folger, G.; Grichine, V.M.; Ivanchenko, V.N.; Wright, D.H.; /SLAC

    2012-04-26

    Cross sections of antinucleus ({bar p}, {bar d}, {bar t}, {sup 3}{ovr He}, {sup 4}{ovr He}) interactions with nuclei in the energy range 100 MeV/c to 1000 GeV/c per antinucleon are calculated in the Glauber approximation which provides good description of all known {bar p}Across sections. The results were obtained using a new parameterization of the total and elastic {bar p}p cross sections. Simple parameterizations of the antinucleus-nucleus cross sections are proposed for use in estimating the efficiency of antinucleus detection and tracking in cosmic rays and accelerator experiments. These parameterizations are implemented in the Geant4 toolkit.

  17. Extension of PENELOPE to protons: Simulation of nuclear reactions and benchmark with Geant4

    SciTech Connect

    Sterpin, E.; Sorriaux, J.; Vynckier, S.

    2013-11-15

    Purpose: Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4.Methods: PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer–Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for {sup 1}H and ICRU 63 data for {sup 12}C, {sup 14}N, {sup 16}O, {sup 31}P, and {sup 40}Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth–dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth–dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone.Results: For simulations with EM collisions only, integral depth–dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth–dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth

  18. BC404 scintillators as gamma locators studied via Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Cortés, M. L.; Hoischen, R.; Eisenhauer, K.; Gerl, J.; Pietralla, N.

    2014-05-01

    In many applications in industry and academia, an accurate determination of the direction from where gamma rays are emitted is either needed or desirable. Ion-beam therapy treatments, the search for orphan sources, and homeland security applications are examples of fields that can benefit from directional sensitivity to gamma-radiation. Scintillation detectors are a good option for these types of applications as they have relatively low cost, are easy to handle and can be produced in a large range of different sizes. In this work a Geant4 simulation was developed to study the directional sensitivity of different BC404 scintillator geometries and arrangements. The simulation includes all the physical processes relevant for gamma detection in a scintillator. In particular, the creation and propagation of optical photons inside the scintillator was included. A simplified photomultiplier tube model was also simulated. The physical principle exploited is the angular dependence of the shape of the energy spectrum obtained from thin scintillator layers when irradiated from different angles. After an experimental confirmation of the working principle of the device and a check of the simulation, the possibilities and limitations of directional sensitivity to gamma radiation using scintillator layers was tested. For this purpose, point-like sources of typical energies expected in ion-beam therapy were used. Optimal scintillator thicknesses for different energies were determined and the setup efficiencies calculated. The use of arrays of scintillators to reconstruct the direction of incoming gamma rays was also studied. For this case, a spherical source emitting Bremsstrahlung radiation was used together with a setup consisting of scintillator layers. The capability of this setup to identify the center of the extended source was studied together with its angular resolution.

  19. BoGEMMS: the Bologna Geant4 multi-mission simulator

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Fioretti, V.; Malaguti, P.; Trifoglio, M.; Gianotti, F.

    2012-07-01

    BoGEMMS, (Bologna Geant4 Multi-Mission Simulator) is a software project for fast simulation of payload on board of scientific satellites for prompt background evaluation that has been developed at the INAF/IASF Bologna. By exploiting the Geant4 set of libraries, BoGEMMS allows to interactively set the geometrical and physical parameters (e.g. physics list, materials and thicknesses), recording the interactions (e.g. energy deposit, position, interacting particle) in NASA FITS and CERN root format output files and filtering the output as a real observation in space, to finally produce the background detected count rate and spectra. Four different types of output can be produced by the BoGEMMS capturing different aspects of the interactions. The simulator can also run in parallel jobs and store the results in a centralized server via xrootd protocol. The BoGEMMS is a multi-mission tool, generally designed to be applied to any high-energy mission for which the shielding and instruments performances analysis is required.

  20. Assessment of a new multileaf collimator concept using GEANT4 Monte Carlo simulations.

    PubMed

    Tacke, Martin B; Szymanowski, Hanitra; Oelfke, Uwe; Schulze, Carsten; Nuss, Susanne; Wehrwein, Eugen; Leidenberger, Stefan

    2006-04-01

    The aim of the work was to investigate in advance the dosimetric properties of a new multileaf collimator (MLC) concept with the help of Monte Carlo (MC) simulations prior to the production of a prototype. The geometrical design of the MLC was implemented in the MC code GEANT4. For the simulation of a 6 MV treatment beam, an experimentally validated phase space and a virtual spatial Gaussian-shaped model placed in the origin were used. For the simulation of the geometry in GEANT4, the jaws and the two leaf packages were implemented with the help of computer-aided design data. First, transmission values for different tungsten alloys were extracted using the simulation codes GEANT4 and BEAMnrc and compared to experimental measurements. In a second step, high-resolution simulations were performed to detect the leakage at depth of maximum dose. The 20%-80% penumbra along the travel direction of the leaves was determined using 10 x 10 cm2 fields shifted along the x- and y-axis. The simulated results were compared with measured data. The simulation of the transmission values for different tungsten alloys showed a good agreement with the experimental measurements (within 2.0%). This enabled an accurate estimation of the attenuation coefficient for the various leaf materials. Simulations with varying width of the spatial Gaussian distribution showed that the leakage and the penumbra depend very much on this parameter: for instance, for widths of 2 and 4 mm, the interleaf leakage is below 0.3% and 0.75%, respectively. The results for the leakage and the penumbra (4.7+/-0.5 mm) are in good agreement with the measurements. This study showed that GEANT4 is appropriate for the investigation of the dosimetric properties of a multileaf collimator. In particular, a quantification of the leakage, the penumbra, and the tongue-and-groove effect and an evaluation of the influence of the beam parameters such as the width of the Gaussian distribution was possible.

  1. Thermal neutron response of a boron-coated GEM detector via GEANT4 Monte Carlo code.

    PubMed

    Jamil, M; Rhee, J T; Kim, H G; Ahmad, Farzana; Jeon, Y J

    2014-10-22

    In this work, we report the design configuration and the performance of the hybrid Gas Electron Multiplier (GEM) detector. In order to make the detector sensitive to thermal neutrons, the forward electrode of the GEM has been coated with the enriched boron-10 material, which works as a neutron converter. A total of 5×5cm(2) configuration of GEM has been used for thermal neutron studies. The response of the detector has been estimated via using GEANT4 MC code with two different physics lists. Using the QGSP_BIC_HP physics list, the neutron detection efficiency was determined to be about 3%, while with QGSP_BERT_HP physics list the efficiency was around 2.5%, at the incident thermal neutron energies of 25meV. The higher response of the detector proves that GEM-coated with boron converter improves the efficiency for thermal neutrons detection.

  2. Simulation of a Helical Channel using GEANT4

    SciTech Connect

    Elvira, V. D.; Lebrun, P.; Spentzouris, P.

    2001-02-01

    We present a simulation of a 72 m long cooling channel proposed by V. Balbekov based on the helical cooling concept developed by Ya. Derbenev. LiH wedge absorbers provide the energy loss mechanism and 201 MHz cavities are used for re-acceleration. They are placed inside a main solenoidal field to focus the beam. A helical field with an amplitude of 0.3 T and a period of 1.8 m provides momentum dispersion for emittance exchange.The simulation is performed using GEANT4. The total fractional transmission is 0.85, and the transverse, longitudinal, and 3-D cooling factors are 3.75, 2.27, and 14.61, respectively. Some version of this helical channel could eventually be used to replace the first section of the double flip channel to keep the longitudinal emittance under control and increase transmission. Although this is an interesting option, the technical challenges are still significant.

  3. Positron Production at JLab Simulated Using Geant4

    SciTech Connect

    Kossler, W. J.; Long, S. S.

    2009-09-02

    The results of a Geant4 Monte-Carlo study of the production of slow positrons using a 140 MeV electron beam which might be available at Jefferson Lab are presented. Positrons are produced by pair production for the gamma-rays produced by bremsstrahlung on the target which is also the stopping medium for the positrons. Positrons which diffuse to the surface of the stopping medium are assumed to be ejected due to a negative work function. Here the target and moderator are combined into one piece. For an osmium target/moderator 3 cm long with transverse dimensions of 1 cm by 1 mm, we obtain a slow positron yield of about 8.5centre dot10{sup 10}/(scentre dotmA) If these positrons were remoderated and re-emitted with a 23% probability we would obtain 2centre dot10{sup 10}/(scentre dotmA) in a micro-beam.

  4. Nuclear spectroscopy with Geant4: Proton and neutron emission & radioactivity

    NASA Astrophysics Data System (ADS)

    Sarmiento, L. G.; Rudolph, D.

    2016-07-01

    With the aid of a novel combination of existing equipment - JYFLTRAP and the TASISpec decay station - it is possible to perform very clean quantum-state selective, high-resolution particle-γ decay spectroscopy. We intend to study the determination of the branching ratio of the ℓ = 9 proton emission from the Iπ = 19/2-, 3174-keV isomer in the N = Z - 1 nucleus 53Co. The study aims to initiate a series of similar experiments along the proton dripline, thereby providing unique insights into "open quantum systems". The technique has been pioneered in case studies using SHIPTRAP and TASISpec at GSI. Newly available radioactive decay modes in Geant4 simulations are going to corroborate the anticipated experimental results.

  5. Enhancement and validation of Geant4 Brachytherapy application on clinical HDR 192Ir source

    NASA Astrophysics Data System (ADS)

    Ababneh, Eshraq; Dababneh, Saed; Qatarneh, Sharif; Wadi-Ramahi, Shada

    2014-10-01

    The Geant4 Monte Carlo MC associated Brachytherapy example was adapted, enhanced and several analysis techniques have been developed. The simulation studies the isodose distribution of the total, primary and scattered doses around a Nucletron microSelectron 192Ir source. Different phantom materials were used (water, tissue and bone) and the calculation was conducted at various depths and planes. The work provides an early estimate of the required number of primary events to ultimately achieve a given uncertainty at a given distance, in the otherwise CPU and time consuming clinical MC calculation. The adaptation of the Geant4 toolkit and the enhancements introduced to the code are all validated including the comprehensive decay of the 192Ir source, the materials used to build the geometry, the geometry itself and the calculated scatter to primary dose ratio. The simulation quantitatively illustrates that the scattered dose in the bone medium is larger than its value in water and tissue. As the distance away from the source increases, scatter contribution to dose becomes more significant as the primary dose decreases. The developed code could be viewed as a platform that contains detailed dose calculation model for clinical application of HDR 192Ir in Brachytherapy.

  6. Development and validation of a GEANT4 radiation transport code for CT dosimetry.

    PubMed

    Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G

    2015-04-01

    The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  7. Development and validation of a GEANT4 radiation transport code for CT dosimetry

    PubMed Central

    Carver, DE; Kost, SD; Fernald, MJ; Lewis, KG; Fraser, ND; Pickens, DR; Price, RR; Stabin, MG

    2014-01-01

    We have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate our simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air, a standard 16-cm acrylic head phantom, and a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of our Monte Carlo simulations. We found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  8. Nuclear reaction measurements on tissue-equivalent materials and GEANT4 Monte Carlo simulations for hadrontherapy

    NASA Astrophysics Data System (ADS)

    De Napoli, M.; Romano, F.; D'Urso, D.; Licciardello, T.; Agodi, C.; Candiano, G.; Cappuzzello, F.; Cirrone, G. A. P.; Cuttone, G.; Musumarra, A.; Pandola, L.; Scuderi, V.

    2014-12-01

    When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned. Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u-1 12C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.

  9. Validating Geant4 Versions 7.1 and 8.3 Against 6.1 for BaBar

    SciTech Connect

    Banerjee, Swagato; Brown, David N.; Chen, Chunhui; Cote, David; Dubois-Felsmann, Gregory P.; Gaponenko, Igor; Kim, Peter C.; Lockman, William S.; Neal, Homer A.; Simi, Gabriele; Telnov, Alexandre V.; Wright, Dennis H.; /SLAC

    2011-11-08

    Since 2005 and 2006, respectively, Geant4 versions 7.1 and 8.3 have been available, providing: improvements in modeling of multiple scattering; corrections to muon ionization and improved MIP signature; widening of the core of electromagnetic shower shape profiles; newer implementation of elastic scattering for hadronic processes; detailed implementation of Bertini cascade model for kaons and lambdas, and updated hadronic cross-sections from calorimeter beam tests. The effects of these changes in simulation are studied in terms of closer agreement of simulation using Geant4 versions 7.1 and 8.3 as compared to Geant4 version 6.1 with respect to data distributions of: the hit residuals of tracks in BABAR silicon vertex tracker; the photon and K{sub L}{sup 0} shower shapes in the electromagnetic calorimeter; the ratio of energy deposited in the electromagnetic calorimeter and the flux return of the magnet instrumented with a muon detection system composed of resistive plate chambers and limited-streamer tubes; and the muon identification efficiency in the muon detector system of the BABAR detector.

  10. Geant4 Application for Simulating the Propagation of Cosmic Rays through the Earth's Magnetosphere

    NASA Astrophysics Data System (ADS)

    Desorgher, L.; Flueckiger, E.O.; Buetikofer, R.; Moser, M.R.

    2003-07-01

    We have developed a Geant4 application to simulate the propagation of cosmic rays through the Earth's magnetosphere. The application computes the motion of charged particles through advanced magnetospheric magnetic field models such as the Tsyganenko 2001 model. It allows to determine cosmic ray cutoff rigidities and asymptotic directions of incidence for user-defined observing positions, directions, and times. By using the new generation of Tsyganenko models, we can analyse the variation of cutoff rigidities and asymptotic directions during magnetic storms as function of the Dst index and of the solar wind dynamic pressure. The paper describes the application, in particular its visualisation potential, and simulation results. Acknowledgments. This work was supported by the Swiss National Science Foundation, grant 20-67092.01 and by the QINETIQ contract CU009-0000028872 in the frame of the ESA/ESTEC SEPTIMESS project.

  11. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  12. GEANT4 simulation of APEX background radiation and shielding

    NASA Astrophysics Data System (ADS)

    Kaluarachchi, Maduka M.; Cates, Gordon D.; Wojtsekhowski, B.

    2015-04-01

    The A' Experiment (APEX), which is approved to run at the Thomas Jefferson National Accelerator Facility (JLab) Hall A, will search for a new vector boson that is hypothesized to be a possible force carrier that couples to dark matter. APEX results should be sensitive to the mass range of 65 MeV to 550 MeV, and high sensitivity will be achieved by means of a high intensity 100 μA beam on a 0.5 g/cm2 Tungsten target resulting in very high luminosity. The experiment should be able to observe the A ' with a coupling constant α ' ~ 1 × 107 times smaller than the electromagnetic coupling constant α. To deal safely with such enormous intensity and luminosity, a full radiation analysis must be used to help with the design of proper radiation shielding. The purpose of this talk is to present preliminary results obtained by simulating radiation background from the APEX experiment using the 3D Monte-Carlo transport code Geant4. Included in the simulation is a detailed Hall A setup: the hall, spectrometers and shield house, beam dump, beam line, septa magnet with its field, as well as the production target. The results were compared to the APEX test run data and used in development of the radiation shielding for sensitive electronics.

  13. GEANT4 Simulation of Neutron Detector for DAMPE

    NASA Astrophysics Data System (ADS)

    He, M.; Ma, T.; Chang, J.; Zhang, Y.; Huang, Y. Y.; Zang, J. J.; Wu, J.; Dong, T. K.

    2016-01-01

    During recent tens of years dark matter has gradually become a hot topic in astronomical research field, and related theory researches and experiment projects change with each passing day. The Dark Matter Particle Explorer (DAMPE) of our country is proposed under this background. As the probing object involves high energy electrons, appropriate methods must be taken to distinguish them from protons in order to reduce the event probability of other charged particles (e.g. a proton) being mistaken as electrons. The experiments show that, the hadronic shower of high energy proton in BGO electromagnetic calorimeter, which is usually accompanied by the emitting of large number of secondary neutrons, is significantly different from the electromagnetic shower of high energy electron. Through the detection of secondary neutron signal emitting from the bottom of BGO electromagnetic calorimeter and the shower shape of incident particles in BGO electromagnetic calorimeter, we can effectively distinguish whether the incident particles are high energy protons or electrons. This paper introduces the structure and detecting principle of DAMPE neutron detector. We use Monte-Carlo method with GEANT4 software to simulate the signal emitting from protons and electrons at characteristic energy in the neutron detector, and finally summarize the neutron detector's ability to distinguish protons and electrons under different electron acception efficiencies.

  14. Optimization of a photoneutron source based on 10 MeV electron beam using Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Askri, Boubaker

    2015-10-01

    Geant4 Monte Carlo code has been used to conceive and optimize a simple and compact neutron source based on a 10 MeV electron beam impinging on a tungsten target adjoined to a beryllium target. For this purpose, a precise photonuclear reaction cross-section model issued from the International Atomic Energy Agency (IAEA) database was linked to Geant4 to accurately simulate the interaction of low energy bremsstrahlung photons with beryllium material. A benchmark test showed that a good agreement was achieved when comparing the emitted neutron flux spectra predicted by Geant4 and Fluka codes for a beryllium cylinder bombarded with a 5 MeV photon beam. The source optimization was achieved through a two stage Monte Carlo simulation. In the first stage, the distributions of the seven phase space coordinates of the bremsstrahlung photons at the boundaries of the tungsten target were determined. In the second stage events corresponding to photons emitted according to these distributions were tracked. A neutron yield of 4.8 × 1010 neutrons/mA/s was obtained at 20 cm from the beryllium target. A thermal neutron yield of 1.5 × 109 neutrons/mA/s was obtained after introducing a spherical shell of polyethylene as a neutron moderator.

  15. Radiation quality of cosmic ray nuclei studied with Geant4-based simulations

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas N.; Pshenichnov, Igor A.; Mishustin, Igor N.; Bleicher, Marcus

    2014-04-01

    In future missions in deep space a space craft will be exposed to a non-negligible flux of high charge and energy (HZE) particles present in the galactic cosmic rays (GCR). One of the major concerns of manned missions is the impact on humans of complex radiation fields which result from the interactions of HZE particles with the spacecraft materials. The radiation quality of several ions representing GCR is investigated by calculating microdosimetry spectra. A Geant4-based Monte Carlo model for Heavy Ion Therapy (MCHIT) is used to simulate microdosimetry data for HZE particles in extended media where fragmentation reactions play a certain role. Our model is able to reproduce measured microdosimetry spectra for H, He, Li, C and Si in the energy range of 150-490 MeV/u. The effect of nuclear fragmentation on the relative biological effectiveness (RBE) of He, Li and C is estimated and found to be below 10%.

  16. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.

    PubMed

    Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L

    2012-06-01

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  17. Monte Carlo simulation by GEANT 4 and GESPECOR of in situ gamma-ray spectrometry measurements.

    PubMed

    Chirosca, Alecsandru; Suvaila, Rares; Sima, Octavian

    2013-11-01

    The application of GEANT 4 and GESPECOR Monte Carlo simulation codes for efficiency calibration of in situ gamma-ray spectrometry was studied. The long computing time required by GEANT 4 prevents its use in routine simulations. Due to the application of variance reduction techniques, GESPECOR is much faster. In this code specific procedures for incorporating the depth profile of the activity were implemented. In addition procedures for evaluating the effect of non-homogeneity of the source were developed. The code was validated by comparison with test simulations carried out with GEANT 4 and by comparison with published results. PMID:23566809

  18. GATE - Geant4 Application for Tomographic Emission: a simulation toolkit for PET and SPECT

    PubMed Central

    Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.

    2012-01-01

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document, and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.ep.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects toward the gridification of GATE and its extension to other domains such as dosimetry are also discussed. PMID:15552416

  19. Interaction of Fast Nucleons with Actinide Nuclei Studied with GEANT4

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yu.; Pshenichnov, I.; Mishustin, I.; Greiner, W.

    2014-04-01

    We model interactions of protons and neutrons with energies from 1 to 1000 MeV with 241Am and 243Am nuclei. The calculations are performed with the Monte Carlo model for Accelerator Driven Systems (MCADS) which we developed based on the GEANT4 toolkit of version 9.4. This toolkit is widely used to simulate the propagation of particles in various materials which contain nuclei up to uranium. After several extensions we apply this toolkit also to proton- and neutron-induced reactions on Am. The fission and radiative neutron capture cross sections, neutron multiplicities and distributions of fission fragments were calculated for 241Am and 243Am and compared with experimental data. As demonstrated, the fission of americium by energetic protons with energies above 20 MeV can be well described by the Intra-Nuclear Cascade Liège (INCL) model combined with the fission-evaporation model ABLA. The calculated average numbers of fission neutrons and mass distributions of fission products agree well with the corresponding data. However, the proton-induced fission below 20 MeV is described less accurately. This is attributed to the limitations of the Intra-Nuclear Cascade model at low projectile energies.

  20. Use of GEANT4 vs. MCNPX for the characterization of a boron-lined neutron detector

    NASA Astrophysics Data System (ADS)

    van der Ende, B. M.; Atanackovic, J.; Erlandson, A.; Bentoumi, G.

    2016-06-01

    This work compares GEANT4 with MCNPX in the characterization of a boron-lined neutron detector. The neutron energy ranges simulated in this work (0.025 eV to 20 MeV) are the traditional domain of MCNP simulations. This paper addresses the question, how well can GEANT4 and MCNPX be employed for detailed thermal neutron detector characterization? To answer this, GEANT4 and MCNPX have been employed to simulate detector response to a 252Cf energy spectrum point source, as well as to simulate mono-energetic parallel beam source geometries. The 252Cf energy spectrum simulation results demonstrate agreement in detector count rate within 3% between the two packages, with the MCNPX results being generally closer to experiment than are those from GEANT4. The mono-energetic source simulations demonstrate agreement in detector response within 5% between the two packages for all neutron energies, and within 1% for neutron energies between 100 eV and 5 MeV. Cross-checks between the two types of simulations using ISO-8529 252Cf energy bins demonstrates that MCNPX results are more self-consistent than are GEANT4 results, by 3-4%.

  1. The Simulation of AN Imaging Gamma-Ray Compton Backscattering Device Using GEANT4

    NASA Astrophysics Data System (ADS)

    Flechas, D.; Sarmiento, L. G.; Cristancho, F.; Fajardo, E.

    2014-02-01

    A gamma-backscattering imaging device dubbed Compton Camera, developed at GSI (Darmstadt, Germany) and modified and studied at the Nuclear Physics Group of the National University of Colombia in Bogotá, uses the back-to-back emission of two gamma rays in the positron annihilation to construct a bidimensional image that represents the distribution of matter in the field-of-view of the camera. This imaging capability can be used in a host of different situations, for example, to identify and study deposition and structural defects, and to help locating concealed objects, to name just two cases. In order to increase the understanding of the response of the Compton Camera and, in particular, its image formation process, and to assist in the data analysis, a simulation of the camera was developed using the GEANT4 simulation toolkit. In this work, the images resulting from different experimental conditions are shown. The simulated images and their comparison with the experimental ones already suggest methods to improve the present experimental device

  2. Geant4 simulation of the solar neutron telescope at Sierra Negra, Mexico

    NASA Astrophysics Data System (ADS)

    González, L. X.; Sánchez, F.; Valdés-Galicia, J. F.

    2010-02-01

    The solar neutron telescope (SNT) at Sierra Negra (19.0°N, 97.3°W and 4580 m.a.s.l) is part of a worldwide network of similar detectors (Valdés-Galicia et al., (2004) [1]). This SNT has an area of 4 m2; it is composed by four 1 m×1 m×30 cm plastic scintillators (Sci). The Telescope is completely surrounded by anti-coincidence proportional counters (PRCs) to separate charged particles from the neutron flux. In order to discard photon background it is shielded on its sides by 10 mm thick iron plates and on its top by 5 mm lead plates. It is capable of registering four different channels corresponding to four energy deposition thresholds: E>30, >60, >90 and >120 MeV. The arrival direction of neutrons is determined by gondolas of PRCs in electronic coincidence, four layers of these gondolas orthogonally located underneath the SNT, two in the NS direction and two in the EW direction. We present here simulations of the detector response to neutrons, protons, electrons and gammas in range of energies from 100 to 1000 MeV. We report on the detector efficiency and on its angular resolution for particles impinging the device with different zenith angles. The simulation code was written using the Geant4 package (Agostinelli et al., (2003) [2]), taking into account all relevant physical processes.

  3. Yields of positron and positron emitting nuclei for proton and carbon ion radiation therapy: a simulation study with GEANT4.

    PubMed

    Lau, Andy; Chen, Yong; Ahmad, Salahuddin

    2012-01-01

    A Monte Carlo application is developed to investigate the yields of positron-emitting nuclei (PEN) used for proton and carbon ion range verification techniques using the GEANT4 Toolkit. A base physics list was constructed and used to simulate incident proton and carbon ions onto a PMMA or water phantom using pencil like beams. In each simulation the total yields of PEN are counted and both the PEN and their associated positron depth-distributions were recorded and compared to the incident radiation's Bragg Peak. Alterations to the physics lists are then performed to investigate the PEN yields dependence on the choice of physics list. In our study, we conclude that the yields of PEN can be estimated using the physics list presented here for range verification of incident proton and carbon ions.

  4. Application of GEANT4 in the Development of New Radiation Therapy Treatment Methods

    NASA Astrophysics Data System (ADS)

    Brahme, Anders; Gudowska, Irena; Larsson, Susanne; Andreassen, Björn; Holmberg, Rickard; Svensson, Roger; Ivanchenko, Vladimir; Bagulya, Alexander; Grichine, Vladimir; Starkov, Nikolay

    2006-04-01

    There is a very fast development of new radiation treatment methods today, from advanced use of intensity modulated photon and electron beams to light ion therapy with narrow scanned beam based treatment units. Accurate radiation transport calculations are a key requisite for these developments where Geant4 is a very useful Monte Carlo code for accurate design of new treatment units. Today we cannot only image the tumor by PET-CT imaging before the treatment but also determine the tumor sensitivity to radiation and even measure in vivo the delivered absorbed dose in three dimensions in the patient. With such methods accurate Monte Carlo calculations will make radiation therapy an almost exact science where the curative doses can be calculated based on patient individual response data. In the present study results from the application of Geant4 are discussed and the comparisons between Geant4 and experimental and other Monte Carlo data are presented.

  5. Photon energy absorption coefficients for nuclear track detectors using Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Singh, Vishwanath P.; Medhat, M. E.; Badiger, N. M.

    2015-01-01

    Geant4 Monte Carlo code simulations were used to solve experimental and theoretical complications for calculation of mass energy-absorption coefficients of elements, air, and compounds. The mass energy-absorption coefficients for nuclear track detectors were computed first time using Geant4 Monte Carlo code for energy 1 keV-20 MeV. Very good agreements for simulated results of mass energy-absorption coefficients for carbon, nitrogen, silicon, sodium iodide and nuclear track detectors were observed on comparison with the values reported in the literatures. Kerma relative to air for energy 1 keV-20 MeV and energy absorption buildup factors for energy 50 keV-10 MeV up to 10 mfp penetration depths of the selected nuclear track detectors were also calculated to evaluate the absorption of the gamma photons. Geant4 simulation can be utilized for estimation of mass energy-absorption coefficients in elements and composite materials.

  6. Dose conversion coefficients for ICRP110 voxel phantom in the Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Martins, M. C.; Cordeiro, T. P. V.; Silva, A. X.; Souza-Santos, D.; Queiroz-Filho, P. P.; Hunt, J. G.

    2014-02-01

    The reference adult male voxel phantom recommended by International Commission on Radiological Protection no. 110 was implemented in the Geant4 Monte Carlo code. Geant4 was used to calculate Dose Conversion Coefficients (DCCs) expressed as dose deposited in organs per air kerma for photons, electrons and neutrons in the Annals of the ICRP. In this work the AP and PA irradiation geometries of the ICRP male phantom were simulated for the purpose of benchmarking the Geant4 code. Monoenergetic photons were simulated between 15 keV and 10 MeV and the results were compared with ICRP 110, the VMC Monte Carlo code and the literature data available, presenting a good agreement.

  7. Distributions of deposited energy and ionization clusters around ion tracks studied with Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Hilgers, Gerhard; Bleicher, Marcus

    2016-05-01

    The Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT) was extended to study the patterns of energy deposition at sub-micrometer distance from individual ion tracks. Dose distributions for low-energy 1H, 4He, 12C and 16O ions measured in several experiments are well described by the model in a broad range of radial distances, from 0.5 to 3000 nm. Despite the fact that such distributions are characterized by long tails, a dominant fraction of deposited energy (∼80%) is confined within a radius of about 10 nm. The probability distributions of clustered ionization events in nanoscale volumes of water traversed by 1H, 2H, 4He, 6Li, 7Li, and 12C ions are also calculated. A good agreement of calculated ionization cluster-size distributions with the corresponding experimental data suggests that the extended MCHIT can be used to characterize stochastic processes of energy deposition to sensitive cellular structures.

  8. Validation of a dose deposited by low-energy photons using GATE/GEANT4.

    PubMed

    Thiam, C O; Breton, V; Donnarieix, D; Habib, B; Maigne, L

    2008-06-01

    The GATE Monte Carlo simulation platform based on the Geant4 toolkit has now become a diffused tool for simulating PET and SPECT imaging devices. In this paper, we explore its relevance for dosimetry of low-energy 125I photon brachytherapy sources used to treat prostate cancers. To that end, three 125-iodine sources widely used in prostate cancer brachytherapy treatment have been modelled. GATE simulations reproducing dosimetric reference observables such as radial dose function g(r), anisotropy function F(r, theta) and dose-rate constant (Lambda) were performed in liquid water. The calculations were splitted on the EGEE grid infrastructure to reduce the computing time of the simulations. The results were compared to other relevant Monte Carlo results and to measurements published and fixed as recommended values by the AAPM Task Group 43. GATE results agree with consensus values published by AAPM Task Group 43 with an accuracy better than 2%, demonstrating that GATE is a relevant tool for the study of the dose induced by low-energy photons.

  9. Determination of age specific ¹³¹I S-factor values for thyroid using anthropomorphic phantom in Geant4 simulations.

    PubMed

    Rahman, Ziaur; Ahmad, Syed Bilal; Mirza, Sikander M; Arshed, Waheed; Mirza, Nasir M; Ahmed, Waheed

    2014-08-01

    Using anthropomorphic phantom in Geant4, determination of β- and γ-absorbed fractions and energy absorbed per event due to (131)I activity in thyroid of individuals of various age groups and geometrical models, have been carried out. In the case of (131)I β-particles, the values of the absorbed fraction increased from 0.88 to 0.97 with fetus age. The maximum difference in absorbed energy per decay for soft tissue and water is 7.2% for γ-rays and 0.4% for β-particles. The new mathematical MIRD embedded in Geant4 (MEG) and two-lobe ellipsoidal models developed in this work have 4.3% and 2.9% lower value of S-factor as compared with the ORNL data.

  10. SU-E-J-72: Geant4 Simulations of Spot-Scanned Proton Beam Treatment Plans

    SciTech Connect

    Kanehira, T; Sutherland, K; Matsuura, T; Umegaki, K; Shirato, H

    2014-06-01

    Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generated and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.

  11. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes

    NASA Astrophysics Data System (ADS)

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes.

  12. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes.

    PubMed

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes.

  13. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes.

    PubMed

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes. PMID:26975304

  14. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    NASA Astrophysics Data System (ADS)

    Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M. G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  15. The local skin dose conversion coefficients of electrons, protons and alpha particles calculated using the Geant4 code.

    PubMed

    Zhang, Bintuan; Dang, Bingrong; Wang, Zhuanzi; Wei, Wei; Li, Wenjian

    2013-10-01

    The skin tissue-equivalent slab reported in the International Commission on Radiological Protection (ICRP) Publication 116 to calculate the localised skin dose conversion coefficients (LSDCCs) was adopted into the Monte Carlo transport code Geant4. The Geant4 code was then utilised for computation of LSDCCs due to a circular parallel beam of monoenergetic electrons, protons and alpha particles <10 MeV. The computed LSDCCs for both electrons and alpha particles are found to be in good agreement with the results using the MCNPX code of ICRP 116 data. The present work thus validates the LSDCC values for both electrons and alpha particles using the Geant4 code.

  16. Background simulation of the X-ray detectors using Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Sarkar, R.; Mandal, S.; Nandi, A.; Debnath, D.; Chakrabarti, S. K.; Rao, A. R.

    We have studied the background noise of X-ray detectors using the Geant4 simulation toolkit. The main source of background noise for the X-ray detectors of low earth orbit is due to cosmic diffused background photons. We have calculated the background spectrum for the CZT of ASTROSAT as well as the phoswich detector of RT-2. Also we have studied the importance of the veto detector to reduce the Compton induced background photons. In this simulation ess we also have optimized the passive shielding to minimize the detector weight within the allowed limit of background counts.

  17. Application of Geant4 in routine close geometry gamma spectroscopy for environmental samples.

    PubMed

    Dababneh, Saed; Al-Nemri, Ektimal; Sharaf, Jamal

    2014-08-01

    This work examines the utilization of Geant4 to practically achieve crucial corrections, in close geometry, for self-absorption and true coincidence summing in gamma-ray spectrometry of environmental samples, namely soil and water. After validation, different simulation options have been explored and compared. The simulation was used to correct for self-absorption effects, and to establish a summing-free efficiency curve, thus overcoming limitations and uncertainties imposed by conventional calibration standards. To be applicable in busy laboratories, simulation results were introduced into the conventional software Genie 2000 in order to be reliably used in everyday routine measurements.

  18. Wurtzite Gallium Nitride as a scintillator detector for alpha particles (a Geant4 simulation)

    NASA Astrophysics Data System (ADS)

    Taheri, A.; Sheidaiy, M.

    2015-05-01

    Gallium Nitride has become a very popular material in electronics and optoelectronics. Because of its interesting properties, it is suitable for a large range of applications. This material also shows very good scintillation properties that make it a possible candidate for use as a charged particles scintillator detector. In this work, we simulated the scintillation and optical properties of the gallium nitride in the presence of alpha particles using Geant4. The results show that gallium nitride can be an appropriate choice for this purpose.

  19. Integration of the low-energy particle track simulation code in Geant4

    NASA Astrophysics Data System (ADS)

    Arce, Pedro; Muñoz, Antonio; Moraleda, Montserrat; Gomez Ros, José María; Blanco, Fernando; Perez, José Manuel; García, Gustavo

    2015-08-01

    The Low-Energy Particle Track Simulation code (LEPTS) is a Monte Carlo code developed to simulate the damage caused by radiation at molecular level. The code is based on experimental data of scattering cross sections, both differential and integral, and energy loss data, complemented with theoretical calculations. It covers the interactions of electrons and positrons from energies of 10 keV down to 0.1 eV in different biologically relevant materials. In this article we briefly mention the main characteristics of this code and we present its integration within the Geant4 Monte Carlo toolkit.

  20. Simulation of positron backscattering and implantation profiles using Geant4 code

    NASA Astrophysics Data System (ADS)

    Huang, Shi-Juan; Pan, Zi-Wen; Liu, Jian-Dang; Han, Rong-Dian; Ye, Bang-Jiao

    2015-10-01

    For the proper interpretation of the experimental data produced in slow positron beam technique, the positron implantation properties are studied carefully using the latest Geant4 code. The simulated backscattering coefficients, the implantation profiles, and the median implantation depths for mono-energetic positrons with energy range from 1 keV to 50 keV normally incident on different crystals are reported. Compared with the previous experimental results, our simulation backscattering coefficients are in reasonable agreement, and we think that the accuracy may be related to the structures of the host materials in the Geant4 code. Based on the reasonable simulated backscattering coefficients, the adjustable parameters of the implantation profiles which are dependent on materials and implantation energies are obtained. The most important point is that we calculate the positron backscattering coefficients and median implantation depths in amorphous polymers for the first time and our simulations are in fairly good agreement with the previous experimental results. Project supported by the National Natural Science Foundation of China (Grant Nos. 11175171 and 11105139).

  1. Optimisation of a dual head semiconductor Compton camera using Geant4

    NASA Astrophysics Data System (ADS)

    Harkness, L. J.; Boston, A. J.; Boston, H. C.; Cooper, R. J.; Cresswell, J. R.; Grint, A. N.; Nolan, P. J.; Oxley, D. C.; Scraggs, D. P.; Beveridge, T.; Gillam, J.; Lazarus, I.

    2009-06-01

    Conventional medical gamma-ray camera systems utilise mechanical collimation to provide information on the position of an incident gamma-ray photon. Systems that use electronic collimation utilising Compton image reconstruction techniques have the potential to offer huge improvements in sensitivity. Position sensitive high purity germanium (HPGe) detector systems are being evaluated as part of a single photon emission computed tomography (SPECT) Compton camera system. Data have been acquired from the orthogonally segmented planar SmartPET detectors, operated in Compton camera mode. The minimum gamma-ray energy which can be imaged by the current system in Compton camera configuration is 244 keV due to the 20 mm thickness of the first scatter detector which causes large gamma-ray absorption. A simulation package for the optimisation of a new semiconductor Compton camera has been developed using the Geant4 toolkit. This paper will show results of preliminary analysis of the validated Geant4 simulation for gamma-ray energies of SPECT, 141 keV.

  2. A Compton camera application for the GAMOS GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Harkness, L. J.; Arce, P.; Judson, D. S.; Boston, A. J.; Boston, H. C.; Cresswell, J. R.; Dormand, J.; Jones, M.; Nolan, P. J.; Sampson, J. A.; Scraggs, D. P.; Sweeney, A.; Lazarus, I.; Simpson, J.

    2012-04-01

    Compton camera systems can be used to image sources of gamma radiation in a variety of applications such as nuclear medicine, homeland security and nuclear decommissioning. To locate gamma-ray sources, a Compton camera employs electronic collimation, utilising Compton kinematics to reconstruct the paths of gamma rays which interact within the detectors. The main benefit of this technique is the ability to accurately identify and locate sources of gamma radiation within a wide field of view, vastly improving the efficiency and specificity over existing devices. Potential advantages of this imaging technique, along with advances in detector technology, have brought about a rapidly expanding area of research into the optimisation of Compton camera systems, which relies on significant input from Monte-Carlo simulations. In this paper, the functionality of a Compton camera application that has been integrated into GAMOS, the GEANT4-based Architecture for Medicine-Oriented Simulations, is described. The application simplifies the use of GEANT4 for Monte-Carlo investigations by employing a script based language and plug-in technology. To demonstrate the use of the Compton camera application, simulated data have been generated using the GAMOS application and acquired through experiment for a preliminary validation, using a Compton camera configured with double sided high purity germanium strip detectors. Energy spectra and reconstructed images for the data sets are presented.

  3. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4

    PubMed Central

    Schümann, J.; Paganetti, H.; Shin, J.; Faddegon, B.; Perl, J.

    2012-01-01

    A key task within all Monte Carlo particle transport codes is Navigation, the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the effciency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the latter with and without boundary skipping, a method where neighboring voxels with the same Hounsfield Unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a TOol for PArticle Simulations layered on top of Geant4. Runtime comparisons were performed on three distinct patient CT data sets: A head and neck, a liver, and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average run time ratio for G4PhantomParamererisation with and without boundary skipping for our heterogeneous data was = 0:97 : 1. The calculated dose distributions agreed with the reference distribution for all but the G4Phantom

  4. Calculation of extrapolation curves in the 4π(LS)β-γ coincidence technique with the Monte Carlo code Geant4.

    PubMed

    Bobin, C; Thiam, C; Bouchard, J

    2016-03-01

    At LNE-LNHB, a liquid scintillation (LS) detection setup designed for Triple to Double Coincidence Ratio (TDCR) measurements is also used in the β-channel of a 4π(LS)β-γ coincidence system. This LS counter based on 3 photomultipliers was first modeled using the Monte Carlo code Geant4 to enable the simulation of optical photons produced by scintillation and Cerenkov effects. This stochastic modeling was especially designed for the calculation of double and triple coincidences between photomultipliers in TDCR measurements. In the present paper, this TDCR-Geant4 model is extended to 4π(LS)β-γ coincidence counting to enable the simulation of the efficiency-extrapolation technique by the addition of a γ-channel. This simulation tool aims at the prediction of systematic biases in activity determination due to eventual non-linearity of efficiency-extrapolation curves. First results are described in the case of the standardization (59)Fe. The variation of the γ-efficiency in the β-channel due to the Cerenkov emission is investigated in the case of the activity measurements of (54)Mn. The problem of the non-linearity between β-efficiencies is featured in the case of the efficiency tracing technique for the activity measurements of (14)C using (60)Co as a tracer.

  5. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation

    NASA Astrophysics Data System (ADS)

    Ogawara, R.; Ishikawa, M.

    2016-07-01

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements.

  6. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation.

    PubMed

    Ogawara, R; Ishikawa, M

    2016-07-01

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements. PMID:27475602

  7. Comparison of MCNPX and Geant4 proton energy deposition predictions for clinical use

    PubMed Central

    Titt, U.; Bednarz, B.; Paganetti, H.

    2012-01-01

    Several different Monte Carlo codes are currently being used at proton therapy centers to improve upon dose predictions over standard methods using analytical or semi-empirical dose algorithms. There is a need to better ascertain the differences between proton dose predictions from different available Monte Carlo codes. In this investigation Geant4 and MCNPX, the two most-utilized Monte Carlo codes for proton therapy applications, were used to predict energy deposition distributions in a variety of geometries, comprising simple water phantoms, water phantoms with complex inserts and in a voxelized geometry based on clinical CT data. The gamma analysis was used to evaluate the differences of the predictions between the codes. The results show that in the all cases the agreement was better than clinical acceptance criteria. PMID:22996039

  8. GEANT4 Application for the Simulation of the Head of a Siemens Primus Linac

    SciTech Connect

    Cortes-Giraldo, M. A.; Quesada, J. M.; Gallardo, M. I.

    2010-04-26

    The Monte Carlo simulation of the head of a Siemens Primus Linac used at Virgen Macarena Hospital (Sevilla, Spain) has been performed using the code GEANT4, version 9.2. In this work, the main features of the application built by our group are presented. They are mainly focused in the optimization of the performance of the simulation. The geometry, including the water phantom, has been entirely wrapped by a shielding volume which discards all the particles escaping far away through its walls. With this, a factor of four in the time spent by the simulation can be saved. An interface to read and write phase-space files in IAEA format has been also developed to save CPU time in our simulations. Finally, some calculations of the dose absorption in the water phantom have been done and compared with the results given by EGSnrc and with experimental data obtained for the calibration of the machine.

  9. Geant4 Predictions of Energy Spectra in Typical Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Sabra, M. S.; Barghouty, A. F.

    2014-01-01

    Accurate knowledge of energy spectra inside spacecraft is important for protecting astronauts as well as sensitive electronics from the harmful effects of space radiation. Such knowledge allows one to confidently map the radiation environment inside the vehicle. The purpose of this talk is to present preliminary calculations for energy spectra inside a spherical shell shielding and behind a slab in typical space radiation environment using the 3D Monte-Carlo transport code Geant4. We have simulated proton and iron isotropic sources and beams impinging on Aluminum and Gallium arsenide (GaAs) targets at energies of 0.2, 0.6, 1, and 10 GeV/u. If time permits, other radiation sources and beams (_, C, O) and targets (C, Si, Ge, water) will be presented. The results are compared to ground-based measurements where available.

  10. Geant4 calculations for space radiation shielding material Al2O3

    NASA Astrophysics Data System (ADS)

    Capali, Veli; Acar Yesil, Tolga; Kaya, Gokhan; Kaplan, Abdullah; Yavuz, Mustafa; Tilki, Tahir

    2015-07-01

    Aluminium Oxide, Al2O3 is the most widely used material in the engineering applications. It is significant aluminium metal, because of its hardness and as a refractory material owing to its high melting point. This material has several engineering applications in diverse fields such as, ballistic armour systems, wear components, electrical and electronic substrates, automotive parts, components for electric industry and aero-engine. As well, it is used as a dosimeter for radiation protection and therapy applications for its optically stimulated luminescence properties. In this study, stopping powers and penetrating distances have been calculated for the alpha, proton, electron and gamma particles in space radiation shielding material Al2O3 for incident energies 1 keV - 1 GeV using GEANT4 calculation code.

  11. Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.

    PubMed

    Chagren, S; Ben Tekaya, M; Reguigui, N; Gharbi, F

    2016-01-01

    In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed. PMID:26623928

  12. Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.

    PubMed

    Chagren, S; Ben Tekaya, M; Reguigui, N; Gharbi, F

    2016-01-01

    In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed.

  13. A Geant4 simulation of the depth dose percentage in brain tumors treatments using protons and carbon ions

    NASA Astrophysics Data System (ADS)

    José A. Diaz, M.; Torres, D. A.

    2016-07-01

    The deposited energy and dose distribution of beams of protons and carbon over a head are simulated using the free tool package Geant4 and the data analysis package ROOT-C++. The present work shows a methodology to understand the microscopical process occurring in a session of hadron-therapy using advance simulation tools.

  14. Sensitivity analysis for liver iron measurement through neutron stimulated emission computed tomography: a Monte Carlo study in GEANT4.

    PubMed

    Agasthya, G A; Harrawood, B C; Shah, J P; Kapadia, A J

    2012-01-01

    Neutron stimulated emission computed tomography (NSECT) is being developed as a non-invasive imaging modality to detect and quantify iron overload in the human liver. NSECT uses gamma photons emitted by the inelastic interaction between monochromatic fast neutrons and iron nuclei in the body to detect and quantify the disease. Previous simulated and physical experiments with phantoms have shown that NSECT has the potential to accurately diagnose iron overload with reasonable levels of radiation dose. In this work, we describe the results of a simulation study conducted to determine the sensitivity of the NSECT system for hepatic iron quantification in patients of different sizes. A GEANT4 simulation of the NSECT system was developed with a human liver and two torso sizes corresponding to small and large patients. The iron concentration in the liver ranged between 0.5 and 20 mg g(-1), corresponding to clinically reported iron levels in iron-overloaded patients. High-purity germanium gamma detectors were simulated to detect the emitted gamma spectra, which were background corrected using suitable water phantoms and analyzed to determine the minimum detectable level (MDL) of iron and the sensitivity of the NSECT system. These analyses indicate that for a small patient (torso major axis = 30 cm) the MDL is 0.5 mg g(-1) and sensitivity is ∼13 ± 2 Fe counts/mg/mSv and for a large patient (torso major axis = 40 cm) the values are 1 mg g(-1) and ∼5 ± 1 Fe counts/mg/mSv, respectively. The results demonstrate that the MDL for both patient sizes lies within the clinically significant range for human iron overload.

  15. Sensitivity analysis for liver iron measurement through neutron stimulated emission computed tomography: a Monte Carlo study in GEANT4

    NASA Astrophysics Data System (ADS)

    Agasthya, G. A.; Harrawood, B. C.; Shah, J. P.; Kapadia, A. J.

    2012-01-01

    Neutron stimulated emission computed tomography (NSECT) is being developed as a non-invasive imaging modality to detect and quantify iron overload in the human liver. NSECT uses gamma photons emitted by the inelastic interaction between monochromatic fast neutrons and iron nuclei in the body to detect and quantify the disease. Previous simulated and physical experiments with phantoms have shown that NSECT has the potential to accurately diagnose iron overload with reasonable levels of radiation dose. In this work, we describe the results of a simulation study conducted to determine the sensitivity of the NSECT system for hepatic iron quantification in patients of different sizes. A GEANT4 simulation of the NSECT system was developed with a human liver and two torso sizes corresponding to small and large patients. The iron concentration in the liver ranged between 0.5 and 20 mg g-1,In this paper all iron concentrations with units mg g-1 refer to wet weight concentrations. corresponding to clinically reported iron levels in iron-overloaded patients. High-purity germanium gamma detectors were simulated to detect the emitted gamma spectra, which were background corrected using suitable water phantoms and analyzed to determine the minimum detectable level (MDL) of iron and the sensitivity of the NSECT system. These analyses indicate that for a small patient (torso major axis = 30 cm) the MDL is 0.5 mg g-1 and sensitivity is ˜13 ± 2 Fe counts/mg/mSv and for a large patient (torso major axis = 40 cm) the values are 1 mg g-1 and ˜5 ± 1 Fe counts/mg/mSv, respectively. The results demonstrate that the MDL for both patient sizes lies within the clinically significant range for human iron overload.

  16. Feasibility of using Geant4 Monte Carlo simulation for IMRT dose calculations for the Novalis Tx with a HD-120 multi-leaf collimator

    NASA Astrophysics Data System (ADS)

    Jung, Hyunuk; Shin, Jungsuk; Chung, Kwangzoo; Han, Youngyih; Kim, Jinsung; Choi, Doo Ho

    2015-05-01

    The aim of this study was to develop an independent dose verification system by using a Monte Carlo (MC) calculation method for intensity modulated radiation therapy (IMRT) conducted by using a Varian Novalis Tx (Varian Medical Systems, Palo Alto, CA, USA) equipped with a highdefinition multi-leaf collimator (HD-120 MLC). The Geant4 framework was used to implement a dose calculation system that accurately predicted the delivered dose. For this purpose, the Novalis Tx Linac head was modeled according to the specifications acquired from the manufacturer. Subsequently, MC simulations were performed by varying the mean energy, energy spread, and electron spot radius to determine optimum values of irradiation with 6-MV X-ray beams by using the Novalis Tx system. Computed percentage depth dose curves (PDDs) and lateral profiles were compared to the measurements obtained by using an ionization chamber (CC13). To validate the IMRT simulation by using the MC model we developed, we calculated a simple IMRT field and compared the result with the EBT3 film measurements in a water-equivalent solid phantom. Clinical cases, such as prostate cancer treatment plans, were then selected, and MC simulations were performed. The accuracy of the simulation was assessed against the EBT3 film measurements by using a gamma-index criterion. The optimal MC model parameters to specify the beam characteristics were a 6.8-MeV mean energy, a 0.5-MeV energy spread, and a 3-mm electron radius. The accuracy of these parameters was determined by comparison of MC simulations with measurements. The PDDs and the lateral profiles of the MC simulation deviated from the measurements by 1% and 2%, respectively, on average. The computed simple MLC fields agreed with the EBT3 measurements with a 95% passing rate with 3%/3-mm gamma-index criterion. Additionally, in applying our model to clinical IMRT plans, we found that the MC calculations and the EBT3 measurements agreed well with a passing rate of greater

  17. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  18. Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.

    PubMed

    Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B

    2010-09-01

    The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made.

  19. GEANT4 SIMULATIONS OF GAMMA-RAY EMISSION FROM ACCELERATED PARTICLES IN SOLAR FLARES

    SciTech Connect

    Tang Shichao; Smith, David M.

    2010-10-01

    Gamma-ray spectroscopy provides diagnostics of particle acceleration in solar flares, but care must be taken when interpreting the spectra due to effects of the angular distribution of the accelerated particles (such as relativistic beaming) and Compton reprocessing of the radiation in the solar atmosphere. In this paper, we use the GEANT4 Monte Carlo package to simulate the interactions of accelerated electrons and protons and study the effects of these interactions on the gamma rays resulting from electron bremsstrahlung and pion decay. We consider the ratio of the 511 keV annihilation-line flux to the continuum at 200 keV and in the energy band just above the nuclear de-excitation lines (8-15 MeV) as a diagnostic of the accelerated particles and a point of comparison with data from the X17 flare of 2003 October 28. We also find that pion secondaries from accelerated protons produce a positron annihilation line component at a depth of {approx}10 g cm{sup -2} and that the subsequent Compton scattering of the 511 keV photons produces a continuum that can mimic the spectrum expected from the 3{gamma} decay of orthopositronium.

  20. Study on gamma response function of EJ301 organic liquid scintillator with GEANT4 and FLUKA

    NASA Astrophysics Data System (ADS)

    Zhang, Su-Ya-La-Tu; Chen, Zhi-Qiang; Han, Rui; Liu, Xing-Quan; Wada, R.; Lin, Wei-Ping; Jin, Zeng-Xue; Xi, Yin-Yin; Liu, Jian-Li; Shi, Fu-Dong

    2013-12-01

    The gamma response function is required for energy calibration of EJ301 (5 cm in diameter and 20 cm in height) organic liquid scintillator detector by means of gamma sources. The GEANT4 and FLUKA Monte Carlo simulation packages were used to simulate the response function of the detector for standard 22Na, 60Co, 137Cs gamma sources. The simulated results showed a good agreement with experimental data by incorporating the energy resolution function to simulation codes. The energy resolution and the position of the maximum Compton electron energy were obtained by comparing measured light output distribution with simulated one. The energy resolution of the detector varied from 21.2% to 12.4% for electrons in the energy region from 0.341 MeV to 1.12 MeV. The accurate position of the maximum Compton electron energy was determined at the position 81% of maximum height of Compton edges distribution. In addition, the relation of the electron energy calibration and the effective neutron detection thresholds were described in detail. The present results indicated that both packages were suited for studying the gamma response function of EJ301 detector.

  1. Ion therapy for uveal melanoma in new human eye phantom based on GEANT4 toolkit.

    PubMed

    Mahdipour, Seyed Ali; Mowlavi, Ali Asghar

    2016-01-01

    Radiotherapy with ion beams like proton and carbon has been used for treatment of eye uveal melanoma for many years. In this research, we have developed a new phantom of human eye for Monte Carlo simulation of tumors treatment to use in GEANT4 toolkit. Total depth-dose profiles for the proton, alpha, and carbon incident beams with the same ranges have been calculated in the phantom. Moreover, the deposited energy of the secondary particles for each of the primary beams is calculated. The dose curves are compared for 47.8MeV proton, 190.1MeV alpha, and 1060MeV carbon ions that have the same range in the target region reaching to the center of tumor. The passively scattered spread-out Bragg peak (SOBP) for each incident beam as well as the flux curves of the secondary particles including neutron, gamma, and positron has been calculated and compared for the primary beams. The high sharpness of carbon beam׳s Bragg peak with low lateral broadening is the benefit of this beam in hadrontherapy but it has disadvantages of dose leakage in the tail after its Bragg peak and high intensity of neutron production. However, proton beam, which has a good conformation with tumor shape owing to the beam broadening caused by scattering, can be a good choice for the large-size tumors. PMID:26831752

  2. VIDA: a voxel-based dosimetry method for targeted radionuclide therapy using Geant4.

    PubMed

    Kost, Susan D; Dewaraja, Yuni K; Abramson, Richard G; Stabin, Michael G

    2015-02-01

    We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy ((131)I, (90)Y, (111)In, (177)Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by (131)I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression.

  3. VIDA: A Voxel-Based Dosimetry Method for Targeted Radionuclide Therapy Using Geant4

    PubMed Central

    Dewaraja, Yuni K.; Abramson, Richard G.; Stabin, Michael G.

    2015-01-01

    Abstract We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy (131I, 90Y, 111In, 177Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by 131I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression. PMID:25594357

  4. Monte Carlo study of a 3D Compton imaging device with GEANT4

    NASA Astrophysics Data System (ADS)

    Lenti, M.; Veltri, M.

    2011-10-01

    In this paper we investigate, with a detailed Monte Carlo simulation based on Geant4, the novel approach of Lenti (2008) [1] to 3D imaging with photon scattering. A monochromatic and well collimated gamma beam is used to illuminate the object to be imaged and the photons Compton scattered are detected by means of a surrounding germanium strip detector. The impact position and the energy of the photons are measured with high precision and the scattering position along the beam axis is calculated. We study as an application of this technique the case of brain imaging but the results can be applied as well to situations where a lighter object, with localized variations of density, is embedded in a denser container. We report here the attainable sensitivity in the detection of density variations as a function of the beam energy, the depth inside the object and size and density of the inclusions. Using a 600 keV gamma beam, for an inclusion with a density increase of 30% with respect to the surrounding tissue and thickness along the beam of 5 mm, we obtain at midbrain position a resolution of about 2 mm and a contrast of 12%. In addition the simulation indicates that for the same gamma beam energy a complete brain scan would result in an effective dose of about 1 mSv.

  5. GATE as a GEANT4-based Monte Carlo platform for the evaluation of proton pencil beam scanning treatment plans.

    PubMed

    Grevillot, L; Bertrand, D; Dessy, F; Freud, N; Sarrut, D

    2012-07-01

    Active scanning delivery systems take full advantage of ion beams to best conform to the tumor and to spare surrounding healthy tissues; however, it is also a challenging technique for quality assurance. In this perspective, we upgraded the GATE/GEANT4 Monte Carlo platform in order to recalculate the treatment planning system (TPS) dose distributions for active scanning systems. A method that allows evaluating the TPS dose distributions with the GATE Monte Carlo platform has been developed and applied to the XiO TPS (Elekta), for the IBA proton pencil beam scanning (PBS) system. First, we evaluated the specificities of each dose engine. A dose-conversion scheme that allows one to convert dose to medium into dose to water was implemented within GATE. Specific test cases in homogeneous and heterogeneous configurations allowed for the estimation of the differences between the beam models implemented in XiO and GATE. Finally, dose distributions of a prostate treatment plan were compared. In homogeneous media, a satisfactory agreement was generally obtained between XiO and GATE. The maximum stopping power difference of 3% occurred in a human tissue of 0.9 g cm(-3) density and led to a significant range shift. Comparisons in heterogeneous configurations pointed out the limits of the TPS dose calculation accuracy and the superiority of Monte Carlo simulations. The necessity of computing dose to water in our Monte Carlo code for comparisons with TPSs is also presented. Finally, the new capabilities of the platform are applied to a prostate treatment plan and dose differences between both dose engines are analyzed in detail. This work presents a generic method to compare TPS dose distributions with the GATE Monte Carlo platform. It is noteworthy that GATE is also a convenient tool for imaging applications, therefore opening new research possibilities for the PBS modality.

  6. Geant4 simulation for a study of a possible use of carbon ion pencil beams for the treatment of ocular melanomas with the active scanning system at CNAO

    NASA Astrophysics Data System (ADS)

    Farina, E.; Piersimoni, P.; Riccardi, C.; Rimoldi, A.; Tamborini, A.; Ciocca, M.

    2015-12-01

    The aim of this work was to study a possible use of carbon ion pencil beams (delivered with active scanning modality) for the treatment of ocular melanomas at the Centro Nazionale di Adroterapia Oncologica (CNAO). The promising aspect of carbon ions radiotherapy for the treatment of this disease lies in its superior relative radio-biological effectiveness (RBE). The Monte Carlo (MC) Geant4 10.00 toolkit was used to simulate the complete CNAO extraction beamline, with the active and passive components along it. A human eye modeled detector, including a realistic target tumor volume, was used as target. Cross check with previous studies at CNAO using protons allowed comparisons on possible benefits on using such a technique with respect to proton beams. Experimental data on proton and carbon ion beams transverse distributions were used to validate the simulation.

  7. Geant4 simulations on medical Linac operation at 18 MV: Experimental validation based on activation foils

    NASA Astrophysics Data System (ADS)

    Vagena, E.; Stoulos, S.; Manolopoulou, M.

    2016-03-01

    The operation of a medical linear accelerator was simulated using the Geant4 code regarding to study the characteristics of an 18 MeV photon beam. Simulations showed that (a) the photon spectrum at the isocenter is not influenced by changes of the primary electron beam's energy distribution and spatial spread (b) 98% of the photon energy fluence scored at the isocenter is primary photons that have only interacted with the target (c) the number of contaminant electrons is not negligible since it fluctuated around 5×10-5 per primary electron or 2.40×10-3 per photon at the isocenter (d) the number of neutrons that are created by (γ, n) reactions is 3.13×10-6 per primary electron or 1.50×10-3 per photon at the isocenter (e) a flattening filter free beam needs less primary electrons in order to deliver the same photon fluence at the isocenter than a normal flattening filter operation (f) there is no significant increase of the surface dose due to the contaminant electrons by removing the flattening filter (g) comparing the neutron fluences per incident electron for the flattened and unflattened beam, the neutron fluencies is 7% higher for the unflattened beams. To validate the simulations results, the total neutron and photon fluence at the isocenter field were measured using nickel, indium, and natural uranium activation foils. The percentage difference between simulations and measurements was 1.26% in case of uranium and 2.45% in case of the indium foil regarding photon fluencies while for neutrons the discrepancy is higher up to 8.0%. The photon and neutron fluencies of the simulated experiments fall within a range of ±1 and ±2 sigma error, respectively, compared to the ones obtained experimentally.

  8. The simulation of the LANFOS-H food radiation contamination detector using Geant4 package

    NASA Astrophysics Data System (ADS)

    Piotrowski, Lech Wiktor; Casolino, Marco; Ebisuzaki, Toshikazu; Higashide, Kazuhiro

    2015-02-01

    Recent incident in the Fukushima power plant caused a growing concern about the radiation contamination and resulted in lowering the Japanese limits for the permitted amount of 137Cs in food to 100 Bq/kg. To increase safety and ease the concern we are developing LANFOS (Large Food Non-destructive Area Sampler)-a compact, easy to use detector for assessment of radiation in food. Described in this paper LANFOS-H has a 4 π coverage to assess the amount of 137Cs present, separating it from the possible 40K food contamination. Therefore, food samples do not have to be pre-processed prior to a test and can be consumed after measurements. It is designed for use by non-professionals in homes and small institutions such as schools, showing safety of the samples, but can be also utilized by specialists providing radiation spectrum. Proper assessment of radiation in food in the apparatus requires estimation of the γ conversion factor of the detectors-how many γ photons will produce a signal. In this paper we show results of the Monte Carlo estimation of this factor for various approximated shapes of fish, vegetables and amounts of rice, performed with Geant4 package. We find that the conversion factor combined from all the detectors is similar for all food types and is around 37%, varying maximally by 5% with sample length, much less than for individual detectors. The different inclinations and positions of samples in the detector introduce uncertainty of 1.4%. This small uncertainty validates the concept of a 4 π non-destructive apparatus.

  9. Investigation of OPET Performance Using GATE, a Geant4-Based Simulation Software.

    PubMed

    Rannou, Fernando R; Kohli, Vandana; Prout, David L; Chatziioannou, Arion F

    2004-10-01

    A combined optical positron emission tomography (OPET) system is capable of both optical and PET imaging in the same setting, and it can provide information/interpretation not possible in single-mode imaging. The scintillator array here serves the dual function of coupling the optical signal from bioluminescence/fluorescence to the photodetector and also of channeling optical scintillations from the gamma rays. We report simulation results of the PET part of OPET using GATE, a Geant4 simulation package. The purpose of this investigation is the definition of the geometric parameters of the OPET tomograph. OPET is composed of six detector blocks arranged in a hexagonal ring-shaped pattern with an inner radius of 15.6 mm. Each detector consists of a two-dimensional array of 8 × 8 scintillator crystals each measuring 2 × 2 × 10 mm(3). Monte Carlo simulations were performed using the GATE software to measure absolute sensitivity, depth of interaction, and spatial resolution for two ring configurations, with and without gantry rotations, two crystal materials, and several crystal lengths. Images were reconstructed with filtered backprojection after angular interleaving and transverse one-dimensional interpolation of the sinogram. We report absolute sensitivities nearly seven times that of the prototype microPET at the center of field of view and 2.0 mm tangential and 2.3 mm radial resolutions with gantry rotations up to an 8.0 mm radial offset. These performance parameters indicate that the imaging spatial resolution and sensitivity of the OPET system will be suitable for high-resolution and high-sensitivity small-animal PET imaging.

  10. GEANT4 used for neutron beam design of a neutron imaging facility at TRIGA reactor in Morocco

    NASA Astrophysics Data System (ADS)

    Ouardi, A.; Machmach, A.; Alami, R.; Bensitel, A.; Hommada, A.

    2011-09-01

    Neutron imaging has a broad scope of applications and has played a pivotal role in visualizing and quantifying hydrogenous masses in metallic matrices. The field continues to expand into new applications with the installation of new neutron imaging facilities. In this scope, a neutron imaging facility for computed tomography and real-time neutron radiography is currently being developed around 2.0MW TRIGA MARK-II reactor at Maamora Nuclear Research Center in Morocco (Reuscher et al., 1990 [1]; de Menezes et al., 2003 [2]; Deinert et al., 2005 [3]). The neutron imaging facility consists of neutron collimator, real-time neutron imaging system and imaging process systems. In order to reduce the gamma-ray content in the neutron beam, the tangential channel was selected. For power of 250 kW, the corresponding thermal neutron flux measured at the inlet of the tangential channel is around 3×10 11 ncm 2/s. This facility will be based on a conical neutron collimator with two circular diaphragms with diameters of 4 and 2 cm corresponding to L/D-ratio of 165 and 325, respectively. These diaphragms' sizes allow reaching a compromise between good flux and efficient L/D-ratio. Convergent-divergent collimator geometry has been adopted. The beam line consists of a gamma filter, fast neutrons filter, neutron moderator, neutron and gamma shutters, biological shielding around the collimator and several stages of neutron collimator. Monte Carlo calculations by a fully 3D numerical code GEANT4 were used to design the neutron beam line ( http://www.info.cern.ch/asd/geant4/geant4.html[4]). To enhance the neutron thermal beam in terms of quality, several materials, mainly bismuth (Bi) and sapphire (Al 2O 3) were examined as gamma and neutron filters respectively. The GEANT4 simulations showed that the gamma and epithermal and fast neutron could be filtered using the bismuth (Bi) and sapphire (Al 2O 3) filters, respectively. To get a good cadmium ratio, GEANT 4 simulations were used to

  11. Compton polarimeter as a focal plane detector for hard X-ray telescope: sensitivity estimation with Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, T.; Vadawale, S. V.; Pendharkar, J.

    2013-04-01

    X-ray polarimetry can be an important tool for investigating various physical processes as well as their geometries at the celestial X-ray sources. However, X-ray polarimetry has not progressed much compared to the spectroscopy, timing and imaging mainly due to the extremely photon-hungry nature of X-ray polarimetry leading to severely limited sensitivity of X-ray polarimeters. The great improvement in sensitivity in spectroscopy and imaging was possible due to focusing X-ray optics which is effective only at the soft X-ray energy range. Similar improvement in sensitivity of polarisation measurement at soft X-ray range is expected in near future with the advent of GEM based photoelectric polarimeters. However, at energies >10 keV, even spectroscopic and imaging sensitivities of X-ray detector are limited due to lack of focusing optics. Thus hard X-ray polarimetry so far has been largely unexplored area. On the other hand, typically the polarisation degree is expected to increase at higher energies as the radiation from non-thermal processes is dominant fraction. So polarisation measurement in hard X-ray can yield significant insights into such processes. With the recent availability of hard X-ray optics (e.g. with upcoming NuSTAR, Astro-H missions) which can focus X-rays from 5 KeV to 80 KeV, sensitivity of X-ray detectors in hard X-ray range is expected to improve significantly. In this context we explore feasibility of a focal plane hard X-ray polarimeter based on Compton scattering having a thin plastic scatterer surrounded by cylindrical array scintillator detectors. We have carried out detailed Geant4 simulation to estimate the modulation factor for 100 % polarized beam as well as polarimetric efficiency of this configuration. We have also validated these results with a semi-analytical approach. Here we present the initial results of polarisation sensitivities of such focal plane Compton polarimeter coupled with the reflection efficiency of present era hard X

  12. Simulation of the radiation exposure in space during a large solar energetic particle event with GEANT4

    NASA Astrophysics Data System (ADS)

    Matthiä, Daniel; Berger, Thomas; Puchalska, Monika; Reitz, Guenther

    in August 1972 in the energy range from 45 MeV to 1 GeV. The transport calculations of the energetic particles through the shielding and the phantom model were performed using the Monte-Carlo code GEANT4.

  13. TH-E-BRE-01: A 3D Solver of Linear Boltzmann Transport Equation Based On a New Angular Discretization Method with Positivity for Photon Dose Calculation Benchmarked with Geant4

    SciTech Connect

    Hong, X; Gao, H

    2014-06-15

    Purpose: The Linear Boltzmann Transport Equation (LBTE) solved through statistical Monte Carlo (MC) method provides the accurate dose calculation in radiotherapy. This work is to investigate the alternative way for accurately solving LBTE using deterministic numerical method due to its possible advantage in computational speed from MC. Methods: Instead of using traditional spherical harmonics to approximate angular scattering kernel, our deterministic numerical method directly computes angular scattering weights, based on a new angular discretization method that utilizes linear finite element method on the local triangulation of unit angular sphere. As a Result, our angular discretization method has the unique advantage in positivity, i.e., to maintain all scattering weights nonnegative all the time, which is physically correct. Moreover, our method is local in angular space, and therefore handles the anisotropic scattering well, such as the forward-peaking scattering. To be compatible with image-guided radiotherapy, the spatial variables are discretized on the structured grid with the standard diamond scheme. After discretization, the improved sourceiteration method is utilized for solving the linear system without saving the linear system to memory. The accuracy of our 3D solver is validated using analytic solutions and benchmarked with Geant4, a popular MC solver. Results: The differences between Geant4 solutions and our solutions were less than 1.5% for various testing cases that mimic the practical cases. More details are available in the supporting document. Conclusion: We have developed a 3D LBTE solver based on a new angular discretization method that guarantees the positivity of scattering weights for physical correctness, and it has been benchmarked with Geant4 for photon dose calculation.

  14. Computed Pion Yields from a Tantalum Rod Target: Comparing MARS15 and GEANT4 Across Proton Energies

    NASA Astrophysics Data System (ADS)

    Brooks, S. J.; Walaron, K. A.

    2006-05-01

    The choice of proton driver energy is an important variable in maximising the pion flux available in later stages of the neutrino factory. Simulations of pion production using a range of energies are presented and cross-checked for reliability between the codes MARS15 and GEANT4. The distributions are combined with postulated apertures for the pion decay channel and muon front-end to estimate the usable muon flux after capture losses. Resolution of discrepancies between the codes awaits experimental data in the required energy range.

  15. A comparison of the measured responses of a tissue-equivalent proportional counter to high energy heavy (HZE) particles and those simulated using the Geant4 Monte Carlo code

    PubMed Central

    Taddei, Phillip J.; Zhao, Zhongxiang; Borak, Thomas B.

    2010-01-01

    Monte Carlo simulations of heavy ion interactions using the Geant4 toolkit were compared with measurements of energy deposition in a spherical tissue-equivalent proportional counter (TEPC). A spherical cavity with a physical diameter of 12.7 mm was filled with propane-based tissue-equivalent gas surrounded by a wall of A-150 tissue-equivalent plastic that was 2.54 mm to thick. Measurements and Monte Carlo simulations were used to record the energy deposition and the trajectory of the incident particle on an event-by-event basis for ions ranging in atomic number from 2 (4He) to 26 (56Fe) and in energy from 200 MeV/nucleon to 1000 MeV/nucleon. In the simulations, tracking of secondary electrons was terminated when the range of an electron was below a specified threshold. The effects of range cuts for electrons at 0.5 μm, 1 μm, 10 μm, and 100 μm were evaluated. To simulate an energy deposition influenced by large numbers of low energy electrons with large transverse momentum, it was necessary to track electrons down to range cuts of 10 μm or less. The Geant4 simulated data closely matched the measured data acquired using a TEPC for incident particles traversing the center of the detector as well as near the gas-wall interface. Values of frequency mean lineal energy and dose mean lineal energy were within 8% of the measured data. The production of secondary particles in the aluminum vacuum chamber had no effect on the response of the TEPC for 56Fe at 1000 MeV/nucleon. The results of this study confirm that Geant4 can simulate patterns of energy deposition for existing microdosimeters and is valuable for improving the design of a new generation of detectors used for space dosimetry and for characterizing particle beams used in hadron radiotherapy. PMID:20862212

  16. Inhomogeneity effect in Varian Trilogy Clinac iX 10 MV photon beam using EGSnrc and Geant4 code system

    NASA Astrophysics Data System (ADS)

    Yani, S.; Rhani, M. F.; Haryanto, F.; Arif, I.

    2016-08-01

    Treatment fields consist of tissue other than water equivalent tissue (soft tissue, bones, lungs, etc.). The inhomogeneity effect can be investigated by Monte Carlo (MC) simulation. MC simulation of the radiation transport in an absorbing medium is the most accurate method for dose calculation in radiotherapy. The aim of this work is to evaluate the effect of inhomogeneity phantom on dose calculations in photon beam radiotherapy obtained by different MC codes. MC code system EGSnrc and Geant4 was used in this study. Inhomogeneity phantom dimension is 39.5 × 30.5 × 30 cm3 and made of 4 material slices (12.5 cm water, 10 cm aluminium, 5 cm lung and 12.5 cm water). Simulations were performed for field size 4 × 4 cm2 at SSD 100 cm. The spectrum distribution Varian Trilogy Clinac iX 10 MV was used. Percent depth dose (PDD) and dose profile was investigated in this research. The effects of inhomogeneities on radiation dose distributions depend on the amount, density and atomic number of the inhomogeneity, as well as on the quality of the photon beam. Good agreement between dose distribution from EGSnrc and Geant4 code system in inhomogeneity phantom was observed, with dose differences around 5% and 7% for depth doses and dose profiles.

  17. DagSolid: a new Geant4 solid class for fast simulation in polygon-mesh geometry.

    PubMed

    Han, Min Cheol; Kim, Chan Hyeong; Jeong, Jong Hwi; Yeom, Yeon Soo; Kim, SungHoon; Wilson, Paul P H; Apostolakis, John

    2013-07-01

    Even though a computer-aided design (CAD)-based geometry can be directly implemented in Geant4 as polygon-mesh using the G4TessellatedSolid class, the computation speed becomes very slow, especially when the geometry is composed of a large number of facets. To address this problem, in the present study, a new Geant4 solid class, named DagSolid, was developed based on the direct accelerated geometry for the Monte Carlo (DAGMC) library which provides the ray-tracing acceleration algorithm functions. To develop the DagSolid class, the new solid class was derived from the G4VSolid class, and its ray-tracing functions were linked to the corresponding functions of the DAGMC library. The results of this study show that the use of the DagSolid class drastically improves the computation speed. The improvement was more significant when there were more facets, meaning that the DagSolid class can be used more effectively for complicated geometries with many facets than for simple geometries. The maximum difference of computation speed was 1562 and 680 times for Geantino and ChargedGeantino, respectively. For real particles (gammas, electrons, neutrons, and protons), the difference of computation speed was less significant, but still was within the range of 53-685 times depending on the type of beam particles simulated. PMID:23771063

  18. Performance evaluation of multi sampling ionization chamber for heavy ion beams by comparison with GEANT4 simulation

    NASA Astrophysics Data System (ADS)

    Kanke, Yuki; Himac H093 Collaboration

    2014-09-01

    In high-energy heavy-ion accelerator facilities, multi sampling ionization chambers are often used for the identification of the atomic number Z by detecting the energy deposit in it. In the study at GSI, the picture of the escape of secondary electrons, δ rays, from the ionization chamber explains the experimental data of pulse-height resolution. If this picture is correct, the pulse-height resolution should depend on the effective area of the ionization chamber. The experiment have been performed at NIRS-HIMAC. The pulse-height resolutions of two ionization chambers with different effective area were compared by using a 400-MeV/u Ni beam and their fragments. The difference in the pulse-height resolutions was observed. By comparison with the GEANT4 simulation including the δ-rays emission, the performance of the ionization chamber have been evaluated.

  19. Design of Cherenkov bars for the optical part of the time-of-flight detector in Geant4.

    PubMed

    Nozka, L; Brandt, A; Rijssenbeek, M; Sykora, T; Hoffman, T; Griffiths, J; Steffens, J; Hamal, P; Chytka, L; Hrabovsky, M

    2014-11-17

    We present the results of studies devoted to the development and optimization of the optical part of a high precision time-of-flight (TOF) detector for the Large Hadron Collider (LHC). This work was motivated by a proposal to use such a detector in conjunction with a silicon detector to tag and measure protons from interactions of the type p + p → p + X + p, where the two outgoing protons are scattered in the very forward directions. The fast timing detector uses fused silica (quartz) bars that emit Cherenkov radiation as a relativistic particle passes through and the emitted Cherenkov photons are detected by, for instance, a micro-channel plate multi-anode Photomultiplier Tube (MCP-PMT). Several possible designs are implemented in Geant4 and studied for timing optimization as a function of the arrival time, and the number of Cherenkov photons reaching the photo-sensor.

  20. Calculation of direct effects of 60Co gamma rays on the different DNA structural levels: A simulation study using the Geant4-DNA toolkit

    NASA Astrophysics Data System (ADS)

    Tajik, Marjan; Rozatian, Amir S. H.; Semsarha, Farid

    2015-03-01

    In this study, simple single strand breaks (SSB) and double strand breaks (DSB) due to direct effects of the secondary electron spectrum of 60Co gamma rays on different organizational levels of a volume model of the B-DNA conformation have been calculated using the Geant4-DNA toolkit. Result of this study for the direct DSB yield shows a good agreement with other theoretical and experimental results obtained by both photons and their secondary electrons; however, in the case of SSB a noticeable difference can be observed. Moreover, regarding the almost constant yields of the direct strand breaks in the different structural levels of the DNA, calculated in this work, and compared with some theoretical studies, it can be deduced that the direct strand breaks yields depend mainly on the primary double helix structure of the DNA and the higher-order structures cannot have a noticeable effect on the direct DNA damage inductions by 60Co gamma rays. In contrast, a direct dependency between the direct SSB and DSB yields and the volume of the DNA structure has been found. Also, a further study on the histone proteins showed that they can play an important role in the trapping of low energy electrons without any significant effect on the direct DNA strand breaks inductions, at least in the range of energies used in the current study.

  1. A GAMOS plug-in for GEANT4 based Monte Carlo simulation of radiation-induced light transport in biological media.

    PubMed

    Glaser, Adam K; Kanick, Stephen C; Zhang, Rongxiao; Arce, Pedro; Pogue, Brian W

    2013-05-01

    We describe a tissue optics plug-in that interfaces with the GEANT4/GAMOS Monte Carlo (MC) architecture, providing a means of simulating radiation-induced light transport in biological media for the first time. Specifically, we focus on the simulation of light transport due to the Čerenkov effect (light emission from charged particle's traveling faster than the local speed of light in a given medium), a phenomenon which requires accurate modeling of both the high energy particle and subsequent optical photon transport, a dynamic coupled process that is not well-described by any current MC framework. The results of validation simulations show excellent agreement with currently employed biomedical optics MC codes, [i.e., Monte Carlo for Multi-Layered media (MCML), Mesh-based Monte Carlo (MMC), and diffusion theory], and examples relevant to recent studies into detection of Čerenkov light from an external radiation beam or radionuclide are presented. While the work presented within this paper focuses on radiation-induced light transport, the core features and robust flexibility of the plug-in modified package make it also extensible to more conventional biomedical optics simulations. The plug-in, user guide, example files, as well as the necessary files to reproduce the validation simulations described within this paper are available online at http://www.dartmouth.edu/optmed/research-projects/monte-carlo-software.

  2. Ray tracing simulations for the wide-field x-ray telescope of the Einstein Probe mission based on Geant4 and XRTG4

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Willingale, Richard; Ling, Zhixing; Feng, Hua; Li, Hong; Ji, Jianfeng; Wang, Wenxin; Zhang, Shuangnan

    2014-07-01

    Einstein Probe (EP) is a proposed small scientific satellite dedicated to time-domain astrophysics working in the soft X-ray band. It will discover transients and monitor variable objects in 0.5-4 keV, for which it will employ a very large instantaneous field-of-view (60° × 60°), along with moderate spatial resolution (FWHM ˜ 5 arcmin). Its wide-field imaging capability will be achieved by using established technology in novel lobster-eye optics. In this paper, we present Monte-Carlo simulations for the focusing capabilities of EP's Wide-field X-ray Telescope (WXT). The simulations are performed using Geant4 with an X-ray tracer which was developed by cosine (http://cosine.nl/) to trace X-rays. Our work is the first step toward building a comprehensive model with which the design of the X-ray optics and the ultimate sensitivity of the instrument can be optimized by simulating the X-ray tracing and radiation environment of the system, including the focal plane detector and the shielding at the same time.

  3. Organ doses from hepatic radioembolization with 90Y, 153Sm, 166Ho and 177Lu: A Monte Carlo simulation study using Geant4

    NASA Astrophysics Data System (ADS)

    Hashikin, N. A. A.; Yeong, C. H.; Guatelli, S.; Abdullah, B. J. J.; Ng, K. H.; Malaroda, A.; Rosenfeld, A. B.; Perkins, A. C.

    2016-03-01

    90Y-radioembolization is a palliative treatment for liver cancer. 90Y decays via beta emission, making imaging difficult due to absence of gamma radiation. Since post-procedure imaging is crucial, several theranostic radionuclides have been explored as alternatives. However, exposures to gamma radiation throughout the treatment caused concern for the organs near the liver. Geant4 Monte Carlo simulation using MIRD Pamphlet 5 reference phantom was carried out. A spherical tumour with 4.3cm radius was modelled within the liver. 1.82GBq of 90Y sources were isotropically distributed within the tumour, with no extrahepatic shunting. The simulation was repeated with 153Sm, 166Ho and 177Lu. The estimated tumour doses for all radionuclides were 262.9Gy. Tumour dose equivalent to 1.82GBq 90Y can be achieved with 8.32, 5.83, and 4.44GBq for 153Sm, 166Ho and 177Lu, respectively. Normal liver doses by the other radionuclides were lower than 90Y, hence beneficial for normal tissue sparing. The organ doses from 153Sm and 177Lu were relatively higher due to higher gamma energy, but were still well below 1Gy. 166Ho, 177Lu and 153Sm offer useful gamma emission for post-procedure imaging. They show potential as 90Y substitutes, delivering comparable tumour doses, lower normal liver doses and other organs doses far below the tolerance limit.

  4. SU-E-T-531: Performance Evaluation of Multithreaded Geant4 for Proton Therapy Dose Calculations in a High Performance Computing Facility

    SciTech Connect

    Shin, J; Coss, D; McMurry, J; Farr, J; Faddegon, B

    2014-06-01

    Purpose: To evaluate the efficiency of multithreaded Geant4 (Geant4-MT, version 10.0) for proton Monte Carlo dose calculations using a high performance computing facility. Methods: Geant4-MT was used to calculate 3D dose distributions in 1×1×1 mm3 voxels in a water phantom and patient's head with a 150 MeV proton beam covering approximately 5×5 cm2 in the water phantom. Three timestamps were measured on the fly to separately analyze the required time for initialization (which cannot be parallelized), processing time of individual threads, and completion time. Scalability of averaged processing time per thread was calculated as a function of thread number (1, 100, 150, and 200) for both 1M and 50 M histories. The total memory usage was recorded. Results: Simulations with 50 M histories were fastest with 100 threads, taking approximately 1.3 hours and 6 hours for the water phantom and the CT data, respectively with better than 1.0 % statistical uncertainty. The calculations show 1/N scalability in the event loops for both cases. The gains from parallel calculations started to decrease with 150 threads. The memory usage increases linearly with number of threads. No critical failures were observed during the simulations. Conclusion: Multithreading in Geant4-MT decreased simulation time in proton dose distribution calculations by a factor of 64 and 54 at a near optimal 100 threads for water phantom and patient's data respectively. Further simulations will be done to determine the efficiency at the optimal thread number. Considering the trend of computer architecture development, utilizing Geant4-MT for radiotherapy simulations is an excellent cost-effective alternative for a distributed batch queuing system. However, because the scalability depends highly on simulation details, i.e., the ratio of the processing time of one event versus waiting time to access for the shared event queue, a performance evaluation as described is recommended.

  5. Technical Note: Implementation of biological washout processes within GATE/GEANT4—A Monte Carlo study in the case of carbon therapy treatments

    SciTech Connect

    Martínez-Rovira, I. Jouvie, C.; Jan, S.

    2015-04-15

    Purpose: The imaging of positron emitting isotopes produced during patient irradiation is the only in vivo method used for hadrontherapy dose monitoring in clinics nowadays. However, the accuracy of this method is limited by the loss of signal due to the metabolic decay processes (biological washout). In this work, a generic modeling of washout was incorporated into the GATE simulation platform. Additionally, the influence of the washout on the β{sup +} activity distributions in terms of absolute quantification and spatial distribution was studied. Methods: First, the irradiation of a human head phantom with a {sup 12}C beam, so that a homogeneous dose distribution was achieved in the tumor, was simulated. The generated {sup 11}C and {sup 15}O distribution maps were used as β{sup +} sources in a second simulation, where the PET scanner was modeled following a detailed Monte Carlo approach. The activity distributions obtained in the presence and absence of washout processes for several clinical situations were compared. Results: Results show that activity values are highly reduced (by a factor of 2) in the presence of washout. These processes have a significant influence on the shape of the PET distributions. Differences in the distal activity falloff position of 4 mm are observed for a tumor dose deposition of 1 Gy (T{sub ini} = 0 min). However, in the case of high doses (3 Gy), the washout processes do not have a large effect on the position of the distal activity falloff (differences lower than 1 mm). The important role of the tumor washout parameters on the activity quantification was also evaluated. Conclusions: With this implementation, GATE/GEANT 4 is the only open-source code able to simulate the full chain from the hadrontherapy irradiation to the PET dose monitoring including biological effects. Results show the strong impact of the washout processes, indicating that the development of better models and measurement of biological washout data are

  6. GEANT4 Simulation for the Zenith Angle Dependence of Cosmic Muon Intensities at Two Different Geomagnetic Locations

    NASA Astrophysics Data System (ADS)

    Arslan, Halil; Bektasoglu, Mehmet

    2013-06-01

    The zenith angle dependence of cosmic muon flux at sea level in the western, eastern, southern and northern azimuths have been investigated separately for Calcutta, India and Melbourne, Australia for muon momenta up to 500 GeV/c using Geant4 simulation package. These two locations were selected due to the fact that they significantly differ in geomagnetic cutoff rigidity. The exponent n, which is defined by the relation I(θ) = I(0°)cosnθ, was obtained for each azimuth in Calcutta and Melbourne. By acquiring an agreement between the simulation results and the experimental ones, the simulation study was extended for different azimuth angles and higher muon momenta. It was shown that the angular dependence of the cosmic muon intensity decreases with the increase of muon momentum at both locations. Moreover, the exponent becomes independent of both geomagnetic location and the azimuth angle for muons with momentum above 10 GeV/c, and it is nearly zero above 50 GeV/c. Therefore, it can be concluded that the cosmic muons with momenta between 50 GeV/c and 500 GeV/c reach the sea level almost isotropically.

  7. DETECTORS AND EXPERIMENTAL METHODS: Study of neutron response for two hybrid RPC setups using the GEANT4 MC simulation approach

    NASA Astrophysics Data System (ADS)

    M., Jamil; Rhee T., J.; Jeon J., Y.

    2009-10-01

    The present article describes a detailed neutron simulation study in the energy range 10-10 MeV to 1.0 GeV for two different RPC configurations. The simulation studies were taken by using the GEANT4 MC code. Aluminum was utilized on the GND and readout strips for the (a) Bakelite-based and (b) glass-based RPCs. For the former type of RPC setup the neutron sensitivity for the isotropic source was Sn = 2.702 × 10-2 at En = 1.0 GeV, while for the latter type of RPC, the neutron sensitivity for the same source was evaluated as Sn = 4.049 × 10-2 at En = 1.0 GeV. These results were further compared with the previous RPC configuration in which copper was used for ground and pickup pads. Additionally Al was employed at (GND+strips) of the phosphate glass RPC setup and compared with the copper-based phosphate glass RPC. Good agreement with sensitivity values was obtained with the current and previous simulation results.

  8. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    SciTech Connect

    Uzunyan, S. A.; Blazey, G.; Boi, S.; Coutrakon, G.; Dyshkant, A.; Francis, K.; Hedin, D.; Johnson, E.; Kalnins, J.; Zutshi, V.; Ford, R.; Rauch, J. E.; Rubinov, P.; Sellberg, G.; Wilson, P.; Naimuddin, M.

    2015-12-29

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input for image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.

  9. Simulation of the 6 MV Elekta Synergy Platform linac photon beam using Geant4 Application for Tomographic Emission

    PubMed Central

    Didi, Samir; Moussa, Abdelilah; Yahya, Tayalati; Mustafa, Zerfaoui

    2015-01-01

    The present work validates the Geant4 Application for Tomographic Emission Monte Carlo software for the simulation of a 6 MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The simulation includes the major components of the linear accelerator (LINAC) with multi-leaf collimator and a homogeneous water phantom. Calculations were performed for the photon beam with several treatment field sizes ranging from 5 cm × 5 cm to 30 cm × 30 cm at 100 cm distance from the source. The simulation was successfully validated by comparison with experimental distributions. Good agreement between simulations and measurements was observed, with dose differences of about 0.02% and 2.5% for depth doses and lateral dose profiles, respectively. This agreement was also emphasized by the Kolmogorov–Smirnov goodness-of-fit test and by the gamma-index comparisons where more than 99% of the points for all simulations fulfill the quality assurance criteria of 2 mm/2%. PMID:26500399

  10. 3D polymer gel dosimetry and Geant4 Monte Carlo characterization of novel needle based X-ray source

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Sozontov, E.; Safronov, V.; Gutman, G.; Strumban, E.; Jiang, Q.; Li, S.

    2010-11-01

    In the recent years, there have been a few attempts to develop a low energy x-ray radiation sources alternative to conventional radioisotopes used in brachytherapy. So far, all efforts have been centered around the intent to design an interstitial miniaturized x-ray tube. Though direct irradiation of tumors looks very promising, the known insertable miniature x-ray tubes have many limitations: (a) difficulties with focusing and steering the electron beam to the target; (b)necessity to cool the target to increase x-ray production efficiency; (c)impracticability to reduce the diameter of the miniaturized x-ray tube below 4mm (the requirement to decrease the diameter of the x-ray tube and the need to have a cooling system for the target have are mutually exclusive); (c) significant limitations in changing shape and energy of the emitted radiation. The specific aim of this study is to demonstrate the feasibility of a new concept for an insertable low-energy needle x-ray device based on simulation with Geant4 Monte Carlo code and to measure the dose rate distribution for low energy (17.5 keV) x-ray radiation with the 3D polymer gel dosimetry.

  11. Space radiation analysis: Radiation effects and particle interaction outside the Earth's magnetosphere using GRAS and GEANT4

    NASA Astrophysics Data System (ADS)

    Martinez, Lisandro M.; Kingston, Jennifer

    2012-03-01

    In order to explore the Moon and Mars it is necessary to investigate the hazards due to the space environment and especially ionizing radiation. According to previous papers, much information has been presented in radiation analysis inside the Earth's magnetosphere, but much of this work was not directly relevant to the interplanetary medium. This work intends to explore the effect of radiation on humans inside structures such as the ISS and provide a detailed analysis of galactic cosmic rays (GCRs) and solar proton events (SPEs) using SPENVIS (Space Environment Effects and Information System) and CREME96 data files for particle flux outside the Earth's magnetosphere. The simulation was conducted using GRAS, a European Space Agency (ESA) software based on GEANT4. Dose and equivalent dose have been calculated as well as secondary particle effects and GCR energy spectrum. The calculated total dose effects and equivalent dose indicate the risk and effects that space radiation could have on the crew, these values are calculated using two different types of structures, the ISS and the TransHab modules. Final results indicate the amounts of radiation expected to be absorbed by the astronauts during long duration interplanetary flights; this denotes importance of radiation shielding and the use of proper materials to reduce the effects.

  12. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4.

    PubMed

    Pope, D J; Cutajar, D L; George, S P; Guatelli, S; Bucci, J A; Enari, K E; Miller, S; Siegele, R; Rosenfeld, A B

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences.Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  13. Benchmarking the Geant4 full system simulation of an associated alpha-particle detector for use in a D-T neutron generator.

    PubMed

    Zhang, Xiaodong; Hayward, Jason P; Cates, Joshua W; Hausladen, Paul A; Laubach, Mitchell A; Sparger, Johnathan E; Donnald, Samuel B

    2012-08-01

    The position-sensitive alpha-particle detector used to provide the starting time and initial direction of D-T neutrons in a fast-neutron imaging system was simulated with a Geant4-based Monte Carlo program. The whole detector system, which consists of a YAP:Ce scintillator, a fiber-optic faceplate, a light guide, and a position-sensitive photo-multiplier tube (PSPMT), was modeled, starting with incident D-T alphas. The scintillation photons, whose starting time follows the distribution of a scintillation decay curve, were produced and emitted uniformly into a solid angle of 4π along the track segments of the alpha and its secondaries. Through tracking all photons and taking into account the quantum efficiency of the photocathode, the number of photoelectrons and their time and position distributions were obtained. Using a four-corner data reconstruction formula, the flood images of the alpha detector with and without optical grease between the YAP scintillator and the fiber-optic faceplate were obtained, which show agreement with the experimental results. The reconstructed position uncertainties of incident alpha particles for both cases are 1.198 mm and 0.998 mm respectively across the sensitive area of the detector. Simulation results also show that comparing with other faceplates composed of 500 μm, 300 μm, and 100 μm fibers, the 10-μm-fiber faceplate is the best choice to build the detector for better position performance. In addition, the study of the background originating inside the D-T generator suggests that for 500-μm-thick YAP:Ce coated with 1-μm-thick aluminum, and very good signal-to-noise ratio can be expected through application of a simple threshold.

  14. SU-E-T-519: Emission of Secondary Particles From a PMMA Phantom During Proton Irradiation: A Simulation Study with the Geant4 Monte Carlo Toolkit

    SciTech Connect

    Lau, A; Chen, Y; Ahmad, S

    2014-06-01

    Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMA phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study.

  15. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the geant4 Monte Carlo code

    PubMed Central

    Guan, Fada; Peeler, Christopher; Bronk, Lawrence; Geng, Changran; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Grosshans, David; Mohan, Radhe; Titt, Uwe

    2015-01-01

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the geant 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from geant 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LETt and dose-averaged LET, LETd) using geant 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LETt and LETd of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LETt but significant for LETd. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in geant 4 can result in incorrect LETd calculation results in the dose plateau region for small step limits. The erroneous LETd results can be attributed to the algorithm to determine fluctuations in energy deposition along the

  16. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β(+)-emitting nuclei during therapeutic particle irradiation to measured data.

    PubMed

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-21

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β(+)-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β(+)-activity and dose is not feasible, a simulation of the expected β(+)-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β(+)-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β(+)-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  17. New data libraries and physics data management tools

    NASA Astrophysics Data System (ADS)

    Han, M.; Pia, M. G.; Augelli, M.; Hauf, S.; Kim, C. H.; Kuster, M.; Moneta, L.; Quintieri, L.; Saracco, P.; Seo, H.

    2011-12-01

    A number of physics data libraries for Monte Carlo simulation are reviewed. The development of a package for the management of physics data is described: its design, implementation and computational benchmarks. This package improves the data management tools originally developed for Geant4 electromagnetic physics models based on data libraries. The implementation exploits recent evolutions of the C++ libraries appearing in the C++0x draft, which are intended for inclusion in the next C++ ISO Standard. The new tools improve the computational performance of physics data management.

  18. Physical modelling in biomechanics.

    PubMed Central

    Koehl, M A R

    2003-01-01

    Physical models, like mathematical models, are useful tools in biomechanical research. Physical models enable investigators to explore parameter space in a way that is not possible using a comparative approach with living organisms: parameters can be varied one at a time to measure the performance consequences of each, while values and combinations not found in nature can be tested. Experiments using physical models in the laboratory or field can circumvent problems posed by uncooperative or endangered organisms. Physical models also permit some aspects of the biomechanical performance of extinct organisms to be measured. Use of properly scaled physical models allows detailed physical measurements to be made for organisms that are too small or fast to be easily studied directly. The process of physical modelling and the advantages and limitations of this approach are illustrated using examples from our research on hydrodynamic forces on sessile organisms, mechanics of hydraulic skeletons, food capture by zooplankton and odour interception by olfactory antennules. PMID:14561350

  19. Development of a randomized 3D cell model for Monte Carlo microdosimetry simulations

    SciTech Connect

    Douglass, Michael; Bezak, Eva; Penfold, Scott

    2012-06-15

    Purpose: The objective of the current work was to develop an algorithm for growing a macroscopic tumor volume from individual randomized quasi-realistic cells. The major physical and chemical components of the cell need to be modeled. It is intended to import the tumor volume into GEANT4 (and potentially other Monte Carlo packages) to simulate ionization events within the cell regions. Methods: A MATLAB Copyright-Sign code was developed to produce a tumor coordinate system consisting of individual ellipsoidal cells randomized in their spatial coordinates, sizes, and rotations. An eigenvalue method using a mathematical equation to represent individual cells was used to detect overlapping cells. GEANT4 code was then developed to import the coordinate system into GEANT4 and populate it with individual cells of varying sizes and composed of the membrane, cytoplasm, reticulum, nucleus, and nucleolus. Each region is composed of chemically realistic materials. Results: The in-house developed MATLAB Copyright-Sign code was able to grow semi-realistic cell distributions ({approx}2 Multiplication-Sign 10{sup 8} cells in 1 cm{sup 3}) in under 36 h. The cell distribution can be used in any number of Monte Carlo particle tracking toolkits including GEANT4, which has been demonstrated in this work. Conclusions: Using the cell distribution and GEANT4, the authors were able to simulate ionization events in the individual cell components resulting from 80 keV gamma radiation (the code is applicable to other particles and a wide range of energies). This virtual microdosimetry tool will allow for a more complete picture of cell damage to be developed.

  20. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4

    NASA Astrophysics Data System (ADS)

    Pope, D. J.; Cutajar, D. L.; George, S. P.; Guatelli, S.; Bucci, J. A.; Enari, K. E.; Miller, S.; Siegele, R.; Rosenfeld, A. B.

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences. Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  1. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4.

    PubMed

    Pope, D J; Cutajar, D L; George, S P; Guatelli, S; Bucci, J A; Enari, K E; Miller, S; Siegele, R; Rosenfeld, A B

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences.Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  2. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    SciTech Connect

    Guan, Fada; Peeler, Christopher; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Mohan, Radhe; Titt, Uwe; Bronk, Lawrence; Geng, Changran; Grosshans, David

    2015-11-15

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to

  3. The Cryogenic AntiCoincidence Detector for the ATHENA X-IFU: Design Aspects by Geant4 Simulation and Preliminary Characterization of the New Single Pixel

    NASA Astrophysics Data System (ADS)

    Macculi, C.; Argan, A.; D'Andrea, M.; Lotti, S.; Piro, L.; Biasotti, M.; Corsini, D.; Gatti, F.; Orlando, A.; Torrioli, G.

    2016-08-01

    The ATHENA observatory is the second large-class ESA mission, in the context of the Cosmic Vision 2015-2025, scheduled to be launched on 2028 at L2 orbit. One of the two planned focal plane instruments is the X-ray Integral Field Unit (X-IFU), which will be able to perform simultaneous high-grade energy spectroscopy and imaging over the 5 arcmin FoV by means of a kilo-pixel array of transition-edge sensor (TES) microcalorimeters, coupled to a high-quality X-ray optics. The X-IFU sensitivity is degraded by the particle background, induced by primary protons of both solar and cosmic rays' origin and secondary electrons. A Cryogenic AntiCoincidence (CryoAC) TES-based detector, located <1 mm below the TES array, will allow the mission to reach the background level that enables its scientific goals. The CryoAC is a 4-pixel detector made of Silicon absorbers sensed by Iridium TESs. We currently achieve a TRL = 3-4 at the single-pixel level. We have designed and developed two further prototypes in order to reach TRL = 4. The design of the CryoAC has been also optimized using the Geant4 simulation tool. Here we will describe some results from the Geant4 simulations performed to optimize the design and preliminary test results from the first of the two detectors, 1 cm2 area, made of 65 Ir TESs.

  4. Geant4 simulation of zinc oxide nanowires in anodized aluminum oxide template as a low energy X-ray scintillator detector

    NASA Astrophysics Data System (ADS)

    Taheri, Ali; Saramad, Shahyar; Setayeshi, Saeed

    2013-02-01

    In this work, ZnO nanowires in anodized aluminum oxide nanoporous template are proposed as an architecture for development of new generation of scintillator based X-ray imagers. The optical response of crystalline ordered ZnO nanowire arrays in porous anodized aluminum oxide template under 20 keV X-ray illumination is simulated using the Geant4 Monte Carlo code. The results show that anodized aluminum oxide template has a special impact as a light guide to conduct the optical photons induced by X-ray toward the detector thickness and to decrease the light scattering in detector volume. This inexpensive and effective method can significantly improve the spatial resolution in scintillator based X-ray imagers, especially in medical applications.

  5. Simulation, optimization and testing of a novel high spatial resolution X-ray imager based on Zinc Oxide nanowires in Anodic Aluminium Oxide membrane using Geant4

    NASA Astrophysics Data System (ADS)

    Esfandi, F.; Saramad, S.

    2015-07-01

    In this work, a new generation of scintillator based X-ray imagers based on ZnO nanowires in Anodized Aluminum Oxide (AAO) nanoporous template is characterized. The optical response of ordered ZnO nanowire arrays in porous AAO template under low energy X-ray illumination is simulated by the Geant4 Monte Carlo code and compared with experimental results. The results show that for 10 keV X-ray photons, by considering the light guiding properties of zinc oxide inside the AAO template and suitable selection of detector thickness and pore diameter, the spatial resolution less than one micrometer and the detector detection efficiency of 66% are accessible. This novel nano scintillator detector can have many advantages for medical applications in the future.

  6. A polygon-surface reference Korean male phantom (PSRK-Man) and its direct implementation in Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Kim, Chan Hyeong; Jeong, Jong Hwi; Bolch, Wesley E.; Cho, Kun-Woo; Hwang, Sung Bae

    2011-05-01

    Even though the hybrid phantom embodies both the anatomic reality of voxel phantoms and the deformability of stylized phantoms, it must be voxelized to be used in a Monte Carlo code for dose calculation or some imaging simulation, which incurs the inherent limitations of voxel phantoms. In the present study, a voxel phantom named VKH-Man (Visible Korean Human-Man), was converted to a polygon-surface phantom (PSRK-Man, Polygon-Surface Reference Korean-Man), which was then adjusted to the Reference Korean data. Subsequently, the PSRK-Man polygon phantom was directly, without any voxelization process, implemented in the Geant4 Monte Carlo code for dose calculations. The calculated dose values and computation time were then compared with those of HDRK-Man (High Definition Reference Korean-Man), a corresponding voxel phantom adjusted to the same Reference Korean data from the same VKH-Man voxel phantom. Our results showed that the calculated dose values of the PSRK-Man surface phantom agreed well with those of the HDRK-Man voxel phantom. The calculation speed for the PSRK-Man polygon phantom though was 70-150 times slower than that of the HDRK-Man voxel phantom; that speed, however, could be acceptable in some applications, in that direct use of the surface phantom PSRK-Man in Geant4 does not require a separate voxelization process. Computing speed can be enhanced, in future, either by optimizing the Monte Carlo transport kernel for the polygon surfaces or by using modern computing technologies such as grid computing and general-purpose computing on graphics processing units programming.

  7. Beyond Standard Model Physics

    SciTech Connect

    Bellantoni, L.

    2009-11-01

    There are many recent results from searches for fundamental new physics using the TeVatron, the SLAC b-factory and HERA. This talk quickly reviewed searches for pair-produced stop, for gauge-mediated SUSY breaking, for Higgs bosons in the MSSM and NMSSM models, for leptoquarks, and v-hadrons. There is a SUSY model which accommodates the recent astrophysical experimental results that suggest that dark matter annihilation is occurring in the center of our galaxy, and a relevant experimental result. Finally, model-independent searches at D0, CDF, and H1 are discussed.

  8. Ionospheric irregularity physics modelling

    SciTech Connect

    Ossakow, S.L.; Keskinen, M.J.; Zalesak, S.T.

    1982-01-01

    Theoretical and numerical simulation techniques have been employed to study ionospheric F region plasma cloud striation phenomena, equatorial spread F phenomena, and high latitude diffuse auroral F region irregularity phenomena. Each of these phenomena can cause scintillation effects. The results and ideas from these studies are state-of-the-art, agree well with experimental observations, and have induced experimentalists to look for theoretically predicted results. One conclusion that can be drawn from these studies is that ionospheric irregularity phenomena can be modelled from a first principles physics point of view. Theoretical and numerical simulation results from the aforementioned ionospheric irregularity areas will be presented.

  9. Simulating cosmic radiation absorption and secondary particle production of solar panel layers of Low Earth Orbit (LEO) satellite with GEANT4

    NASA Astrophysics Data System (ADS)

    Yiǧitoǧlu, Merve; Veske, Doǧa; Nilüfer Öztürk, Zeynep; Bilge Demirköz, Melahat

    2016-07-01

    All devices which operate in space are exposed to cosmic rays during their operation. The resulting radiation may cause fatal damages in the solid structure of devices and the amount of absorbed radiation dose and secondary particle production for each component should be calculated carefully before the production. Solar panels are semiconductor solid state devices and are very sensitive to radiation. Even a short term power cut-off may yield a total failure of the satellite. Even little doses of radiation can change the characteristics of solar cells. This deviation can be caused by rarer high energetic particles as well as the total ionizing dose from the abundant low energy particles. In this study, solar panels planned for a specific LEO satellite, IMECE, are analyzed layer by layer. The Space Environment Information System (SPENVIS) database and GEANT4 simulation software are used to simulate the layers of the panels. The results obtained from the simulation will be taken in account to determine the amount of radiation protection and resistance needed for the panels or to revise the design of the panels.

  10. Physical Models of Cognition

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1994-01-01

    This paper presents and discusses physical models for simulating some aspects of neural intelligence, and, in particular, the process of cognition. The main departure from the classical approach here is in utilization of a terminal version of classical dynamics introduced by the author earlier. Based upon violations of the Lipschitz condition at equilibrium points, terminal dynamics attains two new fundamental properties: it is spontaneous and nondeterministic. Special attention is focused on terminal neurodynamics as a particular architecture of terminal dynamics which is suitable for modeling of information flows. Terminal neurodynamics possesses a well-organized probabilistic structure which can be analytically predicted, prescribed, and controlled, and therefore which presents a powerful tool for modeling real-life uncertainties. Two basic phenomena associated with random behavior of neurodynamic solutions are exploited. The first one is a stochastic attractor ; a stable stationary stochastic process to which random solutions of a closed system converge. As a model of the cognition process, a stochastic attractor can be viewed as a universal tool for generalization and formation of classes of patterns. The concept of stochastic attractor is applied to model a collective brain paradigm explaining coordination between simple units of intelligence which perform a collective task without direct exchange of information. The second fundamental phenomenon discussed is terminal chaos which occurs in open systems. Applications of terminal chaos to information fusion as well as to explanation and modeling of coordination among neurons in biological systems are discussed. It should be emphasized that all the models of terminal neurodynamics are implementable in analog devices, which means that all the cognition processes discussed in the paper are reducible to the laws of Newtonian mechanics.

  11. MODELING PHYSICAL HABITAT PARAMETERS

    EPA Science Inventory

    Salmonid populations can be affected by alterations in stream physical habitat. Fish productivity is determined by the stream's physical habitat structure ( channel form, substrate distribution, riparian vegetation), water quality, flow regime and inputs from the watershed (sedim...

  12. Evolutionary Industrial Physical Model Generation

    NASA Astrophysics Data System (ADS)

    Carrascal, Alberto; Alberdi, Amaia

    Both complexity and lack of knowledge associated to physical processes makes physical models design an arduous task. Frequently, the only available information about the physical processes are the heuristic data obtained from experiments or at best a rough idea on what are the physical principles and laws that underlie considered physical processes. Then the problem is converted to find a mathematical expression which fits data. There exist traditional approaches to tackle the inductive model search process from data, such as regression, interpolation, finite element method, etc. Nevertheless, these methods either are only able to solve a reduced number of simple model typologies, or the given black-box solution does not contribute to clarify the analyzed physical process. In this paper a hybrid evolutionary approach to search complex physical models is proposed. Tests carried out on a real-world industrial physical process (abrasive water jet machining) demonstrate the validity of this approach.

  13. SU-E-T-290: Secondary Dose Monitoring Using Scintillating Fibers in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    SciTech Connect

    Tesfamicael, B; Gueye, P; Lyons, D; Avery, S; Mahesh, M

    2014-06-01

    Purpose: To monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers. Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate prostate cancer proton therapy based treatments. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cmm{sup 3} DuPont™ Delrin blocks were used to monitor the emission of secondary particles in the transverse (left and right) and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were used to extract the energy deposited in each fiber and the scintillating block. Results: The transverse dose distributions from secondary particles in both cases agree within <5% and with a very good symmetry. The energy deposited not only gradually increases as one moves from the peripheral row fibers towards the center of the block (aligned with the center of the prostate) but also decreases as one goes from the frontal to distal region of the block. The ratio of the doses from the prostate to the ones in the middle two rows of fibers showed a linear relationship with a slope (−3.55±2.26) × 10−5 MeV per treatment Gy. The distal detectors recorded a very small energy deposited due to water attenuation. Conclusion: With a good calibration and the ability to define a good correlation between the dose to the external fibers and the prostate, such fibers can be used for real time dose verification to the target.

  14. GEANT4 simulation of a scintillating-fibre tracker for the cosmic-ray muon tomography of legacy nuclear waste containers

    NASA Astrophysics Data System (ADS)

    Clarkson, A.; Hamilton, D. J.; Hoek, M.; Ireland, D. G.; Johnstone, J. R.; Kaiser, R.; Keri, T.; Lumsden, S.; Mahon, D. F.; McKinnon, B.; Murray, M.; Nutbeam-Tuffs, S.; Shearer, C.; Staines, C.; Yang, G.; Zimmerman, C.

    2014-05-01

    Cosmic-ray muons are highly penetrative charged particles that are observed at the sea level with a flux of approximately one per square centimetre per minute. They interact with matter primarily through Coulomb scattering, which is exploited in the field of muon tomography to image shielded objects in a wide range of applications. In this paper, simulation studies are presented that assess the feasibility of a scintillating-fibre tracker system for use in the identification and characterisation of nuclear materials stored within industrial legacy waste containers. A system consisting of a pair of tracking modules above and a pair below the volume to be assayed is simulated within the GEANT4 framework using a range of potential fibre pitches and module separations. Each module comprises two orthogonal planes of fibres that allow the reconstruction of the initial and Coulomb-scattered muon trajectories. A likelihood-based image reconstruction algorithm has been developed that allows the container content to be determined with respect to the scattering density λ, a parameter which is related to the atomic number Z of the scattering material. Images reconstructed from this simulation are presented for a range of anticipated scenarios that highlight the expected image resolution and the potential of this system for the identification of high-Z materials within a shielded, concrete-filled container. First results from a constructed prototype system are presented in comparison with those from a detailed simulation. Excellent agreement between experimental data and simulation is observed showing clear discrimination between the different materials assayed throughout.

  15. Building Mental Models by Dissecting Physical Models

    ERIC Educational Resources Information Center

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…

  16. Physical Modeling of the Piano

    NASA Astrophysics Data System (ADS)

    Giordano, N.; Jiang, M.

    2004-12-01

    A project aimed at constructing a physical model of the piano is described. Our goal is to calculate the sound produced by the instrument entirely from Newton's laws. The structure of the model is described along with experiments that augment and test the model calculations. The state of the model and what can be learned from it are discussed.

  17. TH-A-19A-05: Modeling Physics Properties and Biologic Effects Induced by Proton and Helium Ions

    SciTech Connect

    Taleei, R; Titt, U; Peeler, C; Guan, F; Mirkovic, D; Grosshans, D; Mohan, R

    2014-06-15

    Purpose: Currently, proton and carbon ions are used for cancer treatment. More recently, other light ions including helium ions have shown interesting physical and biological properties. The purpose of this work is to study the biological and physical properties of helium ions (He-3) in comparison to protons. Methods: Monte Carlo simulations with FLUKA, GEANT4 and MCNPX were used to calculate proton and He-3 dose distributions in water phantoms. The energy spectra of proton and He-3 beams were calculated with high resolution for use in biological models. The repair-misrepairfixation (RMF) model was subsequently used to calculate the RBE. Results: The proton Bragg curve calculations show good agreement between the three general purpose Monte Carlo codes. In contrast, the He-3 Bragg curve calculations show disagreement (for the magnitude of the Bragg peak) between FLUKA and the other two Monte Carlo codes. The differences in the magnitude of the Bragg peak are mainly due to the discrepancy in the secondary fragmentation cross sections used by the codes. The RBE for V79 cell lines is about 0.96 and 0.98 at the entrance of proton and He-3 ions depth dose respectively. The RBE increases to 1.06 and 1.59 at the Bragg peak of proton and He-3 ions. The results demonstrated that LET, microdosimetric parameters (such as dose-mean lineal energy) and RBE are nearly constant along the plateau region of Bragg curve, while all parameters increase within the Bragg peak and at the distal edge for both proton and He-3 ions. Conclusion: The Monte Carlo codes should revise the fragmentation cross sections to more accurately simulate the physical properties of He-3 ions. The increase in RBE for He-3 ions is higher than for proton beams at the Bragg peak.

  18. Physical Modeling of Microtubules Network

    NASA Astrophysics Data System (ADS)

    Allain, Pierre; Kervrann, Charles

    2014-10-01

    Microtubules (MT) are highly dynamic tubulin polymers that are involved in many cellular processes such as mitosis, intracellular cell organization and vesicular transport. Nevertheless, the modeling of cytoskeleton and MT dynamics based on physical properties is difficult to achieve. Using the Euler-Bernoulli beam theory, we propose to model the rigidity of microtubules on a physical basis using forces, mass and acceleration. In addition, we link microtubules growth and shrinkage to the presence of molecules (e.g. GTP-tubulin) in the cytosol. The overall model enables linking cytosol to microtubules dynamics in a constant state space thus allowing usage of data assimilation techniques.

  19. Physics beyond the standard model

    SciTech Connect

    Womersley, J.

    2000-01-24

    The author briefly summarizes the prospects for extending the understanding of physics beyond the standard model within the next five years. He interprets ``beyond the standard model'' to mean the physics of electroweak symmetry breaking, including the standard model Higgs boson. The nature of this TeV-scale new physics is perhaps the most crucial question facing high-energy physics, but one should recall (neutrino oscillations) that there is ample evidence for interesting physics in the flavour section too. In the next five years, before the LHC starts operations, the facilities available will be LEP2, HERA and the Fermilab Tevatron. He devotes a bit more time to the Tevatron as it is a new initiative for United Kingdom institutions. The Tevatron schedule now calls for data taking in Run II, using two upgraded detectors, to begin on March 1, 2001, with 2 fb{sup {minus}1} accumulated in the first two years. A nine-month shutdown will follow, to allow new silicon detector layers to be installed, and then running will resume with a goal of accumulating 15 fb{sup {minus}1} (or more) by 2006.

  20. Standard Model of Particle Physics--a health physics perspective.

    PubMed

    Bevelacqua, J J

    2010-11-01

    The Standard Model of Particle Physics is reviewed with an emphasis on its relationship to the physics supporting the health physics profession. Concepts important to health physics are emphasized and specific applications are presented. The capability of the Standard Model to provide health physics relevant information is illustrated with application of conservation laws to neutron and muon decay and in the calculation of the neutron mean lifetime.

  1. Building mental models by dissecting physical models.

    PubMed

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to ensure focused learning; models that are too constrained require less supervision, but can be constructed mechanically, with little to no conceptual engagement. We propose "model-dissection" as an alternative to "model-building," whereby instructors could make efficient use of supervisory resources, while simultaneously promoting focused learning. We report empirical results from a study conducted with biology undergraduate students, where we demonstrate that asking them to "dissect" out specific conceptual structures from an already built 3D physical model leads to a significant improvement in performance than asking them to build the 3D model from simpler components. Using questionnaires to measure understanding both before and after model-based interventions for two cohorts of students, we find that both the "builders" and the "dissectors" improve in the post-test, but it is the latter group who show statistically significant improvement. These results, in addition to the intrinsic time-efficiency of "model dissection," suggest that it could be a valuable pedagogical tool. PMID:26712513

  2. Accelerator physics and modeling: Proceedings

    SciTech Connect

    Parsa, Z.

    1991-01-01

    This report contains papers on the following topics: Physics of high brightness beams; radio frequency beam conditioner for fast-wave free-electron generators of coherent radiation; wake-field and space-charge effects on high brightness beams. Calculations and measured results for BNL-ATF; non-linear orbit theory and accelerator design; general problems of modeling for accelerators; development and application of dispersive soft ferrite models for time-domain simulation; and bunch lengthening in the SLC damping rings.

  3. Accelerator physics and modeling: Proceedings

    SciTech Connect

    Parsa, Z.

    1991-12-31

    This report contains papers on the following topics: Physics of high brightness beams; radio frequency beam conditioner for fast-wave free-electron generators of coherent radiation; wake-field and space-charge effects on high brightness beams. Calculations and measured results for BNL-ATF; non-linear orbit theory and accelerator design; general problems of modeling for accelerators; development and application of dispersive soft ferrite models for time-domain simulation; and bunch lengthening in the SLC damping rings.

  4. Physical and mathematical cochlear models

    NASA Astrophysics Data System (ADS)

    Lim, Kian-Meng

    2000-10-01

    The cochlea is an intricate organ in the inner ear responsible for our hearing. Besides acting as a transducer to convert mechanical sound vibrations to electrical neural signals, the cochlea also amplifies and separates the sound signal into its spectral components for further processing in the brain. It operates over a broad-band of frequency and a huge dynamic range of input while maintaining a low power consumption. The present research takes the approach of building cochlear models to study and understand the underlying mechanics involved in the functioning of the cochlea. Both physical and mathematical models of the cochlea are constructed. The physical model is a first attempt to build a life- sized replica of the human cochlea using advanced micro- machining techniques. The model takes a modular design, with a removable silicon-wafer based partition membrane encapsulated in a plastic fluid chamber. Preliminary measurements in the model are obtained and they compare roughly with simulation results. Parametric studies on the design parameters of the model leads to an improved design of the model. The studies also revealed that the width and orthotropy of the basilar membrane in the cochlea have significant effects on the sharply tuned responses observed in the biological cochlea. The mathematical model is a physiologically based model that includes three-dimensional viscous fluid flow and a tapered partition with variable properties along its length. A hybrid asymptotic and numerical method provides a uniformly valid and efficient solution to the short and long wave regions in the model. Both linear and non- linear activity are included in the model to simulate the active cochlea. The mathematical model has successfully reproduced many features of the response in the biological cochlea, as observed in experiment measurements performed on animals. These features include sharply tuned frequency responses, significant amplification with inclusion of activity

  5. Physical modeling of the piano

    NASA Astrophysics Data System (ADS)

    Giordano, N.; Jiang, M.

    2003-10-01

    Over the past several years, this project has been aimed at constructing a physical model of the piano. The goal is to use Newton's laws to describe the motion of the hammers, strings, soundboard, and surrounding air, and thereby calculate the sound produced by the instrument entirely from first principles. The structure of the model is described, along with experiments that have provided essential tests and guidance to the calculations. The state of the model and, especially, how this work can lead to new insights and understanding into the piano are discussed. In many cases the work and the specific questions addressed along the way have followed paths initially inspired and developed by Gabriel Weinreich. [Work supported by NSF.

  6. Modelization For Electromagnetic Electron Scattering at Low Energies for Radiotherapy applications.

    NASA Astrophysics Data System (ADS)

    Nazaryan, Vahagn; Gueye, Paul

    2006-03-01

    Since release of the GEANT4 particle simulation toolkit in 2003, there has been a growing interest in its applications to medical physics. The applicability of GEANT4 to radiotherapy has been a subject of several investigations in recent years, and it was found to be of great use. Its low-energy model allows for electromagnetic interaction simulations down to 250 eV. The electron physics data are obtained from the Lawrence Livermore National Laboratory's Evaluated Electron Data Library (EEDL). At very lower energies (below 10 MeV), some of the tabulated data in EEDL have big uncertainties (more than 50%), and rely on various extrapolations to energy regions where there is no experimental data. We have investigated the variations of these cross-section data to radiotherapy applications. Our study suggests a strong need for better theoretical models of electron interactions with matter at these energies, and the necessity of new and more reliable experimental data. The progress towards such theoretical model will be presented.

  7. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  8. A Multivariate Model of Physics Problem Solving

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Farley, John

    2013-01-01

    A model of expertise in physics problem solving was tested on undergraduate science, physics, and engineering majors enrolled in an introductory-level physics course. Structural equation modeling was used to test hypothesized relationships among variables linked to expertise in physics problem solving including motivation, metacognitive planning,…

  9. Sensitivity study of proton radiography and comparison with kV and MV x-ray imaging using GEANT4 Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Depauw, Nicolas; Seco, Joao

    2011-04-01

    The imaging sensitivity of proton radiography has been studied and compared with kV and MV x-ray imaging using Monte Carlo simulations. A phantom was specifically modeled using 21 different material inserts with densities ranging from 0.001 to 1.92 g cm-3. These simulations were run using the MGH double scattered proton beam, scanned pencil proton beams from 200 to 490 MeV, as well as pure 50 keV, 100 keV, 1 MeV and 2 MeV gamma x-ray beams. In order to compare the physics implied in both proton and photon radiography without being biased by the current state of the art in detector technology, the detectors were considered perfect. Along with spatial resolution, the contrast-to-noise ratio was evaluated and compared for each material. These analyses were performed using radiographic images that took into account the following: only primary protons, both primary and secondary protons, and both contributions while performing angular and energetic cuts. Additionally, tissue-to-tissue contrasts in an actual lung cancer patient case were studied for simulated proton radiographs and compared against the original kV x-ray image which corresponds to the current patient set-up image in the proton clinic. This study highlights the poorer spatial resolution of protons versus x-rays for radiographic imaging purposes, and the excellent density resolution of proton radiography. Contrasts around the tumor are higher using protons in a lung cancer patient case. The high-density resolution of proton radiography is of great importance for specific tumor diagnostics, such as in lung cancer, where x-ray radiography operates poorly. Furthermore, the use of daily proton radiography prior to proton therapy would ameliorate patient set-up while reducing the absorbed dose delivered through imaging.

  10. Modeling QCD for Hadron Physics

    SciTech Connect

    Tandy, P. C.

    2011-10-24

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  11. Modeling QCD for Hadron Physics

    NASA Astrophysics Data System (ADS)

    Tandy, P. C.

    2011-10-01

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  12. Physics modeling support contract: Final report

    SciTech Connect

    Not Available

    1987-09-30

    This document is the final report for the Physics Modeling Support contract between TRW, Inc. and the Lawrence Livermore National Laboratory for fiscal year 1987. It consists of following projects: TIBER physics modeling and systems code development; advanced blanket modeling task; time dependent modeling; and free electron maser for TIBER II.

  13. Model Formulation for Physics Problem Solving. Draft.

    ERIC Educational Resources Information Center

    Novak, Gordon S., Jr.

    The major task in solving a physics problem is to construct an appropriate model of the problem in terms of physical principles. The functions performed by such a model, the information which needs to be represented, and the knowledge used in selecting and instantiating an appropriate model are discussed. An example of a model for a mechanics…

  14. Evaluating a Model of Youth Physical Activity

    PubMed Central

    Heitzler, Carrie D.; Lytle, Leslie A.; Erickson, Darin J.; Barr-Anderson, Daheia; Sirard, John R.; Story, Mary

    2011-01-01

    Objective To explore the relationship between social influences, self-efficacy, enjoyment, and barriers and physical activity. Methods Structural equation modeling examined relationships between parent and peer support, parent physical activity, individual perceptions, and objectively measured physical activity using accelerometers among a sample of youth aged 10–17 years (N=720). Results Peer support, parent physical activity, and perceived barriers were directly related to youth activity. The proposed model accounted for 14.7% of the variance in physical activity. Conclusions The results demonstrate a need to further explore additional individual, social, and environmental factors that may influence youth’s regular participation in physical activity. PMID:20524889

  15. Physical modeling of Tibetan bowls

    NASA Astrophysics Data System (ADS)

    Antunes, Jose; Inacio, Octavio

    2001-05-01

    Tibetan bowls produce rich penetrating sounds, used in musical contexts and to induce a state of relaxation for meditation or therapy purposes. To understand the dynamics of these instruments under impact and rubbing excitation, we developed a simulation method based on the modal approach, following our previous papers on physical modeling of plucked/bowed strings and impacted/bowed bars. This technique is based on a compact representation of the system dynamics, in terms of the unconstrained bowl modes. Nonlinear contact/friction interaction forces, between the exciter (puja) and the bowl, are computed at each time step and projected on the bowl modal basis, followed by step integration of the modal equations. We explore the behavior of two different-sized bowls, for extensive ranges of excitation conditions (contact/friction parameters, normal force, and tangential puja velocity). Numerical results and experiments show that various self-excited motions may arise depending on the playing conditions and, mainly, on the contact/friction interaction parameters. Indeed, triggering of a given bowl modal frequency mainly depends on the puja material. Computed animations and experiments demonstrate that self-excited modes spin, following the puja motion. Accordingly, the sensed pressure field pulsates, with frequency controlled by the puja spinning velocity and the spatial pattern of the singing mode.

  16. A qualitative model of physical fields

    SciTech Connect

    Lundell, M.

    1996-12-31

    A qualitative model of the spatio-temporal behaviour of distributed parameter systems based on physical fields is presented. Field-based models differ from the object-based models normally used in qualitative physics by treating parameters as continuous entities instead of as attributes of discrete objects. This is especially suitable for natural physical systems, e.g. in ecology. The model is divided into a static and a dynamic part. The static model describes the distribution of each parameter as a qualitative physical field. Composite fields are constructed from intersection models of pairs of fields. The dynamic model describes processes acting on the fields, and qualitative relationships between parameters. Spatio-temporal behaviour is modelled by interacting temporal processes, influencing single points in space, and spatial processes that gradually spread temporal processes over space. We give an example of a qualitative model of a natural physical system and discuss the ambiguities that arise during simulation.

  17. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  18. Determination and Fabrication of New Shield Super Alloys Materials for Nuclear Reactor Safety by Experiments and Cern-Fluka Monte Carlo Simulation Code, Geant4 and WinXCom

    NASA Astrophysics Data System (ADS)

    Aygun, Bünyamin; Korkut, Turgay; Karabulut, Abdulhalik

    2016-05-01

    Despite the possibility of depletion of fossil fuels increasing energy needs the use of radiation tends to increase. Recently the security-focused debate about planned nuclear power plants still continues. The objective of this thesis is to prevent the radiation spread from nuclear reactors into the environment. In order to do this, we produced higher performanced of new shielding materials which are high radiation holders in reactors operation. Some additives used in new shielding materials; some of iron (Fe), rhenium (Re), nickel (Ni), chromium (Cr), boron (B), copper (Cu), tungsten (W), tantalum (Ta), boron carbide (B4C). The results of this experiments indicated that these materials are good shields against gamma and neutrons. The powder metallurgy technique was used to produce new shielding materials. CERN - FLUKA Geant4 Monte Carlo simulation code and WinXCom were used for determination of the percentages of high temperature resistant and high-level fast neutron and gamma shielding materials participated components. Super alloys was produced and then the experimental fast neutron dose equivalent measurements and gamma radiation absorpsion of the new shielding materials were carried out. The produced products to be used safely reactors not only in nuclear medicine, in the treatment room, for the storage of nuclear waste, nuclear research laboratories, against cosmic radiation in space vehicles and has the qualities.

  19. Physics models of centriole replication.

    PubMed

    Cheng, Kang; Zou, Changhua

    2006-01-01

    Our previous pre-clinic experimental results have showed that the epithelialization can be enhanced by the externally applied rectangular pulsed electrical current stimulation (RPECS). The results are clinically significant for patients, especially for those difficult patients whose skin wounds need long periods to heal. However, the results also raise questions: How does the RPECS accelerate the epithelium cell proliferation? To answer these questions, we have previously developed several models for animal cells, in a view of physics, to explain mechanisms of mitosis and cytokinesis at a cellular level, and separation of nucleotide sequences and the unwinding of a double helix during DNA replication at a bio-molecular level. In this paper, we further model the mechanism of centriole replication during a natural and normal mitosis and cytokinesis to explore the mechanism of epithelialization enhanced with the externally applied RPECS at a bio-molecular level. Our models suggest: (1) Centriole replication is an information flowing. The direction of the information flowing is from centrioles to centrioles based on a cylindrical template of 9 x 3 protein microtubules (MTs) pattern. (2) A spontaneous and strong electromagnetic field (EMF) force is a pushing force that separates a mother and a daughter centrioles in centrosomes or in cells, while a pulling force of interacting fibers and pericentriolar materials delivers new babies. The newly born babies inherit the pattern information from their mother(s) and grow using microtubule fragments that come through the centrosome pores. A daughter centriole is always born and grows along stronger EMF. The EMF mostly determines centrioles positions and plays key role in centriole replication. We also hypothesize that the normal centriole replication could not been disturbed in centrosome in the epithelium cells by our RPECS, because the centrioles have two non-conducting envelope (cell and centrosome membranes), that protect

  20. Evaluating a Model of Youth Physical Activity

    ERIC Educational Resources Information Center

    Heitzler, Carrie D.; Lytle, Leslie A.; Erickson, Darin J.; Barr-Anderson, Daheia; Sirard, John R.; Story, Mary

    2010-01-01

    Objective: To explore the relationship between social influences, self-efficacy, enjoyment, and barriers and physical activity. Methods: Structural equation modeling examined relationships between parent and peer support, parent physical activity, individual perceptions, and objectively measured physical activity using accelerometers among a…

  1. Dosimetry for electron Intra-Operative RadioTherapy: Comparison of output factors obtained through alanine/EPR pellets, ionization chamber and Monte Carlo-GEANT4 simulations for IORT mobile dedicate accelerator

    NASA Astrophysics Data System (ADS)

    Marrale, Maurizio; Longo, Anna; Russo, Giorgio; Casarino, Carlo; Candiano, Giuliana; Gallo, Salvatore; Carlino, Antonio; Brai, Maria

    2015-09-01

    In this work a comparison between the response of alanine and Markus ionization chamber was carried out for measurements of the output factors (OF) of electron beams produced by a linear accelerator used for Intra-Operative Radiation Therapy (IORT). Output factors (OF) for conventional high-energy electron beams are normally measured using ionization chamber according to international dosimetry protocols. However, the electron beams used in IORT have characteristics of dose per pulse, energy spectrum and angular distribution quite different from beams usually used in external radiotherapy, so the direct application of international dosimetry protocols may introduce additional uncertainties in dosimetric determinations. The high dose per pulse could lead to an inaccuracy in dose measurements with ionization chamber, due to overestimation of ks recombination factor. Furthermore, the electron fields obtained with IORT-dedicated applicators have a wider energy spectrum and a wider angular distribution than the conventional fields, due to the presence of electrons scattered by the applicator's wall. For this reason, a dosimetry system should be characterized by a minimum dependence from the beam energy and from angle of incidence of electrons. This become particularly critical for small and bevelled applicators. All of these reasons lead to investigate the use of detectors different from the ionization chamber for measuring the OFs. Furthermore, the complete characterization of the radiation field could be accomplished also by the use of Monte Carlo simulations which allows to obtain detailed information on dose distributions. In this work we compare the output factors obtained by means of alanine dosimeters and Markus ionization chamber. The comparison is completed by the Monte Carlo calculations of OFs determined through the use of the Geant4 application "iort _ therapy" . The results are characterized by a good agreement of response of alanine pellets and Markus

  2. SU-E-T-289: Scintillating Fiber Based In-Vivo Dose Monitoring System to the Rectum in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    SciTech Connect

    Tesfamicael, B; Gueye, P; Lyons, D; Mahesh, M; Avery, S

    2014-06-01

    Purpose: To construct a dose monitoring system based on an endorectal balloon coupled to thin scintillating fibers to study the dose delivered to the rectum during prostate cancer proton therapy Methods: The Geant4 Monte Carlo toolkit version 9.6p02 was used to simulate prostate cancer proton therapy treatments of an endorectal balloon (for immobilization of a 2.9 cm diameter prostate gland) and a set of 34 scintillating fibers symmetrically placed around the balloon and perpendicular to the proton beam direction (for dosimetry measurements) Results: A linear response of the fibers to the dose delivered was observed within <2%, a property that makes them good candidates for real time dosimetry. Results obtained show that the closest fiber recorded about 1/3 of the dose to the target with a 1/r{sup 2} decrease in the dose distribution as one goes toward the frontal and distal top fibers. Very low dose was recorded by the bottom fibers (about 45 times comparatively), which is a clear indication that the overall volume of the rectal wall that is exposed to a higher dose is relatively minimized. Further analysis indicated a simple scaling relationship between the dose to the prostate and the dose to the top fibers (a linear fit gave a slope of −0.07±0.07 MeV per treatment Gy) Conclusion: Thin (1 mm × 1 mm × 100 cm) long scintillating fibers were found to be ideal for real time in-vivo dose measurement to the rectum for prostate cancer proton therapy. The linear response of the fibers to the dose delivered makes them good candidates of dosimeters. With thorough calibration and the ability to define a good correlation between the dose to the target and the dose to the fibers, such dosimeters can be used for real time dose verification to the target.

  3. Are Physical Education Majors Models for Fitness?

    ERIC Educational Resources Information Center

    Kamla, James; Snyder, Ben; Tanner, Lori; Wash, Pamela

    2012-01-01

    The National Association of Sport and Physical Education (NASPE) (2002) has taken a firm stance on the importance of adequate fitness levels of physical education teachers stating that they have the responsibility to model an active lifestyle and to promote fitness behaviors. Since the NASPE declaration, national initiatives like Let's Move…

  4. PIXE simulation: Models, methods and technologies

    SciTech Connect

    Batic, M.; Pia, M. G.; Saracco, P.; Weidenspointner, G.

    2013-04-19

    The simulation of PIXE (Particle Induced X-ray Emission) is discussed in the context of general-purpose Monte Carlo systems for particle transport. Dedicated PIXE codes are mainly concerned with the application of the technique to elemental analysis, but they lack the capability of dealing with complex experimental configurations. General-purpose Monte Carlo codes provide powerful tools to model the experimental environment in great detail, but so far they have provided limited functionality for PIXE simulation. This paper reviews recent developments that have endowed the Geant4 simulation toolkit with advanced capabilities for PIXE simulation, and related efforts for quantitative validation of cross sections and other physical parameters relevant to PIXE simulation.

  5. Modeling Physics with Easy Java Simulations

    ERIC Educational Resources Information Center

    Christian, Wolfgang; Esquembre, Francisco

    2007-01-01

    Modeling has been shown to correct weaknesses of traditional instruction by engaging students in the design of physical models to describe, explain, and predict phenomena. Although the modeling method can be used without computers, the use of computers allows students to study problems that are difficult and time consuming, to visualize their…

  6. The trinucleons: Physical observables and model properties

    SciTech Connect

    Gibson, B.F.

    1992-01-01

    Our progress in understanding the properties of {sup 3}H and {sup 3}He in terms of a nonrelativistic Hamiltonian picture employing realistic nuclear forces is reviewed. Trinucleon model properties are summarized for a number of contemporary force models, and predictions for physical observables are presented. Disagreement between theoretical model results and experimental results are highlighted.

  7. The trinucleons: Physical observables and model properties

    SciTech Connect

    Gibson, B.F.

    1992-05-01

    Our progress in understanding the properties of {sup 3}H and {sup 3}He in terms of a nonrelativistic Hamiltonian picture employing realistic nuclear forces is reviewed. Trinucleon model properties are summarized for a number of contemporary force models, and predictions for physical observables are presented. Disagreement between theoretical model results and experimental results are highlighted.

  8. Physics of the Quark Model

    ERIC Educational Resources Information Center

    Young, Robert D.

    1973-01-01

    Discusses the charge independence, wavefunctions, magnetic moments, and high-energy scattering of hadrons on the basis of group theory and nonrelativistic quark model with mass spectrum calculated by first-order perturbation theory. The presentation is explainable to advanced undergraduate students. (CC)

  9. The Standard Model of Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Detmold, William

    2015-04-01

    At its core, nuclear physics, which describes the properties and interactions of hadrons, such as protons and neutrons, and atomic nuclei, arises from the Standard Model of particle physics. However, the complexities of nuclei result in severe computational difficulties that have historically prevented the calculation of central quantities in nuclear physics directly from this underlying theory. The availability of petascale (and prospect of exascale) high performance computing is changing this situation by enabling us to extend the numerical techniques of lattice Quantum Chromodynamics (LQCD), applied successfully in particle physics, to the more intricate dynamics of nuclear physics. In this talk, I will discuss this revolution and the emerging understanding of hadrons and nuclei within the Standard Model.

  10. PHYSICAL MODELING OF CONTRACTED FLOW.

    USGS Publications Warehouse

    Lee, Jonathan K.

    1987-01-01

    Experiments on steady flow over uniform grass roughness through centered single-opening contractions were conducted in the Flood Plain Simulation Facility at the U. S. Geological Survey's Gulf Coast Hydroscience Center near Bay St. Louis, Miss. The experimental series was designed to provide data for calibrating and verifying two-dimensional, vertically averaged surface-water flow models used to simulate flow through openings in highway embankments across inundated flood plains. Water-surface elevations, point velocities, and vertical velocity profiles were obtained at selected locations for design discharges ranging from 50 to 210 cfs. Examples of observed water-surface elevations and velocity magnitudes at basin cross-sections are presented.

  11. Waste Feed Evaporation Physical Properties Modeling

    SciTech Connect

    Daniel, W.E.

    2003-08-25

    This document describes the waste feed evaporator modeling work done in the Waste Feed Evaporation and Physical Properties Modeling test specification and in support of the Hanford River Protection Project (RPP) Waste Treatment Plant (WTP) project. A private database (ZEOLITE) was developed and used in this work in order to include the behavior of aluminosilicates such a NAS-gel in the OLI/ESP simulations, in addition to the development of the mathematical models. Mathematical models were developed that describe certain physical properties in the Hanford RPP-WTP waste feed evaporator process (FEP). In particular, models were developed for the feed stream to the first ultra-filtration step characterizing its heat capacity, thermal conductivity, and viscosity, as well as the density of the evaporator contents. The scope of the task was expanded to include the volume reduction factor across the waste feed evaporator (total evaporator feed volume/evaporator bottoms volume). All the physical properties were modeled as functions of the waste feed composition, temperature, and the high level waste recycle volumetric flow rate relative to that of the waste feed. The goal for the mathematical models was to predict the physical property to predicted simulation value. The simulation model approximating the FEP process used to develop the correlations was relatively complex, and not possible to duplicate within the scope of the bench scale evaporation experiments. Therefore, simulants were made of 13 design points (a subset of the points used in the model fits) using the compositions of the ultra-filtration feed streams as predicted by the simulation model. The chemistry and physical properties of the supernate (the modeled stream) as predicted by the simulation were compared with the analytical results of experimental simulant work as a method of validating the simulation software.

  12. Physical Modelling of Sedimentary Basin

    SciTech Connect

    Yuen, David A.

    2003-04-24

    The main goals of the first three years have been achieved, i.e., the development of particle-based and continuum-based algorithms for cross-scaleup-scale analysis of complex fluid flows. The U. Minnesota team has focused on particle-based methods, wavelets (Rustad et al., 2001) and visualization and has had great success with the dissipative and fluid particle dynamics algorithms, as applied to colloidal, polymeric and biological systems, wavelet filtering and visualization endeavors. We have organized two sessions in nonlinear geophysics at the A.G.U. Fall Meeting (2000,2002), which have indeed synergetically stimulated the community and promoted cross-disciplinary efforts in the geosciences. The LANL team has succeeded with continuum-based algorithms, in particular, fractal interpolating functions (fif). These have been applied to 1-D flow and transport equations (Travis, 2000; 2002) as a proof of principle, providing solutions that capture dynamics at all scales. In addition, the fif representations can be integrated to provide sub-grid-scale homogenization, which can be used in more traditional finite difference or finite element solutions of porous flow and transport. Another useful tool for fluid flow problems is the ability to solve inverse problems, that is, given present-time observations of a fluid flow, what was the initial state of that fluid system? We have demonstrated this capability for a large-scale problem of 3-D flow in the Earth's crust (Bunge, Hagelberg & Travis, 2002). Use of the adjoint method for sensitivity analysis (Marchuk, 1995) to compute derivatives of models makes the large-scale inversion feasible in 4-D, , space and time. Further, a framework for simulating complex fluid flow in the Earth's crust has been implemented (Dutrow et al, 2001). The remaining task of the first three-year campaign is to extend the implementation of the fif formalism to our 2-D and 3-D computer codes, which is straightforward, but involved.

  13. Particle radiation transport and effects models from research to space weather operations

    NASA Astrophysics Data System (ADS)

    Santin, Giovanni; Nieminen, Petteri; Rivera, Angela; Ibarmia, Sergio; Truscott, Pete; Lei, Fan; Desorgher, Laurent; Ivanchenko, Vladimir; Kruglanski, Michel; Messios, Neophytos

    Assessment of risk from potential radiation-induced effects to space systems requires knowledge of both the conditions of the radiation environment and of the impact of radiation on sensi-tive spacecraft elements. During sensitivity analyses, test data are complemented by models to predict how external radiation fields are transported and modified in spacecraft materials. Radiation transport is still itself a subject of research and models are continuously improved to describe the physical interactions that take place when particles pass through shielding materi-als or hit electronic systems or astronauts, sometimes down to nanometre-scale interactions of single particles with deep sub-micron technologies or DNA structures. In recent years, though, such radiation transport models are transitioning from being a research subject by itself, to being widely used in the space engineering domain and finally being directly applied in the context of operation of space weather services. A significant "research to operations" (R2O) case is offered by Geant4, an open source toolkit initially developed and used in the context of fundamental research in high energy physics. Geant4 is also being used in the space domain, e.g. for modelling detector responses in science payloads, but also for studying the radiation environment itself, with subjects ranging from cosmic rays, to solar energetic particles in the heliosphere, to geomagnetic shielding. Geant4-based tools are now becoming more and more integrated in spacecraft design procedures, also through user friendly interfaces such as SPEN-VIS. Some examples are given by MULASSIS, offering multi-layered shielding analysis capa-bilities in realistic spacecraft materials, or GEMAT, focused on micro-dosimetry in electronics, or PLANETOCOSMICS, describing the interaction of the space environment with planetary magneto-and atmospheres, or GRAS, providing a modular and easy to use interface to various analysis types in simple or

  14. Physical-statistical modeling in geophysics

    NASA Astrophysics Data System (ADS)

    Berliner, L. Mark

    2003-12-01

    Two powerful formulas have been available to scientists for more than two centuries: Newton's second law, providing a foundation for classical physics, and Bayes's theorem, prescribing probabilistic learning about unknown quantities based on observations. For the most part the use of these formulas has been separated, with Newton being the more dominant in geophysics. This separation is arguably surprising since numerous sources of uncertainty arise in the application of classical physics in complex situations. One explanation for the separation is the difficulty in implementing Bayesian analysis in complex settings. However, recent advances in both modeling strategies and computational tools have contributed to a significant change in the scope and feasibility of Bayesian analysis. This paradigm provides opportunities for the combination of physical reasoning and observational data in a coherent analysis framework but in a fashion which manages the uncertainties in both information sources. A key to the modeling is the hierarchical viewpoint, in which separate statistical models are developed for the process variables studied and for the observations conditional on those variables. Modeling process variables in this way enables the incorporation of physics across a spectrum of levels of intensity, ranging from a qualitative use of physical reasoning to a strong reliance on numerical models. Selected examples from this spectrum are reviewed. So far as the laws of mathematics refer to reality, they are not certain. And so far as they are certain, they do not refer to reality.Albert Einstein (1921)

  15. Monte Carlo investigation of the increased radiation deposition due to gold nanoparticles using kilovoltage and megavoltage photons in a 3D randomized cell model

    SciTech Connect

    Douglass, Michael; Bezak, Eva; Penfold, Scott

    2013-07-15

    Purpose: Investigation of increased radiation dose deposition due to gold nanoparticles (GNPs) using a 3D computational cell model during x-ray radiotherapy.Methods: Two GNP simulation scenarios were set up in Geant4; a single 400 nm diameter gold cluster randomly positioned in the cytoplasm and a 300 nm gold layer around the nucleus of the cell. Using an 80 kVp photon beam, the effect of GNP on the dose deposition in five modeled regions of the cell including cytoplasm, membrane, and nucleus was simulated. Two Geant4 physics lists were tested: the default Livermore and custom built Livermore/DNA hybrid physics list. 10{sup 6} particles were simulated at 840 cells in the simulation. Each cell was randomly placed with random orientation and a diameter varying between 9 and 13 {mu}m. A mathematical algorithm was used to ensure that none of the 840 cells overlapped. The energy dependence of the GNP physical dose enhancement effect was calculated by simulating the dose deposition in the cells with two energy spectra of 80 kVp and 6 MV. The contribution from Auger electrons was investigated by comparing the two GNP simulation scenarios while activating and deactivating atomic de-excitation processes in Geant4.Results: The physical dose enhancement ratio (DER) of GNP was calculated using the Monte Carlo model. The model has demonstrated that the DER depends on the amount of gold and the position of the gold cluster within the cell. Individual cell regions experienced statistically significant (p < 0.05) change in absorbed dose (DER between 1 and 10) depending on the type of gold geometry used. The DER resulting from gold clusters attached to the cell nucleus had the more significant effect of the two cases (DER {approx} 55). The DER value calculated at 6 MV was shown to be at least an order of magnitude smaller than the DER values calculated for the 80 kVp spectrum. Based on simulations, when 80 kVp photons are used, Auger electrons have a statistically insignificant (p

  16. Monte Carlo modeling provides accurate calibration factors for radionuclide activity meters.

    PubMed

    Zagni, F; Cicoria, G; Lucconi, G; Infantino, A; Lodi, F; Marengo, M

    2014-12-01

    Accurate determination of calibration factors for radionuclide activity meters is crucial for quantitative studies and in the optimization step of radiation protection, as these detectors are widespread in radiopharmacy and nuclear medicine facilities. In this work we developed the Monte Carlo model of a widely used activity meter, using the Geant4 simulation toolkit. More precisely the "PENELOPE" EM physics models were employed. The model was validated by means of several certified sources, traceable to primary activity standards, and other sources locally standardized with spectrometry measurements, plus other experimental tests. Great care was taken in order to accurately reproduce the geometrical details of the gas chamber and the activity sources, each of which is different in shape and enclosed in a unique container. Both relative calibration factors and ionization current obtained with simulations were compared against experimental measurements; further tests were carried out, such as the comparison of the relative response of the chamber for a source placed at different positions. The results showed a satisfactory level of accuracy in the energy range of interest, with the discrepancies lower than 4% for all the tested parameters. This shows that an accurate Monte Carlo modeling of this type of detector is feasible using the low-energy physics models embedded in Geant4. The obtained Monte Carlo model establishes a powerful tool for first instance determination of new calibration factors for non-standard radionuclides, for custom containers, when a reference source is not available. Moreover, the model provides an experimental setup for further research and optimization with regards to materials and geometrical details of the measuring setup, such as the ionization chamber itself or the containers configuration.

  17. Review of Some Promising Fractional Physical Models

    NASA Astrophysics Data System (ADS)

    Tarasov, Vasily E.

    2013-04-01

    Fractional dynamics is a field of study in physics and mechanics investigating the behavior of objects and systems that are characterized by power-law nonlocality, power-law long-term memory or fractal properties by using integrations and differentiation of non-integer orders, i.e., by methods in the fractional calculus. This paper is a review of physical models that look very promising for future development of fractional dynamics. We suggest a short introduction to fractional calculus as a theory of integration and differentiation of noninteger order. Some applications of integro-differentiations of fractional orders in physics are discussed. Models of discrete systems with memory, lattice with long-range inter-particle interaction, dynamics of fractal media are presented. Quantum analogs of fractional derivatives and model of open nano-system systems with memory are also discussed.

  18. Simplified models for LHC new physics searches

    NASA Astrophysics Data System (ADS)

    Alves, Daniele; Arkani-Hamed, Nima; Arora, Sanjay; Bai, Yang; Baumgart, Matthew; Berger, Joshua; Buckley, Matthew; Butler, Bart; Chang, Spencer; Cheng, Hsin-Chia; Cheung, Clifford; Sekhar Chivukula, R.; Cho, Won Sang; Cotta, Randy; D'Alfonso, Mariarosaria; El Hedri, Sonia; Essig (Editor, Rouven; Evans, Jared A.; Fitzpatrick, Liam; Fox, Patrick; Franceschini, Roberto; Freitas, Ayres; Gainer, James S.; Gershtein, Yuri; Gray, Richard; Gregoire, Thomas; Gripaios, Ben; Gunion, Jack; Han, Tao; Haas, Andy; Hansson, Per; Hewett, JoAnne; Hits, Dmitry; Hubisz, Jay; Izaguirre, Eder; Kaplan, Jared; Katz, Emanuel; Kilic, Can; Kim, Hyung-Do; Kitano, Ryuichiro; Koay, Sue Ann; Ko, Pyungwon; Krohn, David; Kuflik, Eric; Lewis, Ian; Lisanti (Editor, Mariangela; Liu, Tao; Liu, Zhen; Lu, Ran; Luty, Markus; Meade, Patrick; Morrissey, David; Mrenna, Stephen; Nojiri, Mihoko; Okui, Takemichi; Padhi, Sanjay; Papucci, Michele; Park, Michael; Park, Myeonghun; Perelstein, Maxim; Peskin, Michael; Phalen, Daniel; Rehermann, Keith; Rentala, Vikram; Roy, Tuhin; Ruderman, Joshua T.; Sanz, Veronica; Schmaltz, Martin; Schnetzer, Stephen; Schuster (Editor, Philip; Schwaller, Pedro; Schwartz, Matthew D.; Schwartzman, Ariel; Shao, Jing; Shelton, Jessie; Shih, David; Shu, Jing; Silverstein, Daniel; Simmons, Elizabeth; Somalwar, Sunil; Spannowsky, Michael; Spethmann, Christian; Strassler, Matthew; Su, Shufang; Tait (Editor, Tim; Thomas, Brooks; Thomas, Scott; Toro (Editor, Natalia; Volansky, Tomer; Wacker (Editor, Jay; Waltenberger, Wolfgang; Yavin, Itay; Yu, Felix; Zhao, Yue; Zurek, Kathryn; LHC New Physics Working Group

    2012-10-01

    This document proposes a collection of simplified models relevant to the design of new-physics searches at the Large Hadron Collider (LHC) and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the ‘Topologies for Early LHC Searches’ workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first ˜50-500 pb-1 of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

  19. Simplified Models for LHC New Physics Searches

    SciTech Connect

    Alves, Daniele; Arkani-Hamed, Nima; Arora, Sanjay; Bai, Yang; Baumgart, Matthew; Berger, Joshua; Buckley, Matthew; Butler, Bart; Chang, Spencer; Cheng, Hsin-Chia; Cheung, Clifford; Chivukula, R.Sekhar; Cho, Won Sang; Cotta, Randy; D'Alfonso, Mariarosaria; El Hedri, Sonia; Essig, Rouven,; Evans, Jared A.; Fitzpatrick, Liam; Fox, Patrick; Franceschini, Roberto; /more authors..

    2012-06-01

    This document proposes a collection of simplified models relevant to the design of new-physics searches at the LHC and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the 'Topologies for Early LHC Searches' workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first {approx} 50-500 pb{sup -1} of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

  20. Comparisons between physical model and numerical model results

    SciTech Connect

    Sagasta, P.F.

    1986-04-01

    Physical modeling scaling laws provide the opportunity to compare results among numerical modeling programs, including two- and three-dimensional interactive-raytracing and more sophisticated wave-equation-approximation methods, and seismic data collected over a known, three-dimensional model in a water tank. The sixfold closely spaced common-midpoint water-tank data modeled for this study simulate a standard marine three-dimensional survey shot over a three-layered physical model (a structured upper layer overlying two flat layers. Using modeling theory, the physical-tank model dimensions scale to realistic exploration dimensions, and the ultrasonic frequencies scale to seismic frequencies of 2-60 Hz. A comparison of P and converted-S events and amplitudes among these physical tank data and numerical modeling results illustrates many of the advantages and limitations of modeling methods available to the exploration geophysicist. The ability of three-dimensional raytracing to model off-line events and more closely predict waveform phase due to geometric effects shows the greater usefulness of three-dimensional modeling methods over two-dimensional methods in seismic interpretation. Forward modeling of P to Sv-converted events and multiples predicts their presence in the seismic data. The geometry of the physical model leads to examples where raytracing approximations are limited and the more time-consuming finite-element technique is useful to better understand wave propagation within the physical model. All of the numerical modeling programs used show limitations in matching the amplitudes and phase of events in the physical-model seismic data.

  1. Physical and stochastic models of earthquake clustering

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Murru, Maura; Catalli, Flaminia

    2006-04-01

    The phenomenon of earthquake clustering, i.e., the increase of occurrence probability for seismic events close in space and time to other previous earthquakes, has been modeled both by statistical and physical processes. From a statistical viewpoint the so-called epidemic model (ETAS) introduced by Ogata in 1988 and its variations have become fairly well known in the seismological community. Tests on real seismicity and comparison with a plain time-independent Poissonian model through likelihood-based methods have reliably proved their validity. On the other hand, in the last decade many papers have been published on the so-called Coulomb stress change principle, based on the theory of elasticity, showing qualitatively that an increase of the Coulomb stress in a given area is usually associated with an increase of seismic activity. More specifically, the rate-and-state theory developed by Dieterich in the '90s has been able to give a physical justification to the phenomenon known as Omori law. According to this law, a mainshock is followed by a series of aftershocks whose frequency decreases in time as an inverse power law. In this study we give an outline of the above-mentioned stochastic and physical models, and build up an approach by which these models can be merged in a single algorithm and statistically tested. The application to the seismicity of Japan from 1970 to 2003 shows that the new model incorporating the physical concept of the rate-and-state theory performs not worse than the purely stochastic model with two free parameters only. The numerical results obtained in these applications are related to physical characters of the model as the stress change produced by an earthquake close to its edges and to the A and σ parameters of the rate-and-state constitutive law.

  2. A physical analogue of the Schelling model

    NASA Astrophysics Data System (ADS)

    Vinković, Dejan; Kirman, Alan

    2006-12-01

    We present a mathematical link between Schelling's socio-economic model of segregation and the physics of clustering. We replace the economic concept of "utility" by the physics concept of a particle's internal energy. As a result cluster dynamics is driven by the "surface tension" force. The resultant segregated areas can be very large and can behave like spherical "liquid" droplets or as a collection of static clusters in "frozen" form. This model will hopefully provide a useful framework for studying many spatial economic phenomena that involve individuals making location choices as a function of the characteristics and choices of their neighbors.

  3. Waste glass melter numerical and physical modeling

    SciTech Connect

    Eyler, L.L.; Peters, R.D.; Lessor, D.L.; Lowery, P.S.; Elliott, M.L.

    1991-10-01

    Results of physical and numerical simulation modeling of high-level liquid waste vitrification melters are presented. Physical modeling uses simulant fluids in laboratory testing. Visualization results provide insight into convective melt flow patterns from which information is derived to support performance estimation of operating melters and data to support numerical simulation. Numerical simulation results of several melter configurations are presented. These are in support of programs to evaluate melter operation characteristics and performance. Included are investigations into power skewing and alternating current electric field phase angle in a dual electrode pair reference design and bi-modal convective stability in an advanced design. 9 refs., 9 figs., 1 tab.

  4. Plasma simulation studies using multilevel physics models

    SciTech Connect

    Park, W.; Belova, E.V.; Fu, G.Y.

    2000-01-19

    The question of how to proceed toward ever more realistic plasma simulation studies using ever increasing computing power is addressed. The answer presented here is the M3D (Multilevel 3D) project, which has developed a code package with a hierarchy of physics levels that resolve increasingly complete subsets of phase-spaces and are thus increasingly more realistic. The rationale for the multilevel physics models is given. Each physics level is described and examples of its application are given. The existing physics levels are fluid models (3D configuration space), namely magnetohydrodynamic (MHD) and two-fluids; and hybrid models, namely gyrokinetic-energetic-particle/MHD (5D energetic particle phase-space), gyrokinetic-particle-ion/fluid-electron (5D ion phase-space), and full-kinetic-particle-ion/fluid-electron level (6D ion phase-space). Resolving electron phase-space (5D or 6D) remains a future project. Phase-space-fluid models are not used in favor of delta f particle models. A practical and accurate nonlinear fluid closure for noncollisional plasmas seems not likely in the near future.

  5. Topos models for physics and topos theory

    SciTech Connect

    Wolters, Sander

    2014-08-15

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  6. Dilution physics modeling: Dissolution/precipitation chemistry

    SciTech Connect

    Onishi, Y.; Reid, H.C.; Trent, D.S.

    1995-09-01

    This report documents progress made to date on integrating dilution/precipitation chemistry and new physical models into the TEMPEST thermal-hydraulics computer code. Implementation of dissolution/precipitation chemistry models is necessary for predicting nonhomogeneous, time-dependent, physical/chemical behavior of tank wastes with and without a variety of possible engineered remediation and mitigation activities. Such behavior includes chemical reactions, gas retention, solids resuspension, solids dissolution and generation, solids settling/rising, and convective motion of physical and chemical species. Thus this model development is important from the standpoint of predicting the consequences of various engineered activities, such as mitigation by dilution, retrieval, or pretreatment, that can affect safe operations. The integration of a dissolution/precipitation chemistry module allows the various phase species concentrations to enter into the physical calculations that affect the TEMPEST hydrodynamic flow calculations. The yield strength model of non-Newtonian sludge correlates yield to a power function of solids concentration. Likewise, shear stress is concentration-dependent, and the dissolution/precipitation chemistry calculations develop the species concentration evolution that produces fluid flow resistance changes. Dilution of waste with pure water, molar concentrations of sodium hydroxide, and other chemical streams can be analyzed for the reactive species changes and hydrodynamic flow characteristics.

  7. Service Learning In Physics: The Consultant Model

    NASA Astrophysics Data System (ADS)

    Guerra, David

    2005-04-01

    Each year thousands of students across the country and across the academic disciplines participate in service learning. Unfortunately, with no clear model for integrating community service into the physics curriculum, there are very few physics students engaged in service learning. To overcome this shortfall, a consultant based service-learning program has been developed and successfully implemented at Saint Anselm College (SAC). As consultants, students in upper level physics courses apply their problem solving skills in the service of others. Most recently, SAC students provided technical and managerial support to a group from Girl's Inc., a national empowerment program for girls in high-risk, underserved areas, who were participating in the national FIRST Lego League Robotics competition. In their role as consultants the SAC students provided technical information through brainstorming sessions and helped the girls stay on task with project management techniques, like milestone charting. This consultant model of service-learning, provides technical support to groups that may not have a great deal of resources and gives physics students a way to improve their interpersonal skills, test their technical expertise, and better define the marketable skill set they are developing through the physics curriculum.

  8. Transforming teacher knowledge: Modeling instruction in physics

    NASA Astrophysics Data System (ADS)

    Cabot, Lloyd H.

    I show that the Modeling physics curriculum is readily accommodated by most teachers in favor of traditional didactic pedagogies. This is so, at least in part, because Modeling focuses on a small set of connected models embedded in a self-consistent theoretical framework and thus is closely congruent with human cognition in this context which is to generate mental models of physical phenomena as both predictive and explanatory devices. Whether a teacher fully implements the Modeling pedagogy depends on the depth of the teacher's commitment to inquiry-based instruction, specifically Modeling instruction, as a means of promoting student understanding of Newtonian mechanics. Moreover, this commitment trumps all other characteristics: teacher educational background, content coverage issues, student achievement data, district or state learning standards, and district or state student assessments. Indeed, distinctive differences exist in how Modeling teachers deliver their curricula and some teachers are measurably more effective than others in their delivery, but they all share an unshakable belief in the efficacy of inquiry-based, constructivist-oriented instruction. The Modeling Workshops' pedagogy, duration, and social interactions impacts teachers' self-identification as members of a professional community. Finally, I discuss the consequences my research may have for the Modeling Instruction program designers and for designers of professional development programs generally.

  9. Full-waveform modeling and inversion of physical model data

    NASA Astrophysics Data System (ADS)

    Cai, Jian; Zhang, Jie

    2016-08-01

    Because full elastic waveform inversion requires considerable computation time for forward modeling and inversion, acoustic waveform inversion is often applied to marine data for reducing the computational time. To understand the validity of the acoustic approximation, we study data collected from an ultrasonic laboratory with a known physical model by applying elastic and acoustic waveform modeling and acoustic waveform inversion. This study enables us to evaluate waveform differences quantitatively between synthetics and real data from the same physical model and to understand the effects of different objective functions in addressing the waveform differences for full-waveform inversion. Because the materials used in the physical experiment are viscoelastic, we find that both elastic and acoustic synthetics differ substantially from the physical data over offset in true amplitude. If attenuation is taken into consideration, the amplitude versus offset (AVO) of viscoelastic synthetics more closely approximates the physical data. To mitigate the effect of amplitude differences, we apply trace normalization to both synthetics and physical data in acoustic full-waveform inversion. The objective function is equivalent to minimizing the phase differences with indirect contributions from the amplitudes. We observe that trace normalization helps to stabilize the inversion and obtain more accurate model solutions for both synthetics and physical data.

  10. Modelling Students' Construction of Energy Models in Physics.

    ERIC Educational Resources Information Center

    Devi, Roshni; And Others

    1996-01-01

    Examines students' construction of experimentation models for physics theories in energy storage, transformation, and transfers involving electricity and mechanics. Student problem solving dialogs and artificial intelligence modeling of these processes is analyzed. Construction of models established relations between elements with linear causal…

  11. Physics Beyond the Standard Model: Supersymmetry

    SciTech Connect

    Nojiri, M.M.; Plehn, T.; Polesello, G.; Alexander, John M.; Allanach, B.C.; Barr, Alan J.; Benakli, K.; Boudjema, F.; Freitas, A.; Gwenlan, C.; Jager, S.; /CERN /LPSC, Grenoble

    2008-02-01

    This collection of studies on new physics at the LHC constitutes the report of the supersymmetry working group at the Workshop 'Physics at TeV Colliders', Les Houches, France, 2007. They cover the wide spectrum of phenomenology in the LHC era, from alternative models and signatures to the extraction of relevant observables, the study of the MSSM parameter space and finally to the interplay of LHC observations with additional data expected on a similar time scale. The special feature of this collection is that while not each of the studies is explicitly performed together by theoretical and experimental LHC physicists, all of them were inspired by and discussed in this particular environment.

  12. Prototyping of cerebral vasculature physical models

    PubMed Central

    Khan, Imad S.; Kelly, Patrick D.; Singer, Robert J.

    2014-01-01

    Background: Prototyping of cerebral vasculature models through stereolithographic methods have the ability to accurately depict the 3D structures of complicated aneurysms with high accuracy. We describe the method to manufacture such a model and review some of its uses in the context of treatment planning, research, and surgical training. Methods: We prospectively used the data from the rotational angiography of a 40-year-old female who presented with an unruptured right paraclinoid aneurysm. The 3D virtual model was then converted to a physical life-sized model. Results: The model constructed was shown to be a very accurate depiction of the aneurysm and its associated vasculature. It was found to be useful, among other things, for surgical training and as a patient education tool. Conclusion: With improving and more widespread printing options, these models have the potential to become an important part of research and training modalities. PMID:24678427

  13. Physical Modeling of the Composting Ecosystem †

    PubMed Central

    Hogan, J. A.; Miller, F. C.; Finstein, M. S.

    1989-01-01

    A composting physical model with an experimental chamber with a working volume of 14 × 103 cm3 (0.5 ft3) was designed to avoid exaggerated conductive heat loss resulting from, relative to field-scale piles, a disproportionately large outer surface-area-to-volume ratio. In the physical model, conductive flux (rate of heat flow through chamber surfaces) was made constant and slight through a combination of insulation and temperature control of the surrounding air. This control was based on the instantaneous conductive flux, as calculated from temperature differentials via a conductive heat flow model. An experiment was performed over a 10-day period in which control of the composting process was based on ventilative heat removal in reference to a microbially favorable temperature ceiling (temperature feedback). By using the conduction control system (surrounding air temperature controlled), 2.4% of the total heat evolved from the chamber was through conduction, whereas the remainder was through the ventilative mechanisms of the latent heat of vaporization and the sensible temperature increase of air. By comparison, with insulation alone (the conduction control system was not used) conduction accounted for 33.5% of the total heat evolved. This difference in conduction resulted in substantial behavioral differences with respect to the temperature of the composting matrix and the amount of water removed. By emphasizing the slight conduction system (2.4% of total heat flow) as being a better representative of field conditions, a comparison was made between composting system behavior in the laboratory physical model and field-scale piles described in earlier reports. Numerous behavioral patterns were qualitatively similar in the laboratory and field (e.g., temperature gradient, O2 content, and water removal). It was concluded that field-scale composting system behavior can be simulated reasonably faithfully in the physical model. Images PMID:16347903

  14. Introduction. Stochastic physics and climate modelling.

    PubMed

    Palmer, T N; Williams, P D

    2008-07-28

    Finite computing resources limit the spatial resolution of state-of-the-art global climate simulations to hundreds of kilometres. In neither the atmosphere nor the ocean are small-scale processes such as convection, clouds and ocean eddies properly represented. Climate simulations are known to depend, sometimes quite strongly, on the resulting bulk-formula representation of unresolved processes. Stochastic physics schemes within weather and climate models have the potential to represent the dynamical effects of unresolved scales in ways which conventional bulk-formula representations are incapable of so doing. The application of stochastic physics to climate modelling is a rapidly advancing, important and innovative topic. The latest research findings are gathered together in the Theme Issue for which this paper serves as the introduction.

  15. Some Generalized Physical Models Through Homographic Group

    NASA Astrophysics Data System (ADS)

    Agop, Maricel; Gavriluţ, Alina

    2015-10-01

    In the present paper some generalized physical models are established using differential and integral elements geometry of the homographic group. The generalization of the hyperbolic motions (with constant acceleration) on the Minkowskian space-time and classical Kepler problems is analyzed using a variational principle of Matzner-Misner type. This way the Skyrme theory is placed in an inherent continuity with respect to the Newtonian natural philosophy.

  16. Physical modelling of failure in composites.

    PubMed

    Talreja, Ramesh

    2016-07-13

    Structural integrity of composite materials is governed by failure mechanisms that initiate at the scale of the microstructure. The local stress fields evolve with the progression of the failure mechanisms. Within the full span from initiation to criticality of the failure mechanisms, the governing length scales in a fibre-reinforced composite change from the fibre size to the characteristic fibre-architecture sizes, and eventually to a structural size, depending on the composite configuration and structural geometry as well as the imposed loading environment. Thus, a physical modelling of failure in composites must necessarily be of multi-scale nature, although not always with the same hierarchy for each failure mode. With this background, the paper examines the currently available main composite failure theories to assess their ability to capture the essential features of failure. A case is made for an alternative in the form of physical modelling and its skeleton is constructed based on physical observations and systematic analysis of the basic failure modes and associated stress fields and energy balances. This article is part of the themed issue 'Multiscale modelling of the structural integrity of composite materials'. PMID:27242307

  17. Physical models of polarization mode dispersion

    SciTech Connect

    Menyuk, C.R.; Wai, P.K.A.

    1995-12-31

    The effect of randomly varying birefringence on light propagation in optical fibers is studied theoretically in the parameter regime that will be used for long-distance communications. In this regime, the birefringence is large and varies very rapidly in comparison to the nonlinear and dispersive scale lengths. We determine the polarization mode dispersion, and we show that physically realistic models yield the same result for polarization mode dispersion as earlier heuristic models that were introduced by Poole. We also prove an ergodic theorem.

  18. Statistical physical models of cellular motility

    NASA Astrophysics Data System (ADS)

    Banigan, Edward J.

    Cellular motility is required for a wide range of biological behaviors and functions, and the topic poses a number of interesting physical questions. In this work, we construct and analyze models of various aspects of cellular motility using tools and ideas from statistical physics. We begin with a Brownian dynamics model for actin-polymerization-driven motility, which is responsible for cell crawling and "rocketing" motility of pathogens. Within this model, we explore the robustness of self-diffusiophoresis, which is a general mechanism of motility. Using this mechanism, an object such as a cell catalyzes a reaction that generates a steady-state concentration gradient that propels the object in a particular direction. We then apply these ideas to a model for depolymerization-driven motility during bacterial chromosome segregation. We find that depolymerization and protein-protein binding interactions alone are sufficient to robustly pull a chromosome, even against large loads. Next, we investigate how forces and kinetics interact during eukaryotic mitosis with a many-microtubule model. Microtubules exert forces on chromosomes, but since individual microtubules grow and shrink in a force-dependent way, these forces lead to bistable collective microtubule dynamics, which provides a mechanism for chromosome oscillations and microtubule-based tension sensing. Finally, we explore kinematic aspects of cell motility in the context of the immune system. We develop quantitative methods for analyzing cell migration statistics collected during imaging experiments. We find that during chronic infection in the brain, T cells run and pause stochastically, following the statistics of a generalized Levy walk. These statistics may contribute to immune function by mimicking an evolutionarily conserved efficient search strategy. Additionally, we find that naive T cells migrating in lymph nodes also obey non-Gaussian statistics. Altogether, our work demonstrates how physical

  19. Physical vs. Mathematical Models in Rock Mechanics

    NASA Astrophysics Data System (ADS)

    Morozov, I. B.; Deng, W.

    2013-12-01

    One of the less noted challenges in understanding the mechanical behavior of rocks at both in situ and lab conditions is the character of theoretical approaches being used. Currently, the emphasis is made on spatial averaging theories (homogenization and numerical models of microstructure), empirical models for temporal behavior (material memory, compliance functions and complex moduli), and mathematical transforms (Laplace and Fourier) used to infer the Q-factors and 'relaxation mechanisms'. In geophysical applications, we have to rely on such approaches for very broad spatial and temporal scales which are not available in experiments. However, the above models often make insufficient use of physics and utilize, for example, the simplified 'correspondence principle' instead of the laws of viscosity and friction. As a result, the commonly-used time- and frequency dependent (visco)elastic moduli represent apparent properties related to the measurement procedures and not necessarily to material properties. Predictions made from such models may therefore be inaccurate or incorrect when extrapolated beyond the lab scales. To overcome the above challenge, we need to utilize the methods of micro- and macroscopic mechanics and thermodynamics known in theoretical physics. This description is rigorous and accurate, uses only partial differential equations, and allows straightforward numerical implementations. One important observation from the physical approach is that the analysis should always be done for the specific geometry and parameters of the experiment. Here, we illustrate these methods on axial deformations of a cylindrical rock sample in the lab. A uniform, isotropic elastic rock with a thermoelastic effect is considered in four types of experiments: 1) axial extension with free transverse boundary, 2) pure axial extension with constrained transverse boundary, 3) pure bulk expansion, and 4) axial loading harmonically varying with time. In each of these cases, an

  20. A physical model of Titan's clouds

    NASA Technical Reports Server (NTRS)

    Toon, O. B.; Pollack, J. B.; Turco, R. P.

    1980-01-01

    A physical model of the formation and growth of aerosols in the atmosphere of Titan has been constructed in light of the observed correlation between variations in Titan's albedo and the sunspot cycle. The model was developed to fit spectral observations of deep methane bands, pressures, temperature distributions, and cloud structure, and is based on a one-dimensional physical-chemical model developed to simulate the earth's stratospheric aerosol layer. Sensitivity tests reveal the model parameters to be relatively insensitive to particle shape but sensitive to particle density, with high particle densities requiring larger aerosol mass production rates to produce compatible clouds. Solution of the aerosol continuity equations for particles of sizes 13 A to about 3 microns indicates the importance of a warm upper atmosphere and a high-altitude mass injection layer, and the production of aerosols at very low aerosol optical depths. Limits are obtained for the chemical production of aerosol mass and the eddy diffusion coefficient, and it is found that an increase in mass input causes a decrease in mean particle size.

  1. Beyond the standard model of particle physics.

    PubMed

    Virdee, T S

    2016-08-28

    The Large Hadron Collider (LHC) at CERN and its experiments were conceived to tackle open questions in particle physics. The mechanism of the generation of mass of fundamental particles has been elucidated with the discovery of the Higgs boson. It is clear that the standard model is not the final theory. The open questions still awaiting clues or answers, from the LHC and other experiments, include: What is the composition of dark matter and of dark energy? Why is there more matter than anti-matter? Are there more space dimensions than the familiar three? What is the path to the unification of all the fundamental forces? This talk will discuss the status of, and prospects for, the search for new particles, symmetries and forces in order to address the open questions.This article is part of the themed issue 'Unifying physics and technology in light of Maxwell's equations'. PMID:27458261

  2. Beyond the standard model of particle physics.

    PubMed

    Virdee, T S

    2016-08-28

    The Large Hadron Collider (LHC) at CERN and its experiments were conceived to tackle open questions in particle physics. The mechanism of the generation of mass of fundamental particles has been elucidated with the discovery of the Higgs boson. It is clear that the standard model is not the final theory. The open questions still awaiting clues or answers, from the LHC and other experiments, include: What is the composition of dark matter and of dark energy? Why is there more matter than anti-matter? Are there more space dimensions than the familiar three? What is the path to the unification of all the fundamental forces? This talk will discuss the status of, and prospects for, the search for new particles, symmetries and forces in order to address the open questions.This article is part of the themed issue 'Unifying physics and technology in light of Maxwell's equations'.

  3. Models in Physics, Models for Physics Learning, and Why the Distinction May Matter in the Case of Electric Circuits

    ERIC Educational Resources Information Center

    Hart, Christina

    2008-01-01

    Models are important both in the development of physics itself and in teaching physics. Historically, the consensus models of physics have come to embody particular ontological assumptions and epistemological commitments. Educators have generally assumed that the consensus models of physics, which have stood the test of time, will also work well…

  4. Structured physical examination data: a modeling challenge.

    PubMed

    Doupi, P; van Ginneken, A M

    2001-01-01

    The success of systems facilitating collection of structured data by clinicians is largely dependent on the flexibility of the interface. The Open Record for CAre (ORCA) makes use of a generic model to support knowledge-based structured data entry for a variety of medical domains. An endeavor undertaken recently aimed to cover the broader area of Physical Examination by expanding the contents of the knowledge base. The model was found to be adequately expressive for supporting this task. Maintaining the balance between flexibility of the interface and constraints dictated by reliable retrieval, however, proved to be a considerable challenge. In this paper we illustrate through specific examples the effect of this trade off on the modeling process, together with the rationale for the chosen solutions and suggestions for future research focus.

  5. Physical modelling of the nuclear pore complex

    PubMed Central

    Fassati, Ariberto; Ford, Ian J.; Hoogenboom, Bart W.

    2013-01-01

    Physically interesting behaviour can arise when soft matter is confined to nanoscale dimensions. A highly relevant biological example of such a phenomenon is the Nuclear Pore Complex (NPC) found perforating the nuclear envelope of eukaryotic cells. In the central conduit of the NPC, of ∼30–60 nm diameter, a disordered network of proteins regulates all macromolecular transport between the nucleus and the cytoplasm. In spite of a wealth of experimental data, the selectivity barrier of the NPC has yet to be explained fully. Experimental and theoretical approaches are complicated by the disordered and heterogeneous nature of the NPC conduit. Modelling approaches have focused on the behaviour of the partially unfolded protein domains in the confined geometry of the NPC conduit, and have demonstrated that within the range of parameters thought relevant for the NPC, widely varying behaviour can be observed. In this review, we summarise recent efforts to physically model the NPC barrier and function. We illustrate how attempts to understand NPC barrier function have employed many different modelling techniques, each of which have contributed to our understanding of the NPC.

  6. Ionospheric irregularity physics modelling. Memorandum report

    SciTech Connect

    Ossakow, S.L.; Keskinen, M.J.; Zalesak, S.T.

    1982-02-09

    Theoretical and numerical simulation techniques have been employed to study ionospheric F region plasma cloud striation phenomena, equatorial spread F phenomena, and high latitude diffuse auroral F region irregularity phenomena. Each of these phenomena can cause scintillation effects. The results and ideas from these studies are state-of-the-art, agree well with experimental observations, and have induced experimentalists to look for theoretically predicted results. One conclusion that can be drawn from these studies is that ionospheric irregularity phenomena can be modelled from a first principles physics point of view. Theoretical and numerical simulation results from the aforementioned ionospheric irregularity areas will be presented.

  7. Physical modeling synthesis of recorder sound

    NASA Astrophysics Data System (ADS)

    Shiraiwa, Hiroko; Kishi, Kenshi; Nakamura, Isao

    2003-04-01

    A time-domain simulation of the soprano baroque recorder based on the digital waveguide model (DWM) and an air reed model is introduced. The air reed model is developed upon the negative acoustic displacement model (NADM), which was proposed for the organ flue-pipe simulation [Adachi, Proc. of ISMA 1997, pp. 251-260], based on the semiempirical model by Fletcher [Fletcher and Rossing, The Physics of Musical Instruments, 2nd ed. (Springer, Berlin, 2000)]. Two models are proposed to couple DWM and NADM. The jet amplification coefficient is remodeled for the application of NADM for the recorder, regarding the recent experimental reports [Yoshikawa and Arimoto, Proc. of ISMA 2001, pp. 309-312]. The simulation results are presented in terms of the mode transient characteristics and the spectral characteristics of the synthesized sounds. They indicate that the NADM is not sufficient to describe the realistic mode transient of the recorder, while the synthesized sounds maintained almost resemble timbre to the recorder sounds.

  8. Modelling biological complexity: a physical scientist's perspective.

    PubMed

    Coveney, Peter V; Fowler, Philip W

    2005-09-22

    We discuss the modern approaches of complexity and self-organization to understanding dynamical systems and how these concepts can inform current interest in systems biology. From the perspective of a physical scientist, it is especially interesting to examine how the differing weights given to philosophies of science in the physical and biological sciences impact the application of the study of complexity. We briefly describe how the dynamics of the heart and circadian rhythms, canonical examples of systems biology, are modelled by sets of nonlinear coupled differential equations, which have to be solved numerically. A major difficulty with this approach is that all the parameters within these equations are not usually known. Coupled models that include biomolecular detail could help solve this problem. Coupling models across large ranges of length- and time-scales is central to describing complex systems and therefore to biology. Such coupling may be performed in at least two different ways, which we refer to as hierarchical and hybrid multiscale modelling. While limited progress has been made in the former case, the latter is only beginning to be addressed systematically. These modelling methods are expected to bring numerous benefits to biology, for example, the properties of a system could be studied over a wider range of length- and time-scales, a key aim of systems biology. Multiscale models couple behaviour at the molecular biological level to that at the cellular level, thereby providing a route for calculating many unknown parameters as well as investigating the effects at, for example, the cellular level, of small changes at the biomolecular level, such as a genetic mutation or the presence of a drug. The modelling and simulation of biomolecular systems is itself very computationally intensive; we describe a recently developed hybrid continuum-molecular model, HybridMD, and its associated molecular insertion algorithm, which point the way towards the

  9. Modelling biological complexity: a physical scientist's perspective

    PubMed Central

    Coveney, Peter V; Fowler, Philip W

    2005-01-01

    We discuss the modern approaches of complexity and self-organization to understanding dynamical systems and how these concepts can inform current interest in systems biology. From the perspective of a physical scientist, it is especially interesting to examine how the differing weights given to philosophies of science in the physical and biological sciences impact the application of the study of complexity. We briefly describe how the dynamics of the heart and circadian rhythms, canonical examples of systems biology, are modelled by sets of nonlinear coupled differential equations, which have to be solved numerically. A major difficulty with this approach is that all the parameters within these equations are not usually known. Coupled models that include biomolecular detail could help solve this problem. Coupling models across large ranges of length- and time-scales is central to describing complex systems and therefore to biology. Such coupling may be performed in at least two different ways, which we refer to as hierarchical and hybrid multiscale modelling. While limited progress has been made in the former case, the latter is only beginning to be addressed systematically. These modelling methods are expected to bring numerous benefits to biology, for example, the properties of a system could be studied over a wider range of length- and time-scales, a key aim of systems biology. Multiscale models couple behaviour at the molecular biological level to that at the cellular level, thereby providing a route for calculating many unknown parameters as well as investigating the effects at, for example, the cellular level, of small changes at the biomolecular level, such as a genetic mutation or the presence of a drug. The modelling and simulation of biomolecular systems is itself very computationally intensive; we describe a recently developed hybrid continuum-molecular model, HybridMD, and its associated molecular insertion algorithm, which point the way towards the

  10. Detailed Physical Trough Model for NREL's Solar Advisor Model: Preprint

    SciTech Connect

    Wagner, M. J.; Blair, N.; Dobos, A.

    2010-10-01

    Solar Advisor Model (SAM) is a free software package made available by the National Renewable Energy Laboratory (NREL), Sandia National Laboratory, and the US Department of Energy. SAM contains hourly system performance and economic models for concentrating solar power (CSP) systems, photovoltaic, solar hot-water, and generic fuel-use technologies. Versions of SAM prior to 2010 included only the parabolic trough model based on Excelergy. This model uses top-level empirical performance curves to characterize plant behavior, and thus is limited in predictive capability for new technologies or component configurations. To address this and other functionality challenges, a new trough model; derived from physical first principles was commissioned to supplement the Excelergy-based empirical model. This new 'physical model' approaches the task of characterizing the performance of the whole parabolic trough plant by replacing empirical curve-fit relationships with more detailed calculations where practical. The resulting model matches the annual performance of the SAM empirical model (which has been previously verified with plant data) while maintaining run-times compatible with parametric analysis, adding additional flexibility in modeled system configurations, and providing more detailed performance calculations in the solar field, power block, piping, and storage subsystems.

  11. Physics model for wringing of wet cloth

    NASA Astrophysics Data System (ADS)

    Dany Rahmayanti, Handika; Utami, Fisca Dian; Abdullah, Mikrajuddin

    2016-11-01

    One activity that has been performed by human beings for a long time is washing clothes. Before the invention of the washing machine, clothes were washed by hand and then wrung before drying in the open air. When observed carefully, the wringing of cloth presents some interesting phenomena. However, there are no reports on the physical modelling of this very old activity. This paper reports a simple model to explain the discharge of water from clothes when squeezed. A simple tool was also designed to retrieve data to confirm the theory. We found that the theoretical predictions accurately explained the experimental results. The experiments were conducted on two types of cloth: towels and batik cloth. We also obtained a universal curve to which all the data converged.

  12. Semi-Empirical Modeling of SLD Physics

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Potapczuk, Mark G.

    2004-01-01

    The effects of supercooled large droplets (SLD) in icing have been an area of much interest in recent years. As part of this effort, the assumptions used for ice accretion software have been reviewed. A literature search was performed to determine advances from other areas of research that could be readily incorporated. Experimental data in the SLD regime was also analyzed. A semi-empirical computational model is presented which incorporates first order physical effects of large droplet phenomena into icing software. This model has been added to the LEWICE software. Comparisons are then made to SLD experimental data that has been collected to date. Results will be presented for the comparison of water collection efficiency, ice shape and ice mass.

  13. Physics-based models of the plasmasphere

    SciTech Connect

    Jordanova, Vania K; Pierrard, Vivane; Goldstein, Jerry; Andr'e, Nicolas; Lemaire, Joseph F; Liemohn, Mike W; Matsui, H

    2008-01-01

    We describe recent progress in physics-based models of the plasmasphere using the Auid and the kinetic approaches. Global modeling of the dynamics and inAuence of the plasmasphere is presented. Results from global plasmasphere simulations are used to understand and quantify (i) the electric potential pattern and evolution during geomagnetic storms, and (ii) the inAuence of the plasmasphere on the excitation of electromagnetic ion cyclotron (ElvIIC) waves a.nd precipitation of energetic ions in the inner magnetosphere. The interactions of the plasmasphere with the ionosphere a.nd the other regions of the magnetosphere are pointed out. We show the results of simulations for the formation of the plasmapause and discuss the inAuence of plasmaspheric wind and of ultra low frequency (ULF) waves for transport of plasmaspheric material. Theoretical formulations used to model the electric field and plasma distribution in the plasmasphere are given. Model predictions are compared to recent CLUSTER and MAGE observations, but also to results of earlier models and satellite observations.

  14. 3-D physical models of amitosis (cytokinesis).

    PubMed

    Cheng, Kang; Zou, Changhua

    2005-01-01

    Based on Newton's laws, extended Coulomb's law and published biological data, we develop our 3-D physical models of natural and normal amitosis (cytokinesis), for prokaryotes (bacterial cells) in M phase. We propose following hypotheses: Chromosome rings exclusion: No normally and naturally replicated chromosome rings (RCR) can occupy the same prokaryote, a bacterial cell. The RCR produce spontaneous and strong electromagnetic fields (EMF), that can be alternated environmentally, in protoplasm and cortex. The EMF is approximately a repulsive quasi-static electric (slowly variant and mostly electric) field (EF). The EF forces between the RCR are strong enough, and orderly accumulate contractile proteins that divide the procaryotes in the cell cortex of division plane or directly split the cell compartment envelope longitudinally. The radial component of the EF forces could also make furrows or cleavages of procaryotes. The EF distribution controls the protoplasm partition and completes the amitosis (cytokinesis). After the cytokinesis, the spontaneous and strong EF disappear because the net charge accumulation becomes weak, in the protoplasm. The exclusion is because the two sets of informative objects (RCR) have identical DNA codes information and they are electro magnetically identical, therefore they repulse from each other. We also compare divisions among eukaryotes, prokaryotes, mitochondria and chloroplasts and propose our hypothesis: The principles of our models are applied to divisions of mitochondria and chloroplasts of eucaryotes too because these division mechanisms are closer than others in a view of physics. Though we develop our model using 1 division plane (i.e., 1 cell is divided into 2 cells) as an example, the principle of our model is applied to the cases with multiple division planes (i.e., 1 cell is divided into multiple cells) too.

  15. Propulsion Physics Using the Chameleon Density Model

    NASA Technical Reports Server (NTRS)

    Robertson, Glen A.

    2011-01-01

    To grow as a space faring race, future spaceflight systems will require a new theory of propulsion. Specifically one that does not require mass ejection without limiting the high thrust necessary to accelerate within or beyond our solar system and return within a normal work period or lifetime. The Chameleon Density Model (CDM) is one such model that could provide new paths in propulsion toward this end. The CDM is based on Chameleon Cosmology a dark matter theory; introduced by Khrouy and Weltman in 2004. Chameleon as it is hidden within known physics, where the Chameleon field represents a scalar field within and about an object; even in the vacuum. The CDM relates to density changes in the Chameleon field, where the density changes are related to matter accelerations within and about an object. These density changes in turn change how an object couples to its environment. Whereby, thrust is achieved by causing a differential in the environmental coupling about an object. As a demonstration to show that the CDM fits within known propulsion physics, this paper uses the model to estimate the thrust from a solid rocket motor. Under the CDM, a solid rocket constitutes a two body system, i.e., the changing density of the rocket and the changing density in the nozzle arising from the accelerated mass. Whereby, the interactions between these systems cause a differential coupling to the local gravity environment of the earth. It is shown that the resulting differential in coupling produces a calculated value for the thrust near equivalent to the conventional thrust model used in Sutton and Ross, Rocket Propulsion Elements. Even though imbedded in the equations are the Universe energy scale factor, the reduced Planck mass and the Planck length, which relates the large Universe scale to the subatomic scale.

  16. Tactile Teaching: Exploring Protein Structure/Function Using Physical Models

    ERIC Educational Resources Information Center

    Herman, Tim; Morris, Jennifer; Colton, Shannon; Batiza, Ann; Patrick, Michael; Franzen, Margaret; Goodsell, David S.

    2006-01-01

    The technology now exists to construct physical models of proteins based on atomic coordinates of solved structures. We review here our recent experiences in using physical models to teach concepts of protein structure and function at both the high school and the undergraduate levels. At the high school level, physical models are used in a…

  17. Computer Integrated Manufacturing: Physical Modelling Systems Design. A Personal View.

    ERIC Educational Resources Information Center

    Baker, Richard

    A computer-integrated manufacturing (CIM) Physical Modeling Systems Design project was undertaken in a time of rapid change in the industrial, business, technological, training, and educational areas in Australia. A specification of a manufacturing physical modeling system was drawn up. Physical modeling provides a flexibility and configurability…

  18. Fuzzy modelling of Atlantic salmon physical habitat

    NASA Astrophysics Data System (ADS)

    St-Hilaire, André; Mocq, Julien; Cunjak, Richard

    2015-04-01

    Fish habitat models typically attempt to quantify the amount of available river habitat for a given fish species for various flow and hydraulic conditions. To achieve this, information on the preferred range of values of key physical habitat variables (e.g. water level, velocity, substrate diameter) for the targeted fishs pecies need to be modelled. In this context, we developed several habitat suitability indices sets for three Atlantic salmon life stages (young-of-the-year (YOY), parr, spawning adults) with the help of fuzzy logic modeling. Using the knowledge of twenty-seven experts, from both sides of the Atlantic Ocean, we defined fuzzy sets of four variables (depth, substrate size, velocity and Habitat Suitability Index, or HSI) and associated fuzzy rules. When applied to the Romaine River (Canada), median curves of standardized Weighted Usable Area (WUA) were calculated and a confidence interval was obtained by bootstrap resampling. Despite the large range of WUA covered by the expert WUA curves, confidence intervals were relatively narrow: an average width of 0.095 (on a scale of 0 to 1) for spawning habitat, 0.155 for parr rearing habitat and 0.160 for YOY rearing habitat. When considering an environmental flow value corresponding to 90% of the maximum reached by WUA curve, results seem acceptable for the Romaine River. Generally, this proposed fuzzy logic method seems suitable to model habitat availability for the three life stages, while also providing an estimate of uncertainty in salmon preferences.

  19. Compass models: Theory and physical motivations

    NASA Astrophysics Data System (ADS)

    Nussinov, Zohar; van den Brink, Jeroen

    2015-01-01

    Compass models are theories of matter in which the couplings between the internal spin (or other relevant field) components are inherently spatially (typically, direction) dependent. A simple illustrative example is furnished by the 90° compass model on a square lattice in which only couplings of the form τixτjx (where {τia}a denote Pauli operators at site i ) are associated with nearest-neighbor sites i and j separated along the x axis of the lattice while τiyτjy couplings appear for sites separated by a lattice constant along the y axis. Similar compass-type interactions can appear in diverse physical systems. For instance, compass models describe Mott insulators with orbital degrees of freedom where interactions sensitively depend on the spatial orientation of the orbitals involved as well as the low-energy effective theories of frustrated quantum magnets, and a host of other systems such as vacancy centers, and cold atomic gases. The fundamental interdependence between internal (spin, orbital, or other) and external (i.e., spatial) degrees of freedom which underlies compass models generally leads to very rich behaviors, including the frustration of (semi-)classical ordered states on nonfrustrated lattices, and to enhanced quantum effects, prompting, in certain cases, the appearance of zero-temperature quantum spin liquids. As a consequence of these frustrations, new types of symmetries and their associated degeneracies may appear. These intermediate symmetries lie midway between the extremes of global symmetries and local gauge symmetries and lead to effective dimensional reductions. In this article, compass models are reviewed in a unified manner, paying close attention to exact consequences of these symmetries and to thermal and quantum fluctuations that stabilize orders via order-out-of-disorder effects. This is complemented by a survey of numerical results. In addition to reviewing past works, a number of other models are introduced and new results

  20. A Holoinformational Model of the Physical Observer

    NASA Astrophysics Data System (ADS)

    Biase, Francisco Di

    2013-09-01

    The author proposes a holoinformational view of the observer based, on the holonomic theory of brain/mind function and quantum brain dynamics developed by Karl Pribram, Sir John Eccles, R.L. Amoroso, Hameroff, Jibu and Yasue, and in the quantumholographic and holomovement theory of David Bohm. This conceptual framework is integrated with nonlocal information properties of the Quantum Field Theory of Umesawa, with the concept of negentropy, order, and organization developed by Shannon, Wiener, Szilard and Brillouin, and to the theories of self-organization and complexity of Prigogine, Atlan, Jantsch and Kauffman. Wheeler's "it from bit" concept of a participatory universe, and the developments of the physics of information made by Zureck and others with the concepts of statistical entropy and algorithmic entropy, related to the number of bits being processed in the mind of the observer are also considered. This new synthesis gives a self-organizing quantum nonlocal informational basis for a new model of awareness in a participatory universe. In this synthesis, awareness is conceived as meaningful quantum nonlocal information interconnecting the brain and the cosmos, by a holoinformational unified field (integrating nonlocal holistic (quantum) and local (Newtonian). We propose that the cosmology of the physical observer is this unified nonlocal quantum-holographic cosmos manifesting itself through awareness, interconnected in a participatory holistic and indivisible way the human mind-brain to all levels of the self-organizing holographic anthropic multiverse.

  1. Physical and Statistical Modeling of Saturn's Troposphere

    NASA Astrophysics Data System (ADS)

    Yanamandra-Fisher, Padmavati A.; Braverman, Amy J.; Orton, Glenn S.

    2002-12-01

    The 5.2-μm atmospheric window on Saturn is dominated by thermal radiation and weak gaseous absorption, with a 20% contribution from sunlight reflected from clouds. The striking variability displayed by Saturn's clouds at 5.2 μm and the detection of PH3 (an atmospheric tracer) variability near or below the 2-bar level and possibly at lower pressures provide salient constraints on the dynamical organization of Saturn's atmosphere by constraining the strength of vertical motions at two levels across the disk. We analyse the 5.2-μm spectra of Saturn by utilising two independent methods: (a) physical models based on the relevant atmospheric parameters and (b) statistical analysis, based on principal components analysis (PCA), to determine the influence of the variation of phosphine and the opacity of clouds deep within Saturn's atmosphere to understand the dynamics in its atmosphere.

  2. Outstanding questions: physics beyond the Standard Model.

    PubMed

    Ellis, John

    2012-02-28

    The Standard Model of particle physics agrees very well with experiment, but many important questions remain unanswered, among them are the following. What is the origin of particle masses and are they due to a Higgs boson? How does one understand the number of species of matter particles and how do they mix? What is the origin of the difference between matter and antimatter, and is it related to the origin of the matter in the Universe? What is the nature of the astrophysical dark matter? How does one unify the fundamental interactions? How does one quantize gravity? In this article, I introduce these questions and discuss how they may be addressed by experiments at the Large Hadron Collider, with particular attention to the search for the Higgs boson and supersymmetry. PMID:22253238

  3. A Conceptual Model of Observed Physical Literacy

    ERIC Educational Resources Information Center

    Dudley, Dean A.

    2015-01-01

    Physical literacy is a concept that is gaining greater acceptance around the world with the United Nations Educational, Cultural, and Scientific Organization (2013) recognizing it as one of several central tenets in a quality physical education framework. However, previous attempts to understand progression in physical literacy learning have been…

  4. Physical modeling of transverse drainage mechanisms

    NASA Astrophysics Data System (ADS)

    Douglass, J. C.; Schmeeckle, M. W.

    2005-12-01

    Streams that incise across bedrock highlands such as anticlines, upwarps, cuestas, or horsts are termed transverse drainages. Their relevance today involves such diverse matters as highway and dam construction decisions, location of wildlife corridors, better-informed sediment budgets, and detailed studies into developmental histories of late Cenozoic landscapes. The transient conditions responsible for transverse drainage incision have been extensively studied on a case-by-case basis, and the dominate mechanisms proposed include: antecedence, superimposition, overflow, and piracy. Modeling efforts have been limited to antecedence, and such the specific erosional conditions required for transverse drainage incision, with respect to the individual mechanisms, remains poorly understood. In this study, fifteen experiments attempted to simulate the four mechanisms and constructed on a 9.15 m long, 2.1 m wide, and 0.45 m deep stream table. Experiments lasted between 50 and 220 minutes. The stream table was filled with seven tons of sediment consisting of a silt and clay (30%) and a fine to coarse sand (70%) mixture. The physical models highlighted the importance of downstream aggradation with regard to antecedent incision versus possible defeat and diversion. The overflow experiments indicate that retreating knickpoints across a basin outlet produce a high probability of downstream flooding when associated with a deep lake. Misters used in a couple of experiments illustrate a potential complication with regard to headward erosion driven piracy. Relatively level asymmetrically sloped ridges allow for the drainage divide across the ridge to retreat from headward erosion, but hindered when the ridge's apex undulates or when symmetrically sloped. Although these physical models cannot strictly simulate natural transverse drainages, the observed processes, their development over time, and resultant landforms roughly emulate their natural counterparts. Proposed originally from

  5. The Role of Various Curriculum Models on Physical Activity Levels

    ERIC Educational Resources Information Center

    Culpepper, Dean O.; Tarr, Susan J.; Killion, Lorraine E.

    2011-01-01

    Researchers have suggested that physical education curricula can be highly effective in increasing physical activity levels at school (Sallis & Owen, 1999). The purpose of this study was to investigate the impact of various curriculum models on physical activity. Total steps were measured on 1,111 subjects and three curriculum models were studied…

  6. A Structural Equation Model of Expertise in College Physics

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Carr, Martha

    2009-01-01

    A model of expertise in physics was tested on a sample of 374 college students in 2 different level physics courses. Structural equation modeling was used to test hypothesized relationships among variables linked to expert performance in physics including strategy use, pictorial representation, categorization skills, and motivation, and these…

  7. Models for Curriculum and Pedagogy in Elementary School Physical Education

    ERIC Educational Resources Information Center

    Kulinna, Pamela Hodges

    2008-01-01

    The purpose of this article is to review current models for curriculum and pedagogy used in elementary school physical education programs. Historically, physical educators have developed and used a multiactivity curriculum in order to educate students through physical movement. More recently, a variety of alternative curricular models have been…

  8. A Structural Equation Model of Conceptual Change in Physics

    ERIC Educational Resources Information Center

    Taasoobshirazi, Gita; Sinatra, Gale M.

    2011-01-01

    A model of conceptual change in physics was tested on introductory-level, college physics students. Structural equation modeling was used to test hypothesized relationships among variables linked to conceptual change in physics including an approach goal orientation, need for cognition, motivation, and course grade. Conceptual change in physics…

  9. Evolution and physics in comparative protein structure modeling.

    PubMed

    Fiser, András; Feig, Michael; Brooks, Charles L; Sali, Andrej

    2002-06-01

    From a physical perspective, the native structure of a protein is a consequence of physical forces acting on the protein and solvent atoms during the folding process. From a biological perspective, the native structure of proteins is a result of evolution over millions of years. Correspondingly, there are two types of protein structure prediction methods, de novo prediction and comparative modeling. We review comparative protein structure modeling and discuss the incorporation of physical considerations into the modeling process. A good starting point for achieving this aim is provided by comparative modeling by satisfaction of spatial restraints. Incorporation of physical considerations is illustrated by an inclusion of solvation effects into the modeling of loops.

  10. Engaging Students In Modeling Instruction for Introductory Physics

    NASA Astrophysics Data System (ADS)

    Brewe, Eric

    2016-05-01

    Teaching introductory physics is arguably one of the most important things that a physics department does. It is the primary way that students from other science disciplines engage with physics and it is the introduction to physics for majors. Modeling instruction is an active learning strategy for introductory physics built on the premise that science proceeds through the iterative process of model construction, development, deployment, and revision. We describe the role that participating in authentic modeling has in learning and then explore how students engage in this process in the classroom. In this presentation, we provide a theoretical background on models and modeling and describe how these theoretical elements are enacted in the introductory university physics classroom. We provide both quantitative and video data to link the development of a conceptual model to the design of the learning environment and to student outcomes. This work is supported in part by DUE #1140706.

  11. Modelling Mathematical Reasoning in Physics Education

    ERIC Educational Resources Information Center

    Uhden, Olaf; Karam, Ricardo; Pietrocola, Mauricio; Pospiech, Gesche

    2012-01-01

    Many findings from research as well as reports from teachers describe students' problem solving strategies as manipulation of formulas by rote. The resulting dissatisfaction with quantitative physical textbook problems seems to influence the attitude towards the role of mathematics in physics education in general. Mathematics is often seen as a…

  12. Teacher Fidelity to One Physical Education Curricular Model

    ERIC Educational Resources Information Center

    Kloeppel, Tiffany; Kulinna, Pamela Hodges; Stylianou, Michalis; van der Mars, Hans

    2013-01-01

    This study addressed teachers' fidelity to one Physical Education curricular model. The theoretical framework guiding this study included professional development and fidelity to curricular models. In this study, teachers' fidelity to the Dynamic Physical Education (DPE) curricular model was measured for high and nonsupport district groups.…

  13. Intentional Development: A Model to Guide Lifelong Physical Activity

    ERIC Educational Resources Information Center

    Cherubini, Jeffrey M.

    2009-01-01

    Framed in the context of researching influences on physical activity and actually working with individuals and groups seeking to initiate, increase or maintain physical activity, the purpose of this review is to present the model of Intentional Development as a multi-theoretical approach to guide research and applied work in physical activity.…

  14. SHERLOCK: A quasi-model-independent new physics search strategy.

    NASA Astrophysics Data System (ADS)

    Knuteson, Bruce

    2000-04-01

    We develop a quasi-model-independent prescription for searching for physics responsible for the electroweak symmetry breaking in the Standard Model, and show a preliminary version of what we find when this prescription is applied to the DZero data.

  15. TOWARD EFFICIENT RIPARIAN RESTORATION: INTEGRATING ECONOMIC, PHYSICAL, AND BIOLOGICAL MODELS

    EPA Science Inventory

    This paper integrates economic, biological, and physical models to determine the efficient combination and spatial allocation of conservation efforts for water quality protection and salmonid habitat enhancement in the Grande Ronde basin, Oregon. The integrated modeling system co...

  16. LCDD: A complete detector description package

    NASA Astrophysics Data System (ADS)

    Graf, Norman; McCormick, Jeremy

    2015-07-01

    LCDD has been developed to provide a complete detector description package for physics detector simulations using Geant4. All aspects of the experimental setup, such as the physical geometry, magnetic fields, and sensitive detector readouts, as well as control of the physics simulations, such as physics processes, interaction models and kinematic limits, are defined at runtime. Users are therefore able to concentrate on the design of the detector system without having to master the intricacies of C++ programming or being proficient in setting up their own Geant4 application. We describe both the XML-based file format and the processors which communicate this information to the underlying Geant4 simulation toolkit.

  17. An Empirical-Mathematical Modelling Approach to Upper Secondary Physics

    ERIC Educational Resources Information Center

    Angell, Carl; Kind, Per Morten; Henriksen, Ellen K.; Guttersrud, Oystein

    2008-01-01

    In this paper we describe a teaching approach focusing on modelling in physics, emphasizing scientific reasoning based on empirical data and using the notion of multiple representations of physical phenomena as a framework. We describe modelling activities from a project (PHYS 21) and relate some experiences from implementation of the modelling…

  18. Modeling the Discrimination Power of Physics Items

    ERIC Educational Resources Information Center

    Mesic, Vanes

    2011-01-01

    For the purposes of tailoring physics instruction in accordance with the needs and abilities of the students it is useful to explore the knowledge structure of students of different ability levels. In order to precisely differentiate the successive, characteristic states of student achievement it is necessary to use test items that possess…

  19. Testing a Theoretical Model of Immigration Transition and Physical Activity.

    PubMed

    Chang, Sun Ju; Im, Eun-Ok

    2015-01-01

    The purposes of the study were to develop a theoretical model to explain the relationships between immigration transition and midlife women's physical activity and test the relationships among the major variables of the model. A theoretical model, which was developed based on transitions theory and the midlife women's attitudes toward physical activity theory, consists of 4 major variables, including length of stay in the United States, country of birth, level of acculturation, and midlife women's physical activity. To test the theoretical model, a secondary analysis with data from 127 Hispanic women and 123 non-Hispanic (NH) Asian women in a national Internet study was used. Among the major variables of the model, length of stay in the United States was negatively associated with physical activity in Hispanic women. Level of acculturation in NH Asian women was positively correlated with women's physical activity. Country of birth and level of acculturation were significant factors that influenced physical activity in both Hispanic and NH Asian women. The findings support the theoretical model that was developed to examine relationships between immigration transition and physical activity; it shows that immigration transition can play an essential role in influencing health behaviors of immigrant populations in the United States. The NH theoretical model can be widely used in nursing practice and research that focus on immigrant women and their health behaviors. Health care providers need to consider the influences of immigration transition to promote immigrant women's physical activity. PMID:26502554

  20. Testing a Theoretical Model of Immigration Transition and Physical Activity.

    PubMed

    Chang, Sun Ju; Im, Eun-Ok

    2015-01-01

    The purposes of the study were to develop a theoretical model to explain the relationships between immigration transition and midlife women's physical activity and test the relationships among the major variables of the model. A theoretical model, which was developed based on transitions theory and the midlife women's attitudes toward physical activity theory, consists of 4 major variables, including length of stay in the United States, country of birth, level of acculturation, and midlife women's physical activity. To test the theoretical model, a secondary analysis with data from 127 Hispanic women and 123 non-Hispanic (NH) Asian women in a national Internet study was used. Among the major variables of the model, length of stay in the United States was negatively associated with physical activity in Hispanic women. Level of acculturation in NH Asian women was positively correlated with women's physical activity. Country of birth and level of acculturation were significant factors that influenced physical activity in both Hispanic and NH Asian women. The findings support the theoretical model that was developed to examine relationships between immigration transition and physical activity; it shows that immigration transition can play an essential role in influencing health behaviors of immigrant populations in the United States. The NH theoretical model can be widely used in nursing practice and research that focus on immigrant women and their health behaviors. Health care providers need to consider the influences of immigration transition to promote immigrant women's physical activity.

  1. Simple universal models capture all classical spin physics.

    PubMed

    De las Cuevas, Gemma; Cubitt, Toby S

    2016-03-11

    Spin models are used in many studies of complex systems because they exhibit rich macroscopic behavior despite their microscopic simplicity. Here, we prove that all the physics of every classical spin model is reproduced in the low-energy sector of certain "universal models," with at most polynomial overhead. This holds for classical models with discrete or continuous degrees of freedom. We prove necessary and sufficient conditions for a spin model to be universal and show that one of the simplest and most widely studied spin models, the two-dimensional Ising model with fields, is universal. Our results may facilitate physical simulations of Hamiltonians with complex interactions.

  2. Numerical modelling for intense laser physics

    SciTech Connect

    Audit, Edouard; Schurtz, Guy

    2007-04-06

    The recent start-up of large intense laser facilities such as the Ligne d'Integration Laser (LIL) or the LULI2000 and the arrival in the near future of the Laser Megajoule (LMJ) gives a great perspective for laboratory astrophysics, dense matter studies and inertial fusion. To make the most of these opportunities, several teams have set up a program which aims at satisfying simulation needs in the fields of Astrophysics, Hot Dense Matter and Inertial Confinement Fusion. A large part of the scientific production in these fields relies upon simulations of complex unsteady hydro flows, coupled to non equilibrium transport and chemical kinetics. As the characteristic time scales of transport may be much shorter than the fluid time scale, implicit numerical methods are often required. Atomics physics data, and in particular equation of states and opacities, are a key and critical ingredients for the simulations done in stellar physics, laboratory astrophysics and in many other fields of astrophysics. We will show the different codes used in the various fields of the project and the different methods used to capture the desired physics. We will also present ODALISC, a new opacity database aiming at providing the community with spectral opacities and numerical tools to use them efficiently in radiation-hydrodynamics codes.

  3. A model-based view of physics for computational activities in the introductory physics course

    NASA Astrophysics Data System (ADS)

    Buffler, Andy; Pillay, Seshini; Lubben, Fred; Fearick, Roger

    2008-04-01

    A model-based view of physics provides a framework within which computational activities may be structured so as to present to students an authentic representation of physics as a discipline. The use of the framework in teaching computation at the introductory physics level is illustrated by a case study based on the simultaneous translation and rotation of a disk-shaped spaceship. Student responses to an interactive worksheet are used to support guidelines for the design of computational tasks to enhance the understanding of physical systems through numerical problem solving.

  4. Relativistic models in nuclear and particle physics

    SciTech Connect

    Coester, F.

    1988-01-01

    A comparative overview is presented of different approaches to the construction of phenomenological dynamical models that respect basic principles of quantum theory and relativity. Wave functions defined as matrix elements of products of field operators on one hand and wave functions that are defined as representatives of state vectors in model Hilbert spaces are related differently to observables and dynamical models for these wave functions have each distinct advantages and disadvantages 34 refs.

  5. Physic-Based Imaginary Potential and Incoherent Current Models for RTD Simulation Using Optical Model

    NASA Astrophysics Data System (ADS)

    Sharifi, M. J.; Navi, Keivan

    In this study, a physic-based model for calculating incoherent current of Resonant Tunneling Diode (RTD) has been introduced which is based on the meta-stable states of RTD. Also a physic-based model for imaginary potential is introduced which has full position, bias, energy and temperature dependency of the imaginary potential. By incorporating these two physic-based models, the conventional optical model becomes a completely physic-based approach to RTD.

  6. Engineered Barrier System: Physical and Chemical Environment Model

    SciTech Connect

    D. M. Jolley; R. Jarek; P. Mariner

    2004-02-09

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  7. Hidden sector DM models and Higgs physics

    SciTech Connect

    Ko, P.

    2014-06-24

    We present an extension of the standard model to dark sector with an unbroken local dark U(1){sub X} symmetry. Including various singlet portal interactions provided by the standard model Higgs, right-handed neutrinos and kinetic mixing, we show that the model can address most of phenomenological issues (inflation, neutrino mass and mixing, baryon number asymmetry, dark matter, direct/indirect dark matter searches, some scale scale puzzles of the standard collisionless cold dark matter, vacuum stability of the standard model Higgs potential, dark radiation) and be regarded as an alternative to the standard model. The Higgs signal strength is equal to one as in the standard model for unbroken U(1){sub X} case with a scalar dark matter, but it could be less than one independent of decay channels if the dark matter is a dark sector fermion or if U(1){sub X} is spontaneously broken, because of a mixing with a new neutral scalar boson in the models.

  8. Physical and Mathematical Modeling in Experimental Papers.

    PubMed

    Möbius, Wolfram; Laan, Liedewij

    2015-12-17

    An increasing number of publications include modeling. Often, such studies help us to gain a deeper insight into the phenomena studied and break down barriers between experimental and theoretical communities. However, combining experimental and theoretical work is challenging for authors, reviewers, and readers. To help maximize the usefulness and impact of combined theoretical and experimental research, this Primer describes the purpose, usefulness, and different types of models and addresses the practical aspect of integrated publications by outlining characteristics of good modeling, presentation, and fruitful collaborations.

  9. Physical and Mathematical Modeling in Experimental Papers.

    PubMed

    Möbius, Wolfram; Laan, Liedewij

    2015-12-17

    An increasing number of publications include modeling. Often, such studies help us to gain a deeper insight into the phenomena studied and break down barriers between experimental and theoretical communities. However, combining experimental and theoretical work is challenging for authors, reviewers, and readers. To help maximize the usefulness and impact of combined theoretical and experimental research, this Primer describes the purpose, usefulness, and different types of models and addresses the practical aspect of integrated publications by outlining characteristics of good modeling, presentation, and fruitful collaborations. PMID:26687351

  10. Educational Value and Models-Based Practice in Physical Education

    ERIC Educational Resources Information Center

    Kirk, David

    2013-01-01

    A models-based approach has been advocated as a means of overcoming the serious limitations of the traditional approach to physical education. One of the difficulties with this approach is that physical educators have sought to use it to achieve diverse and sometimes competing educational benefits, and these wide-ranging aspirations are rarely if…

  11. A Model of Physical Performance for Occupational Tasks.

    ERIC Educational Resources Information Center

    Hogan, Joyce

    This report acknowledges the problems faced by industrial/organizational psychologists who must make personnel decisions involving physically demanding jobs. The scarcity of criterion-related validation studies and the difficulty of generalizing validity are considered, and a model of physical performance that builds on Fleishman's (1984)…

  12. Early Childhood Educators' Experience of an Alternative Physical Education Model

    ERIC Educational Resources Information Center

    Tsangaridou, Niki; Genethliou, Nicholas

    2016-01-01

    Alternative instructional and curricular models are regarded as more comprehensive and suitable approaches to providing quality physical education (Kulinna 2008; Lund and Tannehill 2010; McKenzie and Kahan 2008; Metzler 2011; Quay and Peters 2008). The purpose of this study was to describe the impact of the Early Steps Physical Education…

  13. Physical models of giant subaqueous rock avalanches

    NASA Astrophysics Data System (ADS)

    De Blasio, F. V.

    2011-12-01

    Large subaqueous rock avalanches are characterized by horizontal run-outs approximately ten times longer than the fall height. It is shown that this mobility is somehow puzzling, as it corresponds to a decrease of the effective friction coefficient by a factor 10-50 compared to bare rock. Two dynamical models are so introduced to explain the observed mobility. In the first model, the fast-moving fragmented rock avalanche is subjected to a lift force that makes it hydroplane, avoiding contact with the sea floor. In a second model the fragmented material ingests water, transforming into a non-Newtonian fluid that progressively reduces its shear strength. Both models give peak velocity of 65-70 m/s, which implies a high potential for tsunami generation.

  14. Discrete mathematical physics and particle modeling

    NASA Astrophysics Data System (ADS)

    Greenspan, D.

    The theory and application of the arithmetic approach to the foundations of both Newtonian and special relativistic mechanics are explored. Using only arithmetic, a reformulation of the Newtonian approach is given for: gravity; particle modeling of solids, liquids, and gases; conservative modeling of laminar and turbulent fluid flow, heat conduction, and elastic vibration; and nonconservative modeling of heat convection, shock-wave generation, the liquid drop problem, porous flow, the interface motion of a melting solid, soap films, string vibrations, and solitons. An arithmetic reformulation of special relativistic mechanics is given for theory in one space dimension, relativistic harmonic oscillation, and theory in three space dimensions. A speculative quantum mechanical model of vibrations in the water molecule is also discussed.

  15. Massive Stars: Input Physics and Stellar Models

    NASA Astrophysics Data System (ADS)

    El Eid, M. F.; The, L.-S.; Meyer, B. S.

    2009-10-01

    We present a general overview of the structure and evolution of massive stars of masses ≥12 M ⊙ during their pre-supernova stages. We think it is worth reviewing this topic owing to the crucial role of massive stars in astrophysics, especially in the evolution of galaxies and the universe. We have performed several test computations with the aim to analyze and discuss many physical uncertainties still encountered in massive-star evolution. In particular, we explore the effects of mass loss, convection, rotation, 12C( α, γ)16O reaction and initial metallicity. We also compare and analyze the similarities and differences among various works and ours. Finally, we present useful comments on the nucleosynthesis from massive stars concerning the s-process and the yields for 26Al and 60Fe.

  16. A Physically Based Coupled Chemical and Physical Weathering Model for Simulating Soilscape Evolution

    NASA Astrophysics Data System (ADS)

    Willgoose, G. R.; Welivitiya, D.; Hancock, G. R.

    2015-12-01

    A critical missing link in existing landscape evolution models is a dynamic soil evolution models where soils co-evolve with the landform. Work by the authors over the last decade has demonstrated a computationally manageable model for soil profile evolution (soilscape evolution) based on physical weathering. For chemical weathering it is clear that full geochemistry models such as CrunchFlow and PHREEQC are too computationally intensive to be couplable to existing soilscape and landscape evolution models. This paper presents a simplification of CrunchFlow chemistry and physics that makes the task feasible, and generalises it for hillslope geomorphology applications. Results from this simplified model will be compared with field data for soil pedogenesis. Other researchers have previously proposed a number of very simple weathering functions (e.g. exponential, humped, reverse exponential) as conceptual models of the in-profile weathering process. The paper will show that all of these functions are possible for specific combinations of in-soil environmental, geochemical and geologic conditions, and the presentation will outline the key variables controlling which of these conceptual models can be realistic models of in-profile processes and under what conditions. The presentation will finish by discussing the coupling of this model with a physical weathering model, and will show sample results from our SSSPAM soilscape evolution model to illustrate the implications of including chemical weathering in the soilscape evolution model.

  17. Propulsion Physics Under the Changing Density Field Model

    NASA Technical Reports Server (NTRS)

    Robertson, Glen A.

    2011-01-01

    To grow as a space faring race, future spaceflight systems will requires new propulsion physics. Specifically a propulsion physics model that does not require mass ejection without limiting the high thrust necessary to accelerate within or beyond our solar system and return within a normal work period or lifetime. In 2004 Khoury and Weltman produced a density dependent cosmology theory they called Chameleon Cosmology, as at its nature, it is hidden within known physics. This theory represents a scalar field within and about an object, even in the vacuum. Whereby, these scalar fields can be viewed as vacuum energy fields with definable densities that permeate all matter; having implications to dark matter/energy with universe acceleration properties; implying a new force mechanism for propulsion physics. Using Chameleon Cosmology, the author has developed a new propulsion physics model, called the Changing Density Field (CDF) Model. This model relates to density changes in these density fields, where the density field density changes are related to the acceleration of matter within an object. These density changes in turn change how an object couples to the surrounding density fields. Whereby, thrust is achieved by causing a differential in the coupling to these density fields about an object. Since the model indicates that the density of the density field in an object can be changed by internal mass acceleration, even without exhausting mass, the CDF model implies a new propellant-less propulsion physics model

  18. Standard model status (in search of new physics'')

    SciTech Connect

    Marciano, W.J.

    1993-03-01

    A perspective on successes and shortcomings of the standard model is given. The complementarity between direct high energy probes of new physics and lower energy searches via precision measurements and rare reactions is described. Several illustrative examples are discussed.

  19. Standard model status (in search of ``new physics``)

    SciTech Connect

    Marciano, W.J.

    1993-03-01

    A perspective on successes and shortcomings of the standard model is given. The complementarity between direct high energy probes of new physics and lower energy searches via precision measurements and rare reactions is described. Several illustrative examples are discussed.

  20. Evaluation and development of physically-based embankment breach models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The CEATI Dam Safety Interest Group (DSIG) working group on embankment erosion and breach modelling has evaluated three physically-based numerical models used to simulate embankment erosion and breach development. The three models identified by the group were considered to be good candidates for fu...

  1. Kinetic exchange models: From molecular physics to social science

    NASA Astrophysics Data System (ADS)

    Patriarca, Marco; Chakraborti, Anirban

    2013-08-01

    We discuss several multi-agent models that have their origin in the kinetic exchange theory of statistical mechanics and have been recently applied to a variety of problems in the social sciences. This class of models can be easily adapted for simulations in areas other than physics, such as the modeling of income and wealth distributions in economics and opinion dynamics in sociology.

  2. Harmony Theory: Problem Solving, Parallel Cognitive Models, and Thermal Physics.

    ERIC Educational Resources Information Center

    Smolensky, Paul; Riley, Mary S.

    This document consists of three papers. The first, "A Parallel Model of (Sequential) Problem Solving," describes a parallel model designed to solve a class of relatively simple problems from elementary physics and discusses implications for models of problem-solving in general. It is shown that one of the most salient features of problem solving,…

  3. Rock Physic Modeling of Carbonate Sediments

    NASA Astrophysics Data System (ADS)

    Ruiz, F. J.; Dvorkin, J.; Nur, A.

    2006-12-01

    We offer an effective-medium model for estimating the elastic properties of high-porosity marine carbonate sediment. This model treats carbonate as a pack of porous elastic grains. The effective elastic moduli of the grains are calculated using the Differential Effective Medium model (DEM) where the ellipsoidal inclusions have a fixed aspect ratio and are filled with sea water. Then the elastic moduli of a pack of these grains are calculated using a modified (scaled to the critical porosity) upper Hashin-Shtrikman bound. We find that the best match between the model-predicted compressional and shear-wave velocity and ODP data from three wells is achieved for the aspect ratio 0.25. We also examine a laboratory data set for low-porosity consolidated carbonate rock. In this case we treat the grains as solid without inclusions and then use DEM to calculate the effective bulk and shear moduli of the whole rock. The best fit to the experimental data is achieved for the pore aspect ratio in the range between 0.1 and 0.2. These effective medium predictions also match the empirical Raymer's (1980) equation applied to pure calcite rock. The basic conclusion is that in spite of the apparent wide variation in the shape and size distribution of pores in carbonate, its elastic properties can be predicted by assuming a single aspect ratio (shape) of the pores. The combination of the above two models provides a predictive estimate for the elastic-wave velocity of calcite sediment (at least for the data under examination) in a wide porosity range between zero and almost 100% porosity. It is important to emphasize that our effective-medium approach assigns finite non-zero values to the shear modulus of high-porosity marine sediment unlike the suspension model commonly used in such depositional setting.

  4. Value-Added Modeling in Physical Education

    ERIC Educational Resources Information Center

    Hushman, Glenn; Hushman, Carolyn

    2015-01-01

    The educational reform movement in the United States has resulted in a variety of states moving toward a system of value-added modeling (VAM) to measure a teacher's contribution to student achievement. Recently, many states have begun using VAM scores as part of a larger system to evaluate teacher performance. In the past decade, only "core…

  5. Numerical calculations of cosmic ray cascade in the Earth's atmosphere using different particle interaction models

    NASA Astrophysics Data System (ADS)

    Nesterenok, A. V.; Naidenov, V. O.

    2015-12-01

    The interaction of primary cosmic rays with the Earth's atmosphere is investigated using the simulation toolkit GEANT4. Two reference lists of physical processes - QGSP_BIC_HP and FTFP_BERT_HP - are used in the simulations of cosmic ray cascade in the atmosphere. The cosmic ray neutron fluxes are calculated for mean level of solar activity, high geomagnetic latitudes and sea level. The calculated fluxes are compared with the published results of other analogous simulations and with experimental data.

  6. Three-Dimensional Ultrasound-Derived Physical Mitral Valve Modeling

    PubMed Central

    Witschey, Walter RT; Pouch, Alison M; McGarvey, Jeremy R; Ikeuchi, Kaori; Contijoch, Francisco; Levack, Melissa M; Yushkevick, Paul A; Sehgal, Chandra M; Jackson, Benjamin; Gorman, Robert C; Gorman, Joseph H

    2015-01-01

    Purpose Advances in mitral valve repair and adoption have been partly attributed to improvements in echocardiographic imaging technology. To further educate and guide repair surgery, we have developed a methodology to quickly produce physical models of the valve using novel 3D echocardiographic imaging software in combination with stereolithographic printing. Description Quantitative virtual mitral valve shape models were developed from 3D transesophageal echocardiographic images using software based on semi-automated image segmentation and continuous medial representation (cm-rep) algorithms. These quantitative virtual shape models were then used as input to a commercially available stereolithographic printer to generate a physical model of the each valve at end systole and end diastole. Evaluation Physical models of normal and diseased valves (ischemic mitral regurgitation and myxomatous degeneration) were constructed. There was good correspondence between the virtual shape models and physical models. Conclusions It was feasible to create a physical model of mitral valve geometry under normal, ischemic and myxomatous valve conditions using 3D printing of 3D echocardiographic data. Printed valves have the potential to guide surgical therapy for mitral valve disease. PMID:25087790

  7. Search for physics beyond the Standard Model using jet observables

    NASA Astrophysics Data System (ADS)

    Kousouris, Konstantinos

    2015-11-01

    Jet observables have been exploited extensively during the LHC Run 1 to search for physics beyond the Standard Model. In this article, the most recent results from the ATLAS and CMS collaborations are summarized. Data from proton-proton collisions at 7 and 8 TeV center-of-mass energy have been analyzed to study monojet, dijet, and multijet final states, searching for a variety of new physics signals that include colored resonances, contact interactions, extra dimensions, and supersymmetric particles. The exhaustive searches with jets in Run 1 did not reveal any signal, and the results were used to put stringent exclusion limits on the new physics models.

  8. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  9. Role Modeling Attitudes, Physical Activity and Fitness Promoting Behaviors of Prospective Physical Education Specialists and Non-Specialists.

    ERIC Educational Resources Information Center

    Cardinal, Bradley J.; Cardinal, Marita K.

    2002-01-01

    Compared the role modeling attitudes and physical activity and fitness promoting behaviors of undergraduate students majoring in physical education and in elementary education. Student teacher surveys indicated that physical education majors had more positive attitudes toward role modeling physical activity and fitness promoting behaviors and…

  10. Characterizing, modeling, and addressing gender disparities in introductory college physics

    NASA Astrophysics Data System (ADS)

    Kost-Smith, Lauren Elizabeth

    2011-12-01

    The underrepresentation and underperformance of females in physics has been well documented and has long concerned policy-makers, educators, and the physics community. In this thesis, we focus on gender disparities in the first- and second-semester introductory, calculus-based physics courses at the University of Colorado. Success in these courses is critical for future study and careers in physics (and other sciences). Using data gathered from roughly 10,000 undergraduate students, we identify and model gender differences in the introductory physics courses in three areas: student performance, retention, and psychological factors. We observe gender differences on several measures in the introductory physics courses: females are less likely to take a high school physics course than males and have lower standardized mathematics test scores; males outscore females on both pre- and post-course conceptual physics surveys and in-class exams; and males have more expert-like attitudes and beliefs about physics than females. These background differences of males and females account for 60% to 70% of the gender gap that we observe on a post-course survey of conceptual physics understanding. In analyzing underlying psychological factors of learning, we find that female students report lower self-confidence related to succeeding in the introductory courses (self-efficacy) and are less likely to report seeing themselves as a "physics person". Students' self-efficacy beliefs are significant predictors of their performance, even when measures of physics and mathematics background are controlled, and account for an additional 10% of the gender gap. Informed by results from these studies, we implemented and tested a psychological, self-affirmation intervention aimed at enhancing female students' performance in Physics 1. Self-affirmation reduced the gender gap in performance on both in-class exams and the post-course conceptual physics survey. Further, the benefit of the self

  11. A GLOBAL PHYSICAL MODEL FOR CEPHEIDS

    SciTech Connect

    Pejcha, Ondrej; Kochanek, Christopher S. E-mail: ckochanek@astronomy.ohio-state.edu

    2012-04-01

    We perform a global fit to {approx}5000 radial velocity and {approx}177, 000 magnitude measurements in 29 photometric bands covering 0.3 {mu}m to 8.0 {mu}m distributed among 287 Galactic, Large Magellanic Cloud, and Small Magellanic Cloud Cepheids with P > 10 days. We assume that the Cepheid light curves and radial velocities are fully characterized by distance, reddening, and time-dependent radius and temperature variations. We construct phase curves of radius and temperature for periods between 10 and 100 days, which yield light-curve templates for all our photometric bands and can be easily generalized to any additional band. With only four to six parameters per Cepheid, depending on the existence of velocity data and the amount of freedom in the distance, the models have typical rms light and velocity curve residuals of 0.05 mag and 3.5 km s{sup -1}. The model derives the mean Cepheid spectral energy distribution and its derivative with respect to temperature, which deviate from a blackbody in agreement with metal-line and molecular opacity effects. We determine a mean reddening law toward the Cepheids in our sample, which is not consistent with standard assumptions in either the optical or near-IR. Based on stellar atmosphere models, we predict the biases in distance, reddening, and temperature determinations due to the metallicity and quantify the metallicity signature expected for our fit residuals. The observed residuals as a function of wavelength show clear differences between the individual galaxies, which are compatible with these predictions. In particular, we find that metal-poor Cepheids appear hotter. Finally, we provide a framework for optimally selecting filters that yield the smallest overall errors in Cepheid parameter determination or filter combinations for suppressing or enhancing the metallicity effects on distance determinations. We make our templates publicly available.

  12. Application of physical parameter identification to finite element models

    NASA Technical Reports Server (NTRS)

    Bronowicki, Allen J.; Lukich, Michael S.; Kuritz, Steven P.

    1986-01-01

    A time domain technique for matching response predictions of a structural dynamic model to test measurements is developed. Significance is attached to prior estimates of physical model parameters and to experimental data. The Bayesian estimation procedure allows confidence levels in predicted physical and modal parameters to be obtained. Structural optimization procedures are employed to minimize an error functional with physical model parameters describing the finite element model as design variables. The number of complete FEM analyses are reduced using approximation concepts, including the recently developed convoluted Taylor series approach. The error function is represented in closed form by converting free decay test data to a time series model using Prony' method. The technique is demonstrated on simulated response of a simple truss structure.

  13. The limitations of mathematical modeling in high school physics education

    NASA Astrophysics Data System (ADS)

    Forjan, Matej

    The theme of the doctoral dissertation falls within the scope of didactics of physics. Theoretical analysis of the key constraints that occur in the transmission of mathematical modeling of dynamical systems into field of physics education in secondary schools is presented. In an effort to explore the extent to which current physics education promotes understanding of models and modeling, we analyze the curriculum and the three most commonly used textbooks for high school physics. We focus primarily on the representation of the various stages of modeling in the solved tasks in textbooks and on the presentation of certain simplifications and idealizations, which are in high school physics frequently used. We show that one of the textbooks in most cases fairly and reasonably presents the simplifications, while the other two half of the analyzed simplifications do not explain. It also turns out that the vast majority of solved tasks in all the textbooks do not explicitly represent model assumptions based on what we can conclude that in high school physics the students do not develop sufficiently a sense of simplification and idealizations, which is a key part of the conceptual phase of modeling. For the introduction of modeling of dynamical systems the knowledge of students is also important, therefore we performed an empirical study on the extent to which high school students are able to understand the time evolution of some dynamical systems in the field of physics. The research results show the students have a very weak understanding of the dynamics of systems in which the feedbacks are present. This is independent of the year or final grade in physics and mathematics. When modeling dynamical systems in high school physics we also encounter the limitations which result from the lack of mathematical knowledge of students, because they don't know how analytically solve the differential equations. We show that when dealing with one-dimensional dynamical systems

  14. A physical corrosion model for bioabsorbable metal stents.

    PubMed

    Grogan, J A; Leen, S B; McHugh, P E

    2014-05-01

    Absorbable metal stents (AMSs) are an emerging technology in the treatment of heart disease. Computational modelling of AMS performance will facilitate the development of this technology. In this study a physical corrosion model is developed for AMSs based on the finite element method and adaptive meshing. The model addresses a gap between currently available phenomenological corrosion models for AMSs and physical corrosion models that have been developed for more simple geometries than those of a stent. The model developed in this study captures the changing surface of a corroding three-dimensional AMS structure for the case of diffusion-controlled corrosion. Comparisons are made between model predictions and those of previously developed phenomenological corrosion models for AMSs in terms of predicted device geometry and mechanical performance during corrosion. Relationships between alloy solubility and diffusivity in the corrosion environment and device performance during corrosion are also investigated.

  15. "Let's get physical": advantages of a physical model over 3D computer models and textbooks in learning imaging anatomy.

    PubMed

    Preece, Daniel; Williams, Sarah B; Lam, Richard; Weller, Renate

    2013-01-01

    Three-dimensional (3D) information plays an important part in medical and veterinary education. Appreciating complex 3D spatial relationships requires a strong foundational understanding of anatomy and mental 3D visualization skills. Novel learning resources have been introduced to anatomy training to achieve this. Objective evaluation of their comparative efficacies remains scarce in the literature. This study developed and evaluated the use of a physical model in demonstrating the complex spatial relationships of the equine foot. It was hypothesized that the newly developed physical model would be more effective for students to learn magnetic resonance imaging (MRI) anatomy of the foot than textbooks or computer-based 3D models. Third year veterinary medicine students were randomly assigned to one of three teaching aid groups (physical model; textbooks; 3D computer model). The comparative efficacies of the three teaching aids were assessed through students' abilities to identify anatomical structures on MR images. Overall mean MRI assessment scores were significantly higher in students utilizing the physical model (86.39%) compared with students using textbooks (62.61%) and the 3D computer model (63.68%) (P < 0.001), with no significant difference between the textbook and 3D computer model groups (P = 0.685). Student feedback was also more positive in the physical model group compared with both the textbook and 3D computer model groups. Our results suggest that physical models may hold a significant advantage over alternative learning resources in enhancing visuospatial and 3D understanding of complex anatomical architecture, and that 3D computer models have significant limitations with regards to 3D learning.

  16. Applying Transtheoretical Model to Promote Physical Activities Among Women

    PubMed Central

    Pirzadeh, Asiyeh; Mostafavi, Firoozeh; Ghofranipour, Fazllolah; Feizi, Awat

    2015-01-01

    Background: Physical activity is one of the most important indicators of health in communities but different studies conducted in the provinces of Iran showed that inactivity is prevalent, especially among women. Objectives: Inadequate regular physical activities among women, the importance of education in promoting the physical activities, and lack of studies on the women using transtheoretical model, persuaded us to conduct this study with the aim of determining the application of transtheoretical model in promoting the physical activities among women of Isfahan. Materials and Methods: This research was a quasi-experimental study which was conducted on 141 women residing in Isfahan, Iran. They were randomly divided into case and control groups. In addition to the demographic information, their physical activities and the constructs of the transtheoretical model (stages of change, processes of change, decisional balance, and self-efficacy) were measured at 3 time points; preintervention, 3 months, and 6 months after intervention. Finally, the obtained data were analyzed through t test and repeated measures ANOVA test using SPSS version 16. Results: The results showed that education based on the transtheoretical model significantly increased physical activities in 2 aspects of intensive physical activities and walking, in the case group over the time. Also, a high percentage of people have shown progress during the stages of change, the mean of the constructs of processes of change, as well as pros and cons. On the whole, a significant difference was observed over the time in the case group (P < 0.01). Conclusions: This study showed that interventions based on the transtheoretical model can promote the physical activity behavior among women. PMID:26834796

  17. Technical Manual for the SAM Physical Trough Model

    SciTech Connect

    Wagner, M. J.; Gilman, P.

    2011-06-01

    NREL, in conjunction with Sandia National Lab and the U.S Department of Energy, developed the System Advisor Model (SAM) analysis tool for renewable energy system performance and economic analysis. This paper documents the technical background and engineering formulation for one of SAM's two parabolic trough system models in SAM. The Physical Trough model calculates performance relationships based on physical first principles where possible, allowing the modeler to predict electricity production for a wider range of component geometries than is possible in the Empirical Trough model. This document describes the major parabolic trough plant subsystems in detail including the solar field, power block, thermal storage, piping, auxiliary heating, and control systems. This model makes use of both existing subsystem performance modeling approaches, and new approaches developed specifically for SAM.

  18. Spin-foam models and the physical scalar product

    SciTech Connect

    Alesci, Emanuele; Noui, Karim; Sardelli, Francesco

    2008-11-15

    This paper aims at clarifying the link between loop quantum gravity and spin-foam models in four dimensions. Starting from the canonical framework, we construct an operator P acting on the space of cylindrical functions Cyl({gamma}), where {gamma} is the four-simplex graph, such that its matrix elements are, up to some normalization factors, the vertex amplitude of spin-foam models. The spin-foam models we are considering are the topological model, the Barrett-Crane model, and the Engle-Pereira-Rovelli model. If one of these spin-foam models provides a covariant quantization of gravity, then the associated operator P should be the so-called ''projector'' into physical states and its matrix elements should give the physical scalar product. We discuss the possibility to extend the action of P to any cylindrical functions on the space manifold.

  19. Model Rocketry in the 21st-Century Physics Classroom

    NASA Astrophysics Data System (ADS)

    Horst, Ken

    2004-10-01

    Model rocketry has changed since my introduction to it as an eighth-grade student. Two of these changes are important for the use of rocketry in the physics classroom. First, simulation software, which is relatively inexpensive and very powerful, allows students to create and fly virtual models of their rocket designs. Second, lightweight and sophisticated electronics2 are available for logging flight data and for controlling flight operations such as deploying parachutes. In this technology-rich context, designing, building, and flying model rockets can capture the interest of today's physics students.

  20. Quantum monadology: a consistent world model for consciousness and physics.

    PubMed

    Nakagomi, Teruaki

    2003-04-01

    The NL world model presented in the previous paper is embodied by use of relativistic quantum mechanics, which reveals the significance of the reduction of quantum states and the relativity principle, and locates consciousness and the concept of flowing time consistently in physics. This model provides a consistent framework to solve apparent incompatibilities between consciousness (as our interior experience) and matter (as described by quantum mechanics and relativity theory). Does matter have an inside? What is the flowing time now? Does physics allow the indeterminism by volition? The problem of quantum measurement is also resolved in this model.

  1. Snyder-de Sitter model from two-time physics

    SciTech Connect

    Carrisi, M. C.; Mignemi, S.

    2010-11-15

    We show that the symplectic structure of the Snyder model on a de Sitter background can be derived from two-time physics in seven dimensions and propose a Hamiltonian for a free particle consistent with the symmetries of the model.

  2. Investigating Student Understanding of Quantum Physics: Spontaneous Models of Conductivity.

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Steinberg, Richard N.; Redish, Edward F.

    2002-01-01

    Investigates student reasoning about models of conduction. Reports that students often are unable to account for the existence of free electrons in a conductor and create models that lead to incorrect predictions and responses contradictory to expert descriptions of the physics involved. (Contains 36 references.) (Author/YDS)

  3. Rock.XML - Towards a library of rock physics models

    NASA Astrophysics Data System (ADS)

    Jensen, Erling Hugo; Hauge, Ragnar; Ulvmoen, Marit; Johansen, Tor Arne; Drottning, Åsmund

    2016-08-01

    Rock physics modelling provides tools for correlating physical properties of rocks and their constituents to the geophysical observations we measure on a larger scale. Many different theoretical and empirical models exist, to cover the range of different types of rocks. However, upon reviewing these, we see that they are all built around a few main concepts. Based on this observation, we propose a format for digitally storing the specifications for rock physics models which we have named Rock.XML. It does not only contain data about the various constituents, but also the theories and how they are used to combine these building blocks to make a representative model for a particular rock. The format is based on the Extensible Markup Language XML, making it flexible enough to handle complex models as well as scalable towards extending it with new theories and models. This technology has great advantages as far as documenting and exchanging models in an unambiguous way between people and between software. Rock.XML can become a platform for creating a library of rock physics models; making them more accessible to everyone.

  4. Beyond Standard Model Physics: At the Frontiers of Cosmology and Particle Physics

    NASA Astrophysics Data System (ADS)

    Lopez-Suarez, Alejandro O.

    I begin to write this thesis at a time of great excitement in the field of cosmology and particle physics. The aim of this thesis is to study and search for beyond the standard model (BSM) physics in the cosmological and high energy particle fields. There are two main questions, which this thesis aims to address: 1) what can we learn about the inflationary epoch utilizing the pioneer gravitational wave detector Adv. LIGO?, and 2) what are the dark matter particle properties and interactions with the standard model particles?. This thesis will focus on advances in answering both questions.

  5. Using resource graphs to model learning in physics.

    NASA Astrophysics Data System (ADS)

    Wittmann, Michael

    2007-04-01

    Physics education researchers have many valuable ways of describing student reasoning while learning physics. One can describe the correct physics and look at specific student difficulties, for example, though that doesn't quite address the issue of how the latter develops into the former. A recent model (building on work by A.A. diSessa and D. Hammer) is to use resource graphs, which are networks of connected, small-scale ideas that describe reasoning about a specific physics topic in a specific physics context. We can compare resource graphs before and after instruction to represent conceptual changes that occur during learning. The representation describes several well documented forms of conceptual change and suggests others. I will apply the resource graphs representation to describe reasoning about energy loss in quantum tunneling. I will end the talk with a brief discussion (in the context of Newton's Laws) of how a resource perspective affects our instructional choices.

  6. The Effects of a Model-Based Physics Curriculum Program with a Physics First Approach: A Causal-Comparative Study

    ERIC Educational Resources Information Center

    Liang, Ling L.; Fulmer, Gavin W.; Majerich, David M.; Clevenstine, Richard; Howanski, Raymond

    2012-01-01

    The purpose of this study is to examine the effects of a model-based introductory physics curriculum on conceptual learning in a Physics First (PF) Initiative. This is the first comparative study in physics education that applies the Rasch modeling approach to examine the effects of a model-based curriculum program combined with PF in the United…

  7. Search for Beyond the Standard Model Physics at D0

    SciTech Connect

    Kraus, James

    2011-08-01

    The standard model (SM) of particle physics has been remarkably successful at predicting the outcomes of particle physics experiments, but there are reasons to expect new physics at the electroweak scale. Over the last several years, there have been a number of searches for beyond the standard model (BSM) physics at D0. Here, we limit our focus to three: searches for diphoton events with large missing transverse energy (E{sub T}), searches for leptonic jets and E{sub T}, and searches for single vector-like quarks. We have discussed three recent searches at D0. There are many more, including limits on heavy neutral gauge boson in the ee channel, a search for scalar top quarks, a search for quirks, and limits on a new resonance decaying to WW or WZ.

  8. A physical model of Titan's aerosols.

    PubMed

    Toon, O B; McKay, C P; Griffith, C A; Turco, R P

    1992-01-01

    Microphysical simulations of Titan's stratospheric haze show that aerosol microphysics is linked to organized dynamical processes. The detached haze layer may be a manifestation of 1 cm sec-1 vertical velocities at altitudes above 300 km. The hemispherical asymmetry in the visible albedo may be caused by 0.05 cm sec-1 vertical velocities at altitudes of 150 to 200 km, we predict contrast reversal beyond 0.6 micrometer. Tomasko and Smith's (1982, Icarus 51, 65-95) model, in which a layer of large particles above 220 km altitude is responsible for the high forward scattering observed by Rages and Pollack (1983, Icarus 55, 50-62), is a natural outcome of the detached haze layer being produced by rising motions if aerosol mass production occurs primarily below the detached haze layer. The aerosol's electrical charge is critical for the particle size and optical depth of the haze. The geometric albedo, particularly in the ultraviolet and near infrared, requires that the particle size be near 0.15 micrometer down to altitudes below 100 km, which is consistent with polarization observations (Tomasko and Smith 1982, West and Smith 1991, Icarus 90, 330-333). Above about 400 km and below about 150 km Yung et al.'s (1984, Astrophys. J. Suppl. Ser. 55, 465-506) diffusion coefficients are too small. Dynamical processes control the haze particles below about 150 km. The relatively large eddy diffusion coefficients in the lower stratosphere result in a vertically extensive region with nonuniform mixing ratios of condensable gases, so that most hydrocarbons may condense very near the tropopause rather than tens of kilometers above it. The optical depths of hydrocarbon clouds are probably less than one, requiring that abundant gases such as ethane condense on a subset of the haze particles to create relatively large, rapidly removed particles. The wavelength dependence of the optical radius is calculated for use in analyzing observations of the geometric albedo. The lower

  9. GEANT4 simulations for low energy proton computerized tomography.

    PubMed

    Milhoretto, Edney; Schelin, Hugo R; Setti, João A P; Denyak, Valery; Paschuk, Sergei A; Evseev, Ivan G; de Assis, Joaquim T; Yevseyeva, O; Lopes, Ricardo T; Vinagre Filho, Ubirajara M

    2010-01-01

    This work presents the recent results of computer simulations for the low energy proton beam tomographic scanner installed at the cyclotron CV-28 of IEN/CNEN. New computer simulations were performed in order to adjust the parameters of previous simulation within the first experimental results and to understand some specific effects that affected the form of the final proton energy spectra. To do this, the energy and angular spread of the initial proton beam were added, and the virtual phantom geometry was specified more accurately in relation to the real one. As a result, a more realistic view on the measurements was achieved.

  10. Physical and numerical modeling of Joule-heated melters

    NASA Astrophysics Data System (ADS)

    Eyler, L. L.; Skarda, R. J.; Crowder, R. S., III; Trent, D. S.; Reid, C. R.; Lessor, D. L.

    1985-10-01

    The Joule-heated ceramic-lined melter is an integral part of the high level waste immobilization process under development by the US Department of Energy. Scaleup and design of this waste glass melting furnace requires an understanding of the relationships between melting cavity design parameters and the furnace performance characteristics such as mixing, heat transfer, and electrical requirements. Developing empirical models of these relationships through actual melter testing with numerous designs would be a very costly and time consuming task. Additionally, the Pacific Northwest Laboratory (PNL) has been developing numerical models that simulate a Joule-heated melter for analyzing melter performance. This report documents the method used and results of this modeling effort. Numerical modeling results are compared with the more conventional, physical modeling results to validate the approach. Also included are the results of numerically simulating an operating research melter at PNL. Physical Joule-heated melters modeling results used for qualiying the simulation capabilities of the melter code included: (1) a melter with a single pair of electrodes and (2) a melter with a dual pair (two pairs) of electrodes. The physical model of the melter having two electrode pairs utilized a configuration with primary and secondary electrodes. The principal melter parameters (the ratio of power applied to each electrode pair, modeling fluid depth, electrode spacing) were varied in nine tests of the physical model during FY85. Code predictions were made for five of these tests. Voltage drops, temperature field data, and electric field data varied in their agreement with the physical modeling results, but in general were judged acceptable.

  11. Physical and numerical modeling of Joule-heated melters

    SciTech Connect

    Eyler, L.L.; Skarda, R.J.; Crowder, R.S. III; Trent, D.S.; Reid, C.R.; Lessor, D.L.

    1985-10-01

    The Joule-heated ceramic-lined melter is an integral part of the high level waste immobilization process under development by the US Department of Energy. Scaleup and design of this waste glass melting furnace requires an understanding of the relationships between melting cavity design parameters and the furnace performance characteristics such as mixing, heat transfer, and electrical requirements. Developing empirical models of these relationships through actual melter testing with numerous designs would be a very costly and time consuming task. Additionally, the Pacific Northwest Laboratory (PNL) has been developing numerical models that simulate a Joule-heated melter for analyzing melter performance. This report documents the method used and results of this modeling effort. Numerical modeling results are compared with the more conventional, physical modeling results to validate the approach. Also included are the results of numerically simulating an operating research melter at PNL. Physical Joule-heated melters modeling results used for qualiying the simulation capabilities of the melter code included: (1) a melter with a single pair of electrodes and (2) a melter with a dual pair (two pairs) of electrodes. The physical model of the melter having two electrode pairs utilized a configuration with primary and secondary electrodes. The principal melter parameters (the ratio of power applied to each electrode pair, modeling fluid depth, electrode spacing) were varied in nine tests of the physical model during FY85. Code predictions were made for five of these tests. Voltage drops, temperature field data, and electric field data varied in their agreement with the physical modeling results, but in general were judged acceptable. 14 refs., 79 figs., 17 tabs.

  12. Mathematical modeling and physical reality in noncovalent interactions.

    PubMed

    Politzer, Peter; Murray, Jane S; Clark, Timothy

    2015-03-01

    The Hellmann-Feynman theorem provides a straightforward interpretation of noncovalent bonding in terms of Coulombic interactions, which encompass polarization (and accordingly include dispersion). Exchange, Pauli repulsion, orbitals, etc., are part of the mathematics of obtaining the system's wave function and subsequently its electronic density. They do not correspond to physical forces. Charge transfer, in the context of noncovalent interactions, is equivalent to polarization. The key point is that mathematical models must not be confused with physical reality. PMID:25697332

  13. A physical data model for fields and agents

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; de Bakker, Merijn; Karssenberg, Derek

    2016-04-01

    Two approaches exist in simulation modeling: agent-based and field-based modeling. In agent-based (or individual-based) simulation modeling, the entities representing the system's state are represented by objects, which are bounded in space and time. Individual objects, like an animal, a house, or a more abstract entity like a country's economy, have properties representing their state. In an agent-based model this state is manipulated. In field-based modeling, the entities representing the system's state are represented by fields. Fields capture the state of a continuous property within a spatial extent, examples of which are elevation, atmospheric pressure, and water flow velocity. With respect to the technology used to create these models, the domains of agent-based and field-based modeling have often been separate worlds. In environmental modeling, widely used logical data models include feature data models for point, line and polygon objects, and the raster data model for fields. Simulation models are often either agent-based or field-based, even though the modeled system might contain both entities that are better represented by individuals and entities that are better represented by fields. We think that the reason for this dichotomy in kinds of models might be that the traditional object and field data models underlying those models are relatively low level. We have developed a higher level conceptual data model for representing both non-spatial and spatial objects, and spatial fields (De Bakker et al. 2016). Based on this conceptual data model we designed a logical and physical data model for representing many kinds of data, including the kinds used in earth system modeling (e.g. hydrological and ecological models). The goal of this work is to be able to create high level code and tools for the creation of models in which entities are representable by both objects and fields. Our conceptual data model is capable of representing the traditional feature data

  14. Female role models in physics education in Ireland

    NASA Astrophysics Data System (ADS)

    Chormaic, Síle Nic; Fee, Sandra; Tobin, Laura; Hennessy, Tara

    2013-03-01

    In this paper we consider the statistics on undergraduate student representation in Irish universities and look at student numbers in secondary (high) schools in one region in Ireland. There seems to be no significant change in female participation in physics from 2002 to 2011. Additionally, we have studied the influence of an educator's gender on the prevalence of girls studying physics in secondary schools in Co. Louth, Ireland, and at the postgraduate level in Irish universities. It would appear that strong female role models have a positive influence and lead to an increase in girls' participation in physics.

  15. Source signature and acoustic field of seismic physical modeling

    NASA Astrophysics Data System (ADS)

    Lin, Q.; Jackson, C.; Tang, G.; Burbach, G.

    2004-12-01

    As an important tool of seismic research and exploration, seismic physical modeling simulates the real world data acquisition by scaling the model, acquisition parameters, and some features of the source generated by a transducer. Unlike the numerical simulation where a point source is easily satisfied, the transducer can't be made small enough for approximating the point source in physical modeling, therefore yield different source signature than the sources applied in the field data acquisition. To better understand the physical modeling data, characterizing the wave field generated by ultrasonic transducers is desirable and helpful. In this study, we explode several aspects of source characterization; including their radiation pattern, directivity, sensitivity and frequency response. We also try to figure out how to improve the acquired data quality, such as minimize ambient noise, use encoded chirp to prevent ringing, apply deterministic deconvolution to enhance data resolution and t-P filtering to remove linear events. We found that the transducer and their wave field, the modeling system performance, as well as material properties of the model and their coupling conditions all play roles in the physical modeling data acquisition.

  16. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.; Kumar, S.; Lapenta, W.; Li, X.; Matsui, T.; Rienecker, M.; Shen, B.W.; Shi, J.J.; Simpson, J.; Zeng, X.

    2008-01-01

    Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite

  17. A Physical Model of Electron Radiation Belts of Saturn

    NASA Astrophysics Data System (ADS)

    Lorenzato, L.; Sicard-Piet, A.; Bourdarie, S.

    2012-04-01

    Radiation belts causes irreversible damages on on-board instruments materials. That's why for two decades, ONERA proposes studies about radiation belts of magnetized planets. First, in the 90's, the development of a physical model, named Salammbô, carried out a model of the radiation belts of the Earth. Then, for few years, analysis of the magnetosphere of Jupiter and in-situ data (Pioneer, Voyager, Galileo) allow to build a physical model of the radiation belts of Jupiter. Enrolling on the Cassini age and thanks to all information collected, this study permits to adapt Salammbô jovian radiation belts model to the case of Saturn environment. Indeed, some physical processes present in the kronian magnetosphere are similar to those present in the magnetosphere of Jupiter (radial diffusion; interaction of energetic electrons with rings, moons, atmosphere; synchrotron emission). However, some physical processes have to be added to the kronian model (compared to the jovian model) because of the particularity of the magnetosphere of Saturn: interaction of energetic electrons with neutral particles from Enceladus, and wave-particle interaction. This last physical process has been studied in details with the analysis of CASSINI/RPWS (Radio and Plasma Waves Science) data. The major importance of the wave particles interaction is now well known in the case of the radiation belts of the Earth but it is important to investigate on its role in the case of Saturn. So, importance of each physical process has been studied and analysis of Cassini MIMI-LEMMS and CAPS data allows to build a model boundary condition (at L = 6). Finally, results of this study lead to a kronian electrons radiation belts model including radial diffusion, interactions of energetic electrons with rings, moons and neutrals particles and wave-particle interaction (interactions of electrons with atmosphere particles and synchrotron emission are too weak to be taken into account in this model). Then, to

  18. Messages on Flavour Physics Beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Buras, Andrzej J.

    2008-12-01

    We present a brief summary of the main results on flavour physics beyond the Standard Model that have been obtained in 2008 by my collaborators and myself in my group at TUM. In particular we list main messages coming from our analyses of flavour and CP-violating processes in Supersymmetry, Littlest Higgs model with T-Parity and a warped extra dimension model with custodial protection for the flavour diagonal and non-diagonal Z boson couplings.

  19. LETTER: Statistical physics of the Schelling model of segregation

    NASA Astrophysics Data System (ADS)

    Dall'Asta, L.; Castellano, C.; Marsili, M.

    2008-07-01

    We investigate the static and dynamic properties of a celebrated model of social segregation, providing a complete explanation of the mechanisms leading to segregation both in one- and two-dimensional systems. Standard statistical physics methods shed light on the rich phenomenology of this simple model, exhibiting static phase transitions typical of kinetic constrained models, non-trivial coarsening like in driven-particle systems and percolation-related phenomena.

  20. Reading Time as Evidence for Mental Models in Understanding Physics

    NASA Astrophysics Data System (ADS)

    Brookes, David T.; Mestre, José; Stine-Morrow, Elizabeth A. L.

    2007-11-01

    We present results of a reading study that show the usefulness of probing physics students' cognitive processing by measuring reading time. According to contemporary discourse theory, when people read a text, a network of associated inferences is activated to create a mental model. If the reader encounters an idea in the text that conflicts with existing knowledge, the construction of a coherent mental model is disrupted and reading times are prolonged, as measured using a simple self-paced reading paradigm. We used this effect to study how "non-Newtonian" and "Newtonian" students create mental models of conceptual systems in physics as they read texts related to the ideas of Newton's third law, energy, and momentum. We found significant effects of prior knowledge state on patterns of reading time, suggesting that students attempt to actively integrate physics texts with their existing knowledge.

  1. Transport in Polymer-Electrolyte Membranes I. Physical Model

    SciTech Connect

    Weber, Adam Z.; Newman, John

    2003-06-02

    In this paper, a physical model is developed that is semiphenomenological and takes into account Schroeder's paradox. Using the wealth of knowledge contained in the literature regarding polymer-electrolyte membranes as a basis, a novel approach is taken in tying together all of the data into a single coherent theory. This approach involves describing the structural changes of the membrane due to water content, and casting this in terms of capillary phenomena. By treating the membrane in this fashion, Schroeder's paradox can be elucidated. Along with the structural changes, two different transport mechanisms are presented and discussed. These mechanisms, along with the membrane's structural changes, comprise the complete physical model of the membrane. The model is shown to agree qualitatively with different membranes and different membrane forms, and is applicable to modeling perfluorinated sulfonic acid and similar membranes. It is also the first physically based comprehensive model of transport in a membrane that includes a physical description of Schroeder's paradox, and it bridges the gap between the two types of macroscopic models currently in the literature.

  2. Combined physical and chemical nonequilibrium transport model for solution conduits

    NASA Astrophysics Data System (ADS)

    Field, Malcolm S.; Leij, Feike J.

    2014-02-01

    Solute transport in karst aquifers is primarily constrained to relatively complex and inaccessible solution conduits where transport is often rapid, turbulent, and at times constrictive. Breakthrough curves generated from tracer tests in solution conduits are typically positively-skewed with long tails evident. Physical nonequilibrium models to fit breakthrough curves for tracer tests in solution conduits are now routinely employed. Chemical nonequilibrium processes are likely important interactions, however. In addition to partitioning between different flow domains, there may also be equilibrium and nonequilibrium partitioning between the aqueous and solid phases. A combined physical and chemical nonequilibrium (PCNE) model was developed for an instantaneous release similar to that developed by Leij and Bradford (2009) for a pulse release. The PCNE model allows for partitioning open space in solution conduits into mobile and immobile flow regions with first-order mass transfer between the two regions to represent physical nonequilibrium in the conduit. Partitioning between the aqueous and solid phases proceeds either as an equilibrium process or as a first-order process and represents chemical nonequilibrium for both the mobile and immobile regions. Application of the model to three example breakthrough curves demonstrates the applicability of the combined physical and chemical nonequilibrium model to tracer tests conducted in karst aquifers, with exceptionally good model fits to the data. The three models, each from a different state in the United States, exhibit very different velocities, dispersions, and other transport properties with most of the transport occurring via the fraction of mobile water. Fitting the model suggests the potentially important interaction of physical and chemical nonequilibrium processes.

  3. A mathematical look at a physical power prediction model

    SciTech Connect

    Landberg, L.

    1997-12-31

    This paper takes a mathematical look at a physical model used to predict the power produced from wind farms. The reason is to see whether simple mathematical expressions can replace the original equations, and to give guidelines as to where the simplifications can be made and where they can not. This paper shows that there is a linear dependence between the geostrophic wind and the wind at the surface, but also that great care must be taken in the selection of the models since physical dependencies play a very important role, e.g. through the dependence of the turning of the wind on the wind speed.

  4. Physics-based model for electro-chemical process

    SciTech Connect

    Zhang, Jinsuo

    2013-07-01

    Considering the kinetics of electrochemical reactions and mass transfer at the surface and near-surface of the electrode, a physics-based separation model for separating actinides from fission products in an electro-refiner is developed. The model, taking into account the physical, chemical and electrochemical processes at the electrode surface, can be applied to study electrorefining kinetics. One of the methods used for validation has been to apply the developed model to the computation of the cyclic voltammetry process of PuCl{sub 3} and UCl{sub 3} at a solid electrode in molten KCl-LiCl. The computed results appear to be similar to experimental measures. The separation model can be applied to predict materials flows under normal and abnormal operation conditions. Parametric studies can be conducted based on the model to identify the most important factors that affect the electrorefining processes.

  5. Highly physical penumbra solar radiation pressure modeling with atmospheric effects

    NASA Astrophysics Data System (ADS)

    Robertson, Robert; Flury, Jakob; Bandikova, Tamara; Schilling, Manuel

    2015-10-01

    We present a new method for highly physical solar radiation pressure (SRP) modeling in Earth's penumbra. The fundamental geometry and approach mirrors past work, where the solar radiation field is modeled using a number of light rays, rather than treating the Sun as a single point source. However, we aim to clarify this approach, simplify its implementation, and model previously overlooked factors. The complex geometries involved in modeling penumbra solar radiation fields are described in a more intuitive and complete way to simplify implementation. Atmospheric effects are tabulated to significantly reduce computational cost. We present new, more efficient and accurate approaches to modeling atmospheric effects which allow us to consider the high spatial and temporal variability in lower atmospheric conditions. Modeled penumbra SRP accelerations for the Gravity Recovery and Climate Experiment (GRACE) satellites are compared to the sub-nm/s2 precision GRACE accelerometer data. Comparisons to accelerometer data and a traditional penumbra SRP model illustrate the improved accuracy which our methods provide. Sensitivity analyses illustrate the significance of various atmospheric parameters and modeled effects on penumbra SRP. While this model is more complex than a traditional penumbra SRP model, we demonstrate its utility and propose that a highly physical model which considers atmospheric effects should be the basis for any simplified approach to penumbra SRP modeling.

  6. The Martian surface radiation environment - a comparison of models and MSL/RAD measurements

    NASA Astrophysics Data System (ADS)

    Matthiä, Daniel; Ehresmann, Bent; Lohf, Henning; Köhler, Jan; Zeitlin, Cary; Appel, Jan; Sato, Tatsuhiko; Slaba, Tony; Martin, Cesar; Berger, Thomas; Boehm, Eckart; Boettcher, Stephan; Brinza, David E.; Burmeister, Soenke; Guo, Jingnan; Hassler, Donald M.; Posner, Arik; Rafkin, Scot C. R.; Reitz, Günther; Wilson, John W.; Wimmer-Schweingruber, Robert F.

    2016-03-01

    Context: The Radiation Assessment Detector (RAD) on the Mars Science Laboratory (MSL) has been measuring the radiation environment on the surface of Mars since August 6th 2012. MSL-RAD is the first instrument to provide detailed information about charged and neutral particle spectra and dose rates on the Martian surface, and one of the primary objectives of the RAD investigation is to help improve and validate current radiation transport models. Aims: Applying different numerical transport models with boundary conditions derived from the MSL-RAD environment the goal of this work was to both provide predictions for the particle spectra and the radiation exposure on the Martian surface complementing the RAD sensitive range and, at the same time, validate the results with the experimental data, where applicable. Such validated models can be used to predict dose rates for future manned missions as well as for performing shield optimization studies. Methods: Several particle transport models (GEANT4, PHITS, HZETRN/OLTARIS) were used to predict the particle flux and the corresponding radiation environment caused by galactic cosmic radiation on Mars. From the calculated particle spectra the dose rates on the surface are estimated. Results: Calculations of particle spectra and dose rates induced by galactic cosmic radiation on the Martian surface are presented. Although good agreement is found in many cases for the different transport codes, GEANT4, PHITS, and HZETRN/OLTARIS, some models still show large, sometimes order of magnitude discrepancies in certain particle spectra. We have found that RAD data is helping to make better choices of input parameters and physical models. Elements of these validated models can be applied to more detailed studies on how the radiation environment is influenced by solar modulation, Martian atmosphere and soil, and changes due to the Martian seasonal pressure cycle. By extending the range of the calculated particle spectra with respect to

  7. Hadronic Shower Validation Experience for the ATLAS End-Cap Calorimeter

    NASA Astrophysics Data System (ADS)

    Kiryunin, A. E.; Salihagić, D.

    2007-03-01

    Validation of GEANT4 hadronic physics models is carried out by comparing experimental data from beam tests of modules of the ATLAS end-cap calorimeters with GEANT4 based simulations. Two physics lists (LHEP and QGSP) for the simulation of hadronic showers are evaluated. Calorimeter performance parameters like the energy resolution and response for charged pions and shapes of showers are studied. Comparison with GEANT3 predictions is done as well.

  8. Hadronic Shower Validation Experience for the ATLAS End-Cap Calorimeter

    SciTech Connect

    Kiryunin, A. E.; Salihagic, D.

    2007-03-19

    Validation of GEANT4 hadronic physics models is carried out by comparing experimental data from beam tests of modules of the ATLAS end-cap calorimeters with GEANT4 based simulations. Two physics lists (LHEP and QGSP) for the simulation of hadronic showers are evaluated. Calorimeter performance parameters like the energy resolution and response for charged pions and shapes of showers are studied. Comparison with GEANT3 predictions is done as well.

  9. Coarse-grained, foldable, physical model of the polypeptide chain

    PubMed Central

    Chakraborty, Promita; Zuckermann, Ronald N.

    2013-01-01

    Although nonflexible, scaled molecular models like Pauling–Corey’s and its descendants have made significant contributions in structural biology research and pedagogy, recent technical advances in 3D printing and electronics make it possible to go one step further in designing physical models of biomacromolecules: to make them conformationally dynamic. We report here the design, construction, and validation of a flexible, scaled, physical model of the polypeptide chain, which accurately reproduces the bond rotational degrees of freedom in the peptide backbone. The coarse-grained backbone model consists of repeating amide and α-carbon units, connected by mechanical bonds (corresponding to φ and ψ) that include realistic barriers to rotation that closely approximate those found at the molecular scale. Longer-range hydrogen-bonding interactions are also incorporated, allowing the chain to readily fold into stable secondary structures. The model is easily constructed with readily obtainable parts and promises to be a tremendous educational aid to the intuitive understanding of chain folding as the basis for macromolecular structure. Furthermore, this physical model can serve as the basis for linking tangible biomacromolecular models directly to the vast array of existing computational tools to provide an enhanced and interactive human–computer interface. PMID:23898168

  10. Evaluating performances of simplified physically based landslide susceptibility models.

    NASA Astrophysics Data System (ADS)

    Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale

    2015-04-01

    Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk

  11. Compressive sensing as a paradigm for building physics models

    NASA Astrophysics Data System (ADS)

    Nelson, Lance J.; Hart, Gus L. W.; Zhou, Fei; Ozoliņš, Vidvuds

    2013-01-01

    The widely accepted intuition that the important properties of solids are determined by a few key variables underpins many methods in physics. Though this reductionist paradigm is applicable in many physical problems, its utility can be limited because the intuition for identifying the key variables often does not exist or is difficult to develop. Machine learning algorithms (genetic programming, neural networks, Bayesian methods, etc.) attempt to eliminate the a priori need for such intuition but often do so with increased computational burden and human time. A recently developed technique in the field of signal processing, compressive sensing (CS), provides a simple, general, and efficient way of finding the key descriptive variables. CS is a powerful paradigm for model building; we show that its models are more physical and predict more accurately than current state-of-the-art approaches and can be constructed at a fraction of the computational cost and user effort.

  12. Filamentous Phages As a Model System in Soft Matter Physics.

    PubMed

    Dogic, Zvonimir

    2016-01-01

    Filamentous phages have unique physical properties, such as uniform particle lengths, that are not found in other model systems of rod-like colloidal particles. Consequently, suspensions of such phages provided powerful model systems that have advanced our understanding of soft matter physics in general and liquid crystals in particular. We described some of these advances. In particular we briefly summarize how suspensions of filamentous phages have provided valuable insight into the field of colloidal liquid crystals. We also describe recent experiments on filamentous phages that have elucidated a robust pathway for assembly of 2D membrane-like materials. Finally, we outline unique structural properties of filamentous phages that have so far remained largely unexplored yet have the potential to further advance soft matter physics and material science. PMID:27446051

  13. An Introduction to the Standard Model of Particle Physics

    NASA Astrophysics Data System (ADS)

    Cottingham, W. Noel; Greenwood, Derek A.

    1999-01-01

    This graduate textbook provides a concise, accessible introduction to the Standard Model of particle physics. Theoretical concepts are developed clearly and carefully throughout the book--from the electromagnetic and weak interactions of leptons and quarks to the strong interactions of quarks. Chapters developing the theory are interspersed with chapters describing some of the wealth of experimental data supporting the model. The book assumes only the standard mathematics taught in an undergraduate physics course; more sophisticated mathematical ideas are developed in the text and in appendices. For graduate students in particle physics and physicists working in other fields who are interested in the current understanding of the ultimate constituents of matter, this textbook provides a lucid and up-to-date introduction.

  14. A stochastic physical system approach to modeling river water quality

    NASA Astrophysics Data System (ADS)

    Curi, W. F.; Unny, T. E.; Kay, J. J.

    1995-06-01

    In this paper, concepts of network thermodynamics are applied to a river water quality model, which is based on Streeter-Phelps equations, to identify the corresponding physical components and their topology. Then, the randomness in the parameters, input coefficients and initial conditions are modeled by Gaussian white noises. From the stochastic components of the physical system description of problem and concepts of physical system theory, a set of stochastic differential equations can be automatically generated in a computer and the recent developments on the automatic formulation of the moment equations based on Ito calculus can be used. This procedure is illustrated through the solution of an example of stochastic river water quality problem and it is also shown how other related problems with different configurations can be automatically solved in a computer using just one software.

  15. Filamentous Phages As a Model System in Soft Matter Physics.

    PubMed

    Dogic, Zvonimir

    2016-01-01

    Filamentous phages have unique physical properties, such as uniform particle lengths, that are not found in other model systems of rod-like colloidal particles. Consequently, suspensions of such phages provided powerful model systems that have advanced our understanding of soft matter physics in general and liquid crystals in particular. We described some of these advances. In particular we briefly summarize how suspensions of filamentous phages have provided valuable insight into the field of colloidal liquid crystals. We also describe recent experiments on filamentous phages that have elucidated a robust pathway for assembly of 2D membrane-like materials. Finally, we outline unique structural properties of filamentous phages that have so far remained largely unexplored yet have the potential to further advance soft matter physics and material science.

  16. Filamentous Phages As a Model System in Soft Matter Physics

    PubMed Central

    Dogic, Zvonimir

    2016-01-01

    Filamentous phages have unique physical properties, such as uniform particle lengths, that are not found in other model systems of rod-like colloidal particles. Consequently, suspensions of such phages provided powerful model systems that have advanced our understanding of soft matter physics in general and liquid crystals in particular. We described some of these advances. In particular we briefly summarize how suspensions of filamentous phages have provided valuable insight into the field of colloidal liquid crystals. We also describe recent experiments on filamentous phages that have elucidated a robust pathway for assembly of 2D membrane-like materials. Finally, we outline unique structural properties of filamentous phages that have so far remained largely unexplored yet have the potential to further advance soft matter physics and material science. PMID:27446051

  17. Monte Carlo modeling in CT-based geometries: dosimetry for biological modeling experiments with particle beam radiation.

    PubMed

    Diffenderfer, Eric S; Dolney, Derek; Schaettler, Maximilian; Sanzari, Jenine K; McDonough, James; Cengel, Keith A

    2014-03-01

    The space radiation environment imposes increased dangers of exposure to ionizing radiation, particularly during a solar particle event (SPE). These events consist primarily of low energy protons that produce a highly inhomogeneous dose distribution. Due to this inherent dose heterogeneity, experiments designed to investigate the radiobiological effects of SPE radiation present difficulties in evaluating and interpreting dose to sensitive organs. To address this challenge, we used the Geant4 Monte Carlo simulation framework to develop dosimetry software that uses computed tomography (CT) images and provides radiation transport simulations incorporating all relevant physical interaction processes. We found that this simulation accurately predicts measured data in phantoms and can be applied to model dose in radiobiological experiments with animal models exposed to charged particle (electron and proton) beams. This study clearly demonstrates the value of Monte Carlo radiation transport methods for two critically interrelated uses: (i) determining the overall dose distribution and dose levels to specific organ systems for animal experiments with SPE-like radiation, and (ii) interpreting the effect of random and systematic variations in experimental variables (e.g. animal movement during long exposures) on the dose distributions and consequent biological effects from SPE-like radiation exposure. The software developed and validated in this study represents a critically important new tool that allows integration of computational and biological modeling for evaluating the biological outcomes of exposures to inhomogeneous SPE-like radiation dose distributions, and has potential applications for other environmental and therapeutic exposure simulations.

  18. Scratch as a computational modelling tool for teaching physics

    NASA Astrophysics Data System (ADS)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  19. Speedminton: Using the Tactical Games Model in Secondary Physical Education

    ERIC Educational Resources Information Center

    Oh, Hyun-Ju; Bullard, Susan; Hovatter, Rhonda

    2011-01-01

    Teaching and learning of sport and sports-related games dominates the curriculum in most secondary physical education programs in America. For many secondary school students, playing games can be exciting and lead to a lifetime of participation in sport-related activities. Using the Tactical Games Model (TGM) (Mitchell et al., 2006) to teach the…

  20. Physical-scale models of engineered log jams in rivers

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Stream restoration and river engineering projects are employing engineered log jams increasingly for stabilization and in-stream improvements. To further advance the design of these structures and their morphodynamic effects on corridors, the basis for physical-scale models of rivers with engineere...

  1. Advanced Ground Systems Maintenance Physics Models for Diagnostics Project

    NASA Technical Reports Server (NTRS)

    Harp, Janicce Leshay

    2014-01-01

    The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations.

  2. Project Physics Text 5, Models of the Atom.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    Basic atomic theories are presented in this fifth unit of the Project Physics text for use by senior high students. Chemical basis of atomic models in the early years of the 18th Century is discussed n connection with Dalton's theory, atomic properties, and periodic tables. The discovery of electrons is described by using cathode rays, Millikan's…

  3. Project Physics Tests 5, Models of the Atom.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    Test items relating to Project Physics Unit 5 are presented in this booklet. Included are 70 multiple-choice and 23 problem-and-essay questions. Concepts of atomic model are examined on aspects of relativistic corrections, electron emission, photoelectric effects, Compton effect, quantum theories, electrolysis experiments, atomic number and mass,…

  4. Aspects of the Cognitive Model of Physics Problem Solving.

    ERIC Educational Resources Information Center

    Brekke, Stewart E.

    Various aspects of the cognitive model of physics problem solving are discussed in detail including relevant cues, encoding, memory, and input stimuli. The learning process involved in the recognition of familiar and non-familiar sensory stimuli is highlighted. Its four components include selection, acquisition, construction, and integration. The…

  5. Evaluation of an Interdisciplinary, Physically Active Lifestyle Course Model

    ERIC Educational Resources Information Center

    Fede, Marybeth H.

    2009-01-01

    The purpose of this study was to evaluate a fit for life program at a university and to use the findings from an extensive literature review, consultations with formative and summative committees, and data collection to develop an interdisciplinary, physically active lifestyle (IPAL) course model. To address the 5 research questions examined in…

  6. TOWARD EFFICIENT RIPARIAN RESTORATION: INTEGRATING ECONOMIC, PHYSICAL, AND BIOLOGICAL MODELS

    EPA Science Inventory

    This paper integrates economic, biological, and physical models to explore the efficient combination and spatial allocation of conservation efforts to protect water quality and increase salmonid populations in the Grande Ronde basin, Oregon. We focus on the effects of shade on wa...

  7. PHYSICAL AND NUMERICAL MODELING OF ASD EXHAUST DISPERSION AROUND HOUSES

    EPA Science Inventory

    The report discusses the use of a wind tunnel to physically model the dispersion of exhaust plumes from active soil depressurization (ASD) radon mitigation systems in houses. he testing studied the effects of exhaust location (grade level vs. above the eave), as house height, roo...

  8. The Role of Computer-Aided Modelling in Learning Physics.

    ERIC Educational Resources Information Center

    Niedderer, H.; And Others

    1991-01-01

    Described is how an iconic model building software can be used to help students gain a deeper qualitative conceptual understanding of physics concepts. The program, STELLA, links research about misconceptions and new teaching strategies with the use of modern information technology tools. (31 references) (KR)

  9. Linear Sigma Model Toolshed for D-brane Physics

    SciTech Connect

    Hellerman, Simeon

    2001-08-23

    Building on earlier work, we construct linear sigma models for strings on curved spaces in the presence of branes. Our models include an extremely general class of brane-worldvolume gauge field configurations. We explain in an accessible manner the mathematical ideas which suggest appropriate worldsheet interactions for generating a given open string background. This construction provides an explanation for the appearance of the derived category in D-brane physic complementary to that of recent work of Douglas.

  10. Plasma physics modeling and the Cray-2 multiprocessor

    SciTech Connect

    Killeen, J.

    1985-01-01

    The importance of computer modeling in the magnetic fusion energy research program is discussed. The need for the most advanced supercomputers is described. To meet the demand for more powerful scientific computers to solve larger and more complicated problems, the computer industry is developing multiprocessors. The role of the Cray-2 in plasma physics modeling is discussed with some examples. 28 refs., 2 figs., 1 tab.

  11. Two-fluid model for heavy electron physics

    NASA Astrophysics Data System (ADS)

    Yang, Yi-feng

    2016-07-01

    The two-fluid model is a phenomenological description of the gradual change of the itinerant and local characters of f-electrons with temperature and other tuning parameters and has been quite successful in explaining many unusual and puzzling experimental observations in heavy electron materials. We review some of these results and discuss possible implications of the two-fluid model in understanding the microscopic origin of heavy electron physics.

  12. Childhood physical abuse and midlife physical health: Testing a multi-pathway life course model

    PubMed Central

    Springer, K. W.

    2009-01-01

    Although prior research has established that childhood abuse adversely affects midlife physical health outcomes, it is unclear how abuse continues to harm health decades after the abuse has ended. In this project, I assess four life course pathways (behavioral, emotional, cognitive, and social relations) that plausibly link childhood physical abuse to three midlife physical health outcomes (bronchitis diagnosis, ulcer diagnosis, and general physical health). These three outcomes are etiologically distinct, leading to unique testable hypotheses. Multivariate models controlling for childhood background and early adversity were estimated using data from over 3,000 respondents in the Wisconsin Longitudinal Study, USA. The results indicate that midlife social relations and cognition do not function as pathways for any outcome. However, smoking is a crucial pathway connecting childhood abuse with bronchitis; mental health is important for ulcers; and BMI, smoking, and mental health are paramount for general physical health. These findings suggest that abuse survivors’ coping mechanisms can lead to an array of midlife health problems. Furthermore, the results validate the use of etiologically distinct outcomes for understanding plausible causal pathways when using cross-sectional data. PMID:19446943

  13. Childhood physical abuse and midlife physical health: testing a multi-pathway life course model.

    PubMed

    Springer, Kristen W

    2009-07-01

    Although prior research has established that childhood abuse adversely affects midlife physical health, it is unclear how abuse continues to harm health decades after the abuse has ended. In this project, I assess four life course pathways (health behaviors, cognition, mental health, and social relation) that plausibly link childhood physical abuse to three midlife physical health outcomes (bronchitis diagnosis, ulcer diagnosis, and general physical health). These three outcomes are etiologically distinct, leading to unique testable hypotheses. Multivariate models controlling for childhood background and early adversity were estimated using data from over 3000 respondents in the Wisconsin Longitudinal Study, USA. The results indicate that midlife social relations and cognition do not function as pathways for any outcome. However, smoking is a crucial pathway connecting childhood abuse with bronchitis; mental health is important for ulcers; and BMI, smoking, and mental health are paramount for general physical health. These findings suggest that abuse survivors' coping mechanisms can lead to an array of midlife health problems. Furthermore, the results validate the use of etiologically distinct outcomes for understanding plausible causal pathways when using cross-sectional data.

  14. Physical-Socio-Economic Modeling of Climate Change

    NASA Astrophysics Data System (ADS)

    Chamberlain, R. G.; Vatan, F.

    2008-12-01

    Because of the global nature of climate change, any assessment of the effects of plans, policies, and response to climate change demands a model that encompasses the entire Earth System, including socio- economic factors. Physics-based climate models of the factors that drive global temperatures, rainfall patterns, and sea level are necessary but not sufficient to guide decision making. Actions taken by farmers, industrialists, environmentalists, politicians, and other policy makers may result in large changes to economic factors, international relations, food production, disease vectors, and beyond. These consequences will not be felt uniformly around the globe or even across a given region. Policy models must comprehend all of these considerations. Combining physics-based models of the Earth's climate and biosphere with societal models of population dynamics, economics, and politics is a grand challenge with high stakes. We propose to leverage our recent advances in modeling and simulation of military stability and reconstruction operations to models that address all these areas of concern. Following over twenty years' experience of successful combat simulation, JPL has started developing Minerva, which will add demographic, economic, political, and media/information models to capabilities that already exist. With these new models, for which we have design concepts, it will be possible to address a very wide range of potential national and international problems that were previously inaccessible. Our climate change model builds on Minerva and expands the geographical horizon from playboxes containing regions and neighborhoods to the entire globe. This system consists of a collection of interacting simulation models that specialize in different aspects of the global situation. They will each contribute to and draw from a pool of shared data. The basic models are: the physical model; the demographic model; the political model; the economic model; and the media

  15. Evaluating performances of simplified physically based models for landslide susceptibility

    NASA Astrophysics Data System (ADS)

    Formetta, G.; Capparelli, G.; Versace, P.

    2015-12-01

    Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC) coupled with model M3 is the best modeling solution for our test case.

  16. Pre-Service Physics Teachers' Knowledge of Models and Perceptions of Modelling

    ERIC Educational Resources Information Center

    Ogan-Bekiroglu, Feral

    2006-01-01

    One of the purposes of this study was to examine the differences between knowledge of pre-service physics teachers who experienced model-based teaching in pre-service education and those who did not. Moreover, it was aimed to determine pre-service physics teachers' perceptions of modelling. Posttest-only control group experimental design was used…

  17. Rock Physics Models of Biofilm Growth in Porous Media

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; alhadhrami, F. M.; Atekwana, E. A.

    2013-12-01

    Recent studies suggest the potential to use acoustic techniques to image biofilm growth in porous media. Nonetheless the interpretation of the seismic response to biofilm growth and development remains speculative because of the lack of quantitative petrophysical models that can relate changes in biofilm saturation to changes in seismic attributes. Here, we report our efforts in developing quantitative rock physics models to biofilm saturation with increasing and decreasing P-wave velocity (VP) and amplitudes recorded in the Davis et al. [2010] physical scale experiment. We adapted rock physics models developed for modeling gas hydrates in unconsolidated sediments. Two distinct growth models, which appear to be a function of pore throat size, are needed to explain the experimental data. First, introduction of biofilm as an additional mineral grain in the sediment matrix (load-bearing mode) is needed to explain the increasing time-lapse VP. Second, introduction of biofilm as part of the pore fluid (pore-filling mode) is required to explain the decreasing time-lapse VP. To explain the time-lapse VP, up to 15% of the pore volume was required to be saturated with biofilm. The recorded seismic amplitudes, which can be expressed as a function of porosity, permeability and grain size, showed a monotonic time-lapse decay except on Day 3 at a few selected locations, where it increased. Since porosity changes are constrained by VP, amplitude increase could be modeled by increasing hydraulic conductivity. Time lapse VP at locations with increasing amplitudes suggest that these locations have a load-bearing growth style. We conclude that permeability can increase by up to 10% at low (~2%) biofilm saturation in load-bearing growth style due to the development of channels within the biofilm structure. Developing a rock physics model for the biofilm growth in general may help create a field guide for interpreting porosity and permeability changes in bioremediation, MEOR and

  18. The ESA Meteoroid Model 2010: Enhanced Physical Model

    NASA Astrophysics Data System (ADS)

    Dikarev, Valeri; Mints, Alexey; Drolshagen, Gerhard

    The orbital distributions of meteoroids in interplanetary space are revised in the ESA meteoroid model. In the present update, the chemical composition of the meteoroids is simulated in more detail than in the previous meteoroid models. Silicate and carbonaceous fractions are introduced for all meteoroid populations, and in addition to asteroids and Jupiter-crossing comets, comet 2P/Encke is added as a source. The orbital evolution under planetary gravity, Poynting-Robertson effect and mutual collisions is simulated using analytical approximations. Infrared observations of the zodiacal cloud by the COBE DIRBE instrument, in situ flux measurements by the dust detectors on board Galileo, Ulysses, Pioneer 11 and Helios-1 spacecraft, and the crater size distributions on lunar rock samples retrieved by the Apollo missions are incorporated in the model.

  19. Application of physical parameter identification to finite-element models

    NASA Technical Reports Server (NTRS)

    Bronowicki, Allen J.; Lukich, Michael S.; Kuritz, Steven P.

    1987-01-01

    The time domain parameter identification method described previously is applied to TRW's Large Space Structure Truss Experiment. Only control sensors and actuators are employed in the test procedure. The fit of the linear structural model to the test data is improved by more than an order of magnitude using a physically reasonable parameter set. The electro-magnetic control actuators are found to contribute significant damping due to a combination of eddy current and back electro-motive force (EMF) effects. Uncertainties in both estimated physical parameters and modal behavior variables are given.

  20. Future high precision experiments and new physics beyond Standard Model

    SciTech Connect

    Luo, Mingxing

    1993-04-01

    High precision (< 1%) electroweak experiments that have been done or are likely to be done in this decade are examined on the basis of Standard Model (SM) predictions of fourteen weak neutral current observables and fifteen W and Z properties to the one-loop level, the implications of the corresponding experimental measurements to various types of possible new physics that enter at the tree or loop level were investigated. Certain experiments appear to have special promise as probes of the new physics considered here.

  1. Future high precision experiments and new physics beyond Standard Model

    SciTech Connect

    Luo, Mingxing.

    1993-01-01

    High precision (< 1%) electroweak experiments that have been done or are likely to be done in this decade are examined on the basis of Standard Model (SM) predictions of fourteen weak neutral current observables and fifteen W and Z properties to the one-loop level, the implications of the corresponding experimental measurements to various types of possible new physics that enter at the tree or loop level were investigated. Certain experiments appear to have special promise as probes of the new physics considered here.

  2. Model Independent Search For New Physics At The Tevatron

    SciTech Connect

    Choudalakis, Georgios

    2008-04-01

    The Standard Model of elementary particles can not be the final theory. There are theoretical reasons to expect the appearance of new physics, possibly at the energy scale of few TeV. Several possible theories of new physics have been proposed, each with unknown probability to be confirmed. Instead of arbitrarily choosing to examine one of those theories, this thesis is about searching for any sign of new physics in a model-independent way. This search is performed at the Collider Detector at Fermilab (CDF). The Standard Model prediction is implemented in all final states simultaneously, and an array of statistical probes is employed to search for significant discrepancies between data and prediction. The probes are sensitive to overall population discrepancies, shape disagreements in distributions of kinematic quantities of final particles, excesses of events of large total transverse momentum, and local excesses of data expected from resonances due to new massive particles. The result of this search, first in 1 fb-1 and then in 2 fb-1, is null, namely no considerable evidence of new physics was found.

  3. Evaluating nuclear physics inputs in core-collapse supernova models

    SciTech Connect

    Lentz, Eric J; Hix, William Raphael; Baird, Mark L; Messer, Bronson; Mezzacappa, Anthony

    2010-01-01

    Core-collapse supernova models depend on the details of the nuclear and weak interaction physics inputs just as they depend on the details of the macroscopic physics (transport, hydrodynamics, etc.), numerical methods, and progenitors. We present the results of our ongoing comparison studies of nuclear and weak interaction physics inputs to core collapse supernova models using the spherically-symmetric, general relativistic, neutrino radiation hydrodynamics code Agile-Boltztran. We focus on comparisons of the effects of the nuclear EoS and the effects of improving the opacities, particularly neutrino--nucleon interactions. We present the results of our ongoing comparison studies of nuclear and weak interaction physics inputs to core collapse supernova models using the spherically-symmetric, general relativistic, neutrino radiation hydrodynamics code Agile-Boltztran. We focus on comparisons of the effects of the nuclear EoS and the effects of improving the opacities, particularly neutrino--nucleon interactions. We also investigate the feedback between different EoSs and opacities in the context of different progenitors.

  4. Precision Higgs Boson Physics and Implications for Beyond the Standard Model Physics Theories

    SciTech Connect

    Wells, James

    2015-06-10

    The discovery of the Higgs boson is one of science's most impressive recent achievements. We have taken a leap forward in understanding what is at the heart of elementary particle mass generation. We now have a significant opportunity to develop even deeper understanding of how the fundamental laws of nature are constructed. As such, we need intense focus from the scientific community to put this discovery in its proper context, to realign and narrow our understanding of viable theory based on this positive discovery, and to detail the implications the discovery has for theories that attempt to answer questions beyond what the Standard Model can explain. This project's first main object is to develop a state-of-the-art analysis of precision Higgs boson physics. This is to be done in the tradition of the electroweak precision measurements of the LEP/SLC era. Indeed, the electroweak precision studies of the past are necessary inputs to the full precision Higgs program. Calculations will be presented to the community of Higgs boson observables that detail just how well various couplings of the Higgs boson can be measured, and more. These will be carried out using state-of-the-art theory computations coupled with the new experimental results coming in from the LHC. The project's second main objective is to utilize the results obtained from LHC Higgs boson experiments and the precision analysis, along with the direct search studies at LHC, and discern viable theories of physics beyond the Standard Model that unify physics to a deeper level. Studies will be performed on supersymmetric theories, theories of extra spatial dimensions (and related theories, such as compositeness), and theories that contain hidden sector states uniquely accessible to the Higgs boson. In addition, if data becomes incompatible with the Standard Model's low-energy effective lagrangian, new physics theories will be developed that explain the anomaly and put it into a more unified framework beyond

  5. Systems and models with anticipation in physics and its applications

    NASA Astrophysics Data System (ADS)

    Makarenko, A.

    2012-11-01

    Investigations of recent physics processes and real applications of models require the new more and more improved models which should involved new properties. One of such properties is anticipation (that is taking into accounting some advanced effects).It is considered the special kind of advanced systems - namely a strong anticipatory systems introduced by D. Dubois. Some definitions, examples and peculiarities of solutions are described. The main feature is presumable multivaluedness of the solutions. Presumable physical examples of such systems are proposed: self-organization problems; dynamical chaos; synchronization; advanced potentials; structures in micro-, meso- and macro- levels; cellular automata; computing; neural network theory. Also some applications for modeling social, economical, technical and natural systems are described.

  6. Unifying wildfire models from ecology and statistical physics.

    PubMed

    Zinck, Richard D; Grimm, Volker

    2009-11-01

    Understanding the dynamics of wildfire regimes is crucial for both regional forest management and predicting global interactions between fire regimes and climate. Accordingly, spatially explicit modeling of forest fire ecosystems is a very active field of research, including both generic and highly specific models. There is, however, a second field in which wildfire has served as a metaphor for more than 20 years: statistical physics. So far, there has been only limited interaction between these two fields of wildfire modeling. Here we show that two typical generic wildfire models from ecology are structurally equivalent to the most commonly used model from statistical physics. All three models can be unified to a single model in which they appear as special cases of regrowth-dependent flammability. This local "ecological memory" of former fire events is key to self-organization in wildfire ecosystems. The unified model is able to reproduce three different patterns observed in real boreal forests: fire size distributions, fire shapes, and a hump-shaped relationship between disturbance intensity (average annual area burned) and diversity of succession stages. The unification enables us to bring together insights from both disciplines in a novel way and to identify limitations that provide starting points for further research.

  7. Unifying wildfire models from ecology and statistical physics.

    PubMed

    Zinck, Richard D; Grimm, Volker

    2009-11-01

    Understanding the dynamics of wildfire regimes is crucial for both regional forest management and predicting global interactions between fire regimes and climate. Accordingly, spatially explicit modeling of forest fire ecosystems is a very active field of research, including both generic and highly specific models. There is, however, a second field in which wildfire has served as a metaphor for more than 20 years: statistical physics. So far, there has been only limited interaction between these two fields of wildfire modeling. Here we show that two typical generic wildfire models from ecology are structurally equivalent to the most commonly used model from statistical physics. All three models can be unified to a single model in which they appear as special cases of regrowth-dependent flammability. This local "ecological memory" of former fire events is key to self-organization in wildfire ecosystems. The unified model is able to reproduce three different patterns observed in real boreal forests: fire size distributions, fire shapes, and a hump-shaped relationship between disturbance intensity (average annual area burned) and diversity of succession stages. The unification enables us to bring together insights from both disciplines in a novel way and to identify limitations that provide starting points for further research. PMID:19799499

  8. Physically-based modeling and simulation of extraocular muscles.

    PubMed

    Wei, Qi; Sueda, Shinjiro; Pai, Dinesh K

    2010-12-01

    Dynamic simulation of human eye movements, with realistic physical models of extraocular muscles (EOMs), may greatly advance our understanding of the complexities of the oculomotor system and aid in treatment of visuomotor disorders. In this paper we describe the first three dimensional (3D) biomechanical model which can simulate the dynamics of ocular motility at interactive rates. We represent EOMs using "strands", which are physical primitives that can model an EOM's complex nonlinear anatomical and physiological properties. Contact between the EOMs, the globe, and orbital structures can be explicitly modeled. Several studies were performed to assess the validity and utility of the model. EOM deformation during smooth pursuit was simulated and compared with published experimental data; the model reproduces qualitative features of the observed nonuniformity. The model is able to reproduce realistic saccadic trajectories when the lateral rectus muscle was driven by published measurements of abducens neuron discharge. Finally, acute superior oblique palsy, a pathological condition, was simulated to further evaluate the system behavior; the predicted deviation patterns agree qualitatively with experimental observations. This example also demonstrates potential clinical applications of such a model. PMID:20868704

  9. Evaluating plume dispersion models: Expanding the practice to include the model physics

    SciTech Connect

    Weil, J.C.

    1994-12-31

    Plume dispersion models are used in a variety of air-quality applications including the determination of source emission limits, new source sites, etc. The cost of pollution control and siting has generated much interest in model evaluation and accuracy. Two questions are of primary concern: (1) How well does a model predict the high ground-level concentrations (GLCs) that are necessary in assessing compliance with air-quality regulations? This prompts an operational performance evaluation; (2) Is the model based on sound physical principles and does it give good predictions for the {open_quotes}right{close_quotes} reasons? This prompts a model physics evaluation. Although air-quality managers are interested primarily in operational performance, model physics should be an equally important issue. The purpose in establishing good physics is to build confidence in model predictions beyond the limited experimental range, i.e., for new source applications.

  10. Microphysics in Multi-scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2012-01-01

    Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

  11. Physical security and vulnerability modeling for infrasturcture facilities.

    SciTech Connect

    Nozick, Linda Karen; Jones, Dean A.; Davis, Chad Edward; Turnquist, Mark Alan

    2006-07-01

    A model of malicious intrusions in infrastructure facilities is developed, using a network representation of the system structure together with Markov models of intruder progress and strategy. This structure provides an explicit mechanism to estimate the probability of successful breaches of physical security, and to evaluate potential improvements. Simulation is used to analyze varying levels of imperfect information on the part of the intruders in planning their attacks. An example of an intruder attempting to place an explosive device on an airplane at an airport gate illustrates the structure and potential application of the model.

  12. Channel and physical models of the Jovian subnebula

    NASA Technical Reports Server (NTRS)

    Lewis, J. S.

    1982-01-01

    A semiempirical physical model of the Jovian subnebula was developed by analogy with the primitive solar nebula itself. The chemical aspects of this model are developed according to the principles developed in the study of the thermochemistry and gas kinetic behavior of the solar nebula, but with important modifications to take into account the higher pressures and densities in the Jovian subnebula. The bulk compositions and densities of the inner satellites of Jupiter are calculated. It is proposed that Europa differs from Io chiefly in that in has suffered a less severe thermal history. The general features of this model are applicable with minor modification to the systems of Saturn and Uranus.

  13. Physics-Based Reactive Burn Model: Grain Size Effects

    NASA Astrophysics Data System (ADS)

    Lu, X.; Hamate, Y.; Horie, Y.

    2007-12-01

    We have been developing a physics-based reactive burn (PBRB) model, which was formulated based on the concept of a statistical hot spot cell. In the model, essential thermomechanics and physiochemical features are explicitly modeled. In this paper, we have extended the statistical hot spot model to explicitly describe the ignition and growth of hot spots. In particular, grain size effects are explicitly delineated through introduction of grain size-dependent, thickness of the hot-region, energy deposition criterion, and specific surface area. Besides the linear relationships between the run distance to detonation and the critical diameter with respect to the reciprocal specific surface area of heterogeneous explosives (HE), which is based on the original model and discussed in a parallel paper of this meeting, parametric studies have shown that the extended PBRB model can predict a non-monotonic variation of shock sensitivity with grain size, as observed by Moulard et al.

  14. An integrated physical and biological model for anaerobic lagoons.

    PubMed

    Wu, Binxin; Chen, Zhenbin

    2011-04-01

    A computational fluid dynamics (CFD) model that integrates physical and biological processes for anaerobic lagoons is presented. In the model development, turbulence is represented using a transition k-ω model, heat conduction and solar radiation are included in the thermal model, biological oxygen demand (BOD) reduction is characterized by first-order kinetics, and methane yield rate is expressed as a linear function of temperature. A test of the model applicability is conducted in a covered lagoon digester operated under tropical climate conditions. The commercial CFD software, ANSYS-Fluent, is employed to solve the integrated model. The simulation procedures include solving fluid flow and heat transfer, predicting local resident time based on the converged flow fields, and calculating the BOD reduction and methane production. The simulated results show that monthly methane production varies insignificantly, but the time to achieve a 99% BOD reduction in January is much longer than that in July.

  15. Physical modeling of traffic with stochastic cellular automata

    SciTech Connect

    Schreckenberg, M.; Nagel, K. |

    1995-09-01

    A new type of probabilistic cellular automaton for the physical description of single and multilane traffic is presented. In this model space, time and the velocity of the cars are represented by integer numbers (as usual in cellular automata) with local update rules for the velocity. The model is very efficient for both numerical simulations and analytical investigations. The numerical results from extensive simulations reproduce very well data taken from real traffic (e.g. fundamental diagrams). Several analytical results for the model are presented as well as new approximation schemes for stationary traffic. In addition the relation to continuum hydrodynamic theory (Lighthill-Whitham) and the follow-the-leader models is discussed. The model is part of an interdisciplinary research program in Northrhine-Westfalia (``NRW Forschungsverbund Verkehrssimulation``) for the construction of a large scale microsimulation model for network traffic, supported by the government of NRW.

  16. A Framework for Understanding Physics Students' Computational Modeling Practices

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  17. Influence of a health-related physical fitness model on students' physical activity, perceived competence, and enjoyment.

    PubMed

    Fu, You; Gao, Zan; Hannon, James; Shultz, Barry; Newton, Maria; Sibthorp, Jim

    2013-12-01

    This study was designed to explore the effects of a health-related physical fitness physical education model on students' physical activity, perceived competence, and enjoyment. 61 students (25 boys, 36 girls; M age = 12.6 yr., SD = 0.6) were assigned to two groups (health-related physical fitness physical education group, and traditional physical education group), and participated in one 50-min. weekly basketball class for 6 wk. Students' in-class physical activity was assessed using NL-1000 pedometers. The physical subscale of the Perceived Competence Scale for Children was employed to assess perceived competence, and children's enjoyment was measured using the Sport Enjoyment Scale. The findings suggest that students in the intervention group increased their perceived competence, enjoyment, and physical activity over a 6-wk. intervention, while the comparison group simply increased physical activity over time. Children in the intervention group had significantly greater enjoyment.

  18. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    A multi-scale modeling system with unified physics has been developed at NASA Goddard Space Flight Center (GSFC). The system consists of an MMF, the coupled NASA Goddard finite-volume GCM (fvGCM) and Goddard Cumulus Ensemble model (GCE, a CRM); the state-of-the-art Weather Research and Forecasting model (WRF) and the stand alone GCE. These models can share the same microphysical schemes, radiation (including explicitly calculated cloud optical properties), and surface models that have been developed, improved and tested for different environments. In this talk, I will present: (1) A brief review on GCE model and its applications on the impact of the aerosol on deep precipitation processes, (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications). We are also performing the inline tracer calculation to comprehend the physical processes (i.e., boundary layer and each quadrant in the boundary layer) related to the development and structure of hurricanes and mesoscale convective systems. In addition, high - resolution (spatial. 2km, and temporal, I minute) visualization showing the model results will be presented.

  19. A physically based model of global freshwater surface temperature

    NASA Astrophysics Data System (ADS)

    Beek, Ludovicus P. H.; Eikelboom, Tessa; Vliet, Michelle T. H.; Bierkens, Marc F. P.

    2012-09-01

    Temperature determines a range of physical properties of water and exerts a strong control on surface water biogeochemistry. Thus, in freshwater ecosystems the thermal regime directly affects the geographical distribution of aquatic species through their growth and metabolism and indirectly through their tolerance to parasites and diseases. Models used to predict surface water temperature range between physically based deterministic models and statistical approaches. Here we present the initial results of a physically based deterministic model of global freshwater surface temperature. The model adds a surface water energy balance to river discharge modeled by the global hydrological model PCR-GLOBWB. In addition to advection of energy from direct precipitation, runoff, and lateral exchange along the drainage network, energy is exchanged between the water body and the atmosphere by shortwave and longwave radiation and sensible and latent heat fluxes. Also included are ice formation and its effect on heat storage and river hydraulics. We use the coupled surface water and energy balance model to simulate global freshwater surface temperature at daily time steps with a spatial resolution of 0.5° on a regular grid for the period 1976-2000. We opt to parameterize the model with globally available data and apply it without calibration in order to preserve its physical basis with the outlook of evaluating the effects of atmospheric warming on freshwater surface temperature. We validate our simulation results with daily temperature data from rivers and lakes (U.S. Geological Survey (USGS), limited to the USA) and compare mean monthly temperatures with those recorded in the Global Environment Monitoring System (GEMS) data set. Results show that the model is able to capture the mean monthly surface temperature for the majority of the GEMS stations, while the interannual variability as derived from the USGS and NOAA data was captured reasonably well. Results are poorest for

  20. Dynamic inverse models in human-cyber-physical systems

    NASA Astrophysics Data System (ADS)

    Robinson, Ryan M.; Scobee, Dexter R. R.; Burden, Samuel A.; Sastry, S. Shankar

    2016-05-01

    Human interaction with the physical world is increasingly mediated by automation. This interaction is characterized by dynamic coupling between robotic (i.e. cyber) and neuromechanical (i.e. human) decision-making agents. Guaranteeing performance of such human-cyber-physical systems will require predictive mathematical models of this dynamic coupling. Toward this end, we propose a rapprochement between robotics and neuromechanics premised on the existence of internal forward and inverse models in the human agent. We hypothesize that, in tele-robotic applications of interest, a human operator learns to invert automation dynamics, directly translating from desired task to required control input. By formulating the model inversion problem in the context of a tracking task for a nonlinear control system in control-a_ne form, we derive criteria for exponential tracking and show that the resulting dynamic inverse model generally renders a portion of the physical system state (i.e., the internal dynamics) unobservable from the human operator's perspective. Under stability conditions, we show that the human can achieve exponential tracking without formulating an estimate of the system's state so long as they possess an accurate model of the system's dynamics. These theoretical results are illustrated using a planar quadrotor example. We then demonstrate that the automation can intervene to improve performance of the tracking task by solving an optimal control problem. Performance is guaranteed to improve under the assumption that the human learns and inverts the dynamic model of the altered system. We conclude with a discussion of practical limitations that may hinder exact dynamic model inversion.

  1. SU-E-T-53: Benchmarking a Monte Carlo Model for Patient Plane Leakage Calculations of Low Energy 6MV Unique Linacs

    SciTech Connect

    Constantin, M; Sawkey, D; Johnsen, S; Hsu, H

    2014-06-01

    Purpose: To validate the physics parameters of a Monte Carlo model for patient plane leakage calculations on the 6MV Unique linac by comparing the simulations against IEC patient plane leakage measurements. The benchmarked model can further be used for shielding design optimization, to predict leakage in the proximity of intended treatment fields, reduce the system weight and cost, and improve components reliability. Methods: The treatment head geometry of the Unique linac was simulated in Geant4 (v9.4.p02 with “Opt3” standard electromagnetic physics list) based on CAD drawings of all collimation and shielding components projected from the target to the area within 2m from isocenter. A 4×4m2 scorer was inserted 1m from the target in the patient plane and multiple phase space files were recorded by performing a 40-node computing cluster simulation on the EC2 cloud. The photon energy fluence was calculated relative to the value at isocenter for a 10×10cm2 field using 10×10mm2 bins. Tungsten blocks were parked accordingly to represent MLC120. The secondary particle contamination to patient plane was eliminated by “killing” those particles prior to the primary collimator entrance using a “kill-plane”, which represented the upper head shielding components not being modeled. Both IEC patient-plane leakage and X/Y-jaws transmission were simulated. Results: The contribution of photons to energy fluence was 0.064% on average, in excellent agreement with the experimental data available at 0.5, 1.0, and 1.5m from isocenter, characterized by an average leakage of 0.045% and a maximum leakage of 0.085%. X- and Y-jaws transmissions of 0.43% and 0.44% were found in good agreement with measurements of 0.48% and 0.43%, respectively. Conclusion: A Geant4 model based on energy fluence calculations for the 6MV Unique linac was created and validated using IEC patient plane leakage measurements. The “kill-plane” has effectively eliminated electron contamination to

  2. A minimal physical model captures the shapes of crawling cells

    NASA Astrophysics Data System (ADS)

    Tjhung, E.; Tiribocchi, A.; Marenduzzo, D.; Cates, M. E.

    2015-01-01

    Cell motility in higher organisms (eukaryotes) is crucial to biological functions ranging from wound healing to immune response, and also implicated in diseases such as cancer. For cells crawling on hard surfaces, significant insights into motility have been gained from experiments replicating such motion in vitro. Such experiments show that crawling uses a combination of actin treadmilling (polymerization), which pushes the front of a cell forward, and myosin-induced stress (contractility), which retracts the rear. Here we present a simplified physical model of a crawling cell, consisting of a droplet of active polar fluid with contractility throughout, but treadmilling connected to a thin layer near the supporting wall. The model shows a variety of shapes and/or motility regimes, some closely resembling cases seen experimentally. Our work strongly supports the view that cellular motility exploits autonomous physical mechanisms whose operation does not need continuous regulatory effort.

  3. Physical model assisted probability of detection in nondestructive evaluation

    SciTech Connect

    Li, M.; Meeker, W. Q.; Thompson, R. B.

    2011-06-23

    Nondestructive evaluation is used widely in many engineering and industrial areas to detect defects or flaws such as cracks inside parts or structures during manufacturing or for products in service. The standard statistical model is a simple empirical linear regression between the (possibly transformed) signal response variables and the (possibly transformed) explanatory variables. For some applications, such a simple empirical approach is inadequate. An important alternative approach is to use knowledge of the physics of the inspection process to provide information about the underlying relationship between the response and explanatory variables. Use of such knowledge can greatly increase the power and accuracy of the statistical analysis and enable, when needed, proper extrapolation outside the range of the observed explanatory variables. This paper describes a set of physical model-assisted analyses to study the capability of two different ultrasonic testing inspection methods to detect synthetic hard alpha inclusion and flat-bottom hole defects in a titanium forging disk.

  4. Model of cosmology and particle physics at an intermediate scale

    SciTech Connect

    Bastero-Gil, M.; Di Clemente, V.; King, S. F.

    2005-05-15

    We propose a model of cosmology and particle physics in which all relevant scales arise in a natural way from an intermediate string scale. We are led to assign the string scale to the intermediate scale M{sub *}{approx}10{sup 13} GeV by four independent pieces of physics: electroweak symmetry breaking; the {mu} parameter; the axion scale; and the neutrino mass scale. The model involves hybrid inflation with the waterfall field N being responsible for generating the {mu} term, the right-handed neutrino mass scale, and the Peccei-Quinn symmetry breaking scale. The large scale structure of the Universe is generated by the lightest right-handed sneutrino playing the role of a coupled curvaton. We show that the correct curvature perturbations may be successfully generated providing the lightest right-handed neutrino is weakly coupled in the seesaw mechanism, consistent with sequential dominance.

  5. A Linearization Approach for Rational Nonlinear Models in Mathematical Physics

    NASA Astrophysics Data System (ADS)

    Robert, A. Van Gorder

    2012-04-01

    In this paper, a novel method for linearization of rational second order nonlinear models is discussed. In particular, we discuss an application of the δ expansion method (created to deal with problems in Quantum Field Theory) which will enable both the linearization and perturbation expansion of such equations. Such a method allows for one to quickly obtain the order zero perturbation theory in terms of certain special functions which are governed by linear equations. Higher order perturbation theories can then be obtained in terms of such special functions. One benefit to such a method is that it may be applied even to models without small physical parameters, as the perturbation is given in terms of the degree of nonlinearity, rather than any physical parameter. As an application, we discuss a method of linearizing the six Painlevé equations by an application of the method. In addition to highlighting the benefits of the method, we discuss certain shortcomings of the method.

  6. Physics validation studies for muon collider detector background simulations

    SciTech Connect

    Morris, Aaron Owen; /Northern Illinois U.

    2011-07-01

    Within the broad discipline of physics, the study of the fundamental forces of nature and the most basic constituents of the universe belongs to the field of particle physics. While frequently referred to as 'high-energy physics,' or by the acronym 'HEP,' particle physics is not driven just by the quest for ever-greater energies in particle accelerators. Rather, particle physics is seen as having three distinct areas of focus: the cosmic, intensity, and energy frontiers. These three frontiers all provide different, but complementary, views of the basic building blocks of the universe. Currently, the energy frontier is the realm of hadron colliders like the Tevatron at Fermi National Accelerator Laboratory (Fermilab) or the Large Hadron Collider (LHC) at CERN. While the LHC is expected to be adequate for explorations up to 14 TeV for the next decade, the long development lead time for modern colliders necessitates research and development efforts in the present for the next generation of colliders. This paper focuses on one such next-generation machine: a muon collider. Specifically, this paper focuses on Monte Carlo simulations of beam-induced backgrounds vis-a-vis detector region contamination. Initial validation studies of a few muon collider physics background processes using G4beamline have been undertaken and results presented. While these investigations have revealed a number of hurdles to getting G4beamline up to the level of more established simulation suites, such as MARS, the close communication between us, as users, and the G4beamline developer, Tom Roberts, has allowed for rapid implementation of user-desired features. The main example of user-desired feature implementation, as it applies to this project, is Bethe-Heitler muon production. Regarding the neutron interaction issues, we continue to study the specifics of how GEANT4 implements nuclear interactions. The GEANT4 collaboration has been contacted regarding the minor discrepancies in the neutron

  7. Advancing reservoir operation description in physically based hydrological models

    NASA Astrophysics Data System (ADS)

    Anghileri, Daniela; Giudici, Federico; Castelletti, Andrea; Burlando, Paolo

    2016-04-01

    Last decades have seen significant advances in our capacity of characterizing and reproducing hydrological processes within physically based models. Yet, when the human component is considered (e.g. reservoirs, water distribution systems), the associated decisions are generally modeled with very simplistic rules, which might underperform in reproducing the actual operators' behaviour on a daily or sub-daily basis. For example, reservoir operations are usually described by a target-level rule curve, which represents the level that the reservoir should track during normal operating conditions. The associated release decision is determined by the current state of the reservoir relative to the rule curve. This modeling approach can reasonably reproduce the seasonal water volume shift due to reservoir operation. Still, it cannot capture more complex decision making processes in response, e.g., to the fluctuations of energy prices and demands, the temporal unavailability of power plants or varying amount of snow accumulated in the basin. In this work, we link a physically explicit hydrological model with detailed hydropower behavioural models describing the decision making process by the dam operator. In particular, we consider two categories of behavioural models: explicit or rule-based behavioural models, where reservoir operating rules are empirically inferred from observational data, and implicit or optimization based behavioural models, where, following a normative economic approach, the decision maker is represented as a rational agent maximising a utility function. We compare these two alternate modelling approaches on the real-world water system of Lake Como catchment in the Italian Alps. The water system is characterized by the presence of 18 artificial hydropower reservoirs generating almost 13% of the Italian hydropower production. Results show to which extent the hydrological regime in the catchment is affected by different behavioural models and reservoir

  8. Explore Physics Beyond the Standard Model with GLAST

    SciTech Connect

    Lionetto, A. M.

    2007-07-12

    We give an overview of the possibility of GLAST to explore theories beyond the Standard Model of particle physics. Among the wide taxonomy we will focus in particular on low scale supersymmetry and theories with extra space-time dimensions. These theories give a suitable dark matter candidate whose interactions and composition can be studied using a gamma ray probe. We show the possibility of GLAST to disentangle such exotic signals from a standard production background.

  9. Qweak, N → Δ, and physics beyond the standard model

    NASA Astrophysics Data System (ADS)

    Leacock, J.

    2014-01-01

    The data-taking phase of the Qweak experiment ended in May of 2012 at the Thomas Jefferson National Accelerator Facility. Qweak aims to measure the weak charge of the proton, Q {/W p }, via parity-violating elastic electron-proton scattering. The expected value of Q {/W p } is fortuitously suppressed, which leads to an increased sensitivity to physics beyond the Standard Model.

  10. Physics models in the toroidal transport code PROCTR

    SciTech Connect

    Howe, H.C.

    1990-08-01

    The physics models that are contained in the toroidal transport code PROCTR are described in detail. Time- and space-dependent models are included for the plasma hydrogenic-ion, helium, and impurity densities, the electron and ion temperatures, the toroidal rotation velocity, and the toroidal current profile. Time- and depth-dependent models for the trapped and mobile hydrogenic particle concentrations in the wall and a time-dependent point model for the number of particles in the limiter are also included. Time-dependent models for neutral particle transport, neutral beam deposition and thermalization, fusion heating, impurity radiation, pellet injection, and the radial electric potential are included and recalculated periodically as the time-dependent models evolve. The plasma solution is obtained either in simple flux coordinates, where the radial shift of each elliptical, toroidal flux surface is included to maintain an approximate pressure equilibrium, or in general three-dimensional torsatron coordinates represented by series of helical harmonics. The detailed coupling of the plasma, scrape-off layer, limiter, and wall models through the neutral transport model makes PROCTR especially suited for modeling of recycling and particle control in toroidal plasmas. The model may also be used in a steady-state profile analysis mode for studying energy and particle balances starting with measured plasma profiles.

  11. A Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2008-01-01

    Numerical cloud models, which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Because cloud-scale dynamics are treated explicitly, uncertainties stemming from convection that have to be parameterized in (hydrostatic) large-scale models are obviated, or at least mitigated, in cloud models. Global models will use the non-hydrostatic framework when their horizontal resolution becomes about 10 km, the theoretical limit for the hydrostatic approximation. This juncture will be reached one to two decades from now. In recent years, exponentially increasing computer power has extended cloud-resolving-mode1 integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique.

  12. On Wiener filtering and the physics behind statistical modeling.

    PubMed

    Marbach, Ralf

    2002-01-01

    The closed-form solution of the so-called statistical multivariate calibration model is given in terms of the pure component spectral signal, the spectral noise, and the signal and noise of the reference method. The "statistical" calibration model is shown to be as much grounded on the physics of the pure component spectra as any of the "physical" models. There are no fundamental differences between the two approaches since both are merely different attempts to realize the same basic idea, viz., the spectrometric Wiener filter. The concept of the application-specific signal-to-noise ratio (SNR) is introduced, which is a combination of the two SNRs from the reference and the spectral data. Both are defined and the central importance of the latter for the assessment and development of spectroscopic instruments and methods is explained. Other statistics like the correlation coefficient, prediction error, slope deficiency, etc., are functions of the SNR. Spurious correlations and other practically important issues are discussed in quantitative terms. Most important, it is shown how to use a priori information about the pure component spectra and the spectral noise in an optimal way, thereby making the distinction between statistical and physical calibrations obsolete and combining the best of both worlds. Companies and research groups can use this article to realize significant savings in cost and time for development efforts.

  13. A simple physical model for deep moonquake occurrence times

    USGS Publications Warehouse

    Weber, R.C.; Bills, B.G.; Johnson, C.L.

    2010-01-01

    The physical process that results in moonquakes is not yet fully understood. The periodic occurrence times of events from individual clusters are clearly related to tidal stress, but also exhibit departures from the temporal regularity this relationship would seem to imply. Even simplified models that capture some of the relevant physics require a large number of variables. However, a single, easily accessible variable - the time interval I(n) between events - can be used to reveal behavior not readily observed using typical periodicity analyses (e.g., Fourier analyses). The delay-coordinate (DC) map, a particularly revealing way to display data from a time series, is a map of successive intervals: I(n+. 1) plotted vs. I(n). We use a DC approach to characterize the dynamics of moonquake occurrence. Moonquake-like DC maps can be reproduced by combining sequences of synthetic events that occur with variable probability at tidal periods. Though this model gives a good description of what happens, it has little physical content, thus providing only little insight into why moonquakes occur. We investigate a more mechanistic model. In this study, we present a series of simple models of deep moonquake occurrence, with consideration of both tidal stress and stress drop during events. We first examine the behavior of inter-event times in a delay-coordinate context, and then examine the output, in that context, of a sequence of simple models of tidal forcing and stress relief. We find, as might be expected, that the stress relieved by moonquakes influences their occurrence times. Our models may also provide an explanation for the opposite-polarity events observed at some clusters. ?? 2010.

  14. Boltzmann-Arrhenius (baz) Model in Physics-Of Problems

    NASA Astrophysics Data System (ADS)

    Suhir, E.; Kang, S.-M.

    2013-05-01

    Boltzmann-Arrhenius-Zhurkov (BAZ) model enables one to obtain a simple, easy-to-use and physically meaningful formula for the evaluation of the probability of failure (PoF) of a material after the given time in operation at the given temperature and under the given stress (not necessarily mechanical). It is shown that the material degradation (aging, damage accumulation, flaw propagation, etc.) can be viewed, when BAZ model is considered, as a Markovian process, and that the BAZ model can be obtained as the steady-state solution to the Fokker-Planck equation in the theory of Markovian processes. It is shown also that the BAZ model addresses the worst and a reasonably conservative situation, when the highest PoF is expected. It is suggested therefore that the transient period preceding the condition addressed by the steady-state BAZ model need not be accounted for in engineering evaluations. However, when there is an interest in understanding the physics of the transient degradation process, the obtained solution to the Fokker-Planck equation can be used for this purpose.

  15. Physical and mathematical modeling of antimicrobial photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Bürgermeister, Lisa; López, Fernando Romero; Schulz, Wolfgang

    2014-07-01

    Antimicrobial photodynamic therapy (aPDT) is a promising method to treat local bacterial infections. The therapy is painless and does not cause bacterial resistances. However, there are gaps in understanding the dynamics of the processes, especially in periodontal treatment. This work describes the advances in fundamental physical and mathematical modeling of aPDT used for interpretation of experimental evidence. The result is a two-dimensional model of aPDT in a dental pocket phantom model. In this model, the propagation of laser light and the kinetics of the chemical reactions are described as coupled processes. The laser light induces the chemical processes depending on its intensity. As a consequence of the chemical processes, the local optical properties and distribution of laser light change as well as the reaction rates. The mathematical description of these coupled processes will help to develop treatment protocols and is the first step toward an inline feedback system for aPDT users.

  16. Reduced-Order Modeling: New Approaches for Computational Physics

    NASA Technical Reports Server (NTRS)

    Beran, Philip S.; Silva, Walter A.

    2001-01-01

    In this paper, we review the development of new reduced-order modeling techniques and discuss their applicability to various problems in computational physics. Emphasis is given to methods ba'sed on Volterra series representations and the proper orthogonal decomposition. Results are reported for different nonlinear systems to provide clear examples of the construction and use of reduced-order models, particularly in the multi-disciplinary field of computational aeroelasticity. Unsteady aerodynamic and aeroelastic behaviors of two- dimensional and three-dimensional geometries are described. Large increases in computational efficiency are obtained through the use of reduced-order models, thereby justifying the initial computational expense of constructing these models and inotivatim,- their use for multi-disciplinary design analysis.

  17. Modelling the physics in iterative reconstruction for transmission computed tomography

    PubMed Central

    Nuyts, Johan; De Man, Bruno; Fessler, Jeffrey A.; Zbijewski, Wojciech; Beekman, Freek J.

    2013-01-01

    There is an increasing interest in iterative reconstruction (IR) as a key tool to improve quality and increase applicability of X-ray CT imaging. IR has the ability to significantly reduce patient dose, it provides the flexibility to reconstruct images from arbitrary X-ray system geometries and it allows to include detailed models of photon transport and detection physics, to accurately correct for a wide variety of image degrading effects. This paper reviews discretisation issues and modelling of finite spatial resolution, Compton scatter in the scanned object, data noise and the energy spectrum. Widespread implementation of IR with highly accurate model-based correction, however, still requires significant effort. In addition, new hardware will provide new opportunities and challenges to improve CT with new modelling. PMID:23739261

  18. A physically-based abrasive wear model for composite materials

    SciTech Connect

    Lee, Gun Y.; Dharan, C.K.H.; Ritchie, Robert O.

    2001-05-01

    A simple physically-based model for the abrasive wear of composite materials is presented based on the mechanics and mechanisms associated with sliding wear in soft (ductile) matrix composites containing hard (brittle) reinforcement particles. The model is based on the assumption that any portion of the reinforcement that is removed as wear debris cannot contribute to the wear resistance of the matrix material. The size of this non-contributing portion of the reinforcement is estimated by modeling the three primary wear mechanisms, specifically plowing, interfacial cracking and particle removal. Critical variables describing the role of the reinforcement, such as its relative size and the nature of the matrix/reinforcement interface, are characterized by a single contribution coefficient, C. Predictions are compared with the results of experimental two-body (pin-on drum) abrasive wear tests performed on a model aluminum particulate-reinforced epoxy matrix composite material.

  19. Possibilities: A framework for modeling students' deductive reasoning in physics

    NASA Astrophysics Data System (ADS)

    Gaffney, Jonathan David Housley

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the

  20. A Physical Model of the Metric Expansion of Space

    NASA Astrophysics Data System (ADS)

    Laubenstein, John

    2010-02-01

    At the heart of IWPD's Scale Metrics (ISM) theory is the realization that any orthogonal relationship may be equivalently expressed as a linear relationship multiplied by a mathematical scalar. This has significance in the relationship of a worldline to its 4-Velocity and observed 3-Velocity, as well as in understanding the divergence between energy and momentum as invariant mass increases. Spacetime may be depicted by taking the time dimension within four-dimensional spacetime and rotating it until it becomes embedded as a line segment (or ring) within the three spatial dimensions. This allows velocity and momentum to be determined based upon a linear subtraction of physical entities multiplied by a mathematical scalar (X). We will provide evidence supporting the mathematical and physical significance of this scaling factor along with the benefits of ISM theory. This model provides a physical explanation of the metric expansion of space and defines the initial singularity present at the earliest moment of the universe. ISM theory addresses many of the current challenges in physics and makes predictions that are testable with technologies currently in place. )