Science.gov

Sample records for geant4 software design

  1. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  2. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    SciTech Connect

    Genser, Krzysztof; Hatcher, Robert; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H.

    2016-11-10

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  3. SOLPEX x-ray polarimeter detector luminescence background calculated using Geant4 simulation software

    NASA Astrophysics Data System (ADS)

    Gorgolewski, Aleksander; Barylak, Jaromir; Steślicki, Marek; Szaforz, Å.»aneta; BÄ kała, Jarosław

    2016-09-01

    The Soft X-ray Solar polarimeter-spectrometer (SOLPEX) experiment is planned to be placed in Roscosmos' Multipurpose Laboratory Module "NAUKA" on International Space Station (ISS) in 2019. The experiment is design to detect polarization and X-ray spectra of solar flares. Due to very high, few percent, linear polarization detection limit, accurate background estimation and modeling is crucial. Calculating the background photoelectric effect, Compton scattering and Bremsstrahlung were taken into account. Luminescence background from particles produced in solar flares was simulated using Geant4. Additionally, theoretical spectra was modeled in order to simulate full SOLPEX detector response for M5 and X1 solar flare classes.

  4. The GEANT4 Visualisation System

    SciTech Connect

    Allison, J.; Asai, M.; Barrand, G.; Donszelmann, M.; Minamimoto, K.; Tanaka, S.; Tcherniaev, E.; Tinslay, J.; /SLAC

    2007-11-02

    The Geant4 Visualization System is a multi-driver graphics system designed to serve the Geant4 Simulation Toolkit. It is aimed at the visualization of Geant4 data, primarily detector descriptions and simulated particle trajectories and hits. It can handle a variety of graphical technologies simultaneously and interchangeably, allowing the user to choose the visual representation most appropriate to requirements. It conforms to the low-level Geant4 abstract graphical user interfaces and introduces new abstract classes from which the various drivers are derived and that can be straightforwardly extended, for example, by the addition of a new driver. It makes use of an extendable class library of models and filters for data representation and selection. The Geant4 Visualization System supports a rich set of interactive commands based on the Geant4 command system. It is included in the Geant4 code distribution and maintained and documented like other components of Geant4.

  5. Geant4 Applications in Space

    SciTech Connect

    Asai, M.; /SLAC

    2007-11-07

    Use of Geant4 is rapidly expanding in space application domain. I try to overview three major application areas of Geant4 in space, which are apparatus simulation for pre-launch design and post-launch analysis, planetary scale simulation for radiation spectra and surface and sub-surface explorations, and micro-dosimetry simulation for single event study and radiation-hardening of semiconductor devices. Recently, not only the mission dependent applications but also various multi-purpose or common tools built on top of Geant4 are also widely available. I overview some of such tools as well. The Geant4 Collaboration identifies that the space applications are now one of the major driving forces of the further developments and refinements of Geant4 toolkit. Highlights of such developments are introduced.

  6. A Virtual Geant4 Environment

    NASA Astrophysics Data System (ADS)

    Iwai, Go

    2015-12-01

    We describe the development of an environment for Geant4 consisting of an application and data that provide users with a more efficient way to access Geant4 applications without having to download and build the software locally. The environment is platform neutral and offers the users near-real time performance. In addition, the environment consists of data and Geant4 libraries built using low-level virtual machine (LLVM) tools which can produce bitcode that can be embedded in HTML and accessed via a browser. The bitcode is downloaded to the local machine via the browser and can then be configured by the user. This approach provides a way of minimising the risk of leaking potentially sensitive data used to construct the Geant4 model and application in the medical domain for treatment planning. We describe several applications that have used this approach and compare their performance with that of native applications. We also describe potential user communities that could benefit from this approach.

  7. Design of Cherenkov bars for the optical part of the time-of-flight detector in Geant4.

    PubMed

    Nozka, L; Brandt, A; Rijssenbeek, M; Sykora, T; Hoffman, T; Griffiths, J; Steffens, J; Hamal, P; Chytka, L; Hrabovsky, M

    2014-11-17

    We present the results of studies devoted to the development and optimization of the optical part of a high precision time-of-flight (TOF) detector for the Large Hadron Collider (LHC). This work was motivated by a proposal to use such a detector in conjunction with a silicon detector to tag and measure protons from interactions of the type p + p → p + X + p, where the two outgoing protons are scattered in the very forward directions. The fast timing detector uses fused silica (quartz) bars that emit Cherenkov radiation as a relativistic particle passes through and the emitted Cherenkov photons are detected by, for instance, a micro-channel plate multi-anode Photomultiplier Tube (MCP-PMT). Several possible designs are implemented in Geant4 and studied for timing optimization as a function of the arrival time, and the number of Cherenkov photons reaching the photo-sensor.

  8. Recent developments in Geant4

    SciTech Connect

    Allison, J.; Amako, K.; Apostolakis, J.; Arce, P.; Asai, M.; Aso, T.; Bagli, E.; Bagulya, A.; Banerjee, S.; Barrand, G.; Beck, B. R.; Bogdanov, A. G.; Brandt, D.; Brown, J. M. C.; Burkhardt, H.; Canal, Ph.; Cano-Ott, D.; Chauvie, S.; Cho, K.; Cirrone, G. A. P.; Cooperman, G.; Cortés-Giraldo, M. A.; Cosmo, G.; Cuttone, G.; Depaola, G.; Desorgher, L.; Dong, X.; Dotti, A.; Elvira, V. D.; Folger, G.; Francis, Z.; Galoyan, A.; Garnier, L.; Gayer, M.; Genser, K. L.; Grichine, V. M.; Guatelli, S.; Guèye, P.; Gumplinger, P.; Howard, A. S.; Hřivnáčová, I.; Hwang, S.; Incerti, S.; Ivanchenko, A.; Ivanchenko, V. N.; Jones, F. W.; Jun, S. Y.; Kaitaniemi, P.; Karakatsanis, N.; Karamitrosi, M.; Kelsey, M.; Kimura, A.; Koi, T.; Kurashige, H.; Lechner, A.; Lee, S. B.; Longo, F.; Maire, M.; Mancusi, D.; Mantero, A.; Mendoza, E.; Morgan, B.; Murakami, K.; Nikitina, T.; Pandola, L.; Paprocki, P.; Perl, J.; Petrović, I.; Pia, M. G.; Pokorski, W.; Quesada, J. M.; Raine, M.; Reis, M. A.; Ribon, A.; Ristić Fira, A.; Romano, F.; Russo, G.; Santin, G.; Sasaki, T.; Sawkey, D.; Shin, J. I.; Strakovsky, I. I.; Taborda, A.; Tanaka, S.; Tomé, B.; Toshito, T.; Tran, H. N.; Truscott, P. R.; Urban, L.; Uzhinsky, V.; Verbeke, J. M.; Verderi, M.; Wendt, B. L.; Wenzel, H.; Wright, D. H.; Wright, D. M.; Yamashita, T.; Yarba, J.; Yoshida, H.

    2016-07-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions to the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.

  9. Recent developments in GEANT4

    NASA Astrophysics Data System (ADS)

    Allison, J.; Amako, K.; Apostolakis, J.; Arce, P.; Asai, M.; Aso, T.; Bagli, E.; Bagulya, A.; Banerjee, S.; Barrand, G.; Beck, B. R.; Bogdanov, A. G.; Brandt, D.; Brown, J. M. C.; Burkhardt, H.; Canal, Ph.; Cano-Ott, D.; Chauvie, S.; Cho, K.; Cirrone, G. A. P.; Cooperman, G.; Cortés-Giraldo, M. A.; Cosmo, G.; Cuttone, G.; Depaola, G.; Desorgher, L.; Dong, X.; Dotti, A.; Elvira, V. D.; Folger, G.; Francis, Z.; Galoyan, A.; Garnier, L.; Gayer, M.; Genser, K. L.; Grichine, V. M.; Guatelli, S.; Guèye, P.; Gumplinger, P.; Howard, A. S.; Hřivnáčová, I.; Hwang, S.; Incerti, S.; Ivanchenko, A.; Ivanchenko, V. N.; Jones, F. W.; Jun, S. Y.; Kaitaniemi, P.; Karakatsanis, N.; Karamitrosi, M.; Kelsey, M.; Kimura, A.; Koi, T.; Kurashige, H.; Lechner, A.; Lee, S. B.; Longo, F.; Maire, M.; Mancusi, D.; Mantero, A.; Mendoza, E.; Morgan, B.; Murakami, K.; Nikitina, T.; Pandola, L.; Paprocki, P.; Perl, J.; Petrović, I.; Pia, M. G.; Pokorski, W.; Quesada, J. M.; Raine, M.; Reis, M. A.; Ribon, A.; Ristić Fira, A.; Romano, F.; Russo, G.; Santin, G.; Sasaki, T.; Sawkey, D.; Shin, J. I.; Strakovsky, I. I.; Taborda, A.; Tanaka, S.; Tomé, B.; Toshito, T.; Tran, H. N.; Truscott, P. R.; Urban, L.; Uzhinsky, V.; Verbeke, J. M.; Verderi, M.; Wendt, B. L.; Wenzel, H.; Wright, D. H.; Wright, D. M.; Yamashita, T.; Yarba, J.; Yoshida, H.

    2016-11-01

    GEANT4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. The adaptation of GEANT4 to multithreading, advances in physics, detector modeling and visualization, extensions to the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.

  10. Recent developments in Geant4

    DOE PAGES

    Allison, J.; Amako, K.; Apostolakis, J.; ...

    2016-07-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions tomore » the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.« less

  11. A CAD interface for GEANT4.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-09-01

    Often CAD models already exist for parts of a geometry being simulated using GEANT4. Direct import of these CAD models into GEANT4 however, may not be possible and complex components may be difficult to define via other means. Solutions that allow for users to work around the limited support in the GEANT4 toolkit for loading predefined CAD geometries have been presented by others, however these solutions require intermediate file format conversion using commercial software. Here within we describe a technique that allows for CAD models to be directly loaded as geometry without the need for commercial software and intermediate file format conversion. Robustness of the interface was tested using a set of CAD models of various complexity; for the models used in testing, no import errors were reported and all geometry was found to be navigable by GEANT4.

  12. Validation of Hadronic Models in GEANT4

    SciTech Connect

    Koi, Tatsumi; Wright, Dennis H.; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Peter; Lei, Fan; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  13. Designing a new type of neutron detector for neutron and gamma-ray discrimination via GEANT4.

    PubMed

    Shan, Qing; Chu, Shengnan; Ling, Yongsheng; Cai, Pingkun; Jia, Wenbao

    2016-04-01

    Design of a new type of neutron detector, consisting of a fast neutron converter, plastic scintillator, and Cherenkov detector, to discriminate 14-MeV fast neutrons and gamma rays in a pulsed n-γ mixed field and monitor their neutron fluxes is reported in this study. Both neutrons and gamma rays can produce fluorescence in the scintillator when they are incident on the detector. However, only the secondary charged particles of the gamma rays can produce Cherenkov light in the Cherenkov detector. The neutron and gamma-ray fluxes can be calculated by measuring the fluorescence and Cherenkov light. The GEANT4 Monte Carlo simulation toolkit is used to simulate the whole process occurring in the detector, whose optimum parameters are known. Analysis of the simulation results leads to a calculation method of neutron flux. This method is verified by calculating the neutron fluxes using pulsed n-γ mixed fields with different n/γ ratios, and the results show that the relative errors of all calculations are <5%.

  14. Geant4 - Towards major release 10

    NASA Astrophysics Data System (ADS)

    Cosmo, G.; Geant4 Collaboration

    2014-06-01

    The Geant4 simulation toolkit has reached maturity in the middle of the previous decade, providing a wide variety of established features coherently aggregated in a software product, which has become the standard for detector simulation in HEP and is used in a variety of other application domains. We review the most recent capabilities introduced in the kernel, highlighting those, which are being prepared for the next major release (version 10.0) that is scheduled for the end of 2013. A significant new feature contained in this release will be the integration of multi-threading processing, aiming at targeting efficient use of modern many-cores system architectures and minimization of the memory footprint for exploiting event-level parallelism. We discuss its design features and impact on the existing API and user-interface of Geant4. Revisions are made to balance the need for preserving backwards compatibility and to consolidate and improve the interfaces; taking into account requirements from the multithreaded extensions and from the evolution of the data processing models of the LHC experiments.

  15. GEANT4: Applications in High Energy Physics

    SciTech Connect

    Mahmood, Tariq; Zafar, Abrar Ahmed; Hussain, Talib; Rashid, Haris

    2007-02-14

    GEANT4 is a detector simulation toolkit aimed at studying, mainly experimental high energy physics. In this paper we will give an overview of this software with special reference to its applications in high energy physics experiments. A brief of process methods is given. Object-oriented nature of the simulation toolkit is highlighted.

  16. Geant4 Computing Performance Benchmarking and Monitoring

    DOE PAGES

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; ...

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less

  17. Geant4 Computing Performance Benchmarking and Monitoring

    SciTech Connect

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  18. Geant4-DNA: overview and recent developments

    NASA Astrophysics Data System (ADS)

    Štěpán, Václav

    software already available for download, as well as future perspectives, will be presented, on behalf of the Geant4-DNA Collaboration.

  19. GEANT4 distributed computing for compact clusters

    NASA Astrophysics Data System (ADS)

    Harrawood, Brian P.; Agasthya, Greeshma A.; Lakshmanan, Manu N.; Raterman, Gretchen; Kapadia, Anuj J.

    2014-11-01

    A new technique for distribution of GEANT4 processes is introduced to simplify running a simulation in a parallel environment such as a tightly coupled computer cluster. Using a new C++ class derived from the GEANT4 toolkit, multiple runs forming a single simulation are managed across a local network of computers with a simple inter-node communication protocol. The class is integrated with the GEANT4 toolkit and is designed to scale from a single symmetric multiprocessing (SMP) machine to compact clusters ranging in size from tens to thousands of nodes. User designed 'work tickets' are distributed to clients using a client-server work flow model to specify the parameters for each individual run of the simulation. The new g4DistributedRunManager class was developed and well tested in the course of our Neutron Stimulated Emission Computed Tomography (NSECT) experiments. It will be useful for anyone running GEANT4 for large discrete data sets such as covering a range of angles in computed tomography, calculating dose delivery with multiple fractions or simply speeding the through-put of a single model.

  20. The Cryogenic AntiCoincidence Detector for the ATHENA X-IFU: Design Aspects by Geant4 Simulation and Preliminary Characterization of the New Single Pixel

    NASA Astrophysics Data System (ADS)

    Macculi, C.; Argan, A.; D'Andrea, M.; Lotti, S.; Piro, L.; Biasotti, M.; Corsini, D.; Gatti, F.; Orlando, A.; Torrioli, G.

    2016-08-01

    The ATHENA observatory is the second large-class ESA mission, in the context of the Cosmic Vision 2015-2025, scheduled to be launched on 2028 at L2 orbit. One of the two planned focal plane instruments is the X-ray Integral Field Unit (X-IFU), which will be able to perform simultaneous high-grade energy spectroscopy and imaging over the 5 arcmin FoV by means of a kilo-pixel array of transition-edge sensor (TES) microcalorimeters, coupled to a high-quality X-ray optics. The X-IFU sensitivity is degraded by the particle background, induced by primary protons of both solar and cosmic rays' origin and secondary electrons. A Cryogenic AntiCoincidence (CryoAC) TES-based detector, located <1 mm below the TES array, will allow the mission to reach the background level that enables its scientific goals. The CryoAC is a 4-pixel detector made of Silicon absorbers sensed by Iridium TESs. We currently achieve a TRL = 3-4 at the single-pixel level. We have designed and developed two further prototypes in order to reach TRL = 4. The design of the CryoAC has been also optimized using the Geant4 simulation tool. Here we will describe some results from the Geant4 simulations performed to optimize the design and preliminary test results from the first of the two detectors, 1 cm2 area, made of 65 Ir TESs.

  1. The Geant4 Bertini Cascade

    SciTech Connect

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron–nucleus interaction models in the Geant4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron–nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other Geant4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  2. Benchmarking Geant4 for spallation neutron source calculations

    NASA Astrophysics Data System (ADS)

    DiJulio, Douglas D.; Batkov, Konstantin; Stenander, John; Cherkashyna, Nataliia; Bentley, Phillip M.

    2016-09-01

    Geant4 is becoming increasingly used for radiation transport simulations of spallation neutron sources and related components. Historically, the code has seen little usage in this field and it is of general interest to investigate the suitability of Geant4 for such applications. For this purpose, we carried out Geant4 calculations based on simple spallation source geometries and also with the the European Spallation Source Technical Design Report target and moderator configuration. The results are compared to calculations performed with the Monte Carlo N- Particle extended code. The comparisons are carried out over the full spallation neutron source energy spectrum, from sub-eV energies up to thousands of MeV. Our preliminary results reveal that there is generally good agreement between the simulations using both codes. Additionally, we have also implemented a general weight-window generator for Geant4 based applications and present some results of the method applied to the ESS target model.

  3. Nuclear spectroscopy with Geant4. The superheavy challenge

    NASA Astrophysics Data System (ADS)

    Sarmiento, Luis G.

    2016-12-01

    The simulation toolkit Geant4 was originally developed at CERN for high-energy physics. Over the years it has been established as a swiss army knife not only in particle physics but it has seen an accelerated expansion towards nuclear physics and more recently to medical imaging and γ- and ion- therapy to mention but a handful of new applications. The validity of Geant4 is vast and large across many particles, ions, materials, and physical processes with typically various different models to choose from. Unfortunately, atomic nuclei with atomic number Z > 100 are not properly supported. This is likely due to the rather novelty of the field, its comparably small user base, and scarce evaluated experimental data. To circumvent this situation different workarounds have been used over the years. In this work the simulation toolkit Geant4 will be introduced with its different components and the effort to bring the software to the heavy and superheavy region will be described.

  4. Introduction to the Geant4 Simulation toolkit

    SciTech Connect

    Guatelli, S.; Cutajar, D.; Rosenfeld, A. B.; Oborn, B.

    2011-05-05

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  5. The Geant4 physics validation repository

    DOE PAGES

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-23

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described

  6. Visualization drivers for Geant4

    SciTech Connect

    Beretvas, Andy; /Fermilab

    2005-10-01

    This document is on Geant4 visualization tools (drivers), evaluating pros and cons of each option, including recommendations on which tools to support at Fermilab for different applications. Four visualization drivers are evaluated. They are OpenGL, HepRep, DAWN and VRML. They all have good features, OpenGL provides graphic output without an intermediate file. HepRep provides menus to assist the user. DAWN provides high quality plots and even for large files produces output quickly. VRML uses the smallest disk space for intermediate files. Large experiments at Fermilab will want to write their own display. They should proceed to make this display graphics independent. Medium experiment will probably want to use HepRep because of it's menu support. Smaller scale experiments will want to use OpenGL in the spirit of having immediate response, good quality output and keeping things simple.

  7. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  8. Implementing NRF Physics in Geant4

    SciTech Connect

    Jordan, David V.; Warren, Glen A.

    2006-07-01

    The Geant4 radiation transport Monte Carlo code toolkit currently does not support nuclear resonance fluorescence (NRF). After a brief review of NRF physics, plans for implementing this physics process in Geant4, and validating the output of the code, are described. The plans will be executed as Task 3 of project 50799, "Nuclear Resonance Fluorescence Signatures (NuRFS)".

  9. Geant4 application in a Web browser

    NASA Astrophysics Data System (ADS)

    Garnier, Laurent; Geant4 Collaboration

    2014-06-01

    Geant4 is a toolkit for the simulation of the passage of particles through matter. The Geant4 visualization system supports many drivers including OpenGL[1], OpenInventor, HepRep[2], DAWN[3], VRML, RayTracer, gMocren[4] and ASCIITree, with diverse and complementary functionalities. Web applications have an increasing role in our work, and thanks to emerging frameworks such as Wt [5], building a web application on top of a C++ application without rewriting all the code can be done. Because the Geant4 toolkit's visualization and user interface modules are well decoupled from the rest of Geant4, it is straightforward to adapt these modules to render in a web application instead of a computer's native window manager. The API of the Wt framework closely matches that of Qt [6], our experience in building Qt driver will benefit for Wt driver. Porting a Geant4 application to a web application is easy, and with minimal effort, Geant4 users can replicate this process to share their own Geant4 applications in a web browser.

  10. Development of an Interface for Using EGS4 Physics Processes in Geant4

    SciTech Connect

    Murakami, K.

    2004-01-21

    As simulation system, the variety of physics processes implemented is one of the most important functionalities. In that sense, Geant4 is one of the most powerful simulation toolkits. Its flexibility and expansibility brought by object-oriented approach make it possible for us to easily assimilate external simulation packages into the Geant4 system as modules of physics processes. We developed an interface for using EGS4, which is another of the most well-known simulation package for electromagnetic physics, in Geant4. By means of this interface, EGS4 users can share Geant4 powerful resources, such as geometry, tracking, etc. It is also important that it can provide a common environment for comparison tests between EGS4 and Geant4. In this paper, we describe our design and implementation of the interface.

  11. Alpha Coincidence Spectroscopy studied with GEANT4

    SciTech Connect

    Dion, Michael P.; Miller, Brian W.; Tatishvili, Gocha; Warren, Glen A.

    2013-11-02

    Abstract The high-energy side of peaks in alpha spectra, e.g. 241Am, as measured with a silicon detector has structure caused mainly by alpha-conversion electron and to some extent alphagamma coincidences. We compare GEANT4 simulation results to 241Am alpha spectroscopy measurements with a passivated implanted planar silicon detector. A large discrepancy between the measurements and simulations suggest that the GEANT4 photon evaporation database for 237Np (daughter of 241Am decay) does not accurately describe the conversion electron spectrum and therefore was found to have large discrepancies with experimental measurements. We describe how to improve the agreement between GEANT4 and alpha spectroscopy for actinides of interest by including experimental measurements of conversion electron spectroscopy into the photon evaporation database.

  12. Monte Carlo simulation of the ELIMED beamline using Geant4

    NASA Astrophysics Data System (ADS)

    Pipek, J.; Romano, F.; Milluzzo, G.; Cirrone, G. A. P.; Cuttone, G.; Amico, A. G.; Margarone, D.; Larosa, G.; Leanza, R.; Petringa, G.; Schillaci, F.; Scuderi, V.

    2017-03-01

    In this paper, we present a Geant4-based Monte Carlo application for ELIMED beamline [1-6] simulation, including its features and several preliminary results. We have developed the application to aid the design of the beamline, to estimate various beam characteristics, and to assess the amount of secondary radiation. In future, an enhanced version of this application will support the beamline users when preparing their experiments.

  13. Comparison of GEANT4 very low energy cross section models with experimental data in water

    SciTech Connect

    Incerti, S.; Ivanchenko, A.; Karamitros, M.; Mantero, A.; Moretto, P.; Tran, H. N.; Mascialino, B.; Champion, C.; Ivanchenko, V. N.; Bernal, M. A.; Francis, Z.; Villagrasa, C.; Baldacchino, G.; Gueye, P.; Capra, R.; Nieminen, P.; Zacharatou, C.

    2010-09-15

    Purpose: The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H{sup 0}, H{sup +}) and (He{sup 0}, He{sup +}, He{sup 2+}), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called ''GEANT4-DNA'' physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. Methods: An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. Results: The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant

  14. Geant4: A Simulation Toolkit for the Passage of Particles through Matter

    NASA Astrophysics Data System (ADS)

    Geant4 Collaboration

    2010-10-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  15. Geant4 models for space radiation environment.

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Nieminen, Petteri; Incerti, Sebastien; Santin, Giovanni; Ivantchenko, Vladimir; Grichine, Vladimir; Allison, John

    The space radiation environment includes wide varieties of particles from electrons to heavy ions. In order to correctly predict the dose received by astronauts and devices the simulation models must have good applicability and produce accurate results from 10 MeV/u up to 10 GeV/u, where the most radioactive hazardous particles are present in the spectra. Appropriate models should also provide a good description of electromagnetic interactions down to very low energies (10 eV/u - 10 MeV/u) for understanding the damage mechanisms due to long-term low doses. Predictions of biological dose during long interplanetary journeys also need models for hadronic interactions of energetic heavy ions extending higher energies (10 GeV/u - 100 GeV/u, but possibly up to 1 TeV/u). Geant4 is a powerful toolkit, which in some areas well surpasses the needs from space radiation studies, while in other areas is being developed and/or validated to properly cover the modelling requirements outlined above. Our activities in ESA projects deal with the research and development of both Geant4 hadronic and electromagnetic physics. Recently the scope of verification tests and benchmarks has been extended. Hadronic tests and benchmarks run proton, pion, and ion interactions with matter at various energies. In the Geant4 hadronic sub-libraries, the most accurate cross sections have been identified and selected as a default for all particle types relevant to space applications. Significant developments were carried out for ion/ion interaction models. These now allow one to perform Geant4 simulations for all particle types and energies relevant to space applications. For the validation of ion models the hadronic testing suite for ion interactions was significantly extended. In this work the results of benchmarking versus data in a wide energy range for projectile protons and ions will be shown and discussed. Here we show results of the tests runs and their precision. Recommendations for Geant4

  16. BoGEMMS: the Bologna Geant4 multi-mission simulator

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Fioretti, V.; Malaguti, P.; Trifoglio, M.; Gianotti, F.

    2012-07-01

    BoGEMMS, (Bologna Geant4 Multi-Mission Simulator) is a software project for fast simulation of payload on board of scientific satellites for prompt background evaluation that has been developed at the INAF/IASF Bologna. By exploiting the Geant4 set of libraries, BoGEMMS allows to interactively set the geometrical and physical parameters (e.g. physics list, materials and thicknesses), recording the interactions (e.g. energy deposit, position, interacting particle) in NASA FITS and CERN root format output files and filtering the output as a real observation in space, to finally produce the background detected count rate and spectra. Four different types of output can be produced by the BoGEMMS capturing different aspects of the interactions. The simulator can also run in parallel jobs and store the results in a centralized server via xrootd protocol. The BoGEMMS is a multi-mission tool, generally designed to be applied to any high-energy mission for which the shielding and instruments performances analysis is required.

  17. Geometry Optimization in NOvA with Geant4

    NASA Astrophysics Data System (ADS)

    Nguyen, Vivan; Messier, Mark; NOvA Collaboration

    2013-10-01

    NOvA is a neutrino beam experiment, designed to detect neutrino oscillations. There are two detectors, placed at distances of 1km and 810 km from the proton target. The detectors are made of PVC filled with liquid scintillator. In simulating the experiment, an important aspect is the detector geometry, which is input to Geant4 using the GDML markup language. I will present studies in which the geometry description was systematically varied to find a configuration which preserved the modeling accuracy required by the experiment while minimizing the CPU time required for the simulation. This work was supported by the REU Program of the National Science Foundation under Award PHY-1156540.

  18. Validation of Geant4 Hadronic Generators versus Thin Target Data

    SciTech Connect

    Banerjee, S.; Folger, G.; Ivanchenko, A.; Ivanchenko, V.N.; Kossov, M.; Quesada, J.M.; Schalicke, A.; Uzhinsky, V.; Wenzel, H.; Wright, D.H.; Yarba, J.; /Fermilab

    2012-04-19

    The GEANT4 toolkit is widely used for simulation of high energy physics (HEP) experiments, in particular, those at the Large Hadron Collider (LHC). The requirements of robustness, stability and quality of simulation for the LHC are demanding. This requires an accurate description of hadronic interactions for a wide range of targets over a large energy range, from stopped particle reactions to low energy nuclear interactions to interactions at the TeV energy scale. This is achieved within the Geant4 toolkit by combining a number of models, each of which are valid within a certain energy domain. Comparison of these models to thin target data over a large energy range indicates the strengths and weaknesses of the model descriptions and the energy range over which each model is valid. Software has been developed to handle the large number of validation tests required to provide the feedback needed to improve the models. An automated process for carrying out the validation and storing/displaying the results is being developed and will be discussed.

  19. Modeling of microporous silicon betaelectric converter with 63Ni plating in GEANT4 toolkit*

    NASA Astrophysics Data System (ADS)

    Zelenkov, P. V.; Sidorov, V. G.; Lelekov, E. T.; Khoroshko, A. Y.; Bogdanov, S. V.; Lelekov, A. T.

    2016-04-01

    The model of electron-hole pairs generation rate distribution in semiconductor is needed to optimize the parameters of microporous silicon betaelectric converter, which uses 63Ni isotope radiation. By using Monte-Carlo methods of GEANT4 software with ultra-low energy electron physics models this distribution in silicon was calculated and approximated with exponential function. Optimal pore configuration was estimated.

  20. Geant4-Simulations for cellular dosimetry in nuclear medicine.

    PubMed

    Freudenberg, Robert; Wendisch, Maria; Kotzerke, Jörg

    2011-12-01

    The application of unsealed radionuclides in radiobiological experiments can lead to intracellular radionuclide uptake and an increased absorbed dose. Accurate dose quantification is essential to assess observed radiobiological effects. Due to small cellular dimensions direct dose measurement is impossible. We will demonstrate the application of Monte Carlo simulations for dose calculation. Dose calculations were performed using the Geant4 Monte Carlo toolkit, wherefore typical experimental situations were designed. Dose distributions inside wells were simulated for different radionuclides. S values were simulated for spherical cells and cell monolayers of different diameter. Concomitantly experiments were performed using the PC Cl3 cell line with mediated radionuclide uptake. For various activity distributions cellular survival was measured. We yielded S values for dose distribution inside the wells. Calculated S values for a single cell are in good agreement to S values provided in the literature (ratio 0.87 to 1.07). Cross-dose is up to ten times higher for Y-90. Concomitantly performed cellular experiments confirm the dose calculation. Furthermore the necessity of correct dose calculation was shown for assessment of radiobiological effects after application of unsealed radionuclides. Thereby the feasibility of using Geant4 was demonstrated.

  1. Implementing dosimetry in GATE: dose-point kernel validation with GEANT4 4.8.1.

    PubMed

    Ferrer, Ludovic; Chouin, Nicolas; Bitar, Abdalkader; Lisbona, Albert; Bardiès, Manuel

    2007-02-01

    GATE is a recent Monte Carlo code, based on GEANT4, and used in nuclear medicine mainly for imaging and detector design. Our goal was to implement dosimetry within GATE (i.e., combining the excellent potential of Gate for image modeling with GEANT4 dosimetric capabilities. The latest release of GEANT4 (4.8.1) completely revised the electron multiple scattering propagation algorithm. In this work, we calculated dose point kernels (DPK) for 0.01, 0.05, 0.1, 1, and 3 MeV monoenergetic electrons. We then compared our results with data obtained with another Monte Carlo code (MCNPX) or from the reference publication from Berger and Seltzer. To facilitate comparison, all calculated dose distributions were scaled to the corresponding R(CSDA), as given by the ESTAR NIST web database. Some GEANT4 parameters (i.e., Stepmax), or the shell thickness, had to be adjusted in order to achieve good agreement for energies below 1 MeV. For all energies except 10 keV, calculated DPKs do not differ significantly from the reference, as assessed by a Kolmogorov-Smirnov test. This preliminary step allowed us to consider the integration of GEANT4 dosimetric capabilities within the Gate framework.

  2. SU-E-J-72: Geant4 Simulations of Spot-Scanned Proton Beam Treatment Plans

    SciTech Connect

    Kanehira, T; Sutherland, K; Matsuura, T; Umegaki, K; Shirato, H

    2014-06-01

    Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generated and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.

  3. Geant4 simulations of a wide-angle x-ray focusing telescope

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Zhang, Shuangnan; Willingale, Richard; Ling, Zhixing

    2017-03-01

    The rapid development of X-ray astronomy has been made possible by widely deploying X-ray focusing telescopes on board many X-ray satellites. Geant4 is a very powerful toolkit for Monte Carlo simulations and has remarkable abilities to model complex geometrical configurations. However, the library of physical processes available in Geant4 lacks a description of the reflection of X-ray photons at a grazing incident angle which is the core physical process in the simulation of X-ray focusing telescopes. The scattering of low-energy charged particles from the mirror surfaces is another noteworthy process which is not yet incorporated into Geant4. Here we describe a Monte Carlo model of a simplified wide-angle X-ray focusing telescope adopting lobster-eye optics and a silicon detector using the Geant4 toolkit. With this model, we simulate the X-ray tracing, proton scattering and background detection. We find that: (1) the effective area obtained using Geant4 is in agreement with that obtained using Q software with an average difference of less than 3%; (2) X-rays are the dominant background source below 10 keV; (3) the sensitivity of the telescope is better by at least one order of magnitude than that of a coded mask telescope with the same physical dimensions; (4) the number of protons passing through the optics and reaching the detector by Firsov scattering is about 2.5 times that of multiple scattering for the lobster-eye telescope.

  4. artG4: A Generic Framework for Geant4 Simulations

    SciTech Connect

    Arvanitis, Tasha; Lyon, Adam

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  5. GEANT4 Simulation of Neutron Detector for DAMPE

    NASA Astrophysics Data System (ADS)

    He, M.; Ma, T.; Chang, J.; Zhang, Y.; Huang, Y. Y.; Zang, J. J.; Wu, J.; Dong, T. K.

    2016-01-01

    During recent tens of years dark matter has gradually become a hot topic in astronomical research field, and related theory researches and experiment projects change with each passing day. The Dark Matter Particle Explorer (DAMPE) of our country is proposed under this background. As the probing object involves high energy electrons, appropriate methods must be taken to distinguish them from protons in order to reduce the event probability of other charged particles (e.g. a proton) being mistaken as electrons. The experiments show that, the hadronic shower of high energy proton in BGO electromagnetic calorimeter, which is usually accompanied by the emitting of large number of secondary neutrons, is significantly different from the electromagnetic shower of high energy electron. Through the detection of secondary neutron signal emitting from the bottom of BGO electromagnetic calorimeter and the shower shape of incident particles in BGO electromagnetic calorimeter, we can effectively distinguish whether the incident particles are high energy protons or electrons. This paper introduces the structure and detecting principle of DAMPE neutron detector. We use Monte-Carlo method with GEANT4 software to simulate the signal emitting from protons and electrons at characteristic energy in the neutron detector, and finally summarize the neutron detector's ability to distinguish protons and electrons under different electron acception efficiencies.

  6. GEANT4 Simulation of Neutron Detector for DAMPE

    NASA Astrophysics Data System (ADS)

    Ming, He; Tao, Ma; Jin, Chang; Yan, Zhang; Yong-yi, Huang; Jing-jing, Zang; Jian, Wu; Tie-kuang, Dong

    2016-10-01

    In recent decades, dark matter has gradually become a hot topic in astronomical research, and the related theoretical research and experimental project are updated with each passing day. The Dark Matter Particle Explorer (DAMPE) of our country was proposed under this background. As the detected object involves high-energy electrons, appropriate methods must be taken to distinguish them from protons, in order to reduce the event probability of other charged particles (for example protons) being mistaken as electrons. The experiments show that the hadron shower of high-energy proton in BGO (Bismuth Germanium Oxide) calorimeter, which is usually accompanied with the emitting of a large number of secondary neutrons, is significantly different from the electromagnetic shower of high-energy electron. Through the detection of secondary neutron signals emerging from the bottom of BGO calorimeter, and the shower shape of incident particles in the BGO calorimeter, we can effectively distinguish whether the incident particles are high-energy protons or electrons. This paper introduces the structure and detection principle of the DAMPE neutron detector. We use the Monte-Carlo method and the GEANT4 software to simulate the signals produced by protons and electrons at the characteristic energy in the neutron detector, and finally summarize the neutron detector's ability to distinguish protons and electrons under different electron acceptabilities.

  7. CAD-based Automatic Modeling Method for Geant4 geometry model Through MCAM

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Nie, Fanzhi; Wang, Guozhong; Long, Pengcheng; LV, Zhongliang; LV, Zhongliang

    2014-06-01

    Geant4 is a widely used Monte Carlo transport simulation package. Before calculating using Geant4, the calculation model need be established which could be described by using Geometry Description Markup Language (GDML) or C++ language. However, it is time-consuming and error-prone to manually describe the models by GDML. Automatic modeling methods have been developed recently, but there are some problem existed in most of present modeling programs, specially some of them were not accurate or adapted to specifically CAD format. To convert the GDML format models to CAD format accurately, a Geant4 Computer Aided Design (CAD) based modeling method was developed for automatically converting complex CAD geometry model into GDML geometry model. The essence of this method was dealing with CAD model represented with boundary representation (B-REP) and GDML model represented with constructive solid geometry (CSG). At first, CAD model was decomposed to several simple solids which had only one close shell. And then the simple solid was decomposed to convex shell set. Then corresponding GDML convex basic solids were generated by the boundary surfaces getting from the topological characteristic of a convex shell. After the generation of these solids, GDML model was accomplished with series boolean operations. This method was adopted in CAD/Image-based Automatic Modeling Program for Neutronics & Radiation Transport (MCAM), and tested with several models including the examples in Geant4 install package. The results showed that this method could convert standard CAD model accurately, and can be used for Geant4 automatic modeling.

  8. Geant4 validation with CMS calorimeters test-beam data

    SciTech Connect

    Piperov, Stefan; /Sofiya, Inst. Nucl. Res. /Fermilab

    2008-08-01

    CMS experiment is using Geant4 for Monte-Carlo simulation of the detector setup. Validation of physics processes describing hadronic showers is a major concern in view of getting a proper description of jets and missing energy for signal and background events. This is done by carrying out an extensive studies with test beam using the prototypes or real detector modules of the CMS calorimeter. These data are matched with Geant4 predictions. Tuning of the Geant4 models is carried out and steps to be used in reproducing detector signals are defined in view of measurements of energy response, energy resolution, transverse and longitudinal shower profiles for a variety of hadron beams over a broad energy spectrum between 2 to 300 GeV/c.

  9. GEANT4 simulation of APEX background radiation and shielding

    NASA Astrophysics Data System (ADS)

    Kaluarachchi, Maduka M.; Cates, Gordon D.; Wojtsekhowski, B.

    2015-04-01

    The A' Experiment (APEX), which is approved to run at the Thomas Jefferson National Accelerator Facility (JLab) Hall A, will search for a new vector boson that is hypothesized to be a possible force carrier that couples to dark matter. APEX results should be sensitive to the mass range of 65 MeV to 550 MeV, and high sensitivity will be achieved by means of a high intensity 100 μA beam on a 0.5 g/cm2 Tungsten target resulting in very high luminosity. The experiment should be able to observe the A ' with a coupling constant α ' ~ 1 × 107 times smaller than the electromagnetic coupling constant α. To deal safely with such enormous intensity and luminosity, a full radiation analysis must be used to help with the design of proper radiation shielding. The purpose of this talk is to present preliminary results obtained by simulating radiation background from the APEX experiment using the 3D Monte-Carlo transport code Geant4. Included in the simulation is a detailed Hall A setup: the hall, spectrometers and shield house, beam dump, beam line, septa magnet with its field, as well as the production target. The results were compared to the APEX test run data and used in development of the radiation shielding for sensitive electronics.

  10. Preliminary Investigation of Microdosimetric Track Structure Physics Models in Geant4-DNA and RITRACKS.

    PubMed

    Douglass, Michael; Penfold, Scott; Bezak, Eva

    2015-01-01

    The major differences between the physics models in Geant4-DNA and RITRACKS Monte Carlo packages are investigated. Proton and electron ionisation interactions and electron excitation interactions in water are investigated in the current work. While these packages use similar semiempirical physics models for inelastic cross-sections, the implementation of these models is demonstrated to be significantly different. This is demonstrated in a simple Monte Carlo simulation designed to identify differences in interaction cross-sections.

  11. Preliminary Investigation of Microdosimetric Track Structure Physics Models in Geant4-DNA and RITRACKS

    PubMed Central

    Bezak, Eva

    2015-01-01

    The major differences between the physics models in Geant4-DNA and RITRACKS Monte Carlo packages are investigated. Proton and electron ionisation interactions and electron excitation interactions in water are investigated in the current work. While these packages use similar semiempirical physics models for inelastic cross-sections, the implementation of these models is demonstrated to be significantly different. This is demonstrated in a simple Monte Carlo simulation designed to identify differences in interaction cross-sections. PMID:26124856

  12. Comparison of electron scattering algorithms in Geant4.

    PubMed

    Sawkey, D; Constantin, M; Svatos, M

    2012-06-07

    Electron scattering algorithms in Geant4 versions 9.4 and 9.5 were benchmarked by comparing scattered distributions against previously measured values at 13 and 20 MeV, for low, intermediate, and high atomic number materials. Several scattering models were used: Versions 93 and 95 of the Urban model, with different step size limits near boundaries; Goudsmit-Saunderson multiple scattering; and single scattering. The Urban93 and Urban95 models with a large step size limit (as in the Option 0 physics list) were found to give results most closely matching the experimental results. Scattered distributions using the Urban models were all narrower than measured by up to 6%, consistent with previous published simulations using EGSnrc. This is suggestive of a systematic difference between simulations and measurement. The magnitudes of the differences were similar to previously published results using Geant4, although there were differences in detail. In particular, the current results were typically 2% narrower than values. Results with the more restrictive step size limit in Option 3 were even more narrow, and close to those with single scattering. The Goudsmit-Saunderson multiple scattering model produced distributions up to 15% different from measured in Geant4 version 9.5 and up to 45% different in Geant4 version 9.4.

  13. Beam Simulation Tools for GEANT4 (BT-V1.0). User's Guide

    SciTech Connect

    Elvira, V. Daniel; Lebrum, P.; Spentzouris, P.

    2002-12-02

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the high energy physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. The Beam Tools are a set of C++ classes designed to facilitate the simulation of accelerator elements: r.f. cavities, magnets, absorbers, etc. These elements are constructed from Geant4 solid volumes like boxes, tubes, trapezoids, or spheers. There are many computer programs for beam physics simulations, but Geant4 is ideal to model a beam through a material or to integrate a beam line with a complex detector. There are many such examples in the current international High Energy Physics programs. For instance, an essential part of the R&D associated with the Neutrino Source/Muon Collider accelerator is the ionization cooling channel, which is a section of the system aimed to reduce the size of the muon beam in phase space. The ionization cooling technique uses a combination of linacs and light absorbers to reduce the transverse momentum and size of the beam, while keeping the longitudinal momentum constant. The MuCool/MICE (muon cooling) experiments need accurate simulations of the beam transport through the cooling channel in addition to a detailed simulation of the detectors designed to measure the size of the beam. The accuracy of the models for physics processes associated with muon ionization and multiple scattering is critical in this type of applications. Another example is the simulation of the interaction region in future accelerators. The high luminosity and background environments expected in the Next Linear Collider (NLC) and the Very Large Hadron Collider (VLHC) pose great demand on the detectors, which may be optimized by means of a simulation of the detector-accelerator interface.

  14. Geant4-DNA simulation of electron slowing-down spectra in liquid water

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Kyriakou, I.; Tran, H. N.

    2017-04-01

    This work presents the simulation of monoenergetic electron slowing-down spectra in liquid water by the Geant4-DNA extension of the Geant4 Monte Carlo toolkit (release 10.2p01). These spectra are simulated for several incident energies using the most recent Geant4-DNA physics models, and they are compared to literature data. The influence of Auger electron production is discussed. For the first time, a dedicated Geant4-DNA example allowing such simulations is described and is provided to Geant4 users, allowing further verification of Geant4-DNA track structure simulation capabilities.

  15. Geant4 models for simulation of hadron/ion nuclear interactions at moderate and low energies.

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Ivanchenko, Vladimir; Quesada, Jose-Manuel; Wright, Dennis

    The Geant4 toolkit is intended for Monte Carlo simulation of particle transport in media. It was initially designed for High Energy Physics purposes such as experiments at the Large Hadron Collider (LHC) at CERN. The toolkit offers a set of models allowing effective simulation of cosmic ray interactions with different materials. For moderate and low energy hadron/ion interactions with nuclei there are a number of competitive models: Binary and Bertini intra-nuclear cascade models, quantum molecular dynamic model (QMD), INCL/ABLA cascade model, and Chiral Invariant Phase Space Decay model (CHIPS). We report the status of these models for the recent version of Geant4 (release 9.3, December 2009). The Bertini cascade in-ternal cross sections were upgraded. The native Geant4 precompound and deexcitation models were used in the Binary cascade and QMD. They were significantly improved including emis-sion of light fragments, the Fermi break-up model, the General Evaporation Model (GEM), the multi-fragmentation model, and the fission model. Comparisons between model predictions and data for thin target experiments for neutron, proton, light ions, and isotope production are presented and discussed. The focus of these validations is concentrated on target materials important for space missions.

  16. Application of Geant4 in routine close geometry gamma spectroscopy for environmental samples.

    PubMed

    Dababneh, Saed; Al-Nemri, Ektimal; Sharaf, Jamal

    2014-08-01

    This work examines the utilization of Geant4 to practically achieve crucial corrections, in close geometry, for self-absorption and true coincidence summing in gamma-ray spectrometry of environmental samples, namely soil and water. After validation, different simulation options have been explored and compared. The simulation was used to correct for self-absorption effects, and to establish a summing-free efficiency curve, thus overcoming limitations and uncertainties imposed by conventional calibration standards. To be applicable in busy laboratories, simulation results were introduced into the conventional software Genie 2000 in order to be reliably used in everyday routine measurements.

  17. Design software for reuse

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Viewgraphs are presented on the designing of software for reuse. Topics include terminology, software reuse maxims, the science of programming, an interface design example, a modularization example, and reuse and implementation guidelines.

  18. GEANT4 simulations of Cherenkov reaction history diagnostics

    SciTech Connect

    Rubery, M. S.; Horsfield, C. J.; Herrmann, H. W.; Kim, Y.; Mack, J. M.; Young, C. S.; Caldwell, S. E.; Evans, S. C.; Sedilleo, T. J.; McEvoy, A.; Miller, E. K.; Stoeffl, W.; Ali, Z.

    2010-10-15

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility.

  19. Accurate simulations of TEPC neutron spectra using Geant4

    NASA Astrophysics Data System (ADS)

    Taylor, G. C.; Hawkes, N. P.; Shippen, A.

    2015-11-01

    A Geant4 model of a tissue-equivalent proportional counter (TEPC) has been developed in which the calculated output spectrum exhibits unparalleled agreement with experiment for monoenergetic neutron fields at several energies below 20 MeV. The model uses the standard release of the Geant4 9.6 p2 code, but with a non-standard neutron cross section file as provided by Mendoza et al., and with the environment variable options recommended by the same authors. This configuration was found to produce significant improvements in the alpha-dominated region of the calculated response. In this paper, these improvements are presented, and the post-processing required to convert deposited energy into the number of ion pairs (which is the quantity actually measured experimentally) is discussed.

  20. Instructional Software Design Principles.

    ERIC Educational Resources Information Center

    Hazen, Margret

    1985-01-01

    Discusses learner/computer interaction, learner control, sequencing of instructional events, and graphic screen design as effective principles for the design of instructional software, including tutorials. (MBR)

  1. GEANT4 Tuning For pCT Development

    NASA Astrophysics Data System (ADS)

    Yevseyeva, Olga; de Assis, Joaquim T.; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, João A. P.; Díaz, Katherin S.; Hormaza, Joel M.; Lopes, Ricardo T.

    2011-08-01

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Thus, the fidelity of proton computed tomography (pCT) simulations as a tool for proton therapy planning depends in the general case on the accuracy of results obtained for the proton interaction with thick absorbers. GEANT4 simulations of proton energy spectra after passing thick absorbers do not agree well with existing experimental data, as showed previously. Moreover, the spectra simulated for the Bethe-Bloch domain showed an unexpected sensitivity to the choice of low-energy electromagnetic models during the code execution. These observations were done with the GEANT4 version 8.2 during our simulations for pCT. This work describes in more details the simulations of the proton passage through aluminum absorbers with varied thickness. The simulations were done by modifying only the geometry in the Hadrontherapy Example, and for all available choices of the Electromagnetic Physics Models. As the most probable reasons for these effects is some specific feature in the code, or some specific implicit parameters in the GEANT4 manual, we continued our study with version 9.2 of the code. Some improvements in comparison with our previous results were obtained. The simulations were performed considering further applications for pCT development.

  2. Measuring software design

    NASA Technical Reports Server (NTRS)

    1986-01-01

    An extensive series of studies of software design measures conducted by the Software Engineering Laboratory is described. Included are the objectives and results of the studies, the method used to perform the studies, and the problems encountered. The document should be useful to researchers planning similar studies as well as to managers and designers concerned with applying quantitative design measures.

  3. GEANT4 for breast dosimetry: parameters optimization study.

    PubMed

    Fedon, C; Longo, F; Mettivier, G; Longo, R

    2015-08-21

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor (r2>0.99) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4.

  4. GEANT4 for breast dosimetry: parameters optimization study

    NASA Astrophysics Data System (ADS)

    Fedon, C.; Longo, F.; Mettivier, G.; Longo, R.

    2015-08-01

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor ({{r}2}>0.99 ) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4.

  5. Calibration of the radiation monitor onboard Akebono using Geant4

    NASA Astrophysics Data System (ADS)

    Asai, Keiko; Takashima, Takeshi; Koi, Tatsumi; Nagai, Tsugunobu

    Natural high-energy electrons and protons (keV-MeV) in the space contaminate the data re-ciprocally. In order to calibrate the energy ranges and to remove data contamination on the radiation monitor (RDM) onboard the Japanese satellite, Akebono (EXOS-D), the detector is investigated using the Geant4 simulation toolkit of computational particle tracing. The semi-polar orbiting Akebono, launched in February 1989, is active now. This satellite has been observed the space environment at altitudes of several thousands km. The RDM instrument onboard Akebono monitors energetic particles in the Earth's radiation belt and gives important data accumulated for about two solar cycles. The data from RDM are for electrons in three energy channels of 0.3 MeV, protons in three energy channels of ¿ 30 MeV, and alpha particles in one energy channels of 15-45 MeV. The energy ranges are however based on information of about 20 years ago so that the data seem to include some errors actuary. In addition, these data include contamination of electrons and protons reciprocally. Actuary it is noticed that the electron data are contaminated by the solar protons but unknown quantitative amount of the contamination. Therefore we need data calibration in order to correct the energy ranges and to remove data contamination. The Geant4 simulation gives information of trajectories of incident and secondary particles whose are interacted with materials. We examine the RDM monitor using the Geant4 simulation. We find from the results that relativistic electrons of MeV behave quite complicatedly because of particle-material interaction in the instrument. The results indicate that efficiencies of detection and contamination are dependent on energy. This study compares the electron data from Akebono RDM with the simultaneous observation of CRRES and tries to lead the values of correction for each of the energy channels.

  6. Neutron shielding for a new projected proton therapy facility: A Geant4 simulation study.

    PubMed

    Cadini, Francesco; Bolst, David; Guatelli, Susanna; Beltran, Chris; Jackson, Michael; Rosenfeld, Anatoly B

    2016-12-01

    In this work, we used the Monte Carlo-based Geant4 simulation toolkit to calculate the ambient dose equivalents due to the secondary neutron field produced in a new projected proton therapy facility. In particular the facility geometry was modeled in Geant4 based on the CAD design. Proton beams were originated with an energy of 250MeV in the gantry rooms with different angles with respect to the patient; a fixed 250MeV proton beam was also modeled. The ambient dose equivalent was calculated in several locations of interest inside and outside the facility, for different scenarios. The simulation results were compared qualitatively to previous work on an existing facility bearing some similarities with the design under study, showing that the ambient dose equivalent ranges obtained are reasonable. The ambient dose equivalents, calculated by means of the Geant4 simulation, were compared to the Australian regulatory limits and showed that the new facility will not pose health risks for the public or staff, with a maximum equivalent dose rate equal to 7.9mSv/y in the control rooms and maze exit areas and 1.3·10(-1)mSv/y close to the walls, outside the facility, under very conservative assumptions. This work represents the first neutron shielding verification analysis of a new projected proton therapy facility and, as such, it may serve as a new source of comparison and validation for the international community, besides confirming the viability of the project from a radioprotection point of view.

  7. Antinucleus-Nucleus Cross Sections Implemented in Geant4

    SciTech Connect

    Uzhinsky, V.; Apostolakis, J.; Galoyan, A.; Folger, G.; Grichine, V.M.; Ivanchenko, V.N.; Wright, D.H.; /SLAC

    2012-04-26

    Cross sections of antinucleus ({bar p}, {bar d}, {bar t}, {sup 3}{ovr He}, {sup 4}{ovr He}) interactions with nuclei in the energy range 100 MeV/c to 1000 GeV/c per antinucleon are calculated in the Glauber approximation which provides good description of all known {bar p}Across sections. The results were obtained using a new parameterization of the total and elastic {bar p}p cross sections. Simple parameterizations of the antinucleus-nucleus cross sections are proposed for use in estimating the efficiency of antinucleus detection and tracking in cosmic rays and accelerator experiments. These parameterizations are implemented in the Geant4 toolkit.

  8. Progress in Hadronic Physics Modelling in Geant4

    SciTech Connect

    Apostolakis, John; Folger, Gunter; Grichine, Vladimir; Heikkinen, Aatos; Howard, Alexander; Ivanchenko, Vladimir; Kaitaniemi, Pekka; Koi, Tatsumi; Kosov, Mikhail; Quesada, Jose Manuel; Ribon, Alberto; Uzhinsky, Vladimir; Wright, Dennis; /SLAC

    2011-11-28

    Geant4 offers a set of models to simulate hadronic showers in calorimeters. Recent improvements to several models relevant to the modelling of hadronic showers are discussed. These include improved cross sections, a revision of the FTF model, the addition of quasi-elastic scattering to the QGS model, and enhancements in the nuclear precompound and de-excitation models. The validation of physics models against thin target experiments has been extended especially in the energy region 10 GeV and below. Examples of new validation results are shown.

  9. Distributed geant4 simulation in medical and space science applications using DIANE framework and the GRID

    NASA Astrophysics Data System (ADS)

    Mościcki, Jakub T.; Guatelli, Susanna; Mantero, Alfonso; Pia, M. G.

    2003-09-01

    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed Analysis Environment (DIANE) [1] is a R&D study, focusing on semi-interactive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European Data Grid middleware. This paper describes the general concepts behind the DIANE framework and results of the first tests with distributed Geant 4 simulations.

  10. Geant4 supplied parameters for gamma reaction history at NIF

    NASA Astrophysics Data System (ADS)

    Rubery, Michael; Horsfield, Colin; Herrmann, Hans; Kim, Yongho; Mack, Joe; Young, Carl; Evans, Scott; Sedillo, Tom; Miller, Kirk; Stoeffl, Wolfgang; Grafil, Elliot

    2011-10-01

    The GRH diagnostics at NIF and Omega report ICF burn parameters through detection of multi-MeV γ emissions. Of particular interest is ` γ bang-time' (GBT), defined as the temporal separation between light impacting the capsule and peak in the nuclear reaction history; GBT can constrain shock and compression parameters, and indicate fuel/ablator mix. Early NIF commissioning experiments have identified contributions to GRH signals from n,n' γ reactions with remaining capsule ablator, hohlraum and thermo-mechanical package, outside the fuel hotspot region. Such contributions are mitigated by increasing the Cherenkov threshold above the energy of these emissions. The pressure adjustment modifies parameters important to GBT, such as cell time-of-flight and detector FWHM; corrections simulated using Geant4 are presented using models experimentally validated at Duke University. Beyond GBT, studies suggest GRH may be capable of recording ablator ρR, unfolding the DT γ spectrum, and inferring the DTγ /DTn branching ratio. All calculations rely on the energy-resolved intensity response as a function of gas pressure. Geant4 response simulations, together with calculations by LANL using the experimentally validated ACCEPT code, are also presented.

  11. Evaluation of open MPI and MPICH2 performances for the computation time in proton therapy dose calculations with Geant4

    NASA Astrophysics Data System (ADS)

    Kazemi, M.; Afarideh, H.; Riazi, Z.

    2015-11-01

    The aim of this research work is to use a better parallel software structure to improve the performance of the Monte Carlo Geant4 code in proton treatment planning. The hadron therapy simulation is rewritten to parallelize the shared memory multiprocessor systems by using the Message-Passing Interface (MPI). The speedup performance of the code has been studied by using two MPI-compliant libraries including Open MPI and the MPICH2, separately. Despite the speedup, the results are almost linear for both the Open MPI and MPICH2; the latter was chosen because of its better characteristics and lower computation time. The Geant4 parameters, including the step limiter and the set cut, have been analyzed to minimize the simulation time as much as possible. For a reasonable compromise between the spatial dose distribution and the calculation time, the improvement in time reduction coefficient reaches about 157.

  12. A modular Geant4 model of Leksell Gamma Knife Perfexion™

    NASA Astrophysics Data System (ADS)

    Pipek, J.; Novotný, J.; Novotný, J., Jr.; Kozubíková, P.

    2014-12-01

    This work presents a Monte Carlo model of Leksell Gamma Knife Perfexion as well as the main parameters of the dose distribution in the standard phantom obtained using this model. The model is developed in the Geant4 simulation toolkit in a modular way which enables its reuse in other Perfexion studies. Large phase space files were created, containing particles that are entering the inner machine cavity after being transported through the collimation system. All 14 output factors of the machine and effective output factors for both the 4 mm (0.830 ± 0.009) and 8 mm (0.921 ± 0.004) collimators were calculated. Dose profiles along the main axes are also included for each collimator size. All results are compared to the values obtained from the treatment planning system, from experiments, and from other Monte Carlo models.

  13. Simulation of a Helical Channel using GEANT4

    SciTech Connect

    Elvira, V. D.; Lebrun, P.; Spentzouris, P.

    2001-02-01

    We present a simulation of a 72 m long cooling channel proposed by V. Balbekov based on the helical cooling concept developed by Ya. Derbenev. LiH wedge absorbers provide the energy loss mechanism and 201 MHz cavities are used for re-acceleration. They are placed inside a main solenoidal field to focus the beam. A helical field with an amplitude of 0.3 T and a period of 1.8 m provides momentum dispersion for emittance exchange.The simulation is performed using GEANT4. The total fractional transmission is 0.85, and the transverse, longitudinal, and 3-D cooling factors are 3.75, 2.27, and 14.61, respectively. Some version of this helical channel could eventually be used to replace the first section of the double flip channel to keep the longitudinal emittance under control and increase transmission. Although this is an interesting option, the technical challenges are still significant.

  14. A modular Geant4 model of Leksell Gamma Knife Perfexion™.

    PubMed

    Pipek, J; Novotný, J; Novotný, J; Kozubíková, P

    2014-12-21

    This work presents a Monte Carlo model of Leksell Gamma Knife Perfexion as well as the main parameters of the dose distribution in the standard phantom obtained using this model. The model is developed in the Geant4 simulation toolkit in a modular way which enables its reuse in other Perfexion studies. Large phase space files were created, containing particles that are entering the inner machine cavity after being transported through the collimation system. All 14 output factors of the machine and effective output factors for both the 4 mm (0.830 ± 0.009) and 8 mm (0.921 ± 0.004) collimators were calculated. Dose profiles along the main axes are also included for each collimator size. All results are compared to the values obtained from the treatment planning system, from experiments, and from other Monte Carlo models.

  15. Simulation loop between cad systems, GEANT-4 and GeoModel: Implementation and results

    NASA Astrophysics Data System (ADS)

    Sharmazanashvili, A.; Tsutskiridze, Niko

    2016-09-01

    Compare analysis of simulation and as-built geometry descriptions of detector is important field of study for data_vs_Monte-Carlo discrepancies. Shapes consistency and detalization is not important while adequateness of volumes and weights of detector components are essential for tracking. There are 2 main reasons of faults of geometry descriptions in simulation: (1) Difference between simulated and as-built geometry descriptions; (2) Internal inaccuracies of geometry transformations added by simulation software infrastructure itself. Georgian Engineering team developed hub on the base of CATIA platform and several tools enabling to read in CATIA different descriptions used by simulation packages, like XML->CATIA; VP1->CATIA; Geo-Model->CATIA; Geant4->CATIA. As a result it becomes possible to compare different descriptions with each other using the full power of CATIA and investigate both classes of reasons of faults of geometry descriptions. Paper represents results of case studies of ATLAS Coils and End-Cap toroid structures.

  16. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    SciTech Connect

    Uzunyan, S. A.; Blazey, G.; Boi, S.; Coutrakon, G.; Dyshkant, A.; Francis, K.; Hedin, D.; Johnson, E.; Kalnins, J.; Zutshi, V.; Ford, R.; Rauch, J. E.; Rubinov, P.; Sellberg, G.; Wilson, P.; Naimuddin, M.

    2015-12-29

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input for image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.

  17. CASE: Software design technologies

    SciTech Connect

    Kalyanov, G.N.

    1994-05-01

    CASE (Computer-Aided Software Engineering) is a set of methodologies for software design, development, and maintenance supported by a complex of interconnected automation tools. CASE is a set of tools for the programmer, analyst, and developer for the automation of software design and development. Today, CASE has become an independent discipline in software engineering that has given rise to a powerful CASE industry made up of hundreds of firms and companies of various kinds. They include companies that develop tools for software analysis and design and have a wide network of distributors and dealers, firms that develop specialized tools for narrow subject areas or for individual stages of the software life cycle, firms that organize seminars and courses for specialists, consulting firms, which demonstrate the practical power of CASE toolkits for specific applications, and companies specializing in the publication of periodicals and bulletins on CASE. The principal purchasers of CASE toolkits abroad are military organizations, data-processing centers, and commercial software developers.

  18. Software architecture design domain

    SciTech Connect

    White, S.A.

    1996-12-31

    Software architectures can provide a basis for the capture and subsequent reuse of design knowledge. The goal of software architecture is to allow the design of a system to take place at a higher level of abstraction; a level concerned with components, connections, constraints, rationale. This architectural view of software adds a new layer of abstraction to the traditional design phase of software development. It has resulted in a flurry of activity towards techniques, tools, and architectural design languages developed specifically to assist with this activity. An analysis of architectural descriptions, even though they differ in notation, shows a common set of key constructs that are present across widely varying domains. These common aspects form a core set of constructs that should belong to any ADL in order to for the language to offer the ability to specify software systems at the architectural level. This analysis also revealed a second set of constructs which served to expand the first set thereby improving the syntax and semantics. These constructs are classified according to whether they provide representation and analysis support for architectures belonging to many varying application domains (domain-independent construct class) or to a particular application domain (domain-dependent constructs). This paper presents the constructs of these two classes, their placement in the architecture design domain and shows how they may be used to classify, select, and analyze proclaimed architectural design languages (ADLs).

  19. Diffusion-controlled reactions modeling in Geant4-DNA

    SciTech Connect

    Karamitros, M.; Luan, S.; Bernal, M.A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H.N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k–d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  20. Diffusion-controlled reactions modeling in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Karamitros, M.; Luan, S.; Bernal, M. A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H. N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k-d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  1. GATE - Geant4 Application for Tomographic Emission: a simulation toolkit for PET and SPECT

    PubMed Central

    Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.

    2012-01-01

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document, and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.ep.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects toward the gridification of GATE and its extension to other domains such as dosimetry are also discussed. PMID:15552416

  2. Thermal neutron response of a boron-coated GEM detector via GEANT4 Monte Carlo code.

    PubMed

    Jamil, M; Rhee, J T; Kim, H G; Ahmad, Farzana; Jeon, Y J

    2014-10-22

    In this work, we report the design configuration and the performance of the hybrid Gas Electron Multiplier (GEM) detector. In order to make the detector sensitive to thermal neutrons, the forward electrode of the GEM has been coated with the enriched boron-10 material, which works as a neutron converter. A total of 5×5cm(2) configuration of GEM has been used for thermal neutron studies. The response of the detector has been estimated via using GEANT4 MC code with two different physics lists. Using the QGSP_BIC_HP physics list, the neutron detection efficiency was determined to be about 3%, while with QGSP_BERT_HP physics list the efficiency was around 2.5%, at the incident thermal neutron energies of 25meV. The higher response of the detector proves that GEM-coated with boron converter improves the efficiency for thermal neutrons detection.

  3. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described.

  4. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV

    NASA Astrophysics Data System (ADS)

    Maigne, L.; Perrot, Y.; Schaart, D. R.; Donnarieix, D.; Breton, V.

    2011-02-01

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV.

  5. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV.

    PubMed

    Maigne, L; Perrot, Y; Schaart, D R; Donnarieix, D; Breton, V

    2011-02-07

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV.

  6. Recent improvements on the description of hadronic interactions in Geant4

    NASA Astrophysics Data System (ADS)

    Dotti, A.; Apostolakis, J.; Folger, G.; Grichine, V.; Ivanchenko, V.; Kosov, M.; Ribon, A.; Uzhinsky, V.; Wright, D. H.

    2011-04-01

    We present an overview of recent improvements of hadronic models in Geant4 for the physics configurations (Physics Lists) relevant to applications in high energy experiments. During last year the improvements have concentrated on the study of unphysical discontinuities in calorimeter observables in the transition regions between the models used in Physics Lists. The microscopic origin of these have been investigated, and possible improvements of Geant4 code are currently under validation. In this paper we discuss the status of the latest version of Geant4 with emphasis on the most promising new developments, namely the Fritiof based and CHIPS Physics Lists.

  7. Recent Improvements on the Description of Hadronic Interactions in Geant4

    SciTech Connect

    Dotti, A.; Apostolakis, J.; Folger, G.; Grichine, V.; Ivanchenko, V.; Kosov, M.; Ribon, A.; Uzhinsky, V.; Wright, D.H.; /SLAC

    2012-06-07

    We present an overview of recent improvements of hadronic models in Geant4 for the physics configurations (Physics Lists) relevant to applications in high energy experiments. During last year the improvements have concentrated on the study of unphysical discontinuities in calorimeter observables in the transition regions between the models used in Physics Lists. The microscopic origin of these have been investigated, and possible improvements of Geant4 code are currently under validation. In this paper we discuss the status of the latest version of Geant4 with emphasis on the most promising new developments, namely the Fritiof based and CHIPS Physics Lists.

  8. Experimental quantification of Geant4 PhysicsList recommendations: methods and results

    NASA Astrophysics Data System (ADS)

    Basaglia, Tullio; Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Saracco, Paolo

    2015-12-01

    The Geant4 physicsjists package encompasses predefined selections of physics processes and models to be used in simulation applications. Limited documentation is available in the literature about Geant4 pre-packaged PhysicsLists and their validation. The reports in the literature mainly concern specific use cases. This paper documents the epistemological grounds for the validation of Geant4 pre-packaged PhysicsLists (and their accessory classes, Builders and PhysicsConstructors) and some examples of the author's scientific activity on this subject.

  9. Study on GEANT4 code applications to dose calculation using imaging data

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Ok; Kang, Jeong Ku; Kim, Jhin Kee; Kwon, Hyeong Cheol; Kim, Jung Soo; Kim, Bu Gil; Jeong, Dong Hyeok

    2015-07-01

    The use of the GEANT4 code has increased in the medical field. Various studies have calculated the patient dose distributions by users the GEANT4 code with imaging data. In present study, Monte Carlo simulations based on DICOM data were performed to calculate the dose absorb in the patient's body. Various visualization tools are installed in the GEANT4 code to display the detector construction; however, the display of DICOM images is limited. In addition, to displaying the dose distributions on the imaging data of the patient is difficult. Recently, the gMocren code, a volume visualization tool for GEANT4 simulation, was developed and has been used in volume visualization of image files. In this study, the imaging based on the dose distributions absorbed in the patients was performed by using the gMocren code. Dosimetric evaluations with were carried out by using thermo luminescent dosimeter and film dosimetry to verify the calculated results.

  10. Geant4 validation of neutron production on thick targets bombarded with 120 GeV protons

    NASA Astrophysics Data System (ADS)

    Sabra, Mohammad S.

    2015-09-01

    Neutron energy spectra and angular distributions are calculated for 120 GeV protons on thick graphite, aluminum, copper, and tungsten targets using relevant physics models within the Monte-Carlo simulation package Geant4. The calculations are compared to data from recent experiment. Discrepancies are observed between experimental data and Geant4 models, and suggest that improvements of the intra-(INC) and inter-nuclear cascade processes employed by the models are required.

  11. Designing Educational Software for Tomorrow.

    ERIC Educational Resources Information Center

    Harvey, Wayne

    Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…

  12. Physical models implemented in the GEANT4-DNA extension of the GEANT-4 toolkit for calculating initial radiation damage at the molecular level.

    PubMed

    Villagrasa, C; Francis, Z; Incerti, S

    2011-02-01

    The ROSIRIS project aims to study the radiobiology of integrated systems for medical treatment optimisation using ionising radiations and evaluate the associated risk. In the framework of this project, one research focus is the interpretation of the initial radio-induced damage in DNA created by ionising radiation (and detected by γ-H2AX foci analysis) from the track structure of the incident particles. In order to calculate the track structure of ionising particles at a nanometric level, the Geant4 Monte Carlo toolkit was used. Geant4 (Object Oriented Programming Architecture in C++) offers a common platform, available free to all users and relatively easy to use. Nevertheless, the current low-energy threshold for electromagnetic processes in GEANT4 is set to 1 keV (250 eV using the Livermore processes), which is an unsuitable value for nanometric applications. To lower this energy threshold, the necessary interaction processes and models were identified, and the corresponding available cross sections collected from the literature. They are mostly based on the plane-wave Born approximation (first Born approximation, or FBA) for inelastic interactions and on semi-empirical models for energies where the FBA fails (at low energies). In this paper, the extensions that have been introduced into the 9.3 release of the Geant4 toolkit are described, the so-called Geant4-DNA extension, including a set of processes and models adapted in this study and permitting the simulation of electron (8 eV-1 MeV), proton (100 eV-100 MeV) and alpha particle (1 keV-10 MeV) interactions in liquid water.

  13. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Cañadas, M.; Arce, P.; Rato Mendes, P.

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was

  14. Geant4 Simulations of SuperCDMS iZip Detector Charge Carrier Propagation and FET Readout

    NASA Astrophysics Data System (ADS)

    Agnese, Rob

    2013-04-01

    The SuperCDMS experiment aims to directly detect dark matter particles called WIMPs (Weakly Interacting Massive Particles). The detectors collect phonon and ionization energy of incident particles for analysis. The SuperCDMS Detector Monte Carlo group is implementing low temperature phonon and ionization simulations in Geant4 in order to study the response of the detectors to incident events. Phonons and electron-hole pairs are tracked in a low temperature crystal detector. The resulting TES phonon readout, as well as the FET charge readout are simulated. The Geant4 framework is well-suited to these tasks. The charge transport in the presence of a complex electric field is performed by calculating a tetrahedral mesh of potentials across the crystal volume. To calculate the FET readout, the Shockley-Ramo theorem is applied to simulate the current in the FET. The focus of this presentation will be on incorporating and using the software package, Qhull, to calculate a tetrahedral mesh from known potentials and then using barycentric coordinates to perform a linear interpolation to calculate the field. After calculating the field at each charge carrier's position, the Shockley-Ramo theorem is applied and the previous triangulation technique is performed to simulate the FET response

  15. Implementation of new physics models for low energy electrons in liquid water in Geant4-DNA.

    PubMed

    Bordage, M C; Bordes, J; Edel, S; Terrissol, M; Franceries, X; Bardiès, M; Lampe, N; Incerti, S

    2016-12-01

    A new alternative set of elastic and inelastic cross sections has been added to the very low energy extension of the Geant4 Monte Carlo simulation toolkit, Geant4-DNA, for the simulation of electron interactions in liquid water. These cross sections have been obtained from the CPA100 Monte Carlo track structure code, which has been a reference in the microdosimetry community for many years. They are compared to the default Geant4-DNA cross sections and show better agreement with published data. In order to verify the correct implementation of the CPA100 cross section models in Geant4-DNA, simulations of the number of interactions and ranges were performed using Geant4-DNA with this new set of models, and the results were compared with corresponding results from the original CPA100 code. Good agreement is observed between the implementations, with relative differences lower than 1% regardless of the incident electron energy. Useful quantities related to the deposited energy at the scale of the cell or the organ of interest for internal dosimetry, like dose point kernels, are also calculated using these new physics models. They are compared with results obtained using the well-known Penelope Monte Carlo code.

  16. Use of GEANT4 vs. MCNPX for the characterization of a boron-lined neutron detector

    NASA Astrophysics Data System (ADS)

    van der Ende, B. M.; Atanackovic, J.; Erlandson, A.; Bentoumi, G.

    2016-06-01

    This work compares GEANT4 with MCNPX in the characterization of a boron-lined neutron detector. The neutron energy ranges simulated in this work (0.025 eV to 20 MeV) are the traditional domain of MCNP simulations. This paper addresses the question, how well can GEANT4 and MCNPX be employed for detailed thermal neutron detector characterization? To answer this, GEANT4 and MCNPX have been employed to simulate detector response to a 252Cf energy spectrum point source, as well as to simulate mono-energetic parallel beam source geometries. The 252Cf energy spectrum simulation results demonstrate agreement in detector count rate within 3% between the two packages, with the MCNPX results being generally closer to experiment than are those from GEANT4. The mono-energetic source simulations demonstrate agreement in detector response within 5% between the two packages for all neutron energies, and within 1% for neutron energies between 100 eV and 5 MeV. Cross-checks between the two types of simulations using ISO-8529 252Cf energy bins demonstrates that MCNPX results are more self-consistent than are GEANT4 results, by 3-4%.

  17. Review of Geant4-DNA applications for micro and nanoscale simulations.

    PubMed

    Incerti, S; Douglass, M; Penfold, S; Guatelli, S; Bezak, E

    2016-10-01

    Emerging radiotherapy treatments including targeted particle therapy, hadron therapy or radiosensitisation of cells by high-Z nanoparticles demand the theoretical determination of radiation track structure at the nanoscale. This is essential in order to evaluate radiation damage at the cellular and DNA level. Since 2007, Geant4 offers physics models to describe particle interactions in liquid water at the nanometre level through the Geant4-DNA Package. This package currently provides a complete set of models describing the event-by-event electromagnetic interactions of particles with liquid water, as well as developments for the modelling of water radiolysis. Since its release, Geant4-DNA has been adopted as an investigational tool in kV and MV external beam radiotherapy, hadron therapies using protons and heavy ions, targeted therapies and radiobiology studies. It has been benchmarked with respect to other track structure Monte Carlo codes and, where available, against reference experimental measurements. While Geant4-DNA physics models and radiolysis modelling functionalities have already been described in detail in the literature, this review paper summarises and discusses a selection of representative papers with the aim of providing an overview of a) geometrical descriptions of biological targets down to the DNA size, and b) the full spectrum of current micro- and nano-scale applications of Geant4-DNA.

  18. Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.

    PubMed

    Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B

    2010-09-01

    The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made.

  19. Applying Software Design Methodology to Instructional Design

    ERIC Educational Resources Information Center

    East, J. Philip

    2004-01-01

    The premise of this paper is that computer science has much to offer the endeavor of instructional improvement. Software design processes employed in computer science for developing software can be used for planning instruction and should improve instruction in much the same manner that design processes appear to have improved software. Techniques…

  20. Microdosimetry of the Auger electron emitting 123I radionuclide using Geant4-DNA simulations.

    PubMed

    Fourie, H; Newman, R T; Slabbert, J P

    2015-04-21

    Microdosimetric calculations of the Auger electron emitter (123)I were done in liquid water spheres using the Geant4 toolkit. The electron emission spectrum of (123)I produced by Geant4 is presented. Energy deposition and corresponding S-values were calculated to investigate the influence of the sub-cellular localization of the Auger emitter. It was found that S-values calculated by the Geant4 toolkit are generally lower than the values calculated by other Monte Carlo codes for the (123)I radionuclide. The differences in the compared S-values are mainly due to the different particle emission spectra employed by the respective computational codes and emphasizes the influence of the spectra on dosimetry calculations.

  1. Influence of Geant4 parameters on dose distribution and computation time for carbon ion therapy simulation.

    PubMed

    Zahra, Nabil; Frisson, Thibault; Grevillot, Loic; Lautesse, Philippe; Sarrut, David

    2010-10-01

    The aim of this work was to study the influence of Geant4 parameters on dose distribution and computational time for simulations of carbon ion therapy. The study was done using Geant4 version 9.0. The dose distribution in water for incident monoenergetic carbon ion beams of 300 MeV/u were compared for different values of secondary particle production threshold and different step limits. Variations of depth dose of about 2 mm were observed in some cases, which induced a 30% variation of dose deposit in the Bragg peak region. Other tests were done using Geant4 version 9.2 to verify the results from this study. The two versions provided converging results and led to the same conclusions.

  2. Validation of Geant4 physics models for 56Fe ion beam in various media

    NASA Astrophysics Data System (ADS)

    Jalota, Summit; Kumar, Ashavani

    2012-11-01

    The depth-dose distribution of a 56Fe ion beam has been studied in water, polyethylene, nextel, kevlar and aluminum media. The dose reduction versus areal depth is also calculated for 56Fe ions in carbon, polyethylene and aluminum using the Monte Carlo simulation toolkit Geant4. This study presents the validation of physics models available in Geant4 by comparing the simulated results with the experimental data available in the literature. Simulations are performed using binary cascade (BIC), abrasion-ablation (AA) and quantum molecular dynamics (QMD) models; integrated into Geant4. Deviations from experimental results may be due to the selection of simple geometry. This paper also addresses the differences in the simulated results from various models.

  3. Calculation of electron Dose Point Kernel in water with GEANT4 for medical application

    SciTech Connect

    Guimaraes, C. C.; Sene, F. F.; Martinelli, J. R.

    2009-06-03

    The rapid insertion of new technologies in medical physics in the last years, especially in nuclear medicine, has been followed by a great development of faster Monte Carlo algorithms. GEANT4 is a Monte Carlo toolkit that contains the tools to simulate the problems of particle transport through matter. In this work, GEANT4 was used to calculate the dose-point-kernel (DPK) for monoenergetic electrons in water, which is an important reference medium for nuclear medicine. The three different physical models of electromagnetic interactions provided by GEANT4 - Low Energy, Penelope and Standard - were employed. To verify the adequacy of these models, the results were compared with references from the literature. For all energies and physical models, the agreement between calculated DPKs and reported values is satisfactory.

  4. Comparison of hadron shower data in the PAMELA experiment with Geant 4 simulations

    NASA Astrophysics Data System (ADS)

    Alekseev, V. V.; Dunaeva, O. A.; Bogomolov, Yu V.; Lukyanov, A. D.; Malakhov, V. V.; Mayorov, A. G.; Rodenko, S. A.

    2017-01-01

    The sampling imaging electromagnetic calorimeter of ≈ 16.3 radiation lengths and ≈ 0.6 nuclear interaction length designed and constructed by the PAMELA collaboration as a part of the large magnetic spectrometer PAMELA. Calorimeter consists of 44 single-sided silicon sensor planes interleaved with 22 plates of tungsten absorber (thickness of each tungsten layer 0.26 cm). Silicon planes are composed of a 3 × 3 matrix of silicon detectors, each segmented into 32 read-out strips with a pitch of 2.4 mm. The orientation of the strips of two consecutive layers is orthogonal and therefore provides two-dimensional spatial information. Due to the high granularity, the development of hadronic showers can be study with a good precision. In this work a Monte Carlo simulations (based on Geant4) performed using different available models, and including detector and physical effects, compared with the experimental data obtained on the near Earth orbit. Response of the PAMELA calorimeter to hadronic showers investigated including total energy release in calorimeter and transverse shower profile characteristics.

  5. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    SciTech Connect

    Zhou Y.; Mitra S.; Zhu X.; Wang Y.

    2011-10-16

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling, this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C{sub 3}H{sub 6}O{sub 6}N{sub 6}), can be identified to a depth of 20 cm when buried in soil.

  6. Development of a Geant4 based Monte Carlo Algorithm to evaluate the MONACO VMAT treatment accuracy.

    PubMed

    Fleckenstein, Jens; Jahnke, Lennart; Lohr, Frank; Wenz, Frederik; Hesser, Jürgen

    2013-02-01

    A method to evaluate the dosimetric accuracy of volumetric modulated arc therapy (VMAT) treatment plans, generated with the MONACO™ (version 3.0) treatment planning system in realistic CT-data with an independent Geant4 based dose calculation algorithm is presented. Therefore a model of an Elekta Synergy linear accelerator treatment head with an MLCi2 multileaf collimator was implemented in Geant4. The time dependent linear accelerator components were modeled by importing either logfiles of an actual plan delivery or a DICOM-RT plan sequence. Absolute dose calibration, depending on a reference measurement, was applied. The MONACO as well as the Geant4 treatment head model was commissioned with lateral profiles and depth dose curves of square fields in water and with film measurements in inhomogeneous phantoms. A VMAT treatment plan for a patient with a thoracic tumor and a VMAT treatment plan of a patient, who received treatment in the thoracic spine region including metallic implants, were used for evaluation. MONACO, as well as Geant4, depth dose curves and lateral profiles of square fields had a mean local gamma (2%, 2mm) tolerance criteria agreement of more than 95% for all fields. Film measurements in inhomogeneous phantoms with a global gamma of (3%, 3mm) showed a pass rate above 95% in all voxels receiving more than 25% of the maximum dose. A dose-volume-histogram comparison of the VMAT patient treatment plans showed mean deviations between Geant4 and MONACO of -0.2% (first patient) and 2.0% (second patient) for the PTVs and (0.5±1.0)% and (1.4±1.1)% for the organs at risk in relation to the prescription dose. The presented method can be used to validate VMAT dose distributions generated by a large number of small segments in regions with high electron density gradients. The MONACO dose distributions showed good agreement with Geant4 and film measurements within the simulation and measurement errors.

  7. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4

    PubMed Central

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-01-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ε). In addition, we present comparisons of GEANT4 simulations performed with a “standard” and a “low-energy” physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results. Key Points Testing the feedback mechanism with GEANT4 Validating the GEANT4 programming toolkit Study the ratio of bremsstrahlung photons to electrons at TGF source altitude PMID:26167437

  8. DagSolid: a new Geant4 solid class for fast simulation in polygon-mesh geometry.

    PubMed

    Han, Min Cheol; Kim, Chan Hyeong; Jeong, Jong Hwi; Yeom, Yeon Soo; Kim, SungHoon; Wilson, Paul P H; Apostolakis, John

    2013-07-07

    Even though a computer-aided design (CAD)-based geometry can be directly implemented in Geant4 as polygon-mesh using the G4TessellatedSolid class, the computation speed becomes very slow, especially when the geometry is composed of a large number of facets. To address this problem, in the present study, a new Geant4 solid class, named DagSolid, was developed based on the direct accelerated geometry for the Monte Carlo (DAGMC) library which provides the ray-tracing acceleration algorithm functions. To develop the DagSolid class, the new solid class was derived from the G4VSolid class, and its ray-tracing functions were linked to the corresponding functions of the DAGMC library. The results of this study show that the use of the DagSolid class drastically improves the computation speed. The improvement was more significant when there were more facets, meaning that the DagSolid class can be used more effectively for complicated geometries with many facets than for simple geometries. The maximum difference of computation speed was 1562 and 680 times for Geantino and ChargedGeantino, respectively. For real particles (gammas, electrons, neutrons, and protons), the difference of computation speed was less significant, but still was within the range of 53-685 times depending on the type of beam particles simulated.

  9. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  10. SU-E-T-565: RAdiation Resistance of Cancer CElls Using GEANT4 DNA: RACE

    SciTech Connect

    Perrot, Y; Payno, H; Delage, E; Maigne, L

    2014-06-01

    Purpose: The objective of the RACE project is to develop a comparison between Monte Carlo simulation using the Geant4-DNA toolkit and measurements of radiation damage on 3D melanoma and chondrosarcoma culture cells coupled with gadolinium nanoparticles. We currently expose the status of the developments regarding simulations. Methods: Monte Carlo studies are driven using the Geant4 toolkit and the Geant4-DNA extension. In order to model the geometry of a cell population, the opensource CPOP++ program is being developed for the geometrical representation of 3D cell populations including a specific cell mesh coupled with a multi-agent system. Each cell includes cytoplasm and nucleus. The correct modeling of the cell population has been validated with confocal microscopy images of spheroids. The Geant4 Livermore physics models are used to simulate the interactions of a 250 keV X-ray beam and the production of secondaries from gadolinium nanoparticles supposed to be fixed on the cell membranes. Geant4-DNA processes are used to simulate the interactions of charged particles with the cells. An atomistic description of the DNA molecule, from PDB (Protein Data Bank) files, is provided by the so-called PDB4DNA Geant4 user application we developed to score energy depositions in DNA base pairs and sugar-phosphate groups. Results: At the microscopic level, our simulations enable assessing microscopic energy distribution in each cell compartment of a realistic 3D cell population. Dose enhancement factors due to the presence of gadolinium nanoparticles can be estimated. At the nanometer scale, direct damages on nuclear DNA are also estimated. Conclusion: We successfully simulated the impact of direct radiations on a realistic 3D cell population model compatible with microdosimetry calculations using the Geant4-DNA toolkit. Upcoming validation and the future integration of the radiochemistry module of Geant4-DNA will propose to correlate clusters of ionizations with in vitro

  11. Simulating cosmic radiation absorption and secondary particle production of solar panel layers of Low Earth Orbit (LEO) satellite with GEANT4

    NASA Astrophysics Data System (ADS)

    Yiǧitoǧlu, Merve; Veske, Doǧa; Nilüfer Öztürk, Zeynep; Bilge Demirköz, Melahat

    2016-07-01

    All devices which operate in space are exposed to cosmic rays during their operation. The resulting radiation may cause fatal damages in the solid structure of devices and the amount of absorbed radiation dose and secondary particle production for each component should be calculated carefully before the production. Solar panels are semiconductor solid state devices and are very sensitive to radiation. Even a short term power cut-off may yield a total failure of the satellite. Even little doses of radiation can change the characteristics of solar cells. This deviation can be caused by rarer high energetic particles as well as the total ionizing dose from the abundant low energy particles. In this study, solar panels planned for a specific LEO satellite, IMECE, are analyzed layer by layer. The Space Environment Information System (SPENVIS) database and GEANT4 simulation software are used to simulate the layers of the panels. The results obtained from the simulation will be taken in account to determine the amount of radiation protection and resistance needed for the panels or to revise the design of the panels.

  12. Optimization of {sup 6}LiF:ZnS(Ag) Scintillator Light Yield Using Geant4

    SciTech Connect

    Yehuda-Zada, Y.; Pritchard, K.; Ziegler, J.B.; Cooksey, C.; Siebein, K.; Jackson, M.; Hurlbut, C.; Kadmon, Y.; Cohen, Y.; Maliszewskyj, N.C.; Ibberson, R.M.; Majkrzak, C.F.; Orion, Y.; Osovizky, A.

    2015-07-01

    Neutrons provide an effective tool to probe materials structure. Neutron diffraction is a method to determine the atomic and magnetic structure of a material based on neutron scattering. By this method a collimated incident beam of thermal neutrons heat the examined sample and based on the obtained diffraction pattern information on the structure of the material is provided. Research for developing a novel cold neutron detector for Chromatic Analysis Neutron Diffractometer Or Reflectometer (CANDOR) is underway at the NIST center for neutron research. The system unique design is aimed to provide over ten times fold faster analysis of materials than conventional system. In order to achieve the fast analysis a large number of neutron detectors is required. A key design constraint for this detector is the thickness of the neutron sensitive element. This is met using {sup 6}LiF:ZnS(Ag) scintillation material with embedded wavelength shifting (WLS) fibers conducting scintillation light to silicon photomultiplier photo-sensors. The detector sensitivity is determined by both the neutron capture probability ({sup 6}Li density) and the detectable light output produced by the ZnS(Ag) ionization, the latter of which is hindered by the fluorescence absorption of the scintillation light by the ZnS. Tradeoffs between the neutron capture probability, stimulated light production and light attenuation for determining the optimal stoichiometry of the {sup 6}LiF and ZnS(Ag) as well as the volume ratio of scintillator and fiber. Simulations performed using the GEANT4 Monte Carlo package were made in order to optimize the detector design. GEANT4 enables the investigation of the neutron interaction with the detector, the ionization process and the light transfer process following the nuclear process. The series of conversions required for this detector were modelled: - A cold neutron enters the sensor and is captured by {sup 6}Li in the scintillator mixture ({sup 6}Li (n,α) {sup 3}H

  13. Geant4-based Monte Carlo simulations on GPU for medical applications.

    PubMed

    Bert, Julien; Perez-Ponce, Hector; El Bitar, Ziad; Jan, Sébastien; Boursier, Yannick; Vintache, Damien; Bonissent, Alain; Morel, Christian; Brasse, David; Visvikis, Dimitris

    2013-08-21

    Monte Carlo simulation (MCS) plays a key role in medical applications, especially for emission tomography and radiotherapy. However MCS is also associated with long calculation times that prevent its use in routine clinical practice. Recently, graphics processing units (GPU) became in many domains a low cost alternative for the acquisition of high computational power. The objective of this work was to develop an efficient framework for the implementation of MCS on GPU architectures. Geant4 was chosen as the MCS engine given the large variety of physics processes available for targeting different medical imaging and radiotherapy applications. In addition, Geant4 is the MCS engine behind GATE which is actually the most popular medical applications' simulation platform. We propose the definition of a global strategy and associated structures for such a GPU based simulation implementation. Different photon and electron physics effects are resolved on the fly directly on GPU without any approximations with respect to Geant4. Validations have shown equivalence in the underlying photon and electron physics processes between the Geant4 and the GPU codes with a speedup factor of 80-90. More clinically realistic simulations in emission and transmission imaging led to acceleration factors of 400-800 respectively compared to corresponding GATE simulations.

  14. Identifying key surface parameters for optical photon transport in GEANT4/GATE simulations.

    PubMed

    Nilsson, Jenny; Cuplov, Vesna; Isaksson, Mats

    2015-09-01

    For a scintillator used for spectrometry, the generation, transport and detection of optical photons have a great impact on the energy spectrum resolution. A complete Monte Carlo model of a scintillator includes a coupled ionizing particle and optical photon transport, which can be simulated with the GEANT4 code. The GEANT4 surface parameters control the physics processes an optical photon undergoes when reaching the surface of a volume. In this work the impact of each surface parameter on the optical transport was studied by looking at the optical spectrum: the number of detected optical photons per ionizing source particle from a large plastic scintillator, i.e. the output signal. All simulations were performed using GATE v6.2 (GEANT4 Application for Tomographic Emission). The surface parameter finish (polished, ground, front-painted or back-painted) showed the greatest impact on the optical spectrum whereas the surface parameter σ(α), which controls the surface roughness, had a relatively small impact. It was also shown how the surface parameters reflectivity and reflectivity types (specular spike, specular lobe, Lambertian and backscatter) changed the optical spectrum depending on the probability for reflection and the combination of reflectivity types. A change in the optical spectrum will ultimately have an impact on a simulated energy spectrum. By studying the optical spectra presented in this work, a GEANT4 user can predict the shift in an optical spectrum caused be the alteration of a specific surface parameter.

  15. Applications of the Monte Carlo method in nuclear physics using the GEANT4 toolkit

    SciTech Connect

    Moralles, Mauricio; Guimaraes, Carla C.; Menezes, Mario O.; Bonifacio, Daniel A. B.; Okuno, Emico; Guimaraes, Valdir; Murata, Helio M.; Bottaro, Marcio

    2009-06-03

    The capabilities of the personal computers allow the application of Monte Carlo methods to simulate very complex problems that involve the transport of particles through matter. Among the several codes commonly employed in nuclear physics problems, the GEANT4 has received great attention in the last years, mainly due to its flexibility and possibility to be improved by the users. Differently from other Monte Carlo codes, GEANT4 is a toolkit written in object oriented language (C++) that includes the mathematical engine of several physical processes, which are suitable to be employed in the transport of practically all types of particles and heavy ions. GEANT4 has also several tools to define materials, geometry, sources of radiation, beams of particles, electromagnetic fields, and graphical visualization of the experimental setup. After a brief description of the GEANT4 toolkit, this presentation reports investigations carried out by our group that involve simulations in the areas of dosimetry, nuclear instrumentation and medical physics. The physical processes available for photons, electrons, positrons and heavy ions were used in these simulations.

  16. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4.

    PubMed

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-11-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ /Ne . We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ε). In addition, we present comparisons of GEANT4 simulations performed with a "standard" and a "low-energy" physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results.

  17. CMS validation experience: Test-beam 2004 data vs GEANT4

    SciTech Connect

    Piperov, Stefan; /Fermilab /Sofiya, Inst. Nucl. Res.

    2007-01-01

    A comparison between the Geant4 Monte-Carlo simulation of CMS Detector's Calorimetric System and data from the 2004 Test-Beam at CERN's SPS H2 beam-line is presented. The overall simulated response agrees quite well with the measured response. Slight differences in the longitudinal shower profiles between the MC predictions made with different Physics Lists are observed.

  18. Reflight certification software design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  19. Monte-Carlo modelling and verification of photoluminescence of Gd2O3:Eu scintillator by using the GEANT4 simulation code

    NASA Astrophysics Data System (ADS)

    Cho, Gyu-Seok; Kim, Kum-Bae; Choi, Sang-Hyoun; Song, Yong-Keun; Lee, Soon-Sung

    2017-01-01

    Recently, Monte Carlo methods have been used to optimize the design and modeling of radiation detectors. However, most Monte Carlo codes have a fixed and simple optical physics, and the effect of the signal readout devices is not considered because of the limitations of the geometry function. Therefore, the disadvantages of the codes prevent the modeling of the scintillator detector. The modeling of a comprehensive and extensive detector system has been reported to be feasible when the optical physics model of the GEomerty ANd Tracking 4 (GEANT 4) simulation code is used. In this study, we performed a Gd2O3:Eu scintillator modelling by using the GEANT4 simulation code and compared the results with the measurement data. To obtain the measurement data for the scintillator, we synthesized the Gd2O3:Eu scintillator by using solution combustion method and we evaluated the characteristics of the scintillator by using X-ray diffraction and photoluminescence. We imported the measured data into the GEANT4 code because GEANT4 cannot simulate a fluorescence phenomenon. The imported data were used as an energy distribution for optical photon generation based on the energy deposited in the scintillator. As a result of the simulation, a strong emission peak consistent with the measured data was observed at 611 nm, and the overall trends of the spectrum agreed with the measured data. This result is significant because the characteristics of the scintillator are equally implemented in the simulation, indicating a valuable improvement in the modeling of scintillator-based radiation detectors.

  20. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1977-01-01

    A communications medium to support the design and documentation of complex software applications is studied. The medium also provides the following: (1) a processor which can convert design specifications into an intelligible, informative machine reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor.

  1. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    NASA Astrophysics Data System (ADS)

    Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M. G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  2. The simulation of the LANFOS-H food radiation contamination detector using Geant4 package

    NASA Astrophysics Data System (ADS)

    Piotrowski, Lech Wiktor; Casolino, Marco; Ebisuzaki, Toshikazu; Higashide, Kazuhiro

    2015-02-01

    Recent incident in the Fukushima power plant caused a growing concern about the radiation contamination and resulted in lowering the Japanese limits for the permitted amount of 137Cs in food to 100 Bq/kg. To increase safety and ease the concern we are developing LANFOS (Large Food Non-destructive Area Sampler)-a compact, easy to use detector for assessment of radiation in food. Described in this paper LANFOS-H has a 4 π coverage to assess the amount of 137Cs present, separating it from the possible 40K food contamination. Therefore, food samples do not have to be pre-processed prior to a test and can be consumed after measurements. It is designed for use by non-professionals in homes and small institutions such as schools, showing safety of the samples, but can be also utilized by specialists providing radiation spectrum. Proper assessment of radiation in food in the apparatus requires estimation of the γ conversion factor of the detectors-how many γ photons will produce a signal. In this paper we show results of the Monte Carlo estimation of this factor for various approximated shapes of fish, vegetables and amounts of rice, performed with Geant4 package. We find that the conversion factor combined from all the detectors is similar for all food types and is around 37%, varying maximally by 5% with sample length, much less than for individual detectors. The different inclinations and positions of samples in the detector introduce uncertainty of 1.4%. This small uncertainty validates the concept of a 4 π non-destructive apparatus.

  3. Introduction to Software Design

    DTIC Science & Technology

    1989-01-01

    Sincovec, wrote the first version of this module and, in so doing, helped define what a module should be. In the two years since Professor Budgen...commonly recognized. f. Design for reuse In order, from highly desirable to undesirable these are: Reuse represents a rather ill- defined and poorly 1...be used effectively with proach is particularly significant for larger sys- well- defined design representations, and so there tems, where the use of a

  4. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    declaration to be associated with it. Second, Byron tools can produce useful output from incomplete specifications. These advantages over pure Ada are...8217 implemented. Implementation independent details are included in Sec- tion 2.2. Requirements and Design information for the DARTS implemen- tation of both...The Intelligence Content (I) is an estimate of the Potential Volume. It is independent of the language used and is expected to be invariant over

  5. Calculation of Coincidence Summing Correction Factors for an HPGe detector using GEANT4.

    PubMed

    Giubrone, G; Ortiz, J; Gallardo, S; Martorell, S; Bas, M C

    2016-07-01

    The aim of this paper was to calculate the True Coincidence Summing Correction Factors (TSCFs) for an HPGe coaxial detector in order to correct the summing effect as a result of the presence of (88)Y and (60)Co in a multigamma source used to obtain a calibration efficiency curve. Results were obtained for three volumetric sources using the Monte Carlo toolkit, GEANT4. The first part of this paper deals with modeling the detector in order to obtain a simulated full energy peak efficiency curve. A quantitative comparison between the measured and simulated values was made across the entire energy range under study. The True Summing Correction Factors were calculated for (88)Y and (60)Co using the full peak efficiencies obtained with GEANT4. This methodology was subsequently applied to (134)Cs, and presented a complex decay scheme.

  6. Geant4 simulations of the neutron production and transport in the n_TOF spallation target

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.

    2016-11-01

    The neutron production and transport in the spallation target of the n_TOF facility at CERN has been simulated with Geant4. The results obtained with the different hadronic Physics Lists provided by Geant4 have been compared with the experimental neutron flux in n_TOF-EAR1. The best overall agreement in both the absolute value and the energy dependence of the flux from thermal to 1GeV, is obtained with the INCL++ model coupled with the Fritiof Model(FTFP). This Physics List has been thus used to simulate and study the main features of the new n_TOF-EAR2 beam line, currently in its commissioning phase.

  7. Modification of source contribution in PALS by simulation using Geant4 code

    NASA Astrophysics Data System (ADS)

    Ning, Xia; Cao, Xingzhong; Li, Chong; Li, Demin; Zhang, Peng; Gong, Yihao; Xia, Rui; Wang, Baoyi; Wei, Long

    2017-04-01

    The contribution of positron source for the results of a positron annihilation lifetime spectrum (PALS) is simulated using Geant4 code. The geometrical structure of PALS measurement system is a sandwich structure: the 22Na radiation source is encapsulated by Kapton films, and the specimens are attached on the outside of the films. The probabilities of a positron being annihilated in the films, annihilated in the targets, and the effect of positrons reflected back from the specimen surface, are simulated. The probability of a positron annihilated in the film is related to the species of targets and the source film thickness. The simulation result is in reasonable agreement with the available experimental data. Thus, modification of the source contribution calculated by Geant4 is viable, and it beneficial for the analysis of the results of PALS.

  8. Comparison of Geant4-DNA simulation of S-values with other Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    André, T.; Morini, F.; Karamitros, M.; Delorme, R.; Le Loirec, C.; Campos, L.; Champion, C.; Groetz, J.-E.; Fromm, M.; Bordage, M.-C.; Perrot, Y.; Barberet, Ph.; Bernal, M. A.; Brown, J. M. C.; Deleuze, M. S.; Francis, Z.; Ivanchenko, V.; Mascialino, B.; Zacharatou, C.; Bardiès, M.; Incerti, S.

    2014-01-01

    Monte Carlo simulations of S-values have been carried out with the Geant4-DNA extension of the Geant4 toolkit. The S-values have been simulated for monoenergetic electrons with energies ranging from 0.1 keV up to 20 keV, in liquid water spheres (for four radii, chosen between 10 nm and 1 μm), and for electrons emitted by five isotopes of iodine (131, 132, 133, 134 and 135), in liquid water spheres of varying radius (from 15 μm up to 250 μm). The results have been compared to those obtained from other Monte Carlo codes and from other published data. The use of the Kolmogorov-Smirnov test has allowed confirming the statistical compatibility of all simulation results.

  9. Application of automated weight windows to spallation neutron source shielding calculations using Geant4

    NASA Astrophysics Data System (ADS)

    Stenander, John; DiJulio, Douglas D.

    2015-10-01

    We present an implementation of a general weight-window generator for global variance reduction in Geant4 based applications. The implementation is flexible and can be easily adjusted to a user-defined model. In this work, the weight-window generator was applied to calculations based on an instrument shielding model of the European Spallation Source, which is currently under construction in Lund, Sweden. The results and performance of the implemented methods were evaluated through the definition of two figures of merit. It was found that the biased simulations showed an overall improvement in performance compared to the unbiased simulations. The present work demonstrates both the suitability of the generator method and Geant4 for these types of calculations.

  10. In-flight second order correction of PAMELA calorimeter characteristics (for simulation in Geant4)

    NASA Astrophysics Data System (ADS)

    Dunaeva, O. A.; Alekseev, V. V.; Bogomolov, Yu V.; Lukyanov, A. D.; Malakhov, V. V.; Mayorov, A. G.; Rodenko, S. A.

    2017-01-01

    Simulation of the PAMELA spectrometer characteristics is performed with the special program accepted by the PAMELA collaboration based on Geant4 package, which needs a detailed information about geometry, materials etc. of scientific equipment. This data is taken from manufactures or obtained from different ground-based tests including accelerators. We propose a method of in-flight verification of calorimeter characteristics. To calculate them we select relativistic protons passing through all the spectrometer without interactions. We obtain correction values from a comparison of experimental data and simulation in assumption that electromagnetic processes are performed in Geant4 with high precision. As a result, characteristics of silicon detectors (the sensitive part) are verified. Correction factor is 2.0 ± 0.3% with respect to original value.

  11. Simulation of the production rates of cosmogenic nuclides on the Moon based on Geant4

    NASA Astrophysics Data System (ADS)

    Li, Yong; Zhang, Xiaoping; Dong, Wudong; Ren, Zhongzhou; Dong, Tiekuang; Xu, Aoao

    2017-02-01

    A numerical simulation model is built to simulate the production of cosmogenic nuclides based on Geant4 (GEometry ANd Tracking). Some modifications have been made for cross sections in Geant4 using the experimental data or the other proper model and the contributions of all secondary particles caused by cosmic rays are included in our simulation model. Our simulation results suggest a substantial contribution of the secondary charged pions to the production rates of 10Be and 14C, as high as 21.04% for 10Be and 21.36% for 14C, respectively. Within one set of self-consistent parameters, the simulation results of the production rates of the cosmogenic nuclides, 53Mn, 36Cl, 41Ca, 26Al, 10Be, and 14C, agree well with the measured data from Apollo 15 drill core. This model provides users a validated approach to study the production of cosmogenic nuclides on the planet surface and in the meteorites.

  12. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  13. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes.

    PubMed

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-07

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes.

  14. Structural Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.

  15. Comparisons of Electron and Muon Signals in the Atlas Liquid Argon Calorimeters with GEANT4 Simulations

    NASA Astrophysics Data System (ADS)

    Benchekroun, D.; Karpetian, G.; Mazini, R.; Kiryunin, A.; Salihagic, D.; Strizenec, P.; Kish, J.; Kordas, K.; Parrour, G.; Leltchouk, M.; Negroni, S.; Seligman, W.; Loch, P.; Soukharev, A.

    2002-01-01

    Signals from electrons and muons taken at testbeams with different modules of the ATLAS Liquid Argon Calorimeter have been compared to corresponding simulations using the GEANT4 toolkit. These simulations have also been compared in some detail with GEANT3 based predictions. Results for signal linearity, energy resolution, and shower shapes all generally indicate a good agreement between experiment and the two simulation packages, typically at the level of a few percent.

  16. Calculation of self-shielding factor for neutron activation experiments using GEANT4 and MCNP

    NASA Astrophysics Data System (ADS)

    Romero-Barrientos, Jaime; Molina, F.; Aguilera, Pablo; Arellano, H. F.

    2016-07-01

    The neutron self-shielding factor G as a function of the neutron energy was obtained for 14 pure metallic samples in 1000 isolethargic energy bins from 1.10-5eV to 2.107eV using Monte Carlo simulations in GEANT4 and MCNP6. The comparison of these two Monte Carlo codes shows small differences in the final self-shielding factor mostly due to the different cross section databases that each program uses.

  17. NEUTRON GENERATOR FACILITY AT SFU: GEANT4 DOSE RATE PREDICTION AND VERIFICATION.

    PubMed

    Williams, J; Chester, A; Domingo, T; Rizwan, U; Starosta, K; Voss, P

    2016-11-01

    Detailed dose rate maps for a neutron generator facility at Simon Fraser University were produced via the GEANT4 Monte Carlo framework. Predicted neutron dose rates throughout the facility were compared with radiation survey measurements made during the facility commissioning process. When accounting for thermal neutrons, the prediction and measurement agree within a factor of 2 or better in most survey locations, and within 10 % inside the vault housing the neutron generator.

  18. The local skin dose conversion coefficients of electrons, protons and alpha particles calculated using the Geant4 code.

    PubMed

    Zhang, Bintuan; Dang, Bingrong; Wang, Zhuanzi; Wei, Wei; Li, Wenjian

    2013-10-01

    The skin tissue-equivalent slab reported in the International Commission on Radiological Protection (ICRP) Publication 116 to calculate the localised skin dose conversion coefficients (LSDCCs) was adopted into the Monte Carlo transport code Geant4. The Geant4 code was then utilised for computation of LSDCCs due to a circular parallel beam of monoenergetic electrons, protons and alpha particles <10 MeV. The computed LSDCCs for both electrons and alpha particles are found to be in good agreement with the results using the MCNPX code of ICRP 116 data. The present work thus validates the LSDCC values for both electrons and alpha particles using the Geant4 code.

  19. Geant4 simulations of STIX Caliste-SO detector's response to solar X-ray radiation

    NASA Astrophysics Data System (ADS)

    Barylak, Jaromir; Barylak, Aleksandra; Mrozek, Tomasz; Steślicki, Marek; Podgórski, Piotr; Netzel, Henryka

    Spectrometer/Telescope for Imaging X-rays (STIX) is a part of Solar Orbiter (SO) science payload. SO will be launched in October 2018, and after three years of cruise phase, it will reach orbit with perihelion distance of 0.3 a.u. STIX is a Fourier imager equipped with pairs of grids that comprise the flare hard X-ray tomograph. Similar imager types were already used in the past (eq. RHESSI, Yohkoh/HXT), but STIX will incorporate Moiré modulation and a new type of pixelized detectors with CdTe sensor. We developed a method of modeling these detectors' response matrix (DRM) using the Geant4 simulations of X-ray photons interactions with CdTe crystals. Taking into account known detector effects (Fano noise, hole tailing etc.) we modeled the resulting spectra with high accuracy. Comparison of Caliste-SO laboratory measurements of 241Am decay spectrum with our results shows a very good agreement. The modeling based on the Geant4 simulations significantly improves our understanding of detector response to X-ray photons. Developed methodology gives opportunity for detailed simulation of whole instrument response with complicated geometry and secondary radiation from cosmic ray particles taken into account. Moreover, we are developing the Geant4 simulations of aging effects which decrease detector's performance.

  20. GMC: a GPU implementation of a Monte Carlo dose calculation based on Geant4.

    PubMed

    Jahnke, Lennart; Fleckenstein, Jens; Wenz, Frederik; Hesser, Jürgen

    2012-03-07

    We present a GPU implementation called GMC (GPU Monte Carlo) of the low energy (<100 GeV) electromagnetic part of the Geant4 Monte Carlo code using the NVIDIA® CUDA programming interface. The classes for electron and photon interactions as well as a new parallel particle transport engine were implemented. The way a particle is processed is not in a history by history manner but rather by an interaction by interaction method. Every history is divided into steps that are then calculated in parallel by different kernels. The geometry package is currently limited to voxelized geometries. A modified parallel Mersenne twister was used to generate random numbers and a random number repetition method on the GPU was introduced. All phantom results showed a very good agreement between GPU and CPU simulation with gamma indices of >97.5% for a 2%/2 mm gamma criteria. The mean acceleration on one GTX 580 for all cases compared to Geant4 on one CPU core was 4860. The mean number of histories per millisecond on the GPU for all cases was 658 leading to a total simulation time for one intensity-modulated radiation therapy dose distribution of 349 s. In conclusion, Geant4-based Monte Carlo dose calculations were significantly accelerated on the GPU.

  1. Application of TDCR-Geant4 modeling to standardization of 63Ni.

    PubMed

    Thiam, C; Bobin, C; Chauvenet, B; Bouchard, J

    2012-09-01

    As an alternative to the classical TDCR model applied to liquid scintillation (LS) counting, a stochastic approach based on the Geant4 toolkit is presented for the simulation of light emission inside the dedicated three-photomultiplier detection system. To this end, the Geant4 modeling includes a comprehensive description of optical properties associated with each material constituting the optical chamber. The objective is to simulate the propagation of optical photons from their creation in the LS cocktail to the production of photoelectrons in the photomultipliers. First validated for the case of radionuclide standardization based on Cerenkov emission, the scintillation process has been added to a TDCR-Geant4 modeling using the Birks expression in order to account for the light-emission nonlinearity owing to ionization quenching. The scintillation yield of the commercial Ultima Gold LS cocktail has been determined from double-coincidence detection efficiencies obtained for (60)Co and (54)Mn with the 4π(LS)β-γ coincidence method. In this paper, the stochastic TDCR modeling is applied for the case of the standardization of (63)Ni (pure β(-)-emitter; E(max)=66.98 keV) and the activity concentration is compared with the result given by the classical model.

  2. Geant4 Model Validation of Compton Suppressed System for Process monitoring of Spent Fuel

    SciTech Connect

    Bender, Sarah; Unlu, Kenan; Orton, Christopher R.; Schwantes, Jon M.

    2013-05-01

    Nuclear material accountancy is of continuous concern for the regulatory, safeguards, and verification communities. In particular, spent nuclear fuel reprocessing facilities pose one of the most difficult accountancy challenges: monitoring highly radioactive, fluid sample streams in near real-time. The Multi-Isotope Process monitor will allow for near-real-time indication of process alterations using passive gamma-ray detection coupled with multivariate analysis techniques to guard against potential material diversion or to enhance domestic process monitoring. The Compton continuum from the dominant 661.7 keV 137Cs fission product peak obscures lower energy lines which could be used for spectral and multivariate analysis. Compton suppression may be able to mitigate the challenges posed by the high continuum caused by scattering. A Monte Carlo simulation using the Geant4 toolkit is being developed to predict the expected suppressed spectrum from spent fuel samples to estimate the reduction in the Compton continuum. Despite the lack of timing information between decay events in the particle management of Geant4, encouraging results were recorded utilizing only the information within individual decays without accounting for accidental coincidences. The model has been validated with single and cascade decay emitters in two steps: as an unsuppressed system and with suppression activated. Results of the Geant4 model validation will be presented.

  3. Optical simulation of monolithic scintillator detectors using GATE/GEANT4.

    PubMed

    van der Laan, D J Jan; Schaart, Dennis R; Maas, Marnix C; Beekman, Freek J; Bruyndonckx, Peter; van Eijk, Carel W E

    2010-03-21

    Much research is being conducted on position-sensitive scintillation detectors for medical imaging, particularly for emission tomography. Monte Carlo simulations play an essential role in many of these research activities. As the scintillation process, the transport of scintillation photons through the crystal(s), and the conversion of these photons into electronic signals each have a major influence on the detector performance; all of these processes may need to be incorporated in the model to obtain accurate results. In this work the optical and scintillation models of the GEANT4 simulation toolkit are validated by comparing simulations and measurements on monolithic scintillator detectors for high-resolution positron emission tomography (PET). We have furthermore made the GEANT4 optical models available within the user-friendly GATE simulation platform (as of version 3.0). It is shown how the necessary optical input parameters can be determined with sufficient accuracy. The results show that the optical physics models of GATE/GEANT4 enable accurate prediction of the spatial and energy resolution of monolithic scintillator PET detectors.

  4. Comparison of GEANT4 Simulations with Experimental Data for Thick Al Absorbers

    SciTech Connect

    Yevseyeva, Olga; Assis, Joaquim de; Diaz, Katherin; Lopes, Ricardo

    2009-06-03

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4 can lead to significant disagreements in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents proton energy spectra obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models for 19.68 MeV protons passing through a number of Al absorbers with various thicknesses. The spectra were compared with the experimental data, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and with the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the MCNPX simulations reasonably reproduce well all experimental spectra. For the relatively thin targets all the methods give practically identical results but this is not the same for the thick absorbers. It should be noted that all the spectra were measured at the proton energies significantly above 2 MeV, i.e., in the so-called 'Bethe-Bloch region'. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies are necessary for better understanding and definitive conclusions.

  5. GEANT4 simulations of the n_TOF spallation source and their benchmarking

    NASA Astrophysics Data System (ADS)

    Lo Meo, S.; Cortés-Giraldo, M. A.; Massimi, C.; Lerendegui-Marco, J.; Barbagallo, M.; Colonna, N.; Guerrero, C.; Mancusi, D.; Mingrone, F.; Quesada, J. M.; Sabate-Gilarte, M.; Vannini, G.; Vlachoudis, V.

    2015-12-01

    Neutron production and transport in the spallation target of the n_TOF facility at CERN has been simulated with GEANT4. The results obtained with different models of high-energy nucleon-nucleus interaction have been compared with the measured characteristics of the neutron beam, in particular the flux and its dependence on neutron energy, measured in the first experimental area. The best agreement at present, within 20% for the absolute value of the flux, and within few percent for the energy dependence in the whole energy range from thermal to 1 GeV, is obtained with the INCL++ model coupled with the GEANT4 native de-excitation model. All other available models overestimate by a larger factor, of up to 70%, the n_TOF neutron flux. The simulations are also able to accurately reproduce the neutron beam energy resolution function, which is essentially determined by the moderation time inside the target/moderator assembly. The results here reported provide confidence on the use of GEANT4 for simulations of spallation neutron sources.

  6. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring

    PubMed Central

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M.; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm2. PMID:26858937

  7. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring.

    PubMed

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm(2).

  8. FIB Microfabrication Software Design Considerations

    NASA Astrophysics Data System (ADS)

    Thompson, W.; Bowe, T.; Morlock, S.; Moskowitz, A.; Plourde, G.; Spaulding, G.; Scialdone, C.; Tsiang, E.

    1986-06-01

    Profit margins on high-volume ICs, such as the 256-K DRAM, are now inadequate. U.S. and foreign manufacturers cannot fully recover the ICs' engineering costs before a new round of product competition begins. Consequently, some semiconductor manufacturers are seeking less competitive designs with healthier, longer lasting profitability. These designs must be converted quickly from CAD to functional circuits in order for irofits to be realized. For ultrahigh performance devices, customized circuits, and rapid verification of design, FIB (focused ion beam) systems provide a viable alternative to the lengthy process of producing a large mask set. Early models of FI equipment did not require sophisticated software. However, as FIB technology approaches adolescence, it must be supported by software that gives the user a friendly system, the flexibility to design a wide variety of circuits, and good growth potential for tomorrow's ICs. Presented here is an overview of IBT's MicroFocus" 150 hardware, followed by descriptions of several MicroFocus software modules. Data preparation techniques from IBCAD formats to chip layout are compared to the more conventional lithographies. The MicroFocus 150 schemes for user interfacing, error logging, calibration, and subsystem control are given. The MicroFocus's pattern generator and bit slice software are explained. IBT's FIB patterning algorithms, which allow the fabrication of unique device types, are reviewed.

  9. MaGe-a Geant4-Based Monte Carlo Application Framework for Low-Background Germanium Experiments

    SciTech Connect

    Boswell, Melissa; Chan, Yuen-Dat; Detwiler, Jason A.; Finnerty, Padraic; Henning, Reyco; Gehman, Victor M.; Johnson, Rob A.; Jordan, David V.; Kazkaz, Kareem; Knapp, Markus; Kroninger, Kevin; Lenz, Daniel; Leviner, Lance; Liu, Jing; Liu, Xiang; MacMullin, Sean; Marino, Michael G.; Mokhtarani, Akbar; Pandola, Luciano; Schubert, Alexis G.; Schubert, Jens; Tomei, Claudia; Volynets, Oleksandr

    2011-06-01

    We describe a physics simulation software framework, MAGE, that is based on the GEANT4 simulation toolkit. MAGE is used to simulate the response of ultra-low radioactive background radiation detectors to ionizing radiation, specifically the MAJ ORANA and GE RDA neutrinoless double-beta decay experiments. MAJ ORANA and GERDA use high-purity germanium technology to search for the neutrinoless double-beta decay of the 76 Ge isotope, and MAGE is jointly developed between these two collaborations. The MAGE framework contains simulated geometries of common objects, prototypes, test stands, and the actual experiments. It also implements customized event generators, GE ANT 4 physics lists, and output formats. All of these features are available as class libraries that are typically compiled into a single executable. The user selects the particular experimental setup implementation at run-time via macros. The combination of all these common classes into one framework reduces duplication of efforts, eases comparison between simulated data and experiment, and simplifies the addition of new detectors to be simulated. This paper focuses on the software framework, custom event generators, and physics list.

  10. GEANT4 calculations of neutron dose in radiation protection using a homogeneous phantom and a Chinese hybrid male phantom.

    PubMed

    Geng, Changran; Tang, Xiaobin; Guan, Fada; Johns, Jesse; Vasudevan, Latha; Gong, Chunhui; Shu, Diyun; Chen, Da

    2016-03-01

    The purpose of this study is to verify the feasibility of applying GEANT4 (version 10.01) in neutron dose calculations in radiation protection by comparing the calculation results with MCNP5. The depth dose distributions are investigated in a homogeneous phantom, and the fluence-to-dose conversion coefficients are calculated for different organs in the Chinese hybrid male phantom for neutrons with energy ranging from 1 × 10(-9) to 10 MeV. By comparing the simulation results between GEANT4 and MCNP5, it is shown that using the high-precision (HP) neutron physics list, GEANT4 produces the closest simulation results to MCNP5. However, differences could be observed when the neutron energy is lower than 1 × 10(-6) MeV. Activating the thermal scattering with an S matrix correction in GEANT4 with HP and MCNP5 in thermal energy range can reduce the difference between these two codes.

  11. Monte Carlo calculations of thermal neutron capture in gadolinium: a comparison of GEANT4 and MCNP with measurements.

    PubMed

    Enger, Shirin A; Munck af Rosenschöld, Per; Rezaei, Arash; Lundqvist, Hans

    2006-02-01

    GEANT4 is a Monte Carlo code originally implemented for high-energy physics applications and is well known for particle transport at high energies. The capacity of GEANT4 to simulate neutron transport in the thermal energy region is not equally well known. The aim of this article is to compare MCNP, a code commonly used in low energy neutron transport calculations and GEANT4 with experimental results and select the suitable code for gadolinium neutron capture applications. To account for the thermal neutron scattering from chemically bound atoms [S(alpha,beta)] in biological materials a comparison of thermal neutron fluence in tissue-like poly(methylmethacrylate) phantom is made with MCNP4B, GEANT4 6.0 patch1, and measurements from the neutron capture therapy (NCT) facility at the Studsvik, Sweden. The fluence measurements agreed with MCNP calculated results considering S(alpha,beta). The location of the thermal neutron peak calculated with MCNP without S(alpha,beta) and GEANT4 is shifted by about 0.5 cm towards a shallower depth and is 25%-30% lower in amplitude. Dose distribution from the gadolinium neutron capture reaction is then simulated by MCNP and compared with measured data. The simulations made by MCNP agree well with experimental results. As long as thermal neutron scattering from chemically bound atoms are not included in GEANT4 it is not suitable for NCT applications.

  12. Electron slowing-down spectra in water for electron and photon sources calculated with the Geant4-DNA code.

    PubMed

    Vassiliev, Oleg N

    2012-02-21

    Recently, a very low energy extension was added to the Monte Carlo simulation toolkit Geant4. It is intended for radiobiological modeling and is referred to as Geant4-DNA. Its performance, however, has not been systematically benchmarked in terms of transport characteristics. This study reports on the electron slowing-down spectra and mean energy per ion pair, the W-value, in water for monoenergetic electron and photon sources calculated with Geant4-DNA. These quantities depend on electron energy, but not on spatial or angular variables which makes them a good choice for testing the model of energy transfer processes. The spectra also have a scientific value for radiobiological modeling as they describe the energy distribution of electrons entering small volumes, such as the cell nucleus. Comparisons of Geant4-DNA results with previous studies showed overall good agreement. Some differences in slowing-down spectra between Geant4-DNA and previous studies were found at 100 eV and at approximately 500 eV that were attributed to approximations in models of vibrational excitations and atomic de-excitation after ionization by electron impact. We also found that the high-energy part of the Geant4-DNA spectrum for a 1 keV electron source was higher, and the asymptotic high-energy W-value was lower than previous studies reported.

  13. Absorbed dose estimations of 131I for critical organs using the GEANT4 Monte Carlo simulation code

    NASA Astrophysics Data System (ADS)

    Ziaur, Rahman; Shakeel, ur Rehman; Waheed, Arshed; Nasir, M. Mirza; Abdul, Rashid; Jahan, Zeb

    2012-11-01

    The aim of this study is to compare the absorbed doses of critical organs of 131I using the MIRD (Medical Internal Radiation Dose) with the corresponding predictions made by GEANT4 simulations. S-values (mean absorbed dose rate per unit activity) and energy deposition per decay for critical organs of 131I for various ages, using standard cylindrical phantom comprising water and ICRP soft-tissue material, have also been estimated. In this study the effect of volume reduction of thyroid, during radiation therapy, on the calculation of absorbed dose is also being estimated using GEANT4. Photon specific energy deposition in the other organs of the neck, due to 131I decay in the thyroid organ, has also been estimated. The maximum relative difference of MIRD with the GEANT4 simulated results is 5.64% for an adult's critical organs of 131I. Excellent agreement was found between the results of water and ICRP soft tissue using the cylindrical model. S-values are tabulated for critical organs of 131I, using 1, 5, 10, 15 and 18 years (adults) individuals. S-values for a cylindrical thyroid of different sizes, having 3.07% relative differences of GEANT4 with Siegel & Stabin results. Comparison of the experimentally measured values at 0.5 and 1 m away from neck of the ionization chamber with GEANT4 based Monte Carlo simulations results show good agreement. This study shows that GEANT4 code is an important tool for the internal dosimetry calculations.

  14. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  15. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  16. Extension of PENELOPE to protons: Simulation of nuclear reactions and benchmark with Geant4

    SciTech Connect

    Sterpin, E.; Sorriaux, J.; Vynckier, S.

    2013-11-15

    Purpose: Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4.Methods: PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer–Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for {sup 1}H and ICRU 63 data for {sup 12}C, {sup 14}N, {sup 16}O, {sup 31}P, and {sup 40}Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth–dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth–dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone.Results: For simulations with EM collisions only, integral depth–dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth–dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth

  17. Validation of the GEANT4 simulation of bremsstrahlung from thick targets below 3 MeV

    NASA Astrophysics Data System (ADS)

    Pandola, L.; Andenna, C.; Caccia, B.

    2015-05-01

    The bremsstrahlung spectra produced by electrons impinging on thick targets are simulated using the GEANT4 Monte Carlo toolkit. Simulations are validated against experimental data available in literature for a range of energy between 0.5 and 2.8 MeV for Al and Fe targets and for a value of energy of 70 keV for Al, Ag, W and Pb targets. The energy spectra for the different configurations of emission angles, energies and targets are considered. Simulations are performed by using the three alternative sets of electromagnetic models that are available in GEANT4 to describe bremsstrahlung. At higher energies (0.5-2.8 MeV) of the impinging electrons on Al and Fe targets, GEANT4 is able to reproduce the spectral shapes and the integral photon emission in the forward direction. The agreement is within 10-30%, depending on energy, emission angle and target material. The physics model based on the Penelope Monte Carlo code is in slightly better agreement with the measured data than the other two. However, all models over-estimate the photon emission in the backward hemisphere. For the lower energy study (70 keV), which includes higher-Z targets, all models systematically under-estimate the total photon yield, providing agreement between 10% and 50%. The results of this work are of potential interest for medical physics applications, where knowledge of the energy spectra and angular distributions of photons is needed for accurate dose calculations with Monte Carlo and other fluence-based methods.

  18. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists.

  19. GEANT4 simulation of the effects of Doppler energy broadening in Compton imaging.

    PubMed

    Uche, C Z; Cree, M J; Round, W H

    2011-09-01

    A Monte Carlo approach was used to study the effects of Doppler energy broadening on Compton camera performance. The GEANT4 simulation toolkit was used to model the radiation transport and interactions with matter in a simulated Compton camera. The low energy electromagnetic physics model of GEANT4 incorporating Doppler broadening developed by Longo et al. was used in the simulations. The camera had a 9 × 9 cm scatterer and a 10 × 10 cm absorber with a scatterer to-absorber separation of 5 cm. Modelling was done such that only the effects of Doppler broadening were taken into consideration and effects of scatterer and absorber thickness and pixelation were not taken into account, thus a 'perfect' Compton camera was assumed. Scatterer materials were either silicon or germanium and the absorber material was cadmium zinc telluride. Simulations were done for point sources 10 cm in front of the scatterer. The results of the simulations validated the use of the low energy model of GEANT4. As expected, Doppler broadening was found to degrade the Compton camera imaging resolution. For a 140.5 keV source the resulting full-width-at-half-maximum (FWHM) of the point source image without accounting for Doppler broadening and using a silicon scatterer was 0.58 mm. This degraded to 7.1 mm when Doppler broadening was introduced and degraded further to 12.3 mm when a germanium scatterer was used instead of silicon. But for a 511 keV source, the FWHM was better than for a 140 keV source. The FWHM improved to 2.4 mm for a silicon scatterer and 4.6 mm for a germanium scatterer. Our result for silicon at 140.5 keV is in very good agreement with that published by An et al.

  20. Simulation of the 6 MV Elekta Synergy Platform linac photon beam using Geant4 Application for Tomographic Emission.

    PubMed

    Didi, Samir; Moussa, Abdelilah; Yahya, Tayalati; Mustafa, Zerfaoui

    2015-01-01

    The present work validates the Geant4 Application for Tomographic Emission Monte Carlo software for the simulation of a 6 MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The simulation includes the major components of the linear accelerator (LINAC) with multi-leaf collimator and a homogeneous water phantom. Calculations were performed for the photon beam with several treatment field sizes ranging from 5 cm × 5 cm to 30 cm × 30 cm at 100 cm distance from the source. The simulation was successfully validated by comparison with experimental distributions. Good agreement between simulations and measurements was observed, with dose differences of about 0.02% and 2.5% for depth doses and lateral dose profiles, respectively. This agreement was also emphasized by the Kolmogorov-Smirnov goodness-of-fit test and by the gamma-index comparisons where more than 99% of the points for all simulations fulfill the quality assurance criteria of 2 mm/2%.

  1. Simulation of the 6 MV Elekta Synergy Platform linac photon beam using Geant4 Application for Tomographic Emission

    PubMed Central

    Didi, Samir; Moussa, Abdelilah; Yahya, Tayalati; Mustafa, Zerfaoui

    2015-01-01

    The present work validates the Geant4 Application for Tomographic Emission Monte Carlo software for the simulation of a 6 MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The simulation includes the major components of the linear accelerator (LINAC) with multi-leaf collimator and a homogeneous water phantom. Calculations were performed for the photon beam with several treatment field sizes ranging from 5 cm × 5 cm to 30 cm × 30 cm at 100 cm distance from the source. The simulation was successfully validated by comparison with experimental distributions. Good agreement between simulations and measurements was observed, with dose differences of about 0.02% and 2.5% for depth doses and lateral dose profiles, respectively. This agreement was also emphasized by the Kolmogorov–Smirnov goodness-of-fit test and by the gamma-index comparisons where more than 99% of the points for all simulations fulfill the quality assurance criteria of 2 mm/2%. PMID:26500399

  2. GEANT4 simulation of cyclotron radioisotope production in a solid target.

    PubMed

    Poignant, F; Penfold, S; Asp, J; Takhar, P; Jackson, P

    2016-05-01

    The use of radioisotopes in nuclear medicine is essential for diagnosing and treating cancer. The optimization of their production is a key factor in maximizing the production yield and minimizing the associated costs. An efficient approach to this problem is the use of Monte Carlo simulations prior to experimentation. By predicting isotopes yields, one can study the isotope of interest expected activity for different energy ranges. One can also study the target contamination with other radioisotopes, especially undesired radioisotopes of the wanted chemical element which are difficult to separate from the irradiated target and might result in increasing the dose when delivering the radiopharmaceutical product to the patient. The aim of this work is to build and validate a Monte Carlo simulation platform using the GEANT4 toolkit to model the solid target system of the South Australian Health and Medical Research Institute (SAHMRI) GE Healthcare PETtrace cyclotron. It includes a GEANT4 Graphical User Interface (GUI) where the user can modify simulation parameters such as the energy, shape and current of the proton beam, the target geometry and material, the foil geometry and material and the time of irradiation. The paper describes the simulation and presents a comparison of simulated and experimental/theoretical yields for various nuclear reactions on an enriched nickel 64 target using the GEANT4 physics model QGSP_BIC_AllHP, a model recently developed to evaluate with high precision the interaction of protons with energies below 200MeV available in Geant4 version 10.1. The simulation yield of the (64)Ni(p,n)(64)Cu reaction was found to be 7.67±0.074 mCi·μA(-1) for a target energy range of 9-12MeV. Szelecsenyi et al. (1993) gives a theoretical yield of 6.71mCi·μA(-1) and an experimental yield of 6.38mCi·μA(-1). The (64)Ni(p,n)(64)Cu cross section obtained with the simulation was also verified against the yield predicted from the nuclear database TENDL and

  3. Application of Geant4 simulation for analysis of soil carbon inelastic neutron scattering measurements.

    PubMed

    Yakubova, Galina; Kavetskiy, Aleksandr; Prior, Stephen A; Torbert, H Allen

    2016-07-01

    Inelastic neutron scattering (INS) was applied to determine soil carbon content. Due to non-uniform soil carbon depth distribution, the correlation between INS signals with some soil carbon content parameter is not obvious; however, a proportionality between INS signals and average carbon weight percent in ~10cm layer for any carbon depth profile is demonstrated using Monte-Carlo simulation (Geant4). Comparison of INS and dry combustion measurements confirms this conclusion. Thus, INS measurements give the value of this soil carbon parameter.

  4. Geant4 simulations on Compton scattering of laser photons on relativistic electrons

    SciTech Connect

    Filipescu, D.; Utsunomiya, H.; Gheorghe, I.; Glodariu, T.; Tesileanu, O.; Shima, T.; Takahisa, K.; Miyamoto, S.

    2015-02-24

    Using Geant4, a complex simulation code of the interaction between laser photons and relativistic electrons was developed. We implemented physically constrained electron beam emittance and spacial distribution parameters and we also considered a Gaussian laser beam. The code was tested against experimental data produced at the γ-ray beam line GACKO (Gamma Collaboration Hutch of Konan University) of the synchrotron radiation facility NewSUBARU. Here we will discuss the implications of transverse missallignments of the collimation system relative to the electron beam axis.

  5. Summing-coincidence corrections with Geant4 in routine measurements by γ spectrometry of environmental samples.

    PubMed

    Quintana, B; Montes, C

    2014-05-01

    In this work, we describe a method to quantitatively evaluate true-coincidence-summing effects by making use of the Geant4 toolkit, which incorporates an emulation of the radionuclide disintegration scheme. To check the capabilities of the method, we firstly validated the simulated corrections using the ones obtained experimentally for radionuclides such as (60)Co, (152)Eu and (133)Ba. Secondly, we evaluated the effect of summing corrections of some radionuclides included in two intercomparison exercises to conclude that the results were improved when utilising the method described here.

  6. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  7. Geant4 Monte Carlo simulation of energy loss and transmission and ranges for electrons, protons and ions

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Vladimir

    Geant4 is a toolkit for Monte Carlo simulation of particle transport originally developed for applications in high-energy physics with the focus on experiments at the Large Hadron Collider (CERN, Geneva). The transparency and flexibility of the code has spread its use to other fields of research, e.g. radiotherapy and space science. The tool provides possibility to simulate complex geometry, transportation in electric and magnetic fields and variety of physics models of interaction of particles with media. Geant4 has been used for simulation of radiation effects for number of space missions. Recent upgrades of the toolkit released in December 2009 include new model for ion electronic stopping power based on the revised version of ICRU'73 Report increasing accuracy of simulation of ion transport. In the current work we present the status of Geant4 electromagnetic package for simulation of particle energy loss, ranges and transmission. This has a direct implication for simulation of ground testing setups at existing European facilities and for simulation of radiation effects in space. A number of improvements were introduced for electron and proton transport, followed by a thorough validation. It was the aim of the present study to validate the range against reference data from the United States National Institute of Standards and Technologies (NIST) ESTAR, PSTAR and ASTAR databases. We compared Geant4 and NIST ranges of electrons using different Geant4 models. The best agreement was found for Penelope, except at very low energies in heavy materials, where the Standard package gave better results. Geant4 proton ranges in water agreed with NIST within 1 The validation of the new ion model is performed against recent data on Bragg peak position in water. The data from transmission of carbon ions via various absorbers following Bragg peak in water demonstrate that the new Geant4 model significantly improves precision of ion range. The absolute accuracy of ion range

  8. Evaluation of proton inelastic reaction models in Geant4 for prompt gamma production during proton radiotherapy.

    PubMed

    Jeyasugiththan, Jeyasingam; Peterson, Stephen W

    2015-10-07

    During proton beam radiotherapy, discrete secondary prompt gamma rays are induced by inelastic nuclear reactions between protons and nuclei in the human body. In recent years, the Geant4 Monte Carlo toolkit has played an important role in the development of a device for real time dose range verification purposes using prompt gamma radiation. Unfortunately the default physics models in Geant4 do not reliably replicate the measured prompt gamma emission. Determining a suitable physics model for low energy proton inelastic interactions will boost the accuracy of prompt gamma simulations. Among the built-in physics models, we found that the precompound model with a modified initial exciton state of 2 (1 particle, 1 hole) produced more accurate discrete gamma lines from the most important elements found within the body such as 16O, 12C and 14N when comparing them with the available gamma production cross section data. Using the modified physics model, we investigated the prompt gamma spectra produced in a water phantom by a 200 MeV pencil beam of protons. The spectra were attained using a LaBr3 detector with a time-of-flight (TOF) window and BGO active shield to reduce the secondary neutron and gamma background. The simulations show that a 2 ns TOF window could reduce 99% of the secondary neutron flux hitting the detector. The results show that using both timing and active shielding can remove up to 85% of the background radiation which includes a 33% reduction by BGO subtraction.

  9. Simulation of positron backscattering and implantation profiles using Geant4 code

    NASA Astrophysics Data System (ADS)

    Huang, Shi-Juan; Pan, Zi-Wen; Liu, Jian-Dang; Han, Rong-Dian; Ye, Bang-Jiao

    2015-10-01

    For the proper interpretation of the experimental data produced in slow positron beam technique, the positron implantation properties are studied carefully using the latest Geant4 code. The simulated backscattering coefficients, the implantation profiles, and the median implantation depths for mono-energetic positrons with energy range from 1 keV to 50 keV normally incident on different crystals are reported. Compared with the previous experimental results, our simulation backscattering coefficients are in reasonable agreement, and we think that the accuracy may be related to the structures of the host materials in the Geant4 code. Based on the reasonable simulated backscattering coefficients, the adjustable parameters of the implantation profiles which are dependent on materials and implantation energies are obtained. The most important point is that we calculate the positron backscattering coefficients and median implantation depths in amorphous polymers for the first time and our simulations are in fairly good agreement with the previous experimental results. Project supported by the National Natural Science Foundation of China (Grant Nos. 11175171 and 11105139).

  10. Application of dynamic Monte Carlo technique in proton beam radiotherapy using Geant4 simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guan, Fada

    Monte Carlo method has been successfully applied in simulating the particles transport problems. Most of the Monte Carlo simulation tools are static and they can only be used to perform the static simulations for the problems with fixed physics and geometry settings. Proton therapy is a dynamic treatment technique in the clinical application. In this research, we developed a method to perform the dynamic Monte Carlo simulation of proton therapy using Geant4 simulation toolkit. A passive-scattering treatment nozzle equipped with a rotating range modulation wheel was modeled in this research. One important application of the Monte Carlo simulation is to predict the spatial dose distribution in the target geometry. For simplification, a mathematical model of a human body is usually used as the target, but only the average dose over the whole organ or tissue can be obtained rather than the accurate spatial dose distribution. In this research, we developed a method using MATLAB to convert the medical images of a patient from CT scanning into the patient voxel geometry. Hence, if the patient voxel geometry is used as the target in the Monte Carlo simulation, the accurate spatial dose distribution in the target can be obtained. A data analysis tool---root was used to score the simulation results during a Geant4 simulation and to analyze the data and plot results after simulation. Finally, we successfully obtained the accurate spatial dose distribution in part of a human body after treating a patient with prostate cancer using proton therapy.

  11. Comparison between EGSnrc, Geant4, MCNP5 and Penelope for mono-energetic electron beams.

    PubMed

    Archambault, John Paul; Mainegra-Hing, Ernesto

    2015-07-07

    A simple geometry is chosen to highlight similarities and differences of current electron transport algorithms implemented in four Monte Carlo codes commonly used in radiation physics. Energy deposited in a water-filled sphere by mono-energetic electron beams was calculated using EGSnrc, Geant4, MCNP5 and Penelope as the radius of the sphere varied from 0.25 cm to 4.5 cm for beam energies of 0.5 MeV, 1.0 MeV and 5.0 MeV. The calculations were performed in single-scattering mode (where applicable) and in condensed history mode. A good agreement is found for the single-scattering calculations except for the in-air case at 0.5 MeV where differences increase with decreasing radius up to 5% between EGSnrc and Penelope. Differences between results calculated with the default user settings when compared to their own single-scattering modes are under 5% for all codes when the sphere is surrounded by vacuum, however, large differences occur for Geant4, MCNP5 and Penelope when air is introduced around the sphere. Finally, the parameters associated with the multiple scattering algorithms were tuned reducing these differences below 10% for these codes at the expense of increased computation time.

  12. Multi-scale hybrid models for radiopharmaceutical dosimetry with Geant4.

    PubMed

    Marcatili, S; Villoing, D; Garcia, M P; Bardiès, M

    2014-12-21

    The accuracy of radiopharmaceutical absorbed dose distributions computed through Monte Carlo (MC) simulations is mostly limited by the low spatial resolution of 3D imaging techniques used to define the simulation geometry. This issue also persists with the implementation of realistic hybrid models built using polygonal mesh and/or NURBS as they require to be simulated in their voxel form in order to reduce computation times. The existing trade-off between voxel size and simulation speed leads on one side, in an overestimation of the size of small radiosensitive structures such as the skin or hollow organs walls and, on the other, to unnecessarily detailed voxelization of large, homogeneous structures.We developed a set of computational tools based on VTK and Geant4 in order to build multi-resolution organ models. Our aim is to use different voxel sizes to represent anatomical regions of different clinical relevance: the MC implementation of these models is expected to improve spatial resolution in specific anatomical structures without significantly affecting simulation speed. Here we present the tools developed through a proof of principle example. Our approach is validated against the standard Geant4 technique for the simulation of voxel geometries.

  13. Local dose enhancement of proton therapy by ceramic oxide nanoparticles investigated with Geant4 simulations.

    PubMed

    McKinnon, Sally; Guatelli, Susanna; Incerti, Sebastien; Ivanchenko, Vladimir; Konstantinov, Konstantin; Corde, Stéphanie; Lerch, Michael; Tehei, Moeava; Rosenfeld, Anatoly

    2016-12-01

    Nanoparticles (NPs) have been shown to enhance X-ray radiotherapy and proton therapy of cancer. The effectiveness of radiation damage is enhanced in the presence of high atomic number (high-Z) NPs due to increased production of low energy, higher linear energy transfer (LET) secondary electrons when NPs are selectively internalized by tumour cells. This work quantifies the local dose enhancement produced by the high-Z ceramic oxide NPs Ta2O5 and CeO2, in the target tumour, for the first time in proton therapy, by means of Geant4 simulations. The dose enhancement produced by the ceramic oxides is compared against gold NPs. The energy deposition on a nanoscale around a single nanoparticle of 100nm diameter is investigated using the Geant4-DNA extension to model particle interactions in the water medium. Enhancement of energy deposition in nano-sized shells of water, local to the NP boundary, ranging between 14% and 27% was observed for proton energies of 5MeV and 50MeV, depending on the NP material. Enhancement of electron production and energy deposition can be correlated to the direct DNA damage mechanism if the NP is in close proximity to the nucleus.

  14. Evaluation of proton inelastic reaction models in Geant4 for prompt gamma production during proton radiotherapy

    NASA Astrophysics Data System (ADS)

    Jeyasugiththan, Jeyasingam; Peterson, Stephen W.

    2015-10-01

    During proton beam radiotherapy, discrete secondary prompt gamma rays are induced by inelastic nuclear reactions between protons and nuclei in the human body. In recent years, the Geant4 Monte Carlo toolkit has played an important role in the development of a device for real time dose range verification purposes using prompt gamma radiation. Unfortunately the default physics models in Geant4 do not reliably replicate the measured prompt gamma emission. Determining a suitable physics model for low energy proton inelastic interactions will boost the accuracy of prompt gamma simulations. Among the built-in physics models, we found that the precompound model with a modified initial exciton state of 2 (1 particle, 1 hole) produced more accurate discrete gamma lines from the most important elements found within the body such as 16O, 12C and 14N when comparing them with the available gamma production cross section data. Using the modified physics model, we investigated the prompt gamma spectra produced in a water phantom by a 200 MeV pencil beam of protons. The spectra were attained using a LaBr3 detector with a time-of-flight (TOF) window and BGO active shield to reduce the secondary neutron and gamma background. The simulations show that a 2 ns TOF window could reduce 99% of the secondary neutron flux hitting the detector. The results show that using both timing and active shielding can remove up to 85% of the background radiation which includes a 33% reduction by BGO subtraction.

  15. Carbon fragmentation measurements and validation of the Geant4 nuclear reaction models for hadrontherapy.

    PubMed

    De Napoli, M; Agodi, C; Battistoni, G; Blancato, A A; Cirrone, G A P; Cuttone, G; Giacoppo, F; Morone, M C; Nicolosi, D; Pandola, L; Patera, V; Raciti, G; Rapisarda, E; Romano, F; Sardina, D; Sarti, A; Sciubba, A; Scuderi, V; Sfienti, C; Tropea, S

    2012-11-21

    Nuclear fragmentation measurements are necessary when using heavy-ion beams in hadrontherapy to predict the effects of the ion nuclear interactions within the human body. Moreover, they are also fundamental to validate and improve the Monte Carlo codes for their use in planning tumor treatments. Nowadays, a very limited set of carbon fragmentation cross sections are being measured, and in particular, to our knowledge, no double-differential fragmentation cross sections at intermediate energies are available in the literature. In this work, we have measured the double-differential cross sections and the angular distributions of the secondary fragments produced in the (12)C fragmentation at 62 A MeV on a thin carbon target. The experimental data have been used to benchmark the prediction capability of the Geant4 Monte Carlo code at intermediate energies, where it was never tested before. In particular, we have compared the experimental data with the predictions of two Geant4 nuclear reaction models: the Binary Light Ions Cascade and the Quantum Molecular Dynamic. From the comparison, it has been observed that the Binary Light Ions Cascade approximates the angular distributions of the fragment production cross sections better than the Quantum Molecular Dynamic model. However, the discrepancies observed between the experimental data and the Monte Carlo simulations lead to the conclusion that the prediction capability of both models needs to be improved at intermediate energies.

  16. Carbon fragmentation measurements and validation of the Geant4 nuclear reaction models for hadrontherapy

    NASA Astrophysics Data System (ADS)

    De Napoli, M.; Agodi, C.; Battistoni, G.; Blancato, A. A.; Cirrone, G. A. P.; Cuttone, G.; Giacoppo, F.; Morone, M. C.; Nicolosi, D.; Pandola, L.; Patera, V.; Raciti, G.; Rapisarda, E.; Romano, F.; Sardina, D.; Sarti, A.; Sciubba, A.; Scuderi, V.; Sfienti, C.; Tropea, S.

    2012-11-01

    Nuclear fragmentation measurements are necessary when using heavy-ion beams in hadrontherapy to predict the effects of the ion nuclear interactions within the human body. Moreover, they are also fundamental to validate and improve the Monte Carlo codes for their use in planning tumor treatments. Nowadays, a very limited set of carbon fragmentation cross sections are being measured, and in particular, to our knowledge, no double-differential fragmentation cross sections at intermediate energies are available in the literature. In this work, we have measured the double-differential cross sections and the angular distributions of the secondary fragments produced in the 12C fragmentation at 62 A MeV on a thin carbon target. The experimental data have been used to benchmark the prediction capability of the Geant4 Monte Carlo code at intermediate energies, where it was never tested before. In particular, we have compared the experimental data with the predictions of two Geant4 nuclear reaction models: the Binary Light Ions Cascade and the Quantum Molecular Dynamic. From the comparison, it has been observed that the Binary Light Ions Cascade approximates the angular distributions of the fragment production cross sections better than the Quantum Molecular Dynamic model. However, the discrepancies observed between the experimental data and the Monte Carlo simulations lead to the conclusion that the prediction capability of both models needs to be improved at intermediate energies.

  17. Geant4 studies of the CNAO facility system for hadrontherapy treatment of uveal melanomas

    NASA Astrophysics Data System (ADS)

    Rimoldi, A.; Piersimoni, P.; Pirola, M.; Riccardi, C.

    2014-06-01

    The Italian National Centre of Hadrontherapy for Cancer Treatment (CNAO -Centro Nazionale di Adroterapia Oncologica) in Pavia, Italy, has started the treatment of selected cancers with the first patients in late 2011. In the coming months at CNAO plans are to activate a new dedicated treatment line for irradiation of uveal melanomas using the available active beam scan. The beam characteristics and the experimental setup should be tuned in order to reach the necessary precision required for such treatments. Collaboration between CNAO foundation, University of Pavia and INFN has started in 2011 to study the feasibility of these specialised treatments by implementing a MC simulation of the transport beam line and comparing the obtained simulation results with measurements at CNAO. The goal is to optimise an eye-dedicated transport beam line and to find the best conditions for ocular melanoma irradiations. This paper describes the Geant4 toolkit simulation of the CNAO setup as well as a modelised human eye with a tumour inside. The Geant4 application could be also used to test possible treatment planning systems. Simulation results illustrate the possibility to adapt the CNAO standard transport beam line by optimising the position of the isocentre and the addition of some passive elements to better shape the beam for this dedicated study.

  18. Comparisons of hadrontherapy-relevant data to nuclear interaction codes in the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Braunn, B.; Boudard, A.; Colin, J.; Cugnon, J.; Cussol, D.; David, J. C.; Kaitaniemi, P.; Labalme, M.; Leray, S.; Mancusi, D.

    2013-03-01

    Comparisons between experimental data, INCL and other nuclear models available in the Geant4 toolkit are presented. The data used for the comparisons come from a fragmentation experiment realised at GANIL facility. The main purpose of this experiment was to measure production rates and angular distributions of emitted particles from the collision of a 95.A MeV 12C beam and thick PMMA (plastic) targets. The latest version of the Intra Nuclear Cascade of Liege code extended to nucleus-nucleus collisions for ion beam therapy application will be described. This code as well as JQMD and the Geant4 binary cascade has been compared with these hadrontherapy-oriented experimental data. The results from the comparisons exhibit an overall qualitative agreement between the models and the experimental data. However, at a quantitative level, it has been shown that none of this three models manage to reproduce precisely all the data. The nucleus-nucleus extension of INCL, which is not predictive enough for ion beam therapy application yet, has nevertheless proven to be competitive with other nuclear collisions codes.

  19. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  20. Calculation of extrapolation curves in the 4π(LS)β-γ coincidence technique with the Monte Carlo code Geant4.

    PubMed

    Bobin, C; Thiam, C; Bouchard, J

    2016-03-01

    At LNE-LNHB, a liquid scintillation (LS) detection setup designed for Triple to Double Coincidence Ratio (TDCR) measurements is also used in the β-channel of a 4π(LS)β-γ coincidence system. This LS counter based on 3 photomultipliers was first modeled using the Monte Carlo code Geant4 to enable the simulation of optical photons produced by scintillation and Cerenkov effects. This stochastic modeling was especially designed for the calculation of double and triple coincidences between photomultipliers in TDCR measurements. In the present paper, this TDCR-Geant4 model is extended to 4π(LS)β-γ coincidence counting to enable the simulation of the efficiency-extrapolation technique by the addition of a γ-channel. This simulation tool aims at the prediction of systematic biases in activity determination due to eventual non-linearity of efficiency-extrapolation curves. First results are described in the case of the standardization (59)Fe. The variation of the γ-efficiency in the β-channel due to the Cerenkov emission is investigated in the case of the activity measurements of (54)Mn. The problem of the non-linearity between β-efficiencies is featured in the case of the efficiency tracing technique for the activity measurements of (14)C using (60)Co as a tracer.

  1. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  2. FLOWTRAN-TF software design

    SciTech Connect

    Aleman, S.E.; Flach, G.P.; Hamm, L.L.; Lee, S.Y.; Smith, F.G. III.

    1993-02-01

    FLOWTRAN-TF was created to analyze an individual Mk22 fuel assembly during a large break Loss Of Coolant Accident (LOCA) scenario involving the Savannah River Site K-reactor after the initial few seconds of the transient. During the initial few seconds reactor cooling is limited by the static or Ledinegg flow instability phenomenon. The predecessor FLOWTRAN code was developed to analyze this portion of a LOCA. In the several seconds following the break, a significant fraction of the reactor coolant inventory leaks out the break, Emergency Cooling System (ECS) flow is initiated, and air enters the primary coolant circulation loops. Reactor fuel assemblies are cooled by a low flowrate air-water downflow. Existing commercial nuclear industry thermal-hydraulic codes were judged inadequate for detailed modeling of a Mk22 fuel assembly because the application involves a ribbed annular geometry, low pressure, downflow and an air-water mixture. FLOWTRAN-TF is a two-phase thermal-hydraulics code of similar technology to existing commercial codes such as RELAP and TRAC but customized for Savannah River Site applications. The main features and capabilities of FLOWTRAN-TF are detailed Mk22 fuel assembly ribbed annular geometry; conjugate heat transfer; detailed neutronic power distribution; three-dimensional heat conduction in Mk22 fuel and target tubes; two-dimensional coolant flow in channels (axial, azimuthal); single-phase and/or two-phase fluid (gas, liquid and/or gas-liquid); two-component (air, water); constitutive models applicable to low pressure air-water downflow in ribbed annular channels. The design of FLOWTRAN-TF is described in detail in this report which serves as the Software Design Report in accordance with Quality Assurance Procedure IV-4, Rev. 0 Software Design and Implementation'' in the 1Q34 manual.

  3. FLOWTRAN-TF software design

    SciTech Connect

    Aleman, S.E.; Flach, G.P.; Hamm, L.L.; Lee, S.Y.; Smith, F.G. III

    1993-02-01

    FLOWTRAN-TF was created to analyze an individual Mk22 fuel assembly during a large break Loss Of Coolant Accident (LOCA) scenario involving the Savannah River Site K-reactor after the initial few seconds of the transient. During the initial few seconds reactor cooling is limited by the static or Ledinegg flow instability phenomenon. The predecessor FLOWTRAN code was developed to analyze this portion of a LOCA. In the several seconds following the break, a significant fraction of the reactor coolant inventory leaks out the break, Emergency Cooling System (ECS) flow is initiated, and air enters the primary coolant circulation loops. Reactor fuel assemblies are cooled by a low flowrate air-water downflow. Existing commercial nuclear industry thermal-hydraulic codes were judged inadequate for detailed modeling of a Mk22 fuel assembly because the application involves a ribbed annular geometry, low pressure, downflow and an air-water mixture. FLOWTRAN-TF is a two-phase thermal-hydraulics code of similar technology to existing commercial codes such as RELAP and TRAC but customized for Savannah River Site applications. The main features and capabilities of FLOWTRAN-TF are detailed Mk22 fuel assembly ribbed annular geometry; conjugate heat transfer; detailed neutronic power distribution; three-dimensional heat conduction in Mk22 fuel and target tubes; two-dimensional coolant flow in channels (axial, azimuthal); single-phase and/or two-phase fluid (gas, liquid and/or gas-liquid); two-component (air, water); constitutive models applicable to low pressure air-water downflow in ribbed annular channels. The design of FLOWTRAN-TF is described in detail in this report which serves as the Software Design Report in accordance with Quality Assurance Procedure IV-4, Rev. 0 ``Software Design and Implementation`` in the 1Q34 manual.

  4. Comparative studies on shielding properties of some steel alloys using Geant4, MCNP, WinXCOM and experimental results

    NASA Astrophysics Data System (ADS)

    Singh, Vishwanath P.; Medhat, M. E.; Shirmardi, S. P.

    2015-01-01

    The mass attenuation coefficients, μ/ρ and effective atomic numbers, Zeff of some carbon steel and stainless steel alloys have been calculated by using Geant4, MCNP simulation codes for different gamma ray energies, 279.1 keV, 661.6 keV, 662 keV, 1115.5 keV, 1173 keV and 1332 keV. The simulation results of Zeff using Geant4 and MCNP codes have been compared with possible available experimental results and theoretical WinXcom, and good agreement has been observed. The simulated μ/ρ and Zeff values using Geant4 and MCNP code signifies that both the simulation process can be followed to determine the gamma ray interaction properties of the alloys for energies wherever analogous experimental results may not be available. This kind of studies can be used for various applications such as for radiation dosimetry, medical and radiation shielding.

  5. New photodisintegration model of GEANT4 for the dγ → np reaction with a dibaryon effective field theory

    NASA Astrophysics Data System (ADS)

    Shin, Jae Won; Hyun, Chang Ho

    2016-09-01

    We develop a new hadronic model for GEANT4 that is specialized for the disintegration of the deuteron by photons, dγ → np. For the description of two-nucleon interactions, we employ a pionless effective field theory with dibaryon fields (dEFT). We apply the new model of GEANT4 (G4dEFT) to the calculations of the total and the differential cross sections in dγ → np and compare the results with empirical data. As an application of the new model, we calculate the neutron yield from the γ+CD2 process. G4dEFT predicts peaks for the neutron yield, but the existing model of GEANT4 does not show such behavior.

  6. Galactic Cosmic Rays and Lunar Secondary Particles from Solar Minimum to Maximum: CRaTER Observations and Geant4 Modeling

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J. E.; Blake, J. B.; Spence, H. E.; Schwadron, N.; Golightly, M. J.; Case, A. W.; Kasper, J. C.; Townsend, L. W.; Wilson, J. K.

    2014-12-01

    The Lunar Reconnaissance Orbiter mission was launched in 2009 during the recent deep and extended solar minimum, with the highest galactic cosmic ray (GCR) fluxes observed since the beginning of the space era. Its Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument was designed to measure the spectra of energy deposits in silicon detectors shielded behind pieces of tissue equivalent plastic, simulating the self-shielding provided by an astronaut's body around radiation-sensitive organs. The CRaTER data set now covers the evolution of the GCR environment near the moon during the first five years of development of the present solar cycle. We will present these observations, along with Geant4 modeling to illustrate the varying particle contributions to the energy-deposit spectra. CRaTER has also measured protons traveling up from the lunar surface after their creation during GCR interactions with surface material, and we will report observations and modeling of the energy and angular distributions of these "albedo" protons.

  7. Geant4 Simulation of A Multi-layered target for the Study of Neutron-Unbound Nuclei

    NASA Astrophysics Data System (ADS)

    Gueye, Paul; Freeman, Jessica; Frank, Nathan; Thoennessen, Michael; MONA Collaboration

    2013-10-01

    The MoNA/LISA setup at the National Superconducting Cyclotron Laboratory at Michigan State University has provided an avenue to study the nuclear structure of unbound states/nuclei at and beyond the neutron dripline for the past decade using secondary beams from the Coupled Cyclotron Facility. A new multi-layered Si/Be active target is being designed to specifically study neutron-unbound nuclei. In these experiments the decay energy is reconstructed from fragment-neutron coincidence measurements that are typically low in count rate. The multi-layered target will allow the use of thicker targets to increase the reaction rates, thus enabling to study currently out of reach nuclei such as 21C, 23C and 24N. The Geant4 Monte Carlo toolkit is currently used to model these physics processes within the multi-layered target and expected invariant mass distributions. A description of the experimental setup and simulation work will be discussed. This work is supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0000979.

  8. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4

    NASA Astrophysics Data System (ADS)

    Schümann, J.; Paganetti, H.; Shin, J.; Faddegon, B.; Perl, J.

    2012-06-01

    A key task within all Monte Carlo particle transport codes is ‘navigation’, the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the efficiency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the last with and without boundary skipping, a method where neighboring voxels with the same Hounsfield unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a tool for particle simulations layered on top of Geant4. Runtime comparisons were made on three distinct patient CT data sets: a head and neck, a liver and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average runtime ratio for G4PhantomParameterisation with and without boundary skipping for our heterogeneous data was equal to 0.97: 1. The calculated dose distributions agreed with the reference distribution for all but the G4

  9. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4.

    PubMed

    Schümann, J; Paganetti, H; Shin, J; Faddegon, B; Perl, J

    2012-06-07

    A key task within all Monte Carlo particle transport codes is 'navigation', the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the efficiency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the last with and without boundary skipping, a method where neighboring voxels with the same Hounsfield unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a tool for particle simulations layered on top of Geant4. Runtime comparisons were made on three distinct patient CT data sets: a head and neck, a liver and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average runtime ratio for G4PhantomParameterisation with and without boundary skipping for our heterogeneous data was equal to 0.97: 1. The calculated dose distributions agreed with the reference distribution for all but the G4Phantom

  10. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  11. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Suerfu, B.; Xu, J.; Ivantchenko, V.; Mantero, A.; Brown, J. M. C.; Bernal, M. A.; Francis, Z.; Karamitros, M.; Tran, H. N.

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  12. CHIPS_TPT models for exclusive Geant4 simulation of neutron-nuclear reactions at low energies

    NASA Astrophysics Data System (ADS)

    Kosov, Mikhail V.; Kudinov, Ilya V.; Savin, Dmitry I.

    2014-03-01

    A novel TPT code (Toolkit for Particle Transport), which is included in CHIPS_TPT physics list for Geant4 simulations, is briefly overviewed. Underlying concept of exclusive modelling is introduced and its beneficial features are illustrated with several examples. Widely used neutron Monte Carlo codes, MCNP and Geant4/HP, are based on inclusive algorithms that independently model neutron state change and secondary particles production while tracking. The exclusive approach implemented in TPT overcomes this unphysical separation and makes it possible to allow for kinematic restrictions as well as correlated emission of gamma-rays and secondaries.

  13. Validation of GEANT4 simulations for 62,63Zn yield estimation in proton induced reactions of natural copper

    NASA Astrophysics Data System (ADS)

    Rostampour, Malihe; Sadeghi, Mahdi; Aboudzadeh, Mohammadreza; Hamidi, Saeid; Hosseini, Seyedeh Fatemeh

    2017-03-01

    A useful approach to optimize of radioisotope production is the use of Monte Carlo simulations prior to experimentation. In this paper, the GEANT4 code was employed to calculate the saturation yields of 62,63Zn from proton-induced reactions of natural copper, enriched 63Cu and 65Cu. In addition, the saturation yields of the investigated radio-nuclides were calculated using the stopping power from the SRIM-2013 and reported experimental data for cross sections. The simulated saturation yields were compared with experimental values. Good agreement between the experimental and corresponding simulated data demonstrated that GEANT4 provides a suitable tool for radionuclide simulation production using proton irradiation.

  14. Software Prototyping: Designing Systems for Users.

    ERIC Educational Resources Information Center

    Spies, Phyllis Bova

    1983-01-01

    Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…

  15. Geant4 Predictions of Energy Spectra in Typical Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Sabra, M. S.; Barghouty, A. F.

    2014-01-01

    Accurate knowledge of energy spectra inside spacecraft is important for protecting astronauts as well as sensitive electronics from the harmful effects of space radiation. Such knowledge allows one to confidently map the radiation environment inside the vehicle. The purpose of this talk is to present preliminary calculations for energy spectra inside a spherical shell shielding and behind a slab in typical space radiation environment using the 3D Monte-Carlo transport code Geant4. We have simulated proton and iron isotropic sources and beams impinging on Aluminum and Gallium arsenide (GaAs) targets at energies of 0.2, 0.6, 1, and 10 GeV/u. If time permits, other radiation sources and beams (_, C, O) and targets (C, Si, Ge, water) will be presented. The results are compared to ground-based measurements where available.

  16. Study of Cosmic Ray Muon Lateral Distribution with Geant4 Simulation

    NASA Astrophysics Data System (ADS)

    Sarajlic, Olesya; He, Xiaochun

    2016-09-01

    Cosmic ray radiation has galactic origin and consists primarily of protons and a small percentage of heavier nuclei. The primary cosmic ray particles interact with the molecules in the atmosphere and produce showers of secondary particles at about 15 km altitude. In recent years, with the advancement in particle detection technology, there is a growing interest of exploring the applications of cosmic ray muons ranging from Homeland Security, correlation study with the atmospheric weather, etc. A Geant4-based cosmic ray shower simulation is developed to study secondary cosmic ray particle showers in the full range of the Earth's atmosphere. In this talk, the diurnal and latitudinal variations of muon lateral distributions will be presented.

  17. Geant4 calculations for space radiation shielding material Al2O3

    NASA Astrophysics Data System (ADS)

    Capali, Veli; Acar Yesil, Tolga; Kaya, Gokhan; Kaplan, Abdullah; Yavuz, Mustafa; Tilki, Tahir

    2015-07-01

    Aluminium Oxide, Al2O3 is the most widely used material in the engineering applications. It is significant aluminium metal, because of its hardness and as a refractory material owing to its high melting point. This material has several engineering applications in diverse fields such as, ballistic armour systems, wear components, electrical and electronic substrates, automotive parts, components for electric industry and aero-engine. As well, it is used as a dosimeter for radiation protection and therapy applications for its optically stimulated luminescence properties. In this study, stopping powers and penetrating distances have been calculated for the alpha, proton, electron and gamma particles in space radiation shielding material Al2O3 for incident energies 1 keV - 1 GeV using GEANT4 calculation code.

  18. Geant4.10 simulation of geometric model for metaphase chromosome

    NASA Astrophysics Data System (ADS)

    Rafat-Motavalli, L.; Miri-Hakimabad, H.; Bakhtiyari, E.

    2016-04-01

    In this paper, a geometric model of metaphase chromosome is explained. The model is constructed according to the packing ratio and dimension of the structure from nucleosome up to chromosome. A B-DNA base pair is used to construct 200 base pairs of nucleosomes. Each chromatin fiber loop, which is the unit of repeat, has 49,200 bp. This geometry is entered in Geant4.10 Monte Carlo simulation toolkit and can be extended to the whole metaphase chromosomes and any application in which a DNA geometrical model is needed. The chromosome base pairs, chromosome length, and relative length of chromosomes are calculated. The calculated relative length is compared to the relative length of human chromosomes.

  19. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.

    PubMed

    Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L

    2012-06-07

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  20. GEANT4 calibration of gamma spectrometry efficiency for measurements of airborne radioactivity on filter paper.

    PubMed

    Alrefae, Tareq

    2014-11-01

    A simple method of efficiency calibration for gamma spectrometry was performed. This method, which focused on measuring airborne radioactivity collected on filter paper, was based on Monte Carlo simulations using the toolkit GEANT4. Experimentally, the efficiency values of an HPGe detector were calculated for a multi-gamma disk source. These efficiency values were compared to their counterparts produced by a computer code that simulated experimental conditions. Such comparison revealed biases of 24, 10, 1, 3, 7, and 3% for the radionuclides (photon energies in keV) of Ce (166), Sn (392), Cs (662), Co (1,173), Co (1,333), and Y (1,836), respectively. The output of the simulation code was in acceptable agreement with the experimental findings, thus validating the proposed method.

  1. Application of GEANT4 simulation on calibration of HPGe detectors for cylindrical environmental samples.

    PubMed

    Nikolic, J D; Jokovic, D; Todorovic, D; Rajacic, M

    2014-06-01

    The determination of radionuclide activity concentration requires a prior knowledge of the full-energy peak (FEP) efficiency at all photon energies for a given measuring geometry. This problem has been partially solved by using procedures based on Monte Carlo simulations, developed in order to complement the experimental calibration procedures used in gamma-ray measurements of environmental samples. The aim of this article is to apply GEANT4 simulation for calibration of two HPGe detectors, for measurement of liquid and soil-like samples in cylindrical geometry. The efficiencies obtained using a simulation were compared with experimental results, and applied to a realistic measurement. Measurement uncertainties for both simulation and experimental values were estimated in order to see whether the results of the realistic measurement fall within acceptable limits. The trueness of the result was checked using the known activity of the measured samples provided by IAEA.

  2. Investigation of cosmic-ray induced background of Germanium gamma spectrometer using GEANT4 simulation.

    PubMed

    Hung, Nguyen Quoc; Hai, Vo Hong; Nomachi, Masaharu

    2017-03-01

    In this article, a GEANT4 Monte Carlo simulation toolkit was used to study the response of the cosmic-ray induced background on a High-Purity Germanium (HPGe) gamma spectrometer in the wide energy range, up to 100MeV. The natural radiation background measurements of the spectrometer were carried out in the energy region from 0.04 to 50MeV. The simulated cosmic-ray induced background of the Ge detector was evaluated in comparison with the measured data. The contribution of various cosmic-ray components including muons, neutrons, protons, electrons, positrons and photons was investigated. We also analyzed secondary particle showers induced by the muonic component.

  3. ROSI and GEANT4 - A comparison in the context of high energy X-ray physics

    NASA Astrophysics Data System (ADS)

    Kiunke, Markus; Stritt, Carina; Schielein, Richard; Sukowski, Frank; Hölzing, Astrid; Zabler, Simon; Hofmann, Jürgen; Flisch, Alexander; Kasperl, Stefan; Sennhauser, Urs; Hanke, Randolf

    2016-06-01

    This work compares two popular MC simulation frameworks ROSI (Roentgen Simulation) and GEANT4 (Geometry and Tracking in its fourth version) in the context of X-ray physics. The comparison will be performed with the help of a parameter study considering energy, material and length variations. While the total deposited energy as well as the contribution of Compton scattering show a good accordance between all simulated configurations, all other physical effects exhibit large deviations in a comparison of data-sets. These discrepancies between simulations are shown to originate from the different cross sectional databases used in the frameworks, whereas the overall simulation mechanics seem to not have an influence on the agreement of the simulations. A scan over energy, length and material shows that the two parameters energy and material have a significant influence on the agreement of the simulation results, while the length parameter shows no noticeable influence on the deviations between the data-sets.

  4. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation.

    PubMed

    Ogawara, R; Ishikawa, M

    2016-07-01

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements.

  5. Development and validation of a GEANT4 radiation transport code for CT dosimetry.

    PubMed

    Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G

    2015-04-01

    The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other.

  6. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy

    NASA Astrophysics Data System (ADS)

    Afsharpour, H.; Landry, G.; D'Amours, M.; Enger, S.; Reniers, B.; Poon, E.; Carrier, J.-F.; Verhaegen, F.; Beaulieu, L.

    2012-06-01

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  7. Simulating the DESCANT Neutron Detection Array with the Geant4 Toolkit

    NASA Astrophysics Data System (ADS)

    Turko, Joseph; Bildstein, Vinzenz; Rand, Evan; Maclean, Andrew; Garrett, Paul; Griffin Collaboration Collaboration

    2016-09-01

    The DEuterated SCintillator Array for Neutron Tagging (DESCANT) is a newly developed high-efficiency neutron detection array composed of 70 hexagonal deuterated scintillators. Due to the anisotropic nature of elastic (n,d) scattering, the pulse-height spectra of a deuterated scintillator contains a forward-peaked structure that can be used to determine the energy of the incident neutron without using traditional time-of-flight methods. Simulations of the array are crucial in order to interpret the DESCANT pulse heights, determine the efficiencies of the array, and examine its capabilities for conducting various nuclear decay experiments. To achieve this, we plan: (i) a verification of the low-energy hadronic physics packages in Geant4, (ii) a comparison of simulated spectra with data from a simple cylindrical ``test can'' detector geometry, (iii) expanding the simulated light response to a prototype DESCANT detector, and (iv) simulating the entire DESCANT array. NSERC, CFI.

  8. Nuclear reaction measurements on tissue-equivalent materials and GEANT4 Monte Carlo simulations for hadrontherapy

    NASA Astrophysics Data System (ADS)

    De Napoli, M.; Romano, F.; D'Urso, D.; Licciardello, T.; Agodi, C.; Candiano, G.; Cappuzzello, F.; Cirrone, G. A. P.; Cuttone, G.; Musumarra, A.; Pandola, L.; Scuderi, V.

    2014-12-01

    When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned. Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u-1 12C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.

  9. Radiation quality of cosmic ray nuclei studied with Geant4-based simulations

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas N.; Pshenichnov, Igor A.; Mishustin, Igor N.; Bleicher, Marcus

    2014-04-01

    In future missions in deep space a space craft will be exposed to a non-negligible flux of high charge and energy (HZE) particles present in the galactic cosmic rays (GCR). One of the major concerns of manned missions is the impact on humans of complex radiation fields which result from the interactions of HZE particles with the spacecraft materials. The radiation quality of several ions representing GCR is investigated by calculating microdosimetry spectra. A Geant4-based Monte Carlo model for Heavy Ion Therapy (MCHIT) is used to simulate microdosimetry data for HZE particles in extended media where fragmentation reactions play a certain role. Our model is able to reproduce measured microdosimetry spectra for H, He, Li, C and Si in the energy range of 150-490 MeV/u. The effect of nuclear fragmentation on the relative biological effectiveness (RBE) of He, Li and C is estimated and found to be below 10%.

  10. Nuclear reaction measurements on tissue-equivalent materials and GEANT4 Monte Carlo simulations for hadrontherapy.

    PubMed

    De Napoli, M; Romano, F; D'Urso, D; Licciardello, T; Agodi, C; Candiano, G; Cappuzzello, F; Cirrone, G A P; Cuttone, G; Musumarra, A; Pandola, L; Scuderi, V

    2014-12-21

    When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned.Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u(-1) (12)C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.

  11. Geant4 simulation of the n_TOF-EAR2 neutron beam: Characteristics and prospects

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Lo Meo, S.; Guerrero, C.; Cortés-Giraldo, M. A.; Massimi, C.; Quesada, J. M.; Barbagallo, M.; Colonna, N.; Mancusi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.

    2016-04-01

    The characteristics of the neutron beam at the new n_TOF-EAR2 facility have been simulated with the Geant4 code with the aim of providing useful data for both the analysis and planning of the upcoming measurements. The spatial and energy distributions of the neutrons, the resolution function and the in-beam γ-ray background have been studied in detail and their implications in the forthcoming experiments have been discussed. The results confirm that, with this new short (18.5m flight path) beam line, reaching an instantaneous neutron flux beyond 105n/μs/pulse in the keV region, n_TOF is one of the few facilities where challenging measurements can be performed, involving in particular short-lived radioisotopes.

  12. Evaluation of a commercial MRI Linac based Monte Carlo dose calculation algorithm with GEANT 4

    SciTech Connect

    Ahmad, Syed Bilal; Sarfehnia, Arman; Kim, Anthony; Sahgal, Arjun; Keller, Brian; Paudel, Moti Raj; Hissoiny, Sami

    2016-02-15

    Purpose: This paper provides a comparison between a fast, commercial, in-patient Monte Carlo dose calculation algorithm (GPUMCD) and GEANT4. It also evaluates the dosimetric impact of the application of an external 1.5 T magnetic field. Methods: A stand-alone version of the Elekta™ GPUMCD algorithm, to be used within the Monaco treatment planning system to model dose for the Elekta™ magnetic resonance imaging (MRI) Linac, was compared against GEANT4 (v10.1). This was done in the presence or absence of a 1.5 T static magnetic field directed orthogonally to the radiation beam axis. Phantoms with material compositions of water, ICRU lung, ICRU compact-bone, and titanium were used for this purpose. Beams with 2 MeV monoenergetic photons as well as a 7 MV histogrammed spectrum representing the MRI Linac spectrum were emitted from a point source using a nominal source-to-surface distance of 142.5 cm. Field sizes ranged from 1.5 × 1.5 to 10 × 10 cm{sup 2}. Dose scoring was performed using a 3D grid comprising 1 mm{sup 3} voxels. The production thresholds were equivalent for both codes. Results were analyzed based upon a voxel by voxel dose difference between the two codes and also using a volumetric gamma analysis. Results: Comparisons were drawn from central axis depth doses, cross beam profiles, and isodose contours. Both in the presence and absence of a 1.5 T static magnetic field the relative differences in doses scored along the beam central axis were less than 1% for the homogeneous water phantom and all results matched within a maximum of ±2% for heterogeneous phantoms. Volumetric gamma analysis indicated that more than 99% of the examined volume passed gamma criteria of 2%—2 mm (dose difference and distance to agreement, respectively). These criteria were chosen because the minimum primary statistical uncertainty in dose scoring voxels was 0.5%. The presence of the magnetic field affects the dose at the interface depending upon the density of the material

  13. Comparison of MCNPX and Geant4 proton energy deposition predictions for clinical use

    PubMed Central

    Titt, U.; Bednarz, B.; Paganetti, H.

    2012-01-01

    Several different Monte Carlo codes are currently being used at proton therapy centers to improve upon dose predictions over standard methods using analytical or semi-empirical dose algorithms. There is a need to better ascertain the differences between proton dose predictions from different available Monte Carlo codes. In this investigation Geant4 and MCNPX, the two most-utilized Monte Carlo codes for proton therapy applications, were used to predict energy deposition distributions in a variety of geometries, comprising simple water phantoms, water phantoms with complex inserts and in a voxelized geometry based on clinical CT data. The gamma analysis was used to evaluate the differences of the predictions between the codes. The results show that in the all cases the agreement was better than clinical acceptance criteria. PMID:22996039

  14. Flight Software Design Choices Based on Criticality

    NASA Technical Reports Server (NTRS)

    Lee, Earl

    1999-01-01

    This slide presentation reviews the rationale behind flight software design as a function of criticality. The requirements of human rated systems implies a high criticality for the flight support software. Human life is dependent on correct operation of the software. Flexibility should be permitted when the consequences of software failure are not life threatening. This is also relevant for selecting Commercial Off the Shelf (COTS) software.

  15. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  16. Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.

    PubMed

    Chagren, S; Ben Tekaya, M; Reguigui, N; Gharbi, F

    2016-01-01

    In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed.

  17. Voxel model of individual cells and its implementation in microdosimetric calculations using GEANT4.

    PubMed

    Sihver, Lembit; Ni, Jie; Sun, Liang; Kong, Dong; Ren, Yuanyuan; Gu, Siyi

    2014-08-01

    Accurate dosimetric calculations at cellular and sub-cellular levels are crucial to obtain an increased understanding of the interactions of ionizing radiation with a cell and its nucleus and cytoplasm. Ion microbeams provide a superior opportunity to irradiate small biological samples, e.g., DNA, cells, and to compare their response to computer simulations. However, the phantoms used to simulate small biological samples at cellular levels are often simplified as simple volumes filled with water. As a first step to improve the situation in comparing measurements of cell response to ionizing radiation with model calculations, a realistic voxel model of a KB cell was constructed and used together with an already constructed geometry and tracking 4 (GEANT4) model of the horizontal microbeam line of the Centre d'Etudes Nucléaires de Bordeaux-Gradignan (CENBG) 3.5 MV Van de Graaf accelerator at the CENBG, France. The microbeam model was then implemented into GEANT4 for simulations of the average number of particles hitting an irradiated cell when a specified number of particles are produced in the beam line. The result shows that when irradiating the developed voxel model of a KB cell with 200 α particles, with a nominal energy of 3 MeV in the beam line and 2.34 MeV at the cell entrance, 100 particles hit the cell on average. The mean specific energy is 0.209 ± 0.019 Gy in the nucleus and 0.044 ± 0.001 Gy in the cytoplasm. These results are in agreement with previously published data, which indicates that this model could act as a reference model for dosimetric calculations of radiobiological experiments, and that the proposed method could be applied to build a cell model database.

  18. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications.

    PubMed

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-07

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled (125)I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10(-6) simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  19. The effects of mapping CT images to Monte Carlo materials on GEANT4 proton simulation accuracy

    SciTech Connect

    Barnes, Samuel; McAuley, Grant; Slater, James; Wroe, Andrew

    2013-04-15

    Purpose: Monte Carlo simulations of radiation therapy require conversion from Hounsfield units (HU) in CT images to an exact tissue composition and density. The number of discrete densities (or density bins) used in this mapping affects the simulation accuracy, execution time, and memory usage in GEANT4 and other Monte Carlo code. The relationship between the number of density bins and CT noise was examined in general for all simulations that use HU conversion to density. Additionally, the effect of this on simulation accuracy was examined for proton radiation. Methods: Relative uncertainty from CT noise was compared with uncertainty from density binning to determine an upper limit on the number of density bins required in the presence of CT noise. Error propagation analysis was also performed on continuously slowing down approximation range calculations to determine the proton range uncertainty caused by density binning. These results were verified with Monte Carlo simulations. Results: In the presence of even modest CT noise (5 HU or 0.5%) 450 density bins were found to only cause a 5% increase in the density uncertainty (i.e., 95% of density uncertainty from CT noise, 5% from binning). Larger numbers of density bins are not required as CT noise will prevent increased density accuracy; this applies across all types of Monte Carlo simulations. Examining uncertainty in proton range, only 127 density bins are required for a proton range error of <0.1 mm in most tissue and <0.5 mm in low density tissue (e.g., lung). Conclusions: By considering CT noise and actual range uncertainty, the number of required density bins can be restricted to a very modest 127 depending on the application. Reducing the number of density bins provides large memory and execution time savings in GEANT4 and other Monte Carlo packages.

  20. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  1. Introducing Third-Year Undergraduates to GEANT4 Simulations of Light Transport and Collection in Scintillation Materials

    ERIC Educational Resources Information Center

    Riggi, Simone; La Rocca, Paola; Riggi, Francesco

    2011-01-01

    GEANT4 simulations of the processes affecting the transport and collection of optical photons generated inside a scintillation detector were carried out, with the aim to complement the educational material offered by textbooks to third-year physics undergraduates. Two typical situations were considered: a long scintillator strip with and without a…

  2. A macroscopic and microscopic study of radon exposure using Geant4 and MCNPX to estimate dose rates and DNA damage

    NASA Astrophysics Data System (ADS)

    van den Akker, Mary Evelyn

    Radon is considered the second-leading cause of lung cancer after smoking. Epidemiological studies have been conducted in miner cohorts as well as general populations to estimate the risks associated with high and low dose exposures. There are problems with extrapolating risk estimates to low dose exposures, mainly that the dose-response curve at low doses is not well understood. Calculated dosimetric quantities give average energy depositions in an organ or a whole body, but morphological features of an individual can affect these values. As opposed to human phantom models, Computed Tomography (CT) scans provide unique, patient-specific geometries that are valuable in modeling the radiological effects of the short-lived radon progeny sources. Monte Carlo particle transport code Geant4 was used with the CT scan data to model radon inhalation in the main bronchial bifurcation. The equivalent dose rates are near the lower bounds of estimates found in the literature, depending on source volume. To complement the macroscopic study, simulations were run in a small tissue volume in Geant4-DNA toolkit. As an expansion of Geant4 meant to simulate direct physical interactions at the cellular level, the particle track structure of the radon progeny alphas can be analyzed to estimate the damage that can occur in sensitive cellular structures like the DNA molecule. These estimates of DNA double strand breaks are lower than those found in Geant4-DNA studies. Further refinements of the microscopic model are at the cutting edge of nanodosimetry research.

  3. Microdosimetry of alpha particles for simple and 3D voxelised geometries using MCNPX and Geant4 Monte Carlo codes.

    PubMed

    Elbast, M; Saudo, A; Franck, D; Petitot, F; Desbrée, A

    2012-07-01

    Microdosimetry using Monte Carlo simulation is a suitable technique to describe the stochastic nature of energy deposition by alpha particle at cellular level. Because of its short range, the energy imparted by this particle to the targets is highly non-uniform. Thus, to achieve accurate dosimetric results, the modelling of the geometry should be as realistic as possible. The objectives of the present study were to validate the use of the MCNPX and Geant4 Monte Carlo codes for microdosimetric studies using simple and three-dimensional voxelised geometry and to study their limit of validity in this last case. To that aim, the specific energy (z) deposited in the cell nucleus, the single-hit density of specific energy f(1)(z) and the mean-specific energy were calculated. Results show a good agreement when compared with the literature using simple geometry. The maximum percentage difference found is <6 %. For voxelised phantom, the study of the voxel size highlighted that the shape of the curve f(1)(z) obtained with MCNPX for <1 µm voxel size presents a significant difference with the shape of non-voxelised geometry. When using Geant4, little differences are observed whatever the voxel size is. Below 1 µm, the use of Geant4 is required. However, the calculation time is 10 times higher with Geant4 than MCNPX code in the same conditions.

  4. Implementation of the n-body Monte-Carlo event generator into the Geant4 toolkit for photonuclear studies

    NASA Astrophysics Data System (ADS)

    Luo, Wen; Lan, Hao-yang; Xu, Yi; Balabanski, Dimiter L.

    2017-03-01

    A data-based Monte Carlo simulation algorithm, Geant4-GENBOD, was developed by coupling the n-body Monte-Carlo event generator to the Geant4 toolkit, aiming at accurate simulations of specific photonuclear reactions for diverse photonuclear physics studies. Good comparisons of Geant4-GENBOD calculations with reported measurements of photo-neutron production cross-sections and yields, and with reported energy spectra of the 6Li(n,α)t reaction were performed. Good agreements between the calculations and experimental data were found and the validation of the developed program was verified consequently. Furthermore, simulations for the 92Mo(γ,p) reaction of astrophysics relevance and photo-neutron production of 99Mo/99mTc and 225Ra/225Ac radioisotopes were investigated, which demonstrate the applicability of this program. We conclude that the Geant4-GENBOD is a reliable tool for study of the emerging experiment programs at high-intensity γ-beam laboratories, such as the Extreme Light Infrastructure - Nuclear Physics facility and the High Intensity Gamma-Ray Source at Duke University.

  5. Geant4 physics processes for microdosimetry simulation: Very low energy electromagnetic models for protons and heavy ions in silicon

    NASA Astrophysics Data System (ADS)

    Valentin, A.; Raine, M.; Gaillardin, M.; Paillet, P.

    2012-09-01

    The Geant4-DNA extension of the Geant4 Monte Carlo simulation toolkit aims at modeling early biological damages induced by ionizing radiation at the DNA scale, and it can now track particles down to very low energies in liquid water. New models, called "MuElec", have been implemented for microelectronic applications following the same initial theory, to track low energy electrons in silicon. This paper presents the extension of these MuElec models to incident protons and heavy ions in silicon. First, the theory of the model is presented. The resulting cross sections and stopping powers are compared with data from the literature. The model is then implemented in Geant4 and used to simulate proton tracks. Various physical quantities are extracted from the simulation, and compared with data from the literature and with results from simulation using other Geant4 models. It is shown that the generation of low-energy electrons results in more physically meaningful low-energy secondary electron tracks, which significantly modifies the proton and ion track core on the nanometer scale.

  6. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  7. Language and Program for Documenting Software Design

    NASA Technical Reports Server (NTRS)

    Kleine, H.; Zepko, T. M.

    1986-01-01

    Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.

  8. Knowledge modeling for software design

    NASA Technical Reports Server (NTRS)

    Shaw, Mildred L. G.; Gaines, Brian R.

    1992-01-01

    This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledge modeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.

  9. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products.

    PubMed

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm(3) water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  10. Influence of thyroid volume reduction on absorbed dose in 131I therapy studied by using Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ziaur, Rahman; Sikander, M. Mirza; Waheed, Arshed; Nasir, M. Mirza; Waheed, Ahmed

    2014-05-01

    A simulation study has been performed to quantify the effect of volume reduction on the thyroid absorbed dose per decay and to investigate the variation of energy deposition per decay due to β- and γ-activity of 131I with volume/mass of thyroid, for water, ICRP- and ICRU-soft tissue taken as thyroid material. A Monte Carlo model of the thyroid, in the Geant4 radiation transport simulation toolkit was constructed to compute the β- and γ-absorbed dose in the simulated thyroid phantom for various values of its volume. The effect of the size and shape of the thyroid on energy deposition per decay has also been studied by using spherical, ellipsoidal and cylindrical models for the thyroid and varying its volume in 1-25 cm3 range. The relative differences of Geant4 results for different models with each other and MCNP results lie well below 1.870%. The maximum relative difference among the Geant4 estimated results for water with ICRP and ICRU soft tissues is not more than 0.225%. S-values for ellipsoidal, spherical and cylindrical thyroid models were estimated and the relative difference with published results lies within 3.095%. The absorbed fraction values for beta particles show a good agreement with published values within 2.105% deviation. The Geant4 based simulation results of absorbed fractions for gammas again show a good agreement with the corresponding MCNP and EGS4 results (±6.667%) but have 29.032% higher values than that of MIRD calculated values. Consistent with previous studies, the reduction of the thyroid volume is found to have a substantial effect on the absorbed dose. Geant4 simulations confirm dose dependence on the volume/mass of thyroid in agreement with MCNP and EGS4 computed values but are substantially different from MIRD8 data. Therefore, inclusion of size/mass dependence is indicated for 131I radiotherapy of the thyroid.

  11. Commissioning of 6 MV medical linac for dynamic MLC-based IMRT on Monte Carlo code GEANT4.

    PubMed

    Okamoto, Hiroyuki; Fujita, Yukio; Sakama, Kyoko; Saitoh, Hidetoshi; Kanai, Tatsuaki; Itami, Jun; Kohno, Toshiyuki

    2014-07-01

    Monte Carlo simulation is the most accurate tool for calculating dose distributions. In particular, the Electron Gamma shower computer code has been widely used for multi-purpose research in radiotherapy, but Monte Carlo GEANT4 (GEometry ANd Tracking) is rare for radiotherapy with photon beams and needs to be verified further under various irradiation conditions, particularly multi-leaf collimator-based intensity-modulated radiation therapy (MLC-based IMRT). In this study, GEANT4 was used for modeling of a 6 MV linac for dynamic MLC-based IMRT. To verify the modeling of our linac, we compared the calculated data with the measured depth-dose for a 10 × 10 cm(2) field and the measured dose profile for a 35 × 35 cm(2) field. Moreover, 120 MLCs were modeled on the GEANT4. Five tests of MLC modeling were performed: (I) MLC transmission, (II) MLC transmission profile including intra- and inter-leaf leakage, (III) tongue-and-groove leakage, (IV) a simple field with different field sizes by use of MLC and (V) a dynamic MLC-based IMRT field. For all tests, the calculations were compared with measurements of an ionization chamber and radiographic film. The calculations agreed with the measurements: MLC transmissions by calculations and measurements were 1.76 ± 0.01 and 1.87 ± 0.01 %, respectively. In gamma evaluation method (3 %/3 mm), the pass rates of the (IV) and (V) tests were 98.5 and 97.0 %, respectively. Furthermore, tongue-and-groove leakage could be calculated by GEANT4, and it agreed with the film measurements. The procedure of commissioning of dynamic MLC-based IMRT for GEANT4 is proposed in this study.

  12. LSST control software component design

    NASA Astrophysics Data System (ADS)

    Lotz, Paul J.; Dubois-Felsmann, Gregory P.; Lim, Kian-Tat; Johnson, Tony; Chandrasekharan, Srinivasan; Mills, David; Daly, Philip; Schumacher, Germán.; Delgado, Francisco; Pietrowicz, Steve; Selvy, Brian; Sebag, Jacques; Marshall, Stuart; Sundararaman, Harini; Contaxis, Christopher; Bovill, Robert; Jenness, Tim

    2016-08-01

    Construction of the Large Synoptic Survey Telescope system involves several different organizations, a situation that poses many challenges at the time of the software integration of the components. To ensure commonality for the purposes of usability, maintainability, and robustness, the LSST software teams have agreed to the following for system software components: a summary state machine, a manner of managing settings, a flexible solution to specify controller/controllee relationships reliably as needed, and a paradigm for responding to and communicating alarms. This paper describes these agreed solutions and the factors that motivated these.

  13. Software design studies emphasizing Project LOGOS

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.

  14. An empirical study of software design practices

    NASA Technical Reports Server (NTRS)

    Card, David N.; Church, Victor E.; Agresti, William W.

    1986-01-01

    Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.

  15. GEANT4 Simulations of Gamma-Gamma Angular Correlations with GRIFFIN

    NASA Astrophysics Data System (ADS)

    Natzke, Connor; Griffin Collaboration

    2016-09-01

    The structure of very neutron rich isotopes has been of recent experimental interest for both nuclear astrophysics and fundamental nuclear structure investigations. In beta-minus decay specifically, beta-delayed gamma cascades can help to shed light on the spin and parity of the states involved. One of the world's most powerful decay spectroscopy tool is the Gamma-Ray Infrastructure For Fundamental Investigations of Nuclei (GRIFFIN) spectrometer at TRIUMF-ISAC in Vancouver, Canada. To investigate the feasibility of these experimental studies, GEANT4 simulations of neutron-rich nuclei are critical, as they are able to provide realistic estimates of what the experimental results may look like. The first such nucleus investigated was 44P, and both the temporal and angular γγ correlations were extracted. Furthermore the simulations were used to model various multipole decay possibilities which provide a powerful tool analyzing collected data from such facilities. In the future, the Facility for Rare Isotope Beams (FRIB) at MSU will be an ideal site for such studies on the most exotic nuclei.

  16. Simulation of ultrasoft X-rays induced DNA damage using the Geant4 Monte Carlo toolkit

    NASA Astrophysics Data System (ADS)

    Tajik, Marjan; Rozatian, Amir S. H.; Semsarha, Farid

    2015-01-01

    In this study, the total yields of SSB and DSB induced by monoenergetic electrons with energies of 0.28-4.55 keV, corresponding to ultrasoft X-rays energies, have been calculated in Charlton and Humm volume model using the Geant4-DNA toolkit and compared with theoretical and experimental data. A reasonable agreement between the obtained results in the present study and experimental and theoretical data of previous studies showed the efficiency of this model in estimating the total yield of strand breaks in spite of its simplicity. Also, it has been found that in the low energy region, the yield of the total SSB remains nearly constant while the DSB yield increases with decreasing energy. Moreover, a direct dependency between DSB induction, RBE value and the mean lineal energy as a microdosimetry quantity has been observed. In addition, it has become clear that the use of the threshold energy of 10.79 eV to calculate the total strand breaks yields results in a better agreement with the experiments, while the threshold of 17.5 eV shows a big difference.

  17. Geant4 simulation study of Indian National Gamma Array at TIFR

    NASA Astrophysics Data System (ADS)

    Saha, S.; Palit, R.; Sethi, J.; Biswas, S.; Singh, P.

    2016-03-01

    A Geant4 simulation code for the Indian National Gamma Array (INGA) consisting of 24 Compton suppressed clover high purity germanium (HPGe) detectors has been developed. The calculated properties in the energy range that is of interest for nuclear γ-ray spectroscopy are spectral distributions for various standard radioactive sources, intrinsic peak efficiencies and peak-to-total (P/T) ratios in various configurations such as singles, add-back and Compton suppressed mode. The principle of operation of the detectors in add-back and Compton suppression mode have been reproduced in the simulation. The reliability of the calculation is checked by comparison with the experimental data for various γ-ray energies up to 5 MeV. The comparison between simulation results and experimental data demonstrate the need of incorporating the exact geometry of the clover detectors, Anti-Compton Shield and other surrounding materials in the array to explain the detector response to the γ-ray. Several experimental effects are also investigated. These include the geometrical correction to angular distribution, crosstalk probability and the impact of heavy metal collimators between the target and the array on the P/T ratio.

  18. VIDA: A Voxel-Based Dosimetry Method for Targeted Radionuclide Therapy Using Geant4

    PubMed Central

    Dewaraja, Yuni K.; Abramson, Richard G.; Stabin, Michael G.

    2015-01-01

    Abstract We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy (131I, 90Y, 111In, 177Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by 131I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression. PMID:25594357

  19. Modeling of x-ray fluorescence using MCNPX and Geant4

    SciTech Connect

    Rajasingam, Akshayan; Hoover, Andrew S; Fensin, Michael L; Tobin, Stephen J

    2009-01-01

    X-Ray Fluorescence (XRF) is one of thirteen non-destructive assay techniques being researched for the purpose of quantifying the Pu mass in used fuel assemblies. The modeling portion of this research will be conducted with the MCNPX transport code. The research presented here was undertaken to test the capability of MCNPX so that it can be used to benchmark measurements made at the ORNL and to give confidence in the application of MCNPX as a predictive tool of the expected capability of XRF in the context of used fuel assemblies. The main focus of this paper is a code-to-code comparison between MCNPX and Geant4 code. Since XRF in used fuel is driven by photon emission and beta decay of fission fragments, both terms were independently researched. Simple cases and used fuel cases were modeled for both source terms. In order to prepare for benchmarking to experiments, it was necessary to determine the relative significance of the various fission fragments for producing X-rays.

  20. Distributions of deposited energy and ionization clusters around ion tracks studied with Geant4 toolkit.

    PubMed

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Hilgers, Gerhard; Bleicher, Marcus

    2016-05-21

    The Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT) was extended to study the patterns of energy deposition at sub-micrometer distance from individual ion tracks. Dose distributions for low-energy (1)H, (4)He, (12)C and (16)O ions measured in several experiments are well described by the model in a broad range of radial distances, from 0.5 to 3000 nm. Despite the fact that such distributions are characterized by long tails, a dominant fraction of deposited energy (∼80%) is confined within a radius of about 10 nm. The probability distributions of clustered ionization events in nanoscale volumes of water traversed by (1)H, (2)H, (4)He, (6)Li, (7)Li, and (12)C ions are also calculated. A good agreement of calculated ionization cluster-size distributions with the corresponding experimental data suggests that the extended MCHIT can be used to characterize stochastic processes of energy deposition to sensitive cellular structures.

  1. Radial dose distributions from protons of therapeutic energies calculated with Geant4-DNA.

    PubMed

    Wang, He; Vassiliev, Oleg N

    2014-07-21

    Models based on the amorphous track structure approximation have been successful in predicting the biological effects of heavy charged particles. Development of such models remains an active area of research that includes applications to hadrontherapy. In such models, the radial distribution of the dose deposited by delta electrons and directly by the particle is the main characteristic of track structure. We calculated these distributions with Geant4-DNA Monte Carlo code for protons in the energy range from 10 to 100 MeV. These results were approximated by a simple formula that combines the well-known inverse square distance dependence with two factors that eliminate the divergence of the radial dose integral at both small and large distances. A clear physical interpretation is given to the asymptotic behaviour of the radial dose distribution resulting from these two factors. The proposed formula agrees with the Monte Carlo data within 10% for radial distances of up to 10 μm, which corresponds to a dose range covering over eight orders of magnitude. Differences between our results and those of previously published analytical models are discussed.

  2. Radial dose distributions from protons of therapeutic energies calculated with Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Wang, He; Vassiliev, Oleg N.

    2014-07-01

    Models based on the amorphous track structure approximation have been successful in predicting the biological effects of heavy charged particles. Development of such models remains an active area of research that includes applications to hadrontherapy. In such models, the radial distribution of the dose deposited by delta electrons and directly by the particle is the main characteristic of track structure. We calculated these distributions with Geant4-DNA Monte Carlo code for protons in the energy range from 10 to 100 MeV. These results were approximated by a simple formula that combines the well-known inverse square distance dependence with two factors that eliminate the divergence of the radial dose integral at both small and large distances. A clear physical interpretation is given to the asymptotic behaviour of the radial dose distribution resulting from these two factors. The proposed formula agrees with the Monte Carlo data within 10% for radial distances of up to 10 μm, which corresponds to a dose range covering over eight orders of magnitude. Differences between our results and those of previously published analytical models are discussed.

  3. Optimization of a general-purpose, actively scanned proton beamline for ocular treatments: Geant4 simulations.

    PubMed

    Piersimoni, Pierluigi; Rimoldi, Adele; Riccardi, Cristina; Pirola, Michele; Molinelli, Silvia; Ciocca, Mario

    2015-03-08

    The Italian National Center for Hadrontherapy (CNAO, Centro Nazionale di Adroterapia Oncologica), a synchrotron-based hospital facility, started the treatment of patients within selected clinical trials in late 2011 and 2012 with actively scanned proton and carbon ion beams, respectively. The activation of a new clinical protocol for the irradiation of uveal melanoma using the existing general-purpose proton beamline is foreseen for late 2014. Beam characteristics and patient treatment setup need to be tuned to meet the specific requirements for such a type of treatment technique. The aim of this study is to optimize the CNAO transport beamline by adding passive components and minimizing air gap to achieve the optimal conditions for ocular tumor irradiation. The CNAO setup with the active and passive components along the transport beamline, as well as a human eye-modeled detector also including a realistic target volume, were simulated using the Monte Carlo Geant4 toolkit. The strong reduction of the air gap between the nozzle and patient skin, as well as the insertion of a range shifter plus a patient-specific brass collimator at a short distance from the eye, were found to be effective tools to be implemented. In perspective, this simulation toolkit could also be used as a benchmark for future developments and testing purposes on commercial treatment planning systems.

  4. Ion therapy for uveal melanoma in new human eye phantom based on GEANT4 toolkit.

    PubMed

    Mahdipour, Seyed Ali; Mowlavi, Ali Asghar

    2016-01-01

    Radiotherapy with ion beams like proton and carbon has been used for treatment of eye uveal melanoma for many years. In this research, we have developed a new phantom of human eye for Monte Carlo simulation of tumors treatment to use in GEANT4 toolkit. Total depth-dose profiles for the proton, alpha, and carbon incident beams with the same ranges have been calculated in the phantom. Moreover, the deposited energy of the secondary particles for each of the primary beams is calculated. The dose curves are compared for 47.8MeV proton, 190.1MeV alpha, and 1060MeV carbon ions that have the same range in the target region reaching to the center of tumor. The passively scattered spread-out Bragg peak (SOBP) for each incident beam as well as the flux curves of the secondary particles including neutron, gamma, and positron has been calculated and compared for the primary beams. The high sharpness of carbon beam׳s Bragg peak with low lateral broadening is the benefit of this beam in hadrontherapy but it has disadvantages of dose leakage in the tail after its Bragg peak and high intensity of neutron production. However, proton beam, which has a good conformation with tumor shape owing to the beam broadening caused by scattering, can be a good choice for the large-size tumors.

  5. VIDA: a voxel-based dosimetry method for targeted radionuclide therapy using Geant4.

    PubMed

    Kost, Susan D; Dewaraja, Yuni K; Abramson, Richard G; Stabin, Michael G

    2015-02-01

    We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy ((131)I, (90)Y, (111)In, (177)Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by (131)I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression.

  6. Interaction of Fast Nucleons with Actinide Nuclei Studied with GEANT4

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yu.; Pshenichnov, I.; Mishustin, I.; Greiner, W.

    2014-04-01

    We model interactions of protons and neutrons with energies from 1 to 1000 MeV with 241Am and 243Am nuclei. The calculations are performed with the Monte Carlo model for Accelerator Driven Systems (MCADS) which we developed based on the GEANT4 toolkit of version 9.4. This toolkit is widely used to simulate the propagation of particles in various materials which contain nuclei up to uranium. After several extensions we apply this toolkit also to proton- and neutron-induced reactions on Am. The fission and radiative neutron capture cross sections, neutron multiplicities and distributions of fission fragments were calculated for 241Am and 243Am and compared with experimental data. As demonstrated, the fission of americium by energetic protons with energies above 20 MeV can be well described by the Intra-Nuclear Cascade Liège (INCL) model combined with the fission-evaporation model ABLA. The calculated average numbers of fission neutrons and mass distributions of fission products agree well with the corresponding data. However, the proton-induced fission below 20 MeV is described less accurately. This is attributed to the limitations of the Intra-Nuclear Cascade model at low projectile energies.

  7. Ion therapy for uveal melanoma in new human eye phantom based on GEANT4 toolkit

    SciTech Connect

    Mahdipour, Seyed Ali; Mowlavi, Ali Asghar

    2016-07-01

    Radiotherapy with ion beams like proton and carbon has been used for treatment of eye uveal melanoma for many years. In this research, we have developed a new phantom of human eye for Monte Carlo simulation of tumors treatment to use in GEANT4 toolkit. Total depth−dose profiles for the proton, alpha, and carbon incident beams with the same ranges have been calculated in the phantom. Moreover, the deposited energy of the secondary particles for each of the primary beams is calculated. The dose curves are compared for 47.8 MeV proton, 190.1 MeV alpha, and 1060 MeV carbon ions that have the same range in the target region reaching to the center of tumor. The passively scattered spread-out Bragg peak (SOBP) for each incident beam as well as the flux curves of the secondary particles including neutron, gamma, and positron has been calculated and compared for the primary beams. The high sharpness of carbon beam's Bragg peak with low lateral broadening is the benefit of this beam in hadrontherapy but it has disadvantages of dose leakage in the tail after its Bragg peak and high intensity of neutron production. However, proton beam, which has a good conformation with tumor shape owing to the beam broadening caused by scattering, can be a good choice for the large-size tumors.

  8. Software Updates: Web Design--Software that Makes It Easy!

    ERIC Educational Resources Information Center

    Pattridge, Gregory C.

    2002-01-01

    This article discusses Web design software that provides an easy-to-use interface. The "Netscape Communicator" is highlighted for beginning Web page construction and step-by-step instructions are provided for starting out, page colors and properties, indents, bulleted lists, tables, adding links, navigating long documents, creating e-mail links,…

  9. Domain specific software design for decision aiding

    NASA Technical Reports Server (NTRS)

    Keller, Kirby; Stanley, Kevin

    1992-01-01

    McDonnell Aircraft Company (MCAIR) is involved in many large multi-discipline design and development efforts of tactical aircraft. These involve a number of design disciplines that must be coordinated to produce an integrated design and a successful product. Our interpretation of a domain specific software design (DSSD) is that of a representation or framework that is specialized to support a limited problem domain. A DSSD is an abstract software design that is shaped by the problem characteristics. This parallels the theme of object-oriented analysis and design of letting the problem model directly drive the design. The DSSD concept extends the notion of software reusability to include representations or frameworks. It supports the entire software life cycle and specifically leads to improved prototyping capability, supports system integration, and promotes reuse of software designs and supporting frameworks. The example presented in this paper is the task network architecture or design which was developed for the MCAIR Pilot's Associate program. The task network concept supported both module development and system integration within the domain of operator decision aiding. It is presented as an instance where a software design exhibited many of the attributes associated with DSSD concept.

  10. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  11. Empirical studies of design software: Implications for software engineering environments

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    The empirical studies team of MCC's Design Process Group conducted three studies in 1986-87 in order to gather data on professionals designing software systems in a range of situations. The first study (the Lift Experiment) used thinking aloud protocols in a controlled laboratory setting to study the cognitive processes of individual designers. The second study (the Object Server Project) involved the observation, videotaping, and data collection of a design team of a medium-sized development project over several months in order to study team dynamics. The third study (the Field Study) involved interviews with the personnel from 19 large development projects in the MCC shareholders in order to study how the process of design is affected by organizationl and project behavior. The focus of this report will be on key observations of design process (at several levels) and their implications for the design of environments.

  12. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  13. Assessment of patient dose reduction by bismuth shielding in CT using measurements, GEANT4 and MCNPX simulations.

    PubMed

    Mendes, M; Costa, F; Figueira, C; Madeira, P; Teles, P; Vaz, P

    2015-07-01

    This work reports on the use of two different Monte Carlo codes (GEANT4 and MCNPX) for assessing the dose reduction using bismuth shields in computer tomography (CT) procedures in order to protect radiosensitive organs such as eye lens, thyroid and breast. Measurements were performed using head and body PMMA phantoms and an ionisation chamber placed in five different positions of the phantom. Simulations were performed to estimate Computed Tomography Dose Index values using GEANT4 and MCNPX. The relative differences between measurements and simulations were <10 %. The dose reduction arising from the use of bismuth shielding ranges from 2 to 45 %, depending on the position of the bismuth shield. The percentage of dose reduction was more significant for the area covered by the bismuth shielding (36 % for eye lens, 39 % for thyroid and 45 % for breast shields).

  14. Geant4 simulations of the Gamma Reaction History Diagnostic at the NIF, Omega and HIGS calibration facility

    NASA Astrophysics Data System (ADS)

    Rubery, Michael; Horsfield, Colin; Herrmann, Hans; Kim, Yong Ho; Mack, Joseph; Young, Carlton; Caldwell, Steven; Evans, Scott; Sedillo, Tom; McEvoy, Aaron; Miller, Kirk; Stoeffl, Wolfgang; Ali, Zaheer; Grafil, Elliott

    2010-11-01

    This paper discusses the development of a Geant4 model of the Gamma Reaction History (GRH) diagnostic at NIF and Omega, Inertial Confinement Fusion (ICF) laser facilities. The GRH diagnostic has been developed to measure bang-time and burn-width parameters for ICF implosions at both facilities, further investigations have also shown that measurements, such as ablator aerial density and ion temperature, may also be possible. Absolute gamma calibration experiments have been performed at the High Intensity Gamma Source (HIGS) facility at Duke University to increase confidence in parameters supplied by simulation for the use in calculations at both laser facilities. A comparison between HIGS data, Geant4 and the ITS ACCEPT code will be presented along with other important GRH properties, such as temporal unit response function, peak-timing shift and Cherenkov production profile, all as a function of pressure and incident gamma energy.

  15. Does software design complexity affect maintenance effort?

    NASA Technical Reports Server (NTRS)

    Epping, Andreas; Lott, Christopher M.

    1994-01-01

    The design complexity of a software system may be characterized within a refinement level (e.g., data flow among modules), or between refinement levels (e.g., traceability between the specification and the design). We analyzed an existing set of data from NASA's Software Engineering Laboratory to test whether changing software modules with high design complexity requires more personnel effort than changing modules with low design complexity. By analyzing variables singly, we identified strong correlations between software design complexity and change effort for error corrections performed during the maintenance phase. By analyzing variables in combination, we found patterns which identify modules in which error corrections were costly to perform during the acceptance test phase.

  16. Intelligent Detector Design

    SciTech Connect

    Graf, N.A.; /SLAC

    2012-06-11

    As the complexity and resolution of imaging detectors increases, the need for detailed simulation of the experimental setup also becomes more important. Designing the detectors requires efficient tools to simulate the detector response and reconstruct the events. We have developed efficient and flexible tools for detailed physics and detector response simulation as well as event reconstruction and analysis. The primary goal has been to develop a software toolkit and computing infrastructure to allow physicists from universities and labs to quickly and easily conduct physics analyses and contribute to detector research and development. The application harnesses the full power of the Geant4 toolkit without requiring the end user to have any experience with either Geant4 or C++, thereby allowing the user to concentrate on the physics of the detector system.

  17. CMD-3 detector offline software development

    NASA Astrophysics Data System (ADS)

    Anisenkov, A.; Ignatov, F.; Pirogov, S.; Sibidanov, A.; Viduk, S.; Zaytsev, A.

    2010-04-01

    CMD-3 is the general purpose cryogenic magnetic detector for VEPP-2000 electron-positron collider, which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). The main aspects of physical program of the experiment are precision measurements of hadronic cross sections, study of known and search for new vector mesons, study of the nbar nand pbar pproduction cross sections in the vicinity of the threshold and search for exotic hadrons in the region of center of mass energy below 2 GeV. This contribution gives a general design overview and a status of implementation of CMD-3 offline software for reconstruction, simulation, visualization and storage management. Software design standards for this project are object oriented programming techniques, C++ as a main language, Geant4 as an only simulation tool, Geant4 based detector geometry description, CLHEP library based primary generators, ROOT toolbox as a persistency manager and Scientific Linux as a main platform. The dedicated software development framework (Cmd3Fwk) was implemented in order to be the basic software integration solution and a high level persistency manager. The key features of the framework are modularity, dynamic data processing chain handling according to the XML configuration of reconstruction modules and on-demand data provisioning mechanisms.

  18. PDB4DNA: Implementation of DNA geometry from the Protein Data Bank (PDB) description for Geant4-DNA Monte-Carlo simulations

    NASA Astrophysics Data System (ADS)

    Delage, E.; Pham, Q. T.; Karamitros, M.; Payno, H.; Stepan, V.; Incerti, S.; Maigne, L.; Perrot, Y.

    2015-07-01

    This paper describes PDB4DNA, a new Geant4 user application, based on an independent, cross-platform, free and open source C++ library, so-called PDBlib, which enables use of atomic level description of DNA molecule in Geant4 Monte Carlo particle transport simulations. For the evaluation of direct damage induced on the DNA molecule by ionizing particles, the application makes use of an algorithm able to determine the closest atom in the DNA molecule to energy depositions. Both the PDB4DNA application and the PDBlib library are available as free and open source under the Geant4 license.

  19. A new approach to PLC software design.

    PubMed

    Kandare, Gregor; Godena, Giovanni; Strmcnik, Stanko

    2003-04-01

    This paper presents a model-based approach to PLC software development. The essence of this approach is the introduction of a new procedural modeling language called ProcGraph. In contrast to commonly used methods, ProcGraph deals with the procedural aspect of the control system and allows software specification at a higher level of abstraction. The modeling language has been supported with the development of a software tool which facilitates graphical model design and automatic code generation. The specification notation has been tested in the development of software for industrial applications. The supporting tool has been tested in a laboratory environment.

  20. Geant4 simulations on medical Linac operation at 18 MV: Experimental validation based on activation foils

    NASA Astrophysics Data System (ADS)

    Vagena, E.; Stoulos, S.; Manolopoulou, M.

    2016-03-01

    The operation of a medical linear accelerator was simulated using the Geant4 code regarding to study the characteristics of an 18 MeV photon beam. Simulations showed that (a) the photon spectrum at the isocenter is not influenced by changes of the primary electron beam's energy distribution and spatial spread (b) 98% of the photon energy fluence scored at the isocenter is primary photons that have only interacted with the target (c) the number of contaminant electrons is not negligible since it fluctuated around 5×10-5 per primary electron or 2.40×10-3 per photon at the isocenter (d) the number of neutrons that are created by (γ, n) reactions is 3.13×10-6 per primary electron or 1.50×10-3 per photon at the isocenter (e) a flattening filter free beam needs less primary electrons in order to deliver the same photon fluence at the isocenter than a normal flattening filter operation (f) there is no significant increase of the surface dose due to the contaminant electrons by removing the flattening filter (g) comparing the neutron fluences per incident electron for the flattened and unflattened beam, the neutron fluencies is 7% higher for the unflattened beams. To validate the simulations results, the total neutron and photon fluence at the isocenter field were measured using nickel, indium, and natural uranium activation foils. The percentage difference between simulations and measurements was 1.26% in case of uranium and 2.45% in case of the indium foil regarding photon fluencies while for neutrons the discrepancy is higher up to 8.0%. The photon and neutron fluencies of the simulated experiments fall within a range of ±1 and ±2 sigma error, respectively, compared to the ones obtained experimentally.

  1. Simulation of a 6 MV Elekta Precise Linac photon beam using GATE/GEANT4.

    PubMed

    Grevillot, L; Frisson, T; Maneval, D; Zahra, N; Badel, J-N; Sarrut, D

    2011-02-21

    The GEANT4-based GATE Monte Carlo (MC) platform was initially focused on PET and SPECT simulations. The new release v6.0 (February 2010) proposes new tools dedicated for radiation therapy simulations. In this work, we investigated some part of this extension and proposed a general methodology for Linac simulations. Details of the modeling of a 6 MV photon beam delivered by an Elekta Precise Linac, with radiation fields ranging from 5 × 5 to 30 × 30 cm(2) at the isocenter are presented. Comparisons were performed with measurements in water. The simulations were performed in two stages: first, the patient-independent part was simulated and a phase space (PhS) was built above the secondary collimator. Then, a multiple source model (MSM) derived from the PhS was proposed to simulate the photon fluence interacting with the patient-dependent part. The selective bremsstrahlung splitting (SBS) variance reduction technique proposed in GATE was used in order to speed up the accelerator head simulation. Further investigations showed that the SBS can be safely used without biasing the simulations. Additional comparisons with full simulations performed on the EGEE grid, in a single stage from the electron source to the water phantom, allowed the evaluation of the MSM. The proposed MSM allowed for calculating depth dose and transverse profiles in 48 hours on a single 2.8 GHz CPU, with a statistical uncertainty of 0.8% for a 10 × 10 cm(2) radiation field, using voxels of 5 × 5 × 5 mm(3). Good agreement between simulations and measurements in water was observed, with dose differences of about 1% and 2% for depth doses and dose profiles, respectively. Additional gamma index comparisons were performed; more than 90% of the points for all simulations passed the 3%/3 mm gamma criterion. To our knowledge, this feasibility study is the first one illustrating the potential of GATE for external radiotherapy applications.

  2. Simulation of a 6 MV Elekta Precise Linac photon beam using GATE/GEANT4

    NASA Astrophysics Data System (ADS)

    Grevillot, L.; Frisson, T.; Maneval, D.; Zahra, N.; Badel, J.-N.; Sarrut, D.

    2011-02-01

    The GEANT4-based GATE Monte Carlo (MC) platform was initially focused on PET and SPECT simulations. The new release v6.0 (February 2010) proposes new tools dedicated for radiation therapy simulations. In this work, we investigated some part of this extension and proposed a general methodology for Linac simulations. Details of the modeling of a 6 MV photon beam delivered by an Elekta Precise Linac, with radiation fields ranging from 5 × 5 to 30 × 30 cm2 at the isocenter are presented. Comparisons were performed with measurements in water. The simulations were performed in two stages: first, the patient-independent part was simulated and a phase space (PhS) was built above the secondary collimator. Then, a multiple source model (MSM) derived from the PhS was proposed to simulate the photon fluence interacting with the patient-dependent part. The selective bremsstrahlung splitting (SBS) variance reduction technique proposed in GATE was used in order to speed up the accelerator head simulation. Further investigations showed that the SBS can be safely used without biasing the simulations. Additional comparisons with full simulations performed on the EGEE grid, in a single stage from the electron source to the water phantom, allowed the evaluation of the MSM. The proposed MSM allowed for calculating depth dose and transverse profiles in 48 hours on a single 2.8 GHz CPU, with a statistical uncertainty of 0.8% for a 10 × 10 cm2 radiation field, using voxels of 5 × 5 × 5 mm3. Good agreement between simulations and measurements in water was observed, with dose differences of about 1% and 2% for depth doses and dose profiles, respectively. Additional gamma index comparisons were performed; more than 90% of the points for all simulations passed the 3%/3 mm gamma criterion. To our knowledge, this feasibility study is the first one illustrating the potential of GATE for external radiotherapy applications.

  3. BC404 scintillators as gamma locators studied via Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Cortés, M. L.; Hoischen, R.; Eisenhauer, K.; Gerl, J.; Pietralla, N.

    2014-05-01

    In many applications in industry and academia, an accurate determination of the direction from where gamma rays are emitted is either needed or desirable. Ion-beam therapy treatments, the search for orphan sources, and homeland security applications are examples of fields that can benefit from directional sensitivity to gamma-radiation. Scintillation detectors are a good option for these types of applications as they have relatively low cost, are easy to handle and can be produced in a large range of different sizes. In this work a Geant4 simulation was developed to study the directional sensitivity of different BC404 scintillator geometries and arrangements. The simulation includes all the physical processes relevant for gamma detection in a scintillator. In particular, the creation and propagation of optical photons inside the scintillator was included. A simplified photomultiplier tube model was also simulated. The physical principle exploited is the angular dependence of the shape of the energy spectrum obtained from thin scintillator layers when irradiated from different angles. After an experimental confirmation of the working principle of the device and a check of the simulation, the possibilities and limitations of directional sensitivity to gamma radiation using scintillator layers was tested. For this purpose, point-like sources of typical energies expected in ion-beam therapy were used. Optimal scintillator thicknesses for different energies were determined and the setup efficiencies calculated. The use of arrays of scintillators to reconstruct the direction of incoming gamma rays was also studied. For this case, a spherical source emitting Bremsstrahlung radiation was used together with a setup consisting of scintillator layers. The capability of this setup to identify the center of the extended source was studied together with its angular resolution.

  4. Validating Geant4 Versions 7.1 and 8.3 Against 6.1 for BaBar

    SciTech Connect

    Banerjee, Swagato; Brown, David N.; Chen, Chunhui; Cote, David; Dubois-Felsmann, Gregory P.; Gaponenko, Igor; Kim, Peter C.; Lockman, William S.; Neal, Homer A.; Simi, Gabriele; Telnov, Alexandre V.; Wright, Dennis H.; /SLAC

    2011-11-08

    Since 2005 and 2006, respectively, Geant4 versions 7.1 and 8.3 have been available, providing: improvements in modeling of multiple scattering; corrections to muon ionization and improved MIP signature; widening of the core of electromagnetic shower shape profiles; newer implementation of elastic scattering for hadronic processes; detailed implementation of Bertini cascade model for kaons and lambdas, and updated hadronic cross-sections from calorimeter beam tests. The effects of these changes in simulation are studied in terms of closer agreement of simulation using Geant4 versions 7.1 and 8.3 as compared to Geant4 version 6.1 with respect to data distributions of: the hit residuals of tracks in BABAR silicon vertex tracker; the photon and K{sub L}{sup 0} shower shapes in the electromagnetic calorimeter; the ratio of energy deposited in the electromagnetic calorimeter and the flux return of the magnet instrumented with a muon detection system composed of resistive plate chambers and limited-streamer tubes; and the muon identification efficiency in the muon detector system of the BABAR detector.

  5. Optimization of a photoneutron source based on 10 MeV electron beam using Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Askri, Boubaker

    2015-10-01

    Geant4 Monte Carlo code has been used to conceive and optimize a simple and compact neutron source based on a 10 MeV electron beam impinging on a tungsten target adjoined to a beryllium target. For this purpose, a precise photonuclear reaction cross-section model issued from the International Atomic Energy Agency (IAEA) database was linked to Geant4 to accurately simulate the interaction of low energy bremsstrahlung photons with beryllium material. A benchmark test showed that a good agreement was achieved when comparing the emitted neutron flux spectra predicted by Geant4 and Fluka codes for a beryllium cylinder bombarded with a 5 MeV photon beam. The source optimization was achieved through a two stage Monte Carlo simulation. In the first stage, the distributions of the seven phase space coordinates of the bremsstrahlung photons at the boundaries of the tungsten target were determined. In the second stage events corresponding to photons emitted according to these distributions were tracked. A neutron yield of 4.8 × 1010 neutrons/mA/s was obtained at 20 cm from the beryllium target. A thermal neutron yield of 1.5 × 109 neutrons/mA/s was obtained after introducing a spherical shell of polyethylene as a neutron moderator.

  6. Validation of the Geant4 Monte Carlo package for X-ray fluorescence spectroscopy in triaxial geometry

    NASA Astrophysics Data System (ADS)

    Amaro, Pedro; Santos, José Paulo; Samouco, Ana; Adão, Ricardo; Martins, Luís Souto; Weber, Sebastian; Tashenov, Stanislav; Carvalho, Maria Luisa; Pessanha, Sofia

    2017-04-01

    In this study, we investigated the potential of the Geant4 Monte Carlo simulation package for retrieving accurate elemental concentrations from energy dispersive X-ray fluorescence spectra. For this purpose, we implemented a Geant4 code that simulates an energy dispersive X-ray fluorescence spectrometer in a triaxial geometry. In parallel, we also performed measurements in a spectrometer with the same geometry, for validation of the present code. This spectrometer allows low limits of detection and permits an effective comparison of elemental concentrations down to tens of part-per-million. Several standard reference materials of both light, medium and heavy matrices were employed in order to attest the validity of simulations for several values of averaged atomic number. We observed good agreement of better than 25% for most fluorescence lines of interest, and for all materials. Discrepancies were observed at the multiple Compton scattering tail. We thus concluded from this experimental and theoretical study that the present Geant4 code can be incorporated in a quantitative method for the determination of trace elements in a triaxial-type spectrometer.

  7. GENII Version 2 Software Design Document

    SciTech Connect

    Napier, Bruce A.; Strenge, Dennis L.; Ramsdell, James V.; Eslinger, Paul W.; Fosmire, Christian J.

    2004-03-08

    This document describes the architectural design for the GENII-V2 software package. This document defines details of the overall structure of the software, the major software components, their data file interfaces, and specific mathematical models to be used. The design represents a translation of the requirements into a description of the software structure, software components, interfaces, and necessary data. The design focuses on the major components and data communication links that are key to the implementation of the software within the operating framework. The purpose of the GENII-V2 software package is to provide the capability to perform dose and risk assessments of environmental releases of radionuclides. The software also has the capability of calculating environmental accumulation and radiation doses from surface water, groundwater, and soil (buried waste) media when an input concentration of radionuclide in these media is provided. This report represents a detailed description of the capabilities of the software product with exact specifications of mathematical models that form the basis for the software implementation and testing efforts. This report also presents a detailed description of the overall structure of the software package, details of main components (implemented in the current phase of work), details of data communication files, and content of basic output reports. The GENII system includes the capabilities for calculating radiation doses following chronic and acute releases. Radionuclide transport via air, water, or biological activity may be considered. Air transport options include both puff and plume models, each allow use of an effective stack height or calculation of plume rise from buoyant or momentum effects (or both). Building wake effects can be included in acute atmospheric release scenarios. The code provides risk estimates for health effects to individuals or populations; these can be obtained using the code by applying

  8. Software design and documentation language, revision 1

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1979-01-01

    The Software Design and Documentation Language (SDDL) developed to provide an effective communications medium to support the design and documentation of complex software applications is described. Features of the system include: (1) a processor which can convert design specifications into an intelligible, informative machine-reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor. The SDDL processor is written in the SIMSCRIPT II programming language and is implemented on the UNIVAC 1108, the IBM 360/370, and Control Data machines.

  9. Space Software for Automotive Design

    NASA Technical Reports Server (NTRS)

    1988-01-01

    John Thousand of Wolverine Western Corp. put his aerospace group to work on an unfamiliar job, designing a brake drum using computer design techniques. Computer design involves creation of a mathematical model of a product and analyzing its effectiveness in simulated operation. Technique enables study of performance and structural behavior of a number of different designs before settling on a final configuration. Wolverine employees attacked a traditional brake drum problem, the sudden buildup of heat during fast and repeated braking. Part of brake drum not confined tends to change its shape under combination of heat, physical pressure and rotational forces, a condition known as bellmouthing. Since bellmouthing is a major factor in braking effectiveness, a solution of problem would be a major advance in automotive engineering. A former NASA employee, now a Wolverine employee, knew of a series of NASA computer programs ideally suited to confronting bellmouthing. Originally developed as aids to rocket engine nozzle design, it's capable of analyzing problems generated in a rocket engine or automotive brake drum by heat, expansion, pressure and rotational forces. Use of these computer programs led to new brake drum concept featuring a more durable axle, and heat transfer ribs, or fins, on hub of drum.

  10. SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1994-01-01

    Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures

  11. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  12. Design of software engineering teaching website

    NASA Astrophysics Data System (ADS)

    Li, Yuxiang; Liu, Xin; Zhang, Guangbin; Liu, Xingshun; Gao, Zhenbo

    "􀀶oftware engineering" is different from the general professional courses, it is born for getting rid of the software crisis and adapting to the development of software industry, it is a theory course, especially a practical course. However, due to the own characteristics of software engineering curriculum, in the daily teaching process, concerning theoretical study, students may feel boring, obtain low interest in learning and poor test results and other problems. ASPNET design technique is adopted and Access 2007 database is used for system to design and realize "Software Engineering" teaching website. System features mainly include theoretical teaching, case teaching, practical teaching, teaching interaction, database, test item bank, announcement, etc., which can enhance the vitality, interest and dynamic role of learning.

  13. Early-Stage Software Design for Usability

    ERIC Educational Resources Information Center

    Golden, Elspeth

    2010-01-01

    In spite of the goodwill and best efforts of software engineers and usability professionals, systems continue to be built and released with glaring usability flaws that are costly and difficult to fix after the system has been built. Although user interface (UI) designers, be they usability or design experts, communicate usability requirements to…

  14. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  15. An Analysis of Software Design Methodologies

    DTIC Science & Technology

    1979-08-01

    Technical Report 401-, # "AN ANALYSIS OF SOFTWARE DESIGN METHODOLOGIES H. Rudy Ramsey, Michael E. Atwood , and Gary D. Campbell Science...H. Rudy Ramsey, Michael E. Atwood , and Gary D. Campbell Science Applications, Incorporated Submitted by: Edgar M. Johnson, Chief HUMAN FACTORS...expressed by members ot the Integrated Software Research and Development Working Group (ISRAD). The authors are indebted to Martha Cichelli, Margaret

  16. Monte Carlo simulation and scatter correction of the GE Advance PET scanner with SimSET and Geant4

    NASA Astrophysics Data System (ADS)

    Barret, Olivier; Carpenter, T. Adrian; Clark, John C.; Ansorge, Richard E.; Fryer, Tim D.

    2005-10-01

    For Monte Carlo simulations to be used as an alternative solution to perform scatter correction, accurate modelling of the scanner as well as speed is paramount. General-purpose Monte Carlo packages (Geant4, EGS, MCNP) allow a detailed description of the scanner but are not efficient at simulating voxel-based geometries (patient images). On the other hand, dedicated codes (SimSET, PETSIM) will perform well for voxel-based objects but will be poor in their capacity of simulating complex geometries such as a PET scanner. The approach adopted in this work was to couple a dedicated code (SimSET) with a general-purpose package (Geant4) to have the efficiency of the former and the capabilities of the latter. The combined SimSET+Geant4 code (SimG4) was assessed on the GE Advance PET scanner and compared to the use of SimSET only. A better description of the resolution and sensitivity of the scanner and of the scatter fraction was obtained with SimG4. The accuracy of scatter correction performed with SimG4 and SimSET was also assessed from data acquired with the 20 cm NEMA phantom. SimG4 was found to outperform SimSET and to give slightly better results than the GE scatter correction methods installed on the Advance scanner (curve fitting and scatter modelling for the 300-650 keV and 375-650 keV energy windows, respectively). In the presence of a hot source close to the edge of the field of view (as found in oxygen scans), the GE curve-fitting method was found to fail whereas SimG4 maintained its performance.

  17. SU-E-T-427: Feasibility Study for Evaluation of IMRT Dose Distribution Using Geant4-Based Automated Algorithms

    SciTech Connect

    Choi, H; Shin, W; Testa, M; Min, C; Kim, J

    2015-06-15

    Purpose: For intensity-modulated radiation therapy (IMRT) treatment planning validation using Monte Carlo (MC) simulations, a precise and automated procedure is necessary to evaluate the patient dose distribution. The aim of this study is to develop an automated algorithm for IMRT simulations using DICOM files and to evaluate the patient dose based on 4D simulation using the Geant4 MC toolkit. Methods: The head of a clinical linac (Varian Clinac 2300 IX) was modeled in Geant4 along with particular components such as the flattening filter and the multi-leaf collimator (MLC). Patient information and the position of the MLC were imported from the DICOM-RT interface. For each position of the MLC, a step- and-shoot technique was adopted. PDDs and lateral profiles were simulated in a water phantom (50×50×40 cm{sup 3}) and compared to measurement data. We used a lung phantom and MC-dose calculations were compared to the clinical treatment planning used at the Seoul National University Hospital. Results: In order to reproduce the measurement data, we tuned three free parameters: mean and standard deviation of the primary electron beam energy and the beam spot size. These parameters for 6 MV were found to be 5.6 MeV, 0.2378 MeV and 1 mm FWHM respectively. The average dose difference between measurements and simulations was less than 2% for PDDs and radial profiles. The lung phantom study showed fairly good agreement between MC and planning dose despite some unavoidable statistical fluctuation. Conclusion: The current feasibility study using the lung phantom shows the potential for IMRT dose validation using 4D MC simulations using Geant4 tool kits. This research was supported by Korea Institute of Nuclear safety and Development of Measurement Standards for Medical Radiation funded by Korea research Institute of Standards and Science. (KRISS-2015-15011032)

  18. Monte Carlo simulation and scatter correction of the GE advance PET scanner with SimSET and Geant4.

    PubMed

    Barret, Olivier; Carpenter, T Adrian; Clark, John C; Ansorge, Richard E; Fryer, Tim D

    2005-10-21

    For Monte Carlo simulations to be used as an alternative solution to perform scatter correction, accurate modelling of the scanner as well as speed is paramount. General-purpose Monte Carlo packages (Geant4, EGS, MCNP) allow a detailed description of the scanner but are not efficient at simulating voxel-based geometries (patient images). On the other hand, dedicated codes (SimSET, PETSIM) will perform well for voxel-based objects but will be poor in their capacity of simulating complex geometries such as a PET scanner. The approach adopted in this work was to couple a dedicated code (SimSET) with a general-purpose package (Geant4) to have the efficiency of the former and the capabilities of the latter. The combined SimSET+Geant4 code (SimG4) was assessed on the GE Advance PET scanner and compared to the use of SimSET only. A better description of the resolution and sensitivity of the scanner and of the scatter fraction was obtained with SimG4. The accuracy of scatter correction performed with SimG4 and SimSET was also assessed from data acquired with the 20 cm NEMA phantom. SimG4 was found to outperform SimSET and to give slightly better results than the GE scatter correction methods installed on the Advance scanner (curve fitting and scatter modelling for the 300-650 keV and 375-650 keV energy windows, respectively). In the presence of a hot source close to the edge of the field of view (as found in oxygen scans), the GE curve-fitting method was found to fail whereas SimG4 maintained its performance.

  19. Studying the response of a plastic scintillator to gamma rays using the Geant4 Monte Carlo code.

    PubMed

    Ghadiri, Rasoul; Khorsandi, Jamshid

    2015-05-01

    To determine the gamma ray response function of an NE-102 scintillator and to investigate the gamma spectra due to the transport of optical photons, we simulated an NE-102 scintillator using Geant4 code. The results of the simulation were compared with experimental data. Good consistency between the simulation and data was observed. In addition, the time and spatial distributions, along with the energy distribution and surface treatments of scintillation detectors, were calculated. This simulation makes us capable of optimizing the photomultiplier tube (or photodiodes) position to yield the best coupling to the detector.

  20. Intercomparision of Monte Carlo Radiation Transport Codes MCNPX, GEANT4, and FLUKA for Simulating Proton Radiotherapy of the Eye

    PubMed Central

    Randeniya, S. D.; Taddei, P. J.; Newhauser, W. D.; Yepes, P.

    2010-01-01

    Monte Carlo simulations of an ocular treatment beam-line consisting of a nozzle and a water phantom were carried out using MCNPX, GEANT4, and FLUKA to compare the dosimetric accuracy and the simulation efficiency of the codes. Simulated central axis percent depth-dose profiles and cross-field dose profiles were compared with experimentally measured data for the comparison. Simulation speed was evaluated by comparing the number of proton histories simulated per second using each code. The results indicate that all the Monte Carlo transport codes calculate sufficiently accurate proton dose distributions in the eye and that the FLUKA transport code has the highest simulation efficiency. PMID:20865141

  1. Creation of a Geant4 Muon Tomography Package for Imaging of Nuclear Fuel in Dry Cask Storage

    SciTech Connect

    Tsoukalas, Lefteri H.

    2016-03-01

    This is the final report of the NEUP project “Creation of a Geant4 Muon Tomography Package for Imaging of Nuclear Fuel in Dry Cask Storage”, DE-NE0000695. The project started on December 1, 2013 and this report covers the period December 1, 2013 through November 30, 2015. The project was successfully completed and this report provides an overview of the main achievements, results and findings throughout the duration of the project. Additional details can be found in the main body of this report and on the individual Quarterly Reports and associated Deliverables of the project, uploaded in PICS-NE.

  2. SU-E-T-347: Validation of the Condensed History Algorithm of Geant4 Using the Fano Test

    SciTech Connect

    Lee, H; Mathis, M; Sawakuchi, G

    2014-06-01

    Purpose: To validate the condensed history algorithm and physics of the Geant4 Monte Carlo toolkit for simulations of ionization chambers (ICs). This study is the first step to validate Geant4 for calculations of photon beam quality correction factors under the presence of a strong magnetic field for magnetic resonance guided linac system applications. Methods: The electron transport and boundary crossing algorithms of Geant4 version 9.6.p02 were tested under Fano conditions using the Geant4 example/application FanoCavity. User-defined parameters of the condensed history and multiple scattering algorithms were investigated under Fano test conditions for three scattering models (physics lists): G4UrbanMscModel95 (PhysListEmStandard-option3), G4GoudsmitSaundersonMsc (PhysListEmStandard-GS), and G4WentzelVIModel/G4CoulombScattering (PhysListEmStandard-WVI). Simulations were conducted using monoenergetic photon beams, ranging from 0.5 to 7 MeV and emphasizing energies from 0.8 to 3 MeV. Results: The GS and WVI physics lists provided consistent Fano test results (within ±0.5%) for maximum step sizes under 0.01 mm at 1.25 MeV, with improved performance at 3 MeV (within ±0.25%). The option3 physics list provided consistent Fano test results (within ±0.5%) for maximum step sizes above 1 mm. Optimal parameters for the option3 physics list were 10 km maximum step size with default values for other user-defined parameters: 0.2 dRoverRange, 0.01 mm final range, 0.04 range factor, 2.5 geometrical factor, and 1 skin. Simulations using the option3 physics list were ∼70 – 100 times faster compared to GS and WVI under optimal parameters. Conclusion: This work indicated that the option3 physics list passes the Fano test within ±0.5% when using a maximum step size of 10 km for energies suitable for IC calculations in a 6 MV spectrum without extensive computational times. Optimal user-defined parameters using the option3 physics list will be used in future IC simulations to

  3. The MONSOON Generic Pixel Server software design

    NASA Astrophysics Data System (ADS)

    Buchholz, Nick C.; Daly, Philip N.

    2004-09-01

    MONSOON is the next generation OUV-IR controller development project being conducted at NOAO. MONSOON was designed from the start as an "architecture" that provides the flexibility to handle multiple detector types, rather than as a set of specific hardware to control a particular detector. The hardware design was done with maintainability and scalability as key factors. We have, wherever possible chosen commercial off-the-shelf components rather than use in-house or proprietary systems. From first principles, the software design had to be configurable in order to handle many detector types and focal plane configurations. The MONSOON software is multi-layered with simulation of the hardware built in. By keeping the details of hardware interfaces confined to only two libraries and by strict conformance to a set of interface control documents the MONSOON software is usable with other hardware systems with minimal change. In addition, the design provides that focal plane specific details are confined to routines that are selected at load time. At the top-level, the MONSOON Supervisor Level (MSL), we use the GPX dictionary, a defined interface to the software system that instruments and high-level software can use to control and query the system. Below this are PAN-DHE pairs that interface directly with portions of the focal plane. The number of PAN-DHE pairs can be scaled up to increase channel counts and processing speed or to handle larger focal planes. The range of detector applications supported goes from single detector LAB systems, four detector IR systems like NEWFIRM, up to 500 CCD focal planes like LSST. In this paper we discuss the design of the PAN software and it's interaction with the detector head electronics.

  4. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  5. User Interface Design for Dynamic Geometry Software

    ERIC Educational Resources Information Center

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  6. Teacher-Driven Design of Educational Software.

    ERIC Educational Resources Information Center

    Carlson, Patricia A.

    This paper reflects on the author's participation in two government-sponsored educational software development projects that used a holistic design paradigm in which classroom formative assessment and teacher input played a critical role in the development process. The two project were: R-WISE (Reading and Writing in a Supportive Environment)--a…

  7. Simulation and Digitization of a Gas Electron Multiplier Detector Using Geant4 and an Object-Oriented Digitization Program

    NASA Astrophysics Data System (ADS)

    McMullen, Timothy; Liyanage, Nilanga; Xiong, Weizhi; Zhao, Zhiwen

    2017-01-01

    Our research has focused on simulating the response of a Gas Electron Multiplier (GEM) detector using computational methods. GEM detectors provide a cost effective solution for radiation detection in high rate environments. A detailed simulation of GEM detector response to radiation is essential for the successful adaption of these detectors to different applications. Using Geant4 Monte Carlo (GEMC), a wrapper around Geant4 which has been successfully used to simulate the Solenoidal Large Intensity Device (SoLID) at Jefferson Lab, we are developing a simulation of a GEM chamber similar to the detectors currently used in our lab. We are also refining an object-oriented digitization program, which translates energy deposition information from GEMC into electronic readout which resembles the readout from our physical detectors. We have run the simulation with beta particles produced by the simulated decay of a 90Sr source, as well as with a simulated bremsstrahlung spectrum. Comparing the simulation data with real GEM data taken under similar conditions is used to refine the simulation parameters. Comparisons between results from the simulations and results from detector tests will be presented.

  8. Validation of a Geant4 model of the X-ray fluorescence microprobe at the Australian Synchrotron.

    PubMed

    Dimmock, Matthew Richard; de Jonge, Martin Daly; Howard, Daryl Lloyd; James, Simon Alexander; Kirkham, Robin; Paganin, David Maurice; Paterson, David John; Ruben, Gary; Ryan, Chris Gregory; Brown, Jeremy Michael Cooney

    2015-03-01

    A Geant4 Monte Carlo simulation of the X-ray fluorescence microprobe (XFM) end-station at the Australian Synchrotron has been developed. The simulation is required for optimization of the scan configuration and reconstruction algorithms. As part of the simulation process, a Gaussian beam model was developed. Experimental validation of this simulation has tested the efficacy for use of the low-energy physics models in Geant4 for this synchrotron-based technique. The observed spectral distributions calculated in the 384 pixel Maia detector, positioned in the standard back-scatter configuration, were compared with those obtained from experiments performed at three incident X-ray beam energies: 18.5, 11.0 and 6.8 keV. The reduced χ-squared (\\chi^{2}_{\\rm{red}}) was calculated for the scatter and fluorescence regions of the spectra and demonstrates that the simulations successfully reproduce the scatter distributions. Discrepancies were shown to occur in the multiple-scatter tail of the Compton continuum. The model was shown to be particularly sensitive to the impurities present in the beryllium window of the Maia detector and their concentrations were optimized to improve the \\chi^{2}_{\\rm{red}} parametrization in the low-energy fluorescence regions of the spectra.

  9. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam.

    PubMed

    Hall, David C; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-07

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues.

  10. Layered mass geometry: a novel technique to overlay seeds and applicators onto patient geometry in Geant4 brachytherapy simulations

    NASA Astrophysics Data System (ADS)

    Enger, Shirin A.; Landry, Guillaume; D'Amours, Michel; Verhaegen, Frank; Beaulieu, Luc; Asai, Makoto; Perl, Joseph

    2012-10-01

    A problem faced by all Monte Carlo (MC) particle transport codes is how to handle overlapping geometries. The Geant4 MC toolkit allows the user to create parallel geometries within a single application. In Geant4 the standard mass-containing geometry is defined in a simulation volume called the World Volume. Separate parallel geometries can be defined in parallel worlds, that is, alternate three dimensional simulation volumes that share the same coordinate system with the World Volume for geometrical event biasing, scoring of radiation interactions, and/or the creation of hits in detailed readout structures. Until recently, only one of those worlds could contain mass so these parallel worlds provided no solution to simplify a complex geometric overlay issue in brachytherapy, namely the overlap of radiation sources and applicators with a CT based patient geometry. The standard method to handle seed and applicator overlay in MC requires removing CT voxels whose boundaries would intersect sources, placing the sources into the resulting void and then backfilling the remaining space of the void with a relevant material. The backfilling process may degrade the accuracy of patient representation, and the geometrical complexity of the technique precludes using fast and memory-efficient coding techniques that have been developed for regular voxel geometries. The patient must be represented by the less memory and CPU-efficient Geant4 voxel placement technique, G4PVPlacement, rather than the more efficient G4NestedParameterization (G4NestedParam). We introduce for the first time a Geant4 feature developed to solve this issue: Layered Mass Geometry (LMG) whereby both the standard (CT based patient geometry) and the parallel world (seeds and applicators) may now have mass. For any area where mass is present in the parallel world, the parallel mass is used. Elsewhere, the mass of the standard world is used. With LMG the user no longer needs to remove patient CT voxels that would

  11. Layered mass geometry: a novel technique to overlay seeds and applicators onto patient geometry in Geant4 brachytherapy simulations.

    PubMed

    Enger, Shirin A; Landry, Guillaume; D'Amours, Michel; Verhaegen, Frank; Beaulieu, Luc; Asai, Makoto; Perl, Joseph

    2012-10-07

    A problem faced by all Monte Carlo (MC) particle transport codes is how to handle overlapping geometries. The Geant4 MC toolkit allows the user to create parallel geometries within a single application. In Geant4 the standard mass-containing geometry is defined in a simulation volume called the World Volume. Separate parallel geometries can be defined in parallel worlds, that is, alternate three dimensional simulation volumes that share the same coordinate system with the World Volume for geometrical event biasing, scoring of radiation interactions, and/or the creation of hits in detailed readout structures. Until recently, only one of those worlds could contain mass so these parallel worlds provided no solution to simplify a complex geometric overlay issue in brachytherapy, namely the overlap of radiation sources and applicators with a CT based patient geometry. The standard method to handle seed and applicator overlay in MC requires removing CT voxels whose boundaries would intersect sources, placing the sources into the resulting void and then backfilling the remaining space of the void with a relevant material. The backfilling process may degrade the accuracy of patient representation, and the geometrical complexity of the technique precludes using fast and memory-efficient coding techniques that have been developed for regular voxel geometries. The patient must be represented by the less memory and CPU-efficient Geant4 voxel placement technique, G4PVPlacement, rather than the more efficient G4NestedParameterization (G4NestedParam). We introduce for the first time a Geant4 feature developed to solve this issue: Layered Mass Geometry (LMG) whereby both the standard (CT based patient geometry) and the parallel world (seeds and applicators) may now have mass. For any area where mass is present in the parallel world, the parallel mass is used. Elsewhere, the mass of the standard world is used. With LMG the user no longer needs to remove patient CT voxels that would

  12. SU-E-T-521: Investigation of the Uncertainties Involved in Secondary Neutron/gamma Production in Geant4/MCNP6 Monte Carlo Codes for Proton Therapy Application

    SciTech Connect

    Mirzakhanian, L; Enger, S; Giusti, V

    2015-06-15

    Purpose: A major concern in proton therapy is the production of secondary neutrons causing secondary cancers, especially in young adults and children. Most utilized Monte Carlo codes in proton therapy are Geant4 and MCNP. However, the default versions of Geant4 and MCNP6 do not have suitable cross sections or physical models to properly handle secondary particle production in proton energy ranges used for therapy. In this study, default versions of Geant4 and MCNP6 were modified to better handle production of secondaries by adding the TENDL-2012 cross-section library. Methods: In-water proton depth-dose was measured at the “The Svedberg Laboratory” in Uppsala (Sweden). The proton beam was mono-energetic with mean energy of 178.25±0.2 MeV. The measurement set-up was simulated by Geant4 version 10.00 (default and modified version) and MCNP6. Proton depth-dose, primary and secondary particle fluence and neutron equivalent dose were calculated. In case of Geant4, the secondary particle fluence was filtered by all the physics processes to identify the main process responsible for the difference between the default and modified version. Results: The proton depth-dose curves and primary proton fluence show a good agreement between both Geant4 versions and MCNP6. With respect to the modified version, default Geant4 underestimates the production of secondary neutrons while overestimates that of gammas. The “ProtonInElastic” process was identified as the main responsible process for the difference between the two versions. MCNP6 shows higher neutron production and lower gamma production than both Geant4 versions. Conclusion: Despite the good agreement on the proton depth dose curve and primary proton fluence, there is a significant discrepancy on secondary neutron production between MCNP6 and both versions of Geant4. Further studies are thus in order to find the possible cause of this discrepancy or more accurate cross-sections/models to handle the nuclear

  13. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  14. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  15. Empirical studies of software design: Implications for SSEs

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.

  16. Advanced Extravehicular Mobility Unit Informatics Software Design

    NASA Technical Reports Server (NTRS)

    Wright, Theodore

    2014-01-01

    This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.

  17. Dose point kernels in liquid water: an intra-comparison between GEANT4-DNA and a variety of Monte Carlo codes.

    PubMed

    Champion, C; Incerti, S; Perrot, Y; Delorme, R; Bordage, M C; Bardiès, M; Mascialino, B; Tran, H N; Ivanchenko, V; Bernal, M; Francis, Z; Groetz, J-E; Fromm, M; Campos, L

    2014-01-01

    Modeling the radio-induced effects in biological medium still requires accurate physics models to describe the interactions induced by all the charged particles present in the irradiated medium in detail. These interactions include inelastic as well as elastic processes. To check the accuracy of the very low energy models recently implemented into the GEANT4 toolkit for modeling the electron slowing-down in liquid water, the simulation of electron dose point kernels remains the preferential test. In this context, we here report normalized radial dose profiles, for mono-energetic point sources, computed in liquid water by using the very low energy "GEANT4-DNA" physics processes available in the GEANT4 toolkit. In the present study, we report an extensive intra-comparison of profiles obtained by a large selection of existing and well-documented Monte-Carlo codes, namely, EGSnrc, PENELOPE, CPA100, FLUKA and MCNPX.

  18. Software archeology: a case study in software quality assurance and design

    SciTech Connect

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  19. Ray tracing simulations for the wide-field x-ray telescope of the Einstein Probe mission based on Geant4 and XRTG4

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Willingale, Richard; Ling, Zhixing; Feng, Hua; Li, Hong; Ji, Jianfeng; Wang, Wenxin; Zhang, Shuangnan

    2014-07-01

    Einstein Probe (EP) is a proposed small scientific satellite dedicated to time-domain astrophysics working in the soft X-ray band. It will discover transients and monitor variable objects in 0.5-4 keV, for which it will employ a very large instantaneous field-of-view (60° × 60°), along with moderate spatial resolution (FWHM ˜ 5 arcmin). Its wide-field imaging capability will be achieved by using established technology in novel lobster-eye optics. In this paper, we present Monte-Carlo simulations for the focusing capabilities of EP's Wide-field X-ray Telescope (WXT). The simulations are performed using Geant4 with an X-ray tracer which was developed by cosine (http://cosine.nl/) to trace X-rays. Our work is the first step toward building a comprehensive model with which the design of the X-ray optics and the ultimate sensitivity of the instrument can be optimized by simulating the X-ray tracing and radiation environment of the system, including the focal plane detector and the shielding at the same time.

  20. Probing Planetary Bodies for Subsurface Volatiles: GEANT4 Models of Gamma Ray, Fast, Epithermal, and Thermal Neutron Response to Active Neutron Illumination

    NASA Astrophysics Data System (ADS)

    Chin, G.; Sagdeev, R.; Su, J. J.; Murray, J.

    2014-12-01

    Using an active source of neutrons as an in situ probe of a planetary body has proven to be a powerful tool to extract information about the presence, abundance, and location of subsurface volatiles without the need for drilling. The Dynamic Albedo of Neutrons (DAN) instrument on Curiosity is an example of such an instrument and is designed to detect the location and abundance of hydrogen within the top 50 cm of the Martian surface. DAN works by sending a pulse of neutrons towards the ground beneath the rover and detecting the reflected neutrons. The intensity and time of arrival of the reflection depends on the proportion of water, while the time the pulse takes to reach the detector is a function of the depth at which the water is located. Similar instruments can also be effective probes at the polar-regions of the Moon or on asteroids as a way of detecting sequestered volatiles. We present the results of GEANT4 particle simulation models of gamma ray, fast, epithermal, and thermal neutron responses to active neutron illumination. The results are parameterized by hydrogen abundance, stratification and depth of volatile layers, versus the distribution of neutron and gamma ray energy reflections. Models will be presented to approximate Martian, lunar, and asteroid environments and would be useful tools to assess utility for future NASA exploration missions to these types of planetary bodies.

  1. Technical Note: Improvements in GEANT4 energy-loss model and the effect on low-energy electron transport in liquid water

    SciTech Connect

    Kyriakou, I.; Incerti, S.

    2015-07-15

    Purpose: The GEANT4-DNA physics models are upgraded by a more accurate set of electron cross sections for ionization and excitation in liquid water. The impact of the new developments on low-energy electron transport simulations by the GEANT4 Monte Carlo toolkit is examined for improving its performance in dosimetry applications at the subcellular and nanometer level. Methods: The authors provide an algorithm for an improved implementation of the Emfietzoglou model dielectric response function of liquid water used in the GEANT4-DNA existing model. The algorithm redistributes the imaginary part of the dielectric function to ensure a physically motivated behavior at the binding energies, while retaining all the advantages of the original formulation, e.g., the analytic properties and the fulfillment of the f-sum-rule. In addition, refinements in the exchange and perturbation corrections to the Born approximation used in the GEANT4-DNA existing model are also made. Results: The new ionization and excitation cross sections are significantly different from those of the GEANT4-DNA existing model. In particular, excitations are strongly enhanced relative to ionizations, resulting in higher W-values and less diffusive dose-point-kernels at sub-keV electron energies. Conclusions: An improved energy-loss model for the excitation and ionization of liquid water by low-energy electrons has been implemented in GEANT4-DNA. The suspiciously low W-values and the unphysical long tail in the dose-point-kernel have been corrected owing to a different partitioning of the dielectric function.

  2. GEANT4 Simulation of Hadronic Interactions at 8-GeV/C to 10-GeV/C: Response to the HARP-CDP Group

    SciTech Connect

    Uzhinsky, V.; Apostolakis, J.; Folger, G.; Ivanchenko, V.N.; Kossov, M.V.; Wright, D.H.; /SLAC

    2011-11-21

    The results of the HARP-CDP group on the comparison of GEANT4 Monte Carlo predictions versus experimental data are discussed. It is shown that the problems observed by the group are caused by an incorrect implementation of old features at the programming level, and by a lack of the nucleon Fermi motion in the simulation of quasielastic scattering. These drawbacks are not due to the physical models used. They do not manifest themselves in the most important applications of the GEANT4 toolkit.

  3. j5 DNA assembly design automation software.

    PubMed

    Hillson, Nathan J; Rosengarten, Rafael D; Keasling, Jay D

    2012-01-20

    Recent advances in Synthetic Biology have yielded standardized and automatable DNA assembly protocols that enable a broad range of biotechnological research and development. Unfortunately, the experimental design required for modern scar-less multipart DNA assembly methods is frequently laborious, time-consuming, and error-prone. Here, we report the development and deployment of a web-based software tool, j5, which automates the design of scar-less multipart DNA assembly protocols including SLIC, Gibson, CPEC, and Golden Gate. The key innovations of the j5 design process include cost optimization, leveraging DNA synthesis when cost-effective to do so, the enforcement of design specification rules, hierarchical assembly strategies to mitigate likely assembly errors, and the instruction of manual or automated construction of scar-less combinatorial DNA libraries. Using a GFP expression testbed, we demonstrate that j5 designs can be executed with the SLIC, Gibson, or CPEC assembly methods, used to build combinatorial libraries with the Golden Gate assembly method, and applied to the preparation of linear gene deletion cassettes for E. coli. The DNA assembly design algorithms reported here are generally applicable to broad classes of DNA construction methodologies and could be implemented to supplement other DNA assembly design tools. Taken together, these innovations save researchers time and effort, reduce the frequency of user design errors and off-target assembly products, decrease research costs, and enable scar-less multipart and combinatorial DNA construction at scales unfeasible without computer-aided design.

  4. Assessment and improvements of Geant4 hadronic models in the context of prompt-gamma hadrontherapy monitoring

    NASA Astrophysics Data System (ADS)

    Dedes, G.; Pinto, M.; Dauvergne, D.; Freud, N.; Krimmer, J.; Létang, J. M.; Ray, C.; Testa, E.

    2014-04-01

    Monte Carlo simulations are nowadays essential tools for a wide range of research topics in the field of radiotherapy. They also play an important role in the effort to develop a real-time monitoring system for quality assurance in proton and carbon ion therapy, by means of prompt-gamma detection. The internal theoretical nuclear models of Monte Carlo simulation toolkits are of decisive importance for the accurate description of neutral or charged particle emission, produced by nuclear interactions between beam particles and target nuclei. We assess the performance of Geant4 nuclear models in the context of prompt-gamma emission, comparing them with experimental data from proton and carbon ion beams. As has been shown in the past and further indicated in our study, the prompt-gamma yields are consistently overestimated by Geant4 by a factor of about 100% to 200% over an energy range from 80 to 310 MeV/u for the case of 12C, and to a lesser extent for 160 MeV protons. Furthermore, we focus on the quantum molecular dynamics (QMD) modeling of ion-ion collisions, in order to optimize its description of light nuclei, which are abundant in the human body and mainly anticipated in hadrontherapy applications. The optimization has been performed by benchmarking QMD free parameters with well established nuclear properties. In addition, we study the effect of this optimization on charged particle emission. With the usage of the proposed parameter values, discrepancies reduce to less than 70%, with the highest values being attributed to the nucleon-ion induced prompt-gammas. This conclusion, also confirmed by the disagreement we observe in the case of proton beams, indicates the need for further investigation on nuclear models which describe proton and neutron induced nuclear reactions.

  5. An investigation on the radiation sensitivity of DNA conformations to 60Co gamma rays by using Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Semsarha, F.; Goliaei, B.; Raisali, G.; Khalafi, H.; Mirzakhanian, L.

    2014-03-01

    To investigate the impact of conformational properties of genetic material of living cells on radiation-induced DNA damage, single strand breaks (SSB), double strand breaks (DSB) and some microdosimetric quantities of A, B and Z-DNA conformations caused by 60Co gamma rays, have been calculated. Based on a previous B-DNA geometrical model, models of A and Z forms have been developed. Simple 34 base pairs segments of each model repeated in high number and secondary electron spectrum of 60Co gamma rays have been simulated in a volume of a typical animal cell nucleus. All simulations in this study have been performed by using the Geant4 (GEometry ANd Tracking 4)-DNA extension of the Geant4 toolkit. The results showed that, B-DNA has the lowest yield of simple strand breaks with 2.23 × 10-10 Gy-1 Da-1 and 1.0 × 10-11 Gy-1 Da-1 for the SSB and DSB damage yield, respectively. The A-DNA has the highest SSB yield with 3.59 × 10-10 Gy-1 Da-1 and the Z-DNA has the highest DSB yields with 1.8 × 10-11 Gy-1 Da-1. It has been concluded that there is a direct correlation between the hit probability, mean specific imparted energy and SSB yield in each model of DNA. Moreover, there is a direct correlation between the DSB yield and both the mean lineal energy and topological characteristics of each model.

  6. In-beam quality assurance using induced β+ activity in hadrontherapy: a preliminary physical requirements study using Geant4

    NASA Astrophysics Data System (ADS)

    Lestand, L.; Montarou, G.; Force, P.; Pauna, N.

    2012-10-01

    Light and heavy ions particle therapy, mainly by means of protons and carbon ions, represents an advantageous treatment modality for deep-seated and/or radioresistant tumours. An in-beam quality assurance principle is based on the detection of secondary particles induced by nuclear fragmentations between projectile and target nuclei. Three different strategies are currently under investigation: prompt γ rays imaging, proton interaction vertex imaging and in-beam positron emission tomography. Geant4 simulations have been performed first in order to assess the accuracy of some hadronic models to reproduce experimental data. Two different kinds of data have been considered: β+-emitting isotopes and prompt γ-ray production rates. On the one hand simulations reproduce experimental β+ emitting isotopes production rates to an accuracy of 24%. Moreover simulated β+ emitting nuclei production rate as a function of depth reproduce well the peak-to-plateau ratio of experimental data. On the other hand by tuning the tolerance factor of the photon evaporation model available in Geant4, we reduce significantly prompt γ-ray production rates until a very good agreement is reached with experimental data. Then we have estimated the total amount of induced annihilation photons and prompt γ rays for a simple treatment plan of ∼1 physical Gy in a homogenous equivalent soft tissue tumour (6 cm depth, 4 cm radius and 2 cm wide). The average annihilation photons emitted during a 45 s irradiation in a 4 π solid angle are ∼2 × 106 annihilation photon pairs and 108 single prompt γ whose energy ranges from a few keV to 10 MeV.

  7. In-beam quality assurance using induced β(+) activity in hadrontherapy: a preliminary physical requirements study using Geant4.

    PubMed

    Lestand, L; Montarou, G; Force, P; Pauna, N

    2012-10-21

    Light and heavy ions particle therapy, mainly by means of protons and carbon ions, represents an advantageous treatment modality for deep-seated and/or radioresistant tumours. An in-beam quality assurance principle is based on the detection of secondary particles induced by nuclear fragmentations between projectile and target nuclei. Three different strategies are currently under investigation: prompt γ rays imaging, proton interaction vertex imaging and in-beam positron emission tomography. Geant4 simulations have been performed first in order to assess the accuracy of some hadronic models to reproduce experimental data. Two different kinds of data have been considered: β(+)-emitting isotopes and prompt γ-ray production rates. On the one hand simulations reproduce experimental β(+) emitting isotopes production rates to an accuracy of 24%. Moreover simulated β(+) emitting nuclei production rate as a function of depth reproduce well the peak-to-plateau ratio of experimental data. On the other hand by tuning the tolerance factor of the photon evaporation model available in Geant4, we reduce significantly prompt γ-ray production rates until a very good agreement is reached with experimental data. Then we have estimated the total amount of induced annihilation photons and prompt γ rays for a simple treatment plan of ∼1 physical Gy in a homogenous equivalent soft tissue tumour (6 cm depth, 4 cm radius and 2 cm wide). The average annihilation photons emitted during a 45 s irradiation in a 4 π solid angle are ∼2 × 10(6) annihilation photon pairs and 10(8) single prompt γ whose energy ranges from a few keV to 10 MeV.

  8. Assessment and improvements of Geant4 hadronic models in the context of prompt-gamma hadrontherapy monitoring.

    PubMed

    Dedes, G; Pinto, M; Dauvergne, D; Freud, N; Krimmer, J; Létang, J M; Ray, C; Testa, E

    2014-04-07

    Monte Carlo simulations are nowadays essential tools for a wide range of research topics in the field of radiotherapy. They also play an important role in the effort to develop a real-time monitoring system for quality assurance in proton and carbon ion therapy, by means of prompt-gamma detection. The internal theoretical nuclear models of Monte Carlo simulation toolkits are of decisive importance for the accurate description of neutral or charged particle emission, produced by nuclear interactions between beam particles and target nuclei. We assess the performance of Geant4 nuclear models in the context of prompt-gamma emission, comparing them with experimental data from proton and carbon ion beams. As has been shown in the past and further indicated in our study, the prompt-gamma yields are consistently overestimated by Geant4 by a factor of about 100% to 200% over an energy range from 80 to 310 MeV/u for the case of (12)C, and to a lesser extent for 160 MeV protons. Furthermore, we focus on the quantum molecular dynamics (QMD) modeling of ion-ion collisions, in order to optimize its description of light nuclei, which are abundant in the human body and mainly anticipated in hadrontherapy applications. The optimization has been performed by benchmarking QMD free parameters with well established nuclear properties. In addition, we study the effect of this optimization on charged particle emission. With the usage of the proposed parameter values, discrepancies reduce to less than 70%, with the highest values being attributed to the nucleon-ion induced prompt-gammas. This conclusion, also confirmed by the disagreement we observe in the case of proton beams, indicates the need for further investigation on nuclear models which describe proton and neutron induced nuclear reactions.

  9. Optomechanical design software for segmented mirrors

    NASA Astrophysics Data System (ADS)

    Marrero, Juan

    2016-08-01

    The software package presented in this paper, still under development, was born to help analyzing the influence of the many parameters involved in the design of a large segmented mirror telescope. In summary, it is a set of tools which were added to a common framework as they were needed. Great emphasis has been made on the graphical presentation, as scientific visualization nowadays cannot be conceived without the use of a helpful 3d environment, showing the analyzed system as close to reality as possible. Use of third party software packages is limited to ANSYS, which should be available in the system only if the FEM results are needed. Among the various functionalities of the software, the next ones are worth mentioning here: automatic 3d model construction of a segmented mirror from a set of parameters, geometric ray tracing, automatic 3d model construction of a telescope structure around the defined mirrors from a set of parameters, segmented mirror human access assessment, analysis of integration tolerances, assessment of segments collision, structural deformation under gravity and thermal variation, mirror support system analysis including warping harness mechanisms, etc.

  10. COG Software Architecture Design Description Document

    SciTech Connect

    Buck, R M; Lent, E M

    2009-09-21

    This COG Software Architecture Design Description Document describes the organization and functionality of the COG Multiparticle Monte Carlo Transport Code for radiation shielding and criticality calculations, at a level of detail suitable for guiding a new code developer in the maintenance and enhancement of COG. The intended audience also includes managers and scientists and engineers who wish to have a general knowledge of how the code works. This Document is not intended for end-users. This document covers the software implemented in the standard COG Version 10, as released through RSICC and IAEA. Software resources provided by other institutions will not be covered. This document presents the routines grouped by modules and in the order of the three processing phases. Some routines are used in multiple phases. The routine description is presented once - the first time the routine is referenced. Since this is presented at the level of detail for guiding a new code developer, only the routines invoked by another routine that are significant for the processing phase that is being detailed are presented. An index to all routines detailed is included. Tables for the primary data structures are also presented.

  11. Thalmann Algorithm Decompression Table Generation Software Design Document

    DTIC Science & Technology

    2010-09-01

    Decompression Table Generation Software Design Document Navy Experimental Diving Unit Author...TITLE (Include Security Classification) (U) THALMANN ALGORITHM DECOMPRESSION TABLE GENERATION SOFTWARE DESIGN DOCUMENT 12. PERSONAL AUTHOR(S...1 2. Decompression Table Generator (TBLP7R

  12. SU-E-T-531: Performance Evaluation of Multithreaded Geant4 for Proton Therapy Dose Calculations in a High Performance Computing Facility

    SciTech Connect

    Shin, J; Coss, D; McMurry, J; Farr, J; Faddegon, B

    2014-06-01

    Purpose: To evaluate the efficiency of multithreaded Geant4 (Geant4-MT, version 10.0) for proton Monte Carlo dose calculations using a high performance computing facility. Methods: Geant4-MT was used to calculate 3D dose distributions in 1×1×1 mm3 voxels in a water phantom and patient's head with a 150 MeV proton beam covering approximately 5×5 cm2 in the water phantom. Three timestamps were measured on the fly to separately analyze the required time for initialization (which cannot be parallelized), processing time of individual threads, and completion time. Scalability of averaged processing time per thread was calculated as a function of thread number (1, 100, 150, and 200) for both 1M and 50 M histories. The total memory usage was recorded. Results: Simulations with 50 M histories were fastest with 100 threads, taking approximately 1.3 hours and 6 hours for the water phantom and the CT data, respectively with better than 1.0 % statistical uncertainty. The calculations show 1/N scalability in the event loops for both cases. The gains from parallel calculations started to decrease with 150 threads. The memory usage increases linearly with number of threads. No critical failures were observed during the simulations. Conclusion: Multithreading in Geant4-MT decreased simulation time in proton dose distribution calculations by a factor of 64 and 54 at a near optimal 100 threads for water phantom and patient's data respectively. Further simulations will be done to determine the efficiency at the optimal thread number. Considering the trend of computer architecture development, utilizing Geant4-MT for radiotherapy simulations is an excellent cost-effective alternative for a distributed batch queuing system. However, because the scalability depends highly on simulation details, i.e., the ratio of the processing time of one event versus waiting time to access for the shared event queue, a performance evaluation as described is recommended.

  13. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  14. ClassCompass: A Software Design Mentoring System

    ERIC Educational Resources Information Center

    Coelho, Wesley; Murphy, Gail

    2007-01-01

    Becoming a quality software developer requires practice under the guidance of an expert mentor. Unfortunately, in most academic environments, there are not enough experts to provide any significant design mentoring for software engineering students. To address this problem, we present a collaborative software design tool intended to maximize an…

  15. Geant4 Simulations of the SuperCDMS iZIP Detector Charge Carrier Propagation and FET Readout

    NASA Astrophysics Data System (ADS)

    Agnese, R.; Brandt, D.; Asai, M.; Cabrera, B.; Leman, S.; McCarthy, K.; Redl, P.; Saab, T.; Wright, D.

    2014-09-01

    The SuperCDMS experiment aims to directly detect dark matter particles called WIMPs (weakly interacting massive particles). The detectors measure phonon and ionization energy due to nuclear and electron recoils from incident particles. The SuperCDMS Detector Monte Carlo group uses Geant4 to simulate electron-hole pairs () and low temperature phonons. We use these simulations in order to study energy deposition in the detectors. Phonons and electron-hole pairs are tracked in a crystal detector. Because of the band structure of the crystals, the electrons undergo oblique propagation. The charge electrodes on each side of the detector are biased at different voltages while the phonon sensors are grounded. This creates a nearly uniform electric field through the bulk of the detector, with a complex shape near the surfaces. The electric field is calculated from interpolating on a tetrahedral mesh. The resulting TES phonon readout, as well as the FET charge readout are simulated. To calculate the FET readout, the Shockley-Ramo theorem is applied to simulate the current in the FET. The goal of this paper is to describe the theory and implementation of calculating the electric field, performing the charge carrier propagation, and simulating the FET readout of the SuperCDMS detectors.

  16. Angular and energy distribution for parent primaries of cosmic muons at the sea level using Geant4

    NASA Astrophysics Data System (ADS)

    Arslan, Halil; Bektasoglu, Mehmet

    2015-04-01

    The angular and energy distributions of the primary cosmic rays that are responsible for the muons reaching the sea level have been estimated using the Geant4 simulation package. The models used in the simulations were tested by comparing the simulation results for the differential muon flux with the BESS measurements performed in Lynn Lake, Canada. Then, direct relationship between the propagation directions of the muons and those of the responsible primary particles has been investigated. The median energies for the parent primaries of vertical muons reaching the sea level with the threshold energies (Eμ) in the range 0.5-300 GeV were obtained. Simulation results for the median primary energies, 15.5Eμ and 11.2Eμ for Eμ = 14 GeV and Eμ = 100 GeV, have been found to be in good agreement with the literature. Furthermore, median primary energies for the low energy muons with large zenith angle have been seen to be relatively higher than the ones for the muons with narrower angles.

  17. DETECTORS AND EXPERIMENTAL METHODS: Study of neutron response for two hybrid RPC setups using the GEANT4 MC simulation approach

    NASA Astrophysics Data System (ADS)

    M., Jamil; Rhee T., J.; Jeon J., Y.

    2009-10-01

    The present article describes a detailed neutron simulation study in the energy range 10-10 MeV to 1.0 GeV for two different RPC configurations. The simulation studies were taken by using the GEANT4 MC code. Aluminum was utilized on the GND and readout strips for the (a) Bakelite-based and (b) glass-based RPCs. For the former type of RPC setup the neutron sensitivity for the isotropic source was Sn = 2.702 × 10-2 at En = 1.0 GeV, while for the latter type of RPC, the neutron sensitivity for the same source was evaluated as Sn = 4.049 × 10-2 at En = 1.0 GeV. These results were further compared with the previous RPC configuration in which copper was used for ground and pickup pads. Additionally Al was employed at (GND+strips) of the phosphate glass RPC setup and compared with the copper-based phosphate glass RPC. Good agreement with sensitivity values was obtained with the current and previous simulation results.

  18. G4MoNA - A Geant4 Simulation for unbound nuclides detected with MoNA/LISA

    NASA Astrophysics Data System (ADS)

    Gueye, Paul; Freeman, Jessica; Frank, Nathan; MoNA Collaboration

    2017-01-01

    The MoNA Collaboration has conducted a plethora of experiments to study unbound nuclei near the neutron dripline using the invariant mass technique since 2005. These experiments used a variety of secondary beams from the Coupled Cyclotron Facility of the National Superconducting Cyclotron Laboratory. The experimental setup consists of a large gap superconducting Sweeper magnet for charged fragments separation and the MoNA/LISA neutron detector arrays for neutron detection. Recently, a multi-layered Si/Be segmented target consisting of three 700 mg/cm2 thick 9Be slabs and four 140 μ m Si detectors were added to the setup. This target improves the resolution of the reconstructed decay energy spectra of the unbound nuclides. The Geant4 Monte Carlo simulation toolkit was used to develop a complete realistic model of the setup including a new class to treat the decay of unbound nuclei, the Si/Be segmented target, the MoNA/LISA and the charged fragments detector systems. Comparison between simulated and experimental data will be presented. DoENNSA - DE-NA0000979.

  19. PD5: A General Purpose Library for Primer Design Software

    PubMed Central

    Riley, Michael C.; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Background Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. Results The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C++ with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. Conclusions The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design PMID:24278254

  20. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the geant4 Monte Carlo code

    PubMed Central

    Guan, Fada; Peeler, Christopher; Bronk, Lawrence; Geng, Changran; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Grosshans, David; Mohan, Radhe; Titt, Uwe

    2015-01-01

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the geant 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from geant 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LETt and dose-averaged LET, LETd) using geant 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LETt and LETd of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LETt but significant for LETd. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in geant 4 can result in incorrect LETd calculation results in the dose plateau region for small step limits. The erroneous LETd results can be attributed to the algorithm to determine fluctuations in energy deposition along the

  1. Educational software usability: Artifact or Design?

    PubMed

    Van Nuland, Sonya E; Eagleson, Roy; Rogers, Kem A

    2017-03-01

    Online educational technologies and e-learning tools are providing new opportunities for students to learn worldwide, and they continue to play an important role in anatomical sciences education. Yet, as we shift to teaching online, particularly within the anatomical sciences, it has become apparent that e-learning tool success is based on more than just user satisfaction and preliminary learning outcomes-rather it is a multidimensional construct that should be addressed from an integrated perspective. The efficiency, effectiveness and satisfaction with which a user can navigate an e-learning tool is known as usability, and represents a construct which we propose can be used to quantitatively evaluate e-learning tool success. To assess the usability of an e-learning tool, usability testing should be employed during the design and development phases (i.e., prior to its release to users) as well as during its delivery (i.e., following its release to users). However, both the commercial educational software industry and individual academic developers in the anatomical sciences have overlooked the added value of additional usability testing. Reducing learner frustration and anxiety during e-learning tool use is essential in ensuring e-learning tool success, and will require a commitment on the part of the developers to engage in usability testing during all stages of an e-learning tool's life cycle. Anat Sci Educ 10: 190-199. © 2016 American Association of Anatomists.

  2. Engineering Software Suite Validates System Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  3. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  4. Designing Flexible Software for the "Electronic Board."

    ERIC Educational Resources Information Center

    Hativa, Nira

    1984-01-01

    Argues that software for electronic boards should address a variety of teaching styles, student abilities and ages, class textbooks, teaching objectives, and learning environments for flexibility of use. The software features that contribute to flexibility include frequent stops, options for going backwards, inter- and intra-unit jumps, and…

  5. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  6. SEPAC flight software detailed design specifications, volume 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.

  7. Development of a Geant4 application to characterise a prototype neutron detector based on three orthogonal (3)He tubes inside an HDPE sphere.

    PubMed

    Gracanin, V; Guatelli, S; Prokopovich, D; Rosenfeld, A B; Berry, A

    2017-01-01

    The Bonner Sphere Spectrometer (BSS) system is a well-established technique for neutron dosimetry that involves detection of thermal neutrons within a range of hydrogenous moderators. BSS detectors are often used to perform neutron field surveys in order to determine the ambient dose equivalent H*(10) and estimate health risk to personnel. There is a potential limitation of existing neutron survey techniques, since some detectors do not consider the direction of the neutron field, which can result in overly conservative estimates of dose in neutron fields. This paper shows the development of a Geant4 simulation application to characterise a prototype neutron detector based on three orthogonal (3)He tubes inside a single HDPE sphere built at the Australian Nuclear Science and Technology Organisation (ANSTO). The Geant4 simulation has been validated with respect to experimental measurements performed with an Am-Be source.

  8. Determination of age specific ¹³¹I S-factor values for thyroid using anthropomorphic phantom in Geant4 simulations.

    PubMed

    Rahman, Ziaur; Ahmad, Syed Bilal; Mirza, Sikander M; Arshed, Waheed; Mirza, Nasir M; Ahmed, Waheed

    2014-08-01

    Using anthropomorphic phantom in Geant4, determination of β- and γ-absorbed fractions and energy absorbed per event due to (131)I activity in thyroid of individuals of various age groups and geometrical models, have been carried out. In the case of (131)I β-particles, the values of the absorbed fraction increased from 0.88 to 0.97 with fetus age. The maximum difference in absorbed energy per decay for soft tissue and water is 7.2% for γ-rays and 0.4% for β-particles. The new mathematical MIRD embedded in Geant4 (MEG) and two-lobe ellipsoidal models developed in this work have 4.3% and 2.9% lower value of S-factor as compared with the ORNL data.

  9. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  10. GATE as a GEANT4-based Monte Carlo platform for the evaluation of proton pencil beam scanning treatment plans.

    PubMed

    Grevillot, L; Bertrand, D; Dessy, F; Freud, N; Sarrut, D

    2012-07-07

    Active scanning delivery systems take full advantage of ion beams to best conform to the tumor and to spare surrounding healthy tissues; however, it is also a challenging technique for quality assurance. In this perspective, we upgraded the GATE/GEANT4 Monte Carlo platform in order to recalculate the treatment planning system (TPS) dose distributions for active scanning systems. A method that allows evaluating the TPS dose distributions with the GATE Monte Carlo platform has been developed and applied to the XiO TPS (Elekta), for the IBA proton pencil beam scanning (PBS) system. First, we evaluated the specificities of each dose engine. A dose-conversion scheme that allows one to convert dose to medium into dose to water was implemented within GATE. Specific test cases in homogeneous and heterogeneous configurations allowed for the estimation of the differences between the beam models implemented in XiO and GATE. Finally, dose distributions of a prostate treatment plan were compared. In homogeneous media, a satisfactory agreement was generally obtained between XiO and GATE. The maximum stopping power difference of 3% occurred in a human tissue of 0.9 g cm(-3) density and led to a significant range shift. Comparisons in heterogeneous configurations pointed out the limits of the TPS dose calculation accuracy and the superiority of Monte Carlo simulations. The necessity of computing dose to water in our Monte Carlo code for comparisons with TPSs is also presented. Finally, the new capabilities of the platform are applied to a prostate treatment plan and dose differences between both dose engines are analyzed in detail. This work presents a generic method to compare TPS dose distributions with the GATE Monte Carlo platform. It is noteworthy that GATE is also a convenient tool for imaging applications, therefore opening new research possibilities for the PBS modality.

  11. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  12. Use of the GEANT4 Monte Carlo to determine three-dimensional dose factors for radionuclide dosimetry

    NASA Astrophysics Data System (ADS)

    Amato, Ernesto; Italiano, Antonio; Minutoli, Fabio; Baldari, Sergio

    2013-04-01

    The voxel-level dosimetry is the most simple and common approach to internal dosimetry of nonuniform distributions of activity within the human body. Aim of this work was to obtain the dose "S" factors (mGy/MBqs) at the voxel level for eight beta and beta-gamma emitting radionuclides commonly used in nuclear medicine diagnostic and therapeutic procedures. We developed a Monte Carlo simulation in GEANT4 of a region of soft tissue as defined by the ICRP, divided into 11×11×11 cubic voxels, 3 mm in side. The simulation used the parameterizations of the electromagnetic interaction optimized for low energy (EEDL, EPDL). The decay of each radionuclide (32P, 90Y, 99mTc, 177Lu, 131I, 153Sm, 186Re, 188Re) were simulated homogeneously distributed within the central voxel (0,0,0), and the energy deposited in the surrounding voxels was mediated on the 8 octants of the three dimensional space, for reasons of symmetry. The results obtained were compared with those available in the literature. While the iodine deviations remain within 16%, for phosphorus, a pure beta emitter, the agreement is very good for self-dose (0,0,0) and good for the dose to first neighbors, while differences are observed ranging from -60% to +100% for voxels far distant from the source. The existence of significant differences in the percentage calculation of the voxel S factors, especially for pure beta emitters such as 32P or 90Y, has already been highlighted by other authors. These data can usefully extend the dosimetric approach based on the voxel to other radionuclides not covered in the available literature.

  13. GATE as a GEANT4-based Monte Carlo platform for the evaluation of proton pencil beam scanning treatment plans

    NASA Astrophysics Data System (ADS)

    Grevillot, L.; Bertrand, D.; Dessy, F.; Freud, N.; Sarrut, D.

    2012-07-01

    Active scanning delivery systems take full advantage of ion beams to best conform to the tumor and to spare surrounding healthy tissues; however, it is also a challenging technique for quality assurance. In this perspective, we upgraded the GATE/GEANT4 Monte Carlo platform in order to recalculate the treatment planning system (TPS) dose distributions for active scanning systems. A method that allows evaluating the TPS dose distributions with the GATE Monte Carlo platform has been developed and applied to the XiO TPS (Elekta), for the IBA proton pencil beam scanning (PBS) system. First, we evaluated the specificities of each dose engine. A dose-conversion scheme that allows one to convert dose to medium into dose to water was implemented within GATE. Specific test cases in homogeneous and heterogeneous configurations allowed for the estimation of the differences between the beam models implemented in XiO and GATE. Finally, dose distributions of a prostate treatment plan were compared. In homogeneous media, a satisfactory agreement was generally obtained between XiO and GATE. The maximum stopping power difference of 3% occurred in a human tissue of 0.9 g cm-3 density and led to a significant range shift. Comparisons in heterogeneous configurations pointed out the limits of the TPS dose calculation accuracy and the superiority of Monte Carlo simulations. The necessity of computing dose to water in our Monte Carlo code for comparisons with TPSs is also presented. Finally, the new capabilities of the platform are applied to a prostate treatment plan and dose differences between both dose engines are analyzed in detail. This work presents a generic method to compare TPS dose distributions with the GATE Monte Carlo platform. It is noteworthy that GATE is also a convenient tool for imaging applications, therefore opening new research possibilities for the PBS modality.

  14. Energy deposition in small-scale targets of liquid water using the very low energy electromagnetic physics processes of the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Champion, C.; Tran, H. N.; Karamitros, M.; Bernal, M.; Francis, Z.; Ivanchenko, V.; Mantero, A.; Members of Geant4-DNA Collaboration

    2013-07-01

    In the perspective of building an open source simulation platform dedicated to the modelling of early biological molecular damages due to ionising radiation at the DNA scale, the general-purpose Geant4 Monte Carlo simulation toolkit has been recently extended with specific very low energy electromagnetic physics processes for liquid water medium. These processes - also called “Geant4-DNA” processes - simulate the physical interactions induced by electrons, hydrogen and helium atoms of different charge states. The present work reports on the energy deposit distributions obtained for incident electrons, protons and alpha particles in nanometre-size volumes comparable to those present in the genetic material of mammalian cells. The frequency distributions of the energy deposition obtained for three typical geometries of nanometre-size cylindrical targets placed in a spherical phantom are found to be in reasonable agreement with prior works. Furthermore, we present a combination of the Geant4-DNA processes with a simplified geometrical model of a cellular nucleus allowing the evaluation of energy deposits in volumes of biological interest.

  15. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β(+)-emitting nuclei during therapeutic particle irradiation to measured data.

    PubMed

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-21

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β(+)-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β(+)-activity and dose is not feasible, a simulation of the expected β(+)-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β(+)-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β(+)-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  16. Development of Distributed Computing Systems Software Design Methodologies.

    DTIC Science & Technology

    1982-11-05

    R12i 941 DEVELOPMENT OF DISTRIBUTED COMPUTING SYSTEMS SOFTWARE ± DESIGN METHODOLOGIES(U) NORTHWESTERN UNIV EVANSTON IL DEPT OF ELECTRICAL...GUIRWAU OF STANDARDS -16 5 A Ax u FINAL REPORT Development of Distributed Computing System Software Design Methodologies C)0 Stephen S. Yau September 22...of Distributed Computing Systems Software pt.22,, 80 -OJu1, 2 * Dsig Mehodloges PERFORMING ORG REPORT NUMBERDesign th ol ies" 7. AUTHOR() .. CONTRACT

  17. Design Features of Pedagogically-Sound Software in Mathematics.

    ERIC Educational Resources Information Center

    Haase, Howard; And Others

    Weaknesses in educational software currently available in the domain of mathematics are discussed. A technique that was used for the design and production of mathematics software aimed at improving problem-solving skills which combines sound pedagogy and innovative programming is presented. To illustrate the design portion of this technique, a…

  18. Software design for professional risk evaluation

    NASA Astrophysics Data System (ADS)

    Ionescu, V.; Calea, G.; Amza, G.; Iacobescu, G.; Nitoi, D.; Dimitrescu, A.

    2016-08-01

    Professional risk evaluation represents a complex activity involving each economic operator, with important repercussion upon health and security in work. Article represents an innovative study method, regarding professional risk analyze in which cumulative working posts are evaluated. Work presents a new software that helps in putting together all the working positions from a complex organizational system and analyzing them in order to evaluate the possible risks. Using this software, a multiple analysis can be done like: risk estimation, risk evaluation, estimation of residual risks and finally searching of risk reduction measures.

  19. Designing the Undesignable: Social Software and Control

    ERIC Educational Resources Information Center

    Dron, Jon

    2007-01-01

    Social software, such as blogs, wikis, tagging systems and collaborative filters, treats the group as a first-class object within the system. Drawing from theories of transactional distance and control, this paper proposes a model of e-learning that extends traditional concepts of learner-teacher-content interactions to include these emergent…

  20. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  1. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  2. Designing Distributed Learning Environments with Intelligent Software Agents

    ERIC Educational Resources Information Center

    Lin, Fuhua, Ed.

    2005-01-01

    "Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…

  3. Training Software Developers and Designers to Conduct Usability Evaluations

    ERIC Educational Resources Information Center

    Skov, Mikael Brasholt; Stage, Jan

    2012-01-01

    Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…

  4. Designing Computer Software for Problem-Solving Instruction.

    ERIC Educational Resources Information Center

    Duffield, Judith A.

    1991-01-01

    Discusses factors that might influence the effectiveness of computer software designed to teach problem solving. Topics discussed include the structure of knowledge; transfer of training; computers and problem solving instruction; human-computer interactions; and types of software, including drill and practice programs, tutorials, instructional…

  5. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    SciTech Connect

    Guan, Fada; Peeler, Christopher; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Mohan, Radhe; Titt, Uwe; Bronk, Lawrence; Geng, Changran; Grosshans, David

    2015-11-15

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to

  6. SWEPP Gamma-Ray Spectrometer System software design description

    SciTech Connect

    Femec, D.A.; Killian, E.W.

    1994-08-01

    To assist in the characterization of the radiological contents of contract-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP), the SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory. The SGRS system software controls turntable and detector system activities. In addition to determining the concentrations of gamma-ray-emitting radionuclides, this software also calculates attenuation-corrected isotopic mass ratios of-specific interest. This document describes the software design for the data acquisition and analysis software associated with the SGRS system.

  7. Dose calculations at high altitudes and in deep space with GEANT4 using BIC and JQMD models for nucleus nucleus reactions

    NASA Astrophysics Data System (ADS)

    Sihver, L.; Matthiä, D.; Koi, T.; Mancusi, D.

    2008-10-01

    Radiation exposure of aircrew is more and more recognized as an occupational hazard. The ionizing environment at standard commercial aircraft flight altitudes consists mainly of secondary particles, of which the neutrons give a major contribution to the dose equivalent. Accurate estimations of neutron spectra in the atmosphere are therefore essential for correct calculations of aircrew doses. Energetic solar particle events (SPE) could also lead to significantly increased dose rates, especially at routes close to the North Pole, e.g. for flights between Europe and USA. It is also well known that the radiation environment encountered by personnel aboard low Earth orbit (LEO) spacecraft or aboard a spacecraft traveling outside the Earth's protective magnetosphere is much harsher compared with that within the atmosphere since the personnel are exposed to radiation from both galactic cosmic rays (GCR) and SPE. The relative contribution to the dose from GCR when traveling outside the Earth's magnetosphere, e.g. to the Moon or Mars, is even greater, and reliable and accurate particle and heavy ion transport codes are essential to calculate the radiation risks for both aircrew and personnel on spacecraft. We have therefore performed calculations of neutron distributions in the atmosphere, total dose equivalents, and quality factors at different depths in a water sphere in an imaginary spacecraft during solar minimum in a geosynchronous orbit. The calculations were performed with the GEANT4 Monte Carlo (MC) code using both the binary cascade (BIC) model, which is part of the standard GEANT4 package, and the JQMD model, which is used in the particle and heavy ion transport code PHITS GEANT4.

  8. Pvarray: A software tool for photovoltaic array design

    NASA Technical Reports Server (NTRS)

    Burger, D. R.

    1985-01-01

    The application of PVARRAY, a software program for design of photovoltaic arrays are described. Results of sample parametric studies on array configurations are presented. It is concluded that PVARRAY could simulate a variety of configurations.

  9. Geant4 simulation of the Elekta XVI kV CBCT unit for accurate description of potential late toxicity effects of image-guided radiotherapy.

    PubMed

    Brochu, F M; Burnet, N G; Jena, R; Plaistow, R; Parker, M A; Thomas, S J

    2014-12-21

    This paper describes the modelisation of the Elekta XVI Cone Beam Computed Tomography (CBCT) machine components with Geant4 and its validation against calibration data taken for two commonly used machine setups. Preliminary dose maps of simulated CBCTs coming from this modelisation work are presented. This study is the first step of a research project, GHOST, aiming to improve the understanding of late toxicity risk in external beam radiotherapy patients by simulating dose depositions integrated from different sources (imaging, treatment beam) over the entire treatment plan. The second cancer risk will then be derived from different models relating irradiation dose and second cancer risk.

  10. Geant4 simulation of the Elekta XVI kV CBCT unit for accurate description of potential late toxicity effects of image-guided radiotherapy

    NASA Astrophysics Data System (ADS)

    Brochu, F. M.; Burnet, N. G.; Jena, R.; Plaistow, R.; Parker, M. A.; Thomas, S. J.

    2014-12-01

    This paper describes the modelisation of the Elekta XVI Cone Beam Computed Tomography (CBCT) machine components with Geant4 and its validation against calibration data taken for two commonly used machine setups. Preliminary dose maps of simulated CBCTs coming from this modelisation work are presented. This study is the first step of a research project, GHOST, aiming to improve the understanding of late toxicity risk in external beam radiotherapy patients by simulating dose depositions integrated from different sources (imaging, treatment beam) over the entire treatment plan. The second cancer risk will then be derived from different models relating irradiation dose and second cancer risk.

  11. The reduction techniques of the particle background for the ATHENA X-IFU instrument at L2 orbit: Geant4 and the CryoAC

    NASA Astrophysics Data System (ADS)

    Macculi, Claudio, Piro, L.; Gatti, F.; Lotti, S.; Argan, A.; Laurenza, M.; D'Andrea, M.; Torrioli, G.; Biasotti, M.; Corsini, D.; Orlando, A.; Mineo, T.; D'Ai, A.; Molendi, S.; Gastaldello, F.; Bulgarelli, A.; Fioretti, V.; Jacquey, C.; Laurent, P.

    2015-09-01

    We present the particles background reduction techniques aimed at increasing the X-IFU sensitivity which is reduced by primary protons of both solar and Cosmic Rays origin, and secondary electrons. The adopted solutions involve Monte Carlo simulation by both Geant4 toolkit related to the "expected" background at L2 orbit through the payload mass model and the ray tracing technique to evaluate the soft protons components focussed by the optics to the main detector, and the development of an active Cryogenic AntiCoincidence detector and a passive electron shielding to meet the scientific requirements.

  12. Geant4-based Simulation Study of PEP-II Beam Backgrounds in the BABAR Detector at the SLAC B-Factory

    SciTech Connect

    Lockman, W.S.; Kozanecki, W.; Campbell, B.; Robertson, S.H.; Bondioli, M.; Calderini, G.; Barlow, N.; Edgar, C.L.; Aston, D.; Bower, G.; Cristinziani, M.; Fieguth, T.; Wright, D.H.; Petersen, B.A.; Blount, N.L.; Strom, D.; /Oregon U.

    2005-06-07

    To improve the understanding of accelerator-induced backgrounds at the SLAC B-Factory, we simulate lost particle backgrounds in the BABAR detector originating from beam-gas interactions and radiative Bhabha scatters. We have extended the GEANT4-based BABAR detector simulation to include beam-line components and magnetic fields up to 8.5 m away from the interaction point. We describe the simulation model and then compare preliminary predicted background levels with measurements from dedicated single- and colliding-beam experiments.

  13. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  14. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  15. Evaluation of commercially available lighting design software

    SciTech Connect

    McConnell, D.G.

    1990-09-01

    This report addresses the need for commercially available lighting design computer programs and evaluates several of these programs. Sandia National Laboratories uses these programs to provide lighting designs for exterior closed-circuit television camera intrusion detection assessment for high-security perimeters.

  16. Acquiring Software Design Schemas: A Machine Learning Perspective

    NASA Technical Reports Server (NTRS)

    Harandi, Mehdi T.; Lee, Hing-Yan

    1991-01-01

    In this paper, we describe an approach based on machine learning that acquires software design schemas from design cases of existing applications. An overview of the technique, design representation, and acquisition system are presented. the paper also addresses issues associated with generalizing common features such as biases. The generalization process is illustrated using an example.

  17. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  18. On Designing Lightweight Threads for Substrate Software

    NASA Technical Reports Server (NTRS)

    Haines, Matthew

    1997-01-01

    Existing user-level thread packages employ a 'black box' design approach, where the implementation of the threads is hidden from the user. While this approach is often sufficient for application-level programmers, it hides critical design decisions that system-level programmers must be able to change in order to provide efficient service for high-level systems. By applying the principles of Open Implementation Analysis and Design, we construct a new user-level threads package that supports common thread abstractions and a well-defined meta-interface for altering the behavior of these abstractions. As a result, system-level programmers will have the advantages of using high-level thread abstractions without having to sacrifice performance, flexibility or portability.

  19. Making software get along: integrating optical and mechanical design programs

    NASA Astrophysics Data System (ADS)

    Shackelford, Christie J.; Chinnock, Randal B.

    2001-03-01

    As modern optomechanical engineers, we have the good fortune of having very sophisticated software programs available to us. The current optical design, mechanical design, industrial design, and CAM programs are very powerful tools with some very desirable features. However, no one program can do everything necessary to complete an entire optomechanical system design. Each program has a unique set of features and benefits, and typically two or mo re will be used during the product development process. At a minimum, an optical design program and a mechanical CAD package will be employed. As we strive for efficient, cost-effective, and rapid progress in our development projects, we must use these programs to their full advantage, while keeping redundant tasks to a minimum. Together, these programs offer the promise of a `seamless' flow of data from concept all the way to the download of part designs directly to the machine shop for fabrication. In reality, transferring data from one software package to the next is often frustrating. Overcoming these problems takes some know-how, a bit of creativity, and a lot of persistence. This paper describes a complex optomechanical development effort in which a variety of software tools were used from the concept stage to prototyping. It will describe what software was used for each major design task, how we learned to use them together to best advantage, and how we overcame the frustrations of software that didn't get along.

  20. Comparison of MCNPX and GEANT4 to Predict the Contribution of Non-elastic Nuclear Interactions to Absorbed Dose in Water, PMMA and A150

    NASA Astrophysics Data System (ADS)

    Shtejer, K.; Arruda-Neto, J. D. T.; Schulte, R.; Wroe, A.; Rodrigues, T. E.; de Menezes, M. O.; Moralles, M.; Guzmán, F.; Manso, M. V.

    2008-08-01

    Proton induced non-elastic nuclear reactions play an important role in the dose distribution of clinically used proton beams as they deposit dose of high biological effectiveness both within the primary beam path as well as outside the beam to untargeted tissues. Non-elastic nuclear reactions can be evaluated using transport codes based on the Monte Carlo method. In this work, we have utilized the Los Alamos code MCNPX and the CERN GEANT4 toolkit, which are currently the most widely used Monte Carlo programs for proton radiation transport simulations in medical physics, to study the contribution of non-elastic nuclear interactions to the absorbed dose of proton beams in the therapeutic energy range. The impact of different available theoretical models to address the nuclear reaction process was investigated. The contribution of secondary particles from non-elastic nuclear reactions was calculated in three materials relevant in radiotherapy applications: water, PMMA and A150. The results evidence that there are differences in the calculated contribution of the secondary particles heavier than protons to the absorbed dose, with different approaches to model the nuclear reactions. The MCNPX calculation give rise to a larger contribution of d, t, α3He to the total dose compared to the GEANT4 physical models chosen in this work.

  1. Distributions of secondary particles in proton and carbon-ion therapy: a comparison between GATE/Geant4 and FLUKA Monte Carlo codes.

    PubMed

    Robert, C; Dedes, G; Battistoni, G; Böhlen, T T; Buvat, I; Cerutti, F; Chin, M P W; Ferrari, A; Gueth, P; Kurz, C; Lestand, L; Mairani, A; Montarou, G; Nicolini, R; Ortega, P G; Parodi, K; Prezado, Y; Sala, P R; Sarrut, D; Testa, E

    2013-05-07

    Monte Carlo simulations play a crucial role for in-vivo treatment monitoring based on PET and prompt gamma imaging in proton and carbon-ion therapies. The accuracy of the nuclear fragmentation models implemented in these codes might affect the quality of the treatment verification. In this paper, we investigate the nuclear models implemented in GATE/Geant4 and FLUKA by comparing the angular and energy distributions of secondary particles exiting a homogeneous target of PMMA. Comparison results were restricted to fragmentation of (16)O and (12)C. Despite the very simple target and set-up, substantial discrepancies were observed between the two codes. For instance, the number of high energy (>1 MeV) prompt gammas exiting the target was about twice as large with GATE/Geant4 than with FLUKA both for proton and carbon ion beams. Such differences were not observed for the predicted annihilation photon production yields, for which ratios of 1.09 and 1.20 were obtained between GATE and FLUKA for the proton beam and the carbon ion beam, respectively. For neutrons and protons, discrepancies from 14% (exiting protons-carbon ion beam) to 57% (exiting neutrons-proton beam) have been identified in production yields as well as in the energy spectra for neutrons.

  2. Comparison of MCNPX and GEANT4 to Predict the Contribution of Non-elastic Nuclear Interactions to Absorbed Dose in Water, PMMA and A150

    SciTech Connect

    Shtejer, K.; Arruda-Neto, J. D. T.; Rodrigues, T. E.; Schulte, R.; Wroe, A.; Menezes, M. O. de; Moralles, M.

    2008-08-11

    Proton induced non-elastic nuclear reactions play an important role in the dose distribution of clinically used proton beams as they deposit dose of high biological effectiveness both within the primary beam path as well as outside the beam to untargeted tissues. Non-elastic nuclear reactions can be evaluated using transport codes based on the Monte Carlo method. In this work, we have utilized the Los Alamos code MCNPX and the CERN GEANT4 toolkit, which are currently the most widely used Monte Carlo programs for proton radiation transport simulations in medical physics, to study the contribution of non-elastic nuclear interactions to the absorbed dose of proton beams in the therapeutic energy range. The impact of different available theoretical models to address the nuclear reaction process was investigated. The contribution of secondary particles from non-elastic nuclear reactions was calculated in three materials relevant in radiotherapy applications: water, PMMA and A150. The results evidence that there are differences in the calculated contribution of the secondary particles heavier than protons to the absorbed dose, with different approaches to model the nuclear reactions. The MCNPX calculation give rise to a larger contribution of d, t, {alpha}{sup 3}He to the total dose compared to the GEANT4 physical models chosen in this work.

  3. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  4. Software For Design Of Life-Support Systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1991-01-01

    Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.

  5. Software design to facilitate information transfer at hospital discharge.

    PubMed

    Nace, G Stephen; Graumlich, James F; Aldag, Jean C

    2006-01-01

    Discharge communication between inpatient and outpatient physicians is often an inefficient and error-prone process. Adverse events result from poor communication at the time of discharge. The objective of this study was to describe development of discharge software to overcome communication barriers. The secondary objective was to assess factors that influence the time to complete tasks with the software. Methods were a performance improvement model and database analysis of 336 discharges. Software design specifications included computerised physician order entry, immediate utility, minimal development and deployment costs, acceptability to physician-users, and satisfaction of primary care physicians, patients and pharmacists. Design features included simple 'just-in-time' prompts and point-of-care prescribing resources. The dependent variable for analysis was physician time to complete discharge prescriptions and instructions while using the software. General linear and mixed-effects regression models adjusted for physician effects and other predictors. Results revealed that physician factors significantly affected the time to complete a discharge while using the software. As the number of accesses (log-ins) and free text typing increased, then time to complete the computerised discharge increased. Patient-related factors that increased physician time were discharge diagnoses, prescriptions and length of stay. In conclusion, discharge software can help inpatient physicians transfer timely, complete and legible information to outpatient physicians, pharmacists and patients. Physician and patient factors influence the time to complete discharges using the software.

  6. [Development of a software for 3D virtual phantom design].

    PubMed

    Zou, Lian; Xie, Zhao; Wu, Qi

    2014-02-01

    In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research.

  7. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  8. Calico: An Early-Phase Software Design Tool

    ERIC Educational Resources Information Center

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  9. Designing Prediction Tasks in a Mathematics Software Environment

    ERIC Educational Resources Information Center

    Brunström, Mats; Fahlgren, Maria

    2015-01-01

    There is a recognised need in mathematics teaching for new kinds of tasks which exploit the affordances provided by new technology. This paper focuses on the design of prediction tasks to foster student reasoning about exponential functions in a mathematics software environment. It draws on the first iteration of a design based research study…

  10. Certification trails and software design for testability

    NASA Technical Reports Server (NTRS)

    Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.

    1993-01-01

    Design techniques which may be applied to make program testing easier were investigated. Methods for modifying a program to generate additional data which we refer to as a certification trail are presented. This additional data is designed to allow the program output to be checked more quickly and effectively. Certification trails were described primarily from a theoretical perspective. A comprehensive attempt to assess experimentally the performance and overall value of the certification trail method is reported. The method was applied to nine fundamental, well-known algorithms for the following problems: convex hull, sorting, huffman tree, shortest path, closest pair, line segment intersection, longest increasing subsequence, skyline, and voronoi diagram. Run-time performance data for each of these problems is given, and selected problems are described in more detail. Our results indicate that there are many cases in which certification trails allow for significantly faster overall program execution time than a 2-version programming approach, and also give further evidence of the breadth of applicability of this method.

  11. Software requirements flow-down and preliminary software design for the G-CLEF spectrograph

    NASA Astrophysics Data System (ADS)

    Evans, Ian N.; Budynkiewicz, Jamie A.; DePonte Evans, Janet; Miller, Joseph B.; Onyuksel, Cem; Paxson, Charles; Plummer, David A.

    2016-08-01

    The Giant Magellan Telescope (GMT)-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, precision radial velocity (PRV) optical echelle spectrograph that will be the first light instrument on the GMT. The G-CLEF instrument device control subsystem (IDCS) provides software control of the instrument hardware, including the active feedback loops that are required to meet the G-CLEF PRV stability requirements. The IDCS is also tasked with providing operational support packages that include data reduction pipelines and proposal preparation tools. A formal, but ultimately pragmatic approach is being used to establish a complete and correct set of requirements for both the G-CLEF device control and operational support packages. The device control packages must integrate tightly with the state-machine driven software and controls reference architecture designed by the GMT Organization. A model-based systems engineering methodology is being used to develop a preliminary design that meets these requirements. Through this process we have identified some lessons that have general applicability to the development of software for ground-based instrumentation. For example, tasking an individual with overall responsibility for science/software/hardware integration is a key step to ensuring effective integration between these elements. An operational concept document that includes detailed routine and non- routine operational sequences should be prepared in parallel with the hardware design process to tie together these elements and identify any gaps. Appropriate time-phasing of the hardware and software design phases is important, but revisions to driving requirements that impact software requirements and preliminary design are inevitable. Such revisions must be carefully managed to ensure efficient use of resources.

  12. Cluster computing software for GATE simulations.

    PubMed

    De Beenhouwer, Jan; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R

    2007-06-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

  13. Roles in Innovative Software Teams: A Design Experiment

    NASA Astrophysics Data System (ADS)

    Aaen, Ivan

    With inspiration from role-play and improvisational theater, we are developing a framework for innovation in software teams called Essence. Based on agile principles, Essence is designed for teams of developers and an onsite customer. This paper reports from teaching experiments inspired by design science, where we tried to assign differentiated roles to team members. The experiments provided valuable insights into the design of roles in Essence. These insights are used for redesigning how roles are described and conveyed in Essence.

  14. Mission design applications of QUICK. [software for interactive trajectory calculation

    NASA Technical Reports Server (NTRS)

    Skinner, David L.; Bass, Laura E.; Byrnes, Dennis V.; Cheng, Jeannie T.; Fordyce, Jess E.; Knocke, Philip C.; Lyons, Daniel T.; Pojman, Joan L.; Stetson, Douglas S.; Wolf, Aron A.

    1990-01-01

    An overview of an interactive software environment for space mission design termed QUICK is presented. This stand-alone program provides a programmable FORTRAN-like calculator interface to a wide range of both built-in and user defined functions. QUICK has evolved into a general-purpose software environment that can be intrinsically and dynamically customized for a wide range of mission design applications. Specific applications are described for some space programs, e.g., the earth-Venus-Mars mission, the Cassini mission to Saturn, the Mars Observer, the Galileo Project, and the Magellan Spacecraft.

  15. Channeling efficiency dependence on bending radius and thermal vibration amplitude of the model for the channeling of high-energy particles in straight and bent crystals implemented in Geant4

    NASA Astrophysics Data System (ADS)

    Bagli, Enrico; Asai, Makoto; Dotti, Andrea; Guidi, Vincenzo; Verderi, Marc

    2015-07-01

    Monte Carlo simulations of the interaction of particles with matter are usually done with downloadable toolkits such as Geant4. A model suitable for the implementation into Geant4 for the interaction of high-energy particles in straight and bent crystals was developed and implemented. The model relies on the continuum potential approximation. The variation of the Geant4 model for the description of the orientational effect as a function of the physical parameters for the calculation of the interplanar potential is presented. The simulations are capable of reproducing the variation of the efficiency of channeling as a function of the thermal vibration amplitude and the bending radius of a bent Si strip. The study can be useful for the simulation of the channeling effect in experiments at GeV/c energies.

  16. Software Design Methodology Migration for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .

  17. SU-E-T-81: Comparison of Microdosimetric Quantities Calculated Using the Track Structure Monte Carlo Algorithms Geant4-DNA and NOREC

    SciTech Connect

    Lucido, J; Popescu, I; Moiseenko, V

    2014-06-01

    Purpose: Microdosimetric quantities, such as the lineal energy, have been shown to correlate with the biological response to radiation and the relative biological effect of different radiation types. Track-structure Monte Carlo simulations are an important tool for investigating these responses and for developing mechanistic models to explain them. However, some of the cross-sectional data used in these algorithms has large uncertainties; thus, it is important to investigate how the implementation of the different codes affects the quantities of interest. Methods: Two of the most widely-used publicly available track-structure Monte Carlo codes, Geant4-DNA and NOREC, were used generate electron tracks for two particle sources. One source was a mono-energetic parallel beam of electrons with energies from 5 to 500-keV, and the lineal energy for each track was calculated in 1-mm-spheres arranged in planar arrays at multiple distances from the source. The second source was mono-energetic, uniformly-distributed, and isotropic source, and the lineal energy was scored in a single 30-mm-sphere for energies between 300-eV and 5-keV. Results: The dose-mean lineal energy for the parallel-beam simulations almost all agreed within 5%. For the uniformly-distributed source, at the lowest energies there was strong agreement between the algorithms, but the Geant4-DNA simulations showed slightly more high-energy events for more energetic electrons, but the dose-mean lineal energy agreed to within 4% for all energies. Conclusion: While there were slight differences in the results between the codes, these were consistent with previous studies of the stopping power and angular scattering distributions. Importantly, the computation time for Geant4-DNA was larger than for NOREC, largely due to approximations used in the NOREC for energies below 10-eV. This study shows that these approximation does not have a major impact on the microdosimetry on the energy and length scales investigated.

  18. Feasibility of using Geant4 Monte Carlo simulation for IMRT dose calculations for the Novalis Tx with a HD-120 multi-leaf collimator

    NASA Astrophysics Data System (ADS)

    Jung, Hyunuk; Shin, Jungsuk; Chung, Kwangzoo; Han, Youngyih; Kim, Jinsung; Choi, Doo Ho

    2015-05-01

    The aim of this study was to develop an independent dose verification system by using a Monte Carlo (MC) calculation method for intensity modulated radiation therapy (IMRT) conducted by using a Varian Novalis Tx (Varian Medical Systems, Palo Alto, CA, USA) equipped with a highdefinition multi-leaf collimator (HD-120 MLC). The Geant4 framework was used to implement a dose calculation system that accurately predicted the delivered dose. For this purpose, the Novalis Tx Linac head was modeled according to the specifications acquired from the manufacturer. Subsequently, MC simulations were performed by varying the mean energy, energy spread, and electron spot radius to determine optimum values of irradiation with 6-MV X-ray beams by using the Novalis Tx system. Computed percentage depth dose curves (PDDs) and lateral profiles were compared to the measurements obtained by using an ionization chamber (CC13). To validate the IMRT simulation by using the MC model we developed, we calculated a simple IMRT field and compared the result with the EBT3 film measurements in a water-equivalent solid phantom. Clinical cases, such as prostate cancer treatment plans, were then selected, and MC simulations were performed. The accuracy of the simulation was assessed against the EBT3 film measurements by using a gamma-index criterion. The optimal MC model parameters to specify the beam characteristics were a 6.8-MeV mean energy, a 0.5-MeV energy spread, and a 3-mm electron radius. The accuracy of these parameters was determined by comparison of MC simulations with measurements. The PDDs and the lateral profiles of the MC simulation deviated from the measurements by 1% and 2%, respectively, on average. The computed simple MLC fields agreed with the EBT3 measurements with a 95% passing rate with 3%/3-mm gamma-index criterion. Additionally, in applying our model to clinical IMRT plans, we found that the MC calculations and the EBT3 measurements agreed well with a passing rate of greater

  19. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured

  20. Designing visualization software for ships and robotic vehicles

    NASA Astrophysics Data System (ADS)

    Schwehr, Kurt D.; Derbes, Alexander; Edwards, Laurence; Nguyen, Laurent; Zbinden, Eric

    2005-03-01

    One of the challenges of visualization software design is providing real-time tools capable of concurrently displaying data that varies temporally and in scale from kilometers to micrometers, such as the data prevalent in planetary exploration and deep-sea marine research. The Viz software developed by NASA Ames and the additions of the X-Core extensions solve this problem by providing a flexible framework for rapidly developing visualization software capable of accessing and displaying large dynamic data sets. This paper describes the Viz/X-Core design and illustrates the operation of both systems over a number of deployments ranging from marine research to Martian exploration. Highlights include a 2002 integration with live ship operations and the Mars Exploration Rovers Spirit and Opportunity.

  1. Determination of the Thickness of the Back Dead-Layer of GRETINA Crystals via Comparisons of Measured Photopeak Efficiencies with GEANT4 Simulations

    NASA Astrophysics Data System (ADS)

    Jarvis, L. R.; Stine, C. G.; Riley, L. A.

    2016-09-01

    Measurements of the photopeak efficiency of the GRETINA array up to 3.5 MeV made at the National Superconducting Cyclotron Laboratory with 152Eu and 56Co sources were compared with GEANT 4 simulations. We developed a method of determining the average thickness of the back dead layers of the GRETINA crystals by considering the partial photopeak efficiencies of events including gamma-ray interactions in the back slice of the crystals. The impact of dead-layer thicknesses on the accuracy of simulated photopeak efficiencies and the ratio of photopeak counts measured in the two GRETINA crystal types is discussed. This work was supported by the National Science Foundation under Grant Nos. PHY-1303480 and PHY-1102511 and by the US Department of Energy under Grant No. DE-AC02-05CH11231.

  2. Simulation, optimization and testing of a novel high spatial resolution X-ray imager based on Zinc Oxide nanowires in Anodic Aluminium Oxide membrane using Geant4

    NASA Astrophysics Data System (ADS)

    Esfandi, F.; Saramad, S.

    2015-07-01

    In this work, a new generation of scintillator based X-ray imagers based on ZnO nanowires in Anodized Aluminum Oxide (AAO) nanoporous template is characterized. The optical response of ordered ZnO nanowire arrays in porous AAO template under low energy X-ray illumination is simulated by the Geant4 Monte Carlo code and compared with experimental results. The results show that for 10 keV X-ray photons, by considering the light guiding properties of zinc oxide inside the AAO template and suitable selection of detector thickness and pore diameter, the spatial resolution less than one micrometer and the detector detection efficiency of 66% are accessible. This novel nano scintillator detector can have many advantages for medical applications in the future.

  3. Monte Carlo simulation of ruthenium eye plaques with GEANT4: influence of multiple scattering algorithms, the spectrum and the geometry on depth dose profiles

    NASA Astrophysics Data System (ADS)

    Sommer, H.; Ebenau, M.; Spaan, B.; Eichmann, M.

    2017-03-01

    Previous studies show remarkable differences in the simulation of electron depth dose profiles of ruthenium eye plaques. We examined the influence of the scoring and simulation geometry, the source spectrum and the multiple scattering algorithm on the depth dose profile using GEANT4. The simulated absolute dose deposition agrees with absolute dose data from the manufacturer within the measurement uncertainty. Variations in the simulation geometry as well as the source spectrum have only a small influence on the depth dose profiles. However, the multiple scattering algorithms have the largest influence on the depth dose profiles. They deposit up to 20% less dose compared to the single scattering implementation. We recommend researchers who are interested in simulating low- to medium-energy electrons to examine their simulation under the influence of different multiple scattering settings. Since the simulation and scoring geometry as well as the exact physics settings are best described by the source code of the application, we made the code publicly available.

  4. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    PubMed

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity.

  5. Efficiency calibration and coincidence summing correction for a large volume (946cm(3)) LaBr3(Ce) detector: GEANT4 simulations and experimental measurements.

    PubMed

    Dhibar, M; Mankad, D; Mazumdar, I; Kumar, G Anil

    2016-12-01

    The paper describes the studies on efficiency calibration and coincidence summing correction for a 3.5″×6″ cylindrical LaBr3(Ce)detector. GEANT4 simulations were made with point sources, namely, (60)Co, (94)Nb, (24)Na, (46)Sc and (22)Na. The simulated efficiencies, extracted using (60)Co, (94)Nb, (24)Na and (46)Sc that emit coincident gamma rays with same decay intensities, were corrected for coincidence summing by applying the method proposed by Vidmar et al. (2003). The method was applied, for the first time, for correcting the simulated efficiencies extracted using (22)Na that emits coincident gamma rays with different decay intensities. The measured results obtained using (60)Co and (22)Na were found to be in good agreement with simulated results.

  6. Monte Carlo simulation of ruthenium eye plaques with GEANT4: influence of multiple scattering algorithms, the spectrum and the geometry on depth dose profiles.

    PubMed

    Sommer, H; Ebenau, M; Spaan, B; Eichmann, M

    2017-03-07

    Previous studies show remarkable differences in the simulation of electron depth dose profiles of ruthenium eye plaques. We examined the influence of the scoring and simulation geometry, the source spectrum and the multiple scattering algorithm on the depth dose profile using GEANT4. The simulated absolute dose deposition agrees with absolute dose data from the manufacturer within the measurement uncertainty. Variations in the simulation geometry as well as the source spectrum have only a small influence on the depth dose profiles. However, the multiple scattering algorithms have the largest influence on the depth dose profiles. They deposit up to 20% less dose compared to the single scattering implementation. We recommend researchers who are interested in simulating low- to medium-energy electrons to examine their simulation under the influence of different multiple scattering settings. Since the simulation and scoring geometry as well as the exact physics settings are best described by the source code of the application, we made the code publicly available.

  7. Designing Computer Software To Minimize the Need for Employee Training.

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2000-01-01

    Discusses problems that arise when computer software users have to learn a new system while maintaining productivity. Highlights include active learning; a constructivist view; Vygotsky's zone of proximal development; and a model called Design for Learnability (DesiL) that focuses the performance technologist on an ethnomethodological study of…

  8. Peeling the Onion: Okapi System Architecture and Software Design Issues.

    ERIC Educational Resources Information Center

    Jones, S.; And Others

    1997-01-01

    Discusses software design issues for Okapi, an information retrieval system that incorporates both search engine and user interface and supports weighted searching, relevance feedback, and query expansion. The basic search system, adjacency searching, and moving toward a distributed system are discussed. (Author/LRW)

  9. Competing Ideologies in Software Design for Computer-Aided Composition.

    ERIC Educational Resources Information Center

    LeBlanc, Paul

    1990-01-01

    Illustrates the ideological foundations of computer-assisted composition (CAC) software by comparing two computer-writing aids: "Grammatik II" and "Interchange." Notes that many of the best CAC programs have come out of the composition classroom. Argues that English departments must support and reward those who work in CAC design. (RS)

  10. Art & Design Software Development Using IBM Handy (A Personal Experience).

    ERIC Educational Resources Information Center

    McWhinnie, Harold J.

    This paper presents some of the results from a course in art and design. The course involved the use of simple computer programs for the arts. Attention was geared to the development of graphic components for educational software. The purpose of the course was to provide, through lectures and extensive hands on experience, a basic introduction to…

  11. The Status of Presentation Software and Graphic Design Training.

    ERIC Educational Resources Information Center

    Chalupa, Marilyn R.; Sormunen, Carolee

    1996-01-01

    Usable surveys were received from 282 (23.5%) subscribers to a computer magazine, of whom 91.5% had taught themselves the use of presentation software. Less than 40% had graphic design training; about half recognized a need for it. Printed documentation was the most common means of support. (SK)

  12. Designing for User Cognition and Affect in Software Instructions

    ERIC Educational Resources Information Center

    van der Meij, Hans

    2008-01-01

    In this paper we examine how to design software instructions for user cognition and affect. A basic and co-user manual are compared. The first provides fundamental support for both; the latter includes a buddy to further optimize support for user affect. The basic manual was faster and judged as easier to process than the co-user manual. In…

  13. QUICK - An interactive software environment for engineering design

    NASA Technical Reports Server (NTRS)

    Skinner, David L.

    1989-01-01

    QUICK, an interactive software environment for engineering design, provides a programmable FORTRAN-like calculator interface to a wide range of data structures as well as both built-in and user created functions. QUICK also provides direct access to the operating systems of eight different machine architectures. The evolution of QUICK and a brief overview of the current version are presented.

  14. Issues in Software Engineering of Relevance to Instructional Design

    ERIC Educational Resources Information Center

    Douglas, Ian

    2006-01-01

    Software engineering is popularly misconceived as being an upmarket term for programming. In a way, this is akin to characterizing instructional design as the process of creating PowerPoint slides. In both these areas, the construction of systems, whether they are learning or computer systems, is only one part of a systematic process. The most…

  15. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  16. Advanced Spacesuit Informatics Software Design for Power, Avionics and Software Version 2.0

    NASA Technical Reports Server (NTRS)

    Wright, Theodore W.

    2016-01-01

    A description of the software design for the 2016 edition of the Informatics computer assembly of the NASAs Advanced Extravehicular Mobility Unit (AEMU), also called the Advanced Spacesuit. The Informatics system is an optional part of the spacesuit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and warning information. It also provides an interface to the suit mounted camera for recording still images, video, and audio field notes.

  17. [Software Design for a Portable Ultrasound Bone Densitometer].

    PubMed

    Deng, Jiangjun; Ding, Jie; Xu, Shijie; Geng, Ruihua; He, Aijun

    2015-10-01

    In order to meet the requirements of ultrasound bone density measurement, we designed a sofware based on Visual Studio C+ + 2008. The software includes interface design, acquisition and control, data processing and parameter extraction, data storage and printing. Excellent human-computer interface (HCI) will give users a convenient experience. Auto gain control (AGC) and digital filter can improve the precision effectively. In addition, we can observe waveform clearly in real time. By using USB communication, we can send control commands to the acquisition and get data effectively, which can shorten the measuring time. Then we calculated the speed of sound (SOS) and broadband ultrasound attenuation (BUA). Patients' information can be accessed by using XML document. Finally, the software offers printing function.

  18. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  19. Critical Design Decisions of The Planck LFI Level 1 Software

    NASA Astrophysics Data System (ADS)

    Morisset, N.; Rohlfs, R.; Türler, M.; Meharga, M.; Binko, P.; Beck, M.; Frailis, M.; Zacchei, A.

    2010-12-01

    The PLANCK satellite with two on-board instruments, a Low Frequency Instrument (LFI) and a High Frequency Instrument (HFI) has been launched on May 14th with Ariane 5. The ISDC Data Centre for Astrophysics in Versoix, Switzerland has developed and maintains the Planck LFI Level 1 software for the Data Processing Centre (DPC) in Trieste, Italy. The main tasks of the Level 1 processing are to retrieve the daily available scientific and housekeeping (HK) data of the LFI instrument, the Sorption Cooler and the 4k Cooler data from Mission Operation Centre (MOC) in Darmstadt; to sort them by time and by type (detector, observing mode, etc...); to extract the spacecraft attitude information from auxiliary files; to flag the data according to several criteria; and to archive the resulting Time Ordered Information (TOI), which will then be used to produce maps of the sky in different spectral bands. The output of the Level 1 software are the TOI files in FITS format, later ingested into the Data Management Component (DMC) database. This software has been used during different phases of the LFI instrument development. We started to reuse some ISDC components for the LFI Qualification Model (QM) and we completely rework the software for the Flight Model (FM). This was motivated by critical design decisions taken jointly with the DPC. The main questions were: a) the choice of the data format: FITS or DMC? b) the design of the pipelines: use of the Planck Process Coordinator (ProC) or a simple Perl script? c) do we adapt the existing QM software or do we restart from scratch? The timeline and available manpower are also important issues to be taken into account. We present here the orientation of our choices and discuss their pertinence based on the experience of the final pre-launch tests and the start of real Planck LFI operations.

  20. A requirements specification for a software design support system

    NASA Technical Reports Server (NTRS)

    Noonan, Robert E.

    1988-01-01

    Most existing software design systems (SDSS) support the use of only a single design methodology. A good SDSS should support a wide variety of design methods and languages including structured design, object-oriented design, and finite state machines. It might seem that a multiparadigm SDSS would be expensive in both time and money to construct. However, it is proposed that instead an extensible SDSS that directly implements only minimal database and graphical facilities be constructed. In particular, it should not directly implement tools to faciliate language definition and analysis. It is believed that such a system could be rapidly developed and put into limited production use, with the experience gained used to refine and evolve the systems over time.

  1. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  2. Requirements Management System Browser (RMSB) software design description

    SciTech Connect

    Frank, D.D.

    1996-09-30

    The purpose of this document is to provide an ``as-built`` design description for the Requirements Management System Browser (RMSB) application. The Graphical User Interface (GUI) and database structure design are described for the RMSB application, referred to as the ``Browser.`` The RMSB application provides an easy to use PC-based interface to browse systems engineering data stored and managed in a UNIX software application. The system engineering data include functions, requirements, and architectures that make up the Tank Waste Remediation System (TWRS) technical baseline.

  3. Software and Physics Simulation at Belle II

    NASA Astrophysics Data System (ADS)

    Fulsom, Bryan; Belle Collaboration, II

    2016-03-01

    The Belle II experiment at the SuperKEKB collider in Tsukuba, Japan, will start taking physics data in 2018 and will accumulate 50 ab-1 of e+e-collision data, about 50 times larger than the data set of the earlier Belle experiment. The new detector will use GEANT4 for Monte Carlo simulation and an entirely new software and reconstruction system based on modern computing tools. Examples of physics simulation including beam background overlays will be described.

  4. Autonomous robot vision software design using Matlab toolboxes

    NASA Astrophysics Data System (ADS)

    Tedder, Maurice; Chung, Chan-Jin

    2004-10-01

    The purpose of this paper is to introduce a cost-effective way to design robot vision and control software using Matlab for an autonomous robot designed to compete in the 2004 Intelligent Ground Vehicle Competition (IGVC). The goal of the autonomous challenge event is for the robot to autonomously navigate an outdoor obstacle course bounded by solid and dashed lines on the ground. Visual input data is provided by a DV camcorder at 160 x 120 pixel resolution. The design of this system involved writing an image-processing algorithm using hue, satuaration, and brightness (HSB) color filtering and Matlab image processing functions to extract the centroid, area, and orientation of the connected regions from the scene. These feature vectors are then mapped to linguistic variables that describe the objects in the world environment model. The linguistic variables act as inputs to a fuzzy logic controller designed using the Matlab fuzzy logic toolbox, which provides the knowledge and intelligence component necessary to achieve the desired goal. Java provides the central interface to the robot motion control and image acquisition components. Field test results indicate that the Matlab based solution allows for rapid software design, development and modification of our robot system.

  5. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  6. Computational protein design: the Proteus software and selected applications.

    PubMed

    Simonson, Thomas; Gaillard, Thomas; Mignon, David; Schmidt am Busch, Marcel; Lopes, Anne; Amara, Najette; Polydorides, Savvas; Sedano, Audrey; Druart, Karen; Archontis, Georgios

    2013-10-30

    We describe an automated procedure for protein design, implemented in a flexible software package, called Proteus. System setup and calculation of an energy matrix are done with the XPLOR modeling program and its sophisticated command language, supporting several force fields and solvent models. A second program provides algorithms to search sequence space. It allows a decomposition of the system into groups, which can be combined in different ways in the energy function, for both positive and negative design. The whole procedure can be controlled by editing 2-4 scripts. Two applications consider the tyrosyl-tRNA synthetase enzyme and its successful redesign to bind both O-methyl-tyrosine and D-tyrosine. For the latter, we present Monte Carlo simulations where the D-tyrosine concentration is gradually increased, displacing L-tyrosine from the binding pocket and yielding the binding free energy difference, in good agreement with experiment. Complete redesign of the Crk SH3 domain is presented. The top 10000 sequences are all assigned to the correct fold by the SUPERFAMILY library of Hidden Markov Models. Finally, we report the acid/base behavior of the SNase protein. Sidechain protonation is treated as a form of mutation; it is then straightforward to perform constant-pH Monte Carlo simulations, which yield good agreement with experiment. Overall, the software can be used for a wide range of application, producing not only native-like sequences but also thermodynamic properties with errors that appear comparable to other current software packages.

  7. Organ doses from hepatic radioembolization with 90Y, 153Sm, 166Ho and 177Lu: A Monte Carlo simulation study using Geant4

    NASA Astrophysics Data System (ADS)

    Hashikin, N. A. A.; Yeong, C. H.; Guatelli, S.; Abdullah, B. J. J.; Ng, K. H.; Malaroda, A.; Rosenfeld, A. B.; Perkins, A. C.

    2016-03-01

    90Y-radioembolization is a palliative treatment for liver cancer. 90Y decays via beta emission, making imaging difficult due to absence of gamma radiation. Since post-procedure imaging is crucial, several theranostic radionuclides have been explored as alternatives. However, exposures to gamma radiation throughout the treatment caused concern for the organs near the liver. Geant4 Monte Carlo simulation using MIRD Pamphlet 5 reference phantom was carried out. A spherical tumour with 4.3cm radius was modelled within the liver. 1.82GBq of 90Y sources were isotropically distributed within the tumour, with no extrahepatic shunting. The simulation was repeated with 153Sm, 166Ho and 177Lu. The estimated tumour doses for all radionuclides were 262.9Gy. Tumour dose equivalent to 1.82GBq 90Y can be achieved with 8.32, 5.83, and 4.44GBq for 153Sm, 166Ho and 177Lu, respectively. Normal liver doses by the other radionuclides were lower than 90Y, hence beneficial for normal tissue sparing. The organ doses from 153Sm and 177Lu were relatively higher due to higher gamma energy, but were still well below 1Gy. 166Ho, 177Lu and 153Sm offer useful gamma emission for post-procedure imaging. They show potential as 90Y substitutes, delivering comparable tumour doses, lower normal liver doses and other organs doses far below the tolerance limit.

  8. Calculation of direct effects of 60Co gamma rays on the different DNA structural levels: A simulation study using the Geant4-DNA toolkit

    NASA Astrophysics Data System (ADS)

    Tajik, Marjan; Rozatian, Amir S. H.; Semsarha, Farid

    2015-03-01

    In this study, simple single strand breaks (SSB) and double strand breaks (DSB) due to direct effects of the secondary electron spectrum of 60Co gamma rays on different organizational levels of a volume model of the B-DNA conformation have been calculated using the Geant4-DNA toolkit. Result of this study for the direct DSB yield shows a good agreement with other theoretical and experimental results obtained by both photons and their secondary electrons; however, in the case of SSB a noticeable difference can be observed. Moreover, regarding the almost constant yields of the direct strand breaks in the different structural levels of the DNA, calculated in this work, and compared with some theoretical studies, it can be deduced that the direct strand breaks yields depend mainly on the primary double helix structure of the DNA and the higher-order structures cannot have a noticeable effect on the direct DNA damage inductions by 60Co gamma rays. In contrast, a direct dependency between the direct SSB and DSB yields and the volume of the DNA structure has been found. Also, a further study on the histone proteins showed that they can play an important role in the trapping of low energy electrons without any significant effect on the direct DNA strand breaks inductions, at least in the range of energies used in the current study.

  9. Geant4 simulation of clinical proton and carbon ion beams for the treatment of ocular melanomas with the full 3-D pencil beam scanning system

    SciTech Connect

    Farina, Edoardo; Riccardi, Cristina; Rimoldi, Adele; Tamborini, Aurora; Piersimoni, Pierluigi; Ciocca, Mario

    2015-07-01

    This work investigates the possibility to use carbon ion beams delivered with active scanning modality, for the treatment of ocular melanomas at the Centro Nazionale di Adroterapia Oncologica (CNAO) in Pavia. The radiotherapy with carbon ions offers many advantages with respect to the radiotherapy with protons or photons, such as a higher relative radio-biological effectiveness (RBE) and a dose release better localized to the tumor. The Monte Carlo (MC) Geant4 10.00 patch-03 toolkit is used to reproduce the complete CNAO extraction beam line, including all the active and passive components characterizing it. The simulation of proton and carbon ion beams and radiation scanned field is validated against CNAO experimental data. For the irradiation study of the ocular melanoma an eye-detector, representing a model of a human eye, is implemented in the simulation. Each element of the eye is reproduced with its chemical and physical properties. Inside the eye-detector a realistic tumor volume is placed and used as the irradiation target. A comparison between protons and carbon ions eye irradiations allows to study possible treatment benefits if carbon ions are used instead of protons. (authors)

  10. A Geant4-based Simulation to Evaluate the Feasibility of Using Nuclear Resonance Fluorescence (NRF) in Determining Atomic Compositions of Body Tissue in Cancer Diagnostics and Irradiation

    NASA Astrophysics Data System (ADS)

    Gilbo, Yekaterina; Wijesooriya, Krishni; Liyanage, Nilanga

    2017-01-01

    Customarily applied in homeland security for identifying concealed explosives and chemical weapons, NRF (Nuclear Resonance Fluorescence) may have high potential in determining atomic compositions of body tissue. High energy photons incident on a target excite the target nuclei causing characteristic re-emission of resonance photons. As the nuclei of each isotope have well-defined excitation energies, NRF uniquely indicates the isotopic content of the target. NRF radiation corresponding to nuclear isotopes present in the human body is emitted during radiotherapy based on Bremsstrahlung photons generated in a linear electron accelerator. We have developed a Geant4 simulation in order to help assess NRF capabilities in detecting, mapping, and characterizing tumors. We have imported a digital phantom into the simulation using anatomical data linked to known chemical compositions of various tissues. Work is ongoing to implement the University of Virginia's cancer center treatment setup and patient geometry, and to collect and analyze the simulation's physics quantities to evaluate the potential of NRF for medical imaging applications. Preliminary results will be presented.

  11. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  12. Software Package Completed for Alloy Design at the Atomic Level

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo H.; Noebe, Ronald D.; Abel, Phillip B.; Good, Brian S.

    2001-01-01

    As a result of a multidisciplinary effort involving solid-state physics, quantum mechanics, and materials and surface science, the first version of a software package dedicated to the atomistic analysis of multicomponent systems was recently completed. Based on the BFS (Bozzolo, Ferrante, and Smith) method for the calculation of alloy and surface energetics, this package includes modules devoted to the analysis of many essential features that characterize any given alloy or surface system, including (1) surface structure analysis, (2) surface segregation, (3) surface alloying, (4) bulk crystalline material properties and atomic defect structures, and (5) thermal processes that allow us to perform phase diagram calculations. All the modules of this Alloy Design Workbench 1.0 (ADW 1.0) are designed to run in PC and workstation environments, and their operation and performance are substantially linked to the needs of the user and the specific application.

  13. De novo gene synthesis design using TmPrime software.

    PubMed

    Li, Mo-Huang; Bode, Marcus; Huang, Mo Chao; Cheong, Wai Chye; Lim, Li Shi

    2012-01-01

    This chapter presents TmPrime, a computer program to design oligonucleotide for both ligase chain reaction (LCR)- and polymerase chain reaction (PCR)-based de novo gene synthesis. The program divides a long input DNA sequence based on user-specified melting temperatures and assembly conditions, and dynamically optimizes the length of oligonucleotides to achieve homologous melting temperatures. The output reports the melting temperatures, oligonucleotide sequences, and potential formation of secondary structures in a PDF file, which will be sent to the user via e-mail. The program also provides functions on sequence pooling to separate long genes into smaller pieces for multipool assembly and codon optimization for expression based on the highest organism-specific codon frequency. This software has been successfully used in the design and synthesis of various genes with total length >20 kbp. This program is freely available at http://prime.ibn.a-star.edu.sg.

  14. Trainers and Software Designers: The Case for Togetherness.

    ERIC Educational Resources Information Center

    Lippincott, Jenifer

    1998-01-01

    Offers three strategies that will make the job of training employees to use new software easier: (1) understand the business need that the software is addressing; (2) synchronize the development of training and support materials with the software development cycle; and (3) choose the appropriate training approach for the software application.…

  15. Learning & Personality Types: A Case Study of a Software Design Course

    ERIC Educational Resources Information Center

    Ahmed, Faheem; Campbell, Piers; Jaffar, Ahmad; Alkobaisi, Shayma; Campbell, Julie

    2010-01-01

    The software industry has continued to grow over the past decade and there is now a need to provide education and hands-on training to students in various phases of software life cycle. Software design is one of the vital phases of the software development cycle. Psychological theories assert that not everybody is fit for all kind of tasks as…

  16. Pedagogy Embedded in Educational Software Design: Report of a Case Study.

    ERIC Educational Resources Information Center

    Hinostroza, J. Enrique; Mellar, Harvey

    2001-01-01

    Discussion of educational software focuses on a model of educational software that was derived from a case study of two elementary school teachers participating in a software design process. Considers human-computer interface, interaction, software browsing strategies, and implications for teacher training. (Author/LRW)

  17. Validation of GEANT4 simulations for percentage depth dose calculations in heterogeneous media by using small photon beams from the 6-MV Cyberknife: Comparison with photon beam dosimetry with EBT2 film

    NASA Astrophysics Data System (ADS)

    Lee, Chung Il; Yoon, Sei-Chul; Shin, Jae Won; Hong, Seung-Woo; Suh, Tae Suk; Min, Kyung Joo; Lee, Sang Deok; Chung, Su Mi; Jung, Jae-Yong

    2015-04-01

    Percentage depth dose (PDD) distributions in heterogeneous phantoms with lung and soft bone equivalent media are studied by using the GEANT4 Monte Carlo code. For lung equivalent media, Balsa wood is used, and for soft bone equivalent media, a compound material with epoxy resin, hardener and calcium carbonate is used. Polystyrene slabs put together with these materials are used as a heterogeneous phantom. Dose measurements are performed with Gafchromic EBT2 film by using photon beams from the 6-MV CyberKnife at the Seoul Uridul Hospital. The cone sizes of the photon beams are varied from 5 to 10 to 30 mm. When the Balsa wood is inserted in the phantom, the dose measured with EBT2 film is found to be significantly different from the dose without the EBT2 film in and the dose beyond the Balsa wood region, particularly for small field sizes. On the other hand, when the soft bone equivalent material is inserted in the phantom, the discrepancy between the dose measured with EBT2 film and the dose without EBT2 film can be seen only in the region of the soft bone equivalent material. GEANT4 simulations are done with and without the EBT2 film to compare the simulation results with measurements. The GEANT4 simulations including EBT2 film are found to agree well with the measurements for all the cases within an error of 2.2%. The results of the present study show that GEANT4 gives reasonable results for the PDD calculations in heterogeneous media when using photon beams produced by the 6-MV CyberKnife

  18. Software.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1989

    1989-01-01

    Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)

  19. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, G.; Ferguson, B. L.

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, Task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results, and were generated using a Yates analysis. A previous study has indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledge base and developing a procedure to extract rules to aid in TBC design.

  20. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, Gregory; Ferguson, B. Lynn

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results and were generated using a Yates analysis. A previous study had indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledgebase and developing a procedure to extract rules to aid in TBC design.

  1. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  2. Efficiency corrections in determining the (137)Cs inventory of environmental soil samples by using relative measurement method and GEANT4 simulations.

    PubMed

    Li, Gang; Liang, Yongfei; Xu, Jiayun; Bai, Lixin

    2015-08-01

    The determination of (137)Cs inventory is widely used to estimate the soil erosion or deposition rate. The generally used method to determine the activity of volumetric samples is the relative measurement method, which employs a calibration standard sample with accurately known activity. This method has great advantages in accuracy and operation only when there is a small difference in elemental composition, sample density and geometry between measuring samples and the calibration standard. Otherwise it needs additional efficiency corrections in the calculating process. The Monte Carlo simulations can handle these correction problems easily with lower financial cost and higher accuracy. This work presents a detailed description to the simulation and calibration procedure for a conventionally used commercial P-type coaxial HPGe detector with cylindrical sample geometry. The effects of sample elemental composition, density and geometry were discussed in detail and calculated in terms of efficiency correction factors. The effect of sample placement was also analyzed, the results indicate that the radioactive nuclides and sample density are not absolutely uniform distributed along the axial direction. At last, a unified binary quadratic functional relationship of efficiency correction factors as a function of sample density and height was obtained by the least square fitting method. This function covers the sample density and height range of 0.8-1.8 g/cm(3) and 3.0-7.25 cm, respectively. The efficiency correction factors calculated by the fitted function are in good agreement with those obtained by the GEANT4 simulations with the determination coefficient value greater than 0.9999. The results obtained in this paper make the above-mentioned relative measurements more accurate and efficient in the routine radioactive analysis of environmental cylindrical soil samples.

  3. SU-E-T-203: Comparison of a Commercial MRI-Linear Accelerator Based Monte Carlo Dose Calculation Algorithm and Geant4

    SciTech Connect

    Ahmad, S; Sarfehnia, A; Paudel, M; Sahgal, A; Keller, B; Hissoiny, S

    2015-06-15

    Purpose: An MRI-linear accelerator is currently being developed by the vendor Elekta™. The treatment planning system that will be used to model dose for this unit uses a Monte Carlo dose calculation algorithm, GPUMCD, that allows for the application of a magnetic field. We tested this radiation transport code against an independent Monte-Carlo toolkit Geant4 (v.4.10.01) both with and without the magnetic field applied. Methods: The setup comprised a 6 MeV mono-energetic photon beam emerging from a point source impinging on a homogeneous water phantom at 100 cm SSD. The comparisons were drawn from the percentage depth doses (PDD) for three different field sizes (1.5 x 1.5 cm{sup 2}, 5 x 5 cm{sup 2}, 10 x 10 cm{sup 2}) and dose profiles at various depths. A 1.5 T magnetic field was applied perpendicular to the direction of the beam. The transport thresholds were kept the same for both codes. Results: All of the normalized PDDs and profiles agreed within ± 1 %. In the presence of the magnetic field, PDDs rise more quickly reducing the depth of maximum dose. Near the beam exit point in the phantom a hot spot is created due to the electron return effect. This effect is more pronounced for the larger field sizes. Profiles selected parallel to the external field show no effect, however, the ones selected perpendicular to the direction of the applied magnetic field are shifted towards the direction of the Lorentz force applied by the magnetic field on the secondary electrons. It is observed that these profiles are not symmetric which indicates a lateral build up of the dose. Conclusion: There is a good general agreement between the PDDs/profiles calculated by both algorithms thus far. We are proceeding towards clinically relevant comparisons in a heterogeneous phantom for polyenergetic beams. Funding for this work has been provided by Elekta.

  4. Comparative study of dose distributions and cell survival fractions for 1H, 4He, 12C and 16O beams using Geant4 and Microdosimetric Kinetic model

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2015-04-01

    Depth and radial dose profiles for therapeutic 1H, 4He, 12C and 16O beams are calculated using the Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT). 4He and 16O ions are presented as alternative options to 1H and 12C broadly used for ion-beam cancer therapy. Biological dose profiles and survival fractions of cells are estimated using the modified Microdosimetric Kinetic model. Depth distributions of cell survival of healthy tissues, assuming 10% and 50% survival of tumor cells, are calculated for 6 cm SOBPs at two tumor depths and for different tissues radiosensitivities. It is found that the optimal ion choice depends on (i) depth of the tumor, (ii) dose levels and (iii) the contrast of radiosensitivities of tumor and surrounding healthy tissues. Our results indicate that 12C and 16O ions are more appropriate to spare healthy tissues in the case of a more radioresistant tumor at moderate depths. On the other hand, a sensitive tumor surrounded by more resistant tissues can be better treated with 1H and 4He ions. In general, 4He beam is found to be a good candidate for therapy. It better spares healthy tissues in all considered cases compared to 1H. Besides, the dose conformation is improved for deep-seated tumors compared to 1H, and the damage to surrounding healthy tissues is reduced compared to heavier ions due to the lower impact of nuclear fragmentation. No definite advantages of 16O with respect to 12C ions are found in this study.

  5. Benchmarking the Geant4 full system simulation of an associated alpha-particle detector for use in a D-T neutron generator.

    PubMed

    Zhang, Xiaodong; Hayward, Jason P; Cates, Joshua W; Hausladen, Paul A; Laubach, Mitchell A; Sparger, Johnathan E; Donnald, Samuel B

    2012-08-01

    The position-sensitive alpha-particle detector used to provide the starting time and initial direction of D-T neutrons in a fast-neutron imaging system was simulated with a Geant4-based Monte Carlo program. The whole detector system, which consists of a YAP:Ce scintillator, a fiber-optic faceplate, a light guide, and a position-sensitive photo-multiplier tube (PSPMT), was modeled, starting with incident D-T alphas. The scintillation photons, whose starting time follows the distribution of a scintillation decay curve, were produced and emitted uniformly into a solid angle of 4π along the track segments of the alpha and its secondaries. Through tracking all photons and taking into account the quantum efficiency of the photocathode, the number of photoelectrons and their time and position distributions were obtained. Using a four-corner data reconstruction formula, the flood images of the alpha detector with and without optical grease between the YAP scintillator and the fiber-optic faceplate were obtained, which show agreement with the experimental results. The reconstructed position uncertainties of incident alpha particles for both cases are 1.198 mm and 0.998 mm respectively across the sensitive area of the detector. Simulation results also show that comparing with other faceplates composed of 500 μm, 300 μm, and 100 μm fibers, the 10-μm-fiber faceplate is the best choice to build the detector for better position performance. In addition, the study of the background originating inside the D-T generator suggests that for 500-μm-thick YAP:Ce coated with 1-μm-thick aluminum, and very good signal-to-noise ratio can be expected through application of a simple threshold.

  6. SU-E-T-290: Secondary Dose Monitoring Using Scintillating Fibers in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    SciTech Connect

    Tesfamicael, B; Gueye, P; Lyons, D; Avery, S; Mahesh, M

    2014-06-01

    Purpose: To monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers. Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate prostate cancer proton therapy based treatments. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cmm{sup 3} DuPont™ Delrin blocks were used to monitor the emission of secondary particles in the transverse (left and right) and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were used to extract the energy deposited in each fiber and the scintillating block. Results: The transverse dose distributions from secondary particles in both cases agree within <5% and with a very good symmetry. The energy deposited not only gradually increases as one moves from the peripheral row fibers towards the center of the block (aligned with the center of the prostate) but also decreases as one goes from the frontal to distal region of the block. The ratio of the doses from the prostate to the ones in the middle two rows of fibers showed a linear relationship with a slope (−3.55±2.26) × 10−5 MeV per treatment Gy. The distal detectors recorded a very small energy deposited due to water attenuation. Conclusion: With a good calibration and the ability to define a good correlation between the dose to the external fibers and the prostate, such fibers can be used for real time dose verification to the target.

  7. SU-E-T-519: Emission of Secondary Particles From a PMMA Phantom During Proton Irradiation: A Simulation Study with the Geant4 Monte Carlo Toolkit

    SciTech Connect

    Lau, A; Chen, Y; Ahmad, S

    2014-06-01

    Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMA phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study.

  8. Comparative study of dose distributions and cell survival fractions for 1H, 4He, 12C and 16O beams using Geant4 and Microdosimetric Kinetic model.

    PubMed

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2015-04-21

    Depth and radial dose profiles for therapeutic (1)H, (4)He, (12)C and (16)O beams are calculated using the Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT). (4)He and (16)O ions are presented as alternative options to (1)H and (12)C broadly used for ion-beam cancer therapy. Biological dose profiles and survival fractions of cells are estimated using the modified Microdosimetric Kinetic model. Depth distributions of cell survival of healthy tissues, assuming 10% and 50% survival of tumor cells, are calculated for 6 cm SOBPs at two tumor depths and for different tissues radiosensitivities. It is found that the optimal ion choice depends on (i) depth of the tumor, (ii) dose levels and (iii) the contrast of radiosensitivities of tumor and surrounding healthy tissues. Our results indicate that (12)C and (16)O ions are more appropriate to spare healthy tissues in the case of a more radioresistant tumor at moderate depths. On the other hand, a sensitive tumor surrounded by more resistant tissues can be better treated with (1)H and (4)He ions. In general, (4)He beam is found to be a good candidate for therapy. It better spares healthy tissues in all considered cases compared to (1)H. Besides, the dose conformation is improved for deep-seated tumors compared to (1)H, and the damage to surrounding healthy tissues is reduced compared to heavier ions due to the lower impact of nuclear fragmentation. No definite advantages of (16)O with respect to (12)C ions are found in this study.

  9. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4.

    PubMed

    Pope, D J; Cutajar, D L; George, S P; Guatelli, S; Bucci, J A; Enari, K E; Miller, S; Siegele, R; Rosenfeld, A B

    2015-06-07

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences.Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  10. Technical Note: Implementation of biological washout processes within GATE/GEANT4—A Monte Carlo study in the case of carbon therapy treatments

    SciTech Connect

    Martínez-Rovira, I. Jouvie, C.; Jan, S.

    2015-04-15

    Purpose: The imaging of positron emitting isotopes produced during patient irradiation is the only in vivo method used for hadrontherapy dose monitoring in clinics nowadays. However, the accuracy of this method is limited by the loss of signal due to the metabolic decay processes (biological washout). In this work, a generic modeling of washout was incorporated into the GATE simulation platform. Additionally, the influence of the washout on the β{sup +} activity distributions in terms of absolute quantification and spatial distribution was studied. Methods: First, the irradiation of a human head phantom with a {sup 12}C beam, so that a homogeneous dose distribution was achieved in the tumor, was simulated. The generated {sup 11}C and {sup 15}O distribution maps were used as β{sup +} sources in a second simulation, where the PET scanner was modeled following a detailed Monte Carlo approach. The activity distributions obtained in the presence and absence of washout processes for several clinical situations were compared. Results: Results show that activity values are highly reduced (by a factor of 2) in the presence of washout. These processes have a significant influence on the shape of the PET distributions. Differences in the distal activity falloff position of 4 mm are observed for a tumor dose deposition of 1 Gy (T{sub ini} = 0 min). However, in the case of high doses (3 Gy), the washout processes do not have a large effect on the position of the distal activity falloff (differences lower than 1 mm). The important role of the tumor washout parameters on the activity quantification was also evaluated. Conclusions: With this implementation, GATE/GEANT 4 is the only open-source code able to simulate the full chain from the hadrontherapy irradiation to the PET dose monitoring including biological effects. Results show the strong impact of the washout processes, indicating that the development of better models and measurement of biological washout data are

  11. OpenPET Hardware, Firmware, Software, and Board Design Files

    SciTech Connect

    Abu-Nimeh, Faisal; Choong, Woon-Sengq; Moses, William W.; Peng, Qiyu

    2016-03-29

    OpenPET is an open source, flexible, high-performance, and modular data acquisition system for a variety of applications. The OpenPET electronics are capable of reading analog voltage or current signals from a wide variety of sensors. The electronics boards make extensive use of field programmable gate arrays (FPGAs) to provide flexibility and scalability. Firmware and software for the FPGAs and computer are used to control and acquire data from the system. The command and control flow is similar to the data flow, however, the commands are initiated from the computer similar to a tree topology (i.e., from top-to-bottom). Each node in the tree discovers its parent and children, and all addresses are configured accordingly. A user (or a script) initiates a command from the computer. This command will be translated and encoded to the corresponding child (e.g., SB, MB, DB, etc.). Consecutively, each node will pass the command to its corresponding child(ren) by looking at the destination address. Finally, once the command reaches its desired destination(s) the corresponding node(s) execute(s) the command and send(s) a reply, if required. All the firmware, software, and the electronics board design files are distributed through the OpenPET website (http://openpet.lbl.gov).

  12. Design of Timing System Software on EAST-NBI

    NASA Astrophysics Data System (ADS)

    Zhao, Yuan-Zhe; Hu, Chun-Dong; Sheng, Peng; Zhang, Xiao-Dan; Wu, De-Yun; Cui, Qing-Long

    2013-10-01

    Neutral Beam Injector (NBI) is one of the main plasma heating and plasma current driving methods for Experimental Advanced Superconducting Tokomaks. In order to monitor the NBI experiment, control all the power supply, realize data acquisition and network, the control system is designed. As an important part of NBI control system, timing system (TS) provides a unified clock for all subsystems of NBI. TS controls the input/output services of digital signals and analog signals. It sends feedback message to the control server which is the function of alarm and interlock protection. The TS software runs on a Windows system and uses Labview language code while using client/server mode, multithreading and cyclic redundancy check technology. The experimental results have proved that TS provides a stability and reliability clock to the subsystems of NBI and contributed to the safety of the whole NBI system.

  13. Designing a Software Tool for Fuzzy Logic Programming

    NASA Astrophysics Data System (ADS)

    Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés

    2007-12-01

    Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.

  14. Software Would Largely Automate Design of Kalman Filter

    NASA Technical Reports Server (NTRS)

    Chuang, Jason C. H.; Negast, William J.

    2005-01-01

    Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.

  15. Orion Relative Navigation Flight Software Analysis and Design

    NASA Technical Reports Server (NTRS)

    D'Souza, Chris; Christian, John; Zanetti, Renato

    2011-01-01

    The Orion relative Navigation System has sought to take advantage of the latest developments in sensor and algorithm technology while living under the constraints of mass, power, volume, and throughput. In particular, the only sensor specifically designed for relative navigation is the Vision Navigation System (VNS), a lidar-based sensor. But it uses the Star Trackers, GPS (when available) and IMUs, which are part of the overall Orion navigation sensor suite, to produce a relative state accurate enough to dock with the ISS. The Orion Relative Navigation System has significantly matured as the program has evolved from the design phase to the flight software implementation phase. With the development of the VNS system and the STORRM flight test of the Orion Relative Navigation hardware, much of the performance of the system will be characterized before the first flight. However challenges abound, not the least of which is the elimination of the RF range and range-rate system, along with the development of the FSW in the Matlab/Simulink/Stateflow environment. This paper will address the features and the rationale for the Orion Relative Navigation design as well as the performance of the FSW in a 6-DOF environment as well as the initial results of the hardware performance from the STORRM flight.

  16. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  17. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  18. Usability and Children's Software: A User-Centered Design Methodology.

    ERIC Educational Resources Information Center

    Robertson, Jenifer Wals

    1994-01-01

    Addresses usability issues pertaining to the purpose of educational software, followed by suggestions for ways in which educational software can meet the language, physical, social, and cognitive needs of children. Guidelines and recommendations are provided for adapting usability engineering and testing procedures to educational software to…

  19. Digital Modeling in Design Foundation Coursework: An Exploratory Study of the Effectiveness of Conceptual Design Software

    ERIC Educational Resources Information Center

    Guidera, Stan; MacPherson, D. Scot

    2008-01-01

    This paper presents the results of a study that was conducted to identify and document student perceptions of the effectiveness of computer modeling software introduced in a design foundations course that had previously utilized only conventional manually-produced representation techniques. Rather than attempt to utilize a production-oriented CAD…

  20. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    PubMed

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture.

  1. Integrated Software for Analyzing Designs of Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Philips, Alan D.

    2003-01-01

    Launch Vehicle Analysis Tool (LVA) is a computer program for preliminary design structural analysis of launch vehicles. Before LVA was developed, in order to analyze the structure of a launch vehicle, it was necessary to estimate its weight, feed this estimate into a program to obtain pre-launch and flight loads, then feed these loads into structural and thermal analysis programs to obtain a second weight estimate. If the first and second weight estimates differed, it was necessary to reiterate these analyses until the solution converged. This process generally took six to twelve person-months of effort. LVA incorporates text to structural layout converter, configuration drawing, mass properties generation, pre-launch and flight loads analysis, loads output plotting, direct solution structural analysis, and thermal analysis subprograms. These subprograms are integrated in LVA so that solutions can be iterated automatically. LVA incorporates expert-system software that makes fundamental design decisions without intervention by the user. It also includes unique algorithms based on extensive research. The total integration of analysis modules drastically reduces the need for interaction with the user. A typical solution can be obtained in 30 to 60 minutes. Subsequent runs can be done in less than two minutes.

  2. Facilitating Controlled Tests of Website Design Changes Using Aspect-Oriented Software Development and Software Product Lines

    NASA Astrophysics Data System (ADS)

    Cámara, Javier; Kobsa, Alfred

    Controlled online experiments in which envisaged changes to a website are first tested live with a small subset of site visitors have proven to predict the effects of these changes quite accurately. However, these experiments often require expensive infrastructure and are costly in terms of development effort. This paper advocates a systematic approach to the design and implementation of such experiments in order to overcome the aforementioned drawbacks by making use of Aspect-Oriented Software Development and Software Product Lines.

  3. MHTool User's Guide - Software for Manufactured Housing Structural Design

    SciTech Connect

    W. D. Richins

    2005-07-01

    Since the late 1990s, the Department of Energy's Idaho National Laboratory (INL) has worked with the US Department of Housing and Urban Development (HUD), the Manufactured Housing Institute (MHI), the National Institute of Standards and Technology (NIST), the National Science Foundation (NSF), and an industry committee to measure the response of manufactured housing to both artificial and natural wind loads and to develop a computational desktop tool to optimize the structural performance of manufactured housing to HUD Code loads. MHTool is the result of an 8-year intensive testing and verification effort using single and double section homes. MHTool is the first fully integrated structural analysis software package specifically designed for manufactured housing. To use MHTool, industry design engineers will enter information (geometries, materials, connection types, etc.) describing the structure of a manufactured home, creating a base model. Windows, doors, and interior walls can be added to the initial design. Engineers will input the loads required by the HUD Code (wind, snow loads, interior live loads, etc.) and run an embedded finite element solver to find walls or connections where stresses are either excessive or very low. The designer could, for example, substitute a less expensive and easier to install connection in areas with very low stress, then re-run the analysis for verification. If forces and stresses are still within HUD Code requirements, construction costs would be saved without sacrificing quality. Manufacturers can easily change geometries or component properties to optimize designs of various floor plans then submit MHTool input and output in place of calculations for DAPIA review. No change in the regulatory process is anticipated. MHTool, while not yet complete, is now ready for demonstration. The pre-BETA version (Build-16) was displayed at the 2005 National Congress & Expo for Manufactured & Modular Housing. Additional base models and an

  4. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4

    NASA Astrophysics Data System (ADS)

    Pope, D. J.; Cutajar, D. L.; George, S. P.; Guatelli, S.; Bucci, J. A.; Enari, K. E.; Miller, S.; Siegele, R.; Rosenfeld, A. B.

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences. Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  5. Design and performance test of spacecraft test and operation software

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Cui, Yan; Wang, Shuo; Meng, Xiaofeng

    2011-06-01

    Main test processor (MTP) software is the key element of Electrical Ground Support Equipment (EGSE) for spacecraft test and operation used in the Chinese Academy of Space Technology (CAST) for years without innovation. With the increasing demand for a more efficient and agile MTP software, the new MTP software was developed. It adopts layered and plug-in based software architecture, whose core runtime server provides message queue management, share memory management and process management services and forms the framework for a configurable and open architecture system. To investigate the MTP software's performance, the test case of network response time, test sequence management capability and data-processing capability was introduced in detail. Test results show that the MTP software is common and has higher performance than the legacy one.

  6. Design consideration for design a flat and ring plastics part using Solidworks software

    NASA Astrophysics Data System (ADS)

    Amran, M. A. M.; Faizal, K. M.; Salleh, M. S.; Sulaiman, M. A.; Mohamad, E.

    2015-12-01

    Various considerations on design of plastic injection moulded parts were applied in initial stage to prevent any defects of end products. Therefore, the objective of this project is to design the plastic injection moulded part by taking consideration on several factors such as draft angle, corner radius and location of gate. In this project, flat plastic part, ring plastic part, core inserts for flat and ring plastic part were designed using SolidWorks software. The plastic part was drawn in sketching mode then the 3D modeling of solid part was generated using various commands. Considerations of plastic part such as draft angle and corner radius with location of gate was considered in the design stage. Finally, it was successfully designed the two plastic parts with their respectively insert by using SolidWorks software. The flat plastic part and ring plastic part were designed for the purpose for future researches for study the weld lines, meld lines, air trapped and geometrical size of the product. Thus, by designing the flat plastic part and ring plastic part having core insert on each part, the completed mould design of two plate mould can be considered. This is because, plastic injection parts are needed to be designed properly in order to neglect any defect when the mould was made.

  7. Exploratory research for the development of a computer aided software design environment with the software technology program

    NASA Technical Reports Server (NTRS)

    Hardwick, Charles

    1991-01-01

    Field studies were conducted by MCC to determine areas of research of mutual interest to MCC and JSC. NASA personnel from the Information Systems Directorate and research faculty from UHCL/RICIS visited MCC in Austin, Texas to examine tools and applications under development in the MCC Software Technology Program. MCC personnel presented workshops in hypermedia, design knowledge capture, and design recovery on site at JSC for ISD personnel. The following programs were installed on workstations in the Software Technology Lab, NASA/JSC: (1) GERM (Graphic Entity Relations Modeler); (2) gIBIS (Graphic Issues Based Information System); and (3) DESIRE (Design Recovery tool). These applications were made available to NASA for inspection and evaluation. Programs developed in the MCC Software Technology Program run on the SUN workstation. The programs do not require special configuration, but they will require larger than usual amounts of disk space and RAM to operate properly.

  8. Revisiting software specification and design for large astronomy projects

    NASA Astrophysics Data System (ADS)

    Wiant, Scott; Berukoff, Steven

    2016-07-01

    The separation of science and engineering in the delivery of software systems overlooks the true nature of the problem being solved and the organization that will solve it. Use of a systems engineering approach to managing the requirements flow between these two groups as between a customer and contractor has been used with varying degrees of success by well-known entities such as the U.S. Department of Defense. However, treating science as the customer and engineering as the contractor fosters unfavorable consequences that can be avoided and opportunities that are missed. For example, the "problem" being solved is only partially specified through the requirements generation process since it focuses on detailed specification guiding the parties to a technical solution. Equally important is the portion of the problem that will be solved through the definition of processes and staff interacting through them. This interchange between people and processes is often underrepresented and under appreciated. By concentrating on the full problem and collaborating on a strategy for its solution a science-implementing organization can realize the benefits of driving towards common goals (not just requirements) and a cohesive solution to the entire problem. The initial phase of any project when well executed is often the most difficult yet most critical and thus it is essential to employ a methodology that reinforces collaboration and leverages the full suite of capabilities within the team. This paper describes an integrated approach to specifying the needs induced by a problem and the design of its solution.

  9. LISP as an Environment for Software Design: Powerful and Perspicuous

    PubMed Central

    Blum, Robert L.; Walker, Michael G.

    1986-01-01

    The LISP language provides a useful set of features for prototyping knowledge-intensive, clinical applications software that is not found In most other programing environments. Medical computer programs that need large medical knowledge bases, such as programs for diagnosis, therapeutic consultation, education, simulation, and peer review, are hard to design, evolve continually, and often require major revisions. They necessitate an efficient and flexible program development environment. The LISP language and programming environments bullt around it are well suited for program prototyping. The lingua franca of artifical intelligence researchers, LISP facllitates bullding complex systems because it is simple yet powerful. Because of its simplicity, LISP programs can read, execute, modify and even compose other LISP programs at run time. Hence, it has been easy for system developers to create programming tools that greatly speed the program development process, and that may be easily extended by users. This has resulted in the creation of many useful graphical interfaces, editors, and debuggers, which facllitate the development of knowledge-intensive medical applications.

  10. Design study of Software-Implemented Fault-Tolerance (SIFT) computer

    NASA Technical Reports Server (NTRS)

    Wensley, J. H.; Goldberg, J.; Green, M. W.; Kutz, W. H.; Levitt, K. N.; Mills, M. E.; Shostak, R. E.; Whiting-Okeefe, P. M.; Zeidler, H. M.

    1982-01-01

    Software-implemented fault tolerant (SIFT) computer design for commercial aviation is reported. A SIFT design concept is addressed. Alternate strategies for physical implementation are considered. Hardware and software design correctness is addressed. System modeling and effectiveness evaluation are considered from a fault-tolerant point of view.

  11. Software Designers and Teachers as Evaluators of Computer-Based Learning Environments.

    ERIC Educational Resources Information Center

    Hakkinen, Paivi

    1996-01-01

    Describes a study conducted in Finland that investigated designers' and teachers' conceptions of learning and evaluation criteria for educational software. Results indicate that designers emphasized the appearance of software while teachers referred more to practical teaching arrangements, and that participative and collaborative design is needed.…

  12. Research and Design Issues Concerning the Development of Educational Software for Children. Technical Report No. 14.

    ERIC Educational Resources Information Center

    Char, Cynthia

    Several research and design issues to be considered when creating educational software were identified by a field test evaluation of three types of innovative software created at Bank Street College: (1) Probe, software for measuring and graphing temperature data; (2) Rescue Mission, a navigation game that illustrates the computer's use for…

  13. Teacher-Designed Software for Interactive Linear Equations: Concepts, Interpretive Skills, Applications & Word-Problem Solving.

    ERIC Educational Resources Information Center

    Lawrence, Virginia

    No longer just a user of commercial software, the 21st century teacher is a designer of interactive software based on theories of learning. This software, a comprehensive study of straightline equations, enhances conceptual understanding, sketching, graphic interpretive and word problem solving skills as well as making connections to real-life and…

  14. Object-oriented software design for the Mt. Wilson 100-inch Hooker telescope adaptive optics system

    NASA Astrophysics Data System (ADS)

    Schneider, Thomas G.

    2000-06-01

    The object oriented software design paradigm has been instrumented in the development of the Adoptics software used in the Hooker telescope's ADOPT adaptive optics system. The software runs on a Pentium-class PC host and eight DSP processors connected to the host's motherboard bus. C++ classes were created to implement most of the host software's functionality, with the object oriented features of inheritance, encapsulation and abstraction being the most useful. Careful class design at the inception of the project allowed for the rapid addition of features without comprising the integrity of the software. Base class implementations include the DSP system, real-time graphical displays and opto-mechanical actuator control.

  15. Wake Turbulence Mitigation for Departures (WTMD) Prototype System - Software Design Document

    NASA Technical Reports Server (NTRS)

    Sturdy, James L.

    2008-01-01

    This document describes the software design of a prototype Wake Turbulence Mitigation for Departures (WTMD) system that was evaluated in shadow mode operation at the Saint Louis (KSTL) and Houston (KIAH) airports. This document describes the software that provides the system framework, communications, user displays, and hosts the Wind Forecasting Algorithm (WFA) software developed by the M.I.T. Lincoln Laboratory (MIT-LL). The WFA algorithms and software are described in a separate document produced by MIT-LL.

  16. TH-E-BRE-01: A 3D Solver of Linear Boltzmann Transport Equation Based On a New Angular Discretization Method with Positivity for Photon Dose Calculation Benchmarked with Geant4

    SciTech Connect

    Hong, X; Gao, H

    2014-06-15

    Purpose: The Linear Boltzmann Transport Equation (LBTE) solved through statistical Monte Carlo (MC) method provides the accurate dose calculation in radiotherapy. This work is to investigate the alternative way for accurately solving LBTE using deterministic numerical method due to its possible advantage in computational speed from MC. Methods: Instead of using traditional spherical harmonics to approximate angular scattering kernel, our deterministic numerical method directly computes angular scattering weights, based on a new angular discretization method that utilizes linear finite element method on the local triangulation of unit angular sphere. As a Result, our angular discretization method has the unique advantage in positivity, i.e., to maintain all scattering weights nonnegative all the time, which is physically correct. Moreover, our method is local in angular space, and therefore handles the anisotropic scattering well, such as the forward-peaking scattering. To be compatible with image-guided radiotherapy, the spatial variables are discretized on the structured grid with the standard diamond scheme. After discretization, the improved sourceiteration method is utilized for solving the linear system without saving the linear system to memory. The accuracy of our 3D solver is validated using analytic solutions and benchmarked with Geant4, a popular MC solver. Results: The differences between Geant4 solutions and our solutions were less than 1.5% for various testing cases that mimic the practical cases. More details are available in the supporting document. Conclusion: We have developed a 3D LBTE solver based on a new angular discretization method that guarantees the positivity of scattering weights for physical correctness, and it has been benchmarked with Geant4 for photon dose calculation.

  17. IDEAS and App Development Internship in Hardware and Software Design

    NASA Technical Reports Server (NTRS)

    Alrayes, Rabab D.

    2016-01-01

    In this report, I will discuss the tasks and projects I have completed while working as an electrical engineering intern during the spring semester of 2016 at NASA Kennedy Space Center. In the field of software development, I completed tasks for the G-O Caching Mobile App and the Asbestos Management Information System (AMIS) Web App. The G-O Caching Mobile App was written in HTML, CSS, and JavaScript on the Cordova framework, while the AMIS Web App is written in HTML, CSS, JavaScript, and C# on the AngularJS framework. My goals and objectives on these two projects were to produce an app with an eye-catching and intuitive User Interface (UI), which will attract more employees to participate; to produce a fully-tested, fully functional app which supports workforce engagement and exploration; to produce a fully-tested, fully functional web app that assists technicians working in asbestos management. I also worked in hardware development on the Integrated Display and Environmental Awareness System (IDEAS) wearable technology project. My tasks on this project were focused in PCB design and camera integration. My goals and objectives for this project were to successfully integrate fully functioning custom hardware extenders on the wearable technology headset to minimize the size of hardware on the smart glasses headset for maximum user comfort; to successfully integrate fully functioning camera onto the headset. By the end of this semester, I was able to successfully develop four extender boards to minimize hardware on the headset, and assisted in integrating a fully-functioning camera into the system.

  18. SWEPP Assay System Version 2.0 software design description

    SciTech Connect

    East, L.V.; Marwil, E.S.

    1996-08-01

    The Idaho National Engineering Laboratory (INEL) Stored Waste Examination Pilot Plant (SWEPP) operations staff use nondestructive analysis methods to characterize the radiological contents of contact-handled radioactive waste containers. Containers of waste from Rocky Flats Environmental Technology Site and other Department of Energy (DOE) sites are currently stored at SWEPP. Before these containers can be shipped to the Waste Isolation Pilot Plant (WIPP), SWEPP must verify compliance with storage, shipping, and disposal requirements. This program has been in operation since 1985 at the INEL Radioactive Waste Management Complex (RWMC). One part of the SWEPP program measures neutron emissions from the containers and estimates the mass of plutonium and other transuranic (TRU) isotopes present. A Passive/Active Neutron (PAN) assay system developed at the Los Alamos National Laboratory is used to perform these measurements. A computer program named NEUT2 was originally used to perform the data acquisition and reduction functions for the neutron measurements. This program was originally developed at Los Alamos and extensively modified by a commercial vendor of PAN systems and by personnel at the INEL. NEUT2 uses the analysis methodology outlined, but no formal documentation exists on the program itself. The SWEPP Assay System (SAS) computer program replaced the NEUT2 program in early 1994. The SAS software was developed using an `object model` approach and is documented in accordance with American National Standards Institute (ANSI) and Institute of Electrical and Electronic Engineers (IEEE) standards. The new program incorporates the basic analysis algorithms found in NEUT2. Additional functionality and improvements include a graphical user interface, the ability to change analysis parameters without program code modification, an `object model` design approach and other features for improved flexibility and maintainability.

  19. User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis

    SciTech Connect

    Scholtz, Jean; Endert, Alexander N.

    2014-08-01

    In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We present some standing issues in collaborative software based on existing work within the intelligence community. Based on this information we present opportunities to address some of these challenges.

  20. User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis

    SciTech Connect

    Scholtz, Jean; Endert, Alexander

    2014-07-01

    In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We discuss a number of studies of collaboration in the intelligence community and use this information to provide some guidelines for collaboration software.

  1. Windows Calorimeter Control (WinCal) program computer software design description

    SciTech Connect

    Pertzborn, N.F.

    1997-03-26

    The Windows Calorimeter Control (WinCal) Program System Design Description contains a discussion of the design details for the WinCal product. Information in this document will assist a developer in maintaining the WinCal system. The content of this document follows the guidance in WHC-CM-3-10, Software Engineering Standards, Standard for Software User Documentation.

  2. Validation of mission critical software design and implementation using model checking

    NASA Technical Reports Server (NTRS)

    Pingree, P. J.; Mikk, E.; Holzmann, G.; Smith, M.; Dams, D.

    2002-01-01

    Model Checking conducts an exhaustive exploration of all possible behaviors of a software system design and as such can be used to detect defects in designs that are typically difficult to discover with conventional testing approaches.

  3. Application of software technology to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  4. A Framework for Designing Reliable Software-Intensive Systems

    DTIC Science & Technology

    2011-03-01

    Industrial and Manufacturing Engineering Oregon State University Corvallis, Oregon, USA C. Smidts Mechanical and Aerospace Engineering Ohio State...Reporting Period (End): 11/30/2010 Program Manager: David Luginbuhl Changes in Research Objectives: None Changes in Program Manager: None...activities between the two institutions: • Completed mapping between elements of different Unified Modeling Language (UML) diagrams. • Formulated a software

  5. Classrooms as Test-Beds for Educational Software Design.

    ERIC Educational Resources Information Center

    Carlson, Patricia A.; And Others

    1996-01-01

    Describes efforts to adapt R-WISE (Reading and Writing in a Supportive Environment), a computer program developed for the military, for public education. Field testing at MacArthur High School (Texas) using ninth-grade classes is discussed, including fostering higher-order thinking skills, coping with change, and integrating software into the…

  6. Designing Better Camels: Developing Effective Documentation for Computer Software.

    ERIC Educational Resources Information Center

    Zacher, Candace M.

    This guide to the development of effective documentation for users of computer software begins by identifying five types of documentation, i.e., training manuals, user guides, tutorials, on-screen help comments, and troubleshooting manuals. Six steps in the development process are then outlined and briefly described: (1) planning and preparation;…

  7. The Impact of Social Software in Product Design Higher Education

    ERIC Educational Resources Information Center

    Hurn, Karl

    2012-01-01

    It is difficult to ignore the impact that Web 2.0 and the subsequent social software revolution has had on society in general, and young people in particular. Information is exchanged and interpreted extremely quickly and in ways that were not imagined 10 years ago. Universities are struggling to keep up with this new technology, with outdated…

  8. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles.

  9. Designing Educational Software with Students through Collaborative Design Games: The We!Design&Play Framework

    ERIC Educational Resources Information Center

    Triantafyllakos, George; Palaigeorgiou, George; Tsoukalas, Ioannis A.

    2011-01-01

    In this paper, we present a framework for the development of collaborative design games that can be employed in participatory design sessions with students for the design of educational applications. The framework is inspired by idea generation theory and the design games literature, and guides the development of board games which, through the use…

  10. QUICK - AN INTERACTIVE SOFTWARE ENVIRONMENT FOR ENGINEERING DESIGN

    NASA Technical Reports Server (NTRS)

    Schlaifer, R. S.

    1994-01-01

    QUICK provides the computer user with the facilities of a sophisticated desk calculator which can perform scalar, vector and matrix arithmetic, propagate conic orbits, determine planetary and satellite coordinates and perform other related astrodynamic calculations within a Fortran-like environment. QUICK is an interpreter, therefore eliminating the need to use a compiler or a linker to run QUICK code. QUICK capabilities include options for automated printing of results, the ability to submit operating system commands on some systems, and access to a plotting package (MASL)and a text editor without leaving QUICK. Mathematical and programming features of QUICK include the ability to handle arbitrary algebraic expressions, the capability to define user functions in terms of other functions, built-in constants such as pi, direct access to useful COMMON areas, matrix capabilities, extensive use of double precision calculations, and the ability to automatically load user functions from a standard library. The MASL (The Multi-mission Analysis Software Library) plotting package, included in the QUICK package, is a set of FORTRAN 77 compatible subroutines designed to facilitate the plotting of engineering data by allowing programmers to write plotting device independent applications. Its universality lies in the number of plotting devices it puts at the user's disposal. The MASL package of routines has proved very useful and easy to work with, yielding good plots for most new users on the first or second try. The functions provided include routines for creating histograms, "wire mesh" surface plots and contour plots as well as normal graphs with a large variety of axis types. The library has routines for plotting on cartesian, polar, log, mercator, cyclic, calendar, and stereographic axes, and for performing automatic or explicit scaling. The lengths of the axes of a plot are completely under the control of the program using the library. Programs written to use the MASL

  11. NSTX-U Digital Coil Protection System Software Detailed Design

    SciTech Connect

    2014-06-01

    The National Spherical Torus Experiment (NSTX) currently uses a collection of analog signal processing solutions for coil protection. Part of the NSTX Upgrade (NSTX-U) entails replacing these analog systems with a software solution running on a conventional computing platform. The new Digital Coil Protection System (DCPS) will replace the old systems entirely, while also providing an extensible framework that allows adding new functionality as desired.

  12. Software System Design for Large Scale, Spatially-explicit Agroecosystem Modeling

    SciTech Connect

    Wang, Dali; Nichols, Dr Jeff A; Kang, Shujiang; Post, Wilfred M; Liu, Sumang

    2012-01-01

    Recently, site-based agroecosystem model has been applied at regional and state level to enable comprehensive analyses of environmental sustainability of food and biofuel production. Those large-scale, spatially-explicit simulations present computational challenges in software systems design. Herein, we describe our software system design for large-scale, spatially-explicit agroecosystem modeling and data analysis. First, we describe the software design principles in three major phases: data preparation, high performance simulation, and data management and analysis. Then, we use a case study at a regional intensive modeling area (RIMA) to demonstrate our system implementation and capability.

  13. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    NASA Technical Reports Server (NTRS)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  14. The Gains Design Process: How to do Structured Design of User Interfaces in Any Software Environment

    NASA Astrophysics Data System (ADS)

    Lindeman, Martha J.

    This paper describes a user-interaction design process created and used by a consultant to solve two challenges: (1) how to decrease the need for changes in the user interface by subsequent system releases without doing big design up-front and (2) how to apply a structured user-interaction design process no matter when brought into a project or what software methodology was being used. The four design levels in the process parallel Beck and Fowler’s four planning levels described in their book Planning Extreme Programming. The design process is called “GAINS” because the user-interaction designer has only Attraction, Information and Navigation to connect users’ Goals with the project sponsors’ criteria for Success. Thus there are five questions, one for each letter of the acronym GAINS, asked at each of four levels of design: The first two design levels, Rough Plan and Big Plan, focus on business-process actions and objects that define users’ goals. The next two levels, Release Planning and Iteration Planning, focus on the user interface objects that support the tasks necessary to achieve those goals. Release Planning identifies the displays the user sees for each goal included in that release, and also the across-display navigation for the proposed functionality. Iteration Planning focuses at a lower level of interaction, such as the within-display navigation among ontrols. For a voice system, the word “sees” would be changed to “hears,” but the design rocess and the levels of focus are the same for user interfaces that are vision output (e.g., GUIs), voice output (e.g., VRs), or multimodal.

  15. Design and Implementation of Mapping Software: Developing Technology and Geography Skills in Two Different Learning Communities

    ERIC Educational Resources Information Center

    Friedman, Robert S.; Drakes, Jerri; Deek, Fadi P.

    2002-01-01

    A software development collaboration project designed to maximize the skill sets and interests of school children and teachers, educational software technologists and researchers, and college undergraduates is presented. The work brings elementary school children with college seniors and technology consultants to implement a problem-solving…

  16. A Buyer Behaviour Framework for the Development and Design of Software Agents in E-Commerce.

    ERIC Educational Resources Information Center

    Sproule, Susan; Archer, Norm

    2000-01-01

    Software agents are computer programs that run in the background and perform tasks autonomously as delegated by the user. This paper blends models from marketing research and findings from the field of decision support systems to build a framework for the design of software agents to support in e-commerce buying applications. (Contains 35…

  17. Improving the quality of numerical software through user-centered design

    SciTech Connect

    Pancake, C. M., Oregon State University

    1998-06-01

    The software interface - whether graphical, command-oriented, menu-driven, or in the form of subroutine calls - shapes the user`s perception of what software can do. It also establishes upper bounds on software usability. Numerical software interfaces typically are based on the designer`s understanding of how the software should be used. That is a poor foundation for usability, since the features that are ``instinctively right`` from the developer`s perspective are often the very ones that technical programmers find most objectionable or most difficult to learn. This paper discusses how numerical software interfaces can be improved by involving users more actively in design, a process known as user-centered design (UCD). While UCD requires extra organization and effort, it results in much higher levels of usability and can actually reduce software costs. This is true not just for graphical user interfaces, but for all software interfaces. Examples show how UCD improved the usability of a subroutine library, a command language, and an invocation interface.

  18. Evaluating the Software Design of a Complex System of Systems

    DTIC Science & Technology

    2010-01-01

    Architecture ( LCA ) milestone conducted at the SoS level might be the answer. According to the Rational Unified Process, the LCA marks the conclusion of...an SoS is not merely a roll-up of its constituent systems, so it is that an SoS- level LCA is more than a roll-up of lower level LCA anchor points. A...for individual software packages and another for integrated builds. Although LCAs were conducted regularly at constituent systems levels , the focus

  19. As-built design specification for proportion estimate software subsystem

    NASA Technical Reports Server (NTRS)

    Obrien, S. (Principal Investigator)

    1980-01-01

    The Proportion Estimate Processor evaluates four estimation techniques in order to get an improved estimate of the proportion of a scene that is planted in a selected crop. The four techniques to be evaluated were provided by the techniques development section and are: (1) random sampling; (2) proportional allocation, relative count estimate; (3) proportional allocation, Bayesian estimate; and (4) sequential Bayesian allocation. The user is given two options for computation of the estimated mean square error. These are referred to as the cluster calculation option and the segment calculation option. The software for the Proportion Estimate Processor is operational on the IBM 3031 computer.

  20. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  1. Geant4 simulation for a study of a possible use of carbon ions pencil beam for the treatment of ocular melanomas with the active scanning system at CNAO Centre

    SciTech Connect

    Farina, E.; Piersimoni, P.; Riccardi, C.; Rimoldi, A.; Tamborini, A.; Ciocca, M.

    2015-07-01

    The aim of this work is to validate the Geant4 application reproducing the CNAO (National Centre for Oncological Hadrontherapy) beamline and to study of a possible use of carbon ion pencil beams for the treatment of ocular melanomas at the CNAO Centre. The promising aspect of carbon ions radiotherapy for the treatment of this disease lies in its superior relative radiobiological effectiveness (RBE). The Monte Carlo Geant4 toolkit is used to simulate the complete CNAO extraction beamline, with the active and passive components along it. A human eye modeled detector, including a realistic target tumor volume, is used as target. Cross check with previous studies at CNAO using protons allows comparisons on possible benefits on using such a technique with respect to proton beams. Before the eye-detector irradiation a validation of the Geant4 simulation with CNAO experimental data is carried out with both carbon ions and protons. Important beam parameters such as the transverse FWHM and scanned radiation field 's uniformity are tested within the simulation and compared with experimental measurements at CNAO Centre. The physical processes involved in secondary particles generation by carbon ions and protons in the eye-detector are reproduced to take into account the additional dose to the primary beam given to irradiated eye's tissues. A study of beam shaping is carried out to produce a uniform 3D dose distribution (shaped on the tumor) by the use of a spread out Bragg peak. The eye-detector is then irradiated through a two dimensional transverse beam scan at different depths. In the use case the eye-detector is rotated of an angle of 40 deg. in the vertical direction, in order to mis-align the tumor from healthy tissues in front of it. The treatment uniformity on the tumor in the eye-detector is tested. For a more quantitative description of the deposited dose in the eye-detector and for the evaluation of the ratio between the dose deposited in the tumor and the other

  2. IGDS/TRAP Interface Program (ITIP). Software Design Document

    NASA Technical Reports Server (NTRS)

    Jefferys, Steve; Johnson, Wendell

    1981-01-01

    The preliminary design of the IGDS/TRAP Interface Program (ITIP) is described. The ITIP is implemented on the PDP 11/70 and interfaces directly with the Interactive Graphics Design System and the Data Management and Retrieval System. The program provides an efficient method for developing a network flow diagram. Performance requirements, operational rquirements, and design requirements are discussed along with sources and types of input and destination and types of output. Information processing functions and data base requirements are also covered.

  3. An application of the IMC software to controller design for the JPL LSCL Experiment Facility

    NASA Technical Reports Server (NTRS)

    Zhu, Guoming; Skelton, Robert E.

    1993-01-01

    A software package which Integrates Model reduction and Controller design (The IMC software) is applied to design controllers for the JPL Large Spacecraft Control Laboratory Experiment Facility. Modal Cost Analysis is used for the model reduction, and various Output Covariance Constraints are guaranteed by the controller design. The main motivation is to find the controller with the 'best' performance with respect to output variances. Indeed it is shown that by iterating on the reduced order design model, the controller designed does have better performance than that obtained with the first model reduction.

  4. 'Ten Golden Rules' for Designing Software in Medical Education: Results from a Formative Evaluation of DIALOG.

    ERIC Educational Resources Information Center

    Jha, Vikram; Duffy, Sean

    2002-01-01

    Reports the results of an evaluation of Distance Interactive Learning in Obstetrics and Gynecology (DIALOG) which is an electronic program for continuing education. Presents 10 golden rules for designing software for medical practitioners. (Contains 26 references.) (Author/YDS)

  5. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    SciTech Connect

    Pordes, Rush; Snider, Erica

    2016-08-17

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation software and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.

  6. Heterogeneous Concurrent Modeling and Design in Java (Volume 2: Ptolemy II Software Architecture)

    DTIC Science & Technology

    2008-04-01

    Heterogeneous Concurrent Modeling and Design in Java (Volume 2: Ptolemy II Software Architecture) Christopher Brooks Edward A. Lee Xiaojun Liu...00-2008 4. TITLE AND SUBTITLE Heterogeneous Concurrent Modeling and Design in Java (Volume 2: Ptolemy II Software Architecture) 5a. CONTRACT...the State of California Micro Program, and the following companies: Agilent, Bosch, HSBC, Lockheed-Martin, National Instruments, and Toyota. PTOLEMY II

  7. User-Centered Design of Health Care Software Development: Towards a Cultural Change.

    PubMed

    Stanziola, Enrique; Uznayo, María Quispe; Ortiz, Juan Marcos; Simón, Mariana; Otero, Carlos; Campos, Fernando; Luna, Daniel

    2015-01-01

    Health care software gets better user efficiency, efficacy and satisfaction when the software is designed with their users' needs taken into account. However, it is not trivial to change the practice of software development to adopt user-centered design. In order to produce this change in the Health Informatics Department of the Hospital Italiano de Buenos Aires, a plan was devised and implemented. The article presents the steps of the plan, shows how the steps were carried on, and reflects on the lessons learned through the process.

  8. Drug Guru: a computer software program for drug design using medicinal chemistry rules.

    PubMed

    Stewart, Kent D; Shiroda, Melisa; James, Craig A

    2006-10-15

    Drug Guru (drug generation using rules) is a new web-based computer software program for medicinal chemists that applies a set of transformations, that is, rules, to an input structure. The transformations correspond to medicinal chemistry design rules-of-thumb taken from the historical lore of drug discovery programs. The output of the program is a list of target analogs that can be evaluated for possible future synthesis. A discussion of the features of the program is followed by an example of the software applied to sildenafil (Viagra) in generating ideas for target analogs for phosphodiesterase inhibition. Comparison with other computer-assisted drug design software is given.

  9. The family of standard hydrogen monitoring system computer software design description: Revision 2

    SciTech Connect

    Bender, R.M.

    1994-11-16

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ``The family of SHMS.`` When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation.

  10. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  11. Psychosocial Risks Generated By Assets Specific Design Software

    NASA Astrophysics Data System (ADS)

    Remus, Furtună; Angela, Domnariu; Petru, Lazăr

    2015-07-01

    The human activity concerning an occupation is resultant from the interaction between the psycho-biological, socio-cultural and organizational-occupational factors. Tehnological development, automation and computerization that are to be found in all the branches of activity, the level of speed in which things develop, as well as reaching their complexity, require less and less physical aptitudes and more cognitive qualifications. The person included in the work process is bound in most of the cases to come in line with the organizational-occupational situations that are specific to the demands of the job. The role of the programmer is essencial in the process of execution of ordered softwares, thus the truly brilliant ideas can only come from well-rested minds, concentrated on their tasks. The actual requirements of the jobs, besides the high number of benefits and opportunities, also create a series of psycho-social risks, which can increase the level of stress during work activity, especially for those who work under pressure.

  12. Investigation into the development of computer aided design software for space based sensors

    NASA Technical Reports Server (NTRS)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  13. Methods and software tools for design evaluation in population pharmacokinetics-pharmacodynamics studies.

    PubMed

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)-pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization.

  14. The Implementation of Satellite Attitude Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Reid, W. Mark; Hansell, William; Phillips, Tom; Anderson, Mark O.; Drury, Derek

    1998-01-01

    NASA established the Small Explorer (SNMX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions. The SMEX program has produced five satellites, three of which have been successfully launched. The remaining two spacecraft are scheduled for launch within the coming year. NASA has recently developed a prototype for the next generation Small Explorer spacecraft (SMEX-Lite). This paper describes the object-oriented design (OOD) of the SMEX-Lite Attitude Control System (ACS) software. The SMEX-Lite ACS is three-axis controlled and is capable of performing sub-arc-minute pointing. This paper first describes high level requirements governing the SMEX-Lite ACS software architecture. Next, the context in which the software resides is explained. The paper describes the principles of encapsulation, inheritance, and polymorphism with respect to the implementation of an ACS software system. This paper will also discuss the design of several ACS software components. Specifically, object-oriented designs are presented for sensor data processing, attitude determination, attitude control, and failure detection. Finally, this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects.

  15. The Relationship between Software Design and Children's Engagement

    ERIC Educational Resources Information Center

    Buckleitner, Warren

    2006-01-01

    This study was an attempt to measure the effects of praise and reinforcement on children in a computer learning setting. A sorting game was designed to simulate 2 interaction styles. One style, called high computer control, provided frequent praise and coaching. The other, called high child control, had narration and praise toggled off. A…

  16. Software Developers' Attitudes toward User-Centered Design.

    ERIC Educational Resources Information Center

    Frick, Theodore; Boling, Elizabeth; Kim, Kyong-Jee; Oswald, Daniel; Zazelenchuk, Todd

    The concepts of usability and user-centered design (UCD) have grown in popularity over the past 20 years as measured by the number of research and mainstream articles devoted to their discussion. As with all new developments, however, there are always the questions of how things work in practice compared to theory. A survey of 83 software…

  17. A Dialogue and Social Software Perspective on Deep Learning Design

    ERIC Educational Resources Information Center

    Ravenscroft, Andrew; Boyle, Tom

    2010-01-01

    This article considers projects in Technology Enhanced Learning (TEL) that have focussed on designing digital tools that stimulate and support dialogue rich learning. These have emphasised collaborative thinking and meaning making in a rich and varied range of educational contexts. Technically, they have exploited AI, CSCL and HCI techniques, and…

  18. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  19. Software/firmware design specification for 10-MWe solar-thermal central-receiver pilot plant

    SciTech Connect

    Ladewig, T.D.

    1981-03-01

    The software and firmware employed for the operation of the Barstow Solar Pilot Plant are completely described. The systems allow operator control of up to 2048 heliostats, and include the capability of operator-commanded control, graphic displays, status displays, alarm generation, system redundancy, and interfaces to the Operational Control System, the Data Acquisition System, and the Beam Characterization System. The requirements are decomposed into eleven software modules for execution in the Heliostat Array Controller computer, one firmware module for execution in the Heliostat Field Controller microprocessor, and one firmware module for execution in the Heliostat Controller microprocessor. The design of the modules to satisfy requirements, the interfaces between the computers, the software system structure, and the computers in which the software and firmware will execute are detailed. The testing sequence for validation of the software/firmware is described. (LEW)

  20. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.