Science.gov

Sample records for geant4 software design

  1. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  2. The GEANT4 Visualisation System

    SciTech Connect

    Allison, J.; Asai, M.; Barrand, G.; Donszelmann, M.; Minamimoto, K.; Tanaka, S.; Tcherniaev, E.; Tinslay, J.; /SLAC

    2007-11-02

    The Geant4 Visualization System is a multi-driver graphics system designed to serve the Geant4 Simulation Toolkit. It is aimed at the visualization of Geant4 data, primarily detector descriptions and simulated particle trajectories and hits. It can handle a variety of graphical technologies simultaneously and interchangeably, allowing the user to choose the visual representation most appropriate to requirements. It conforms to the low-level Geant4 abstract graphical user interfaces and introduces new abstract classes from which the various drivers are derived and that can be straightforwardly extended, for example, by the addition of a new driver. It makes use of an extendable class library of models and filters for data representation and selection. The Geant4 Visualization System supports a rich set of interactive commands based on the Geant4 command system. It is included in the Geant4 code distribution and maintained and documented like other components of Geant4.

  3. MCNP5 and GEANT4 comparisons for preliminary Fast Neutron Pencil Beam design at the University of Utah TRIGA system

    NASA Astrophysics Data System (ADS)

    Adjei, Christian Amevi

    The main objective of this thesis is twofold. The starting objective was to develop a model for meaningful benchmarking of different versions of GEANT4 against an experimental set-up and MCNP5 pertaining to photon transport and interactions. The following objective was to develop a preliminary design of a Fast Neutron Pencil Beam (FNPB) Facility to be applicable for the University of Utah research reactor (UUTR) using MCNP5 and GEANT4. The three various GEANT4 code versions, GEANT4.9.4, GEANT4.9.3, and GEANT4.9.2, were compared to MCNP5 and the experimental measurements of gamma attenuation in air. The average gamma dose rate was measured in the laboratory experiment at various distances from a shielded cesium source using a Ludlum model 19 portable NaI detector. As it was expected, the gamma dose rate decreased with distance. All three GEANT4 code versions agreed well with both the experimental data and the MCNP5 simulation. Additionally, a simple GEANT4 and MCNP5 model was developed to compare the code agreements for neutron interactions in various materials. Preliminary FNPB design was developed using MCNP5; a semi-accurate model was developed using GEANT4 (because GEANT4 does not support the reactor physics modeling, the reactor was represented as a surface neutron source, thus a semi-accurate model). Based on the MCNP5 model, the fast neutron flux in a sample holder of the FNPB is obtained to be 6.52×107 n/cm2s, which is one order of magnitude lower than gigantic fast neutron pencil beam facilities existing elsewhere. The MCNP5 model-based neutron spectrum indicates that the maximum expected fast neutron flux is at a neutron energy of ~1 MeV. In addition, the MCNP5 model provided information on gamma flux to be expected in this preliminary FNPB design; specifically, in the sample holder, the gamma flux is to be expected to be around 108 γ/cm 2s, delivering a gamma dose of 4.54×103 rem/hr. This value is one to two orders of magnitudes below the gamma

  4. Simulation of neutron production in heavy metal targets using Geant4 software

    NASA Astrophysics Data System (ADS)

    Baldin, A. A.; Berlev, A. I.; Kudashkin, I. V.; Mogildea, G.; Mogildea, M.; Paraipan, M.; Tyutyunnikov, S. I.

    2016-03-01

    Inelastic hadronic interactions in heavy targets have been simulated using Geant4 and compared with experimental data for thin and thick lead and uranium targets. Special attention is paid to neutron and fission fragment production. Good agreement in the description of proton-beam interaction with thick targets is demonstrated, which is important for the simulation of experiments aimed at the development of subcritical reactors.

  5. Geant4 Applications in Space

    SciTech Connect

    Asai, M.; /SLAC

    2007-11-07

    Use of Geant4 is rapidly expanding in space application domain. I try to overview three major application areas of Geant4 in space, which are apparatus simulation for pre-launch design and post-launch analysis, planetary scale simulation for radiation spectra and surface and sub-surface explorations, and micro-dosimetry simulation for single event study and radiation-hardening of semiconductor devices. Recently, not only the mission dependent applications but also various multi-purpose or common tools built on top of Geant4 are also widely available. I overview some of such tools as well. The Geant4 Collaboration identifies that the space applications are now one of the major driving forces of the further developments and refinements of Geant4 toolkit. Highlights of such developments are introduced.

  6. GEANT4 Applications in Space

    NASA Astrophysics Data System (ADS)

    Asai, Makoto

    2008-06-01

    The use of Geant4 is rapidly expanding in the domain of space applications. I try to give an overview three major application areas of Geant4 in space, which are apparatus simulation for pre-launch design and post-launch analysis, planetary scale simulation for radiation spectra and surface and sub-surface explorations, and micro-dosimetry simulation for single event study and radiation-hardening of semiconductor devices. Recently, not only the mission-dependent applications but also various multi-purpose or common tools built on top of Geant4 are also widely available. I overview some of these tools as well. The Geant4 Collaboration identifies the space applications now-as one of the major driving forces of the further developments and refinements of Geant4 toolkit. Highlights of such developments are given.

  7. GEANT4 used for neutron beam design of a neutron imaging facility at TRIGA reactor in Morocco

    NASA Astrophysics Data System (ADS)

    Ouardi, A.; Machmach, A.; Alami, R.; Bensitel, A.; Hommada, A.

    2011-09-01

    Neutron imaging has a broad scope of applications and has played a pivotal role in visualizing and quantifying hydrogenous masses in metallic matrices. The field continues to expand into new applications with the installation of new neutron imaging facilities. In this scope, a neutron imaging facility for computed tomography and real-time neutron radiography is currently being developed around 2.0MW TRIGA MARK-II reactor at Maamora Nuclear Research Center in Morocco (Reuscher et al., 1990 [1]; de Menezes et al., 2003 [2]; Deinert et al., 2005 [3]). The neutron imaging facility consists of neutron collimator, real-time neutron imaging system and imaging process systems. In order to reduce the gamma-ray content in the neutron beam, the tangential channel was selected. For power of 250 kW, the corresponding thermal neutron flux measured at the inlet of the tangential channel is around 3×10 11 ncm 2/s. This facility will be based on a conical neutron collimator with two circular diaphragms with diameters of 4 and 2 cm corresponding to L/D-ratio of 165 and 325, respectively. These diaphragms' sizes allow reaching a compromise between good flux and efficient L/D-ratio. Convergent-divergent collimator geometry has been adopted. The beam line consists of a gamma filter, fast neutrons filter, neutron moderator, neutron and gamma shutters, biological shielding around the collimator and several stages of neutron collimator. Monte Carlo calculations by a fully 3D numerical code GEANT4 were used to design the neutron beam line ( http://www.info.cern.ch/asd/geant4/geant4.html[4]). To enhance the neutron thermal beam in terms of quality, several materials, mainly bismuth (Bi) and sapphire (Al 2O 3) were examined as gamma and neutron filters respectively. The GEANT4 simulations showed that the gamma and epithermal and fast neutron could be filtered using the bismuth (Bi) and sapphire (Al 2O 3) filters, respectively. To get a good cadmium ratio, GEANT 4 simulations were used to

  8. Recent Developments in the Geant4 Hadronic Framework

    NASA Astrophysics Data System (ADS)

    Pokorski, Witold; Ribon, Alberto

    2014-06-01

    In this paper we present the recent developments in the Geant4 hadronic framework. Geant4 is the main simulation toolkit used by the LHC experiments and therefore a lot of effort is put into improving the physics models in order for them to have more predictive power. As a consequence, the code complexity increases, which requires constant improvement and optimization on the programming side. At the same time, we would like to review and eventually reduce the complexity of the hadronic software framework. As an example, a factory design pattern has been applied in Geant4 to avoid duplications of objects, like cross sections, which can be used by several processes or physics models. This approach has been applied also for physics lists, to provide a flexible configuration mechanism at run-time, based on macro files. Moreover, these developments open the future possibility to build Geant4 with only a specified sub-set of physics models. Another technical development focused on the reproducibility of the simulation, i.e. the possibility to repeat an event once the random generator status at the beginning of the event is known. This is crucial for debugging rare situations that may occur after long simulations. Moreover, reproducibility in normal, sequential Geant4 simulation is an important prerequisite to verify the equivalence with multithreaded Geant4 simulations.

  9. A Virtual Geant4 Environment

    NASA Astrophysics Data System (ADS)

    Iwai, Go

    2015-12-01

    We describe the development of an environment for Geant4 consisting of an application and data that provide users with a more efficient way to access Geant4 applications without having to download and build the software locally. The environment is platform neutral and offers the users near-real time performance. In addition, the environment consists of data and Geant4 libraries built using low-level virtual machine (LLVM) tools which can produce bitcode that can be embedded in HTML and accessed via a browser. The bitcode is downloaded to the local machine via the browser and can then be configured by the user. This approach provides a way of minimising the risk of leaking potentially sensitive data used to construct the Geant4 model and application in the medical domain for treatment planning. We describe several applications that have used this approach and compare their performance with that of native applications. We also describe potential user communities that could benefit from this approach.

  10. Design of Cherenkov bars for the optical part of the time-of-flight detector in Geant4.

    PubMed

    Nozka, L; Brandt, A; Rijssenbeek, M; Sykora, T; Hoffman, T; Griffiths, J; Steffens, J; Hamal, P; Chytka, L; Hrabovsky, M

    2014-11-17

    We present the results of studies devoted to the development and optimization of the optical part of a high precision time-of-flight (TOF) detector for the Large Hadron Collider (LHC). This work was motivated by a proposal to use such a detector in conjunction with a silicon detector to tag and measure protons from interactions of the type p + p → p + X + p, where the two outgoing protons are scattered in the very forward directions. The fast timing detector uses fused silica (quartz) bars that emit Cherenkov radiation as a relativistic particle passes through and the emitted Cherenkov photons are detected by, for instance, a micro-channel plate multi-anode Photomultiplier Tube (MCP-PMT). Several possible designs are implemented in Geant4 and studied for timing optimization as a function of the arrival time, and the number of Cherenkov photons reaching the photo-sensor. PMID:25402137

  11. A CAD interface for GEANT4.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-09-01

    Often CAD models already exist for parts of a geometry being simulated using GEANT4. Direct import of these CAD models into GEANT4 however, may not be possible and complex components may be difficult to define via other means. Solutions that allow for users to work around the limited support in the GEANT4 toolkit for loading predefined CAD geometries have been presented by others, however these solutions require intermediate file format conversion using commercial software. Here within we describe a technique that allows for CAD models to be directly loaded as geometry without the need for commercial software and intermediate file format conversion. Robustness of the interface was tested using a set of CAD models of various complexity; for the models used in testing, no import errors were reported and all geometry was found to be navigable by GEANT4. PMID:22956356

  12. Simulations of nuclear resonance fluorescence in GEANT4

    NASA Astrophysics Data System (ADS)

    Lakshmanan, Manu N.; Harrawood, Brian P.; Rusev, Gencho; Agasthya, Greeshma A.; Kapadia, Anuj J.

    2014-11-01

    The nuclear resonance fluorescence (NRF) technique has been used effectively to identify isotopes based on their nuclear energy levels. Specific examples of its modern-day applications include detecting spent nuclear waste and cargo scanning for homeland security. The experimental designs for these NRF applications can be more efficiently optimized using Monte Carlo simulations before the experiment is implemented. One of the most widely used Monte Carlo physics simulations is the open-source toolkit GEANT4. However, NRF physics has not been incorporated into the GEANT4 simulation toolkit in publicly available software. Here we describe the development and testing of an NRF simulation in GEANT4. We describe in depth the development and architecture of this software for the simulation of NRF in any isotope in GEANT4; as well as verification and validation testing of the simulation for NRF in boron. In the verification testing, the simulation showed agreement with the analytical model to be within 0.6% difference for boron and iron. In the validation testing, the simulation showed agreement to be within 20.5% difference with the experimental measurements for boron, with the percent difference likely due to small uncertainties in beam polarization, energy distribution, and detector composition.

  13. Designing a new type of neutron detector for neutron and gamma-ray discrimination via GEANT4.

    PubMed

    Shan, Qing; Chu, Shengnan; Ling, Yongsheng; Cai, Pingkun; Jia, Wenbao

    2016-04-01

    Design of a new type of neutron detector, consisting of a fast neutron converter, plastic scintillator, and Cherenkov detector, to discriminate 14-MeV fast neutrons and gamma rays in a pulsed n-γ mixed field and monitor their neutron fluxes is reported in this study. Both neutrons and gamma rays can produce fluorescence in the scintillator when they are incident on the detector. However, only the secondary charged particles of the gamma rays can produce Cherenkov light in the Cherenkov detector. The neutron and gamma-ray fluxes can be calculated by measuring the fluorescence and Cherenkov light. The GEANT4 Monte Carlo simulation toolkit is used to simulate the whole process occurring in the detector, whose optimum parameters are known. Analysis of the simulation results leads to a calculation method of neutron flux. This method is verified by calculating the neutron fluxes using pulsed n-γ mixed fields with different n/γ ratios, and the results show that the relative errors of all calculations are <5%. PMID:26844541

  14. GEANT4-MT : bringing multi-threading into GEANT4 production

    NASA Astrophysics Data System (ADS)

    Ahn, Sunil; Apostolakis, John; Asai, Makoto; Brandt, Daniel; Cooperman, Gene; Cosmo, Gabriele; Dotti, Andrea; Dong, Xin; Jun, Soon Yung; Nowak, Andrzej

    2014-06-01

    GEANT4-MT is the multi-threaded version of the GEANT4 particle transport code.(1, 2) The key goals for the design of GEANT4-MT have been a) the need to reduce the memory footprint of the multi-threaded application compared to the use of separate jobs and processes; b) to create an easy migration of the existing applications; and c) to use efficiently many threads or cores, by scaling up to tens and potentially hundreds of workers. The first public release of a GEANT4-MT prototype was made in 2011. We report on the revision of GEANT4-MT for inclusion in the production-level release scheduled for end of 2013. This has involved significant re-engineering of the prototype in order to incorporate it into the main GEANT4 development line, and the porting of GEANT4-MT threading code to additional platforms. In order to make the porting of applications as simple as possible, refinements addressed the needs of standalone applications. Further adaptations were created to improve the fit with the frameworks of High Energy Physics (HEP) experiments. We report on performances measurements on Intel Xeon™, AMD Opteron™ the first trials of GEANT4-MT on the Intel Many Integrated Cores (MIC) architecture, in the form of the Xeon Phi™ co-processor.(3) These indicate near-linear scaling through about 200 threads on 60 cores, when holding fixed the number of events per thread.

  15. Geant4 software application for the simulation of cosmic ray showers in the Earth’s atmosphere

    NASA Astrophysics Data System (ADS)

    Paschalis, P.; Mavromichalaki, H.; Dorman, L. I.; Plainaki, C.; Tsirigkas, D.

    2014-11-01

    Galactic cosmic rays and solar energetic particles with sufficient rigidity to penetrate the geomagnetic field, enter the Earth’s atmosphere and interact with the electrons and the nuclei of its atoms and molecules. From the interactions with the nuclei, cascades of secondary particles are produced that can be detected by ground-based detectors such as neutron monitors and muon counters. The theoretical study of the details of the atmospheric showers is of great importance, since many applications, such as the dosimetry for the aviation crews, are based on it. In this work, a new application which can be used in order to study the showers of the secondary particles in the atmosphere is presented. This application is based on the Monte Carlo simulation techniques, performed by using the well-known Geant4 toolkit. We present a thorough analysis of the simulation’s critical points, including a description of the procedure applied in order to model the atmosphere and the geomagnetic field. Representative results obtained by the application are presented and future plans for the project are discussed.

  16. Geant4 - Towards major release 10

    NASA Astrophysics Data System (ADS)

    Cosmo, G.; Geant4 Collaboration

    2014-06-01

    The Geant4 simulation toolkit has reached maturity in the middle of the previous decade, providing a wide variety of established features coherently aggregated in a software product, which has become the standard for detector simulation in HEP and is used in a variety of other application domains. We review the most recent capabilities introduced in the kernel, highlighting those, which are being prepared for the next major release (version 10.0) that is scheduled for the end of 2013. A significant new feature contained in this release will be the integration of multi-threading processing, aiming at targeting efficient use of modern many-cores system architectures and minimization of the memory footprint for exploiting event-level parallelism. We discuss its design features and impact on the existing API and user-interface of Geant4. Revisions are made to balance the need for preserving backwards compatibility and to consolidate and improve the interfaces; taking into account requirements from the multithreaded extensions and from the evolution of the data processing models of the LHC experiments.

  17. GEANT4: Applications in High Energy Physics

    SciTech Connect

    Mahmood, Tariq; Zafar, Abrar Ahmed; Hussain, Talib; Rashid, Haris

    2007-02-14

    GEANT4 is a detector simulation toolkit aimed at studying, mainly experimental high energy physics. In this paper we will give an overview of this software with special reference to its applications in high energy physics experiments. A brief of process methods is given. Object-oriented nature of the simulation toolkit is highlighted.

  18. GAMOS: A framework to do GEANT4 simulations in different physics fields with an user-friendly interface

    NASA Astrophysics Data System (ADS)

    Arce, Pedro; Ignacio Lagares, Juan; Harkness, Laura; Pérez-Astudillo, Daniel; Cañadas, Mario; Rato, Pedro; de Prado, María; Abreu, Yamiel; de Lorenzo, Gianluca; Kolstein, Machiel; Díaz, Angelina

    2014-01-01

    GAMOS is a software system for GEANT4-based simulation. It comprises a framework, a set of components providing functionality to simulation applications on top of the GEANT4 toolkit, and a collection of ready-made applications. It allows to perform GEANT4-based simulations using a scripting language, without requiring the writing of C++ code. Moreover, GAMOS design allows the extension of the existing functionality through user-supplied C++ classes. The main characteristics of GAMOS and its embedded functionality are described.

  19. Geant4 Computing Performance Benchmarking and Monitoring

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-01

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. The scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  20. Geant4 Computing Performance Benchmarking and Monitoring

    SciTech Connect

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  1. Geant4 Computing Performance Benchmarking and Monitoring

    DOE PAGESBeta

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less

  2. Geant4 VMC 3.0

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, I.; Gheata, A.

    2015-12-01

    Virtual Monte Carlo (VMC) [1] provides an abstract interface into Monte Carlo transport codes. A user VMC based application, independent from the specific Monte Carlo codes, can be then run with any of the supported simulation programs. Developed by the ALICE Offline Project and further included in ROOT [2], the interface and implementations have reached stability during the last decade and have become a foundation for other detector simulation frameworks, the FAIR facility experiments framework being among the first and largest. Geant4 VMC [3], which provides the implementation of the VMC interface for Geant4 [4], is in continuous maintenance and development, driven by the evolution of Geant4 on one side and requirements from users on the other side. Besides the implementation of the VMC interface, Geant4 VMC also provides a set of examples that demonstrate the use of VMC to new users and also serve for testing purposes. Since major release 2.0, it includes the G4Root navigator package, which implements an interface that allows one to run a Geant4 simulation using a ROOT geometry. The release of Geant4 version 10.00 with the integration of multithreading processing has triggered the development of the next major version of Geant4 VMC (version 3.0), which was released in November 2014. A beta version, available for user testing since March, has helped its consolidation and improvement. We will review the new capabilities introduced in this major version, in particular the integration of multithreading into the VMC design, its impact on the Geant4 VMC and G4Root packages, and the introduction of a new package, MTRoot, providing utility functions for ROOT parallel output in independent files with necessary additions for thread-safety. Migration of user applications to multithreading that preserves the ease of use of VMC will be also discussed. We will also report on the introduction of a new CMake [5] based build system, the migration to ROOT major release 6 and the

  3. The Cryogenic AntiCoincidence Detector for the ATHENA X-IFU: Design Aspects by Geant4 Simulation and Preliminary Characterization of the New Single Pixel

    NASA Astrophysics Data System (ADS)

    Macculi, C.; Argan, A.; D'Andrea, M.; Lotti, S.; Piro, L.; Biasotti, M.; Corsini, D.; Gatti, F.; Orlando, A.; Torrioli, G.

    2016-01-01

    The ATHENA observatory is the second large-class ESA mission, in the context of the Cosmic Vision 2015-2025, scheduled to be launched on 2028 at L2 orbit. One of the two planned focal plane instruments is the X-ray Integral Field Unit (X-IFU), which will be able to perform simultaneous high-grade energy spectroscopy and imaging over the 5 arcmin FoV by means of a kilo-pixel array of transition-edge sensor (TES) microcalorimeters, coupled to a high-quality X-ray optics. The X-IFU sensitivity is degraded by the particle background, induced by primary protons of both solar and cosmic rays' origin and secondary electrons. A Cryogenic AntiCoincidence (CryoAC) TES-based detector, located < 1 mm below the TES array, will allow the mission to reach the background level that enables its scientific goals. The CryoAC is a 4-pixel detector made of Silicon absorbers sensed by Iridium TESs. We currently achieve a TRL = 3-4 at the single-pixel level. We have designed and developed two further prototypes in order to reach TRL = 4. The design of the CryoAC has been also optimized using the Geant4 simulation tool. Here we will describe some results from the Geant4 simulations performed to optimize the design and preliminary test results from the first of the two detectors, 1 cm2 area, made of 65 Ir TESs.

  4. The Cryogenic AntiCoincidence Detector for the ATHENA X-IFU: Design Aspects by Geant4 Simulation and Preliminary Characterization of the New Single Pixel

    NASA Astrophysics Data System (ADS)

    Macculi, C.; Argan, A.; D'Andrea, M.; Lotti, S.; Piro, L.; Biasotti, M.; Corsini, D.; Gatti, F.; Orlando, A.; Torrioli, G.

    2016-08-01

    The ATHENA observatory is the second large-class ESA mission, in the context of the Cosmic Vision 2015-2025, scheduled to be launched on 2028 at L2 orbit. One of the two planned focal plane instruments is the X-ray Integral Field Unit (X-IFU), which will be able to perform simultaneous high-grade energy spectroscopy and imaging over the 5 arcmin FoV by means of a kilo-pixel array of transition-edge sensor (TES) microcalorimeters, coupled to a high-quality X-ray optics. The X-IFU sensitivity is degraded by the particle background, induced by primary protons of both solar and cosmic rays' origin and secondary electrons. A Cryogenic AntiCoincidence (CryoAC) TES-based detector, located <1 mm below the TES array, will allow the mission to reach the background level that enables its scientific goals. The CryoAC is a 4-pixel detector made of Silicon absorbers sensed by Iridium TESs. We currently achieve a TRL = 3-4 at the single-pixel level. We have designed and developed two further prototypes in order to reach TRL = 4. The design of the CryoAC has been also optimized using the Geant4 simulation tool. Here we will describe some results from the Geant4 simulations performed to optimize the design and preliminary test results from the first of the two detectors, 1 cm2 area, made of 65 Ir TESs.

  5. Geant4-DNA: overview and recent developments

    NASA Astrophysics Data System (ADS)

    Štěpán, Václav

    software already available for download, as well as future perspectives, will be presented, on behalf of the Geant4-DNA Collaboration.

  6. The Geant4 Bertini Cascade

    NASA Astrophysics Data System (ADS)

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron-nucleus interaction models in the GEANT4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron-nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other GEANT4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  7. The Geant4 Bertini Cascade

    SciTech Connect

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron–nucleus interaction models in the Geant4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron–nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other Geant4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  8. Introduction to the Geant4 Simulation toolkit

    SciTech Connect

    Guatelli, S.; Cutajar, D.; Rosenfeld, A. B.; Oborn, B.

    2011-05-05

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  9. Introduction to the Geant4 Simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guatelli, S.; Cutajar, D.; Oborn, B.; Rosenfeld, A. B.

    2011-05-01

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  10. The Geant4 Physics Validation Repository

    SciTech Connect

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-23

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described

  11. The Geant4 physics validation repository

    DOE PAGESBeta

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-01-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described

  12. The Geant4 physics validation repository

    NASA Astrophysics Data System (ADS)

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described.

  13. Visualization drivers for Geant4

    SciTech Connect

    Beretvas, Andy; /Fermilab

    2005-10-01

    This document is on Geant4 visualization tools (drivers), evaluating pros and cons of each option, including recommendations on which tools to support at Fermilab for different applications. Four visualization drivers are evaluated. They are OpenGL, HepRep, DAWN and VRML. They all have good features, OpenGL provides graphic output without an intermediate file. HepRep provides menus to assist the user. DAWN provides high quality plots and even for large files produces output quickly. VRML uses the smallest disk space for intermediate files. Large experiments at Fermilab will want to write their own display. They should proceed to make this display graphics independent. Medium experiment will probably want to use HepRep because of it's menu support. Smaller scale experiments will want to use OpenGL in the spirit of having immediate response, good quality output and keeping things simple.

  14. Implementing NRF Physics in Geant4

    SciTech Connect

    Jordan, David V.; Warren, Glen A.

    2006-07-01

    The Geant4 radiation transport Monte Carlo code toolkit currently does not support nuclear resonance fluorescence (NRF). After a brief review of NRF physics, plans for implementing this physics process in Geant4, and validating the output of the code, are described. The plans will be executed as Task 3 of project 50799, "Nuclear Resonance Fluorescence Signatures (NuRFS)".

  15. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  16. Alpha Coincidence Spectroscopy studied with GEANT4

    SciTech Connect

    Dion, Michael P.; Miller, Brian W.; Tatishvili, Gocha; Warren, Glen A.

    2013-11-02

    Abstract The high-energy side of peaks in alpha spectra, e.g. 241Am, as measured with a silicon detector has structure caused mainly by alpha-conversion electron and to some extent alphagamma coincidences. We compare GEANT4 simulation results to 241Am alpha spectroscopy measurements with a passivated implanted planar silicon detector. A large discrepancy between the measurements and simulations suggest that the GEANT4 photon evaporation database for 237Np (daughter of 241Am decay) does not accurately describe the conversion electron spectrum and therefore was found to have large discrepancies with experimental measurements. We describe how to improve the agreement between GEANT4 and alpha spectroscopy for actinides of interest by including experimental measurements of conversion electron spectroscopy into the photon evaporation database.

  17. Medical Applications of the Geant4 Toolkit

    NASA Astrophysics Data System (ADS)

    Agostinelli, S.; Chauvie, S.; Foppiano, F.; Garelli, S.; Marchetto, F.; Pia, M. G.; Nieminen, P.; Rolando, V.; Solano, A.

    A powerful and suitable tool for attacking the problem of the production and transport of different beams in biological matter is offered by the Geant4 Simulation Toolkit. Various activities in progress in the domain of medical applications are presented: studies on calibration of br achy therapie sources and termoluminescent dosimeters, studies of a complete 3-D inline dosimeter, development of general tools for CT interface for treatment planning, studies involving neutron transport, etc. A novel approach, based on the Geant4 Toolkit, for the study of radiation damage at the cellular and DNA level, is also presented.

  18. Comparison of GEANT4 very low energy cross section models with experimental data in water

    SciTech Connect

    Incerti, S.; Ivanchenko, A.; Karamitros, M.; Mantero, A.; Moretto, P.; Tran, H. N.; Mascialino, B.; Champion, C.; Ivanchenko, V. N.; Bernal, M. A.; Francis, Z.; Villagrasa, C.; Baldacchino, G.; Gueye, P.; Capra, R.; Nieminen, P.; Zacharatou, C.

    2010-09-15

    Purpose: The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H{sup 0}, H{sup +}) and (He{sup 0}, He{sup +}, He{sup 2+}), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called ''GEANT4-DNA'' physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. Methods: An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. Results: The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant

  19. Design Software

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A NASA contractor and Small Business Innovation Research (SBIR) participant has converted its research into commercial software products for auto design, structural analysis and other applications. ViGYAN, Inc., utilizing the aeronautical research principle of computational fluid dynamics, has created - with VGRID3D and VPLOT3D - an easier alternative to conventional structured grids for fluid dynamic calculations.

  20. GEANT4 Simulation of the NPDGamma Experiment

    NASA Astrophysics Data System (ADS)

    Frlez, Emil

    2014-03-01

    The n-> + p --> d + γ experiment, currently taking data at the Oak Ridge SNS facility, is a high-precision measurement of weak nuclear forces at low energies. Detecting the correlation between the cold neutron spin and photon direction in the capture of neutrons on Liquid Hydrogen (LH) target, the experiment is sensitive to the properties of neutral weak current. We have written a GEANT4 Monte Carlo simulation of the NPDGamma detector that, in addition to the active CsI detectors, also includes different targets and passive materials as well as the beam line elements. The neutron beam energy spectrum, its profiles, divergencies, and time-of-flight are simulated in detail. We have used the code to cross-calibrate the positions of (i) polarized LH target, (ii) Aluminum target, and (iii) CCl4 target. The responses of the 48 CsI detectors in the simulation were fixed using data taken on the LH target. Both neutron absorption as well as scattering and thermal processes were turned on in the GEANT4 physics lists. We use the results to simulate in detail the data obtained with different targets used in the experiment within a comprehensive analysis. This work is supported by NSF grant PHY-1307328.

  1. Integration of g4tools in Geant4

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, Ivana

    2014-06-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  2. Geant4-Simulations for cellular dosimetry in nuclear medicine.

    PubMed

    Freudenberg, Robert; Wendisch, Maria; Kotzerke, Jörg

    2011-12-01

    The application of unsealed radionuclides in radiobiological experiments can lead to intracellular radionuclide uptake and an increased absorbed dose. Accurate dose quantification is essential to assess observed radiobiological effects. Due to small cellular dimensions direct dose measurement is impossible. We will demonstrate the application of Monte Carlo simulations for dose calculation. Dose calculations were performed using the Geant4 Monte Carlo toolkit, wherefore typical experimental situations were designed. Dose distributions inside wells were simulated for different radionuclides. S values were simulated for spherical cells and cell monolayers of different diameter. Concomitantly experiments were performed using the PC Cl3 cell line with mediated radionuclide uptake. For various activity distributions cellular survival was measured. We yielded S values for dose distribution inside the wells. Calculated S values for a single cell are in good agreement to S values provided in the literature (ratio 0.87 to 1.07). Cross-dose is up to ten times higher for Y-90. Concomitantly performed cellular experiments confirm the dose calculation. Furthermore the necessity of correct dose calculation was shown for assessment of radiobiological effects after application of unsealed radionuclides. Thereby the feasibility of using Geant4 was demonstrated. PMID:21983023

  3. Application of GEANT4 in the Development of New Radiation Therapy Treatment Methods

    NASA Astrophysics Data System (ADS)

    Brahme, Anders; Gudowska, Irena; Larsson, Susanne; Andreassen, Björn; Holmberg, Rickard; Svensson, Roger; Ivanchenko, Vladimir; Bagulya, Alexander; Grichine, Vladimir; Starkov, Nikolay

    2006-04-01

    There is a very fast development of new radiation treatment methods today, from advanced use of intensity modulated photon and electron beams to light ion therapy with narrow scanned beam based treatment units. Accurate radiation transport calculations are a key requisite for these developments where Geant4 is a very useful Monte Carlo code for accurate design of new treatment units. Today we cannot only image the tumor by PET-CT imaging before the treatment but also determine the tumor sensitivity to radiation and even measure in vivo the delivered absorbed dose in three dimensions in the patient. With such methods accurate Monte Carlo calculations will make radiation therapy an almost exact science where the curative doses can be calculated based on patient individual response data. In the present study results from the application of Geant4 are discussed and the comparisons between Geant4 and experimental and other Monte Carlo data are presented.

  4. Medical Applications of the Geant4 Simulation Toolkit

    NASA Astrophysics Data System (ADS)

    Perl, Joseph

    2008-03-01

    Geant4 is a toolkit for the simulation of the passage of particles through matter. While Geant4 was originally developed for High Energy Physics (HEP), applications now include Nuclear, Space and Medical Physics. Medical applications of Geant4 in North America and throughout the world have been increasing rapidly due to the overall growth of Monte Carlo use in Medical Physics and the unique qualities of Geant4 as an all-particle code able to handle complex geometry, motion and fields with the flexibility of modern programming and an open and free source code. Work has included characterizing beams and brachytherapy sources, treatment planning, retrospective studies, imaging and validation. This talk will provide an overview of these applications, with a focus on therapy, and will discuss how Geant4 has responded to the specific challenges of moving from HEP to Medical applications.

  5. Experimental spectra analysis in THM with the help of simulation based on the Geant4 framework

    NASA Astrophysics Data System (ADS)

    Li, Cheng-Bo; Wen, Qun-Gang; Zhou, Shu-Hua; Jiang, Zong-Jun; Fu, Yuan-Yong; Zhou, Jing; Meng, Qiu-Ying; Wang, Xiao-Lian

    2015-05-01

    The Coulomb barrier and electron screening cause difficulties in directly measuring nuclear reaction cross sections of charged particles at astrophysical energies. The Trojan-horse method (THM) has been introduced to solve the difficulties as a powerful indirect tool. In order to understand experimental spectra better, Geant4 is employed to simulate the method. Validity and reliability of simulation data are examined by comparing the experimental data with simulated results. The Geant4 simulation of THM improves data analysis and is beneficial to the design for future related experiments. Supported by National Natural Science Foundation of China (11075218, 10575132) and Beijing Natural Science Foundation (1122017)

  6. An Overview of the Geant4 Toolkit

    SciTech Connect

    Apostolakis, John; Wright, Dennis H.

    2007-03-19

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications.With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results.Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results.Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  7. An Overview of the GEANT4 Toolkit

    SciTech Connect

    Apostolakis, John; Wright, Dennis H.; /SLAC

    2007-10-05

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualize and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  8. Simulation study of Fast Neutron Radiography using GEANT4

    NASA Astrophysics Data System (ADS)

    Bishnoi, S.; Thomas, R. G.; Sarkar, P. S.; Datar, V. M.; Sinha, A.

    2015-02-01

    Fast neutron radiography (FNR) is an important non-destructive technique for the imaging of thick bulk material. We are designing a FNR system using a laboratory based 14 MeV D-T neutron generator [1]. Simulation studies have been carried using Monte Carlo based GEANT4 code to understand the response of the FNR system for various objects. Different samples ranging from low Z, metallic and high Z materials were simulated for their radiographic images. The quality of constructed neutron radiography images in terms of relative contrast ratio and the contrast to noise ratio were investigated for their dependence on various parameters such as thickness, voids inside high/low Z material and also for low Z material hidden behind high Z material. We report here the potential and limitations of FNR for imaging different materials and a few configurations and also the possible areas where FNR can be implemented.

  9. Beam simulation tools for GEANT4 (and neutrino source applications)

    SciTech Connect

    V.Daniel Elvira, Paul Lebrun and Panagiotis Spentzouris

    2002-12-03

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the High Energy Physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. Although there are many computer programs for beam physics simulations, Geant4 is ideal to model a beam going through material or a system with a beam line integrated to a complex detector. There are many examples in the current international High Energy Physics programs, such as studies related to a future Neutrino Factory, a Linear Collider, and a very Large Hadron Collider.

  10. Simulation of Cold Neutron Experiments using GEANT4

    NASA Astrophysics Data System (ADS)

    Frlez, Emil; Hall, Joshua; Root, Melinda; Baessler, Stefan; Pocanic, Dinko

    2013-10-01

    We review the available GEANT4 physics processes for the cold neutrons in the energy range 1-100 meV. We consider the cases of the neutron beam interacting with (i) para- and ortho- polarized liquid hydrogen, (ii) Aluminum, and (iii) carbon tetrachloride (CCl4) targets. Scattering, thermal and absorption cross sections used by GEANT4 and MCNP6 libraries are compared with the National Nuclear Data Center (NNDC) compilation. NPDGamma detector simulation is presented as an example of the implementation of the resulting GEANT4 code. This work is supported by NSF grant PHY-0970013.

  11. SU-E-J-72: Geant4 Simulations of Spot-Scanned Proton Beam Treatment Plans

    SciTech Connect

    Kanehira, T; Sutherland, K; Matsuura, T; Umegaki, K; Shirato, H

    2014-06-01

    Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generated and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.

  12. Polycrystalline neutron scattering for Geant4: NXSG4

    NASA Astrophysics Data System (ADS)

    Kittelmann, T.; Boin, M.

    2015-04-01

    An extension to Geant4 based on the nxs library is presented. It has been implemented in order to include effects of low-energy neutron scattering in polycrystalline materials, and is made available to the scientific community.

  13. Geant4 electromagnetic physics updates for space radiation effects simulation

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Nieminen, Petteri; Incerti, Sebastien; Santin, Giovanni; Ivantchenko, Vladimir; Grichine, Vladimir; Allison, John; Karamitos, Mathiew

    The Geant4 toolkit is used in many applications including space science studies. The new Geant4 version 10.0 released in December 2013 includes a major revision of the toolkit and offers multi-threaded mode for event level parallelism. At the same time, Geant4 electromagnetic and hadronic physics sub-libraries have been significantly updated. In order to validate the new and updated models Geant4 verification tests and benchmarks were extended. Part of these developments was sponsored by the European Space Agency in the context of research aimed at modelling radiation biological end effects. In this work, we present an overview of results of several benchmarks for electromagnetic physics models relevant to space science. For electromagnetic physics, recently Compton scattering, photoelectric effect, and Rayleigh scattering models have been improved and extended down to lower energies. Models of ionization and fluctuations have also been improved; special micro-dosimetry models for Silicon and liquid water were introduced; the main multiple scattering model was consolidated; and the atomic de-excitation module has been made available to all models. As a result, Geant4 predictions for space radiation effects obtained with different Physics Lists are in better agreement with the benchmark data than previous Geant4 versions. Here we present results of electromagnetic tests and models comparison in the energy interval 10 eV - 10 MeV.

  14. artG4: A Generic Framework for Geant4 Simulations

    SciTech Connect

    Arvanitis, Tasha; Lyon, Adam

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  15. Characterisation of a SAGe well detector using GEANT4 and LabSOCS

    NASA Astrophysics Data System (ADS)

    Britton, R.; Davies, A. V.

    2015-06-01

    This paper reports on the performance of a recently developed Small Anode Germanium (SAGe) well detector from Canberra Industries. This has been specifically designed to improve the energy resolution of the detector, such that it is comparable to the performance of broad-energy designs while achieving far higher efficiencies. Accurate efficiency characterisations and cascade summing correction factors are crucial for quantifying the radionuclides present in environmental samples, and these were calculated for the complex geometry posed by the well detector using two different methodologies. The first relied on Monte-Carlo simulations based upon the GEANT4 toolkit, and the second utilised Canberra Industries GENIE™ 2000 Gamma Analysis software in conjunction with a LabSOCS™ characterisation. Both were found to be in excellent agreement for all nuclides except for 152Eu, which presents a known issue in the Canberra software (all nuclides affected by this issue were well documented, and fixes are being developed). The correction factors were used to analyse two fully characterised reference samples, yielding results in good agreement with the accepted activity concentrations. Given the sensitivity of well type geometries to cascade summing, this represents a considerable achievement, and paves the way for the use of the SAGe well detector in analysis of 'real-world' environmental samples. With the efficiency increase when using the SAGe well in place of a BEGe, substantial reductions in the Minimum Detectable Activity (MDA) should be achievable for a range of nuclides.

  16. GEANT4 Simulation of Neutron Detector for DAMPE

    NASA Astrophysics Data System (ADS)

    He, M.; Ma, T.; Chang, J.; Zhang, Y.; Huang, Y. Y.; Zang, J. J.; Wu, J.; Dong, T. K.

    2016-01-01

    During recent tens of years dark matter has gradually become a hot topic in astronomical research field, and related theory researches and experiment projects change with each passing day. The Dark Matter Particle Explorer (DAMPE) of our country is proposed under this background. As the probing object involves high energy electrons, appropriate methods must be taken to distinguish them from protons in order to reduce the event probability of other charged particles (e.g. a proton) being mistaken as electrons. The experiments show that, the hadronic shower of high energy proton in BGO electromagnetic calorimeter, which is usually accompanied by the emitting of large number of secondary neutrons, is significantly different from the electromagnetic shower of high energy electron. Through the detection of secondary neutron signal emitting from the bottom of BGO electromagnetic calorimeter and the shower shape of incident particles in BGO electromagnetic calorimeter, we can effectively distinguish whether the incident particles are high energy protons or electrons. This paper introduces the structure and detecting principle of DAMPE neutron detector. We use Monte-Carlo method with GEANT4 software to simulate the signal emitting from protons and electrons at characteristic energy in the neutron detector, and finally summarize the neutron detector's ability to distinguish protons and electrons under different electron acception efficiencies.

  17. Monte Carlo simulation of a photodisintegration of 3 H experiment in Geant4

    NASA Astrophysics Data System (ADS)

    Gray, Isaiah

    2013-10-01

    An upcoming experiment involving photodisintegration of 3 H at the High Intensity Gamma-Ray Source facility at Duke University has been simulated in the software package Geant4. CAD models of silicon detectors and wire chambers were imported from Autodesk Inventor using the program FastRad and the Geant4 GDML importer. Sensitive detectors were associated with the appropriate logical volumes in the exported GDML file so that changes in detector geometry will be easily manifested in the simulation. Probability distribution functions for the energy and direction of outgoing protons were generated using numerical tables from previous theory, and energies and directions were sampled from these distributions using a rejection sampling algorithm. The simulation will be a useful tool to optimize detector geometry, estimate background rates, and test data analysis algorithms. This work was supported by the Triangle Universities Nuclear Laboratory REU program at Duke University.

  18. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    NASA Astrophysics Data System (ADS)

    Apostolakis, J.; Asai, M.; Bagulya, A.; Brown, J. M. C.; Burkhardt, H.; Chikuma, N.; Cortes-Giraldo, M. A.; Elles, S.; Grichine, V.; Guatelli, S.; Incerti, S.; Ivanchenko, V. N.; Jacquemier, J.; Kadri, O.; Maire, M.; Pandola, L.; Sawkey, D.; Toshito, T.; Urban, L.; Yamashita, T.

    2015-12-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  19. Geant4 validation with CMS calorimeters test-beam data

    SciTech Connect

    Piperov, Stefan; /Sofiya, Inst. Nucl. Res. /Fermilab

    2008-08-01

    CMS experiment is using Geant4 for Monte-Carlo simulation of the detector setup. Validation of physics processes describing hadronic showers is a major concern in view of getting a proper description of jets and missing energy for signal and background events. This is done by carrying out an extensive studies with test beam using the prototypes or real detector modules of the CMS calorimeter. These data are matched with Geant4 predictions. Tuning of the Geant4 models is carried out and steps to be used in reproducing detector signals are defined in view of measurements of energy response, energy resolution, transverse and longitudinal shower profiles for a variety of hadron beams over a broad energy spectrum between 2 to 300 GeV/c.

  20. GEANT4 simulation of APEX background radiation and shielding

    NASA Astrophysics Data System (ADS)

    Kaluarachchi, Maduka M.; Cates, Gordon D.; Wojtsekhowski, B.

    2015-04-01

    The A' Experiment (APEX), which is approved to run at the Thomas Jefferson National Accelerator Facility (JLab) Hall A, will search for a new vector boson that is hypothesized to be a possible force carrier that couples to dark matter. APEX results should be sensitive to the mass range of 65 MeV to 550 MeV, and high sensitivity will be achieved by means of a high intensity 100 μA beam on a 0.5 g/cm2 Tungsten target resulting in very high luminosity. The experiment should be able to observe the A ' with a coupling constant α ' ~ 1 × 107 times smaller than the electromagnetic coupling constant α. To deal safely with such enormous intensity and luminosity, a full radiation analysis must be used to help with the design of proper radiation shielding. The purpose of this talk is to present preliminary results obtained by simulating radiation background from the APEX experiment using the 3D Monte-Carlo transport code Geant4. Included in the simulation is a detailed Hall A setup: the hall, spectrometers and shield house, beam dump, beam line, septa magnet with its field, as well as the production target. The results were compared to the APEX test run data and used in development of the radiation shielding for sensitive electronics.

  1. New Geant4 based simulation tools for space radiation shielding and effects analysis

    NASA Astrophysics Data System (ADS)

    Santina, G.; Nieminen, P.; Evansa, H.; Daly, E.; Lei, F.; Truscott, P. R.; Dyer, C. S.; Quaghebeur, B.; Heynderickx, D.

    2003-09-01

    We present here a set of tools for space applications based on the Geant4 simulation toolkit, developed for radiation shielding analysis as part of the European Space Agency (ESA) activities in the Geant4 collaboration. The Sector Shielding Analysis Tool (SSAT) and the Materials and Geometry Association (MGA) utility will first be described. An overview of the main features of the MUlti-LAyered Shielding SImulation Software tool (MULASSIS) will follow. The tool is specifically addressed to shielding optimization and effects analysis. A Java interface allows the use of MULASSIS by the space community over the World Wide Web, integrated in the widely used SPENVIS package. The analysis of the particle transport output provides automatically radiation fluence, ionising and NIEL dose and effects analysis. ESA is currently funding the porting of this tools to a lowcost parallel processor facility using the GRID technology under the ESA SpaceGRID initiative. Other Geant4 present and future projects will be presented related to the study of space environment effects on spacecrafts.

  2. Preliminary Investigation of Microdosimetric Track Structure Physics Models in Geant4-DNA and RITRACKS

    PubMed Central

    Bezak, Eva

    2015-01-01

    The major differences between the physics models in Geant4-DNA and RITRACKS Monte Carlo packages are investigated. Proton and electron ionisation interactions and electron excitation interactions in water are investigated in the current work. While these packages use similar semiempirical physics models for inelastic cross-sections, the implementation of these models is demonstrated to be significantly different. This is demonstrated in a simple Monte Carlo simulation designed to identify differences in interaction cross-sections. PMID:26124856

  3. Design software for reuse

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Viewgraphs are presented on the designing of software for reuse. Topics include terminology, software reuse maxims, the science of programming, an interface design example, a modularization example, and reuse and implementation guidelines.

  4. Beam Simulation Tools for GEANT4 (BT-V1.0). User's Guide

    SciTech Connect

    Elvira, V. Daniel; Lebrum, P.; Spentzouris, P.

    2002-12-02

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the high energy physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. The Beam Tools are a set of C++ classes designed to facilitate the simulation of accelerator elements: r.f. cavities, magnets, absorbers, etc. These elements are constructed from Geant4 solid volumes like boxes, tubes, trapezoids, or spheers. There are many computer programs for beam physics simulations, but Geant4 is ideal to model a beam through a material or to integrate a beam line with a complex detector. There are many such examples in the current international High Energy Physics programs. For instance, an essential part of the R&D associated with the Neutrino Source/Muon Collider accelerator is the ionization cooling channel, which is a section of the system aimed to reduce the size of the muon beam in phase space. The ionization cooling technique uses a combination of linacs and light absorbers to reduce the transverse momentum and size of the beam, while keeping the longitudinal momentum constant. The MuCool/MICE (muon cooling) experiments need accurate simulations of the beam transport through the cooling channel in addition to a detailed simulation of the detectors designed to measure the size of the beam. The accuracy of the models for physics processes associated with muon ionization and multiple scattering is critical in this type of applications. Another example is the simulation of the interaction region in future accelerators. The high luminosity and background environments expected in the Next Linear Collider (NLC) and the Very Large Hadron Collider (VLHC) pose great demand on the detectors, which may be optimized by means of a simulation of the detector-accelerator interface.

  5. The GEANT4 toolkit for microdosimetry calculations: application to microbeam radiation therapy (MRT).

    PubMed

    Spiga, J; Siegbahn, E A; Bräuer-Krisch, E; Randaccio, P; Bravin, A

    2007-11-01

    Theoretical dose distributions for microbeam radiation therapy (MRT) are computed in this paper using the GEANT4 Monte Carlo (MC) simulation toolkit. MRT is an innovative experimental radiotherapy technique carried out using an array of parallel microbeams of synchrotron-wiggler-generated x rays. Although the biological mechanisms underlying the effects of microbeams are still largely unknown, the effectiveness of MRT can be traced back to the natural ability of normal tissues to rapidly repair small damages to the vasculature, and on the lack of a similar healing process in tumoral tissues. Contrary to conventional therapy, in which each beam is at least several millimeters wide, the narrowness of the microbeams allows a rapid regeneration of the blood vessels along the beams' trajectories. For this reason the calculation of the "valley" dose is of crucial importance and the correct use of MC codes for such purposes must be understood. GEANT4 offers, in addition to the standard libraries, a specialized package specifically designed to deal with electromagnetic interactions of particles with matter for energies down to 250 eV. This package implements two different approaches for electron and photon transport, one based on evaluated data libraries, the other adopting analytical models. These features are exploited to cross-check theoretical computations for MRT. The lateral and depth dose profiles are studied for the irradiation of a 20 cm diameter, 20 cm long cylindrical phantom, with cylindrical sources of different size and energy. Microbeam arrays are simulated with the aid of superposition algorithms, and the ratios of peak-to-valley doses are computed for typical cases used in preclinical assays. Dose profiles obtained using the GEANT4 evaluated data libraries and analytical models are compared with simulation results previously obtained using the PENELOPE code. The results show that dose profiles computed with GEANT4's analytical model are almost

  6. The GEANT4 toolkit for microdosimetry calculations: Application to microbeam radiation therapy (MRT)

    SciTech Connect

    Spiga, J.; Siegbahn, E. A.; Braeuer-Krisch, E.; Randaccio, P.; Bravin, A.

    2007-11-15

    Theoretical dose distributions for microbeam radiation therapy (MRT) are computed in this paper using the GEANT4 Monte Carlo (MC) simulation toolkit. MRT is an innovative experimental radiotherapy technique carried out using an array of parallel microbeams of synchrotron-wiggler-generated x rays. Although the biological mechanisms underlying the effects of microbeams are still largely unknown, the effectiveness of MRT can be traced back to the natural ability of normal tissues to rapidly repair small damages to the vasculature, and on the lack of a similar healing process in tumoral tissues. Contrary to conventional therapy, in which each beam is at least several millimeters wide, the narrowness of the microbeams allows a rapid regeneration of the blood vessels along the beams' trajectories. For this reason the calculation of the 'valley' dose is of crucial importance and the correct use of MC codes for such purposes must be understood. GEANT4 offers, in addition to the standard libraries, a specialized package specifically designed to deal with electromagnetic interactions of particles with matter for energies down to 250 eV. This package implements two different approaches for electron and photon transport, one based on evaluated data libraries, the other adopting analytical models. These features are exploited to cross-check theoretical computations for MRT. The lateral and depth dose profiles are studied for the irradiation of a 20 cm diameter, 20 cm long cylindrical phantom, with cylindrical sources of different size and energy. Microbeam arrays are simulated with the aid of superposition algorithms, and the ratios of peak-to-valley doses are computed for typical cases used in preclinical assays. Dose profiles obtained using the GEANT4 evaluated data libraries and analytical models are compared with simulation results previously obtained using the PENELOPE code. The results show that dose profiles computed with GEANT4's analytical model are almost

  7. Adaptation of GEANT4 to Monte Carlo dose calculations based on CT data.

    PubMed

    Jiang, H; Paganetti, H

    2004-10-01

    The GEANT4 Monte Carlo code provides many powerful functions for conducting particle transport simulations with great reliability and flexibility. However, as a general purpose Monte Carlo code, not all the functions were specifically designed and fully optimized for applications in radiation therapy. One of the primary issues is the computational efficiency, which is especially critical when patient CT data have to be imported into the simulation model. In this paper we summarize the relevant aspects of the GEANT4 tracking and geometry algorithms and introduce our work on using the code to conduct dose calculations based on CT data. The emphasis is focused on modifications of the GEANT4 source code to meet the requirements for fast dose calculations. The major features include a quick voxel search algorithm, fast volume optimization, and the dynamic assignment of material density. These features are ready to be used for tracking the primary types of particles employed in radiation therapy such as photons, electrons, and heavy charged particles. Recalculation of a proton therapy treatment plan generated by a commercial treatment planning program for a paranasal sinus case is presented as an example. PMID:15543788

  8. Simulation and modeling for the stand-off radiation detection system (SORDS) using GEANT4

    SciTech Connect

    Hoover, Andrew S; Wallace, Mark; Galassi, Mark; Mocko, Michal; Palmer, David; Schultz, Larry; Tornga, Shawn

    2009-01-01

    A Stand-Off Radiation Detection System (SORDS) is being developed through a joint effort by Raytheon, Los Alamos National Laboratory, Bubble Technology Industries, Radiation Monitoring Devices, and the Massachusetts Institute of Technology, for the Domestic Nuclear Detection Office (DNDO). The system is a mobile truck-based platform performing detection, imaging, and spectroscopic identification of gamma-ray sources. A Tri-Modal Imaging (TMI) approach combines active-mask coded aperture imaging, Compton imaging, and shadow imaging techniques. Monte Carlo simulation and modeling using the GEANT4 toolkit was used to generate realistic data for the development of imaging algorithms and associated software code.

  9. Accurate simulations of TEPC neutron spectra using Geant4

    NASA Astrophysics Data System (ADS)

    Taylor, G. C.; Hawkes, N. P.; Shippen, A.

    2015-11-01

    A Geant4 model of a tissue-equivalent proportional counter (TEPC) has been developed in which the calculated output spectrum exhibits unparalleled agreement with experiment for monoenergetic neutron fields at several energies below 20 MeV. The model uses the standard release of the Geant4 9.6 p2 code, but with a non-standard neutron cross section file as provided by Mendoza et al., and with the environment variable options recommended by the same authors. This configuration was found to produce significant improvements in the alpha-dominated region of the calculated response. In this paper, these improvements are presented, and the post-processing required to convert deposited energy into the number of ion pairs (which is the quantity actually measured experimentally) is discussed.

  10. GEANT4 simulations of Cherenkov reaction history diagnostics.

    PubMed

    Rubery, M S; Horsfield, C J; Herrmann, H W; Kim, Y; Mack, J M; Young, C S; Caldwell, S E; Evans, S C; Sedilleo, T J; McEvoy, A; Miller, E K; Stoeffl, W; Ali, Z; Toebbe, J

    2010-10-01

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility. PMID:21033850

  11. GEANT4 simulations of Cherenkov reaction history diagnostics

    SciTech Connect

    Rubery, M. S.; Horsfield, C. J.; Herrmann, H. W.; Kim, Y.; Mack, J. M.; Young, C. S.; Caldwell, S. E.; Evans, S. C.; Sedilleo, T. J.; McEvoy, A.; Miller, E. K.; Stoeffl, W.; Ali, Z.

    2010-10-15

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility.

  12. Transmission Efficiency of the Sage Spectrometer Using GEANT4

    NASA Astrophysics Data System (ADS)

    Cox, D. M.; Herzberg, R.-D.; Papadakis, P.; Ali, F.; Butler, P. A.; Cresswell, J. R.; Mistry, A.; Sampson, J.; Seddon, D. A.; Thornhill, J.; Wells, D.; Konki, J.; Greenlees, P. T.; Rahkila, P.; Pakarinen, J.; Sandzelius, M.; Sorri, J.; Julin, R.; Coleman-Smith, P. J.; Lazarus, I. H.; Letts, S. C.; Simpson, J.; Pucknell, V. F. E.

    2014-09-01

    The new SAGE spectrometer allows simultaneous electron and γ-ray in-beam studies of heavy nuclei. A comprehensive GEANT4 simulation suite has been created for the SAGE spectrometer. This includes both the silicon detectors for electron detection and the germanium detectors for γ-ray detection. The simulation can be used for a wide variety of tests with the aim of better understanding the behaviour of SAGE. A number of aspects of electron transmission are presented here.

  13. Software Design Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  14. GEANT4 Tuning For pCT Development

    NASA Astrophysics Data System (ADS)

    Yevseyeva, Olga; de Assis, Joaquim T.; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, João A. P.; Díaz, Katherin S.; Hormaza, Joel M.; Lopes, Ricardo T.

    2011-08-01

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Thus, the fidelity of proton computed tomography (pCT) simulations as a tool for proton therapy planning depends in the general case on the accuracy of results obtained for the proton interaction with thick absorbers. GEANT4 simulations of proton energy spectra after passing thick absorbers do not agree well with existing experimental data, as showed previously. Moreover, the spectra simulated for the Bethe-Bloch domain showed an unexpected sensitivity to the choice of low-energy electromagnetic models during the code execution. These observations were done with the GEANT4 version 8.2 during our simulations for pCT. This work describes in more details the simulations of the proton passage through aluminum absorbers with varied thickness. The simulations were done by modifying only the geometry in the Hadrontherapy Example, and for all available choices of the Electromagnetic Physics Models. As the most probable reasons for these effects is some specific feature in the code, or some specific implicit parameters in the GEANT4 manual, we continued our study with version 9.2 of the code. Some improvements in comparison with our previous results were obtained. The simulations were performed considering further applications for pCT development.

  15. Electro and gamma nuclear physics in Geant4

    SciTech Connect

    J.P. Wellisch; M. Kossov; P. Degtyarenko

    2003-03-01

    Adequate description of electro and gamma nuclear physics is of utmost importance in studies of electron beam-dumps and intense electron beam accelerators. I also is mandatory to describe neutron backgrounds and activation in linear colliders. This physics was elaborated in Geant4 over the last year, and now entered into the stage of practical application. In the Geant4 Photo-nuclear data base there are at present about 50 nuclei for which the Photo-nuclear absorption cross sections have been measured. Of these, data on 14 nuclei are used to parametrize the gamma nuclear reaction cross-section The resulting cross section is a complex, factorized function of A and e = log(E{gamma}), where E{gamma} is the energy of the incident photon. Electro-nuclear reactions are so closely connected with Photo-nuclear reactions that sometimes they are often called ''Photo-nuclear''. The one-photon exchange mechanism dominates in Electro-nuclear reactions, and the electron can be substituted by a flux of photons. Folding this flux with the gamma-nuclear cross-section, we arrive at an acceptable description of the electro-nuclear physics. Final states in gamma and electro nuclear physics are described using chiral invariant phase-space decay at low gamma or equivalent photon energies, and quark gluon string model at high energies. We will present the modeling of this physics in Geant4, and show results from practical applications.

  16. Measuring software design

    NASA Technical Reports Server (NTRS)

    1986-01-01

    An extensive series of studies of software design measures conducted by the Software Engineering Laboratory is described. Included are the objectives and results of the studies, the method used to perform the studies, and the problems encountered. The document should be useful to researchers planning similar studies as well as to managers and designers concerned with applying quantitative design measures.

  17. Calculation of HPGe efficiency for environmental samples: comparison of EFFTRAN and GEANT4

    NASA Astrophysics Data System (ADS)

    Nikolic, Jelena; Vidmar, Tim; Jokovic, Dejan; Rajacic, Milica; Todorovic, Dragana

    2014-11-01

    Determination of full energy peak efficiency is one of the most important tasks that have to be performed before gamma spectrometry of environmental samples. Many methods, including measurement of specific reference materials, Monte Carlo simulations, efficiency transfer and semi empirical calculations, were developed in order to complete this task. Monte Carlo simulation, based on GEANT4 simulation package and EFFTRAN efficiency transfer software are applied for the efficiency calibration of three detectors, readily used in the Environment and Radiation Protection Laboratory of Institute for Nuclear Sciences Vinca, for measurement of environmental samples. Efficiencies were calculated for water, soil and aerosol samples. The aim of this paper is to perform efficiency calculations for HPGe detectors using both GEANT4 simulation and EFFTRAN efficiency transfer software and to compare obtained results with the experimental results. This comparison should show how the two methods agree with experimentally obtained efficiencies of our measurement system and in which part of the spectrum do the discrepancies appear. The detailed knowledge of accuracy and precision of both methods should enable us to choose an appropriate method for each situation that is presented in our and other laboratories on a daily basis.

  18. GEANT4 for breast dosimetry: parameters optimization study

    NASA Astrophysics Data System (ADS)

    Fedon, C.; Longo, F.; Mettivier, G.; Longo, R.

    2015-08-01

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor ({{r}2}>0.99 ) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4.

  19. GEANT4 for breast dosimetry: parameters optimization study.

    PubMed

    Fedon, C; Longo, F; Mettivier, G; Longo, R

    2015-08-21

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor (r2>0.99) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4. PMID:26267405

  20. Particles Production in Extensive Air Showers: GEANT4 vs CORSIKA

    NASA Astrophysics Data System (ADS)

    Sabra, M. S.; Watts, J. W.; Christl, M. J.

    2014-09-01

    Air shower simulations are essential tools for the interpretation of the Extensive Air Shower (EAS) measurements. The reliability of these codes is evaluated by comparisons with equivalent simulation calculations, and with experimental data (when available). In this work, we present GEANT4 calculations of particles production in EAS induced by primary protons and Iron in the PeV (1015 eV) energy range. The calculations, using different hadronic models, are compared with the results from the well-known air shower simulation code CORSIKA, and the results of this comparison will be discussed. Air shower simulations are essential tools for the interpretation of the Extensive Air Shower (EAS) measurements. The reliability of these codes is evaluated by comparisons with equivalent simulation calculations, and with experimental data (when available). In this work, we present GEANT4 calculations of particles production in EAS induced by primary protons and Iron in the PeV (1015 eV) energy range. The calculations, using different hadronic models, are compared with the results from the well-known air shower simulation code CORSIKA, and the results of this comparison will be discussed. This work is supported by the NASA Postdoctoral Program administered by Oak Ridge Associated Universities.

  1. Evaluation of open MPI and MPICH2 performances for the computation time in proton therapy dose calculations with Geant4

    NASA Astrophysics Data System (ADS)

    Kazemi, M.; Afarideh, H.; Riazi, Z.

    2015-11-01

    The aim of this research work is to use a better parallel software structure to improve the performance of the Monte Carlo Geant4 code in proton treatment planning. The hadron therapy simulation is rewritten to parallelize the shared memory multiprocessor systems by using the Message-Passing Interface (MPI). The speedup performance of the code has been studied by using two MPI-compliant libraries including Open MPI and the MPICH2, separately. Despite the speedup, the results are almost linear for both the Open MPI and MPICH2; the latter was chosen because of its better characteristics and lower computation time. The Geant4 parameters, including the step limiter and the set cut, have been analyzed to minimize the simulation time as much as possible. For a reasonable compromise between the spatial dose distribution and the calculation time, the improvement in time reduction coefficient reaches about 157.

  2. Software Architecture Design Reasoning

    NASA Astrophysics Data System (ADS)

    Tang, Antony; van Vliet, Hans

    Despite recent advancements in software architecture knowledge management and design rationale modeling, industrial practice is behind in adopting these methods. The lack of empirical proofs and the lack of a practical process that can be easily incorporated by practitioners are some of the hindrance for adoptions. In particular, the process to support systematic design reasoning is not available. To rectify this issue, we propose a design reasoning process to help architects cope with an architectural design environment where design concerns are cross-cutting and diversified.We use an industrial case study to validate that the design reasoning process can help improve the quality of software architecture design. The results have indicated that associating design concerns and identifying design options are important steps in design reasoning.

  3. Aircraft Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  4. Positron Production at JLab Simulated Using Geant4

    SciTech Connect

    Kossler, W. J.; Long, S. S.

    2009-09-02

    The results of a Geant4 Monte-Carlo study of the production of slow positrons using a 140 MeV electron beam which might be available at Jefferson Lab are presented. Positrons are produced by pair production for the gamma-rays produced by bremsstrahlung on the target which is also the stopping medium for the positrons. Positrons which diffuse to the surface of the stopping medium are assumed to be ejected due to a negative work function. Here the target and moderator are combined into one piece. For an osmium target/moderator 3 cm long with transverse dimensions of 1 cm by 1 mm, we obtain a slow positron yield of about 8.5centre dot10{sup 10}/(scentre dotmA) If these positrons were remoderated and re-emitted with a 23% probability we would obtain 2centre dot10{sup 10}/(scentre dotmA) in a micro-beam.

  5. Nuclear spectroscopy with Geant4: Proton and neutron emission & radioactivity

    NASA Astrophysics Data System (ADS)

    Sarmiento, L. G.; Rudolph, D.

    2016-07-01

    With the aid of a novel combination of existing equipment - JYFLTRAP and the TASISpec decay station - it is possible to perform very clean quantum-state selective, high-resolution particle-γ decay spectroscopy. We intend to study the determination of the branching ratio of the ℓ = 9 proton emission from the Iπ = 19/2-, 3174-keV isomer in the N = Z - 1 nucleus 53Co. The study aims to initiate a series of similar experiments along the proton dripline, thereby providing unique insights into "open quantum systems". The technique has been pioneered in case studies using SHIPTRAP and TASISpec at GSI. Newly available radioactive decay modes in Geant4 simulations are going to corroborate the anticipated experimental results.

  6. A modular Geant4 model of Leksell Gamma Knife Perfexion™

    NASA Astrophysics Data System (ADS)

    Pipek, J.; Novotný, J.; Novotný, J., Jr.; Kozubíková, P.

    2014-12-01

    This work presents a Monte Carlo model of Leksell Gamma Knife Perfexion as well as the main parameters of the dose distribution in the standard phantom obtained using this model. The model is developed in the Geant4 simulation toolkit in a modular way which enables its reuse in other Perfexion studies. Large phase space files were created, containing particles that are entering the inner machine cavity after being transported through the collimation system. All 14 output factors of the machine and effective output factors for both the 4 mm (0.830 ± 0.009) and 8 mm (0.921 ± 0.004) collimators were calculated. Dose profiles along the main axes are also included for each collimator size. All results are compared to the values obtained from the treatment planning system, from experiments, and from other Monte Carlo models.

  7. GEANT 4 simulation of (99)Mo photonuclear production in nanoparticles.

    PubMed

    Dikiy, N P; Dovbnya, A N; Fedorchenko, D V; Khazhmuradov, M A

    2016-08-01

    GEANT 4 Monte-Carlo simulation toolkit is used to study the kinematic recoil method of (99)Mo photonuclear production. Simulation for bremsstrahlung photon spectrum with maximum photon energy 30MeV showed that for MoO3 nanoparticle escape fraction decreases from 0.24 to 0.08 when nanoparticle size increases from 20nm to 80nm. For the natural molybdenum and pure (100)Mo we obtained the lower values: from 0.17 to 0.05. The generation of accompanying molybdenum nuclei is significantly lower for pure (100)Mo and is about 3.6 nuclei per single (99)Mo nucleus, while natural molybdenum nanoparticle produce about 48 accompanying nuclei. Also, we have shown that for high-energy photons escape fraction of (99)Mo decreases, while production of unwanted molybdenum isotopes is significantly higher. PMID:27156050

  8. Simulation of a Helical Channel using GEANT4

    SciTech Connect

    Elvira, V. D.; Lebrun, P.; Spentzouris, P.

    2001-02-01

    We present a simulation of a 72 m long cooling channel proposed by V. Balbekov based on the helical cooling concept developed by Ya. Derbenev. LiH wedge absorbers provide the energy loss mechanism and 201 MHz cavities are used for re-acceleration. They are placed inside a main solenoidal field to focus the beam. A helical field with an amplitude of 0.3 T and a period of 1.8 m provides momentum dispersion for emittance exchange.The simulation is performed using GEANT4. The total fractional transmission is 0.85, and the transverse, longitudinal, and 3-D cooling factors are 3.75, 2.27, and 14.61, respectively. Some version of this helical channel could eventually be used to replace the first section of the double flip channel to keep the longitudinal emittance under control and increase transmission. Although this is an interesting option, the technical challenges are still significant.

  9. A modular Geant4 model of Leksell Gamma Knife Perfexion™.

    PubMed

    Pipek, J; Novotný, J; Novotný, J; Kozubíková, P

    2014-12-21

    This work presents a Monte Carlo model of Leksell Gamma Knife Perfexion as well as the main parameters of the dose distribution in the standard phantom obtained using this model. The model is developed in the Geant4 simulation toolkit in a modular way which enables its reuse in other Perfexion studies. Large phase space files were created, containing particles that are entering the inner machine cavity after being transported through the collimation system. All 14 output factors of the machine and effective output factors for both the 4 mm (0.830 ± 0.009) and 8 mm (0.921 ± 0.004) collimators were calculated. Dose profiles along the main axes are also included for each collimator size. All results are compared to the values obtained from the treatment planning system, from experiments, and from other Monte Carlo models. PMID:25415510

  10. Geant4 Simulation of Air Showers using Thinning Method

    NASA Astrophysics Data System (ADS)

    Sabra, Mohammad S.; Watts, John W.; Christl, Mark J.

    2015-04-01

    Simulation of complete air showers induced by cosmic ray particles becomes prohibitive at extreme energies due to the large number of secondary particles. Computing time of such simulations roughly scales with the energy of the primary cosmic ray particle, and becomes excessively large. To mitigate the problem, only small fraction of particles can be tracked and, then, the whole shower is reconstructed based on this sample. This method is called Thinning. Using this method in Geant4, we have simulated proton and iron air showers at extreme energies (E >1016 eV). Secondary particle densities are calculated and compared with the standard simulation program in this field, CORSIKA. This work is supported by the NASA Postdoctoral Program administrated by Oak Ridge Associated Universities.

  11. Aviation Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    DARcorporation developed a General Aviation CAD package through a Small Business Innovation Research contract from Langley Research Center. This affordable, user-friendly preliminary design system for General Aviation aircraft runs on the popular 486 IBM-compatible personal computers. Individuals taking the home-built approach, small manufacturers of General Aviation airplanes, as well as students and others interested in the analysis and design of aircraft are possible users of the package. The software can cut design and development time in half.

  12. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    SciTech Connect

    Uzunyan, S. A.; Blazey, G.; Boi, S.; Coutrakon, G.; Dyshkant, A.; Francis, K.; Hedin, D.; Johnson, E.; Kalnins, J.; Zutshi, V.; Ford, R.; Rauch, J. E.; Rubinov, P.; Sellberg, G.; Wilson, P.; Naimuddin, M.

    2015-12-29

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input for image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.

  13. Monte Carlo application based on GEANT4 toolkit to simulate a laser-plasma electron beam line for radiobiological studies

    NASA Astrophysics Data System (ADS)

    Lamia, D.; Russo, G.; Casarino, C.; Gagliano, L.; Candiano, G. C.; Labate, L.; Baffigi, F.; Fulgentini, L.; Giulietti, A.; Koester, P.; Palla, D.; Gizzi, L. A.; Gilardi, M. C.

    2015-06-01

    We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications.

  14. Balloon Design Software

    NASA Technical Reports Server (NTRS)

    Farley, Rodger

    2007-01-01

    PlanetaryBalloon Version 5.0 is a software package for the design of meridionally lobed planetary balloons. It operates in a Windows environment, and programming was done in Visual Basic 6. By including the effects of circular lobes with load tapes, skin mass, hoop and meridional stress, and elasticity in the structural elements, a more accurate balloon shape of practical construction can be determined as well as the room-temperature cut pattern for the gore shapes. The computer algorithm is formulated for sizing meridionally lobed balloons for any generalized atmosphere or planet. This also covers zero-pressure, over-pressure, and super-pressure balloons. Low circumferential loads with meridionally reinforced load tapes will produce shapes close to what are known as the "natural shape." The software allows for the design of constant angle, constant radius, or constant hoop stress balloons. It uses the desired payload capacity for given atmospheric conditions and determines the required volume, allowing users to design exactly to their requirements. The formulations are generalized to use any lift gas (or mixture of gases), any atmosphere, or any planet as described by the local acceleration of gravity. PlanetaryBalloon software has a comprehensive user manual that covers features ranging from, but not limited to, buoyancy and super-pressure, convenient design equations, shape formulation, and orthotropic stress/strain.

  15. Diffusion-controlled reactions modeling in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Karamitros, M.; Luan, S.; Bernal, M. A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H. N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k-d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  16. Diffusion-controlled reactions modeling in Geant4-DNA

    SciTech Connect

    Karamitros, M.; Luan, S.; Bernal, M.A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H.N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k–d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  17. GATE - Geant4 Application for Tomographic Emission: a simulation toolkit for PET and SPECT

    PubMed Central

    Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.

    2012-01-01

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document, and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.ep.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects toward the gridification of GATE and its extension to other domains such as dosimetry are also discussed. PMID:15552416

  18. Thermal neutron response of a boron-coated GEM detector via GEANT4 Monte Carlo code.

    PubMed

    Jamil, M; Rhee, J T; Kim, H G; Ahmad, Farzana; Jeon, Y J

    2014-10-22

    In this work, we report the design configuration and the performance of the hybrid Gas Electron Multiplier (GEM) detector. In order to make the detector sensitive to thermal neutrons, the forward electrode of the GEM has been coated with the enriched boron-10 material, which works as a neutron converter. A total of 5×5cm(2) configuration of GEM has been used for thermal neutron studies. The response of the detector has been estimated via using GEANT4 MC code with two different physics lists. Using the QGSP_BIC_HP physics list, the neutron detection efficiency was determined to be about 3%, while with QGSP_BERT_HP physics list the efficiency was around 2.5%, at the incident thermal neutron energies of 25meV. The higher response of the detector proves that GEM-coated with boron converter improves the efficiency for thermal neutrons detection. PMID:25464183

  19. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. PMID:26653251

  20. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV

    NASA Astrophysics Data System (ADS)

    Maigne, L.; Perrot, Y.; Schaart, D. R.; Donnarieix, D.; Breton, V.

    2011-02-01

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV.

  1. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV.

    PubMed

    Maigne, L; Perrot, Y; Schaart, D R; Donnarieix, D; Breton, V

    2011-02-01

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV. PMID:21239846

  2. Experimental quantification of Geant4 PhysicsList recommendations: methods and results

    NASA Astrophysics Data System (ADS)

    Basaglia, Tullio; Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Saracco, Paolo

    2015-12-01

    The Geant4 physicsjists package encompasses predefined selections of physics processes and models to be used in simulation applications. Limited documentation is available in the literature about Geant4 pre-packaged PhysicsLists and their validation. The reports in the literature mainly concern specific use cases. This paper documents the epistemological grounds for the validation of Geant4 pre-packaged PhysicsLists (and their accessory classes, Builders and PhysicsConstructors) and some examples of the author's scientific activity on this subject.

  3. Software-Design-Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  4. Designing Educational Software for Tomorrow.

    ERIC Educational Resources Information Center

    Harvey, Wayne

    Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…

  5. Scatter in an uncollimated x-ray CT machine based on a Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Wadeson, Nicola; Morton, Edward; Lionheart, William

    2010-04-01

    A high-speed motionless-gantry x-ray CT machine has been designed to allow for 3D images to be collected in real time. By using multiple, switched x-ray sources and fixed detector rings, the time consuming mechanical rotation of conventional CT machines can be removed. However, the nature of this design limits the possibility of detector collimation since each detector must now be able to record the energy of x-ray beams from a number of different directions. The lack of collimation has implications in the reconstructed image due to an increase in the number of scattered photons recorded. A Monte Carlo computer simulation of the x-ray machine has been developed, using the Geant4 software toolkit, to analyse the behaviour of both Rayleigh and Compton scattered photons when considering airport baggage and medical applications. Four different scattering objects were analysed based on 50kVp, 100kVp and 150kVp spectra for a tungsten target. Two suitcase objects, a body and a brain phantom were chosen as objects typical of airport baggage and medical CT. The results indicate that the level of scatter is negligible for a typical airport baggage application, since the majority of space in a suitcase consists of clothing, which has a low density. Scatter contributes to less than 1% of the image in all instances. However, due to the large amounts of water found in the human body, the level of scatter in the medical instances are significantly higher, reaching 37% when the body phantom is analysed at 50kVp.

  6. Hadronic models validation in GEANT4 with CALICE highly granular calorimeters

    NASA Astrophysics Data System (ADS)

    Ramilli, Marco; CALICE Collaboration

    2012-12-01

    The CALICE collaboration has constructed highly granular hadronic and electromagnetic calorimeter prototypes to evaluate technologies for the use in detector systems at a future Linear Collider, and to validate hadronic shower models with unprecedented spatial segmentation. The electromagnetic calorimeter is a sampling structure of tungsten and silicon with 9720 readout channels. The hadron calorimeter uses 7608 small plastic scintillator cells individually read out with silicon photomultipliers. This high granularity opens up the possibility for precise three-dimensional shower reconstructions and for software compensation techniques to improve the energy resolution of the detector. We discuss the latest results on the studies of shower shapes and shower properties and the comparison to the latest developed GEANT4 models for hadronic showers. A satisfactory agreement at better than 5% is found between data and simulations for most of the investigated variables. We show that applying software compensation methods based on reconstructed clusters the energy resolution for hadrons improves by a factor of 15%. The next challenge for CALICE calorimeters will be to validate the 4th dimension of hadronic showers, namely their time evolution.

  7. Study on GEANT4 code applications to dose calculation using imaging data

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Ok; Kang, Jeong Ku; Kim, Jhin Kee; Kwon, Hyeong Cheol; Kim, Jung Soo; Kim, Bu Gil; Jeong, Dong Hyeok

    2015-07-01

    The use of the GEANT4 code has increased in the medical field. Various studies have calculated the patient dose distributions by users the GEANT4 code with imaging data. In present study, Monte Carlo simulations based on DICOM data were performed to calculate the dose absorb in the patient's body. Various visualization tools are installed in the GEANT4 code to display the detector construction; however, the display of DICOM images is limited. In addition, to displaying the dose distributions on the imaging data of the patient is difficult. Recently, the gMocren code, a volume visualization tool for GEANT4 simulation, was developed and has been used in volume visualization of image files. In this study, the imaging based on the dose distributions absorbed in the patients was performed by using the gMocren code. Dosimetric evaluations with were carried out by using thermo luminescent dosimeter and film dosimetry to verify the calculated results.

  8. Apply Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Baggs, Rhoda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  9. Physical models implemented in the GEANT4-DNA extension of the GEANT-4 toolkit for calculating initial radiation damage at the molecular level.

    PubMed

    Villagrasa, C; Francis, Z; Incerti, S

    2011-02-01

    The ROSIRIS project aims to study the radiobiology of integrated systems for medical treatment optimisation using ionising radiations and evaluate the associated risk. In the framework of this project, one research focus is the interpretation of the initial radio-induced damage in DNA created by ionising radiation (and detected by γ-H2AX foci analysis) from the track structure of the incident particles. In order to calculate the track structure of ionising particles at a nanometric level, the Geant4 Monte Carlo toolkit was used. Geant4 (Object Oriented Programming Architecture in C++) offers a common platform, available free to all users and relatively easy to use. Nevertheless, the current low-energy threshold for electromagnetic processes in GEANT4 is set to 1 keV (250 eV using the Livermore processes), which is an unsuitable value for nanometric applications. To lower this energy threshold, the necessary interaction processes and models were identified, and the corresponding available cross sections collected from the literature. They are mostly based on the plane-wave Born approximation (first Born approximation, or FBA) for inelastic interactions and on semi-empirical models for energies where the FBA fails (at low energies). In this paper, the extensions that have been introduced into the 9.3 release of the Geant4 toolkit are described, the so-called Geant4-DNA extension, including a set of processes and models adapted in this study and permitting the simulation of electron (8 eV-1 MeV), proton (100 eV-100 MeV) and alpha particle (1 keV-10 MeV) interactions in liquid water. PMID:21186212

  10. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  11. Applying Software Design Methodology to Instructional Design

    ERIC Educational Resources Information Center

    East, J. Philip

    2004-01-01

    The premise of this paper is that computer science has much to offer the endeavor of instructional improvement. Software design processes employed in computer science for developing software can be used for planning instruction and should improve instruction in much the same manner that design processes appear to have improved software. Techniques…

  12. Desiderata for Linguistic Software Design

    ERIC Educational Resources Information Center

    Garretson, Gregory

    2008-01-01

    This article presents a series of guidelines both for researchers in search of software to be used in linguistic analysis and for programmers designing such software. A description of the intended audience and the types of software under consideration and a review of some relevant literature are followed by a discussion of several important…

  13. A Learning Software Design Competition.

    ERIC Educational Resources Information Center

    Hooper, Simon; Hokanson, Brad; Bernhardt, Paul; Johnson, Mark

    2002-01-01

    Explains the University of Minnesota Learning Software Design Competition, focusing on its goals and emphasis on innovation. Describes the review process to evaluate and judge the software, lists the winners, identifies a new class of educational software, and outlines plans for future competitions. (Author/LRW)

  14. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Cañadas, M.; Arce, P.; Rato Mendes, P.

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was

  15. SDDL: Software Design Documentation Language

    NASA Technical Reports Server (NTRS)

    Kleine, H.; Callender, D. E.; Zepko, T. M.

    1985-01-01

    Promotes effective communications between software designer and user. SDDL successful on tasks ranging from small, one-person informal projects to large projects of hundreds of formally published pages of design.

  16. Use of GEANT4 vs. MCNPX for the characterization of a boron-lined neutron detector

    NASA Astrophysics Data System (ADS)

    van der Ende, B. M.; Atanackovic, J.; Erlandson, A.; Bentoumi, G.

    2016-06-01

    This work compares GEANT4 with MCNPX in the characterization of a boron-lined neutron detector. The neutron energy ranges simulated in this work (0.025 eV to 20 MeV) are the traditional domain of MCNP simulations. This paper addresses the question, how well can GEANT4 and MCNPX be employed for detailed thermal neutron detector characterization? To answer this, GEANT4 and MCNPX have been employed to simulate detector response to a 252Cf energy spectrum point source, as well as to simulate mono-energetic parallel beam source geometries. The 252Cf energy spectrum simulation results demonstrate agreement in detector count rate within 3% between the two packages, with the MCNPX results being generally closer to experiment than are those from GEANT4. The mono-energetic source simulations demonstrate agreement in detector response within 5% between the two packages for all neutron energies, and within 1% for neutron energies between 100 eV and 5 MeV. Cross-checks between the two types of simulations using ISO-8529 252Cf energy bins demonstrates that MCNPX results are more self-consistent than are GEANT4 results, by 3-4%.

  17. Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.

    PubMed

    Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B

    2010-09-01

    The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made. PMID:20511404

  18. Designing Good Educational Software.

    ERIC Educational Resources Information Center

    Kingman, James C.

    1984-01-01

    Describes eight characteristics of good educational software. They are: (1) educational soundness; (2) ease of use; (3) "bullet" proofing (preventing a program from coming to a premature halt); (4) clear instructions; (5) appropriate language; (6) appropriate frame size; (7) motivation; and (8) evaluation. (JN)

  19. Application of GEANT4 radiation transport toolkit to dose calculations in anthropomorphic phantoms.

    PubMed

    Rodrigues, P; Trindade, A; Peralta, L; Alves, C; Chaves, A; Lopes, M C

    2004-12-01

    In this paper, we present a novel implementation of a dose calculation application, based on the GEANT4 Monte Carlo toolkit. Validation studies were performed with an homogeneous water phantom and an Alderson-Rando anthropomorphic phantom both irradiated with high-energy photon beams produced by a clinical linear accelerator. As input, this tool requires computer tomography images for automatic codification of voxel-based geometries and phase-space distributions to characterize the incident radiation field. Simulation results were compared with ionization chamber, thermoluminescent dosimetry data and commercial treatment planning system calculations. In homogeneous water phantom, overall agreement with measurements were within 1-2%. For anthropomorphic simulated setups (thorax and head irradiation) mean differences between GEANT4 and TLD measurements were less than 2%. Significant differences between GEANT4 and a semi-analytical algorithm implemented in the treatment planning system, were found in low-density regions, such as air cavities with strong electronic disequilibrium. PMID:15388147

  20. Calculation of electron Dose Point Kernel in water with GEANT4 for medical application

    NASA Astrophysics Data System (ADS)

    Guimarães, C. C.; Moralles, M.; Sene, F. F.; Martinelli, J. R.; Okuno, E.

    2009-06-01

    The rapid insertion of new technologies in medical physics in the last years, especially in nuclear medicine, has been followed by a great development of faster Monte Carlo algorithms. GEANT4 is a Monte Carlo toolkit that contains the tools to simulate the problems of particle transport through matter. In this work, GEANT4 was used to calculate the dose-point-kernel (DPK) for monoenergetic electrons in water, which is an important reference medium for nuclear medicine. The three different physical models of electromagnetic interactions provided by GEANT4—Low Energy, Penelope and Standard—were employed. To verify the adequacy of these models, the results were compared with references from the literature. For all energies and physical models, the agreement between calculated DPKs and reported values is satisfactory.

  1. Dose conversion coefficients for ICRP110 voxel phantom in the Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Martins, M. C.; Cordeiro, T. P. V.; Silva, A. X.; Souza-Santos, D.; Queiroz-Filho, P. P.; Hunt, J. G.

    2014-02-01

    The reference adult male voxel phantom recommended by International Commission on Radiological Protection no. 110 was implemented in the Geant4 Monte Carlo code. Geant4 was used to calculate Dose Conversion Coefficients (DCCs) expressed as dose deposited in organs per air kerma for photons, electrons and neutrons in the Annals of the ICRP. In this work the AP and PA irradiation geometries of the ICRP male phantom were simulated for the purpose of benchmarking the Geant4 code. Monoenergetic photons were simulated between 15 keV and 10 MeV and the results were compared with ICRP 110, the VMC Monte Carlo code and the literature data available, presenting a good agreement.

  2. Geant4 electromagnetic physics for the LHC and other HEP applications

    NASA Astrophysics Data System (ADS)

    Schälicke, Andreas; Bagulya, Alexander; Dale, Ørjan; Dupertuis, Frederic; Ivanchenko, Vladimir; Kadri, Omrane; Lechner, Anton; Maire, Michel; Tsagri, Mary; Urban, Laszlo

    2011-12-01

    An overview of the electromagnetic physics (EM) models available in the Geant4 toolkit is presented. Recent improvements are focused on the performance of detector simulation results from large MC production exercises at the LHC. Significant efforts were spent for high statistics validation of EM physics. The work on consolidation of Geant4 EM physics was achieved providing common interfaces for EM standard (HEP oriented) and EM low-energy models (other application domains). It allows the combination of ultra-relativistic, relativistic and low-energy models for any Geant4 EM processes. With such a combination both precision and CPU performance are achieved for the simulation of EM interactions in a wide energy range. Due to this migration of EM low-energy models to the common interface additional capabilities become available. Selected validation results are presented in this contribution.

  3. Microdosimetry of the Auger electron emitting 123I radionuclide using Geant4-DNA simulations

    NASA Astrophysics Data System (ADS)

    Fourie, H.; Newman, R. T.; Slabbert, J. P.

    2015-04-01

    Microdosimetric calculations of the Auger electron emitter 123I were done in liquid water spheres using the Geant4 toolkit. The electron emission spectrum of 123I produced by Geant4 is presented. Energy deposition and corresponding S-values were calculated to investigate the influence of the sub-cellular localization of the Auger emitter. It was found that S-values calculated by the Geant4 toolkit are generally lower than the values calculated by other Monte Carlo codes for the 123I radionuclide. The differences in the compared S-values are mainly due to the different particle emission spectra employed by the respective computational codes and emphasizes the influence of the spectra on dosimetry calculations.

  4. Microdosimetry of the Auger electron emitting 123I radionuclide using Geant4-DNA simulations.

    PubMed

    Fourie, H; Newman, R T; Slabbert, J P

    2015-04-21

    Microdosimetric calculations of the Auger electron emitter (123)I were done in liquid water spheres using the Geant4 toolkit. The electron emission spectrum of (123)I produced by Geant4 is presented. Energy deposition and corresponding S-values were calculated to investigate the influence of the sub-cellular localization of the Auger emitter. It was found that S-values calculated by the Geant4 toolkit are generally lower than the values calculated by other Monte Carlo codes for the (123)I radionuclide. The differences in the compared S-values are mainly due to the different particle emission spectra employed by the respective computational codes and emphasizes the influence of the spectra on dosimetry calculations. PMID:25825914

  5. Geant4 simulation of the response of phosphor screens for X-ray imaging

    NASA Astrophysics Data System (ADS)

    Pistrui-Maximean, S. A.; Freud, N.; Létang, J. M.; Koch, A.; Munier, B.; Walenta, A. H.; Montarou, G.; Babot, D.

    2006-07-01

    In order to predict and optimize the response of phosphor screens, it is important to understand the role played by the different physical processes inside the scintillator layer. A simulation model based on the Monte Carlo code Geant4 was developed to determine the Modulation Transfer Function (MTF) of phosphor screens for energies used in X-ray medical imaging and nondestructive testing applications. The visualization of the dose distribution inside the phosphor layer gives an insight into how the MTF is progressively degraded by X-ray and electron transport. The simulation model allows to study the influence of physical and technological parameters on the detector performances, as well as to design and optimize new detector configurations. Preliminary MTF measurements have been carried out and agreement with experimental data has been found in the case of a commercial screen (Kodak Lanex Fine) at an X-ray tube potential of 100 kV. Further validation with other screens (transparent or granular) at different energies is under way.

  6. Applying Software Design Methodology to Instructional Design

    NASA Astrophysics Data System (ADS)

    East, J. Philip

    2004-12-01

    The premise of this paper is that computer science has much to offer the endeavor of instructional improvement. Software design processes employed in computer science for developing software can be used for planning instruction and should improve instruction in much the same manner that design processes appear to have improved software. Techniques for examining the software development process can be applied to an examination of the instructional process. Furthermore, the computer science discipline is particularly well suited to these tasks. Thus, computer science can develop instructional design expertise for export to other disciplines to improve education in all disciplines and, eventually, at all levels.

  7. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4

    PubMed Central

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-01-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ε). In addition, we present comparisons of GEANT4 simulations performed with a “standard” and a “low-energy” physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results. Key Points Testing the feedback mechanism with GEANT4 Validating the GEANT4 programming toolkit Study the ratio of bremsstrahlung photons to electrons at TGF source altitude PMID:26167437

  8. Response of a proportional counter to 37Ar and 71Ge: Measured spectra versus Geant4 simulation

    NASA Astrophysics Data System (ADS)

    Abdurashitov, D. N.; Malyshkin, Yu. M.; Matushko, V. L.; Suerfu, B.

    2016-04-01

    The energy deposition spectra of 37Ar and 71Ge in a miniature proportional counter are measured and compared in detail to the model response simulated with Geant4. A certain modification of the Geant4 code, making it possible to trace the deexcitation of atomic shells properly, is suggested. Modified Geant4 is able to reproduce a response of particle detectors in detail in the keV energy range. This feature is very important for the laboratory experiments that search for massive sterile neutrinos as well as for dark matter searches that employ direct detection of recoil nuclei. This work demonstrates the reliability of Geant4 simulation at low energies.

  9. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1980-01-01

    Language supports design and documentation of complex software. Included are: design and documentation language for expressing design concepts; processor that produces intelligble documentation based on design specifications; and methodology for using language and processor to create well-structured top-down programs and documentation. Processor is written in SIMSCRIPT 11.5 programming language for use on UNIVAC, IBM, and CDC machines.

  10. SU-E-T-565: RAdiation Resistance of Cancer CElls Using GEANT4 DNA: RACE

    SciTech Connect

    Perrot, Y; Payno, H; Delage, E; Maigne, L

    2014-06-01

    Purpose: The objective of the RACE project is to develop a comparison between Monte Carlo simulation using the Geant4-DNA toolkit and measurements of radiation damage on 3D melanoma and chondrosarcoma culture cells coupled with gadolinium nanoparticles. We currently expose the status of the developments regarding simulations. Methods: Monte Carlo studies are driven using the Geant4 toolkit and the Geant4-DNA extension. In order to model the geometry of a cell population, the opensource CPOP++ program is being developed for the geometrical representation of 3D cell populations including a specific cell mesh coupled with a multi-agent system. Each cell includes cytoplasm and nucleus. The correct modeling of the cell population has been validated with confocal microscopy images of spheroids. The Geant4 Livermore physics models are used to simulate the interactions of a 250 keV X-ray beam and the production of secondaries from gadolinium nanoparticles supposed to be fixed on the cell membranes. Geant4-DNA processes are used to simulate the interactions of charged particles with the cells. An atomistic description of the DNA molecule, from PDB (Protein Data Bank) files, is provided by the so-called PDB4DNA Geant4 user application we developed to score energy depositions in DNA base pairs and sugar-phosphate groups. Results: At the microscopic level, our simulations enable assessing microscopic energy distribution in each cell compartment of a realistic 3D cell population. Dose enhancement factors due to the presence of gadolinium nanoparticles can be estimated. At the nanometer scale, direct damages on nuclear DNA are also estimated. Conclusion: We successfully simulated the impact of direct radiations on a realistic 3D cell population model compatible with microdosimetry calculations using the Geant4-DNA toolkit. Upcoming validation and the future integration of the radiochemistry module of Geant4-DNA will propose to correlate clusters of ionizations with in vitro

  11. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  12. Echelle spectrograph software design aid

    NASA Technical Reports Server (NTRS)

    Dantzler, A. A.

    1985-01-01

    A method for mapping, to first order, the spectrograms that result from echelle spectrographic systems is discussed. An in-depth description of the principles behind the method are given so that software may be generated. Such software is an invaluable echelle spectrograph design aid. Results from two applications are discussed.

  13. Reflight certification software design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  14. Measurement of depth-dose of linear accelerator and simulation by use of Geant4 computer code

    PubMed Central

    Sardari, D.; Maleki, R.; Samavat, H.; Esmaeeli, A.

    2010-01-01

    Radiation therapy is an established method of cancer treatment. New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapy treatment plan. This study presents some results of a Geant4-based application for simulation of the absorbed dose distribution given by a medical linear accelerator (LINAC). The LINAC geometry is accurately described in the Monte Carlo code with use of the accelerator manufacturer's specifications. The capability of the software for evaluating the dose distribution has been verified by comparisons with measurements in a water phantom; the comparisons were performed for percentage depth dose (PDD) and profiles for various field sizes and depths, for a 6-MV electron beam. Experimental and calculated dose values were in good agreement both in PDD and in transverse sections of the water phantom. PMID:24376926

  15. Simulating cosmic radiation absorption and secondary particle production of solar panel layers of Low Earth Orbit (LEO) satellite with GEANT4

    NASA Astrophysics Data System (ADS)

    Yiǧitoǧlu, Merve; Veske, Doǧa; Nilüfer Öztürk, Zeynep; Bilge Demirköz, Melahat

    2016-07-01

    All devices which operate in space are exposed to cosmic rays during their operation. The resulting radiation may cause fatal damages in the solid structure of devices and the amount of absorbed radiation dose and secondary particle production for each component should be calculated carefully before the production. Solar panels are semiconductor solid state devices and are very sensitive to radiation. Even a short term power cut-off may yield a total failure of the satellite. Even little doses of radiation can change the characteristics of solar cells. This deviation can be caused by rarer high energetic particles as well as the total ionizing dose from the abundant low energy particles. In this study, solar panels planned for a specific LEO satellite, IMECE, are analyzed layer by layer. The Space Environment Information System (SPENVIS) database and GEANT4 simulation software are used to simulate the layers of the panels. The results obtained from the simulation will be taken in account to determine the amount of radiation protection and resistance needed for the panels or to revise the design of the panels.

  16. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1977-01-01

    A communications medium to support the design and documentation of complex software applications is studied. The medium also provides the following: (1) a processor which can convert design specifications into an intelligible, informative machine reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor.

  17. CMS validation experience: Test-beam 2004 data vs GEANT4

    SciTech Connect

    Piperov, Stefan; /Fermilab /Sofiya, Inst. Nucl. Res.

    2007-01-01

    A comparison between the Geant4 Monte-Carlo simulation of CMS Detector's Calorimetric System and data from the 2004 Test-Beam at CERN's SPS H2 beam-line is presented. The overall simulated response agrees quite well with the measured response. Slight differences in the longitudinal shower profiles between the MC predictions made with different Physics Lists are observed.

  18. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4

    NASA Astrophysics Data System (ADS)

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-11-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ɛ). In addition, we present comparisons of GEANT4 simulations performed with a "standard" and a "low-energy" physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results.

  19. Identifying key surface parameters for optical photon transport in GEANT4/GATE simulations.

    PubMed

    Nilsson, Jenny; Cuplov, Vesna; Isaksson, Mats

    2015-09-01

    For a scintillator used for spectrometry, the generation, transport and detection of optical photons have a great impact on the energy spectrum resolution. A complete Monte Carlo model of a scintillator includes a coupled ionizing particle and optical photon transport, which can be simulated with the GEANT4 code. The GEANT4 surface parameters control the physics processes an optical photon undergoes when reaching the surface of a volume. In this work the impact of each surface parameter on the optical transport was studied by looking at the optical spectrum: the number of detected optical photons per ionizing source particle from a large plastic scintillator, i.e. the output signal. All simulations were performed using GATE v6.2 (GEANT4 Application for Tomographic Emission). The surface parameter finish (polished, ground, front-painted or back-painted) showed the greatest impact on the optical spectrum whereas the surface parameter σ(α), which controls the surface roughness, had a relatively small impact. It was also shown how the surface parameters reflectivity and reflectivity types (specular spike, specular lobe, Lambertian and backscatter) changed the optical spectrum depending on the probability for reflection and the combination of reflectivity types. A change in the optical spectrum will ultimately have an impact on a simulated energy spectrum. By studying the optical spectra presented in this work, a GEANT4 user can predict the shift in an optical spectrum caused be the alteration of a specific surface parameter. PMID:26046519

  20. Simulating response functions and pulse shape discrimination for organic scintillation detectors with Geant4

    NASA Astrophysics Data System (ADS)

    Hartwig, Zachary S.; Gumplinger, Peter

    2014-02-01

    We present new capabilities of the Geant4 toolkit that enable the precision simulation of organic scintillation detectors within a comprehensive Monte Carlo code for the first time. As of version 10.0-beta, the Geant4 toolkit models the data-driven photon production from any user-defined scintillator, photon transportation through arbitrarily complex detector geometries, and time-resolved photon detection at the light readout device. By fully specifying the optical properties and geometrical configuration of the detector, the user can simulate response functions, photon transit times, and pulse shape discrimination. These capabilities enable detector simulation within a larger experimental environment as well as computationally evaluating novel scintillators, detector geometry, and light readout configurations. We demonstrate agreement of Geant4 with the NRESP7 code and with experiments for the spectroscopy of neutrons and gammas in the ranges 0-20 MeV and 0.511-1.274 MeV, respectively, using EJ301-based organic scintillation detectors. We also show agreement between Geant4 and experimental modeling of the particle-dependent detector pulses that enable simulated pulse shape discrimination.

  1. Applications of the Monte Carlo method in nuclear physics using the GEANT4 toolkit

    SciTech Connect

    Moralles, Mauricio; Guimaraes, Carla C.; Menezes, Mario O.; Bonifacio, Daniel A. B.; Okuno, Emico; Guimaraes, Valdir; Murata, Helio M.; Bottaro, Marcio

    2009-06-03

    The capabilities of the personal computers allow the application of Monte Carlo methods to simulate very complex problems that involve the transport of particles through matter. Among the several codes commonly employed in nuclear physics problems, the GEANT4 has received great attention in the last years, mainly due to its flexibility and possibility to be improved by the users. Differently from other Monte Carlo codes, GEANT4 is a toolkit written in object oriented language (C++) that includes the mathematical engine of several physical processes, which are suitable to be employed in the transport of practically all types of particles and heavy ions. GEANT4 has also several tools to define materials, geometry, sources of radiation, beams of particles, electromagnetic fields, and graphical visualization of the experimental setup. After a brief description of the GEANT4 toolkit, this presentation reports investigations carried out by our group that involve simulations in the areas of dosimetry, nuclear instrumentation and medical physics. The physical processes available for photons, electrons, positrons and heavy ions were used in these simulations.

  2. Applications of the Monte Carlo method in nuclear physics using the GEANT4 toolkit

    NASA Astrophysics Data System (ADS)

    Moralles, Maurício; Guimarães, Carla C.; Bonifácio, Daniel A. B.; Okuno, Emico; Murata, Hélio M.; Bottaro, Márcio; Menezes, Mário O.; Guimarães, Valdir

    2009-06-01

    The capabilities of the personal computers allow the application of Monte Carlo methods to simulate very complex problems that involve the transport of particles through matter. Among the several codes commonly employed in nuclear physics problems, the GEANT4 has received great attention in the last years, mainly due to its flexibility and possibility to be improved by the users. Differently from other Monte Carlo codes, GEANT4 is a toolkit written in object oriented language (C++) that includes the mathematical engine of several physical processes, which are suitable to be employed in the transport of practically all types of particles and heavy ions. GEANT4 has also several tools to define materials, geometry, sources of radiation, beams of particles, electromagnetic fields, and graphical visualization of the experimental setup. After a brief description of the GEANT4 toolkit, this presentation reports investigations carried out by our group that involve simulations in the areas of dosimetry, nuclear instrumentation and medical physics. The physical processes available for photons, electrons, positrons and heavy ions were used in these simulations.

  3. Software design by reusing architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.

  4. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  5. The simulation of the LANFOS-H food radiation contamination detector using Geant4 package

    NASA Astrophysics Data System (ADS)

    Piotrowski, Lech Wiktor; Casolino, Marco; Ebisuzaki, Toshikazu; Higashide, Kazuhiro

    2015-02-01

    Recent incident in the Fukushima power plant caused a growing concern about the radiation contamination and resulted in lowering the Japanese limits for the permitted amount of 137Cs in food to 100 Bq/kg. To increase safety and ease the concern we are developing LANFOS (Large Food Non-destructive Area Sampler)-a compact, easy to use detector for assessment of radiation in food. Described in this paper LANFOS-H has a 4 π coverage to assess the amount of 137Cs present, separating it from the possible 40K food contamination. Therefore, food samples do not have to be pre-processed prior to a test and can be consumed after measurements. It is designed for use by non-professionals in homes and small institutions such as schools, showing safety of the samples, but can be also utilized by specialists providing radiation spectrum. Proper assessment of radiation in food in the apparatus requires estimation of the γ conversion factor of the detectors-how many γ photons will produce a signal. In this paper we show results of the Monte Carlo estimation of this factor for various approximated shapes of fish, vegetables and amounts of rice, performed with Geant4 package. We find that the conversion factor combined from all the detectors is similar for all food types and is around 37%, varying maximally by 5% with sample length, much less than for individual detectors. The different inclinations and positions of samples in the detector introduce uncertainty of 1.4%. This small uncertainty validates the concept of a 4 π non-destructive apparatus.

  6. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    NASA Astrophysics Data System (ADS)

    Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M. G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  7. Structural Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.

  8. Software Design for Smile Analysis

    PubMed Central

    Sodagar, A.; Rafatjoo, R.; Gholami Borujeni, D.; Noroozi, H.; Sarkhosh, A.

    2010-01-01

    Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproducible smile. The record then should be analyzed to determine its characteristics. In this study, we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients. Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high (α=0.7–1). Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo). Conclusion: The results obtained from smile analysis could be used in diagnosis, treatment planning and evaluation of the treatment progress. PMID:21998792

  9. Development of a Geant4 based Monte Carlo Algorithm to evaluate the MONACO VMAT treatment accuracy.

    PubMed

    Fleckenstein, Jens; Jahnke, Lennart; Lohr, Frank; Wenz, Frederik; Hesser, Jürgen

    2013-02-01

    A method to evaluate the dosimetric accuracy of volumetric modulated arc therapy (VMAT) treatment plans, generated with the MONACO™ (version 3.0) treatment planning system in realistic CT-data with an independent Geant4 based dose calculation algorithm is presented. Therefore a model of an Elekta Synergy linear accelerator treatment head with an MLCi2 multileaf collimator was implemented in Geant4. The time dependent linear accelerator components were modeled by importing either logfiles of an actual plan delivery or a DICOM-RT plan sequence. Absolute dose calibration, depending on a reference measurement, was applied. The MONACO as well as the Geant4 treatment head model was commissioned with lateral profiles and depth dose curves of square fields in water and with film measurements in inhomogeneous phantoms. A VMAT treatment plan for a patient with a thoracic tumor and a VMAT treatment plan of a patient, who received treatment in the thoracic spine region including metallic implants, were used for evaluation. MONACO, as well as Geant4, depth dose curves and lateral profiles of square fields had a mean local gamma (2%, 2mm) tolerance criteria agreement of more than 95% for all fields. Film measurements in inhomogeneous phantoms with a global gamma of (3%, 3mm) showed a pass rate above 95% in all voxels receiving more than 25% of the maximum dose. A dose-volume-histogram comparison of the VMAT patient treatment plans showed mean deviations between Geant4 and MONACO of -0.2% (first patient) and 2.0% (second patient) for the PTVs and (0.5±1.0)% and (1.4±1.1)% for the organs at risk in relation to the prescription dose. The presented method can be used to validate VMAT dose distributions generated by a large number of small segments in regions with high electron density gradients. The MONACO dose distributions showed good agreement with Geant4 and film measurements within the simulation and measurement errors. PMID:22921843

  10. Managing Software Design and Design Changes

    NASA Technical Reports Server (NTRS)

    Loesh, R. E.

    1985-01-01

    Microprocessor-based system for document production work scheduling, and change control and management information aids in design, development, and control of software. Main components Z80 microprocessor, floppydisk and hard-disk drives, and a character printer. System linked to large computer. Major software components are control program monitor (CP/M), text-editing and wordprocessing system, workbreakdown-schedule processor, and data-base management tool.

  11. FIB Microfabrication Software Design Considerations

    NASA Astrophysics Data System (ADS)

    Thompson, W.; Bowe, T.; Morlock, S.; Moskowitz, A.; Plourde, G.; Spaulding, G.; Scialdone, C.; Tsiang, E.

    1986-06-01

    Profit margins on high-volume ICs, such as the 256-K DRAM, are now inadequate. U.S. and foreign manufacturers cannot fully recover the ICs' engineering costs before a new round of product competition begins. Consequently, some semiconductor manufacturers are seeking less competitive designs with healthier, longer lasting profitability. These designs must be converted quickly from CAD to functional circuits in order for irofits to be realized. For ultrahigh performance devices, customized circuits, and rapid verification of design, FIB (focused ion beam) systems provide a viable alternative to the lengthy process of producing a large mask set. Early models of FI equipment did not require sophisticated software. However, as FIB technology approaches adolescence, it must be supported by software that gives the user a friendly system, the flexibility to design a wide variety of circuits, and good growth potential for tomorrow's ICs. Presented here is an overview of IBT's MicroFocus" 150 hardware, followed by descriptions of several MicroFocus software modules. Data preparation techniques from IBCAD formats to chip layout are compared to the more conventional lithographies. The MicroFocus 150 schemes for user interfacing, error logging, calibration, and subsystem control are given. The MicroFocus's pattern generator and bit slice software are explained. IBT's FIB patterning algorithms, which allow the fabrication of unique device types, are reviewed.

  12. Mass attenuation coefficients of composite materials by Geant4, XCOM and experimental data: comparative study

    NASA Astrophysics Data System (ADS)

    Medhat, M. E.; Singh, V. P.

    2014-09-01

    The main goal of this present study is focused on testing the applicability of Geant4 electromagnetic models for studying mass attenuations coefficients for different types of composite materials at 59.5, 80, 356, 661.6, 1173.2 and 1332.5 keV photon energies. The simulated results of mass attenuation coefficients were compared with the experimental and theoretical XCOM data for the same samples and a good agreement has been observed. The results indicate that this process can be followed to determine the data on the attenuation of gamma rays with the several energies in different materials. The modeling for photon interaction parameters was standard for any type of composite samples. The Geant4 code can be utilized for gamma ray attenuation coefficients for the sample at different energies, which may sometimes be impractical by experiment investigation.

  13. Therapeutic dose simulation of a 6 MV Varian Linac photon beam using GEANT4

    NASA Astrophysics Data System (ADS)

    Salama, E.; Ali, A. S.; Khaled, N. E.; Radi, A.

    2015-10-01

    A developed program in C++ language using GEANT4 libraries was used to simulate the gantry of a 6 MV high energy photon linear accelerator (Linac). The head of a clinical linear accelerator based on the manufacturer's detailed information is simulated. More than 2× 109 primary electrons are used to create the phase space file. Evaluation of the percentage depth dose (PDD) and flatness symmetry (lateral dose profiles) in water phantom were performed. Comparisons between experimental and simulated data were carried out for three field sizes; 5 × 5, 10 × 10 and 15 × 15 cm2. A relatively good agreement appeared between computed and measured PDD. Electron contamination and spatial distribution for both photons and electrons in the simulated beam are evaluated. Moreover, the obtained lateral dose profiles at 15, 50, and 100 mm depth are compatible with the measured values. The obtained results concluded that, GEANT4 code is a promising applicable Monte Carlo program in radiotherapy applications.

  14. Calculation of Coincidence Summing Correction Factors for an HPGe detector using GEANT4.

    PubMed

    Giubrone, G; Ortiz, J; Gallardo, S; Martorell, S; Bas, M C

    2016-07-01

    The aim of this paper was to calculate the True Coincidence Summing Correction Factors (TSCFs) for an HPGe coaxial detector in order to correct the summing effect as a result of the presence of (88)Y and (60)Co in a multigamma source used to obtain a calibration efficiency curve. Results were obtained for three volumetric sources using the Monte Carlo toolkit, GEANT4. The first part of this paper deals with modeling the detector in order to obtain a simulated full energy peak efficiency curve. A quantitative comparison between the measured and simulated values was made across the entire energy range under study. The True Summing Correction Factors were calculated for (88)Y and (60)Co using the full peak efficiencies obtained with GEANT4. This methodology was subsequently applied to (134)Cs, and presented a complex decay scheme. PMID:27085040

  15. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes

    NASA Astrophysics Data System (ADS)

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes.

  16. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes.

    PubMed

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes. PMID:26975304

  17. R&D on co-working transport schemes in Geant4

    NASA Astrophysics Data System (ADS)

    Pia, M. G.; Saracco, P.; Sudhakar, M.; Zoglauer, A.; Augelli, M.; Gargioni, E.; Kim, C. H.; Quintieri, L.; de Queiroz Filho, P. P.; de Souza Santos, D.; Weidenspointner, G.; Begalli, M.

    2010-04-01

    A research and development (R&D) project related to the extension of the Geant4 toolkit has been recently launched to address fundamental methods in radiation transport simulation. The project focuses on simulation at different scales in the same experimental environment; this problem requires new methods across the current boundaries of condensed-random-walk and discrete transport schemes. The new developments have been motivated by experimental requirements in various domains, including nanodosimetry, astronomy and detector developments for high energy physics applications.

  18. Calculation of self-shielding factor for neutron activation experiments using GEANT4 and MCNP

    NASA Astrophysics Data System (ADS)

    Romero-Barrientos, Jaime; Molina, F.; Aguilera, Pablo; Arellano, H. F.

    2016-07-01

    The neutron self-shielding factor G as a function of the neutron energy was obtained for 14 pure metallic samples in 1000 isolethargic energy bins from 1.10-5eV to 2.107eV using Monte Carlo simulations in GEANT4 and MCNP6. The comparison of these two Monte Carlo codes shows small differences in the final self-shielding factor mostly due to the different cross section databases that each program uses.

  19. Comparison of GEANT4 Simulations with Experimental Data for Thick Al Absorbers

    NASA Astrophysics Data System (ADS)

    Yevseyeva, Olga; de Assis, Joaquim; Evseev, Ivan; Schelin, Hugo; Paschuk, Sergei; Milhoretto, Edney; Setti, João; Díaz, Katherin; Hormaza, Joel; Lopes, Ricardo

    2009-06-01

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4 can lead to significant disagreements in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents proton energy spectra obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models for 19.68 MeV protons passing through a number of Al absorbers with various thicknesses. The spectra were compared with the experimental data, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and with the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the MCNPX simulations reasonably reproduce well all experimental spectra. For the relatively thin targets all the methods give practically identical results but this is not the same for the thick absorbers. It should be noted that all the spectra were measured at the proton energies significantly above 2 MeV, i.e., in the so-called "Bethe-Bloch region". Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies are necessary for better understanding and definitive conclusions.

  20. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring.

    PubMed

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm(2). PMID:26858937

  1. Monte Carlo modeling and validation of a proton treatment nozzle by using the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Kim, Dae-Hyun; Kang, Young Nam; Suh, Tae-Suk; Shin, Jungwook; Kim, Jong Won; Yoo, Seung Hoon; Park, Seyjoon; Lee, Sang Hoon; Cho, Sungkoo; Shin, Dongho; Kim, Dae Yong; Lee, Se Byeong

    2012-10-01

    Modern commercial treatment planning systems for proton therapy use the pencil beam algorithm for calculating the absorbed dose. Although it is acceptable for clinical radiation treatment, the accuracy of this method is limited. Alternatively, the Monte Carlo method, which is relatively accurate in dose calculations, has been applied recently to proton therapy. To reduce the remaining uncertainty in proton therapy dose calculations, in the present study, we employed Monte Carlo simulations and the Geant4 simulation toolkit to develop a model for a of a proton treatment nozzle. The results from a Geant4-based medical application of the proton treatment nozzle were compared to the measured data. Simulations of the percentage depth dose profiles showed very good agreement within 1 mm in distal range and 3 mm in modulated width. Moreover, the lateral dose profiles showed good agreement within 3% in the central region of the field and within 10% in the penumbra regions. In this work, we proved that the Geant4 Monte Carlo model of a proton treatment nozzle could be used to the calculate proton dose distributions accurately.

  2. GEANT4 simulations of the n_TOF spallation source and their benchmarking

    NASA Astrophysics Data System (ADS)

    Lo Meo, S.; Cortés-Giraldo, M. A.; Massimi, C.; Lerendegui-Marco, J.; Barbagallo, M.; Colonna, N.; Guerrero, C.; Mancusi, D.; Mingrone, F.; Quesada, J. M.; Sabate-Gilarte, M.; Vannini, G.; Vlachoudis, V.

    2015-12-01

    Neutron production and transport in the spallation target of the n_TOF facility at CERN has been simulated with GEANT4. The results obtained with different models of high-energy nucleon-nucleus interaction have been compared with the measured characteristics of the neutron beam, in particular the flux and its dependence on neutron energy, measured in the first experimental area. The best agreement at present, within 20% for the absolute value of the flux, and within few percent for the energy dependence in the whole energy range from thermal to 1 GeV, is obtained with the INCL++ model coupled with the GEANT4 native de-excitation model. All other available models overestimate by a larger factor, of up to 70%, the n_TOF neutron flux. The simulations are also able to accurately reproduce the neutron beam energy resolution function, which is essentially determined by the moderation time inside the target/moderator assembly. The results here reported provide confidence on the use of GEANT4 for simulations of spallation neutron sources.

  3. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring

    PubMed Central

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M.; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm2. PMID:26858937

  4. Application of TDCR-Geant4 modeling to standardization of 63Ni.

    PubMed

    Thiam, C; Bobin, C; Chauvenet, B; Bouchard, J

    2012-09-01

    As an alternative to the classical TDCR model applied to liquid scintillation (LS) counting, a stochastic approach based on the Geant4 toolkit is presented for the simulation of light emission inside the dedicated three-photomultiplier detection system. To this end, the Geant4 modeling includes a comprehensive description of optical properties associated with each material constituting the optical chamber. The objective is to simulate the propagation of optical photons from their creation in the LS cocktail to the production of photoelectrons in the photomultipliers. First validated for the case of radionuclide standardization based on Cerenkov emission, the scintillation process has been added to a TDCR-Geant4 modeling using the Birks expression in order to account for the light-emission nonlinearity owing to ionization quenching. The scintillation yield of the commercial Ultima Gold LS cocktail has been determined from double-coincidence detection efficiencies obtained for (60)Co and (54)Mn with the 4π(LS)β-γ coincidence method. In this paper, the stochastic TDCR modeling is applied for the case of the standardization of (63)Ni (pure β(-)-emitter; E(max)=66.98 keV) and the activity concentration is compared with the result given by the classical model. PMID:22436447

  5. Modeling proton and alpha elastic scattering in liquid water in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Tran, H. N.; El Bitar, Z.; Champion, C.; Karamitros, M.; Bernal, M. A.; Francis, Z.; Ivantchenko, V.; Lee, S. B.; Shin, J. I.; Incerti, S.

    2015-01-01

    Elastic scattering of protons and alpha (α) particles by water molecules cannot be neglected at low incident energies. However, this physical process is currently not available in the "Geant4-DNA" extension of the Geant4 Monte Carlo simulation toolkit. In this work, we report on theoretical differential and integral cross sections of the elastic scattering process for 100 eV-1 MeV incident protons and for 100 eV-10 MeV incident α particles in liquid water. The calculations are performed within the classical framework described by Everhart et al., Ziegler et al. and by the ICRU 49 Report. Then, we propose an implementation of the corresponding classes into the Geant4-DNA toolkit for modeling the elastic scattering of protons and α particles. Stopping powers as well as ranges are also reported. Then, it clearly appears that the account of the elastic scattering process in the slowing-down of the charged particle improves the agreement with the existing data in particular with the ICRU recommendations.

  6. Comparison of dose distributions for Hounsfield number conversion methods in GEANT4

    NASA Astrophysics Data System (ADS)

    Kim, Hyung Dong; Kim, Byung Yong; Kim, Eng Chan; Yun, Sang Mo; Kang, Jeong Ku; Kim, Sung Kyu

    2014-06-01

    The conversion of patient computed tomography (CT) data to voxel phantoms is essential for CT-based Monte Carlo (MC) dose calculations, and incorrect assignments of materials and mass densities can lead to large errors in dose distributions. We investigated the effects of mass density and material assignments on GEANT4-based photon dose calculations. Three material conversion methods and four density conversion methods were compared for a lung tumor case. The dose calculations for 6-MV photon beams with a field size of 10 × 10 cm2 were performed using a 0.5 × 0.5 × 0.5 cm3 voxel with 1.2 × 109 histories. The material conversion methods led to different material assignment percentages in converted voxel regions. The GEANT4 example and the modified Schneider material conversion methods showed large local dose differences relative to the BEAMnrc default method for lung and other tissues. For mass density conversion methods when only water was used, our results showed only slight dose differences. Gaussian-like distributions, with mean values close to zero, were obtained when the reference method was compared with the other methods. The maximum dose difference of ˜2% indicated that the dose distributions agreed relatively well. Material assignment methods probably have more significant impacts on dose distributions than mass density assignment methods. The study confirms that material assignment methods cause significant dose differences in GEANT4-based photon dose calculations.

  7. Optical simulation of monolithic scintillator detectors using GATE/GEANT4.

    PubMed

    van der Laan, D J Jan; Schaart, Dennis R; Maas, Marnix C; Beekman, Freek J; Bruyndonckx, Peter; van Eijk, Carel W E

    2010-03-21

    Much research is being conducted on position-sensitive scintillation detectors for medical imaging, particularly for emission tomography. Monte Carlo simulations play an essential role in many of these research activities. As the scintillation process, the transport of scintillation photons through the crystal(s), and the conversion of these photons into electronic signals each have a major influence on the detector performance; all of these processes may need to be incorporated in the model to obtain accurate results. In this work the optical and scintillation models of the GEANT4 simulation toolkit are validated by comparing simulations and measurements on monolithic scintillator detectors for high-resolution positron emission tomography (PET). We have furthermore made the GEANT4 optical models available within the user-friendly GATE simulation platform (as of version 3.0). It is shown how the necessary optical input parameters can be determined with sufficient accuracy. The results show that the optical physics models of GATE/GEANT4 enable accurate prediction of the spatial and energy resolution of monolithic scintillator PET detectors. PMID:20182005

  8. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  9. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  10. MaGe - a GEANT4-based Monte Carlo Application Framework for Low-background Germanium Experiments

    SciTech Connect

    Boswell, M.; Chan, Yuen-Dat; Detwiler, Jason A.; Finnerty, P.; Henning, R.; Gehman, Victor; Johnson, Robert A.; Jordan, David V.; Kazkaz, Kareem; Knapp, Markus; Kroninger, Kevin; Lenz, Daniel; Leviner, L.; Liu, Jing; Liu, Xiang; MacMullin, S.; Marino, Michael G.; Mokhtarani, A.; Pandola, Luciano; Schubert, Alexis G.; Schubert, J.; Tomei, Claudia; Volynets, Oleksandr

    2011-06-13

    We describe a physics simulation software framework, MAGE, that is based on the GEANT4 simulation toolkit. MAGE is used to simulate the response of ultra-low radioactive background radiation detectors to ionizing radiation, specifically the MAJ ORANA and GE RDA neutrinoless double-beta decay experiments. MAJ ORANA and GERDA use high-purity germanium technology to search for the neutrinoless double-beta decay of the 76 Ge isotope, and MAGE is jointly developed between these two collaborations. The MAGE framework contains simulated geometries of common objects, prototypes, test stands, and the actual experiments. It also implements customized event generators, GE ANT 4 physics lists, and output formats. All of these features are available as class libraries that are typically compiled into a single executable. The user selects the particular experimental setup implementation at run-time via macros. The combination of all these common classes into one framework reduces duplication of efforts, eases comparison between simulated data and experiment, and simplifies the addition of new detectors to be simulated. This paper focuses on the software framework, custom event generators, and physics list.

  11. GEANT4 calculations of neutron dose in radiation protection using a homogeneous phantom and a Chinese hybrid male phantom.

    PubMed

    Geng, Changran; Tang, Xiaobin; Guan, Fada; Johns, Jesse; Vasudevan, Latha; Gong, Chunhui; Shu, Diyun; Chen, Da

    2016-03-01

    The purpose of this study is to verify the feasibility of applying GEANT4 (version 10.01) in neutron dose calculations in radiation protection by comparing the calculation results with MCNP5. The depth dose distributions are investigated in a homogeneous phantom, and the fluence-to-dose conversion coefficients are calculated for different organs in the Chinese hybrid male phantom for neutrons with energy ranging from 1 × 10(-9) to 10 MeV. By comparing the simulation results between GEANT4 and MCNP5, it is shown that using the high-precision (HP) neutron physics list, GEANT4 produces the closest simulation results to MCNP5. However, differences could be observed when the neutron energy is lower than 1 × 10(-6) MeV. Activating the thermal scattering with an S matrix correction in GEANT4 with HP and MCNP5 in thermal energy range can reduce the difference between these two codes. PMID:26156875

  12. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  13. Monte Carlo calculations of thermal neutron capture in gadolinium: a comparison of GEANT4 and MCNP with measurements.

    PubMed

    Enger, Shirin A; Munck af Rosenschöld, Per; Rezaei, Arash; Lundqvist, Hans

    2006-02-01

    GEANT4 is a Monte Carlo code originally implemented for high-energy physics applications and is well known for particle transport at high energies. The capacity of GEANT4 to simulate neutron transport in the thermal energy region is not equally well known. The aim of this article is to compare MCNP, a code commonly used in low energy neutron transport calculations and GEANT4 with experimental results and select the suitable code for gadolinium neutron capture applications. To account for the thermal neutron scattering from chemically bound atoms [S(alpha,beta)] in biological materials a comparison of thermal neutron fluence in tissue-like poly(methylmethacrylate) phantom is made with MCNP4B, GEANT4 6.0 patch1, and measurements from the neutron capture therapy (NCT) facility at the Studsvik, Sweden. The fluence measurements agreed with MCNP calculated results considering S(alpha,beta). The location of the thermal neutron peak calculated with MCNP without S(alpha,beta) and GEANT4 is shifted by about 0.5 cm towards a shallower depth and is 25%-30% lower in amplitude. Dose distribution from the gadolinium neutron capture reaction is then simulated by MCNP and compared with measured data. The simulations made by MCNP agree well with experimental results. As long as thermal neutron scattering from chemically bound atoms are not included in GEANT4 it is not suitable for NCT applications. PMID:16532938

  14. Monte Carlo calculations of thermal neutron capture in gadolinium: A comparison of GEANT4 and MCNP with measurements

    SciTech Connect

    Enger, Shirin A.; Munck af Rosenschoeld, Per; Rezaei, Arash; Lundqvist, Hans

    2006-02-15

    GEANT4 is a Monte Carlo code originally implemented for high-energy physics applications and is well known for particle transport at high energies. The capacity of GEANT4 to simulate neutron transport in the thermal energy region is not equally well known. The aim of this article is to compare MCNP, a code commonly used in low energy neutron transport calculations and GEANT4 with experimental results and select the suitable code for gadolinium neutron capture applications. To account for the thermal neutron scattering from chemically bound atoms [S({alpha},{beta})] in biological materials a comparison of thermal neutron fluence in tissue-like poly(methylmethacrylate) phantom is made with MCNP4B, GEANT4 6.0 patch1, and measurements from the neutron capture therapy (NCT) facility at the Studsvik, Sweden. The fluence measurements agreed with MCNP calculated results considering S({alpha},{beta}). The location of the thermal neutron peak calculated with MCNP without S({alpha},{beta}) and GEANT4 is shifted by about 0.5 cm towards a shallower depth and is 25%-30% lower in amplitude. Dose distribution from the gadolinium neutron capture reaction is then simulated by MCNP and compared with measured data. The simulations made by MCNP agree well with experimental results. As long as thermal neutron scattering from chemically bound atoms are not included in GEANT4 it is not suitable for NCT applications.

  15. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  16. A GEANT4 Monte Carlo simulation to describe the time response of a coupled SiPM and LYSO detection system

    NASA Astrophysics Data System (ADS)

    Leming, E.; De Santo, A.; Salvatore, F.; Camanzi, B.; Lohstroh, A.

    2014-06-01

    In recent years the silicon photomultiplier has been investigated as an alternative to the traditional photomultiplier tube in a range of applications, including Time-of-flight Positron Emission Tomography (TOF-PET). In this paper we discuss a GEANT4 simulation framework, which has been developed to drive the design of a scalable TOF-PET apparatus to be built at the Rutherford Appleton Laboratory, UK. First results presented in this paper simulate the response of an Hamamatsu Multi-Pixel Photon Counter (S10362-33-050c) coupled to LYSO scintillating crystals, with focus on the timing response of coincidence signals.

  17. Validation of the GEANT4 simulation of bremsstrahlung from thick targets below 3 MeV

    NASA Astrophysics Data System (ADS)

    Pandola, L.; Andenna, C.; Caccia, B.

    2015-05-01

    The bremsstrahlung spectra produced by electrons impinging on thick targets are simulated using the GEANT4 Monte Carlo toolkit. Simulations are validated against experimental data available in literature for a range of energy between 0.5 and 2.8 MeV for Al and Fe targets and for a value of energy of 70 keV for Al, Ag, W and Pb targets. The energy spectra for the different configurations of emission angles, energies and targets are considered. Simulations are performed by using the three alternative sets of electromagnetic models that are available in GEANT4 to describe bremsstrahlung. At higher energies (0.5-2.8 MeV) of the impinging electrons on Al and Fe targets, GEANT4 is able to reproduce the spectral shapes and the integral photon emission in the forward direction. The agreement is within 10-30%, depending on energy, emission angle and target material. The physics model based on the Penelope Monte Carlo code is in slightly better agreement with the measured data than the other two. However, all models over-estimate the photon emission in the backward hemisphere. For the lower energy study (70 keV), which includes higher-Z targets, all models systematically under-estimate the total photon yield, providing agreement between 10% and 50%. The results of this work are of potential interest for medical physics applications, where knowledge of the energy spectra and angular distributions of photons is needed for accurate dose calculations with Monte Carlo and other fluence-based methods.

  18. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN

    PubMed Central

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists. PMID:26170558

  19. Validation of recent Geant4 physics models for application in carbon ion therapy

    NASA Astrophysics Data System (ADS)

    Lechner, A.; Ivanchenko, V. N.; Knobloch, J.

    2010-07-01

    Cancer treatment with energetic carbon ions has distinct advantages over proton or photon irradiation. In this paper we present a simulation model integrated into the Geant4 Monte Carlo toolkit (version 9.3) which enables the use of ICRU 73 stopping powers for ion transport calculations. For a few materials, revised ICRU 73 stopping power tables recently published by ICRU (P. Sigmund, A. Schinner, H. Paul, Errata and Addenda: ICRU Report 73 (Stopping of Ions Heavier than Helium), International Commission on Radiation Units and Measurements, 2009) were incorporated into Geant4, also covering media like water which are of importance in radiotherapeutical applications. We examine, with particular attention paid to the recent developments, the accuracy of current Geant4 models for simulating Bragg peak profiles of 12C ions incident on water and polyethylene targets. Simulated dose distributions are validated against experimental data available in the literature, where the focus is on beam energies relevant to ion therapy applications (90-400 MeV/u). A quantitative analysis is performed which addresses the precision of the Bragg peak position and proportional features of the dose distribution. It is shown that experimental peak positions can be reproduced within 0.2% of the particle range in the case of water, and within 0.9% in the case of polyethylene. The comparisons also demonstrate that the simulations accurately render the full width at half maximum (FWHM) of the measured Bragg peaks in water. For polyethylene slight deviations from experimental peak widths are partly attributed to systematic effects due to a simplified geometry model adopted in the simulation setup.

  20. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists. PMID:26170558

  1. GEANT4 simulation of the effects of Doppler energy broadening in Compton imaging.

    PubMed

    Uche, C Z; Cree, M J; Round, W H

    2011-09-01

    A Monte Carlo approach was used to study the effects of Doppler energy broadening on Compton camera performance. The GEANT4 simulation toolkit was used to model the radiation transport and interactions with matter in a simulated Compton camera. The low energy electromagnetic physics model of GEANT4 incorporating Doppler broadening developed by Longo et al. was used in the simulations. The camera had a 9 × 9 cm scatterer and a 10 × 10 cm absorber with a scatterer to-absorber separation of 5 cm. Modelling was done such that only the effects of Doppler broadening were taken into consideration and effects of scatterer and absorber thickness and pixelation were not taken into account, thus a 'perfect' Compton camera was assumed. Scatterer materials were either silicon or germanium and the absorber material was cadmium zinc telluride. Simulations were done for point sources 10 cm in front of the scatterer. The results of the simulations validated the use of the low energy model of GEANT4. As expected, Doppler broadening was found to degrade the Compton camera imaging resolution. For a 140.5 keV source the resulting full-width-at-half-maximum (FWHM) of the point source image without accounting for Doppler broadening and using a silicon scatterer was 0.58 mm. This degraded to 7.1 mm when Doppler broadening was introduced and degraded further to 12.3 mm when a germanium scatterer was used instead of silicon. But for a 511 keV source, the FWHM was better than for a 140 keV source. The FWHM improved to 2.4 mm for a silicon scatterer and 4.6 mm for a germanium scatterer. Our result for silicon at 140.5 keV is in very good agreement with that published by An et al. PMID:21556971

  2. Simulation of the 6 MV Elekta Synergy Platform linac photon beam using Geant4 Application for Tomographic Emission.

    PubMed

    Didi, Samir; Moussa, Abdelilah; Yahya, Tayalati; Mustafa, Zerfaoui

    2015-01-01

    The present work validates the Geant4 Application for Tomographic Emission Monte Carlo software for the simulation of a 6 MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The simulation includes the major components of the linear accelerator (LINAC) with multi-leaf collimator and a homogeneous water phantom. Calculations were performed for the photon beam with several treatment field sizes ranging from 5 cm × 5 cm to 30 cm × 30 cm at 100 cm distance from the source. The simulation was successfully validated by comparison with experimental distributions. Good agreement between simulations and measurements was observed, with dose differences of about 0.02% and 2.5% for depth doses and lateral dose profiles, respectively. This agreement was also emphasized by the Kolmogorov-Smirnov goodness-of-fit test and by the gamma-index comparisons where more than 99% of the points for all simulations fulfill the quality assurance criteria of 2 mm/2%. PMID:26500399

  3. Space radiation analysis: Radiation effects and particle interaction outside the Earth's magnetosphere using GRAS and GEANT4

    NASA Astrophysics Data System (ADS)

    Martinez, Lisandro M.; Kingston, Jennifer

    2012-03-01

    In order to explore the Moon and Mars it is necessary to investigate the hazards due to the space environment and especially ionizing radiation. According to previous papers, much information has been presented in radiation analysis inside the Earth's magnetosphere, but much of this work was not directly relevant to the interplanetary medium. This work intends to explore the effect of radiation on humans inside structures such as the ISS and provide a detailed analysis of galactic cosmic rays (GCRs) and solar proton events (SPEs) using SPENVIS (Space Environment Effects and Information System) and CREME96 data files for particle flux outside the Earth's magnetosphere. The simulation was conducted using GRAS, a European Space Agency (ESA) software based on GEANT4. Dose and equivalent dose have been calculated as well as secondary particle effects and GCR energy spectrum. The calculated total dose effects and equivalent dose indicate the risk and effects that space radiation could have on the crew, these values are calculated using two different types of structures, the ISS and the TransHab modules. Final results indicate the amounts of radiation expected to be absorbed by the astronauts during long duration interplanetary flights; this denotes importance of radiation shielding and the use of proper materials to reduce the effects.

  4. Simulation of the 6 MV Elekta Synergy Platform linac photon beam using Geant4 Application for Tomographic Emission

    PubMed Central

    Didi, Samir; Moussa, Abdelilah; Yahya, Tayalati; Mustafa, Zerfaoui

    2015-01-01

    The present work validates the Geant4 Application for Tomographic Emission Monte Carlo software for the simulation of a 6 MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The simulation includes the major components of the linear accelerator (LINAC) with multi-leaf collimator and a homogeneous water phantom. Calculations were performed for the photon beam with several treatment field sizes ranging from 5 cm × 5 cm to 30 cm × 30 cm at 100 cm distance from the source. The simulation was successfully validated by comparison with experimental distributions. Good agreement between simulations and measurements was observed, with dose differences of about 0.02% and 2.5% for depth doses and lateral dose profiles, respectively. This agreement was also emphasized by the Kolmogorov–Smirnov goodness-of-fit test and by the gamma-index comparisons where more than 99% of the points for all simulations fulfill the quality assurance criteria of 2 mm/2%. PMID:26500399

  5. Geant4 simulations on Compton scattering of laser photons on relativistic electrons

    SciTech Connect

    Filipescu, D.; Utsunomiya, H.; Gheorghe, I.; Glodariu, T.; Tesileanu, O.; Shima, T.; Takahisa, K.; Miyamoto, S.

    2015-02-24

    Using Geant4, a complex simulation code of the interaction between laser photons and relativistic electrons was developed. We implemented physically constrained electron beam emittance and spacial distribution parameters and we also considered a Gaussian laser beam. The code was tested against experimental data produced at the γ-ray beam line GACKO (Gamma Collaboration Hutch of Konan University) of the synchrotron radiation facility NewSUBARU. Here we will discuss the implications of transverse missallignments of the collimation system relative to the electron beam axis.

  6. Nuclear fragmentation reactions in extended media studied with Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Pshenichnov, Igor; Botvina, Alexander; Mishustin, Igor; Greiner, Walter

    2010-03-01

    It is well-known from numerous experiments that nuclear multifragmentation is a dominating mechanism for production of intermediate mass fragments in nucleus-nucleus collisions at energies above 100A MeV. In this paper we investigate the validity and performance of the Fermi break-up model and the statistical multifragmentation model implemented as parts of the Geant4 toolkit. We study the impact of violent nuclear disintegration reactions on the depth-dose profiles and yields of secondary fragments for beams of light and medium-weight nuclei propagating in extended media. Implications for ion-beam cancer therapy and shielding from cosmic radiation are discussed.

  7. Integration of the low-energy particle track simulation code in Geant4

    NASA Astrophysics Data System (ADS)

    Arce, Pedro; Muñoz, Antonio; Moraleda, Montserrat; Gomez Ros, José María; Blanco, Fernando; Perez, José Manuel; García, Gustavo

    2015-08-01

    The Low-Energy Particle Track Simulation code (LEPTS) is a Monte Carlo code developed to simulate the damage caused by radiation at molecular level. The code is based on experimental data of scattering cross sections, both differential and integral, and energy loss data, complemented with theoretical calculations. It covers the interactions of electrons and positrons from energies of 10 keV down to 0.1 eV in different biologically relevant materials. In this article we briefly mention the main characteristics of this code and we present its integration within the Geant4 Monte Carlo toolkit.

  8. Application of Geant4 simulation for analysis of soil carbon inelastic neutron scattering measurements.

    PubMed

    Yakubova, Galina; Kavetskiy, Aleksandr; Prior, Stephen A; Torbert, H Allen

    2016-07-01

    Inelastic neutron scattering (INS) was applied to determine soil carbon content. Due to non-uniform soil carbon depth distribution, the correlation between INS signals with some soil carbon content parameter is not obvious; however, a proportionality between INS signals and average carbon weight percent in ~10cm layer for any carbon depth profile is demonstrated using Monte-Carlo simulation (Geant4). Comparison of INS and dry combustion measurements confirms this conclusion. Thus, INS measurements give the value of this soil carbon parameter. PMID:27124122

  9. GEANT4 simulation of cyclotron radioisotope production in a solid target.

    PubMed

    Poignant, F; Penfold, S; Asp, J; Takhar, P; Jackson, P

    2016-05-01

    The use of radioisotopes in nuclear medicine is essential for diagnosing and treating cancer. The optimization of their production is a key factor in maximizing the production yield and minimizing the associated costs. An efficient approach to this problem is the use of Monte Carlo simulations prior to experimentation. By predicting isotopes yields, one can study the isotope of interest expected activity for different energy ranges. One can also study the target contamination with other radioisotopes, especially undesired radioisotopes of the wanted chemical element which are difficult to separate from the irradiated target and might result in increasing the dose when delivering the radiopharmaceutical product to the patient. The aim of this work is to build and validate a Monte Carlo simulation platform using the GEANT4 toolkit to model the solid target system of the South Australian Health and Medical Research Institute (SAHMRI) GE Healthcare PETtrace cyclotron. It includes a GEANT4 Graphical User Interface (GUI) where the user can modify simulation parameters such as the energy, shape and current of the proton beam, the target geometry and material, the foil geometry and material and the time of irradiation. The paper describes the simulation and presents a comparison of simulated and experimental/theoretical yields for various nuclear reactions on an enriched nickel 64 target using the GEANT4 physics model QGSP_BIC_AllHP, a model recently developed to evaluate with high precision the interaction of protons with energies below 200MeV available in Geant4 version 10.1. The simulation yield of the (64)Ni(p,n)(64)Cu reaction was found to be 7.67±0.074 mCi·μA(-1) for a target energy range of 9-12MeV. Szelecsenyi et al. (1993) gives a theoretical yield of 6.71mCi·μA(-1) and an experimental yield of 6.38mCi·μA(-1). The (64)Ni(p,n)(64)Cu cross section obtained with the simulation was also verified against the yield predicted from the nuclear database TENDL and

  10. FLOWTRAN-TF software design

    SciTech Connect

    Aleman, S.E.; Flach, G.P.; Hamm, L.L.; Lee, S.Y.; Smith, F.G. III.

    1993-02-01

    FLOWTRAN-TF was created to analyze an individual Mk22 fuel assembly during a large break Loss Of Coolant Accident (LOCA) scenario involving the Savannah River Site K-reactor after the initial few seconds of the transient. During the initial few seconds reactor cooling is limited by the static or Ledinegg flow instability phenomenon. The predecessor FLOWTRAN code was developed to analyze this portion of a LOCA. In the several seconds following the break, a significant fraction of the reactor coolant inventory leaks out the break, Emergency Cooling System (ECS) flow is initiated, and air enters the primary coolant circulation loops. Reactor fuel assemblies are cooled by a low flowrate air-water downflow. Existing commercial nuclear industry thermal-hydraulic codes were judged inadequate for detailed modeling of a Mk22 fuel assembly because the application involves a ribbed annular geometry, low pressure, downflow and an air-water mixture. FLOWTRAN-TF is a two-phase thermal-hydraulics code of similar technology to existing commercial codes such as RELAP and TRAC but customized for Savannah River Site applications. The main features and capabilities of FLOWTRAN-TF are detailed Mk22 fuel assembly ribbed annular geometry; conjugate heat transfer; detailed neutronic power distribution; three-dimensional heat conduction in Mk22 fuel and target tubes; two-dimensional coolant flow in channels (axial, azimuthal); single-phase and/or two-phase fluid (gas, liquid and/or gas-liquid); two-component (air, water); constitutive models applicable to low pressure air-water downflow in ribbed annular channels. The design of FLOWTRAN-TF is described in detail in this report which serves as the Software Design Report in accordance with Quality Assurance Procedure IV-4, Rev. 0 Software Design and Implementation'' in the 1Q34 manual.

  11. FLOWTRAN-TF software design

    SciTech Connect

    Aleman, S.E.; Flach, G.P.; Hamm, L.L.; Lee, S.Y.; Smith, F.G. III

    1993-02-01

    FLOWTRAN-TF was created to analyze an individual Mk22 fuel assembly during a large break Loss Of Coolant Accident (LOCA) scenario involving the Savannah River Site K-reactor after the initial few seconds of the transient. During the initial few seconds reactor cooling is limited by the static or Ledinegg flow instability phenomenon. The predecessor FLOWTRAN code was developed to analyze this portion of a LOCA. In the several seconds following the break, a significant fraction of the reactor coolant inventory leaks out the break, Emergency Cooling System (ECS) flow is initiated, and air enters the primary coolant circulation loops. Reactor fuel assemblies are cooled by a low flowrate air-water downflow. Existing commercial nuclear industry thermal-hydraulic codes were judged inadequate for detailed modeling of a Mk22 fuel assembly because the application involves a ribbed annular geometry, low pressure, downflow and an air-water mixture. FLOWTRAN-TF is a two-phase thermal-hydraulics code of similar technology to existing commercial codes such as RELAP and TRAC but customized for Savannah River Site applications. The main features and capabilities of FLOWTRAN-TF are detailed Mk22 fuel assembly ribbed annular geometry; conjugate heat transfer; detailed neutronic power distribution; three-dimensional heat conduction in Mk22 fuel and target tubes; two-dimensional coolant flow in channels (axial, azimuthal); single-phase and/or two-phase fluid (gas, liquid and/or gas-liquid); two-component (air, water); constitutive models applicable to low pressure air-water downflow in ribbed annular channels. The design of FLOWTRAN-TF is described in detail in this report which serves as the Software Design Report in accordance with Quality Assurance Procedure IV-4, Rev. 0 ``Software Design and Implementation`` in the 1Q34 manual.

  12. User interactive electric propulsion software design

    NASA Technical Reports Server (NTRS)

    Aston, Martha B.; Aston, Graeme; Brophy, John R.

    1989-01-01

    As electric propulsion technology matures from laboratory development to flight application, mission planners and spacecraft designers are increasingly required to determine the benefits and integration issues of using this propulsion capability. A computer software tool for supporting these analyses is presented. This tool combines detailed analytical models describing electric propulsion engine performance and subsystem design, and a software structure that is highly user interactive and adaptable. The software design methodology used to develop this software tool is presented in this paper.

  13. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  14. Geant4 studies of the CNAO facility system for hadrontherapy treatment of uveal melanomas

    NASA Astrophysics Data System (ADS)

    Rimoldi, A.; Piersimoni, P.; Pirola, M.; Riccardi, C.

    2014-06-01

    The Italian National Centre of Hadrontherapy for Cancer Treatment (CNAO -Centro Nazionale di Adroterapia Oncologica) in Pavia, Italy, has started the treatment of selected cancers with the first patients in late 2011. In the coming months at CNAO plans are to activate a new dedicated treatment line for irradiation of uveal melanomas using the available active beam scan. The beam characteristics and the experimental setup should be tuned in order to reach the necessary precision required for such treatments. Collaboration between CNAO foundation, University of Pavia and INFN has started in 2011 to study the feasibility of these specialised treatments by implementing a MC simulation of the transport beam line and comparing the obtained simulation results with measurements at CNAO. The goal is to optimise an eye-dedicated transport beam line and to find the best conditions for ocular melanoma irradiations. This paper describes the Geant4 toolkit simulation of the CNAO setup as well as a modelised human eye with a tumour inside. The Geant4 application could be also used to test possible treatment planning systems. Simulation results illustrate the possibility to adapt the CNAO standard transport beam line by optimising the position of the isocentre and the addition of some passive elements to better shape the beam for this dedicated study.

  15. A Compton camera application for the GAMOS GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Harkness, L. J.; Arce, P.; Judson, D. S.; Boston, A. J.; Boston, H. C.; Cresswell, J. R.; Dormand, J.; Jones, M.; Nolan, P. J.; Sampson, J. A.; Scraggs, D. P.; Sweeney, A.; Lazarus, I.; Simpson, J.

    2012-04-01

    Compton camera systems can be used to image sources of gamma radiation in a variety of applications such as nuclear medicine, homeland security and nuclear decommissioning. To locate gamma-ray sources, a Compton camera employs electronic collimation, utilising Compton kinematics to reconstruct the paths of gamma rays which interact within the detectors. The main benefit of this technique is the ability to accurately identify and locate sources of gamma radiation within a wide field of view, vastly improving the efficiency and specificity over existing devices. Potential advantages of this imaging technique, along with advances in detector technology, have brought about a rapidly expanding area of research into the optimisation of Compton camera systems, which relies on significant input from Monte-Carlo simulations. In this paper, the functionality of a Compton camera application that has been integrated into GAMOS, the GEANT4-based Architecture for Medicine-Oriented Simulations, is described. The application simplifies the use of GEANT4 for Monte-Carlo investigations by employing a script based language and plug-in technology. To demonstrate the use of the Compton camera application, simulated data have been generated using the GAMOS application and acquired through experiment for a preliminary validation, using a Compton camera configured with double sided high purity germanium strip detectors. Energy spectra and reconstructed images for the data sets are presented.

  16. Multi-scale hybrid models for radiopharmaceutical dosimetry with Geant4.

    PubMed

    Marcatili, S; Villoing, D; Garcia, M P; Bardiès, M

    2014-12-21

    The accuracy of radiopharmaceutical absorbed dose distributions computed through Monte Carlo (MC) simulations is mostly limited by the low spatial resolution of 3D imaging techniques used to define the simulation geometry. This issue also persists with the implementation of realistic hybrid models built using polygonal mesh and/or NURBS as they require to be simulated in their voxel form in order to reduce computation times. The existing trade-off between voxel size and simulation speed leads on one side, in an overestimation of the size of small radiosensitive structures such as the skin or hollow organs walls and, on the other, to unnecessarily detailed voxelization of large, homogeneous structures.We developed a set of computational tools based on VTK and Geant4 in order to build multi-resolution organ models. Our aim is to use different voxel sizes to represent anatomical regions of different clinical relevance: the MC implementation of these models is expected to improve spatial resolution in specific anatomical structures without significantly affecting simulation speed. Here we present the tools developed through a proof of principle example. Our approach is validated against the standard Geant4 technique for the simulation of voxel geometries. PMID:25415621

  17. Enhancement and validation of Geant4 Brachytherapy application on clinical HDR 192Ir source

    NASA Astrophysics Data System (ADS)

    Ababneh, Eshraq; Dababneh, Saed; Qatarneh, Sharif; Wadi-Ramahi, Shada

    2014-10-01

    The Geant4 Monte Carlo MC associated Brachytherapy example was adapted, enhanced and several analysis techniques have been developed. The simulation studies the isodose distribution of the total, primary and scattered doses around a Nucletron microSelectron 192Ir source. Different phantom materials were used (water, tissue and bone) and the calculation was conducted at various depths and planes. The work provides an early estimate of the required number of primary events to ultimately achieve a given uncertainty at a given distance, in the otherwise CPU and time consuming clinical MC calculation. The adaptation of the Geant4 toolkit and the enhancements introduced to the code are all validated including the comprehensive decay of the 192Ir source, the materials used to build the geometry, the geometry itself and the calculated scatter to primary dose ratio. The simulation quantitatively illustrates that the scattered dose in the bone medium is larger than its value in water and tissue. As the distance away from the source increases, scatter contribution to dose becomes more significant as the primary dose decreases. The developed code could be viewed as a platform that contains detailed dose calculation model for clinical application of HDR 192Ir in Brachytherapy.

  18. Benchmarking nuclear models of FLUKA and GEANT4 for carbon ion therapy.

    PubMed

    Böhlen, T T; Cerutti, F; Dosanjh, M; Ferrari, A; Gudowska, I; Mairani, A; Quesada, J M

    2010-10-01

    As carbon ions, at therapeutic energies, penetrate tissue, they undergo inelastic nuclear reactions and give rise to significant yields of secondary fragment fluences. Therefore, an accurate prediction of these fluences resulting from the primary carbon interactions is necessary in the patient's body in order to precisely simulate the spatial dose distribution and the resulting biological effect. In this paper, the performance of nuclear fragmentation models of the Monte Carlo transport codes, FLUKA and GEANT4, in tissue-like media and for an energy regime relevant for therapeutic carbon ions is investigated. The ability of these Monte Carlo codes to reproduce experimental data of charge-changing cross sections and integral and differential yields of secondary charged fragments is evaluated. For the fragment yields, the main focus is on the consideration of experimental approximations and uncertainties such as the energy measurement by time-of-flight. For GEANT4, the hadronic models G4BinaryLightIonReaction and G4QMD are benchmarked together with some recently enhanced de-excitation models. For non-differential quantities, discrepancies of some tens of percent are found for both codes. For differential quantities, even larger deviations are found. Implications of these findings for the therapeutic use of carbon ions are discussed. PMID:20844337

  19. Simulation of positron backscattering and implantation profiles using Geant4 code

    NASA Astrophysics Data System (ADS)

    Huang, Shi-Juan; Pan, Zi-Wen; Liu, Jian-Dang; Han, Rong-Dian; Ye, Bang-Jiao

    2015-10-01

    For the proper interpretation of the experimental data produced in slow positron beam technique, the positron implantation properties are studied carefully using the latest Geant4 code. The simulated backscattering coefficients, the implantation profiles, and the median implantation depths for mono-energetic positrons with energy range from 1 keV to 50 keV normally incident on different crystals are reported. Compared with the previous experimental results, our simulation backscattering coefficients are in reasonable agreement, and we think that the accuracy may be related to the structures of the host materials in the Geant4 code. Based on the reasonable simulated backscattering coefficients, the adjustable parameters of the implantation profiles which are dependent on materials and implantation energies are obtained. The most important point is that we calculate the positron backscattering coefficients and median implantation depths in amorphous polymers for the first time and our simulations are in fairly good agreement with the previous experimental results. Project supported by the National Natural Science Foundation of China (Grant Nos. 11175171 and 11105139).

  20. Multi-scale hybrid models for radiopharmaceutical dosimetry with Geant4

    NASA Astrophysics Data System (ADS)

    Marcatili, S.; Villoing, D.; Garcia, M. P.; Bardiès, M.

    2014-12-01

    The accuracy of radiopharmaceutical absorbed dose distributions computed through Monte Carlo (MC) simulations is mostly limited by the low spatial resolution of 3D imaging techniques used to define the simulation geometry. This issue also persists with the implementation of realistic hybrid models built using polygonal mesh and/or NURBS as they require to be simulated in their voxel form in order to reduce computation times. The existing trade-off between voxel size and simulation speed leads on one side, in an overestimation of the size of small radiosensitive structures such as the skin or hollow organs walls and, on the other, to unnecessarily detailed voxelization of large, homogeneous structures. We developed a set of computational tools based on VTK and Geant4 in order to build multi-resolution organ models. Our aim is to use different voxel sizes to represent anatomical regions of different clinical relevance: the MC implementation of these models is expected to improve spatial resolution in specific anatomical structures without significantly affecting simulation speed. Here we present the tools developed through a proof of principle example. Our approach is validated against the standard Geant4 technique for the simulation of voxel geometries.

  1. Evaluation of proton inelastic reaction models in Geant4 for prompt gamma production during proton radiotherapy

    NASA Astrophysics Data System (ADS)

    Jeyasugiththan, Jeyasingam; Peterson, Stephen W.

    2015-10-01

    During proton beam radiotherapy, discrete secondary prompt gamma rays are induced by inelastic nuclear reactions between protons and nuclei in the human body. In recent years, the Geant4 Monte Carlo toolkit has played an important role in the development of a device for real time dose range verification purposes using prompt gamma radiation. Unfortunately the default physics models in Geant4 do not reliably replicate the measured prompt gamma emission. Determining a suitable physics model for low energy proton inelastic interactions will boost the accuracy of prompt gamma simulations. Among the built-in physics models, we found that the precompound model with a modified initial exciton state of 2 (1 particle, 1 hole) produced more accurate discrete gamma lines from the most important elements found within the body such as 16O, 12C and 14N when comparing them with the available gamma production cross section data. Using the modified physics model, we investigated the prompt gamma spectra produced in a water phantom by a 200 MeV pencil beam of protons. The spectra were attained using a LaBr3 detector with a time-of-flight (TOF) window and BGO active shield to reduce the secondary neutron and gamma background. The simulations show that a 2 ns TOF window could reduce 99% of the secondary neutron flux hitting the detector. The results show that using both timing and active shielding can remove up to 85% of the background radiation which includes a 33% reduction by BGO subtraction.

  2. Application of dynamic Monte Carlo technique in proton beam radiotherapy using Geant4 simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guan, Fada

    Monte Carlo method has been successfully applied in simulating the particles transport problems. Most of the Monte Carlo simulation tools are static and they can only be used to perform the static simulations for the problems with fixed physics and geometry settings. Proton therapy is a dynamic treatment technique in the clinical application. In this research, we developed a method to perform the dynamic Monte Carlo simulation of proton therapy using Geant4 simulation toolkit. A passive-scattering treatment nozzle equipped with a rotating range modulation wheel was modeled in this research. One important application of the Monte Carlo simulation is to predict the spatial dose distribution in the target geometry. For simplification, a mathematical model of a human body is usually used as the target, but only the average dose over the whole organ or tissue can be obtained rather than the accurate spatial dose distribution. In this research, we developed a method using MATLAB to convert the medical images of a patient from CT scanning into the patient voxel geometry. Hence, if the patient voxel geometry is used as the target in the Monte Carlo simulation, the accurate spatial dose distribution in the target can be obtained. A data analysis tool---root was used to score the simulation results during a Geant4 simulation and to analyze the data and plot results after simulation. Finally, we successfully obtained the accurate spatial dose distribution in part of a human body after treating a patient with prostate cancer using proton therapy.

  3. Evaluation of proton inelastic reaction models in Geant4 for prompt gamma production during proton radiotherapy.

    PubMed

    Jeyasugiththan, Jeyasingam; Peterson, Stephen W

    2015-10-01

    During proton beam radiotherapy, discrete secondary prompt gamma rays are induced by inelastic nuclear reactions between protons and nuclei in the human body. In recent years, the Geant4 Monte Carlo toolkit has played an important role in the development of a device for real time dose range verification purposes using prompt gamma radiation. Unfortunately the default physics models in Geant4 do not reliably replicate the measured prompt gamma emission. Determining a suitable physics model for low energy proton inelastic interactions will boost the accuracy of prompt gamma simulations. Among the built-in physics models, we found that the precompound model with a modified initial exciton state of 2 (1 particle, 1 hole) produced more accurate discrete gamma lines from the most important elements found within the body such as 16O, 12C and 14N when comparing them with the available gamma production cross section data. Using the modified physics model, we investigated the prompt gamma spectra produced in a water phantom by a 200 MeV pencil beam of protons. The spectra were attained using a LaBr3 detector with a time-of-flight (TOF) window and BGO active shield to reduce the secondary neutron and gamma background. The simulations show that a 2 ns TOF window could reduce 99% of the secondary neutron flux hitting the detector. The results show that using both timing and active shielding can remove up to 85% of the background radiation which includes a 33% reduction by BGO subtraction. PMID:26389549

  4. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  5. Software Prototyping: Designing Systems for Users.

    ERIC Educational Resources Information Center

    Spies, Phyllis Bova

    1983-01-01

    Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…

  6. The impact of new Geant4-DNA cross section models on electron track structure simulations in liquid water

    NASA Astrophysics Data System (ADS)

    Kyriakou, I.; Šefl, M.; Nourry, V.; Incerti, S.

    2016-05-01

    The most recent release of the open source and general purpose Geant4 Monte Carlo simulation toolkit (Geant4 10.2 release) contains a new set of physics models in the Geant4-DNA extension for improving the modelling of low-energy electron transport in liquid water (<10 keV). This includes updated electron cross sections for excitation, ionization, and elastic scattering. In the present work, the impact of these developments to track-structure calculations is examined for providing the first comprehensive comparison against the default physics models of Geant4-DNA. Significant differences with the default models are found for the average path length and penetration distance, as well as for dose-point-kernels for electron energies below a few hundred eV. On the other hand, self-irradiation absorbed fractions for tissue-like volumes and low-energy electron sources (including some Auger emitters) reveal rather small differences (up to 15%) between these new and default Geant4-DNA models. The above findings indicate that the impact of the new developments will mainly affect those applications where the spatial pattern of interactions and energy deposition of very-low energy electrons play an important role such as, for example, the modelling of the chemical and biophysical stage of radiation damage to cells.

  7. Flight Software Design Choices Based on Criticality

    NASA Technical Reports Server (NTRS)

    Lee, Earl

    1999-01-01

    This slide presentation reviews the rationale behind flight software design as a function of criticality. The requirements of human rated systems implies a high criticality for the flight support software. Human life is dependent on correct operation of the software. Flexibility should be permitted when the consequences of software failure are not life threatening. This is also relevant for selecting Commercial Off the Shelf (COTS) software.

  8. Purchasing Computer-Aided Design Software.

    ERIC Educational Resources Information Center

    Smith, Roger A.

    1992-01-01

    Presents a model for the purchase of computer-aided design (CAD) software: collect general information, observe CAD in use, arrange onsite demonstrations, select CAD software and hardware, and choose a vendor. (JOW)

  9. Calculation of extrapolation curves in the 4π(LS)β-γ coincidence technique with the Monte Carlo code Geant4.

    PubMed

    Bobin, C; Thiam, C; Bouchard, J

    2016-03-01

    At LNE-LNHB, a liquid scintillation (LS) detection setup designed for Triple to Double Coincidence Ratio (TDCR) measurements is also used in the β-channel of a 4π(LS)β-γ coincidence system. This LS counter based on 3 photomultipliers was first modeled using the Monte Carlo code Geant4 to enable the simulation of optical photons produced by scintillation and Cerenkov effects. This stochastic modeling was especially designed for the calculation of double and triple coincidences between photomultipliers in TDCR measurements. In the present paper, this TDCR-Geant4 model is extended to 4π(LS)β-γ coincidence counting to enable the simulation of the efficiency-extrapolation technique by the addition of a γ-channel. This simulation tool aims at the prediction of systematic biases in activity determination due to eventual non-linearity of efficiency-extrapolation curves. First results are described in the case of the standardization (59)Fe. The variation of the γ-efficiency in the β-channel due to the Cerenkov emission is investigated in the case of the activity measurements of (54)Mn. The problem of the non-linearity between β-efficiencies is featured in the case of the efficiency tracing technique for the activity measurements of (14)C using (60)Co as a tracer. PMID:26699674

  10. Comparative studies on shielding properties of some steel alloys using Geant4, MCNP, WinXCOM and experimental results

    NASA Astrophysics Data System (ADS)

    Singh, Vishwanath P.; Medhat, M. E.; Shirmardi, S. P.

    2015-01-01

    The mass attenuation coefficients, μ/ρ and effective atomic numbers, Zeff of some carbon steel and stainless steel alloys have been calculated by using Geant4, MCNP simulation codes for different gamma ray energies, 279.1 keV, 661.6 keV, 662 keV, 1115.5 keV, 1173 keV and 1332 keV. The simulation results of Zeff using Geant4 and MCNP codes have been compared with possible available experimental results and theoretical WinXcom, and good agreement has been observed. The simulated μ/ρ and Zeff values using Geant4 and MCNP code signifies that both the simulation process can be followed to determine the gamma ray interaction properties of the alloys for energies wherever analogous experimental results may not be available. This kind of studies can be used for various applications such as for radiation dosimetry, medical and radiation shielding.

  11. 3D polymer gel dosimetry and Geant4 Monte Carlo characterization of novel needle based X-ray source

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Sozontov, E.; Safronov, V.; Gutman, G.; Strumban, E.; Jiang, Q.; Li, S.

    2010-11-01

    In the recent years, there have been a few attempts to develop a low energy x-ray radiation sources alternative to conventional radioisotopes used in brachytherapy. So far, all efforts have been centered around the intent to design an interstitial miniaturized x-ray tube. Though direct irradiation of tumors looks very promising, the known insertable miniature x-ray tubes have many limitations: (a) difficulties with focusing and steering the electron beam to the target; (b)necessity to cool the target to increase x-ray production efficiency; (c)impracticability to reduce the diameter of the miniaturized x-ray tube below 4mm (the requirement to decrease the diameter of the x-ray tube and the need to have a cooling system for the target have are mutually exclusive); (c) significant limitations in changing shape and energy of the emitted radiation. The specific aim of this study is to demonstrate the feasibility of a new concept for an insertable low-energy needle x-ray device based on simulation with Geant4 Monte Carlo code and to measure the dose rate distribution for low energy (17.5 keV) x-ray radiation with the 3D polymer gel dosimetry.

  12. Galactic Cosmic Rays and Lunar Secondary Particles from Solar Minimum to Maximum: CRaTER Observations and Geant4 Modeling

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J. E.; Blake, J. B.; Spence, H. E.; Schwadron, N.; Golightly, M. J.; Case, A. W.; Kasper, J. C.; Townsend, L. W.; Wilson, J. K.

    2014-12-01

    The Lunar Reconnaissance Orbiter mission was launched in 2009 during the recent deep and extended solar minimum, with the highest galactic cosmic ray (GCR) fluxes observed since the beginning of the space era. Its Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument was designed to measure the spectra of energy deposits in silicon detectors shielded behind pieces of tissue equivalent plastic, simulating the self-shielding provided by an astronaut's body around radiation-sensitive organs. The CRaTER data set now covers the evolution of the GCR environment near the moon during the first five years of development of the present solar cycle. We will present these observations, along with Geant4 modeling to illustrate the varying particle contributions to the energy-deposit spectra. CRaTER has also measured protons traveling up from the lunar surface after their creation during GCR interactions with surface material, and we will report observations and modeling of the energy and angular distributions of these "albedo" protons.

  13. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4

    NASA Astrophysics Data System (ADS)

    Schümann, J.; Paganetti, H.; Shin, J.; Faddegon, B.; Perl, J.

    2012-06-01

    A key task within all Monte Carlo particle transport codes is ‘navigation’, the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the efficiency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the last with and without boundary skipping, a method where neighboring voxels with the same Hounsfield unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a tool for particle simulations layered on top of Geant4. Runtime comparisons were made on three distinct patient CT data sets: a head and neck, a liver and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average runtime ratio for G4PhantomParameterisation with and without boundary skipping for our heterogeneous data was equal to 0.97: 1. The calculated dose distributions agreed with the reference distribution for all but the G4

  14. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4.

    PubMed

    Schümann, J; Paganetti, H; Shin, J; Faddegon, B; Perl, J

    2012-06-01

    A key task within all Monte Carlo particle transport codes is 'navigation', the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the efficiency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the last with and without boundary skipping, a method where neighboring voxels with the same Hounsfield unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a tool for particle simulations layered on top of Geant4. Runtime comparisons were made on three distinct patient CT data sets: a head and neck, a liver and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average runtime ratio for G4PhantomParameterisation with and without boundary skipping for our heterogeneous data was equal to 0.97: 1. The calculated dose distributions agreed with the reference distribution for all but the G4Phantom

  15. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Suerfu, B.; Xu, J.; Ivantchenko, V.; Mantero, A.; Brown, J. M. C.; Bernal, M. A.; Francis, Z.; Karamitros, M.; Tran, H. N.

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  16. Molecular scale track structure simulations in liquid water using the Geant4-DNA Monte-Carlo processes.

    PubMed

    Francis, Z; Incerti, S; Capra, R; Mascialino, B; Montarou, G; Stepan, V; Villagrasa, C

    2011-01-01

    This paper presents a study of energy deposits induced by ionising particles in liquid water at the molecular scale. Particles track structures were generated using the Geant4-DNA processes of the Geant4 Monte-Carlo toolkit. These processes cover electrons (0.025 eV-1 MeV), protons (1 keV-100 MeV), hydrogen atoms (1 keV-100 MeV) and alpha particles (10 keV-40 MeV) including their different charge states. Electron ranges and lineal energies for protons were calculated in nanometric and micrometric volumes. PMID:20810287

  17. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  18. Language and Program for Documenting Software Design

    NASA Technical Reports Server (NTRS)

    Kleine, H.; Zepko, T. M.

    1986-01-01

    Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.

  19. Knowledge modeling for software design

    NASA Technical Reports Server (NTRS)

    Shaw, Mildred L. G.; Gaines, Brian R.

    1992-01-01

    This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledge modeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.

  20. Geant4 calculations for space radiation shielding material Al2O3

    NASA Astrophysics Data System (ADS)

    Capali, Veli; Acar Yesil, Tolga; Kaya, Gokhan; Kaplan, Abdullah; Yavuz, Mustafa; Tilki, Tahir

    2015-07-01

    Aluminium Oxide, Al2O3 is the most widely used material in the engineering applications. It is significant aluminium metal, because of its hardness and as a refractory material owing to its high melting point. This material has several engineering applications in diverse fields such as, ballistic armour systems, wear components, electrical and electronic substrates, automotive parts, components for electric industry and aero-engine. As well, it is used as a dosimeter for radiation protection and therapy applications for its optically stimulated luminescence properties. In this study, stopping powers and penetrating distances have been calculated for the alpha, proton, electron and gamma particles in space radiation shielding material Al2O3 for incident energies 1 keV - 1 GeV using GEANT4 calculation code.

  1. Comparison of MCNPX and Geant4 proton energy deposition predictions for clinical use

    PubMed Central

    Titt, U.; Bednarz, B.; Paganetti, H.

    2012-01-01

    Several different Monte Carlo codes are currently being used at proton therapy centers to improve upon dose predictions over standard methods using analytical or semi-empirical dose algorithms. There is a need to better ascertain the differences between proton dose predictions from different available Monte Carlo codes. In this investigation Geant4 and MCNPX, the two most-utilized Monte Carlo codes for proton therapy applications, were used to predict energy deposition distributions in a variety of geometries, comprising simple water phantoms, water phantoms with complex inserts and in a voxelized geometry based on clinical CT data. The gamma analysis was used to evaluate the differences of the predictions between the codes. The results show that in the all cases the agreement was better than clinical acceptance criteria. PMID:22996039

  2. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation

    NASA Astrophysics Data System (ADS)

    Ogawara, R.; Ishikawa, M.

    2016-07-01

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements.

  3. GEANT4 Application for the Simulation of the Head of a Siemens Primus Linac

    SciTech Connect

    Cortes-Giraldo, M. A.; Quesada, J. M.; Gallardo, M. I.

    2010-04-26

    The Monte Carlo simulation of the head of a Siemens Primus Linac used at Virgen Macarena Hospital (Sevilla, Spain) has been performed using the code GEANT4, version 9.2. In this work, the main features of the application built by our group are presented. They are mainly focused in the optimization of the performance of the simulation. The geometry, including the water phantom, has been entirely wrapped by a shielding volume which discards all the particles escaping far away through its walls. With this, a factor of four in the time spent by the simulation can be saved. An interface to read and write phase-space files in IAEA format has been also developed to save CPU time in our simulations. Finally, some calculations of the dose absorption in the water phantom have been done and compared with the results given by EGSnrc and with experimental data obtained for the calibration of the machine.

  4. Geant4 simulation of the n_TOF-EAR2 neutron beam: Characteristics and prospects

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Lo Meo, S.; Guerrero, C.; Cortés-Giraldo, M. A.; Massimi, C.; Quesada, J. M.; Barbagallo, M.; Colonna, N.; Mancusi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.

    2016-04-01

    The characteristics of the neutron beam at the new n_TOF-EAR2 facility have been simulated with the Geant4 code with the aim of providing useful data for both the analysis and planning of the upcoming measurements. The spatial and energy distributions of the neutrons, the resolution function and the in-beam γ-ray background have been studied in detail and their implications in the forthcoming experiments have been discussed. The results confirm that, with this new short (18.5m flight path) beam line, reaching an instantaneous neutron flux beyond 105n/μs/pulse in the keV region, n_TOF is one of the few facilities where challenging measurements can be performed, involving in particular short-lived radioisotopes.

  5. Geant4 Predictions of Energy Spectra in Typical Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Sabra, M. S.; Barghouty, A. F.

    2014-01-01

    Accurate knowledge of energy spectra inside spacecraft is important for protecting astronauts as well as sensitive electronics from the harmful effects of space radiation. Such knowledge allows one to confidently map the radiation environment inside the vehicle. The purpose of this talk is to present preliminary calculations for energy spectra inside a spherical shell shielding and behind a slab in typical space radiation environment using the 3D Monte-Carlo transport code Geant4. We have simulated proton and iron isotropic sources and beams impinging on Aluminum and Gallium arsenide (GaAs) targets at energies of 0.2, 0.6, 1, and 10 GeV/u. If time permits, other radiation sources and beams (_, C, O) and targets (C, Si, Ge, water) will be presented. The results are compared to ground-based measurements where available.

  6. Geant4.10 simulation of geometric model for metaphase chromosome

    NASA Astrophysics Data System (ADS)

    Rafat-Motavalli, L.; Miri-Hakimabad, H.; Bakhtiyari, E.

    2016-04-01

    In this paper, a geometric model of metaphase chromosome is explained. The model is constructed according to the packing ratio and dimension of the structure from nucleosome up to chromosome. A B-DNA base pair is used to construct 200 base pairs of nucleosomes. Each chromatin fiber loop, which is the unit of repeat, has 49,200 bp. This geometry is entered in Geant4.10 Monte Carlo simulation toolkit and can be extended to the whole metaphase chromosomes and any application in which a DNA geometrical model is needed. The chromosome base pairs, chromosome length, and relative length of chromosomes are calculated. The calculated relative length is compared to the relative length of human chromosomes.

  7. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.

    PubMed

    Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L

    2012-06-01

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy. PMID:22572100

  8. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy

    NASA Astrophysics Data System (ADS)

    Afsharpour, H.; Landry, G.; D'Amours, M.; Enger, S.; Reniers, B.; Poon, E.; Carrier, J.-F.; Verhaegen, F.; Beaulieu, L.

    2012-06-01

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  9. Development and validation of a GEANT4 radiation transport code for CT dosimetry

    PubMed Central

    Carver, DE; Kost, SD; Fernald, MJ; Lewis, KG; Fraser, ND; Pickens, DR; Price, RR; Stabin, MG

    2014-01-01

    We have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate our simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air, a standard 16-cm acrylic head phantom, and a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of our Monte Carlo simulations. We found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  10. Geant4 predictions of energy spectra in typical space radiation environment

    NASA Astrophysics Data System (ADS)

    Sabra, M. S.; Barghouty, A. F.

    2014-03-01

    Accurate knowledge of energy spectra inside spacecraft is important for protecting astronauts as well as sensitive electronics from the harmful effects of space radiation. Such knowledge allows one to confidently map the radiation environment inside the vehicle. The purpose of this talk is to present preliminary calculations for energy spectra inside a spherical shell shielding and behind a slab in typical space radiation environment using the 3D Monte-Carlo transport code Geant4. We have simulated proton and iron isotropic sources and beams impinging on Aluminum and Gallium arsenide (GaAs) targets at energies of 0.2, 0.6, 1, and 10 GeV/u. If time permits, other radiation sources and beams (α, C, O) and targets (C, Si, Ge, water) will be presented. The results are compared to ground-based measurements where available.

  11. ROSI and GEANT4 - A comparison in the context of high energy X-ray physics

    NASA Astrophysics Data System (ADS)

    Kiunke, Markus; Stritt, Carina; Schielein, Richard; Sukowski, Frank; Hölzing, Astrid; Zabler, Simon; Hofmann, Jürgen; Flisch, Alexander; Kasperl, Stefan; Sennhauser, Urs; Hanke, Randolf

    2016-06-01

    This work compares two popular MC simulation frameworks ROSI (Roentgen Simulation) and GEANT4 (Geometry and Tracking in its fourth version) in the context of X-ray physics. The comparison will be performed with the help of a parameter study considering energy, material and length variations. While the total deposited energy as well as the contribution of Compton scattering show a good accordance between all simulated configurations, all other physical effects exhibit large deviations in a comparison of data-sets. These discrepancies between simulations are shown to originate from the different cross sectional databases used in the frameworks, whereas the overall simulation mechanics seem to not have an influence on the agreement of the simulations. A scan over energy, length and material shows that the two parameters energy and material have a significant influence on the agreement of the simulation results, while the length parameter shows no noticeable influence on the deviations between the data-sets.

  12. Performance of the Nab segmented silicon detectors: GEANT4 and data

    NASA Astrophysics Data System (ADS)

    Frlez, Emil; Nab Collaboration

    2015-10-01

    The Nab Collaboration has proposed to measure neutron β-decay correlation parameters a and b at the Oak Ridge National Laboratory using a custom superconducting spectrometer and novel Si detectors. Two large area 2-mm thick silicon detectors, each segmented into 127 hexagonal pixels, will be used to detect the proton and electron from cold neutron decay. We present GEANT4 Monte Carlo simulations of the Si detector energy and timing responses to electrons below 1 MeV and to 30 keV protons with realistic simulated amplified anode waveforms. Both the data acquired with a prototype detector at Los Alamos National Laboratory with radioactive sources and the synthetic waveforms are analyzed by the same code. Energy and timing responses of the Si detectors are discussed, with the MC waveforms calibrated to the decay constants, baselines, noise, gains, and timing offsets extracted from measured data, pixel by pixel. Work supported by NSF Grants PHY-1126683, 1205833, 1307328, 1506320, and others.

  13. Modeling spallation reactions in tungsten and uranium targets with the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yury; Pshenichnov, Igor; Mishustin, Igor; Greiner, Walter

    2012-02-01

    We study primary and secondary reactions induced by 600 MeV proton beams in monolithic cylindrical targets made of natural tungsten and uranium by using Monte Carlo simulations with the Geant4 toolkit [1-3]. Bertini intranuclear cascade model, Binary cascade model and IntraNuclear Cascade Liège (INCL) with ABLA model [4] were used as calculational options to describe nuclear reactions. Fission cross sections, neutron multiplicity and mass distributions of fragments for 238U fission induced by 25.6 and 62.9 MeV protons are calculated and compared to recent experimental data [5]. Time distributions of neutron leakage from the targets and heat depositions are calculated. This project is supported by Siemens Corporate Technology.

  14. Nuclear reaction measurements on tissue-equivalent materials and GEANT4 Monte Carlo simulations for hadrontherapy

    NASA Astrophysics Data System (ADS)

    De Napoli, M.; Romano, F.; D'Urso, D.; Licciardello, T.; Agodi, C.; Candiano, G.; Cappuzzello, F.; Cirrone, G. A. P.; Cuttone, G.; Musumarra, A.; Pandola, L.; Scuderi, V.

    2014-12-01

    When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned. Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u-1 12C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.

  15. G4SiPM: A novel silicon photomultiplier simulation package for Geant4

    NASA Astrophysics Data System (ADS)

    Niggemann, Tim; Dietz-Laursonn, Erik; Hebbeker, Thomas; Künsken, Andreas; Lauscher, Markus; Merschmeyer, Markus

    2015-07-01

    The signal of silicon photomultipliers (SiPMs) depends not only on the number of incoming photons but also on thermal and correlated noise of which the latter is difficult to handle. Additionally, the properties of SiPMs vary with the supplied bias voltage and the ambient temperature. The purpose of the G4SiPM simulation package is the integration of a detailed SiPM simulation into Geant4 which is widely used in particle physics. The prediction of the G4SiPM simulation code is validated with a laboratory measurement of the dynamic range of a 3×3 mm2 SiPM with 3600 cells manufactured by Hamamatsu.

  16. Geant4 Simulations of SuperCDMS iZip Detector Charge Carrier Propagation

    NASA Astrophysics Data System (ADS)

    Agnese, Robert; Brandt, Daniel; Redl, Peter; Asai, Makoto; Faiez, Dana; Kelsey, Mike; Bagli, Enrico; Anderson, Adam; Schlupf, Chandler

    2014-03-01

    The SuperCDMS experiment uses germanium crystal detectors instrumented with ionization and phonon readout circuits to search for dark matter. In order to simulate the response of the detectors to particle interactions the SuperCDMS Detector Monte Carlo (DMC) group has been implementing the processes governing electrons and phonons at low temperatures in Geant4. The charge portion of the DMC simulates oblique propagation of the electrons through the L-valleys, propagation of holes through the Γ-valleys, inter-valley scattering, and emission of Neganov-Luke phonons in a complex applied electric field. The field is calculated by applying a directed walk search on a tetrahedral mesh of known potentials and then interpolating the value. This talk will present an overview of the DMC status and a comparison of the charge portion of the DMC to experimental data of electron-hole pair propagation in germanium.

  17. Comparing Geant4 hadronic models for the WENDI-II rem meter response function.

    PubMed

    Vanaudenhove, T; Dubus, A; Pauly, N

    2013-01-01

    The WENDI-II rem meter is one of the most popular neutron dosemeters used to assess a useful quantity of radiation protection, namely the ambient dose equivalent. This is due to its high sensitivity and its energy response that approximately follows the conversion function between neutron fluence and ambient dose equivalent in the range of thermal to 5 GeV. The simulation of the WENDI-II response function with the Geant4 toolkit is then perfectly suited to compare low- and high-energy hadronic models provided by this Monte Carlo code. The results showed that the thermal treatment of hydrogen in polyethylene for neutron <4 eV has a great influence over the whole detector range. Above 19 MeV, both Bertini Cascade and Binary Cascade models show a good correlation with the results found in the literature, while low-energy parameterised models are not suitable for this application. PMID:22972796

  18. Radiation quality of cosmic ray nuclei studied with Geant4-based simulations

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas N.; Pshenichnov, Igor A.; Mishustin, Igor N.; Bleicher, Marcus

    2014-04-01

    In future missions in deep space a space craft will be exposed to a non-negligible flux of high charge and energy (HZE) particles present in the galactic cosmic rays (GCR). One of the major concerns of manned missions is the impact on humans of complex radiation fields which result from the interactions of HZE particles with the spacecraft materials. The radiation quality of several ions representing GCR is investigated by calculating microdosimetry spectra. A Geant4-based Monte Carlo model for Heavy Ion Therapy (MCHIT) is used to simulate microdosimetry data for HZE particles in extended media where fragmentation reactions play a certain role. Our model is able to reproduce measured microdosimetry spectra for H, He, Li, C and Si in the energy range of 150-490 MeV/u. The effect of nuclear fragmentation on the relative biological effectiveness (RBE) of He, Li and C is estimated and found to be below 10%.

  19. Evaluation using GEANT4 of the transit dose in the Tunisian gamma irradiator for insect sterilization.

    PubMed

    Mannai, K; Askri, B; Loussaief, A; Trabelsi, A

    2007-06-01

    A simulation study of the Tunisian Gamma Irradiation Facility for sterile insects release programs has been realized using the GEANT4 Monte Carlo code of CERN. The dose was calculated and measured for high and low dose values inside the irradiation cell. The calculated high dose was in good agreement with measurements. However, a discrepancy between calculated and measured values occurs at dose levels commonly used for sterilization of insects. We argue that this discrepancy is due to the transit dose absorbed during displacement of targets from their initial position towards their irradiation position and displacement of radiation source pencils from storage towards their irradiation position. The discrepancy is corrected by taking into account the transit dose. PMID:17395474

  20. GEANT4 calibration of gamma spectrometry efficiency for measurements of airborne radioactivity on filter paper.

    PubMed

    Alrefae, Tareq

    2014-11-01

    A simple method of efficiency calibration for gamma spectrometry was performed. This method, which focused on measuring airborne radioactivity collected on filter paper, was based on Monte Carlo simulations using the toolkit GEANT4. Experimentally, the efficiency values of an HPGe detector were calculated for a multi-gamma disk source. These efficiency values were compared to their counterparts produced by a computer code that simulated experimental conditions. Such comparison revealed biases of 24, 10, 1, 3, 7, and 3% for the radionuclides (photon energies in keV) of Ce (166), Sn (392), Cs (662), Co (1,173), Co (1,333), and Y (1,836), respectively. The output of the simulation code was in acceptable agreement with the experimental findings, thus validating the proposed method. PMID:25271933

  1. Development and validation of a GEANT4 radiation transport code for CT dosimetry.

    PubMed

    Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G

    2015-04-01

    The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  2. Application of Design Patterns in Refactoring Software Design

    NASA Technical Reports Server (NTRS)

    Baggs. Rjpda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  3. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  4. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications.

    PubMed

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled (125)I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10(-6) simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications. PMID:26061230

  5. Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.

    PubMed

    Chagren, S; Ben Tekaya, M; Reguigui, N; Gharbi, F

    2016-01-01

    In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed. PMID:26623928

  6. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  7. The effects of mapping CT images to Monte Carlo materials on GEANT4 proton simulation accuracy

    SciTech Connect

    Barnes, Samuel; McAuley, Grant; Slater, James; Wroe, Andrew

    2013-04-15

    Purpose: Monte Carlo simulations of radiation therapy require conversion from Hounsfield units (HU) in CT images to an exact tissue composition and density. The number of discrete densities (or density bins) used in this mapping affects the simulation accuracy, execution time, and memory usage in GEANT4 and other Monte Carlo code. The relationship between the number of density bins and CT noise was examined in general for all simulations that use HU conversion to density. Additionally, the effect of this on simulation accuracy was examined for proton radiation. Methods: Relative uncertainty from CT noise was compared with uncertainty from density binning to determine an upper limit on the number of density bins required in the presence of CT noise. Error propagation analysis was also performed on continuously slowing down approximation range calculations to determine the proton range uncertainty caused by density binning. These results were verified with Monte Carlo simulations. Results: In the presence of even modest CT noise (5 HU or 0.5%) 450 density bins were found to only cause a 5% increase in the density uncertainty (i.e., 95% of density uncertainty from CT noise, 5% from binning). Larger numbers of density bins are not required as CT noise will prevent increased density accuracy; this applies across all types of Monte Carlo simulations. Examining uncertainty in proton range, only 127 density bins are required for a proton range error of <0.1 mm in most tissue and <0.5 mm in low density tissue (e.g., lung). Conclusions: By considering CT noise and actual range uncertainty, the number of required density bins can be restricted to a very modest 127 depending on the application. Reducing the number of density bins provides large memory and execution time savings in GEANT4 and other Monte Carlo packages.

  8. Software design studies emphasizing Project LOGOS

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.

  9. A Geant4 simulation of the depth dose percentage in brain tumors treatments using protons and carbon ions

    NASA Astrophysics Data System (ADS)

    José A. Diaz, M.; Torres, D. A.

    2016-07-01

    The deposited energy and dose distribution of beams of protons and carbon over a head are simulated using the free tool package Geant4 and the data analysis package ROOT-C++. The present work shows a methodology to understand the microscopical process occurring in a session of hadron-therapy using advance simulation tools.

  10. Introducing Third-Year Undergraduates to GEANT4 Simulations of Light Transport and Collection in Scintillation Materials

    ERIC Educational Resources Information Center

    Riggi, Simone; La Rocca, Paola; Riggi, Francesco

    2011-01-01

    GEANT4 simulations of the processes affecting the transport and collection of optical photons generated inside a scintillation detector were carried out, with the aim to complement the educational material offered by textbooks to third-year physics undergraduates. Two typical situations were considered: a long scintillator strip with and without a…

  11. Dosimetry characterization of 32P intravascular brachytherapy source wires using Monte Carlo codes PENELOPE and GEANT4.

    PubMed

    Torres, Javier; Buades, Manuel J; Almansa, Julio F; Guerrero, Rafael; Lallena, Antonio M

    2004-02-01

    Monte Carlo calculations using the codes PENELOPE and GEANT4 have been performed to characterize the dosimetric parameters of the new 20 mm long catheter-based 32P beta source manufactured by the Guidant Corporation. The dose distribution along the transverse axis and the two-dimensional dose rate table have been calculated. Also, the dose rate at the reference point, the radial dose function, and the anisotropy function were evaluated according to the adapted TG-60 formalism for cylindrical sources. PENELOPE and GEANT4 codes were first verified against previous results corresponding to the old 27 mm Guidant 32P beta source. The dose rate at the reference point for the unsheathed 27 mm source in water was calculated to be 0.215 +/- 0.001 cGy s(-1) mCi(-1), for PENELOPE, and 0.2312 +/- 0.0008 cGy s(-1) mCi(-1), for GEANT4. For the unsheathed 20 mm source, these values were 0.2908 +/- 0.0009 cGy s(-1) mCi(-1) and 0.311 0.001 cGy s(-1) mCi(-1), respectively. Also, a comparison with the limited data available on this new source is shown. We found non-negligible differences between the results obtained with PENELOPE and GEANT4. PMID:15000615

  12. A macroscopic and microscopic study of radon exposure using Geant4 and MCNPX to estimate dose rates and DNA damage

    NASA Astrophysics Data System (ADS)

    van den Akker, Mary Evelyn

    Radon is considered the second-leading cause of lung cancer after smoking. Epidemiological studies have been conducted in miner cohorts as well as general populations to estimate the risks associated with high and low dose exposures. There are problems with extrapolating risk estimates to low dose exposures, mainly that the dose-response curve at low doses is not well understood. Calculated dosimetric quantities give average energy depositions in an organ or a whole body, but morphological features of an individual can affect these values. As opposed to human phantom models, Computed Tomography (CT) scans provide unique, patient-specific geometries that are valuable in modeling the radiological effects of the short-lived radon progeny sources. Monte Carlo particle transport code Geant4 was used with the CT scan data to model radon inhalation in the main bronchial bifurcation. The equivalent dose rates are near the lower bounds of estimates found in the literature, depending on source volume. To complement the macroscopic study, simulations were run in a small tissue volume in Geant4-DNA toolkit. As an expansion of Geant4 meant to simulate direct physical interactions at the cellular level, the particle track structure of the radon progeny alphas can be analyzed to estimate the damage that can occur in sensitive cellular structures like the DNA molecule. These estimates of DNA double strand breaks are lower than those found in Geant4-DNA studies. Further refinements of the microscopic model are at the cutting edge of nanodosimetry research.

  13. An empirical study of software design practices

    NASA Technical Reports Server (NTRS)

    Card, David N.; Church, Victor E.; Agresti, William W.

    1986-01-01

    Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.

  14. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products

    PubMed Central

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm3 water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  15. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products.

    PubMed

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm(3) water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  16. Conflict and Reconciliation in Software Design

    NASA Astrophysics Data System (ADS)

    Mandel, Eric

    2014-01-01

    Data analysis software is as open-ended and complex as the research it supports. The written specification is never the full story in an arena where users can’t always know what they want to do next. Requirements often are too vague or too concrete, missing or implicit. They sometimes conflict with one another. How can we design high quality software amidst these variables? In this talk, I will discuss provisional conclusions I have reached concerning software design, based on thirty years of experience developing astronomical software.

  17. Influence of thyroid volume reduction on absorbed dose in 131I therapy studied by using Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ziaur, Rahman; Sikander, M. Mirza; Waheed, Arshed; Nasir, M. Mirza; Waheed, Ahmed

    2014-05-01

    A simulation study has been performed to quantify the effect of volume reduction on the thyroid absorbed dose per decay and to investigate the variation of energy deposition per decay due to β- and γ-activity of 131I with volume/mass of thyroid, for water, ICRP- and ICRU-soft tissue taken as thyroid material. A Monte Carlo model of the thyroid, in the Geant4 radiation transport simulation toolkit was constructed to compute the β- and γ-absorbed dose in the simulated thyroid phantom for various values of its volume. The effect of the size and shape of the thyroid on energy deposition per decay has also been studied by using spherical, ellipsoidal and cylindrical models for the thyroid and varying its volume in 1-25 cm3 range. The relative differences of Geant4 results for different models with each other and MCNP results lie well below 1.870%. The maximum relative difference among the Geant4 estimated results for water with ICRP and ICRU soft tissues is not more than 0.225%. S-values for ellipsoidal, spherical and cylindrical thyroid models were estimated and the relative difference with published results lies within 3.095%. The absorbed fraction values for beta particles show a good agreement with published values within 2.105% deviation. The Geant4 based simulation results of absorbed fractions for gammas again show a good agreement with the corresponding MCNP and EGS4 results (±6.667%) but have 29.032% higher values than that of MIRD calculated values. Consistent with previous studies, the reduction of the thyroid volume is found to have a substantial effect on the absorbed dose. Geant4 simulations confirm dose dependence on the volume/mass of thyroid in agreement with MCNP and EGS4 computed values but are substantially different from MIRD8 data. Therefore, inclusion of size/mass dependence is indicated for 131I radiotherapy of the thyroid.

  18. Domain specific software design for decision aiding

    NASA Technical Reports Server (NTRS)

    Keller, Kirby; Stanley, Kevin

    1992-01-01

    McDonnell Aircraft Company (MCAIR) is involved in many large multi-discipline design and development efforts of tactical aircraft. These involve a number of design disciplines that must be coordinated to produce an integrated design and a successful product. Our interpretation of a domain specific software design (DSSD) is that of a representation or framework that is specialized to support a limited problem domain. A DSSD is an abstract software design that is shaped by the problem characteristics. This parallels the theme of object-oriented analysis and design of letting the problem model directly drive the design. The DSSD concept extends the notion of software reusability to include representations or frameworks. It supports the entire software life cycle and specifically leads to improved prototyping capability, supports system integration, and promotes reuse of software designs and supporting frameworks. The example presented in this paper is the task network architecture or design which was developed for the MCAIR Pilot's Associate program. The task network concept supported both module development and system integration within the domain of operator decision aiding. It is presented as an instance where a software design exhibited many of the attributes associated with DSSD concept.

  19. Software Updates: Web Design--Software that Makes It Easy!

    ERIC Educational Resources Information Center

    Pattridge, Gregory C.

    2002-01-01

    This article discusses Web design software that provides an easy-to-use interface. The "Netscape Communicator" is highlighted for beginning Web page construction and step-by-step instructions are provided for starting out, page colors and properties, indents, bulleted lists, tables, adding links, navigating long documents, creating e-mail links,…

  20. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  1. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  2. Does software design complexity affect maintenance effort?

    NASA Technical Reports Server (NTRS)

    Epping, Andreas; Lott, Christopher M.

    1994-01-01

    The design complexity of a software system may be characterized within a refinement level (e.g., data flow among modules), or between refinement levels (e.g., traceability between the specification and the design). We analyzed an existing set of data from NASA's Software Engineering Laboratory to test whether changing software modules with high design complexity requires more personnel effort than changing modules with low design complexity. By analyzing variables singly, we identified strong correlations between software design complexity and change effort for error corrections performed during the maintenance phase. By analyzing variables in combination, we found patterns which identify modules in which error corrections were costly to perform during the acceptance test phase.

  3. Reflecting Indigenous Culture in Educational Software Design.

    ERIC Educational Resources Information Center

    Fleer, Marilyn

    1989-01-01

    Discusses research on Australian Aboriginal cognition which relates to the development of appropriate educational software. Describes "Tinja," a software program using familiar content and experiences, Aboriginal characters and cultural values, extensive graphics and animation, peer and group work, and open-ended design to help young children read…

  4. Empirical studies of design software: Implications for software engineering environments

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    The empirical studies team of MCC's Design Process Group conducted three studies in 1986-87 in order to gather data on professionals designing software systems in a range of situations. The first study (the Lift Experiment) used thinking aloud protocols in a controlled laboratory setting to study the cognitive processes of individual designers. The second study (the Object Server Project) involved the observation, videotaping, and data collection of a design team of a medium-sized development project over several months in order to study team dynamics. The third study (the Field Study) involved interviews with the personnel from 19 large development projects in the MCC shareholders in order to study how the process of design is affected by organizationl and project behavior. The focus of this report will be on key observations of design process (at several levels) and their implications for the design of environments.

  5. The Simulation of AN Imaging Gamma-Ray Compton Backscattering Device Using GEANT4

    NASA Astrophysics Data System (ADS)

    Flechas, D.; Sarmiento, L. G.; Cristancho, F.; Fajardo, E.

    2014-02-01

    A gamma-backscattering imaging device dubbed Compton Camera, developed at GSI (Darmstadt, Germany) and modified and studied at the Nuclear Physics Group of the National University of Colombia in Bogotá, uses the back-to-back emission of two gamma rays in the positron annihilation to construct a bidimensional image that represents the distribution of matter in the field-of-view of the camera. This imaging capability can be used in a host of different situations, for example, to identify and study deposition and structural defects, and to help locating concealed objects, to name just two cases. In order to increase the understanding of the response of the Compton Camera and, in particular, its image formation process, and to assist in the data analysis, a simulation of the camera was developed using the GEANT4 simulation toolkit. In this work, the images resulting from different experimental conditions are shown. The simulated images and their comparison with the experimental ones already suggest methods to improve the present experimental device

  6. Modeling of x-ray fluorescence using MCNPX and Geant4

    SciTech Connect

    Rajasingam, Akshayan; Hoover, Andrew S; Fensin, Michael L; Tobin, Stephen J

    2009-01-01

    X-Ray Fluorescence (XRF) is one of thirteen non-destructive assay techniques being researched for the purpose of quantifying the Pu mass in used fuel assemblies. The modeling portion of this research will be conducted with the MCNPX transport code. The research presented here was undertaken to test the capability of MCNPX so that it can be used to benchmark measurements made at the ORNL and to give confidence in the application of MCNPX as a predictive tool of the expected capability of XRF in the context of used fuel assemblies. The main focus of this paper is a code-to-code comparison between MCNPX and Geant4 code. Since XRF in used fuel is driven by photon emission and beta decay of fission fragments, both terms were independently researched. Simple cases and used fuel cases were modeled for both source terms. In order to prepare for benchmarking to experiments, it was necessary to determine the relative significance of the various fission fragments for producing X-rays.

  7. The radiation environment near the lunar surface: CRaTER observations and Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J. E.; Blake, J. B.; Spence, H. E.; Schwadron, N. A.; Golightly, M. J.; Case, A. W.; Kasper, J. C.; Townsend, L. W.

    2013-04-01

    At the start of the Lunar Reconnaissance Orbiter mission in 2009, its Cosmic Ray Telescope for the Effects of Radiation instrument measured the radiation environment near the Moon during the recent deep solar minimum, when galactic cosmic rays (GCRs) were at the highest level observed during the space age. We present observations that show the combined effects of GCR primaries, secondary particles ("albedo") created by the interaction of GCRs with the lunar surface, and the interactions of these particles in the shielding material overlying the silicon solid-state detectors of the Cosmic Ray Telescope for the Effects of Radiation. We use Geant4 to model the energy and angular distribution of the albedo particles, and to model the response of the sensor to the various particle species reaching the 50 kilometer altitude of the Lunar Reconnaissance Orbiter. Using simulations to gain insight into the observations, we are able to present preliminary energy-deposit spectra for evaluation of the radiation environment's effects on other sensitive materials, whether biological or electronic, that would be exposed to a similar near-lunar environment.

  8. Validation of a dose deposited by low-energy photons using GATE/GEANT4.

    PubMed

    Thiam, C O; Breton, V; Donnarieix, D; Habib, B; Maigne, L

    2008-06-01

    The GATE Monte Carlo simulation platform based on the Geant4 toolkit has now become a diffused tool for simulating PET and SPECT imaging devices. In this paper, we explore its relevance for dosimetry of low-energy 125I photon brachytherapy sources used to treat prostate cancers. To that end, three 125-iodine sources widely used in prostate cancer brachytherapy treatment have been modelled. GATE simulations reproducing dosimetric reference observables such as radial dose function g(r), anisotropy function F(r, theta) and dose-rate constant (Lambda) were performed in liquid water. The calculations were splitted on the EGEE grid infrastructure to reduce the computing time of the simulations. The results were compared to other relevant Monte Carlo results and to measurements published and fixed as recommended values by the AAPM Task Group 43. GATE results agree with consensus values published by AAPM Task Group 43 with an accuracy better than 2%, demonstrating that GATE is a relevant tool for the study of the dose induced by low-energy photons. PMID:18490808

  9. GEANT4 Application for the Simulation of the Head of a Siemens Primus Linac

    NASA Astrophysics Data System (ADS)

    Cortés-Giraldo, M. A.; Quesada, J. M.; Gallardo, M. I.

    2010-04-01

    The Monte Carlo simulation of the head of a Siemens Primus Linac used at Virgen Macarena Hospital (Sevilla, Spain) has been performed using the code GEANT4 [1-2], version 9.2. In this work, the main features of the application built by our group are presented. They are mainly focused in the optimization of the performance of the simulation. The geometry, including the water phantom, has been entirely wrapped by a shielding volume which discards all the particles escaping far away through its walls. With this, a factor of four in the time spent by the simulation can be saved. An interface to read and write phase-space files in IAEA format has been also developed to save CPU time in our simulations [3-4]. Finally, some calculations of the dose absorption in the water phantom have been done and compared with the results given by EGSnrc [5] and with experimental data obtained for the calibration of the machine.

  10. Interaction of Fast Nucleons with Actinide Nuclei Studied with GEANT4

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yu.; Pshenichnov, I.; Mishustin, I.; Greiner, W.

    2014-04-01

    We model interactions of protons and neutrons with energies from 1 to 1000 MeV with 241Am and 243Am nuclei. The calculations are performed with the Monte Carlo model for Accelerator Driven Systems (MCADS) which we developed based on the GEANT4 toolkit of version 9.4. This toolkit is widely used to simulate the propagation of particles in various materials which contain nuclei up to uranium. After several extensions we apply this toolkit also to proton- and neutron-induced reactions on Am. The fission and radiative neutron capture cross sections, neutron multiplicities and distributions of fission fragments were calculated for 241Am and 243Am and compared with experimental data. As demonstrated, the fission of americium by energetic protons with energies above 20 MeV can be well described by the Intra-Nuclear Cascade Liège (INCL) model combined with the fission-evaporation model ABLA. The calculated average numbers of fission neutrons and mass distributions of fission products agree well with the corresponding data. However, the proton-induced fission below 20 MeV is described less accurately. This is attributed to the limitations of the Intra-Nuclear Cascade model at low projectile energies.

  11. Ion therapy for uveal melanoma in new human eye phantom based on GEANT4 toolkit.

    PubMed

    Mahdipour, Seyed Ali; Mowlavi, Ali Asghar

    2016-01-01

    Radiotherapy with ion beams like proton and carbon has been used for treatment of eye uveal melanoma for many years. In this research, we have developed a new phantom of human eye for Monte Carlo simulation of tumors treatment to use in GEANT4 toolkit. Total depth-dose profiles for the proton, alpha, and carbon incident beams with the same ranges have been calculated in the phantom. Moreover, the deposited energy of the secondary particles for each of the primary beams is calculated. The dose curves are compared for 47.8MeV proton, 190.1MeV alpha, and 1060MeV carbon ions that have the same range in the target region reaching to the center of tumor. The passively scattered spread-out Bragg peak (SOBP) for each incident beam as well as the flux curves of the secondary particles including neutron, gamma, and positron has been calculated and compared for the primary beams. The high sharpness of carbon beam׳s Bragg peak with low lateral broadening is the benefit of this beam in hadrontherapy but it has disadvantages of dose leakage in the tail after its Bragg peak and high intensity of neutron production. However, proton beam, which has a good conformation with tumor shape owing to the beam broadening caused by scattering, can be a good choice for the large-size tumors. PMID:26831752

  12. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Bednarz, B.

    2013-02-01

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  13. Distributions of deposited energy and ionization clusters around ion tracks studied with Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Hilgers, Gerhard; Bleicher, Marcus

    2016-05-01

    The Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT) was extended to study the patterns of energy deposition at sub-micrometer distance from individual ion tracks. Dose distributions for low-energy 1H, 4He, 12C and 16O ions measured in several experiments are well described by the model in a broad range of radial distances, from 0.5 to 3000 nm. Despite the fact that such distributions are characterized by long tails, a dominant fraction of deposited energy (∼80%) is confined within a radius of about 10 nm. The probability distributions of clustered ionization events in nanoscale volumes of water traversed by 1H, 2H, 4He, 6Li, 7Li, and 12C ions are also calculated. A good agreement of calculated ionization cluster-size distributions with the corresponding experimental data suggests that the extended MCHIT can be used to characterize stochastic processes of energy deposition to sensitive cellular structures.

  14. VIDA: A Voxel-Based Dosimetry Method for Targeted Radionuclide Therapy Using Geant4

    PubMed Central

    Dewaraja, Yuni K.; Abramson, Richard G.; Stabin, Michael G.

    2015-01-01

    Abstract We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy (131I, 90Y, 111In, 177Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by 131I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression. PMID:25594357

  15. Geant4 simulation study of Indian National Gamma Array at TIFR

    NASA Astrophysics Data System (ADS)

    Saha, S.; Palit, R.; Sethi, J.; Biswas, S.; Singh, P.

    2016-03-01

    A Geant4 simulation code for the Indian National Gamma Array (INGA) consisting of 24 Compton suppressed clover high purity germanium (HPGe) detectors has been developed. The calculated properties in the energy range that is of interest for nuclear γ-ray spectroscopy are spectral distributions for various standard radioactive sources, intrinsic peak efficiencies and peak-to-total (P/T) ratios in various configurations such as singles, add-back and Compton suppressed mode. The principle of operation of the detectors in add-back and Compton suppression mode have been reproduced in the simulation. The reliability of the calculation is checked by comparison with the experimental data for various γ-ray energies up to 5 MeV. The comparison between simulation results and experimental data demonstrate the need of incorporating the exact geometry of the clover detectors, Anti-Compton Shield and other surrounding materials in the array to explain the detector response to the γ-ray. Several experimental effects are also investigated. These include the geometrical correction to angular distribution, crosstalk probability and the impact of heavy metal collimators between the target and the array on the P/T ratio.

  16. Accuracy of the photon and electron physics in GEANT4 for radiotherapy applications

    SciTech Connect

    Poon, Emily; Verhaegen, Frank

    2005-06-15

    This work involves a validation of the photon and electron transport of the GEANT4 particle simulation toolkit for radiotherapy physics applications. We examine the cross sections and sampling algorithms of the three electromagnetic physics models in version 4.6.1 of the toolkit: Standard, Low-energy, and Penelope. The depth dose distributions in water for incident monoenergetic and clinical beams are compared to the EGSNRC results. In photon beam simulations, all three models agree with EGSNRC to within 2%, except for the buildup region. Larger deviations are found for incident electron beams, and the differences are affected by user-imposed electron step limitations. Particle distributions through thin layers of clinical target materials, and perturbation effects near high-Z and low-Z interfaces are also investigated. The electron step size artifacts observed in our studies indicate potential problems with the condensed history algorithm. A careful selection of physics processes and transport parameters is needed for optimum efficiency and accuracy.

  17. VIDA: a voxel-based dosimetry method for targeted radionuclide therapy using Geant4.

    PubMed

    Kost, Susan D; Dewaraja, Yuni K; Abramson, Richard G; Stabin, Michael G

    2015-02-01

    We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy ((131)I, (90)Y, (111)In, (177)Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by (131)I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression. PMID:25594357

  18. GEANT4 simulation of the angular dependence of TLD-based monitor response

    NASA Astrophysics Data System (ADS)

    Guimarães, C. C.; Moralles, M.; Okuno, E.

    2007-09-01

    In this work, the response of thermoluminescent (TL) monitors to X-ray beams impinging on them at different angles was investigated and compared with results of simulations performed with the GEANT4 radiation transport toolkit. Each monitor used contains four TL detectors (TLD): two CaF 2 pellets and two TLD-100 (one of each type within lead filter and the other without filter). Monitors were irradiated free-in-air with X-ray beams of the narrow and wide spectrum with effective energy of 61 and 130 keV and angles of incidence of 0°, 30°, 45°, and 60°. Curves of TL response relative to air kerma as a function of photon effective energy for each detector, with and without filter, are used to correct the energetic dependence of TL response. Such curves were also obtained from the data of radiation energy stored in the TLDs provided by the simulations. The attenuation increases with the increase of the incidence angle, since the thickness of lead filter traversed by the beam also enlarges. As the monitor calibration is usually performed with the beams impinging the monitor at 0°, changes in the attenuation become a source of error in the energy determination and consequently in the value of dose equivalent obtained with this monitor. The changes in attenuation observed in experiments were corroborated by the Monte Carlo simulations.

  19. Software support environment design knowledge capture

    NASA Technical Reports Server (NTRS)

    Dollman, Tom

    1990-01-01

    The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.

  20. Intelligent Detector Design

    SciTech Connect

    Graf, N.A.; /SLAC

    2012-06-11

    As the complexity and resolution of imaging detectors increases, the need for detailed simulation of the experimental setup also becomes more important. Designing the detectors requires efficient tools to simulate the detector response and reconstruct the events. We have developed efficient and flexible tools for detailed physics and detector response simulation as well as event reconstruction and analysis. The primary goal has been to develop a software toolkit and computing infrastructure to allow physicists from universities and labs to quickly and easily conduct physics analyses and contribute to detector research and development. The application harnesses the full power of the Geant4 toolkit without requiring the end user to have any experience with either Geant4 or C++, thereby allowing the user to concentrate on the physics of the detector system.

  1. GENII Version 2 Software Design Document

    SciTech Connect

    Napier, Bruce A.; Strenge, Dennis L.; Ramsdell, James V.; Eslinger, Paul W.; Fosmire, Christian J.

    2004-03-08

    This document describes the architectural design for the GENII-V2 software package. This document defines details of the overall structure of the software, the major software components, their data file interfaces, and specific mathematical models to be used. The design represents a translation of the requirements into a description of the software structure, software components, interfaces, and necessary data. The design focuses on the major components and data communication links that are key to the implementation of the software within the operating framework. The purpose of the GENII-V2 software package is to provide the capability to perform dose and risk assessments of environmental releases of radionuclides. The software also has the capability of calculating environmental accumulation and radiation doses from surface water, groundwater, and soil (buried waste) media when an input concentration of radionuclide in these media is provided. This report represents a detailed description of the capabilities of the software product with exact specifications of mathematical models that form the basis for the software implementation and testing efforts. This report also presents a detailed description of the overall structure of the software package, details of main components (implemented in the current phase of work), details of data communication files, and content of basic output reports. The GENII system includes the capabilities for calculating radiation doses following chronic and acute releases. Radionuclide transport via air, water, or biological activity may be considered. Air transport options include both puff and plume models, each allow use of an effective stack height or calculation of plume rise from buoyant or momentum effects (or both). Building wake effects can be included in acute atmospheric release scenarios. The code provides risk estimates for health effects to individuals or populations; these can be obtained using the code by applying

  2. Software design and documentation language, revision 1

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1979-01-01

    The Software Design and Documentation Language (SDDL) developed to provide an effective communications medium to support the design and documentation of complex software applications is described. Features of the system include: (1) a processor which can convert design specifications into an intelligible, informative machine-reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor. The SDDL processor is written in the SIMSCRIPT II programming language and is implemented on the UNIVAC 1108, the IBM 360/370, and Control Data machines.

  3. Space Software for Automotive Design

    NASA Technical Reports Server (NTRS)

    1988-01-01

    John Thousand of Wolverine Western Corp. put his aerospace group to work on an unfamiliar job, designing a brake drum using computer design techniques. Computer design involves creation of a mathematical model of a product and analyzing its effectiveness in simulated operation. Technique enables study of performance and structural behavior of a number of different designs before settling on a final configuration. Wolverine employees attacked a traditional brake drum problem, the sudden buildup of heat during fast and repeated braking. Part of brake drum not confined tends to change its shape under combination of heat, physical pressure and rotational forces, a condition known as bellmouthing. Since bellmouthing is a major factor in braking effectiveness, a solution of problem would be a major advance in automotive engineering. A former NASA employee, now a Wolverine employee, knew of a series of NASA computer programs ideally suited to confronting bellmouthing. Originally developed as aids to rocket engine nozzle design, it's capable of analyzing problems generated in a rocket engine or automotive brake drum by heat, expansion, pressure and rotational forces. Use of these computer programs led to new brake drum concept featuring a more durable axle, and heat transfer ribs, or fins, on hub of drum.

  4. Assessment of patient dose reduction by bismuth shielding in CT using measurements, GEANT4 and MCNPX simulations.

    PubMed

    Mendes, M; Costa, F; Figueira, C; Madeira, P; Teles, P; Vaz, P

    2015-07-01

    This work reports on the use of two different Monte Carlo codes (GEANT4 and MCNPX) for assessing the dose reduction using bismuth shields in computer tomography (CT) procedures in order to protect radiosensitive organs such as eye lens, thyroid and breast. Measurements were performed using head and body PMMA phantoms and an ionisation chamber placed in five different positions of the phantom. Simulations were performed to estimate Computed Tomography Dose Index values using GEANT4 and MCNPX. The relative differences between measurements and simulations were <10 %. The dose reduction arising from the use of bismuth shielding ranges from 2 to 45 %, depending on the position of the bismuth shield. The percentage of dose reduction was more significant for the area covered by the bismuth shielding (36 % for eye lens, 39 % for thyroid and 45 % for breast shields). PMID:25813483

  5. Program Helps Design Tests Of Developmental Software

    NASA Technical Reports Server (NTRS)

    Hops, Jonathan

    1994-01-01

    Computer program called "A Formal Test Representation Language and Tool for Functional Test Designs" (TRL) provides automatic software tool and formal language used to implement category-partition method and produce specification of test cases in testing phase of development of software. Category-partition method useful in defining input, outputs, and purpose of test-design phase of development and combines benefits of choosing normal cases having error-exposing properties. Traceability maintained quite easily by creating test design for each objective in test plan. Effort to transform test cases into procedures simplified by use of automatic software tool to create cases based on test design. Method enables rapid elimination of undesired test cases from consideration and facilitates review of test designs by peer groups. Written in C language.

  6. Banning design automation software implementation

    NASA Technical Reports Server (NTRS)

    Kuehlthau, R. L.

    1975-01-01

    The research is reported for developing a system of computer programs to aid engineering in the design, fabrication, and testing of large scale integrated circuits, hybrid circuits, and printed circuit boards. The automatic layout programs, analysis programs, and interface programs are discussed.

  7. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  8. SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1994-01-01

    Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures

  9. Geant4 simulations on medical Linac operation at 18 MV: Experimental validation based on activation foils

    NASA Astrophysics Data System (ADS)

    Vagena, E.; Stoulos, S.; Manolopoulou, M.

    2016-03-01

    The operation of a medical linear accelerator was simulated using the Geant4 code regarding to study the characteristics of an 18 MeV photon beam. Simulations showed that (a) the photon spectrum at the isocenter is not influenced by changes of the primary electron beam's energy distribution and spatial spread (b) 98% of the photon energy fluence scored at the isocenter is primary photons that have only interacted with the target (c) the number of contaminant electrons is not negligible since it fluctuated around 5×10-5 per primary electron or 2.40×10-3 per photon at the isocenter (d) the number of neutrons that are created by (γ, n) reactions is 3.13×10-6 per primary electron or 1.50×10-3 per photon at the isocenter (e) a flattening filter free beam needs less primary electrons in order to deliver the same photon fluence at the isocenter than a normal flattening filter operation (f) there is no significant increase of the surface dose due to the contaminant electrons by removing the flattening filter (g) comparing the neutron fluences per incident electron for the flattened and unflattened beam, the neutron fluencies is 7% higher for the unflattened beams. To validate the simulations results, the total neutron and photon fluence at the isocenter field were measured using nickel, indium, and natural uranium activation foils. The percentage difference between simulations and measurements was 1.26% in case of uranium and 2.45% in case of the indium foil regarding photon fluencies while for neutrons the discrepancy is higher up to 8.0%. The photon and neutron fluencies of the simulated experiments fall within a range of ±1 and ±2 sigma error, respectively, compared to the ones obtained experimentally.

  10. BC404 scintillators as gamma locators studied via Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Cortés, M. L.; Hoischen, R.; Eisenhauer, K.; Gerl, J.; Pietralla, N.

    2014-05-01

    In many applications in industry and academia, an accurate determination of the direction from where gamma rays are emitted is either needed or desirable. Ion-beam therapy treatments, the search for orphan sources, and homeland security applications are examples of fields that can benefit from directional sensitivity to gamma-radiation. Scintillation detectors are a good option for these types of applications as they have relatively low cost, are easy to handle and can be produced in a large range of different sizes. In this work a Geant4 simulation was developed to study the directional sensitivity of different BC404 scintillator geometries and arrangements. The simulation includes all the physical processes relevant for gamma detection in a scintillator. In particular, the creation and propagation of optical photons inside the scintillator was included. A simplified photomultiplier tube model was also simulated. The physical principle exploited is the angular dependence of the shape of the energy spectrum obtained from thin scintillator layers when irradiated from different angles. After an experimental confirmation of the working principle of the device and a check of the simulation, the possibilities and limitations of directional sensitivity to gamma radiation using scintillator layers was tested. For this purpose, point-like sources of typical energies expected in ion-beam therapy were used. Optimal scintillator thicknesses for different energies were determined and the setup efficiencies calculated. The use of arrays of scintillators to reconstruct the direction of incoming gamma rays was also studied. For this case, a spherical source emitting Bremsstrahlung radiation was used together with a setup consisting of scintillator layers. The capability of this setup to identify the center of the extended source was studied together with its angular resolution.

  11. PDB4DNA: Implementation of DNA geometry from the Protein Data Bank (PDB) description for Geant4-DNA Monte-Carlo simulations

    NASA Astrophysics Data System (ADS)

    Delage, E.; Pham, Q. T.; Karamitros, M.; Payno, H.; Stepan, V.; Incerti, S.; Maigne, L.; Perrot, Y.

    2015-07-01

    This paper describes PDB4DNA, a new Geant4 user application, based on an independent, cross-platform, free and open source C++ library, so-called PDBlib, which enables use of atomic level description of DNA molecule in Geant4 Monte Carlo particle transport simulations. For the evaluation of direct damage induced on the DNA molecule by ionizing particles, the application makes use of an algorithm able to determine the closest atom in the DNA molecule to energy depositions. Both the PDB4DNA application and the PDBlib library are available as free and open source under the Geant4 license.

  12. Geant4 simulations of Solar-Orbiter STIX Caliste detectors’ response to solar X-ray radiation

    NASA Astrophysics Data System (ADS)

    Barylak, Jaromir; Barylak, Aleksandra; Mrozek, Tomasz; Steslicki, Marek; Podgorski, Piotr; Netzel, Henryka

    2015-08-01

    Spectrometer/Telescope for Imaging X-rays (STIX) is a part of Solar Orbiter (SO) science payload. SO which will be launched in October 2018 into final orbit approaching the Sun to within 0.3 a.u. STIX is a Fourier imager equipped with pairs of grids that comprise the flare hard X-ray tomograph. Similar imager types were already used in the past (eq. RHESSI, Yohkoh/HXT), but STIX will incorporate Moiré modulation and a new type of pixelated? detectors.We developed a method of modeling these detectors’ response matrix (DRM) using the Geant4 simulations of X-ray photons interactions with CdTe crystals. Taking into account known detector effects (Fano noise, hole tailing etc.) we modeled the resulting spectra with high accuracy. Comparison of Caliste-SO laboratory measurements of 241Am decay spectrum with our results shows a perfect agreement (within 1-2%).By using the Geant4 tool we proceed to model ageing response of detectors (several years in interplanetary space). The modeling based on the Geant4 simulations significantly improves our understanding of detector response to X-ray photons and secondary background emission due to particles. As an example we present predicted X-ray spectra of solar flares obtained for several levels of detectors’ degradation and for various distances of SO from the Sun.

  13. Optimization of a photoneutron source based on 10 MeV electron beam using Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Askri, Boubaker

    2015-10-01

    Geant4 Monte Carlo code has been used to conceive and optimize a simple and compact neutron source based on a 10 MeV electron beam impinging on a tungsten target adjoined to a beryllium target. For this purpose, a precise photonuclear reaction cross-section model issued from the International Atomic Energy Agency (IAEA) database was linked to Geant4 to accurately simulate the interaction of low energy bremsstrahlung photons with beryllium material. A benchmark test showed that a good agreement was achieved when comparing the emitted neutron flux spectra predicted by Geant4 and Fluka codes for a beryllium cylinder bombarded with a 5 MeV photon beam. The source optimization was achieved through a two stage Monte Carlo simulation. In the first stage, the distributions of the seven phase space coordinates of the bremsstrahlung photons at the boundaries of the tungsten target were determined. In the second stage events corresponding to photons emitted according to these distributions were tracked. A neutron yield of 4.8 × 1010 neutrons/mA/s was obtained at 20 cm from the beryllium target. A thermal neutron yield of 1.5 × 109 neutrons/mA/s was obtained after introducing a spherical shell of polyethylene as a neutron moderator.

  14. Validating Geant4 Versions 7.1 and 8.3 Against 6.1 for BaBar

    SciTech Connect

    Banerjee, Swagato; Brown, David N.; Chen, Chunhui; Cote, David; Dubois-Felsmann, Gregory P.; Gaponenko, Igor; Kim, Peter C.; Lockman, William S.; Neal, Homer A.; Simi, Gabriele; Telnov, Alexandre V.; Wright, Dennis H.; /SLAC

    2011-11-08

    Since 2005 and 2006, respectively, Geant4 versions 7.1 and 8.3 have been available, providing: improvements in modeling of multiple scattering; corrections to muon ionization and improved MIP signature; widening of the core of electromagnetic shower shape profiles; newer implementation of elastic scattering for hadronic processes; detailed implementation of Bertini cascade model for kaons and lambdas, and updated hadronic cross-sections from calorimeter beam tests. The effects of these changes in simulation are studied in terms of closer agreement of simulation using Geant4 versions 7.1 and 8.3 as compared to Geant4 version 6.1 with respect to data distributions of: the hit residuals of tracks in BABAR silicon vertex tracker; the photon and K{sub L}{sup 0} shower shapes in the electromagnetic calorimeter; the ratio of energy deposited in the electromagnetic calorimeter and the flux return of the magnet instrumented with a muon detection system composed of resistive plate chambers and limited-streamer tubes; and the muon identification efficiency in the muon detector system of the BABAR detector.

  15. Software For Drawing Design Details Concurrently

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    Software system containing five computer-aided-design programs enables more than one designer to work on same part or assembly at same time. Reduces time necessary to produce design by implementing concept of parallel or concurrent detailing, in which all detail drawings documenting three-dimensional model of part or assembly produced simultaneously, rather than sequentially. Keeps various detail drawings consistent with each other and with overall design by distributing changes in each detail to all other affected details.

  16. Design of software engineering teaching website

    NASA Astrophysics Data System (ADS)

    Li, Yuxiang; Liu, Xin; Zhang, Guangbin; Liu, Xingshun; Gao, Zhenbo

    "􀀶oftware engineering" is different from the general professional courses, it is born for getting rid of the software crisis and adapting to the development of software industry, it is a theory course, especially a practical course. However, due to the own characteristics of software engineering curriculum, in the daily teaching process, concerning theoretical study, students may feel boring, obtain low interest in learning and poor test results and other problems. ASPNET design technique is adopted and Access 2007 database is used for system to design and realize "Software Engineering" teaching website. System features mainly include theoretical teaching, case teaching, practical teaching, teaching interaction, database, test item bank, announcement, etc., which can enhance the vitality, interest and dynamic role of learning.

  17. Early-Stage Software Design for Usability

    ERIC Educational Resources Information Center

    Golden, Elspeth

    2010-01-01

    In spite of the goodwill and best efforts of software engineers and usability professionals, systems continue to be built and released with glaring usability flaws that are costly and difficult to fix after the system has been built. Although user interface (UI) designers, be they usability or design experts, communicate usability requirements to…

  18. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  19. General purpose optimization software for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  20. Monte Carlo simulation of MOSFET dosimeter for electron backscatter using the GEANT4 code.

    PubMed

    Chow, James C L; Leung, Michael K K

    2008-06-01

    The aim of this study is to investigate the influence of the body of the metal-oxide-semiconductor field effect transistor (MOSFET) dosimeter in measuring the electron backscatter from lead. The electron backscatter factor (EBF), which is defined as the ratio of dose at the tissue-lead interface to the dose at the same point without the presence of backscatter, was calculated by the Monte Carlo simulation using the GEANT4 code. Electron beams with energies of 4, 6, 9, and 12 MeV were used in the simulation. It was found that in the presence of the MOSFET body, the EBFs were underestimated by about 2%-0.9% for electron beam energies of 4-12 MeV, respectively. The trend of the decrease of EBF with an increase of electron energy can be explained by the small MOSFET dosimeter, mainly made of epoxy and silicon, not only attenuated the electron fluence of the electron beam from upstream, but also the electron backscatter generated by the lead underneath the dosimeter. However, this variation of the EBF underestimation is within the same order of the statistical uncertainties as the Monte Carlo simulations, which ranged from 1.3% to 0.8% for the electron energies of 4-12 MeV, due to the small dosimetric volume. Such small EBF deviation is therefore insignificant when the uncertainty of the Monte Carlo simulation is taken into account. Corresponding measurements were carried out and uncertainties compared to Monte Carlo results were within +/- 2%. Spectra of energy deposited by the backscattered electrons in dosimetric volumes with and without the lead and MOSFET were determined by Monte Carlo simulations. It was found that in both cases, when the MOSFET body is either present or absent in the simulation, deviations of electron energy spectra with and without the lead decrease with an increase of the electron beam energy. Moreover, the softer spectrum of the backscattered electron when lead is present can result in a reduction of the MOSFET response due to stronger

  1. Evaluation of commercially available lighting design software

    SciTech Connect

    McConnell, D.

    1989-01-01

    For years the lighting industry has manually entered and manually performed calculations on the photometric data that is necessary for lighting designs. In the past few years many lighting manufacturers and private lighting design software companies have published computer programs to enter and perform these calculations. Sandia National Laboratories (SNL), and other interested organizations, are involved in outdoor lighting designs for Closed Circuit Television (CCTV) that require lighting design software programs. During the period when no commercial lighting design software programs existed, SNL first used a government agency's program and then developed an in-house program. The in-house program is very powerful but has limitations, so it is not feasible to distribute it to interested organizations. This program has been used extensively for many high security outdoor lighting design projects. There is still a demand for lighting design programs, so SNL has ordered several that are commercially available. These programs are being evaluated for two reasons: (1) to determine if their features are adequate to aid the user in lighting designs, and (2) to provide that information to SNL and other organizations. The information obtained in this paper is to be used to help an end user decide if a program is needed, and if so, to choose one. This paper presents the results of evaluations performed. 5 refs., 6 figs., 3 tabs.

  2. Photonic IC design software and process design kits

    NASA Astrophysics Data System (ADS)

    Korthorst, Twan; Stoffer, Remco; Bakker, Arjen

    2015-04-01

    This review discusses photonic IC design software tools, examines existing design flows for photonics design and how these fit different design styles and describes the activities in collaboration and standardization within the silicon photonics group from Si2 and by members of the PDAFlow Foundation to improve design flows. Moreover, it will address the lowering of access barriers to the technology by providing qualified process design kits (PDKs) and improved integration of photonic integrated circuit simulations, physical simulations, mask layout, and verification.

  3. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  4. User Interface Design for Dynamic Geometry Software

    ERIC Educational Resources Information Center

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  5. Teacher-Driven Design of Educational Software.

    ERIC Educational Resources Information Center

    Carlson, Patricia A.

    This paper reflects on the author's participation in two government-sponsored educational software development projects that used a holistic design paradigm in which classroom formative assessment and teacher input played a critical role in the development process. The two project were: R-WISE (Reading and Writing in a Supportive Environment)--a…

  6. Computer Software Designs for College Science Courses.

    ERIC Educational Resources Information Center

    Jain, Duli C.; And Others

    1985-01-01

    Computer-assisted-instruction software was developed to supplement the conventional lecture-laboratory mode of instruction with another instructional aid for learning science in an individualized, nonthreatening environment. This development project was designed to teach physical concepts, mathematical techniques, and problem solving strategies.…

  7. Monte Carlo simulation and scatter correction of the GE Advance PET scanner with SimSET and Geant4

    NASA Astrophysics Data System (ADS)

    Barret, Olivier; Carpenter, T. Adrian; Clark, John C.; Ansorge, Richard E.; Fryer, Tim D.

    2005-10-01

    For Monte Carlo simulations to be used as an alternative solution to perform scatter correction, accurate modelling of the scanner as well as speed is paramount. General-purpose Monte Carlo packages (Geant4, EGS, MCNP) allow a detailed description of the scanner but are not efficient at simulating voxel-based geometries (patient images). On the other hand, dedicated codes (SimSET, PETSIM) will perform well for voxel-based objects but will be poor in their capacity of simulating complex geometries such as a PET scanner. The approach adopted in this work was to couple a dedicated code (SimSET) with a general-purpose package (Geant4) to have the efficiency of the former and the capabilities of the latter. The combined SimSET+Geant4 code (SimG4) was assessed on the GE Advance PET scanner and compared to the use of SimSET only. A better description of the resolution and sensitivity of the scanner and of the scatter fraction was obtained with SimG4. The accuracy of scatter correction performed with SimG4 and SimSET was also assessed from data acquired with the 20 cm NEMA phantom. SimG4 was found to outperform SimSET and to give slightly better results than the GE scatter correction methods installed on the Advance scanner (curve fitting and scatter modelling for the 300-650 keV and 375-650 keV energy windows, respectively). In the presence of a hot source close to the edge of the field of view (as found in oxygen scans), the GE curve-fitting method was found to fail whereas SimG4 maintained its performance.

  8. Monte Carlo simulation and scatter correction of the GE advance PET scanner with SimSET and Geant4.

    PubMed

    Barret, Olivier; Carpenter, T Adrian; Clark, John C; Ansorge, Richard E; Fryer, Tim D

    2005-10-21

    For Monte Carlo simulations to be used as an alternative solution to perform scatter correction, accurate modelling of the scanner as well as speed is paramount. General-purpose Monte Carlo packages (Geant4, EGS, MCNP) allow a detailed description of the scanner but are not efficient at simulating voxel-based geometries (patient images). On the other hand, dedicated codes (SimSET, PETSIM) will perform well for voxel-based objects but will be poor in their capacity of simulating complex geometries such as a PET scanner. The approach adopted in this work was to couple a dedicated code (SimSET) with a general-purpose package (Geant4) to have the efficiency of the former and the capabilities of the latter. The combined SimSET+Geant4 code (SimG4) was assessed on the GE Advance PET scanner and compared to the use of SimSET only. A better description of the resolution and sensitivity of the scanner and of the scatter fraction was obtained with SimG4. The accuracy of scatter correction performed with SimG4 and SimSET was also assessed from data acquired with the 20 cm NEMA phantom. SimG4 was found to outperform SimSET and to give slightly better results than the GE scatter correction methods installed on the Advance scanner (curve fitting and scatter modelling for the 300-650 keV and 375-650 keV energy windows, respectively). In the presence of a hot source close to the edge of the field of view (as found in oxygen scans), the GE curve-fitting method was found to fail whereas SimG4 maintained its performance. PMID:16204875

  9. Monte Carlo simulation of a PhosWatch detector using Geant4 for xenon isotope beta-gamma coincidence spectrum profile and detection efficiency calculations.

    PubMed

    Mekarski, P; Zhang, W; Ungar, K; Bean, M; Korpach, E

    2009-10-01

    A simulation tool has been developed using the Geant4 Toolkit to simulate a PhosWatch single channel beta-gamma coincidence detection system consisting of a CsI(Tl)/BC404 Phoswich well detector and pulse shape analysis algorithms implemented digital signal processor. The tool can be used to simulate the detector's response for all the gamma rays and beta particles emitted from (135)Xe, (133m)Xe, (133)Xe, (131m)Xe and (214)Pb. Two- and three-dimensional beta-gamma coincidence spectra from the PhosWatch detector can be produced using the simulation tool. The accurately simulated spectra could be used to calculate system coincidence detection efficiency for each xenon isotope, the corrections for the interference from the various spectral components from radon and xenon isotopes, and system gain calibration. Also, it can generate two- and three-dimensional xenon reference spectra to test beta-gamma coincidence spectral deconvolution analysis software. PMID:19647444

  10. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  11. SU-E-T-347: Validation of the Condensed History Algorithm of Geant4 Using the Fano Test

    SciTech Connect

    Lee, H; Mathis, M; Sawakuchi, G

    2014-06-01

    Purpose: To validate the condensed history algorithm and physics of the Geant4 Monte Carlo toolkit for simulations of ionization chambers (ICs). This study is the first step to validate Geant4 for calculations of photon beam quality correction factors under the presence of a strong magnetic field for magnetic resonance guided linac system applications. Methods: The electron transport and boundary crossing algorithms of Geant4 version 9.6.p02 were tested under Fano conditions using the Geant4 example/application FanoCavity. User-defined parameters of the condensed history and multiple scattering algorithms were investigated under Fano test conditions for three scattering models (physics lists): G4UrbanMscModel95 (PhysListEmStandard-option3), G4GoudsmitSaundersonMsc (PhysListEmStandard-GS), and G4WentzelVIModel/G4CoulombScattering (PhysListEmStandard-WVI). Simulations were conducted using monoenergetic photon beams, ranging from 0.5 to 7 MeV and emphasizing energies from 0.8 to 3 MeV. Results: The GS and WVI physics lists provided consistent Fano test results (within ±0.5%) for maximum step sizes under 0.01 mm at 1.25 MeV, with improved performance at 3 MeV (within ±0.25%). The option3 physics list provided consistent Fano test results (within ±0.5%) for maximum step sizes above 1 mm. Optimal parameters for the option3 physics list were 10 km maximum step size with default values for other user-defined parameters: 0.2 dRoverRange, 0.01 mm final range, 0.04 range factor, 2.5 geometrical factor, and 1 skin. Simulations using the option3 physics list were ∼70 – 100 times faster compared to GS and WVI under optimal parameters. Conclusion: This work indicated that the option3 physics list passes the Fano test within ±0.5% when using a maximum step size of 10 km for energies suitable for IC calculations in a 6 MV spectrum without extensive computational times. Optimal user-defined parameters using the option3 physics list will be used in future IC simulations to

  12. Computed Pion Yields from a Tantalum Rod Target: Comparing MARS15 and GEANT4 Across Proton Energies

    NASA Astrophysics Data System (ADS)

    Brooks, S. J.; Walaron, K. A.

    2006-05-01

    The choice of proton driver energy is an important variable in maximising the pion flux available in later stages of the neutrino factory. Simulations of pion production using a range of energies are presented and cross-checked for reliability between the codes MARS15 and GEANT4. The distributions are combined with postulated apertures for the pion decay channel and muon front-end to estimate the usable muon flux after capture losses. Resolution of discrepancies between the codes awaits experimental data in the required energy range.

  13. Studying the response of a plastic scintillator to gamma rays using the Geant4 Monte Carlo code.

    PubMed

    Ghadiri, Rasoul; Khorsandi, Jamshid

    2015-05-01

    To determine the gamma ray response function of an NE-102 scintillator and to investigate the gamma spectra due to the transport of optical photons, we simulated an NE-102 scintillator using Geant4 code. The results of the simulation were compared with experimental data. Good consistency between the simulation and data was observed. In addition, the time and spatial distributions, along with the energy distribution and surface treatments of scintillation detectors, were calculated. This simulation makes us capable of optimizing the photomultiplier tube (or photodiodes) position to yield the best coupling to the detector. PMID:25725326

  14. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  15. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  16. A comparison of the measured responses of a tissue-equivalent proportional counter to high energy heavy (HZE) particles and those simulated using the Geant4 Monte Carlo code

    PubMed Central

    Taddei, Phillip J.; Zhao, Zhongxiang; Borak, Thomas B.

    2010-01-01

    Monte Carlo simulations of heavy ion interactions using the Geant4 toolkit were compared with measurements of energy deposition in a spherical tissue-equivalent proportional counter (TEPC). A spherical cavity with a physical diameter of 12.7 mm was filled with propane-based tissue-equivalent gas surrounded by a wall of A-150 tissue-equivalent plastic that was 2.54 mm to thick. Measurements and Monte Carlo simulations were used to record the energy deposition and the trajectory of the incident particle on an event-by-event basis for ions ranging in atomic number from 2 (4He) to 26 (56Fe) and in energy from 200 MeV/nucleon to 1000 MeV/nucleon. In the simulations, tracking of secondary electrons was terminated when the range of an electron was below a specified threshold. The effects of range cuts for electrons at 0.5 μm, 1 μm, 10 μm, and 100 μm were evaluated. To simulate an energy deposition influenced by large numbers of low energy electrons with large transverse momentum, it was necessary to track electrons down to range cuts of 10 μm or less. The Geant4 simulated data closely matched the measured data acquired using a TEPC for incident particles traversing the center of the detector as well as near the gas-wall interface. Values of frequency mean lineal energy and dose mean lineal energy were within 8% of the measured data. The production of secondary particles in the aluminum vacuum chamber had no effect on the response of the TEPC for 56Fe at 1000 MeV/nucleon. The results of this study confirm that Geant4 can simulate patterns of energy deposition for existing microdosimeters and is valuable for improving the design of a new generation of detectors used for space dosimetry and for characterizing particle beams used in hadron radiotherapy. PMID:20862212

  17. A comparison of the measured responses of a tissue-equivalent proportional counter to high energy heavy (HZE) particles and those simulated using the Geant4 Monte Carlo code.

    PubMed

    Taddei, Phillip J; Zhao, Zhongxiang; Borak, Thomas B

    2008-10-01

    Monte Carlo simulations of heavy ion interactions using the Geant4 toolkit were compared with measurements of energy deposition in a spherical tissue-equivalent proportional counter (TEPC). A spherical cavity with a physical diameter of 12.7 mm was filled with propane-based tissue-equivalent gas surrounded by a wall of A-150 tissue-equivalent plastic that was 2.54 mm to thick. Measurements and Monte Carlo simulations were used to record the energy deposition and the trajectory of the incident particle on an event-by-event basis for ions ranging in atomic number from 2 ((4)He) to 26 ((56)Fe) and in energy from 200 MeV/nucleon to 1000 MeV/nucleon. In the simulations, tracking of secondary electrons was terminated when the range of an electron was below a specified threshold. The effects of range cuts for electrons at 0.5 μm, 1 μm, 10 μm, and 100 μm were evaluated. To simulate an energy deposition influenced by large numbers of low energy electrons with large transverse momentum, it was necessary to track electrons down to range cuts of 10 μm or less. The Geant4 simulated data closely matched the measured data acquired using a TEPC for incident particles traversing the center of the detector as well as near the gas-wall interface. Values of frequency mean lineal energy and dose mean lineal energy were within 8% of the measured data. The production of secondary particles in the aluminum vacuum chamber had no effect on the response of the TEPC for (56)Fe at 1000 MeV/nucleon. The results of this study confirm that Geant4 can simulate patterns of energy deposition for existing microdosimeters and is valuable for improving the design of a new generation of detectors used for space dosimetry and for characterizing particle beams used in hadron radiotherapy. PMID:20862212

  18. Intelligent Software for System Design and Documentation

    NASA Technical Reports Server (NTRS)

    2002-01-01

    In an effort to develop a real-time, on-line database system that tracks documentation changes in NASA's propulsion test facilities, engineers at Stennis Space Center teamed with ECT International of Brookfield, WI, through the NASA Dual-Use Development Program to create the External Data Program and Hyperlink Add-on Modules for the promis*e software. Promis*e is ECT's top-of-the-line intelligent software for control system design and documentation. With promis*e the user can make use of the automated design process to quickly generate control system schematics, panel layouts, bills of material, wire lists, terminal plans and more. NASA and its testing contractors currently use promis*e to create the drawings and schematics at the E2 Cell 2 test stand located at Stennis Space Center.

  19. Advanced Extravehicular Mobility Unit Informatics Software Design

    NASA Technical Reports Server (NTRS)

    Wright, Theodore

    2014-01-01

    This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.

  20. Empirical studies of software design: Implications for SSEs

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.

  1. Multitone radar design using software radio components

    NASA Astrophysics Data System (ADS)

    Mitra, Atindra K.

    2009-05-01

    Recent developments in communications and RF technology have enabled system concept formulations and designs for low-cost radar systems using state-of-the-art software radio modules. One of the major benefits of using these RF communications products is the potential for generating frequency-agile waveforms that are re-programmable in real-time and potentially adapt to a scattering environment. In addition, recent simulation results [1] indicate that this type of system enables the development and implementation of multi-function RF systems that yield good performance within embedded shared-spectrum environments. This paper investigates the design and implementation of software radar systems via implementation of commercially available software radio modules. Specifically, the potential for developing alternative multi-tone radar systems that provide significant levels of information with respect to embedded indoor scattering environments is discussed. This approach is developed via the transform domain waveform synthesis/design and implementation of OFDM (Orthogonal Frequency Domain Multiplexing) waveforms and shows good potential for the future development of cooperative multi-function RF systems.

  2. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam

    NASA Astrophysics Data System (ADS)

    Hall, David C.; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues.

  3. Geant4 simulations of proton beam transport through a carbon or beryllium degrader and following a beam line.

    PubMed

    van Goethem, M J; van der Meer, R; Reist, H W; Schippers, J M

    2009-10-01

    Monte Carlo simulations based on the Geant4 simulation toolkit were performed for the carbon wedge degrader used in the beam line at the Center of Proton Therapy of the Paul Scherrer Institute (PSI). The simulations are part of the beam line studies for the development and understanding of the GANTRY2 and OPTIS2 treatment facilities at PSI, but can also be applied to other beam lines. The simulated stopping power, momentum distributions at the degrader exit and beam line transmission have been compared to accurate benchmark measurements. Because the beam transport through magnetic elements is not easily modeled using Geant4a connection to the TURTLE beam line simulation program was made. After adjusting the mean ionization potential of the carbon degrader material from 78 eV to 95 eV, we found an accurate match between simulations and benchmark measurements, so that the simulation model could be validated. We found that the degrader does not completely erase the initial beam phase space even at low degraded beam energies. Using the validation results, we present a study of the usability of beryllium as a degrader material (mean ionization potential 63.7 eV). We found an improvement in the transmission of 30-45%, depending on the degraded beam energy, the higher value for the lower energies. PMID:19741273

  4. First GEANT4-based simulation investigation of a Li-coated resistive plate chamber for low-energy neutrons

    NASA Astrophysics Data System (ADS)

    Rhee, J. T.; Jamil, M.; Jeon, Y. J.

    2013-08-01

    A simulation study of the performance of a single-gap resistive plate chamber coated with Li-layer for the detection of low energy neutrons was performed by means of GEANT4 Monte Carlo code. Low energy neutrons were detected via 7Li(n, α) 3He nuclear reaction. To make the detector sensitive to low energy neutrons, Li- coating was employed both on the forward and backward electrodes of the converter. Low energy neutrons were transported onto the Li-coating RPC by GEANT4 MC code. A detector with converter area of 5×5 cm2 was utilized for this work. The detection response was evaluated as a function of incident low energy neutrons in the range of 25 MeV-100 MeV. The evaluated results predicted higher detection response for the backward-coated converter detector than that of forward coated converter RPC setup. This type of detector can be useful for the detection of low energy neutrons.

  5. Software design of missile integrated test system

    NASA Astrophysics Data System (ADS)

    Dai, Jing; Zhang, Ping; Li, Xingshan; Liao, Canxing; Wang, Zongli

    2006-11-01

    Based on virtual instrument, software design precept of missile integrated test system is proposed in this paper. The integrated test system software was developed under modular, intelligent and structured precept. In this way, the expansion capability of the test software is improved, and it is very convenient for second-development and maintenance. This test software is of higher-degree automation, its integrated test environment gives full play to the hardware platform of the missile integrated test system. In response to the specific hardware configuration of the test system and special missile test requirements, the application of test resources was optimized in the test procedure to improve test speed greatly and satisfy the power-on time limit for missile test. At the same time, by applying multithreading and hardware clock on a data acquisition card, accurate data acquisition, data calculating and data injecting can be completed in a millisecond to satisfy the harsh missile test requirement. This automatic test equipment can automatically test the nose cabin and control cabin only of a missile and a training missile; all the missile test items can be accomplished in a short period of time to enhance the efficiency and reliability of the test.

  6. Modeling the TrueBeam linac using a CAD to Geant4 geometry implementation: Dose and IAEA-compliant phase space calculations

    SciTech Connect

    Constantin, Magdalena; Perl, Joseph; LoSasso, Tom; Salop, Arthur; Whittum, David; Narula, Anisha; Svatos, Michelle; Keall, Paul J.

    2011-07-15

    Purpose: To create an accurate 6 MV Monte Carlo simulation phase space for the Varian TrueBeam treatment head geometry imported from cad (computer aided design) without adjusting the input electron phase space parameters. Methods: geant4 v4.9.2.p01 was employed to simulate the 6 MV beam treatment head geometry of the Varian TrueBeam linac. The electron tracks in the linear accelerator were simulated with Parmela, and the obtained electron phase space was used as an input to the Monte Carlo beam transport and dose calculations. The geometry components are tessellated solids included in geant4 as gdml (generalized dynamic markup language) files obtained via STEP (standard for the exchange of product) export from Pro/Engineering, followed by STEP import in Fastrad, a STEP-gdml converter. The linac has a compact treatment head and the small space between the shielding collimator and the divergent arc of the upper jaws forbids the implementation of a plane for storing the phase space. Instead, an IAEA (International Atomic Energy Agency) compliant phase space writer was implemented on a cylindrical surface. The simulation was run in parallel on a 1200 node Linux cluster. The 6 MV dose calculations were performed for field sizes varying from 4 x 4 to 40 x 40 cm{sup 2}. The voxel size for the 60x60x40 cm{sup 3} water phantom was 4x4x4 mm{sup 3}. For the 10x10 cm{sup 2} field, surface buildup calculations were performed using 4x4x2 mm{sup 3} voxels within 20 mm of the surface. Results: For the depth dose curves, 98% of the calculated data points agree within 2% with the experimental measurements for depths between 2 and 40 cm. For depths between 5 and 30 cm, agreement within 1% is obtained for 99% (4x4), 95% (10x10), 94% (20x20 and 30x30), and 89% (40x40) of the data points, respectively. In the buildup region, the agreement is within 2%, except at 1 mm depth where the deviation is 5% for the 10x10 cm{sup 2} open field. For the lateral dose profiles, within the field size

  7. Layered mass geometry: a novel technique to overlay seeds and applicators onto patient geometry in Geant4 brachytherapy simulations.

    PubMed

    Enger, Shirin A; Landry, Guillaume; D'Amours, Michel; Verhaegen, Frank; Beaulieu, Luc; Asai, Makoto; Perl, Joseph

    2012-10-01

    A problem faced by all Monte Carlo (MC) particle transport codes is how to handle overlapping geometries. The Geant4 MC toolkit allows the user to create parallel geometries within a single application. In Geant4 the standard mass-containing geometry is defined in a simulation volume called the World Volume. Separate parallel geometries can be defined in parallel worlds, that is, alternate three dimensional simulation volumes that share the same coordinate system with the World Volume for geometrical event biasing, scoring of radiation interactions, and/or the creation of hits in detailed readout structures. Until recently, only one of those worlds could contain mass so these parallel worlds provided no solution to simplify a complex geometric overlay issue in brachytherapy, namely the overlap of radiation sources and applicators with a CT based patient geometry. The standard method to handle seed and applicator overlay in MC requires removing CT voxels whose boundaries would intersect sources, placing the sources into the resulting void and then backfilling the remaining space of the void with a relevant material. The backfilling process may degrade the accuracy of patient representation, and the geometrical complexity of the technique precludes using fast and memory-efficient coding techniques that have been developed for regular voxel geometries. The patient must be represented by the less memory and CPU-efficient Geant4 voxel placement technique, G4PVPlacement, rather than the more efficient G4NestedParameterization (G4NestedParam). We introduce for the first time a Geant4 feature developed to solve this issue: Layered Mass Geometry (LMG) whereby both the standard (CT based patient geometry) and the parallel world (seeds and applicators) may now have mass. For any area where mass is present in the parallel world, the parallel mass is used. Elsewhere, the mass of the standard world is used. With LMG the user no longer needs to remove patient CT voxels that would

  8. Layered mass geometry: a novel technique to overlay seeds and applicators onto patient geometry in Geant4 brachytherapy simulations

    NASA Astrophysics Data System (ADS)

    Enger, Shirin A.; Landry, Guillaume; D'Amours, Michel; Verhaegen, Frank; Beaulieu, Luc; Asai, Makoto; Perl, Joseph

    2012-10-01

    A problem faced by all Monte Carlo (MC) particle transport codes is how to handle overlapping geometries. The Geant4 MC toolkit allows the user to create parallel geometries within a single application. In Geant4 the standard mass-containing geometry is defined in a simulation volume called the World Volume. Separate parallel geometries can be defined in parallel worlds, that is, alternate three dimensional simulation volumes that share the same coordinate system with the World Volume for geometrical event biasing, scoring of radiation interactions, and/or the creation of hits in detailed readout structures. Until recently, only one of those worlds could contain mass so these parallel worlds provided no solution to simplify a complex geometric overlay issue in brachytherapy, namely the overlap of radiation sources and applicators with a CT based patient geometry. The standard method to handle seed and applicator overlay in MC requires removing CT voxels whose boundaries would intersect sources, placing the sources into the resulting void and then backfilling the remaining space of the void with a relevant material. The backfilling process may degrade the accuracy of patient representation, and the geometrical complexity of the technique precludes using fast and memory-efficient coding techniques that have been developed for regular voxel geometries. The patient must be represented by the less memory and CPU-efficient Geant4 voxel placement technique, G4PVPlacement, rather than the more efficient G4NestedParameterization (G4NestedParam). We introduce for the first time a Geant4 feature developed to solve this issue: Layered Mass Geometry (LMG) whereby both the standard (CT based patient geometry) and the parallel world (seeds and applicators) may now have mass. For any area where mass is present in the parallel world, the parallel mass is used. Elsewhere, the mass of the standard world is used. With LMG the user no longer needs to remove patient CT voxels that would

  9. Software archeology: a case study in software quality assurance and design

    SciTech Connect

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  10. j5 DNA assembly design automation software.

    PubMed

    Hillson, Nathan J; Rosengarten, Rafael D; Keasling, Jay D

    2012-01-20

    Recent advances in Synthetic Biology have yielded standardized and automatable DNA assembly protocols that enable a broad range of biotechnological research and development. Unfortunately, the experimental design required for modern scar-less multipart DNA assembly methods is frequently laborious, time-consuming, and error-prone. Here, we report the development and deployment of a web-based software tool, j5, which automates the design of scar-less multipart DNA assembly protocols including SLIC, Gibson, CPEC, and Golden Gate. The key innovations of the j5 design process include cost optimization, leveraging DNA synthesis when cost-effective to do so, the enforcement of design specification rules, hierarchical assembly strategies to mitigate likely assembly errors, and the instruction of manual or automated construction of scar-less combinatorial DNA libraries. Using a GFP expression testbed, we demonstrate that j5 designs can be executed with the SLIC, Gibson, or CPEC assembly methods, used to build combinatorial libraries with the Golden Gate assembly method, and applied to the preparation of linear gene deletion cassettes for E. coli. The DNA assembly design algorithms reported here are generally applicable to broad classes of DNA construction methodologies and could be implemented to supplement other DNA assembly design tools. Taken together, these innovations save researchers time and effort, reduce the frequency of user design errors and off-target assembly products, decrease research costs, and enable scar-less multipart and combinatorial DNA construction at scales unfeasible without computer-aided design. PMID:23651006

  11. COG Software Architecture Design Description Document

    SciTech Connect

    Buck, R M; Lent, E M

    2009-09-21

    This COG Software Architecture Design Description Document describes the organization and functionality of the COG Multiparticle Monte Carlo Transport Code for radiation shielding and criticality calculations, at a level of detail suitable for guiding a new code developer in the maintenance and enhancement of COG. The intended audience also includes managers and scientists and engineers who wish to have a general knowledge of how the code works. This Document is not intended for end-users. This document covers the software implemented in the standard COG Version 10, as released through RSICC and IAEA. Software resources provided by other institutions will not be covered. This document presents the routines grouped by modules and in the order of the three processing phases. Some routines are used in multiple phases. The routine description is presented once - the first time the routine is referenced. Since this is presented at the level of detail for guiding a new code developer, only the routines invoked by another routine that are significant for the processing phase that is being detailed are presented. An index to all routines detailed is included. Tables for the primary data structures are also presented.

  12. Probing Planetary Bodies for Subsurface Volatiles: GEANT4 Models of Gamma Ray, Fast, Epithermal, and Thermal Neutron Response to Active Neutron Illumination

    NASA Astrophysics Data System (ADS)

    Chin, G.; Sagdeev, R.; Su, J. J.; Murray, J.

    2014-12-01

    Using an active source of neutrons as an in situ probe of a planetary body has proven to be a powerful tool to extract information about the presence, abundance, and location of subsurface volatiles without the need for drilling. The Dynamic Albedo of Neutrons (DAN) instrument on Curiosity is an example of such an instrument and is designed to detect the location and abundance of hydrogen within the top 50 cm of the Martian surface. DAN works by sending a pulse of neutrons towards the ground beneath the rover and detecting the reflected neutrons. The intensity and time of arrival of the reflection depends on the proportion of water, while the time the pulse takes to reach the detector is a function of the depth at which the water is located. Similar instruments can also be effective probes at the polar-regions of the Moon or on asteroids as a way of detecting sequestered volatiles. We present the results of GEANT4 particle simulation models of gamma ray, fast, epithermal, and thermal neutron responses to active neutron illumination. The results are parameterized by hydrogen abundance, stratification and depth of volatile layers, versus the distribution of neutron and gamma ray energy reflections. Models will be presented to approximate Martian, lunar, and asteroid environments and would be useful tools to assess utility for future NASA exploration missions to these types of planetary bodies.

  13. Ray tracing simulations for the wide-field x-ray telescope of the Einstein Probe mission based on Geant4 and XRTG4

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Willingale, Richard; Ling, Zhixing; Feng, Hua; Li, Hong; Ji, Jianfeng; Wang, Wenxin; Zhang, Shuangnan

    2014-07-01

    Einstein Probe (EP) is a proposed small scientific satellite dedicated to time-domain astrophysics working in the soft X-ray band. It will discover transients and monitor variable objects in 0.5-4 keV, for which it will employ a very large instantaneous field-of-view (60° × 60°), along with moderate spatial resolution (FWHM ˜ 5 arcmin). Its wide-field imaging capability will be achieved by using established technology in novel lobster-eye optics. In this paper, we present Monte-Carlo simulations for the focusing capabilities of EP's Wide-field X-ray Telescope (WXT). The simulations are performed using Geant4 with an X-ray tracer which was developed by cosine (http://cosine.nl/) to trace X-rays. Our work is the first step toward building a comprehensive model with which the design of the X-ray optics and the ultimate sensitivity of the instrument can be optimized by simulating the X-ray tracing and radiation environment of the system, including the focal plane detector and the shielding at the same time.

  14. Technical Note: Improvements in GEANT4 energy-loss model and the effect on low-energy electron transport in liquid water

    SciTech Connect

    Kyriakou, I.; Incerti, S.

    2015-07-15

    Purpose: The GEANT4-DNA physics models are upgraded by a more accurate set of electron cross sections for ionization and excitation in liquid water. The impact of the new developments on low-energy electron transport simulations by the GEANT4 Monte Carlo toolkit is examined for improving its performance in dosimetry applications at the subcellular and nanometer level. Methods: The authors provide an algorithm for an improved implementation of the Emfietzoglou model dielectric response function of liquid water used in the GEANT4-DNA existing model. The algorithm redistributes the imaginary part of the dielectric function to ensure a physically motivated behavior at the binding energies, while retaining all the advantages of the original formulation, e.g., the analytic properties and the fulfillment of the f-sum-rule. In addition, refinements in the exchange and perturbation corrections to the Born approximation used in the GEANT4-DNA existing model are also made. Results: The new ionization and excitation cross sections are significantly different from those of the GEANT4-DNA existing model. In particular, excitations are strongly enhanced relative to ionizations, resulting in higher W-values and less diffusive dose-point-kernels at sub-keV electron energies. Conclusions: An improved energy-loss model for the excitation and ionization of liquid water by low-energy electrons has been implemented in GEANT4-DNA. The suspiciously low W-values and the unphysical long tail in the dose-point-kernel have been corrected owing to a different partitioning of the dielectric function.

  15. GEANT4 Simulation of Hadronic Interactions at 8-GeV/C to 10-GeV/C: Response to the HARP-CDP Group

    SciTech Connect

    Uzhinsky, V.; Apostolakis, J.; Folger, G.; Ivanchenko, V.N.; Kossov, M.V.; Wright, D.H.; /SLAC

    2011-11-21

    The results of the HARP-CDP group on the comparison of GEANT4 Monte Carlo predictions versus experimental data are discussed. It is shown that the problems observed by the group are caused by an incorrect implementation of old features at the programming level, and by a lack of the nucleon Fermi motion in the simulation of quasielastic scattering. These drawbacks are not due to the physical models used. They do not manifest themselves in the most important applications of the GEANT4 toolkit.

  16. Designing Law-Compliant Software Requirements

    NASA Astrophysics Data System (ADS)

    Siena, Alberto; Mylopoulos, John; Perini, Anna; Susi, Angelo

    New laws, such as HIPAA and SOX, are increasingly impacting the design of software systems, as business organisations strive to comply. This paper studies the problem of generating a set of requirements for a new system which comply with a given law. Specifically, the paper proposes a systematic process for generating law-compliant requirements by using a taxonomy of legal concepts and a set of primitives to describe stakeholders and their strategic goals. Given a model of law and a model of stakeholders goals, legal alternatives are identified and explored. Strategic goals that can realise legal prescriptions are systematically analysed, and alternative ways of fulfilling a law are evaluated. The approach is demonstrated by means of a case study. This work is part of the Nomos framework, intended to support the design of law-compliant requirements models.

  17. First Results of Saturation Curve Measurements of Heat-Resistant Steel Using GEANT4 and MCNP5 Codes

    NASA Astrophysics Data System (ADS)

    Hoang, Duc-Tam; Tran, Thien-Thanh; Le, Bao-Tran; Tran, Kim-Tuyet; Huynh, Dinh-Chuong; Vo, Hoang-Nguyen; Chau, Van-Tao

    A gamma backscattering technique is applied to calculate the saturation curve and the effective mass attenuation coefficient of material. A NaI(Tl) detector collimated by collimator of large diameter is modeled by Monte Carlo technique using both MCNP5 and GEANT4 codes. The result shows a good agreement in response function of the scattering spectra for the two codes. Based on such spectra, the saturation curve of heat-resistant steel is determined. The results represent a strong confirmation that it is appropriate to use the detector collimator of large diameter to obtain the scattering spectra and this work is also the basis of experimental set-up for determining the thickness of material.

  18. Simulation response of B4C-coated PPAC for thermal neutrons using GEANT4 Monte Carlo approach

    NASA Astrophysics Data System (ADS)

    Jamil, M.; Rhee, J. T.; Jeon, Y. J.

    2013-08-01

    In this work we report a technique employed for the detection of thermal neutrons using the parallel plate avalanche counter (PPAC). In order to make the detector sensitive to thermal neutrons a thin layer of B4C has been coated on the forward electrode of the PPAC configuration. Upon falling on the converter coating, charged particles were generated via the 10B (n , α)7Li reaction. In this simulation study, thermal neutrons have been simulated using the GEANT4 MC code, and the response of the detector has been evaluated as a function of neutron energy. For a better understanding of the simulation response, the performance of the detector has been found using the two different physics list i.e., QGSP _ BIC _ HP and QGSP _ BERT _ HP. The obtained results predict that such boron-carbide based PPAC can be potentially utilized for thermal neutron detection. A complete description of the detector configuration and the simulation results is also presented.

  19. In-beam quality assurance using induced β+ activity in hadrontherapy: a preliminary physical requirements study using Geant4

    NASA Astrophysics Data System (ADS)

    Lestand, L.; Montarou, G.; Force, P.; Pauna, N.

    2012-10-01

    Light and heavy ions particle therapy, mainly by means of protons and carbon ions, represents an advantageous treatment modality for deep-seated and/or radioresistant tumours. An in-beam quality assurance principle is based on the detection of secondary particles induced by nuclear fragmentations between projectile and target nuclei. Three different strategies are currently under investigation: prompt γ rays imaging, proton interaction vertex imaging and in-beam positron emission tomography. Geant4 simulations have been performed first in order to assess the accuracy of some hadronic models to reproduce experimental data. Two different kinds of data have been considered: β+-emitting isotopes and prompt γ-ray production rates. On the one hand simulations reproduce experimental β+ emitting isotopes production rates to an accuracy of 24%. Moreover simulated β+ emitting nuclei production rate as a function of depth reproduce well the peak-to-plateau ratio of experimental data. On the other hand by tuning the tolerance factor of the photon evaporation model available in Geant4, we reduce significantly prompt γ-ray production rates until a very good agreement is reached with experimental data. Then we have estimated the total amount of induced annihilation photons and prompt γ rays for a simple treatment plan of ∼1 physical Gy in a homogenous equivalent soft tissue tumour (6 cm depth, 4 cm radius and 2 cm wide). The average annihilation photons emitted during a 45 s irradiation in a 4 π solid angle are ∼2 × 106 annihilation photon pairs and 108 single prompt γ whose energy ranges from a few keV to 10 MeV.

  20. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  1. ClassCompass: A Software Design Mentoring System

    ERIC Educational Resources Information Center

    Coelho, Wesley; Murphy, Gail

    2007-01-01

    Becoming a quality software developer requires practice under the guidance of an expert mentor. Unfortunately, in most academic environments, there are not enough experts to provide any significant design mentoring for software engineering students. To address this problem, we present a collaborative software design tool intended to maximize an…

  2. Designing and Using Software for the Learning Disabled.

    ERIC Educational Resources Information Center

    Weisgerber, Robert A.; Rubin, David P.

    1985-01-01

    Basic principles of effective software implementation with learning disabled students are discussed. A prototype software package is described that is specifically designed to help develop discriminatory skills in recognizing letter shapes and letter combinations. (JW)

  3. PD5: A General Purpose Library for Primer Design Software

    PubMed Central

    Riley, Michael C.; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Background Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. Results The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C++ with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. Conclusions The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design PMID:24278254

  4. SU-E-T-531: Performance Evaluation of Multithreaded Geant4 for Proton Therapy Dose Calculations in a High Performance Computing Facility

    SciTech Connect

    Shin, J; Coss, D; McMurry, J; Farr, J; Faddegon, B

    2014-06-01

    Purpose: To evaluate the efficiency of multithreaded Geant4 (Geant4-MT, version 10.0) for proton Monte Carlo dose calculations using a high performance computing facility. Methods: Geant4-MT was used to calculate 3D dose distributions in 1×1×1 mm3 voxels in a water phantom and patient's head with a 150 MeV proton beam covering approximately 5×5 cm2 in the water phantom. Three timestamps were measured on the fly to separately analyze the required time for initialization (which cannot be parallelized), processing time of individual threads, and completion time. Scalability of averaged processing time per thread was calculated as a function of thread number (1, 100, 150, and 200) for both 1M and 50 M histories. The total memory usage was recorded. Results: Simulations with 50 M histories were fastest with 100 threads, taking approximately 1.3 hours and 6 hours for the water phantom and the CT data, respectively with better than 1.0 % statistical uncertainty. The calculations show 1/N scalability in the event loops for both cases. The gains from parallel calculations started to decrease with 150 threads. The memory usage increases linearly with number of threads. No critical failures were observed during the simulations. Conclusion: Multithreading in Geant4-MT decreased simulation time in proton dose distribution calculations by a factor of 64 and 54 at a near optimal 100 threads for water phantom and patient's data respectively. Further simulations will be done to determine the efficiency at the optimal thread number. Considering the trend of computer architecture development, utilizing Geant4-MT for radiotherapy simulations is an excellent cost-effective alternative for a distributed batch queuing system. However, because the scalability depends highly on simulation details, i.e., the ratio of the processing time of one event versus waiting time to access for the shared event queue, a performance evaluation as described is recommended.

  5. Behaviors of the percentage depth dose curves along the beam axis of a phantom filled with different clinical PTO objects, a Monte Carlo Geant4 study

    NASA Astrophysics Data System (ADS)

    EL Bakkali, Jaafar; EL Bardouni, Tarek; Safavi, Seyedmostafa; Mohammed, Maged; Saeed, Mroan

    2016-08-01

    The aim of this work is to assess the capabilities of Monte Carlo Geant4 code to reproduce the real percentage depth dose (PDD) curves generated in phantoms which mimic three important clinical treatment situations that include lung slab, bone slab, bone-lung slab geometries. It is hoped that this work will lead us to a better understanding of dose distributions in an inhomogeneous medium, and to identify any limitations of dose calculation algorithm implemented in the Geant4 code. For this purpose, the PDD dosimetric functions associated to the three clinical situations described above, were compared to one produced in a homogeneous water phantom. Our results show, firstly, that the Geant4 simulation shows potential mistakes on the shape of the calculated PDD curve of the first physical test object (PTO), and it is obviously not able to successfully predict dose values in regions near to the boundaries between two different materials. This is, surely due to the electron transport algorithm and it is well-known as the artifacts at interface phenomenon. To deal with this issue, we have added and optimized the StepMax parameter to the dose calculation program; consequently the artifacts due to the electron transport were quasi disappeared. However, the Geant4 simulation becomes painfully slow when we attempt to completely resolve the electron artifact problems by considering a smaller value of an electron StepMax parameter. After electron transport optimization, our results demonstrate the medium-level capabilities of the Geant4 code to modeling dose distribution in clinical PTO objects.

  6. Engineering Software Suite Validates System Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  7. The Software Design Document: More than a User's Manual.

    ERIC Educational Resources Information Center

    Bowers, Dennis

    1989-01-01

    Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…

  8. Semiotics as a Basis for Educational Software Design.

    ERIC Educational Resources Information Center

    Oliveira, Osvaldo Luiz de; Baranauskas, Maria Cecilia Calani

    2000-01-01

    Presents a group of semiotic principles for software design and uses them to show how they explain different educational possibilities. Discusses interface design and describes Theater in the Computer, a software environment for children, to illustrate semiotic-based principle of interface design. (Author/LRW)

  9. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  10. Design Your Own Instructional Software: It's Easy.

    ERIC Educational Resources Information Center

    Pauline, Ronald F.

    Computer Assisted Instruction (CAI) is, quite simply, an instance in which instructional content activities are delivered via a computer. Many commercially-available software programs, although excellent programs, may not be acceptable for each individual teacher's classroom. One way to insure that software is not only acceptable but also targets…

  11. JPL Facilities and Software for Collaborative Design: 1994 - Present

    NASA Technical Reports Server (NTRS)

    DeFlorio, Paul A.

    2004-01-01

    The viewgraph presentation provides an overview of the history of the JPL Project Design Center (PDC) and, since 2000, the Center for Space Mission Architecture and Design (CSMAD). The discussion includes PDC objectives and scope; mission design metrics; distributed design; a software architecture timeline; facility design principles; optimized design for group work; CSMAD plan view, facility design, and infrastructure; and distributed collaboration tools.

  12. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  13. SEPAC flight software detailed design specifications, volume 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.

  14. Triple GEM detector sensitivity simulations with Geant4 for the CMS Forward Muon Upgrade at CERN LHC

    NASA Astrophysics Data System (ADS)

    Zenoni, Florian; CMS GEM Collaboration

    2015-04-01

    Triple Gas Electron Multiplier (GEM) detectors are being developed for the forward muon upgrade of the CMS experiment in Phase 2 of the CERN LHC. After the second long LHC shutdown, their implementation will take place for the GE1/1 system in the 1 . 5 < | η | < 2 . 2 region of the muon endcap. This upgrade aims at controlling muon level-1 trigger rates, thanks to their high performance in extreme particle rates (~ MHz/cm2). Moreover, the GEM technology can improve the muon track reconstruction and identification capabilities of the forward detector. The Triple GEMs will work in a hostile radiation background (several hundreds of Hz/cm2) mostly made of photons, neutrons, electrons and positrons. To understand how this background could affect the detectors' functionality it is important to know the sensitivity to these kinds of radiation. The goal of this work is to estimate the sensitivity of Triple GEMs to background particles in the CMS cavern environment, thanks to the latest updates of GEANT4, a toolkit for the simulation of the passage of particles through matter.

  15. GEANT4 simulations for in trap decay spectroscopy for electron capture branching ratio measurements using the TITAN facility

    NASA Astrophysics Data System (ADS)

    Seeraji, Shakil; Andreoiu, C.; Jang, F.; Ma, T.; Chaudhuri, A.; Grossheim, A.; Kwiatkowski, A. A.; Schultz, B. E.; Mane, E.; Gwinner, G.; Dilling, J.; Lennarz, A.; Frekers, D.; Chowdhury, U.; Simon, V. V.; Brunner, T.; Delheij, P.; Simon, M. C.

    2012-10-01

    The TITAN-EC project has developed a unique technique to measure electron capture branching ratios (ECBRs) of short lived intermediate nuclide involved in double beta decay. The ECBR information is important for determination of nuclear matrix elements of double-β decay for both double beta decay (2νββ) and neutrino-less double beta decay (0νββ) processes. An important feature of this technique is the use of open access penning trap. Radioactive ions are stored in the trap and their decays are observed. Electrons produced from β decay are guided out of the trap by the Penning trap's strong magnetic field and the x-ray from EC are detected by seven Si(Li) detectors placed radially around trap using thin Be windows. This set-up provides a lower background for the x-ray detection compared to earlier ECBC measurements where the beam is implanted in mylar tape. Detailed GEANT4 simulations have been performed to characterize the efficiency of the detectors and understand their response. In addition the impact of different sizes and shapes of the ion cloud inside the trap has also been investigated to optimize the experimental set-up.

  16. Distributions of positron-emitting nuclei in proton and carbon-ion therapy studied with GEANT4.

    PubMed

    Pshenichnov, Igor; Mishustin, Igor; Greiner, Walter

    2006-12-01

    Depth distributions of positron-emitting nuclei in PMMA phantoms are calculated within a Monte Carlo model for heavy-ion therapy (MCHIT) based on the GEANT4 toolkit (version 8.0). The calculated total production rates of (11)C, (10)C and (15)O nuclei are compared with experimental data and with corresponding results of the FLUKA and POSGEN codes. The distributions of e(+) annihilation points are obtained by simulating radioactive decay of unstable nuclei and transporting positrons in the surrounding medium. A finite spatial resolution of the positron emission tomography (PET) is taken into account in a simplified way. Depth distributions of beta(+)-activity as seen by a PET scanner are calculated and compared to available data for PMMA phantoms. The obtained beta(+)-activity profiles are in good agreement with PET data for proton and (12)C beams at energies suitable for particle therapy. The MCHIT capability to predict the beta(+)-activity and dose distributions in tissue-like materials of different chemical composition is demonstrated. PMID:17110773

  17. Simulation of Cherenkov photons emitted in photomultiplier windows induced by Compton diffusion using the Monte Carlo code GEANT4.

    PubMed

    Thiam, C; Bobin, C; Bouchard, J

    2010-01-01

    The implementation of the TDCR method (Triple to Double Coincidence Ratio) is based on a liquid scintillation system which comprises three photomultipliers; at LNHB, this counter can also be used in the beta-channel of a 4pi(LS)beta-gamma coincidence counting equipment. It is generally considered that the gamma-sensitivity of the liquid scintillation detector comes from the interaction of the gamma-photons in the scintillation cocktail but when introducing solid gamma-ray emitting sources instead of the scintillation vial, light emitted by the surrounding of the counter is observed. The explanation proposed in this article is that this effect comes from the emission of Cherenkov photons induced by Compton diffusion in the photomultiplier windows. In order to support this assertion, the creation and the propagation of Cherenkov photons inside the TDCR counter is simulated using the Monte Carlo code GEANT4. Stochastic calculations of double coincidences confirm the hypothesis of Cherenkov light produced in the photomultiplier windows. PMID:20031429

  18. Bias and design in software specifications

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1990-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. Presented here is a model of bias in software specifications. Bias is defined in terms of the specification process and a classification of the attributes of the software product. Our definition of bias provides insight into both the origin and the consequences of bias. It also shows that bias is relative and essentially unavoidable. Finally, we describe current work on defining a measure of bias, formalizing our model, and relating bias to software defects.

  19. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the geant4 Monte Carlo code

    PubMed Central

    Guan, Fada; Peeler, Christopher; Bronk, Lawrence; Geng, Changran; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Grosshans, David; Mohan, Radhe; Titt, Uwe

    2015-01-01

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the geant 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from geant 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LETt and dose-averaged LET, LETd) using geant 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LETt and LETd of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LETt but significant for LETd. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in geant 4 can result in incorrect LETd calculation results in the dose plateau region for small step limits. The erroneous LETd results can be attributed to the algorithm to determine fluctuations in energy deposition along the

  20. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  1. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software. PMID:24804514

  2. Ultrarelativistic electrons in the Van Allen belts: RPS observations and Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J. E.; O'Brien, T. P., III; Blake, J. B.; George, J. S.

    2015-12-01

    The Relativistic Proton Spectrometer (RPS) aboard the Van Allen Probes spacecraft is designed to measure protons from about 60 MeV to multiple GeV, but it is also sensitive to electrons above several MeV. Its Cherenkov subsystem provides energy resolution for protons above a few hundred MeV, and electrons at extremely high energies, around 50 MeV and above, can also produce high levels of Cherenkov light. While mapping protons in the inner Van Allen Belt with RPS, Mazur et al. (Fall 2014 AGU meeting, paper SM22A-02) observed a concentration of particle events around L = 2 with Cherenkov light corresponding to protons at energies well above the limit for stable trapping there. We present a preliminary analysis that shows that the patterns of the Cherenkov light distribution are consistent with these particle events instead being caused by electrons at energies of at least several tens of MeV. This energy range is well above that expected from magnetospheric energization, even by a violent event like the March 1991 shock, which injected electrons peaked around 15 MeV (Looper et al., GRL 1994, doi:10.1029/94GL01586). We discuss the possibility that these electrons are instead due to the decay of pions and muons produced by cosmic-ray interactions with the atmosphere, with a characteristic energy set by the pion rest mass of 140 MeV.

  3. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  4. Coupling of Geant4-DNA physics models into the GATE Monte Carlo platform: Evaluation of radiation-induced damage for clinical and preclinical radiation therapy beams

    NASA Astrophysics Data System (ADS)

    Pham, Q. T.; Anne, A.; Bony, M.; Delage, E.; Donnarieix, D.; Dufaure, A.; Gautier, M.; Lee, S. B.; Micheau, P.; Montarou, G.; Perrot, Y.; Shin, J. I.; Incerti, S.; Maigne, L.

    2015-06-01

    The GATE Monte Carlo simulation platform based on the Geant4 toolkit is in constant improvement for dosimetric calculations. In this paper, we present the integration of Geant4-DNA processes into the GATE 7.0 platform in the objective to perform multi-scale simulations (from macroscopic to nanometer scale). We simulated three types of clinical and preclinical beams: a 6 MeV electron clinical beam, a X-ray irradiator beam and a clinical proton beam for which we validated depth dose distributions against measurements in water. Frequencies of energy depositions and DNA damage were evaluated using a specific algorithm in charge of allocating energy depositions to atoms constituting DNA molecules represented by their PDB (Protein Data Bank) description.

  5. Dose distribution in water for monoenergetic photon point sources in the energy range of interest in brachytherapy: Monte Carlo simulations with PENELOPE and GEANT4

    NASA Astrophysics Data System (ADS)

    Almansa, Julio F.; Guerrero, Rafael; Al-Dweri, Feras M. O.; Anguiano, Marta; Lallena, Antonio M.

    2007-05-01

    Monte Carlo calculations using the codes PENELOPE and GEANT4 have been performed to characterize the dosimetric properties of monoenergetic photon point sources in water. The dose rate in water has been calculated for energies of interest in brachytherapy, ranging between 10 keV and 2 MeV. A comparison of the results obtained using the two codes with the available data calculated with other Monte Carlo codes is carried out. A χ2-like statistical test is proposed for these comparisons. PENELOPE and GEANT4 show a reasonable agreement for all energies analyzed and distances to the source larger than 1 cm. Significant differences are found at distances from the source up to 1 cm. A similar situation occurs between PENELOPE and EGS4.

  6. Determination of age specific ¹³¹I S-factor values for thyroid using anthropomorphic phantom in Geant4 simulations.

    PubMed

    Rahman, Ziaur; Ahmad, Syed Bilal; Mirza, Sikander M; Arshed, Waheed; Mirza, Nasir M; Ahmed, Waheed

    2014-08-01

    Using anthropomorphic phantom in Geant4, determination of β- and γ-absorbed fractions and energy absorbed per event due to (131)I activity in thyroid of individuals of various age groups and geometrical models, have been carried out. In the case of (131)I β-particles, the values of the absorbed fraction increased from 0.88 to 0.97 with fetus age. The maximum difference in absorbed energy per decay for soft tissue and water is 7.2% for γ-rays and 0.4% for β-particles. The new mathematical MIRD embedded in Geant4 (MEG) and two-lobe ellipsoidal models developed in this work have 4.3% and 2.9% lower value of S-factor as compared with the ORNL data. PMID:24681428

  7. Comparison of nanodosimetric parameters of track structure calculated by the Monte Carlo codes Geant4-DNA and PTra

    NASA Astrophysics Data System (ADS)

    Lazarakis, P.; Bug, M. U.; Gargioni, E.; Guatelli, S.; Rabus, H.; Rosenfeld, A. B.

    2012-03-01

    The concept of nanodosimetry is based on the assumption that initial damage to cells is related to the number of ionizations (the ionization cluster size) directly produced by single particles within, or in the close vicinity of, short segments of DNA. The ionization cluster-size distribution and other nanodosimetric quantities, however, are not directly measurable in biological targets and our current knowledge is mostly based on numerical simulations of particle tracks in water, calculating track structure parameters for nanometric target volumes. The assessment of nanodosimetric quantities derived from particle-track calculations using different Monte Carlo codes plays, therefore, an important role for a more accurate evaluation of the initial damage to cells and, as a consequence, of the biological effectiveness of ionizing radiation. The aim of this work is to assess the differences in the calculated nanodosimetric quantities obtained with Geant4-DNA as compared to those of the ad hoc particle-track Monte Carlo code ‘PTra’ developed at Physikalisch-Technische Bundesanstalt (PTB), Germany. The comparison of the two codes was made for incident electrons of energy in the range between 50 eV and 10 keV, for protons of energy between 300 keV and 10 MeV, and for alpha particles of energy between 1 and 10 MeV as these were the energy ranges available in both codes at the time this investigation was carried out. Good agreement was found for nanodosimetric characteristics of track structure calculated in the high-energy range of each particle type. For lower energies, significant differences were observed, most notably in the estimates of the biological effectiveness. The largest relative differences obtained were over 50%; however, generally the order of magnitude was between 10% and 20%.

  8. Comparison of nanodosimetric parameters of track structure calculated by the Monte Carlo codes Geant4-DNA and PTra.

    PubMed

    Lazarakis, P; Bug, M U; Gargioni, E; Guatelli, S; Rabus, H; Rosenfeld, A B

    2012-03-01

    The concept of nanodosimetry is based on the assumption that initial damage to cells is related to the number of ionizations (the ionization cluster size) directly produced by single particles within, or in the close vicinity of, short segments of DNA. The ionization cluster-size distribution and other nanodosimetric quantities, however, are not directly measurable in biological targets and our current knowledge is mostly based on numerical simulations of particle tracks in water, calculating track structure parameters for nanometric target volumes. The assessment of nanodosimetric quantities derived from particle-track calculations using different Monte Carlo codes plays, therefore, an important role for a more accurate evaluation of the initial damage to cells and, as a consequence, of the biological effectiveness of ionizing radiation. The aim of this work is to assess the differences in the calculated nanodosimetric quantities obtained with Geant4-DNA as compared to those of the ad hoc particle-track Monte Carlo code 'PTra' developed at Physikalisch-Technische Bundesanstalt (PTB), Germany. The comparison of the two codes was made for incident electrons of energy in the range between 50 eV and 10 keV, for protons of energy between 300 keV and 10 MeV, and for alpha particles of energy between 1 and 10 MeV as these were the energy ranges available in both codes at the time this investigation was carried out. Good agreement was found for nanodosimetric characteristics of track structure calculated in the high-energy range of each particle type. For lower energies, significant differences were observed, most notably in the estimates of the biological effectiveness. The largest relative differences obtained were over 50%; however, generally the order of magnitude was between 10% and 20%. PMID:22330641

  9. GATE as a GEANT4-based Monte Carlo platform for the evaluation of proton pencil beam scanning treatment plans

    NASA Astrophysics Data System (ADS)

    Grevillot, L.; Bertrand, D.; Dessy, F.; Freud, N.; Sarrut, D.

    2012-07-01

    Active scanning delivery systems take full advantage of ion beams to best conform to the tumor and to spare surrounding healthy tissues; however, it is also a challenging technique for quality assurance. In this perspective, we upgraded the GATE/GEANT4 Monte Carlo platform in order to recalculate the treatment planning system (TPS) dose distributions for active scanning systems. A method that allows evaluating the TPS dose distributions with the GATE Monte Carlo platform has been developed and applied to the XiO TPS (Elekta), for the IBA proton pencil beam scanning (PBS) system. First, we evaluated the specificities of each dose engine. A dose-conversion scheme that allows one to convert dose to medium into dose to water was implemented within GATE. Specific test cases in homogeneous and heterogeneous configurations allowed for the estimation of the differences between the beam models implemented in XiO and GATE. Finally, dose distributions of a prostate treatment plan were compared. In homogeneous media, a satisfactory agreement was generally obtained between XiO and GATE. The maximum stopping power difference of 3% occurred in a human tissue of 0.9 g cm-3 density and led to a significant range shift. Comparisons in heterogeneous configurations pointed out the limits of the TPS dose calculation accuracy and the superiority of Monte Carlo simulations. The necessity of computing dose to water in our Monte Carlo code for comparisons with TPSs is also presented. Finally, the new capabilities of the platform are applied to a prostate treatment plan and dose differences between both dose engines are analyzed in detail. This work presents a generic method to compare TPS dose distributions with the GATE Monte Carlo platform. It is noteworthy that GATE is also a convenient tool for imaging applications, therefore opening new research possibilities for the PBS modality.

  10. Use of the GEANT4 Monte Carlo to determine three-dimensional dose factors for radionuclide dosimetry

    NASA Astrophysics Data System (ADS)

    Amato, Ernesto; Italiano, Antonio; Minutoli, Fabio; Baldari, Sergio

    2013-04-01

    The voxel-level dosimetry is the most simple and common approach to internal dosimetry of nonuniform distributions of activity within the human body. Aim of this work was to obtain the dose "S" factors (mGy/MBqs) at the voxel level for eight beta and beta-gamma emitting radionuclides commonly used in nuclear medicine diagnostic and therapeutic procedures. We developed a Monte Carlo simulation in GEANT4 of a region of soft tissue as defined by the ICRP, divided into 11×11×11 cubic voxels, 3 mm in side. The simulation used the parameterizations of the electromagnetic interaction optimized for low energy (EEDL, EPDL). The decay of each radionuclide (32P, 90Y, 99mTc, 177Lu, 131I, 153Sm, 186Re, 188Re) were simulated homogeneously distributed within the central voxel (0,0,0), and the energy deposited in the surrounding voxels was mediated on the 8 octants of the three dimensional space, for reasons of symmetry. The results obtained were compared with those available in the literature. While the iodine deviations remain within 16%, for phosphorus, a pure beta emitter, the agreement is very good for self-dose (0,0,0) and good for the dose to first neighbors, while differences are observed ranging from -60% to +100% for voxels far distant from the source. The existence of significant differences in the percentage calculation of the voxel S factors, especially for pure beta emitters such as 32P or 90Y, has already been highlighted by other authors. These data can usefully extend the dosimetric approach based on the voxel to other radionuclides not covered in the available literature.

  11. Sensitivity analysis for liver iron measurement through neutron stimulated emission computed tomography: a Monte Carlo study in GEANT4

    NASA Astrophysics Data System (ADS)

    Agasthya, G. A.; Harrawood, B. C.; Shah, J. P.; Kapadia, A. J.

    2012-01-01

    Neutron stimulated emission computed tomography (NSECT) is being developed as a non-invasive imaging modality to detect and quantify iron overload in the human liver. NSECT uses gamma photons emitted by the inelastic interaction between monochromatic fast neutrons and iron nuclei in the body to detect and quantify the disease. Previous simulated and physical experiments with phantoms have shown that NSECT has the potential to accurately diagnose iron overload with reasonable levels of radiation dose. In this work, we describe the results of a simulation study conducted to determine the sensitivity of the NSECT system for hepatic iron quantification in patients of different sizes. A GEANT4 simulation of the NSECT system was developed with a human liver and two torso sizes corresponding to small and large patients. The iron concentration in the liver ranged between 0.5 and 20 mg g-1,In this paper all iron concentrations with units mg g-1 refer to wet weight concentrations. corresponding to clinically reported iron levels in iron-overloaded patients. High-purity germanium gamma detectors were simulated to detect the emitted gamma spectra, which were background corrected using suitable water phantoms and analyzed to determine the minimum detectable level (MDL) of iron and the sensitivity of the NSECT system. These analyses indicate that for a small patient (torso major axis = 30 cm) the MDL is 0.5 mg g-1 and sensitivity is ˜13 ± 2 Fe counts/mg/mSv and for a large patient (torso major axis = 40 cm) the values are 1 mg g-1 and ˜5 ± 1 Fe counts/mg/mSv, respectively. The results demonstrate that the MDL for both patient sizes lies within the clinically significant range for human iron overload.

  12. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT. PMID:24763641

  13. Designing the Undesignable: Social Software and Control

    ERIC Educational Resources Information Center

    Dron, Jon

    2007-01-01

    Social software, such as blogs, wikis, tagging systems and collaborative filters, treats the group as a first-class object within the system. Drawing from theories of transactional distance and control, this paper proposes a model of e-learning that extends traditional concepts of learner-teacher-content interactions to include these emergent…

  14. The waveform correlation event detection system global prototype software design

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Trujillo, J.R.; Young, C.J.

    1997-12-01

    The WCEDS prototype software system was developed to investigate the usefulness of waveform correlation methods for CTBT monitoring. The WCEDS prototype performs global seismic event detection and has been used in numerous experiments. This report documents the software system design, presenting an overview of the system operation, describing the system functions, tracing the information flow through the system, discussing the software structures, and describing the subsystem services and interactions. The effectiveness of the software design in meeting project objectives is considered, as well as opportunities for code refuse and lessons learned from the development process. The report concludes with recommendations for modifications and additions envisioned for regional waveform-correlation-based detector.

  15. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β+-emitting nuclei during therapeutic particle irradiation to measured data

    NASA Astrophysics Data System (ADS)

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-01

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β+-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β+-activity and dose is not feasible, a simulation of the expected β+-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β+-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β+-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  16. Designing and implementation of STB SOPCA's software modules

    NASA Astrophysics Data System (ADS)

    Qi, Yanjun; Zhong, Yuzhuo; Yang, Shi-Qiang

    2000-12-01

    The hardware resource limit and real time need have made it difficult to design embedded multimedia terminal's software. We have designed and implemented a device's software system. This device is a set-top box called SOPCA that can receive and play digital TV programs. In designing SOPCA's software modules, we bring forward a scheduling method based on tasks. Each self_governed function is implemented in one task. And all SOPCA's functions are achieved via these task cells' coordination. This method boosts up system's modularity, scalability and transplantability. It has some significance for other embedded systems' designing.

  17. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  18. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  19. Some Interactive Aspects of a Software Design Schema Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Lee, Hing-Yan; Harandi, Mehdi T.

    1991-01-01

    This paper describes a design schema acquisition tool which forms an important component of a hybrid software design system for reuse. The hybrid system incorporates both schema-based approaches in supporting software design reuse activities and is realized by extensions to the IDeA system. The paper also examines some of the interactive aspects that the tool requires with the domain analyst to accomplish its acquisition task.

  20. A proposed approach for safety management in medical software design.

    PubMed

    Rafeh, Reza

    2013-02-01

    Safe behavior of modern medical systems is the most important issue in this industry. Software has to follow safety instructions to keep the system away from any error situation. This paper proposes a new approach for safety management which can be used in different phases of software development before implementation and disposal phase. In the proposed approach safety begins from requirements as the infrastructure of design and continues through other phases of software production. PMID:23321965

  1. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  2. From Concept to Software: Developing a Framework for Understanding the Process of Software Design.

    ERIC Educational Resources Information Center

    Mishra, Punyashloke; Zhao, Yong; Tan, Sophia

    1999-01-01

    Discussion of technological innovation and the process of design focuses on the design of computer software. Offers a framework for understanding the design process by examining two computer programs: FliPS, a multimedia program for learning complex problems in chemistry; and Tiger, a Web-based program for managing and publishing electronic…

  3. Designing Distributed Learning Environments with Intelligent Software Agents

    ERIC Educational Resources Information Center

    Lin, Fuhua, Ed.

    2005-01-01

    "Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…

  4. Designing Computer Software for Problem-Solving Instruction.

    ERIC Educational Resources Information Center

    Duffield, Judith A.

    1991-01-01

    Discusses factors that might influence the effectiveness of computer software designed to teach problem solving. Topics discussed include the structure of knowledge; transfer of training; computers and problem solving instruction; human-computer interactions; and types of software, including drill and practice programs, tutorials, instructional…

  5. Training Software Developers and Designers to Conduct Usability Evaluations

    ERIC Educational Resources Information Center

    Skov, Mikael Brasholt; Stage, Jan

    2012-01-01

    Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…

  6. Design and implementation of embedded GPS navigation software

    NASA Astrophysics Data System (ADS)

    Zeng, Zhe; Li, Zhong-Hua; Zhu, Cai-Lian

    2004-06-01

    A navigation software, which provides positioning and navigation service based on GPS technique, is introduced. This software is designed and realized under the environment of the embedded operating system WindowsCE and can be widely applied to the systems such as ITS (intelligent transportation systems) and LBS (location based services).

  7. SWEPP Gamma-Ray Spectrometer System software design description

    SciTech Connect

    Femec, D.A.; Killian, E.W.

    1994-08-01

    To assist in the characterization of the radiological contents of contract-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP), the SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory. The SGRS system software controls turntable and detector system activities. In addition to determining the concentrations of gamma-ray-emitting radionuclides, this software also calculates attenuation-corrected isotopic mass ratios of-specific interest. This document describes the software design for the data acquisition and analysis software associated with the SGRS system.

  8. Dedicated software for diffractive optics design and simulation

    NASA Astrophysics Data System (ADS)

    Firsov, A.; Brzhezinskaya, M.; Firsov, A.; Svintsov, A.; Erko, A.

    2013-03-01

    An efficient software package for the structure design and simulation of imaging properties of diffraction optical elements has been developed. It operates with point source and consists of: the ZON software, to calculate the structure of an optical element in transmission and reflection; the KRGF software, to simulate the diffraction properties of an ideal optical element with point source; the DS software, to calculate the diffraction properties by taking into consideration material and shadowing effects. Optional software allows simulation with a real non-point source. Zone plate thickness profile, source shape as well as substrate curvature are considered in this calculation. This is especially important for the diffractive focusing elements and gratings at a total external reflection, given that the lateral size of the structure can be up to 1 m. The program package can be used in combination with the Nanomaker software to prepare data for ion and e-beam surface modifications and corrections.

  9. Simulation of the radiation exposure in space during a large solar energetic particle event with GEANT4

    NASA Astrophysics Data System (ADS)

    Matthiä, Daniel; Berger, Thomas; Puchalska, Monika; Reitz, Guenther

    in August 1972 in the energy range from 45 MeV to 1 GeV. The transport calculations of the energetic particles through the shielding and the phantom model were performed using the Monte-Carlo code GEANT4.

  10. Compton polarimeter as a focal plane detector for hard X-ray telescope: sensitivity estimation with Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, T.; Vadawale, S. V.; Pendharkar, J.

    2013-04-01

    X-ray polarimetry can be an important tool for investigating various physical processes as well as their geometries at the celestial X-ray sources. However, X-ray polarimetry has not progressed much compared to the spectroscopy, timing and imaging mainly due to the extremely photon-hungry nature of X-ray polarimetry leading to severely limited sensitivity of X-ray polarimeters. The great improvement in sensitivity in spectroscopy and imaging was possible due to focusing X-ray optics which is effective only at the soft X-ray energy range. Similar improvement in sensitivity of polarisation measurement at soft X-ray range is expected in near future with the advent of GEM based photoelectric polarimeters. However, at energies >10 keV, even spectroscopic and imaging sensitivities of X-ray detector are limited due to lack of focusing optics. Thus hard X-ray polarimetry so far has been largely unexplored area. On the other hand, typically the polarisation degree is expected to increase at higher energies as the radiation from non-thermal processes is dominant fraction. So polarisation measurement in hard X-ray can yield significant insights into such processes. With the recent availability of hard X-ray optics (e.g. with upcoming NuSTAR, Astro-H missions) which can focus X-rays from 5 KeV to 80 KeV, sensitivity of X-ray detectors in hard X-ray range is expected to improve significantly. In this context we explore feasibility of a focal plane hard X-ray polarimeter based on Compton scattering having a thin plastic scatterer surrounded by cylindrical array scintillator detectors. We have carried out detailed Geant4 simulation to estimate the modulation factor for 100 % polarized beam as well as polarimetric efficiency of this configuration. We have also validated these results with a semi-analytical approach. Here we present the initial results of polarisation sensitivities of such focal plane Compton polarimeter coupled with the reflection efficiency of present era hard X

  11. PET monitoring of cancer therapy with 3He and 12C beams: a study with the GEANT4 toolkit.

    PubMed

    Pshenichnov, Igor; Larionov, Alexei; Mishustin, Igor; Greiner, Walter

    2007-12-21

    We study the spatial distributions of beta(+)-activity produced by therapeutic beams of (3)He and (12)C ions in various tissue-like materials. The calculations were performed within a Monte Carlo model for heavy-ion therapy (MCHIT) based on the GEANT4 toolkit. The contributions from positron-emitting nuclei with T(1/2) > 10 s, namely (10,11)C, (13)N, (14,15)O, (17,18)F and (30)P, were calculated and compared with experimental data obtained during and after irradiation, where available. Positron-emitting nuclei are created by a (12)C beam in fragmentation reactions of projectile and target nuclei. This leads to a beta(+)-activity profile characterized by a noticeable peak located close to the Bragg peak in the corresponding depth-dose distribution. This can be used for dose monitoring in carbon-ion therapy of cancer. In contrast, as most of the positron-emitting nuclei are produced by a (3)He beam in target fragmentation reactions, the calculated total beta(+)-activity during or soon after the irradiation period is evenly distributed within the projectile range. However, we predict also the presence of (13)N, (14)O, (17,18)F created in charge-transfer reactions by low-energy (3)He ions close to the end of their range in several tissue-like media. The time evolution of beta(+)-activity profiles was investigated for both kinds of beams. We found that due to the production of (18)F nuclides the beta(+)-activity profile measured 2 or 3 h after irradiation with (3)He ions will have a distinct peak correlated with the maximum of depth-dose distribution. We also found certain advantages of low-energy (3)He beams over low-energy proton beams for reliable PET monitoring during particle therapy of shallow-located tumours. In this case the distal edge of beta(+)-activity distribution from (17)F nuclei clearly marks the range of (3)He in tissues. PMID:18065840

  12. Executive system software design and expert system implementation

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1992-01-01

    The topics are presented in viewgraph form and include: software requirements; design layout of the automated assembly system; menu display for automated composite command; expert system features; complete robot arm state diagram and logic; and expert system benefits.

  13. Evaluation of commercially available lighting design software

    SciTech Connect

    McConnell, D.G.

    1990-09-01

    This report addresses the need for commercially available lighting design computer programs and evaluates several of these programs. Sandia National Laboratories uses these programs to provide lighting designs for exterior closed-circuit television camera intrusion detection assessment for high-security perimeters.

  14. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  15. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  16. Ray tracing software application in VIP lamp design

    NASA Astrophysics Data System (ADS)

    Rehn, Henning

    2002-08-01

    In our contribution we demonstrate a wide variety of ray tracing software applications for the design of VIP short-arc discharge video projection lamps. On the basis of simulations we derive design rules for the lamp itself and for its optical environment. Light Tools software acts as a means to understand the collection efficiency of a VIP lamp with an elliptical reflector and as an instrument to prove the conclusions.

  17. Acquiring Software Design Schemas: A Machine Learning Perspective

    NASA Technical Reports Server (NTRS)

    Harandi, Mehdi T.; Lee, Hing-Yan

    1991-01-01

    In this paper, we describe an approach based on machine learning that acquires software design schemas from design cases of existing applications. An overview of the technique, design representation, and acquisition system are presented. the paper also addresses issues associated with generalizing common features such as biases. The generalization process is illustrated using an example.

  18. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  19. A patterns catalog for RTSJ software designs

    NASA Technical Reports Server (NTRS)

    Benowitz, E. G.; Niessner, A. F.

    2003-01-01

    In this survey paper, we bring together current progress to date in identifying design patterns for use with the real-time specification for Java in a format consistent with contemporary patterns descriptions.

  20. On Designing Lightweight Threads for Substrate Software

    NASA Technical Reports Server (NTRS)

    Haines, Matthew

    1997-01-01

    Existing user-level thread packages employ a 'black box' design approach, where the implementation of the threads is hidden from the user. While this approach is often sufficient for application-level programmers, it hides critical design decisions that system-level programmers must be able to change in order to provide efficient service for high-level systems. By applying the principles of Open Implementation Analysis and Design, we construct a new user-level threads package that supports common thread abstractions and a well-defined meta-interface for altering the behavior of these abstractions. As a result, system-level programmers will have the advantages of using high-level thread abstractions without having to sacrifice performance, flexibility or portability.

  1. Making software get along: integrating optical and mechanical design programs

    NASA Astrophysics Data System (ADS)

    Shackelford, Christie J.; Chinnock, Randal B.

    2001-03-01

    As modern optomechanical engineers, we have the good fortune of having very sophisticated software programs available to us. The current optical design, mechanical design, industrial design, and CAM programs are very powerful tools with some very desirable features. However, no one program can do everything necessary to complete an entire optomechanical system design. Each program has a unique set of features and benefits, and typically two or mo re will be used during the product development process. At a minimum, an optical design program and a mechanical CAD package will be employed. As we strive for efficient, cost-effective, and rapid progress in our development projects, we must use these programs to their full advantage, while keeping redundant tasks to a minimum. Together, these programs offer the promise of a `seamless' flow of data from concept all the way to the download of part designs directly to the machine shop for fabrication. In reality, transferring data from one software package to the next is often frustrating. Overcoming these problems takes some know-how, a bit of creativity, and a lot of persistence. This paper describes a complex optomechanical development effort in which a variety of software tools were used from the concept stage to prototyping. It will describe what software was used for each major design task, how we learned to use them together to best advantage, and how we overcame the frustrations of software that didn't get along.

  2. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    SciTech Connect

    Guan, Fada; Peeler, Christopher; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Mohan, Radhe; Titt, Uwe; Bronk, Lawrence; Geng, Changran; Grosshans, David

    2015-11-15

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to

  3. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  4. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  5. Math software for calculating design constraints

    SciTech Connect

    Deitz, D.

    1995-02-01

    This article describes how, to determine the specifications for a combustor in a high-speed civil transport plane, mechanical engineers are using a commercial math program to solve hundreds of complex equations. Since the demise of the slide rule, many engineers have created math routines from scratch or adapted spreadsheet programs when they needed to calculate design parameters. Commercial math programs, however, are more flexible than homemade routines or jury-rigged spreadsheet calculators, and therefore make it easier to solve complex equations. They can also cut the time it takes to determine key parameters early in the design process, when time savings can translate into big cost savings. Engineers working on Pratt and Whitney`s portion of the National Aeronautics and Space Administration`s high-speed civil transport (HSCT) project estimated that they were able to cut in half the time needed to solve equations in the preliminary design process by using a commercial math program, TK Solver, from Universal Technical Systems Inc. in Rockford, Ill. The time savings resulted from the program`s ability to solve any equation for any variable, relieving engineers of the need to rewrite or reenter equations.

  6. Design of single object model of software reuse framework

    NASA Astrophysics Data System (ADS)

    Yan, Liu

    2011-12-01

    In order to embody the reuse significance of software reuse framework fully, this paper will analyze in detail about the single object model mentioned in the article "The overall design of software reuse framework" and induce them as add and delete and modify mode, check mode, and search and scroll and display integrated mode. Three modes correspond to their own interface design template, class and database design concept. The modelling idea helps developers clear their minds and speed up. Even laymen can complete the development task easily.

  7. Certification trails and software design for testability

    NASA Technical Reports Server (NTRS)

    Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.

    1993-01-01

    Design techniques which may be applied to make program testing easier were investigated. Methods for modifying a program to generate additional data which we refer to as a certification trail are presented. This additional data is designed to allow the program output to be checked more quickly and effectively. Certification trails were described primarily from a theoretical perspective. A comprehensive attempt to assess experimentally the performance and overall value of the certification trail method is reported. The method was applied to nine fundamental, well-known algorithms for the following problems: convex hull, sorting, huffman tree, shortest path, closest pair, line segment intersection, longest increasing subsequence, skyline, and voronoi diagram. Run-time performance data for each of these problems is given, and selected problems are described in more detail. Our results indicate that there are many cases in which certification trails allow for significantly faster overall program execution time than a 2-version programming approach, and also give further evidence of the breadth of applicability of this method.

  8. Software design with fuzzy requirements (A case study)

    NASA Technical Reports Server (NTRS)

    Werntz, David G.

    1989-01-01

    The author describes the resource allocation and planning helper (RALPH) scheduling system developed at NASA's Jet Propulsion Laboratory. RALPH addresses the concerns of designing software systems to minimize the need to recode for changes and upgrades; this concern is acute when requirements are uncertain or changing. Determining requirements, understanding the problem, designing for change, and tradeoffs are also discussed.

  9. Designing Prediction Tasks in a Mathematics Software Environment

    ERIC Educational Resources Information Center

    Brunström, Mats; Fahlgren, Maria

    2015-01-01

    There is a recognised need in mathematics teaching for new kinds of tasks which exploit the affordances provided by new technology. This paper focuses on the design of prediction tasks to foster student reasoning about exponential functions in a mathematics software environment. It draws on the first iteration of a design based research study…

  10. Calico: An Early-Phase Software Design Tool

    ERIC Educational Resources Information Center

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  11. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  12. Methodology for system description using the software design & documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1985-01-01

    The Software Design and Documentation Language (SDDL) can be loosely characterized as a text processor with built-in knowledge of, and methods for handling the concepts of structure and abstraction which are essential for developing software and other information intensive systems. Several aspects of system descriptions to which SDDL has been applied are presented and specific SDDL methodologies developed for these applications are discussed.

  13. Design software for ion-exchanged glass waveguide devices

    NASA Astrophysics Data System (ADS)

    Tervonen, Ari; Honkanen, Seppo; Poyhonen, Pekka; Tahkokorpi, Markku T.

    1993-04-01

    Software tools for design of passive integrated optical components based on ion-exchanged glass waveguides have been developed. All design programs have been implemented on personal computers. A general simulation program for ion exchange processes is used for optimization of waveguide fabrication. The optical propagation in the calculated channel waveguide profiles is modelled with various methods. A user-friendly user's interface has been included in this modelling software. On the basis of the calculated propagation properties, performance of channel waveguide circuits can be modelled and thus devices for different applications may be designed. From the design parameters, the lithography mask pattern to be used is generated for a commercial CAD program for final mask design. Examples of designed and manufactured guided-wave devices are described. These include 1- to-n splitters and asymmetric Mach-Zehnder interferometers for wavelength division multiplexing.

  14. Comparison of MCNPX and GEANT4 to Predict the Contribution of Non-elastic Nuclear Interactions to Absorbed Dose in Water, PMMA and A150

    NASA Astrophysics Data System (ADS)

    Shtejer, K.; Arruda-Neto, J. D. T.; Schulte, R.; Wroe, A.; Rodrigues, T. E.; de Menezes, M. O.; Moralles, M.; Guzmán, F.; Manso, M. V.

    2008-08-01

    Proton induced non-elastic nuclear reactions play an important role in the dose distribution of clinically used proton beams as they deposit dose of high biological effectiveness both within the primary beam path as well as outside the beam to untargeted tissues. Non-elastic nuclear reactions can be evaluated using transport codes based on the Monte Carlo method. In this work, we have utilized the Los Alamos code MCNPX and the CERN GEANT4 toolkit, which are currently the most widely used Monte Carlo programs for proton radiation transport simulations in medical physics, to study the contribution of non-elastic nuclear interactions to the absorbed dose of proton beams in the therapeutic energy range. The impact of different available theoretical models to address the nuclear reaction process was investigated. The contribution of secondary particles from non-elastic nuclear reactions was calculated in three materials relevant in radiotherapy applications: water, PMMA and A150. The results evidence that there are differences in the calculated contribution of the secondary particles heavier than protons to the absorbed dose, with different approaches to model the nuclear reactions. The MCNPX calculation give rise to a larger contribution of d, t, α3He to the total dose compared to the GEANT4 physical models chosen in this work.

  15. Comparison of MCNPX and GEANT4 to Predict the Contribution of Non-elastic Nuclear Interactions to Absorbed Dose in Water, PMMA and A150

    SciTech Connect

    Shtejer, K.; Arruda-Neto, J. D. T.; Rodrigues, T. E.; Schulte, R.; Wroe, A.; Menezes, M. O. de; Moralles, M.

    2008-08-11

    Proton induced non-elastic nuclear reactions play an important role in the dose distribution of clinically used proton beams as they deposit dose of high biological effectiveness both within the primary beam path as well as outside the beam to untargeted tissues. Non-elastic nuclear reactions can be evaluated using transport codes based on the Monte Carlo method. In this work, we have utilized the Los Alamos code MCNPX and the CERN GEANT4 toolkit, which are currently the most widely used Monte Carlo programs for proton radiation transport simulations in medical physics, to study the contribution of non-elastic nuclear interactions to the absorbed dose of proton beams in the therapeutic energy range. The impact of different available theoretical models to address the nuclear reaction process was investigated. The contribution of secondary particles from non-elastic nuclear reactions was calculated in three materials relevant in radiotherapy applications: water, PMMA and A150. The results evidence that there are differences in the calculated contribution of the secondary particles heavier than protons to the absorbed dose, with different approaches to model the nuclear reactions. The MCNPX calculation give rise to a larger contribution of d, t, {alpha}{sup 3}He to the total dose compared to the GEANT4 physical models chosen in this work.

  16. Mission design applications of QUICK. [software for interactive trajectory calculation

    NASA Technical Reports Server (NTRS)

    Skinner, David L.; Bass, Laura E.; Byrnes, Dennis V.; Cheng, Jeannie T.; Fordyce, Jess E.; Knocke, Philip C.; Lyons, Daniel T.; Pojman, Joan L.; Stetson, Douglas S.; Wolf, Aron A.

    1990-01-01

    An overview of an interactive software environment for space mission design termed QUICK is presented. This stand-alone program provides a programmable FORTRAN-like calculator interface to a wide range of both built-in and user defined functions. QUICK has evolved into a general-purpose software environment that can be intrinsically and dynamically customized for a wide range of mission design applications. Specific applications are described for some space programs, e.g., the earth-Venus-Mars mission, the Cassini mission to Saturn, the Mars Observer, the Galileo Project, and the Magellan Spacecraft.

  17. Software Design Methodology Migration for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .

  18. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured

  19. Issues in Software Engineering of Relevance to Instructional Design

    ERIC Educational Resources Information Center

    Douglas, Ian

    2006-01-01

    Software engineering is popularly misconceived as being an upmarket term for programming. In a way, this is akin to characterizing instructional design as the process of creating PowerPoint slides. In both these areas, the construction of systems, whether they are learning or computer systems, is only one part of a systematic process. The most…

  20. Categorizing Student Software Designs: Methods, Results, and Implications

    ERIC Educational Resources Information Center

    Eckerdal, Anna; McCartney, Robert; Mostrom, Jan Erik; Ratcliffe, Mark; Zander, Carol

    2006-01-01

    This paper examines the problem of studying and comparing student software designs. We propose semantic categorization as a way to organize widely varying data items. We describe how this was used to organize a particular multi-national, multi-institutional dataset, and present the results of this analysis: most students are unable to effectively…

  1. Peeling the Onion: Okapi System Architecture and Software Design Issues.

    ERIC Educational Resources Information Center

    Jones, S.; And Others

    1997-01-01

    Discusses software design issues for Okapi, an information retrieval system that incorporates both search engine and user interface and supports weighted searching, relevance feedback, and query expansion. The basic search system, adjacency searching, and moving toward a distributed system are discussed. (Author/LRW)

  2. Designing for User Cognition and Affect in Software Instructions

    ERIC Educational Resources Information Center

    van der Meij, Hans

    2008-01-01

    In this paper we examine how to design software instructions for user cognition and affect. A basic and co-user manual are compared. The first provides fundamental support for both; the latter includes a buddy to further optimize support for user affect. The basic manual was faster and judged as easier to process than the co-user manual. In…

  3. QUICK - An interactive software environment for engineering design

    NASA Technical Reports Server (NTRS)

    Skinner, David L.

    1989-01-01

    QUICK, an interactive software environment for engineering design, provides a programmable FORTRAN-like calculator interface to a wide range of data structures as well as both built-in and user created functions. QUICK also provides direct access to the operating systems of eight different machine architectures. The evolution of QUICK and a brief overview of the current version are presented.

  4. Expertise in professional software design: a process study.

    PubMed

    Sonnentag, S

    1998-10-01

    Forty professional software designers participated in a study in which they worked on a software design task and reported strategies for accomplishing that task. High performers were identified by a peer-nomination method and performance on a design. Verbal protocol analysis based on a comparison of 12 high and 12 moderate performers indicated that high performers structured their design process by local planning and showed more feedback processing, whereas moderate performers were more engaged in analyzing requirements and verbalizing task-irrelevant cognitions. High performers more often described problem comprehension and cooperation with colleagues as useful strategies. High and moderate performers did not differ with respect to length of experience. None of the differences between the two performance groups could be explained by length of experience. PMID:9806013

  5. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  6. Cluster computing software for GATE simulations

    SciTech Connect

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-06-15

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

  7. Cluster computing software for GATE simulations.

    PubMed

    De Beenhouwer, Jan; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R

    2007-06-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values. PMID:17654895

  8. Feasibility of using Geant4 Monte Carlo simulation for IMRT dose calculations for the Novalis Tx with a HD-120 multi-leaf collimator

    NASA Astrophysics Data System (ADS)

    Jung, Hyunuk; Shin, Jungsuk; Chung, Kwangzoo; Han, Youngyih; Kim, Jinsung; Choi, Doo Ho

    2015-05-01

    The aim of this study was to develop an independent dose verification system by using a Monte Carlo (MC) calculation method for intensity modulated radiation therapy (IMRT) conducted by using a Varian Novalis Tx (Varian Medical Systems, Palo Alto, CA, USA) equipped with a highdefinition multi-leaf collimator (HD-120 MLC). The Geant4 framework was used to implement a dose calculation system that accurately predicted the delivered dose. For this purpose, the Novalis Tx Linac head was modeled according to the specifications acquired from the manufacturer. Subsequently, MC simulations were performed by varying the mean energy, energy spread, and electron spot radius to determine optimum values of irradiation with 6-MV X-ray beams by using the Novalis Tx system. Computed percentage depth dose curves (PDDs) and lateral profiles were compared to the measurements obtained by using an ionization chamber (CC13). To validate the IMRT simulation by using the MC model we developed, we calculated a simple IMRT field and compared the result with the EBT3 film measurements in a water-equivalent solid phantom. Clinical cases, such as prostate cancer treatment plans, were then selected, and MC simulations were performed. The accuracy of the simulation was assessed against the EBT3 film measurements by using a gamma-index criterion. The optimal MC model parameters to specify the beam characteristics were a 6.8-MeV mean energy, a 0.5-MeV energy spread, and a 3-mm electron radius. The accuracy of these parameters was determined by comparison of MC simulations with measurements. The PDDs and the lateral profiles of the MC simulation deviated from the measurements by 1% and 2%, respectively, on average. The computed simple MLC fields agreed with the EBT3 measurements with a 95% passing rate with 3%/3-mm gamma-index criterion. Additionally, in applying our model to clinical IMRT plans, we found that the MC calculations and the EBT3 measurements agreed well with a passing rate of greater

  9. Software Design for Interactive Graphic Radiation Treatment Simulation Systems*

    PubMed Central

    Kalet, Ira J.; Sweeney, Christine; Jacky, Jonathan

    1990-01-01

    We examine issues in the design of interactive computer graphic simulation programs for radiation treatment planning (RTP), as well as expert system programs that automate parts of the RTP process, in light of ten years of experience at designing, building and using such programs. An experiment in object-oriented design using standard Pascal shows that while some advantage is gained from the design, it is still difficult to achieve modularity and to integrate expert system components. A new design based on the Common LISP Object System (CLOS) is described. This series of designs for RTP software shows that this application benefits in specific ways from object-oriented design methods and appropriate languages and tools.

  10. [Development of a software for 3D virtual phantom design].

    PubMed

    Zou, Lian; Xie, Zhao; Wu, Qi

    2014-02-01

    In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research. PMID:24804488

  11. Design of Mariner 9 Science Sequences using Interactive Graphics Software

    NASA Technical Reports Server (NTRS)

    Freeman, J. E.; Sturms, F. M, Jr.; Webb, W. A.

    1973-01-01

    This paper discusses the analyst/computer system used to design the daily science sequences required to carry out the desired Mariner 9 science plan. The Mariner 9 computer environment, the development and capabilities of the science sequence design software, and the techniques followed in the daily mission operations are discussed. Included is a discussion of the overall mission operations organization and the individual components which played an essential role in the sequence design process. A summary of actual sequences processed, a discussion of problems encountered, and recommendations for future applications are given.

  12. Geant4 Monte Carlo simulation of absorbed dose and radiolysis yields enhancement from a gold nanoparticle under MeV proton irradiation

    NASA Astrophysics Data System (ADS)

    Tran, H. N.; Karamitros, M.; Ivanchenko, V. N.; Guatelli, S.; McKinnon, S.; Murakami, K.; Sasaki, T.; Okada, S.; Bordage, M. C.; Francis, Z.; El Bitar, Z.; Bernal, M. A.; Shin, J. I.; Lee, S. B.; Barberet, Ph.; Tran, T. T.; Brown, J. M. C.; Nhan Hao, T. V.; Incerti, S.

    2016-04-01

    Gold nanoparticles have been reported as a possible radio-sensitizer agent in radiation therapy due to their ability to increase energy deposition and subsequent direct damage to cells and DNA within their local vicinity. Moreover, this increase in energy deposition also results in an increase of the radiochemical yields. In this work we present, for the first time, an in silico investigation, based on the general purpose Monte Carlo simulation toolkit Geant4, into energy deposition and radical species production around a spherical gold nanoparticle 50 nm in diameter via proton irradiation. Simulations were preformed for incident proton energies ranging from 2 to 170 MeV, which are of interest for clinical proton therapy.

  13. Simulation, optimization and testing of a novel high spatial resolution X-ray imager based on Zinc Oxide nanowires in Anodic Aluminium Oxide membrane using Geant4

    NASA Astrophysics Data System (ADS)

    Esfandi, F.; Saramad, S.

    2015-07-01

    In this work, a new generation of scintillator based X-ray imagers based on ZnO nanowires in Anodized Aluminum Oxide (AAO) nanoporous template is characterized. The optical response of ordered ZnO nanowire arrays in porous AAO template under low energy X-ray illumination is simulated by the Geant4 Monte Carlo code and compared with experimental results. The results show that for 10 keV X-ray photons, by considering the light guiding properties of zinc oxide inside the AAO template and suitable selection of detector thickness and pore diameter, the spatial resolution less than one micrometer and the detector detection efficiency of 66% are accessible. This novel nano scintillator detector can have many advantages for medical applications in the future.

  14. Geant4 simulation for a study of a possible use of carbon ion pencil beams for the treatment of ocular melanomas with the active scanning system at CNAO

    NASA Astrophysics Data System (ADS)

    Farina, E.; Piersimoni, P.; Riccardi, C.; Rimoldi, A.; Tamborini, A.; Ciocca, M.

    2015-12-01

    The aim of this work was to study a possible use of carbon ion pencil beams (delivered with active scanning modality) for the treatment of ocular melanomas at the Centro Nazionale di Adroterapia Oncologica (CNAO). The promising aspect of carbon ions radiotherapy for the treatment of this disease lies in its superior relative radio-biological effectiveness (RBE). The Monte Carlo (MC) Geant4 10.00 toolkit was used to simulate the complete CNAO extraction beamline, with the active and passive components along it. A human eye modeled detector, including a realistic target tumor volume, was used as target. Cross check with previous studies at CNAO using protons allowed comparisons on possible benefits on using such a technique with respect to proton beams. Experimental data on proton and carbon ion beams transverse distributions were used to validate the simulation.

  15. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    PubMed

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity. PMID:26151172

  16. BIOESTIM: software for automatic design of estimators in bioprocess engineering.

    PubMed

    Farza, M; Chéruy, A

    1994-09-01

    This paper describes BIOESTIM, a software package devoted to on-line estimation in bioprocess engineering. BIOESTIM enables bioengineers automatically to design state and parameter estimators from a minimal knowledge of the process kinetics. Such estimators allow development of software sensors capable of coping with the lack of reliable instrumentation suited to real-time monitoring. The estimator building procedure through BIOESTIM starts up from a dynamical material balance model of the bioprocess. This model, supplied by the user, is next completed by other information with no requirement for numerical values: the user has only to specify available measurements, coupled reactions and the known yield coefficients. On the base of this knowledge, BIOESTIM proceeds to symbolic algebraic manipulations on the model in order to study estimation possibilities and check identifiability of yield coefficients. When the design of an estimator is possible, the corresponding equations are automatically generated. Moreover, these estimators are stored in a user-specified file which is automatically interfaced with a specialized simulation software including data treatment and numerical integration packages. Thus, the user can simulate the estimator performances under various operational conditions using available experimental measurements. A typical example dealing with microbial growth and biosynthesis reactions is given in order to illustrate the main functional capabilities of BIOESTIM. BIOESTIM has been designed and written in a modular fashion. The module dealing with estimators design makes use of symbolic computation; it is written in Mathematica and runs on every computer on which this language is available. PMID:7828062

  17. ArrayD: A general purpose software for Microarray design

    PubMed Central

    Sharma, Anu; Srivastava, Gyan Prakash; Sharma, Vineet K; Ramachandran, Srinivasan

    2004-01-01

    Background Microarray is a high-throughput technology to study expression of thousands of genes in parallel. A critical aspect of microarray production is the design aimed at space optimization while maximizing the number of gene probes and their replicates to be spotted. Results We have developed a software called 'ArrayD' that offers various alternative design solutions for an array given a set of user requirements. The user feeds the following inputs: type of source plates to be used, number of gene probes to be printed, number of replicates and number of pins to be used for printing. The solutions are stored in a text file. The choice of a design solution to be used will be governed by the spotting chemistry to be used and the accuracy of the robot. Conclusions ArrayD is a software for standard cartesian robots. The software aids users in preparing a judicious and elegant design. ArrayD is universally applicable and is available at . PMID:15461789

  18. Open source software and web services for designing therapeutic molecules.

    PubMed

    Singla, Deepak; Dhanda, Sandeep Kumar; Chauhan, Jagat Singh; Bhardwaj, Anshu; Brahmachari, Samir K; Raghava, Gajendra P S

    2013-01-01

    Despite the tremendous progress in the field of drug designing, discovering a new drug molecule is still a challenging task. Drug discovery and development is a costly, time consuming and complex process that requires millions of dollar and 10-15 years to bring new drug molecules in the market. This huge investment and long-term process are attributed to high failure rate, complexity of the problem and strict regulatory rules, in addition to other factors. Given the availability of 'big' data with ever improving computing power, it is now possible to model systems which is expected to provide time and cost effectiveness to drug discovery process. Computer Aided Drug Designing (CADD) has emerged as a fast alternative method to bring down the cost involved in discovering a new drug. In past, numerous computer programs have been developed across the globe to assist the researchers working in the field of drug discovery. Broadly, these programs can be classified in three categories, freeware, shareware and commercial software. In this review, we have described freeware or open-source software that are commonly used for designing therapeutic molecules. Major emphasis will be on software and web services in the field of chemo- or pharmaco-informatics that includes in silico tools used for computing molecular descriptors, inhibitors designing against drug targets, building QSAR models, and ADMET properties. PMID:23647540

  19. A polygon-surface reference Korean male phantom (PSRK-Man) and its direct implementation in Geant4 Monte Carlo simulation.

    PubMed

    Kim, Chan Hyeong; Jeong, Jong Hwi; Bolch, Wesley E; Cho, Kun-Woo; Hwang, Sung Bae

    2011-05-21

    Even though the hybrid phantom embodies both the anatomic reality of voxel phantoms and the deformability of stylized phantoms, it must be voxelized to be used in a Monte Carlo code for dose calculation or some imaging simulation, which incurs the inherent limitations of voxel phantoms. In the present study, a voxel phantom named VKH-Man (Visible Korean Human-Man), was converted to a polygon-surface phantom (PSRK-Man, Polygon-Surface Reference Korean-Man), which was then adjusted to the Reference Korean data. Subsequently, the PSRK-Man polygon phantom was directly, without any voxelization process, implemented in the Geant4 Monte Carlo code for dose calculations. The calculated dose values and computation time were then compared with those of HDRK-Man (High Definition Reference Korean-Man), a corresponding voxel phantom adjusted to the same Reference Korean data from the same VKH-Man voxel phantom. Our results showed that the calculated dose values of the PSRK-Man surface phantom agreed well with those of the HDRK-Man voxel phantom. The calculation speed for the PSRK-Man polygon phantom though was 70-150 times slower than that of the HDRK-Man voxel phantom; that speed, however, could be acceptable in some applications, in that direct use of the surface phantom PSRK-Man in Geant4 does not require a separate voxelization process. Computing speed can be enhanced, in future, either by optimizing the Monte Carlo transport kernel for the polygon surfaces or by using modern computing technologies such as grid computing and general-purpose computing on graphics processing units programming. PMID:21521906

  20. Computer software design description for the integrated control and data acquisition system LDUA system

    SciTech Connect

    Aftanas, B.L.

    1998-08-12

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components.

  1. [Software Design for a Portable Ultrasound Bone Densitometer].

    PubMed

    Deng, Jiangjun; Ding, Jie; Xu, Shijie; Geng, Ruihua; He, Aijun

    2015-10-01

    In order to meet the requirements of ultrasound bone density measurement, we designed a sofware based on Visual Studio C+ + 2008. The software includes interface design, acquisition and control, data processing and parameter extraction, data storage and printing. Excellent human-computer interface (HCI) will give users a convenient experience. Auto gain control (AGC) and digital filter can improve the precision effectively. In addition, we can observe waveform clearly in real time. By using USB communication, we can send control commands to the acquisition and get data effectively, which can shorten the measuring time. Then we calculated the speed of sound (SOS) and broadband ultrasound attenuation (BUA). Patients' information can be accessed by using XML document. Finally, the software offers printing function. PMID:26964306

  2. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  3. Critical Design Decisions of The Planck LFI Level 1 Software

    NASA Astrophysics Data System (ADS)

    Morisset, N.; Rohlfs, R.; Türler, M.; Meharga, M.; Binko, P.; Beck, M.; Frailis, M.; Zacchei, A.

    2010-12-01

    The PLANCK satellite with two on-board instruments, a Low Frequency Instrument (LFI) and a High Frequency Instrument (HFI) has been launched on May 14th with Ariane 5. The ISDC Data Centre for Astrophysics in Versoix, Switzerland has developed and maintains the Planck LFI Level 1 software for the Data Processing Centre (DPC) in Trieste, Italy. The main tasks of the Level 1 processing are to retrieve the daily available scientific and housekeeping (HK) data of the LFI instrument, the Sorption Cooler and the 4k Cooler data from Mission Operation Centre (MOC) in Darmstadt; to sort them by time and by type (detector, observing mode, etc...); to extract the spacecraft attitude information from auxiliary files; to flag the data according to several criteria; and to archive the resulting Time Ordered Information (TOI), which will then be used to produce maps of the sky in different spectral bands. The output of the Level 1 software are the TOI files in FITS format, later ingested into the Data Management Component (DMC) database. This software has been used during different phases of the LFI instrument development. We started to reuse some ISDC components for the LFI Qualification Model (QM) and we completely rework the software for the Flight Model (FM). This was motivated by critical design decisions taken jointly with the DPC. The main questions were: a) the choice of the data format: FITS or DMC? b) the design of the pipelines: use of the Planck Process Coordinator (ProC) or a simple Perl script? c) do we adapt the existing QM software or do we restart from scratch? The timeline and available manpower are also important issues to be taken into account. We present here the orientation of our choices and discuss their pertinence based on the experience of the final pre-launch tests and the start of real Planck LFI operations.

  4. A requirements specification for a software design support system

    NASA Technical Reports Server (NTRS)

    Noonan, Robert E.

    1988-01-01

    Most existing software design systems (SDSS) support the use of only a single design methodology. A good SDSS should support a wide variety of design methods and languages including structured design, object-oriented design, and finite state machines. It might seem that a multiparadigm SDSS would be expensive in both time and money to construct. However, it is proposed that instead an extensible SDSS that directly implements only minimal database and graphical facilities be constructed. In particular, it should not directly implement tools to faciliate language definition and analysis. It is believed that such a system could be rapidly developed and put into limited production use, with the experience gained used to refine and evolve the systems over time.

  5. Application of existing design software to problems in neuronal modeling.

    PubMed

    Vranić-Sowers, S; Fleshman, J W

    1994-03-01

    In this communication, we describe the application of the Valid/Analog Design Tools circuit simulation package called PC Workbench to the problem of modeling the electrical behavior of neural tissue. A nerve cell representation as an equivalent electrical circuit using compartmental models is presented. Several types of nonexcitable and excitable membranes are designed, and simulation results for different types of electrical stimuli are compared to the corresponding analytical data. It is shown that the hardware/software platform and the models developed constitute an accurate, flexible, and powerful way to study neural tissue. PMID:8045583

  6. Requirements Management System Browser (RMSB) software design description

    SciTech Connect

    Frank, D.D.

    1996-09-30

    The purpose of this document is to provide an ``as-built`` design description for the Requirements Management System Browser (RMSB) application. The Graphical User Interface (GUI) and database structure design are described for the RMSB application, referred to as the ``Browser.`` The RMSB application provides an easy to use PC-based interface to browse systems engineering data stored and managed in a UNIX software application. The system engineering data include functions, requirements, and architectures that make up the Tank Waste Remediation System (TWRS) technical baseline.

  7. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  8. Design and implementation of an embedded software system for ATR

    NASA Astrophysics Data System (ADS)

    Wang, Yuehuan; Li, Shiyong

    2011-11-01

    This paper has designed and realized a coarse-grained, unbalanced, modularized parallel embedded software system for ATR. According to the characteristics of ATR algorithms, some control modules such as system monitoring, task assignment and hierarchical algorithm modules are realized in our system. There are different design principles for different modules. The task assignment module combines different modules into clusters based on mutually exclusive modules, and assigns them to different processors. The principle of combination is the minimum variance of load on different processors. The system satisfies the requirement of real-time performance due to this reasonable strategy for task assignment, with the flexibility and scalability significantly improved.

  9. Autonomous robot vision software design using Matlab toolboxes

    NASA Astrophysics Data System (ADS)

    Tedder, Maurice; Chung, Chan-Jin

    2004-10-01

    The purpose of this paper is to introduce a cost-effective way to design robot vision and control software using Matlab for an autonomous robot designed to compete in the 2004 Intelligent Ground Vehicle Competition (IGVC). The goal of the autonomous challenge event is for the robot to autonomously navigate an outdoor obstacle course bounded by solid and dashed lines on the ground. Visual input data is provided by a DV camcorder at 160 x 120 pixel resolution. The design of this system involved writing an image-processing algorithm using hue, satuaration, and brightness (HSB) color filtering and Matlab image processing functions to extract the centroid, area, and orientation of the connected regions from the scene. These feature vectors are then mapped to linguistic variables that describe the objects in the world environment model. The linguistic variables act as inputs to a fuzzy logic controller designed using the Matlab fuzzy logic toolbox, which provides the knowledge and intelligence component necessary to achieve the desired goal. Java provides the central interface to the robot motion control and image acquisition components. Field test results indicate that the Matlab based solution allows for rapid software design, development and modification of our robot system.

  10. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software

    PubMed Central

    Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad

    2015-01-01

    Objectives: Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. Materials and Methods: In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Results: Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570–1.0). Conclusion: According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome. PMID:26005455

  11. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  12. Computational protein design: the Proteus software and selected applications.

    PubMed

    Simonson, Thomas; Gaillard, Thomas; Mignon, David; Schmidt am Busch, Marcel; Lopes, Anne; Amara, Najette; Polydorides, Savvas; Sedano, Audrey; Druart, Karen; Archontis, Georgios

    2013-10-30

    We describe an automated procedure for protein design, implemented in a flexible software package, called Proteus. System setup and calculation of an energy matrix are done with the XPLOR modeling program and its sophisticated command language, supporting several force fields and solvent models. A second program provides algorithms to search sequence space. It allows a decomposition of the system into groups, which can be combined in different ways in the energy function, for both positive and negative design. The whole procedure can be controlled by editing 2-4 scripts. Two applications consider the tyrosyl-tRNA synthetase enzyme and its successful redesign to bind both O-methyl-tyrosine and D-tyrosine. For the latter, we present Monte Carlo simulations where the D-tyrosine concentration is gradually increased, displacing L-tyrosine from the binding pocket and yielding the binding free energy difference, in good agreement with experiment. Complete redesign of the Crk SH3 domain is presented. The top 10000 sequences are all assigned to the correct fold by the SUPERFAMILY library of Hidden Markov Models. Finally, we report the acid/base behavior of the SNase protein. Sidechain protonation is treated as a form of mutation; it is then straightforward to perform constant-pH Monte Carlo simulations, which yield good agreement with experiment. Overall, the software can be used for a wide range of application, producing not only native-like sequences but also thermodynamic properties with errors that appear comparable to other current software packages. PMID:24037756

  13. Software design and operational model for the WCEDS prototype

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Young, C.J.; Trujillo, J.R.

    1997-08-01

    To explore the potential of waveform correlation for CTBT, the Waveform Correlation Event Detection System (WCEDS) prototype was developed. The WCEDS software design followed the Object Modeling Technique process of analysis, system design, and detailed design and implementation. Several related executable programs are managed through a Graphical User Interface (GUI). The WCEDS prototype operates in an IDC/NDC-compatible environment. It employs a CSS 3.0 database as its primary input/output interface, reading in raw waveforms at the start, and storing origins, events, arrivals, and associations at the finish. Additional output includes correlation results and data for specified testcase origins, and correlation timelines for specified locations. During the software design process, the more general seismic monitoring functionality was extracted from WCEDS-specific requirements and developed into C++ object-oriented libraries. These include the master image, grid, basic seismic, and extended seismic libraries. Existing NDC and commercial libraries were incorporated into the prototype where appropriate, to focus development activities on new capability. The WCEDS-specific application code was built in a separate layer on top of the general seismic libraries. The general seismic libraries developed for the WCEDS prototype can provide a base for other algorithm development projects.

  14. Concepts and software for a rational design of polynucleotide probes.

    PubMed

    Moraru, Cristina; Moraru, Gabriel; Fuchs, Bernhard M; Amann, Rudolf

    2011-02-01

    Fluorescence in situ hybridization (FISH) of genes and mRNA is most often based on polynucleotide probes. However, so far there was no published framework for the rational design of polynucleotide probes. The well-established concepts for oligonucleotide probe design cannot be transferred to polynucleotides. Due to the high allele diversity of genes, a single probe is not sufficient to detect all alleles of a gene. Therefore, the main objective of this study was to develop a concept and software (PolyPro) for rational design of polynucleotide probe mixes to target particular genes. PolyPro consists of three modules: a GenBank Taxonomy Extractor (GTE), a Polynucleotide Probe Designer (PPD) and a Hybridization Parameters Calculator (HPC). The new concept proposes the construction of defined polynucleotide mixes to target the habitat specific sequence diversity of a particular gene. The concept and the software are intended as a first step towards a more frequent application of polynucleotides for in situ identification of mRNA and genes in environmental microbiology. PMID:23761233

  15. Investigating a computerized scaffolding software for student designed science investigations

    NASA Astrophysics Data System (ADS)

    Deters, Kelly M.

    Science standards call for students to develop skills in designing their own investigations. However, this is a complex task that is likely to overload the working memory capacities of students, therefore requiring scaffolding. This study investigated the effects of a computerized scaffold for student-designed experiments. Students (N = 102) used the computer program to individually design an experiment during the third week of their high school general chemistry course. Students were randomly assigned to one of four software versions to determine the effects and interaction effects of backwards-design scaffolding and reflective prompts on laboratory report scores. Scaffolding the students in a backwards-design process lead to significantly higher student performance scores for all students when they were not provided with reflective prompts (p = 0.01). For students labeled as academically advanced by their eighth grade science teacher, backwards design increased student performance scores with or without reflective prompts (p = 0.002). Using reflective prompts had no effect on advanced students. The use of multiple reflective prompts caused the effect of the backwards-design scaffolding to disappear with lower-level students.

  16. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  17. Software and Physics Simulation at Belle II

    NASA Astrophysics Data System (ADS)

    Fulsom, Bryan; Belle Collaboration, II

    2016-03-01

    The Belle II experiment at the SuperKEKB collider in Tsukuba, Japan, will start taking physics data in 2018 and will accumulate 50 ab-1 of e+e-collision data, about 50 times larger than the data set of the earlier Belle experiment. The new detector will use GEANT4 for Monte Carlo simulation and an entirely new software and reconstruction system based on modern computing tools. Examples of physics simulation including beam background overlays will be described.

  18. Some Didactical and Epistemological Considerations in the Design of Educational Software: The Cabri-Euclide Example

    ERIC Educational Resources Information Center

    Luengo, Vanda

    2005-01-01

    We propose to use didactical theory for the design of educational software. Here we present a set of didactical conditions, and explain how they shape the software design of Cabri-Euclide, a microworld used to learn "mathematical proof" in a geometry setting. The aim is to design software that does not include a predefined knowledge of problem…

  19. Software Package Completed for Alloy Design at the Atomic Level

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo H.; Noebe, Ronald D.; Abel, Phillip B.; Good, Brian S.

    2001-01-01

    As a result of a multidisciplinary effort involving solid-state physics, quantum mechanics, and materials and surface science, the first version of a software package dedicated to the atomistic analysis of multicomponent systems was recently completed. Based on the BFS (Bozzolo, Ferrante, and Smith) method for the calculation of alloy and surface energetics, this package includes modules devoted to the analysis of many essential features that characterize any given alloy or surface system, including (1) surface structure analysis, (2) surface segregation, (3) surface alloying, (4) bulk crystalline material properties and atomic defect structures, and (5) thermal processes that allow us to perform phase diagram calculations. All the modules of this Alloy Design Workbench 1.0 (ADW 1.0) are designed to run in PC and workstation environments, and their operation and performance are substantially linked to the needs of the user and the specific application.

  20. De novo gene synthesis design using TmPrime software.

    PubMed

    Li, Mo-Huang; Bode, Marcus; Huang, Mo Chao; Cheong, Wai Chye; Lim, Li Shi

    2012-01-01

    This chapter presents TmPrime, a computer program to design oligonucleotide for both ligase chain reaction (LCR)- and polymerase chain reaction (PCR)-based de novo gene synthesis. The program divides a long input DNA sequence based on user-specified melting temperatures and assembly conditions, and dynamically optimizes the length of oligonucleotides to achieve homologous melting temperatures. The output reports the melting temperatures, oligonucleotide sequences, and potential formation of secondary structures in a PDF file, which will be sent to the user via e-mail. The program also provides functions on sequence pooling to separate long genes into smaller pieces for multipool assembly and codon optimization for expression based on the highest organism-specific codon frequency. This software has been successfully used in the design and synthesis of various genes with total length >20 kbp. This program is freely available at http://prime.ibn.a-star.edu.sg. PMID:22328437

  1. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation

    SciTech Connect

    Yoriyaz, Helio; Moralles, Mauricio; Tarso Dalledone Siqueira, Paulo de; Costa Guimaraes, Carla da; Belonsi Cintra, Felipe; Santos, Adimir dos

    2009-11-15

    Purpose: Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. Methods: For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Results: Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons.Conclusion: Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  2. Organ doses from hepatic radioembolization with 90Y, 153Sm, 166Ho and 177Lu: A Monte Carlo simulation study using Geant4

    NASA Astrophysics Data System (ADS)

    Hashikin, N. A. A.; Yeong, C. H.; Guatelli, S.; Abdullah, B. J. J.; Ng, K. H.; Malaroda, A.; Rosenfeld, A. B.; Perkins, A. C.

    2016-03-01

    90Y-radioembolization is a palliative treatment for liver cancer. 90Y decays via beta emission, making imaging difficult due to absence of gamma radiation. Since post-procedure imaging is crucial, several theranostic radionuclides have been explored as alternatives. However, exposures to gamma radiation throughout the treatment caused concern for the organs near the liver. Geant4 Monte Carlo simulation using MIRD Pamphlet 5 reference phantom was carried out. A spherical tumour with 4.3cm radius was modelled within the liver. 1.82GBq of 90Y sources were isotropically distributed within the tumour, with no extrahepatic shunting. The simulation was repeated with 153Sm, 166Ho and 177Lu. The estimated tumour doses for all radionuclides were 262.9Gy. Tumour dose equivalent to 1.82GBq 90Y can be achieved with 8.32, 5.83, and 4.44GBq for 153Sm, 166Ho and 177Lu, respectively. Normal liver doses by the other radionuclides were lower than 90Y, hence beneficial for normal tissue sparing. The organ doses from 153Sm and 177Lu were relatively higher due to higher gamma energy, but were still well below 1Gy. 166Ho, 177Lu and 153Sm offer useful gamma emission for post-procedure imaging. They show potential as 90Y substitutes, delivering comparable tumour doses, lower normal liver doses and other organs doses far below the tolerance limit.

  3. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  4. Learning & Personality Types: A Case Study of a Software Design Course

    ERIC Educational Resources Information Center

    Ahmed, Faheem; Campbell, Piers; Jaffar, Ahmad; Alkobaisi, Shayma; Campbell, Julie

    2010-01-01

    The software industry has continued to grow over the past decade and there is now a need to provide education and hands-on training to students in various phases of software life cycle. Software design is one of the vital phases of the software development cycle. Psychological theories assert that not everybody is fit for all kind of tasks as…

  5. Pedagogy Embedded in Educational Software Design: Report of a Case Study.

    ERIC Educational Resources Information Center

    Hinostroza, J. Enrique; Mellar, Harvey

    2001-01-01

    Discussion of educational software focuses on a model of educational software that was derived from a case study of two elementary school teachers participating in a software design process. Considers human-computer interface, interaction, software browsing strategies, and implications for teacher training. (Author/LRW)

  6. The Educational Software Design and Evaluation for K-8: Oral and Dental Health Software

    ERIC Educational Resources Information Center

    Kabakci, Isil; Birinci, Gurkay; Izmirli, Serkan

    2007-01-01

    The aim of this study is to inform about the development of the software "Oral and Dental Health" that will supplement the course of Science and Technology for K8 students in the primary school curriculum and to carry out an evaluation study of the software. This software has been prepared for educational purposes. In relation to the evaluation of…

  7. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, Gregory; Ferguson, B. Lynn

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results and were generated using a Yates analysis. A previous study had indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledgebase and developing a procedure to extract rules to aid in TBC design.

  8. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, G.; Ferguson, B. L.

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, Task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results, and were generated using a Yates analysis. A previous study has indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledge base and developing a procedure to extract rules to aid in TBC design.

  9. Process of videotape making: presentation design, software, and hardware

    NASA Astrophysics Data System (ADS)

    Dickinson, Robert R.; Brady, Dan R.; Bennison, Tim; Burns, Thomas; Pines, Sheldon

    1991-06-01

    The use of technical video tape presentations for communicating abstractions of complex data is now becoming commonplace. While the use of video tapes in the day-to-day work of scientists and engineers is still in its infancy, their use as applications oriented conferences is now growing rapidly. Despite these advancements, there is still very little that is written down about the process of making technical videotapes. For printed media, different presentation styles are well known for categories such as results reports, executive summary reports, and technical papers and articles. In this paper, the authors present ideas on the topic of technical videotape presentation design in a format that is worth referring to. They have started to document the ways in which the experience of media specialist, teaching professionals, and character animators can be applied to scientific animation. Software and hardware considerations are also discussed. For this portion, distinctions are drawn between the software and hardware required for computer animation (frame at a time) productions, and live recorded interaction with a computer graphics display.

  10. Designing multistatic ultrasound imaging systems using software analysis

    NASA Astrophysics Data System (ADS)

    Lee, Michael; Singh, Rahul S.; Culjat, Martin O.; Stubbs, Scott; Natarajan, Shyam; Brown, Elliott R.; Grundfest, Warren S.; Lee, Hua

    2010-03-01

    This paper describes the method of using the finite-element analysis software, PZFlex, to direct the design of a novel ultrasound imaging system which uses conformal transducer arrays. Current challenges in ultrasound array technology, including 2D array processing, have motivated exploration into new data acquisition and reconstruction techniques. Ultimately, these efforts encourage a broader examination of the processes used to effectively validate new array configurations and image formation procedures. Commercial software available today is capable of efficiently and accurately modeling detailed operational aspects of customized arrays. Combining quality simulated data with prototyped reconstruction techniques presents a valuable tool for testing novel schemes before committing more costly resources. To investigate this practice, we modeled three 1D ultrasound arrays operating multistatically instead of by the conventional phased-array approach. They are: a simple linear array, a half-circle array with 180-degree coverage, and a full circular array for inward imaging. We present the process used to create unique array models in PZFlex, simulate operation and obtain data, and subsequently generate images by inputting data into a reconstruction algorithm in MATLAB. Further discussion describes the tested reconstruction algorithm and includes resulting images.

  11. 6-MV photon beam modeling for the Varian Clinac iX by using the Geant4 virtual jaw

    NASA Astrophysics Data System (ADS)

    Kim, Byung Yong; Kim, Hyung Dong; Kim, Dong Ho; Baek, Jong Geun; Moon, Su Ho; Rho, Gwang Won; Kang, Jeong Ku; Kim, Sung Kyu

    2015-07-01

    Most virtual source models (VSMs), with the exception of the patient-dependent secondary collimator (jaw), use beam modeling. Unlike other components of the treatment head, the jaw absorbs many photons generated by bremsstrahlung, which decreases the efficiency of the simulation. In the present study, a new method of beam modeling using a virtual jaw was applied to improve the calculation efficiency of VSM. This new method of beam modeling was designed so that the interaction was not generated in the jaw. The results for the percentage depth dose and the profile of the virtual jaw VSM calculated in a homogeneous water phantom agreed with the measurement results for the CC13 cylinder-type ion chamber to within an error of 2%, and the 80-20% penumbra width agreed with the measurement results to within an error of 0.6 mm. Compared with the existing VSM, in which a great number of photons are absorbed, the calculation efficiency of the VSM using the virtual jaw is expected to be increased by approximately 67%.

  12. Designing a Software Tool for Fuzzy Logic Programming

    NASA Astrophysics Data System (ADS)

    Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés

    2007-12-01

    Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.

  13. Design of Timing Synchronization Software on EAST-NBI

    NASA Astrophysics Data System (ADS)

    Zhao, Yuanzhe; Hu, Chundong; Sheng, Peng; Zhang, Xiaodan

    2013-12-01

    To ensure the uniqueness and recognition of data and make it easy to analyze and process the data of all subsystems of the neutral beam injector (NBI), it is required that all subsystems have a unified system time. In this paper, the timing synchronization software is presented which is related to many kinds of technologies, such as shared memory, multithreading, TCP protocol and so on. Shared memory helps the server save the information of clients and system time, multithreading can deal with different clients with different threads, the server works under Linux operating system, the client works under Linux operating system and Windows operating system. With the help of this design, synchronization of all subsystems can be achieved in less than one second, and this accuracy is enough for the NBI system and the reliability of data is thus ensured.

  14. Design of Timing System Software on EAST-NBI

    NASA Astrophysics Data System (ADS)

    Zhao, Yuan-Zhe; Hu, Chun-Dong; Sheng, Peng; Zhang, Xiao-Dan; Wu, De-Yun; Cui, Qing-Long

    2013-10-01

    Neutral Beam Injector (NBI) is one of the main plasma heating and plasma current driving methods for Experimental Advanced Superconducting Tokomaks. In order to monitor the NBI experiment, control all the power supply, realize data acquisition and network, the control system is designed. As an important part of NBI control system, timing system (TS) provides a unified clock for all subsystems of NBI. TS controls the input/output services of digital signals and analog signals. It sends feedback message to the control server which is the function of alarm and interlock protection. The TS software runs on a Windows system and uses Labview language code while using client/server mode, multithreading and cyclic redundancy check technology. The experimental results have proved that TS provides a stability and reliability clock to the subsystems of NBI and contributed to the safety of the whole NBI system.

  15. Software considerations in the design of an image archive

    NASA Astrophysics Data System (ADS)

    Seshadri, Sridhar B.; Kishore, Sheel; Khalsa, Satjeet S.; Stevens, John F.; Arenson, Ronald L.

    1990-08-01

    The Radiology Department at the Hospital of the University of Pennsylvania is currently expanding its prototype Picture Archiving and Communications System (PACS) into a fully functional clinical system. The first phase of this expansion involves three major efforts: the upgrade of the 10-Mbit token-ring to an 80-Mbit backbone with associated sub-nets, the implementation of a large-scale image archive, and, an interface between the PACS and the Department's Radiology Information System. Upon the completion of this phase, the PACS will serve the storage and display needs of four MRI scanners and four of the Hospital's Intensive Care Units. This paper addresses the implementation of a software suite designed to duplicate and enhance conventional Film Library functions on a PACS. The structure of an electronic 'folder' based upon the ACR/NEMA Digital Imaging and Communication Standard is also introduced.

  16. Software Would Largely Automate Design of Kalman Filter

    NASA Technical Reports Server (NTRS)

    Chuang, Jason C. H.; Negast, William J.

    2005-01-01

    Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.

  17. Orion Relative Navigation Flight Software Analysis and Design

    NASA Technical Reports Server (NTRS)

    D'Souza, Chris; Christian, John; Zanetti, Renato

    2011-01-01

    The Orion relative Navigation System has sought to take advantage of the latest developments in sensor and algorithm technology while living under the constraints of mass, power, volume, and throughput. In particular, the only sensor specifically designed for relative navigation is the Vision Navigation System (VNS), a lidar-based sensor. But it uses the Star Trackers, GPS (when available) and IMUs, which are part of the overall Orion navigation sensor suite, to produce a relative state accurate enough to dock with the ISS. The Orion Relative Navigation System has significantly matured as the program has evolved from the design phase to the flight software implementation phase. With the development of the VNS system and the STORRM flight test of the Orion Relative Navigation hardware, much of the performance of the system will be characterized before the first flight. However challenges abound, not the least of which is the elimination of the RF range and range-rate system, along with the development of the FSW in the Matlab/Simulink/Stateflow environment. This paper will address the features and the rationale for the Orion Relative Navigation design as well as the performance of the FSW in a 6-DOF environment as well as the initial results of the hardware performance from the STORRM flight.

  18. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  19. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  20. Digital Modeling in Design Foundation Coursework: An Exploratory Study of the Effectiveness of Conceptual Design Software

    ERIC Educational Resources Information Center

    Guidera, Stan; MacPherson, D. Scot

    2008-01-01

    This paper presents the results of a study that was conducted to identify and document student perceptions of the effectiveness of computer modeling software introduced in a design foundations course that had previously utilized only conventional manually-produced representation techniques. Rather than attempt to utilize a production-oriented CAD…

  1. Validation of GEANT4 simulations for percentage depth dose calculations in heterogeneous media by using small photon beams from the 6-MV Cyberknife: Comparison with photon beam dosimetry with EBT2 film

    NASA Astrophysics Data System (ADS)

    Lee, Chung Il; Yoon, Sei-Chul; Shin, Jae Won; Hong, Seung-Woo; Suh, Tae Suk; Min, Kyung Joo; Lee, Sang Deok; Chung, Su Mi; Jung, Jae-Yong

    2015-04-01

    Percentage depth dose (PDD) distributions in heterogeneous phantoms with lung and soft bone equivalent media are studied by using the GEANT4 Monte Carlo code. For lung equivalent media, Balsa wood is used, and for soft bone equivalent media, a compound material with epoxy resin, hardener and calcium carbonate is used. Polystyrene slabs put together with these materials are used as a heterogeneous phantom. Dose measurements are performed with Gafchromic EBT2 film by using photon beams from the 6-MV CyberKnife at the Seoul Uridul Hospital. The cone sizes of the photon beams are varied from 5 to 10 to 30 mm. When the Balsa wood is inserted in the phantom, the dose measured with EBT2 film is found to be significantly different from the dose without the EBT2 film in and the dose beyond the Balsa wood region, particularly for small field sizes. On the other hand, when the soft bone equivalent material is inserted in the phantom, the discrepancy between the dose measured with EBT2 film and the dose without EBT2 film can be seen only in the region of the soft bone equivalent material. GEANT4 simulations are done with and without the EBT2 film to compare the simulation results with measurements. The GEANT4 simulations including EBT2 film are found to agree well with the measurements for all the cases within an error of 2.2%. The results of the present study show that GEANT4 gives reasonable results for the PDD calculations in heterogeneous media when using photon beams produced by the 6-MV CyberKnife

  2. On the design of multimedia software and future system architectures

    NASA Astrophysics Data System (ADS)

    de With, Peter H. N.; Jaspers, Egbert G.

    2004-04-01

    A principal challenge for reducing the cost for designing complex systems-on-chip is to pursue more generic systems for a broad range of products. For this purpose, we explore three new architectural concepts for state-of-art video applications. First, we discuss a reusable scalable hardware architecture employing a hierarchical communication network fitting with the natural hierarchy of the application. In a case study, we show that MPEG streaming in DTV occurs at high level, while subsystems communicate at lower levels. The second concept is a software design that scales over a number of processors to enable reuse over a range of VLSI process technologies. We explore this via an H.264 decoder implementation scaling nearly linearly over up to eight processors by applying data partitioning. The third topic is resource-scalability, which is required to satisfy realtime constraints in a system with a high amount of shared resources. An example complexity-scalable MPEG-2 coder scales the required cycle budget with a factor of three, in parallel with a smooth degradation of quality.

  3. Integrated Software for Analyzing Designs of Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Philips, Alan D.

    2003-01-01

    Launch Vehicle Analysis Tool (LVA) is a computer program for preliminary design structural analysis of launch vehicles. Before LVA was developed, in order to analyze the structure of a launch vehicle, it was necessary to estimate its weight, feed this estimate into a program to obtain pre-launch and flight loads, then feed these loads into structural and thermal analysis programs to obtain a second weight estimate. If the first and second weight estimates differed, it was necessary to reiterate these analyses until the solution converged. This process generally took six to twelve person-months of effort. LVA incorporates text to structural layout converter, configuration drawing, mass properties generation, pre-launch and flight loads analysis, loads output plotting, direct solution structural analysis, and thermal analysis subprograms. These subprograms are integrated in LVA so that solutions can be iterated automatically. LVA incorporates expert-system software that makes fundamental design decisions without intervention by the user. It also includes unique algorithms based on extensive research. The total integration of analysis modules drastically reduces the need for interaction with the user. A typical solution can be obtained in 30 to 60 minutes. Subsequent runs can be done in less than two minutes.

  4. Efficiency corrections in determining the (137)Cs inventory of environmental soil samples by using relative measurement method and GEANT4 simulations.

    PubMed

    Li, Gang; Liang, Yongfei; Xu, Jiayun; Bai, Lixin

    2015-08-01

    The determination of (137)Cs inventory is widely used to estimate the soil erosion or deposition rate. The generally used method to determine the activity of volumetric samples is the relative measurement method, which employs a calibration standard sample with accurately known activity. This method has great advantages in accuracy and operation only when there is a small difference in elemental composition, sample density and geometry between measuring samples and the calibration standard. Otherwise it needs additional efficiency corrections in the calculating process. The Monte Carlo simulations can handle these correction problems easily with lower financial cost and higher accuracy. This work presents a detailed description to the simulation and calibration procedure for a conventionally used commercial P-type coaxial HPGe detector with cylindrical sample geometry. The effects of sample elemental composition, density and geometry were discussed in detail and calculated in terms of efficiency correction factors. The effect of sample placement was also analyzed, the results indicate that the radioactive nuclides and sample density are not absolutely uniform distributed along the axial direction. At last, a unified binary quadratic functional relationship of efficiency correction factors as a function of sample density and height was obtained by the least square fitting method. This function covers the sample density and height range of 0.8-1.8 g/cm(3) and 3.0-7.25 cm, respectively. The efficiency correction factors calculated by the fitted function are in good agreement with those obtained by the GEANT4 simulations with the determination coefficient value greater than 0.9999. The results obtained in this paper make the above-mentioned relative measurements more accurate and efficient in the routine radioactive analysis of environmental cylindrical soil samples. PMID:25973538

  5. Benchmarking the Geant4 full system simulation of an associated alpha-particle detector for use in a D-T neutron generator.

    PubMed

    Zhang, Xiaodong; Hayward, Jason P; Cates, Joshua W; Hausladen, Paul A; Laubach, Mitchell A; Sparger, Johnathan E; Donnald, Samuel B

    2012-08-01

    The position-sensitive alpha-particle detector used to provide the starting time and initial direction of D-T neutrons in a fast-neutron imaging system was simulated with a Geant4-based Monte Carlo program. The whole detector system, which consists of a YAP:Ce scintillator, a fiber-optic faceplate, a light guide, and a position-sensitive photo-multiplier tube (PSPMT), was modeled, starting with incident D-T alphas. The scintillation photons, whose starting time follows the distribution of a scintillation decay curve, were produced and emitted uniformly into a solid angle of 4π along the track segments of the alpha and its secondaries. Through tracking all photons and taking into account the quantum efficiency of the photocathode, the number of photoelectrons and their time and position distributions were obtained. Using a four-corner data reconstruction formula, the flood images of the alpha detector with and without optical grease between the YAP scintillator and the fiber-optic faceplate were obtained, which show agreement with the experimental results. The reconstructed position uncertainties of incident alpha particles for both cases are 1.198 mm and 0.998 mm respectively across the sensitive area of the detector. Simulation results also show that comparing with other faceplates composed of 500 μm, 300 μm, and 100 μm fibers, the 10-μm-fiber faceplate is the best choice to build the detector for better position performance. In addition, the study of the background originating inside the D-T generator suggests that for 500-μm-thick YAP:Ce coated with 1-μm-thick aluminum, and very good signal-to-noise ratio can be expected through application of a simple threshold. PMID:22728838

  6. SU-E-T-519: Emission of Secondary Particles From a PMMA Phantom During Proton Irradiation: A Simulation Study with the Geant4 Monte Carlo Toolkit

    SciTech Connect

    Lau, A; Chen, Y; Ahmad, S

    2014-06-01

    Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMA phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study.

  7. SU-E-T-290: Secondary Dose Monitoring Using Scintillating Fibers in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    SciTech Connect

    Tesfamicael, B; Gueye, P; Lyons, D; Avery, S; Mahesh, M

    2014-06-01

    Purpose: To monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers. Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate prostate cancer proton therapy based treatments. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cmm{sup 3} DuPont™ Delrin blocks were used to monitor the emission of secondary particles in the transverse (left and right) and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were used to extract the energy deposited in each fiber and the scintillating block. Results: The transverse dose distributions from secondary particles in both cases agree within <5% and with a very good symmetry. The energy deposited not only gradually increases as one moves from the peripheral row fibers towards the center of the block (aligned with the center of the prostate) but also decreases as one goes from the frontal to distal region of the block. The ratio of the doses from the prostate to the ones in the middle two rows of fibers showed a linear relationship with a slope (−3.55±2.26) × 10−5 MeV per treatment Gy. The distal detectors recorded a very small energy deposited due to water attenuation. Conclusion: With a good calibration and the ability to define a good correlation between the dose to the external fibers and the prostate, such fibers can be used for real time dose verification to the target.

  8. Comparative study of dose distributions and cell survival fractions for 1H, 4He, 12C and 16O beams using Geant4 and Microdosimetric Kinetic model

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2015-04-01

    Depth and radial dose profiles for therapeutic 1H, 4He, 12C and 16O beams are calculated using the Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT). 4He and 16O ions are presented as alternative options to 1H and 12C broadly used for ion-beam cancer therapy. Biological dose profiles and survival fractions of cells are estimated using the modified Microdosimetric Kinetic model. Depth distributions of cell survival of healthy tissues, assuming 10% and 50% survival of tumor cells, are calculated for 6 cm SOBPs at two tumor depths and for different tissues radiosensitivities. It is found that the optimal ion choice depends on (i) depth of the tumor, (ii) dose levels and (iii) the contrast of radiosensitivities of tumor and surrounding healthy tissues. Our results indicate that 12C and 16O ions are more appropriate to spare healthy tissues in the case of a more radioresistant tumor at moderate depths. On the other hand, a sensitive tumor surrounded by more resistant tissues can be better treated with 1H and 4He ions. In general, 4He beam is found to be a good candidate for therapy. It better spares healthy tissues in all considered cases compared to 1H. Besides, the dose conformation is improved for deep-seated tumors compared to 1H, and the damage to surrounding healthy tissues is reduced compared to heavier ions due to the lower impact of nuclear fragmentation. No definite advantages of 16O with respect to 12C ions are found in this study.

  9. Technical Note: Implementation of biological washout processes within GATE/GEANT4—A Monte Carlo study in the case of carbon therapy treatments

    SciTech Connect

    Martínez-Rovira, I. Jouvie, C.; Jan, S.

    2015-04-15

    Purpose: The imaging of positron emitting isotopes produced during patient irradiation is the only in vivo method used for hadrontherapy dose monitoring in clinics nowadays. However, the accuracy of this method is limited by the loss of signal due to the metabolic decay processes (biological washout). In this work, a generic modeling of washout was incorporated into the GATE simulation platform. Additionally, the influence of the washout on the β{sup +} activity distributions in terms of absolute quantification and spatial distribution was studied. Methods: First, the irradiation of a human head phantom with a {sup 12}C beam, so that a homogeneous dose distribution was achieved in the tumor, was simulated. The generated {sup 11}C and {sup 15}O distribution maps were used as β{sup +} sources in a second simulation, where the PET scanner was modeled following a detailed Monte Carlo approach. The activity distributions obtained in the presence and absence of washout processes for several clinical situations were compared. Results: Results show that activity values are highly reduced (by a factor of 2) in the presence of washout. These processes have a significant influence on the shape of the PET distributions. Differences in the distal activity falloff position of 4 mm are observed for a tumor dose deposition of 1 Gy (T{sub ini} = 0 min). However, in the case of high doses (3 Gy), the washout processes do not have a large effect on the position of the distal activity falloff (differences lower than 1 mm). The important role of the tumor washout parameters on the activity quantification was also evaluated. Conclusions: With this implementation, GATE/GEANT 4 is the only open-source code able to simulate the full chain from the hadrontherapy irradiation to the PET dose monitoring including biological effects. Results show the strong impact of the washout processes, indicating that the development of better models and measurement of biological washout data are

  10. Dose distribution changes with shielding disc misalignments and wrong orientations in breast IOERT: a Monte Carlo - GEANT4 and experimental study.

    PubMed

    Russo, Giorgio; Casarino, Carlo; Arnetta, Gaetano; Candiano, Giuliana; Stefano, Alessandro; Alongi, Filippo; Borasi, Giovanni; Messa, Cristina; Gilardi, Maria C

    2012-01-01

    One of the most relevant risks in breast intraoperative electron radiotherapy (IOERT) is the incorrect positioning of the shielding disc. If such a setup error occurs, the treatment zone could receive a nonuniform dose delivery, and a considerable part of the electron beam could hit - and irradiate - the patient's healthy tissue. However misalignment and tilt angle of the shielding disc can be evaluated, but it is not possible to measure the corresponding in vivo dose distribution. This led us to develop a simulation using the Geant4 Monte Carlo toolkit to study the effects of disc configuration on dose distribution. Some parameters were investigated: the shielding factor (SF), the radiation back scattering factor (BSF), the volume-dose histogram in the treatment zone, and the maximum leakage dose (MLD) in normal tissue. A lateral shift of the disc (in the plane perpendicular to the beam axis) causes a decrease in SF (from 4% for a misalignment of 5 mm to 40% for a misalignment of 40 mm), but no relevant dose variations were found for a tilt angle until 10°. In the same uncorrected disc positions, the BSF shows no significant change. MLD rises to 3.45 Gy for a 14 mm misalignment and 4.60 Gy for 30° tilt angle when the prescribed dose is 21 Gy. The simulation results are compared with the experimental ones, and allow an a posteriori estimation of the dose distribution in the breast target and underlying healthy tissue. This information could help the surgical team choose a more correct clinical setup, and assist in quantifying the degree of success or failure of an IOERT breast treatment. PMID:22955646

  11. GEANT4 simulation of a scintillating-fibre tracker for the cosmic-ray muon tomography of legacy nuclear waste containers

    NASA Astrophysics Data System (ADS)

    Clarkson, A.; Hamilton, D. J.; Hoek, M.; Ireland, D. G.; Johnstone, J. R.; Kaiser, R.; Keri, T.; Lumsden, S.; Mahon, D. F.; McKinnon, B.; Murray, M.; Nutbeam-Tuffs, S.; Shearer, C.; Staines, C.; Yang, G.; Zimmerman, C.

    2014-05-01

    Cosmic-ray muons are highly penetrative charged particles that are observed at the sea level with a flux of approximately one per square centimetre per minute. They interact with matter primarily through Coulomb scattering, which is exploited in the field of muon tomography to image shielded objects in a wide range of applications. In this paper, simulation studies are presented that assess the feasibility of a scintillating-fibre tracker system for use in the identification and characterisation of nuclear materials stored within industrial legacy waste containers. A system consisting of a pair of tracking modules above and a pair below the volume to be assayed is simulated within the GEANT4 framework using a range of potential fibre pitches and module separations. Each module comprises two orthogonal planes of fibres that allow the reconstruction of the initial and Coulomb-scattered muon trajectories. A likelihood-based image reconstruction algorithm has been developed that allows the container content to be determined with respect to the scattering density λ, a parameter which is related to the atomic number Z of the scattering material. Images reconstructed from this simulation are presented for a range of anticipated scenarios that highlight the expected image resolution and the potential of this system for the identification of high-Z materials within a shielded, concrete-filled container. First results from a constructed prototype system are presented in comparison with those from a detailed simulation. Excellent agreement between experimental data and simulation is observed showing clear discrimination between the different materials assayed throughout.

  12. Comparative study of dose distributions and cell survival fractions for 1H, 4He, 12C and 16O beams using Geant4 and Microdosimetric Kinetic model.

    PubMed

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2015-04-21

    Depth and radial dose profiles for therapeutic (1)H, (4)He, (12)C and (16)O beams are calculated using the Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT). (4)He and (16)O ions are presented as alternative options to (1)H and (12)C broadly used for ion-beam cancer therapy. Biological dose profiles and survival fractions of cells are estimated using the modified Microdosimetric Kinetic model. Depth distributions of cell survival of healthy tissues, assuming 10% and 50% survival of tumor cells, are calculated for 6 cm SOBPs at two tumor depths and for different tissues radiosensitivities. It is found that the optimal ion choice depends on (i) depth of the tumor, (ii) dose levels and (iii) the contrast of radiosensitivities of tumor and surrounding healthy tissues. Our results indicate that (12)C and (16)O ions are more appropriate to spare healthy tissues in the case of a more radioresistant tumor at moderate depths. On the other hand, a sensitive tumor surrounded by more resistant tissues can be better treated with (1)H and (4)He ions. In general, (4)He beam is found to be a good candidate for therapy. It better spares healthy tissues in all considered cases compared to (1)H. Besides, the dose conformation is improved for deep-seated tumors compared to (1)H, and the damage to surrounding healthy tissues is reduced compared to heavier ions due to the lower impact of nuclear fragmentation. No definite advantages of (16)O with respect to (12)C ions are found in this study. PMID:25825827

  13. MHTool User's Guide - Software for Manufactured Housing Structural Design

    SciTech Connect

    W. D. Richins

    2005-07-01

    Since the late 1990s, the Department of Energy's Idaho National Laboratory (INL) has worked with the US Department of Housing and Urban Development (HUD), the Manufactured Housing Institute (MHI), the National Institute of Standards and Technology (NIST), the National Science Foundation (NSF), and an industry committee to measure the response of manufactured housing to both artificial and natural wind loads and to develop a computational desktop tool to optimize the structural performance of manufactured housing to HUD Code loads. MHTool is the result of an 8-year intensive testing and verification effort using single and double section homes. MHTool is the first fully integrated structural analysis software package specifically designed for manufactured housing. To use MHTool, industry design engineers will enter information (geometries, materials, connection types, etc.) describing the structure of a manufactured home, creating a base model. Windows, doors, and interior walls can be added to the initial design. Engineers will input the loads required by the HUD Code (wind, snow loads, interior live loads, etc.) and run an embedded finite element solver to find walls or connections where stresses are either excessive or very low. The designer could, for example, substitute a less expensive and easier to install connection in areas with very low stress, then re-run the analysis for verification. If forces and stresses are still within HUD Code requirements, construction costs would be saved without sacrificing quality. Manufacturers can easily change geometries or component properties to optimize designs of various floor plans then submit MHTool input and output in place of calculations for DAPIA review. No change in the regulatory process is anticipated. MHTool, while not yet complete, is now ready for demonstration. The pre-BETA version (Build-16) was displayed at the 2005 National Congress & Expo for Manufactured & Modular Housing. Additional base models and an

  14. Design consideration for design a flat and ring plastics part using Solidworks software

    NASA Astrophysics Data System (ADS)

    Amran, M. A. M.; Faizal, K. M.; Salleh, M. S.; Sulaiman, M. A.; Mohamad, E.

    2015-12-01

    Various considerations on design of plastic injection moulded parts were applied in initial stage to prevent any defects of end products. Therefore, the objective of this project is to design the plastic injection moulded part by taking consideration on several factors such as draft angle, corner radius and location of gate. In this project, flat plastic part, ring plastic part, core inserts for flat and ring plastic part were designed using SolidWorks software. The plastic part was drawn in sketching mode then the 3D modeling of solid part was generated using various commands. Considerations of plastic part such as draft angle and corner radius with location of gate was considered in the design stage. Finally, it was successfully designed the two plastic parts with their respectively insert by using SolidWorks software. The flat plastic part and ring plastic part were designed for the purpose for future researches for study the weld lines, meld lines, air trapped and geometrical size of the product. Thus, by designing the flat plastic part and ring plastic part having core insert on each part, the completed mould design of two plate mould can be considered. This is because, plastic injection parts are needed to be designed properly in order to neglect any defect when the mould was made.

  15. Design and performance test of spacecraft test and operation software

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Cui, Yan; Wang, Shuo; Meng, Xiaofeng

    2011-06-01

    Main test processor (MTP) software is the key element of Electrical Ground Support Equipment (EGSE) for spacecraft test and operation used in the Chinese Academy of Space Technology (CAST) for years without innovation. With the increasing demand for a more efficient and agile MTP software, the new MTP software was developed. It adopts layered and plug-in based software architecture, whose core runtime server provides message queue management, share memory management and process management services and forms the framework for a configurable and open architecture system. To investigate the MTP software's performance, the test case of network response time, test sequence management capability and data-processing capability was introduced in detail. Test results show that the MTP software is common and has higher performance than the legacy one.

  16. A Formal Approach to Domain-Oriented Software Design Environments

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    This paper describes a formal approach to domain-oriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domain-oriented specification language and its relationship to implementation-level subroutines. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that guides them in creating diagrams denoting formal specifications. The diagrams also serve to document the specifications. Deductive program synthesis ensures that end-user specifications are correctly implemented. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory, which includes an axiomatization of JPL's SPICELIB subroutine library. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development. Furthermore, AMPHION synthesizes one to two page programs consisting of calls to SPICELIB subroutines from these specifications in just a few minutes. Test results obtained by metering AMPHION's deductive program synthesis component are examined. AMPHION has been installed at JPL and is currently undergoing further refinement in preparation for distribution to hundreds of SPICELIB users worldwide. Current work to support end-user customization of AMPHION's specification acquisition subsystem is briefly discussed, as well as future work to enable domain-expert creation of new AMPHION applications through development of suitable domain theories.

  17. LISP as an Environment for Software Design: Powerful and Perspicuous

    PubMed Central

    Blum, Robert L.; Walker, Michael G.

    1986-01-01

    The LISP language provides a useful set of features for prototyping knowledge-intensive, clinical applications software that is not found In most other programing environments. Medical computer programs that need large medical knowledge bases, such as programs for diagnosis, therapeutic consultation, education, simulation, and peer review, are hard to design, evolve continually, and often require major revisions. They necessitate an efficient and flexible program development environment. The LISP language and programming environments bullt around it are well suited for program prototyping. The lingua franca of artifical intelligence researchers, LISP facllitates bullding complex systems because it is simple yet powerful. Because of its simplicity, LISP programs can read, execute, modify and even compose other LISP programs at run time. Hence, it has been easy for system developers to create programming tools that greatly speed the program development process, and that may be easily extended by users. This has resulted in the creation of many useful graphical interfaces, editors, and debuggers, which facllitate the development of knowledge-intensive medical applications.

  18. Exploratory research for the development of a computer aided software design environment with the software technology program

    NASA Technical Reports Server (NTRS)

    Hardwick, Charles

    1991-01-01

    Field studies were conducted by MCC to determine areas of research of mutual interest to MCC and JSC. NASA personnel from the Information Systems Directorate and research faculty from UHCL/RICIS visited MCC in Austin, Texas to examine tools and applications under development in the MCC Software Technology Program. MCC personnel presented workshops in hypermedia, design knowledge capture, and design recovery on site at JSC for ISD personnel. The following programs were installed on workstations in the Software Technology Lab, NASA/JSC: (1) GERM (Graphic Entity Relations Modeler); (2) gIBIS (Graphic Issues Based Information System); and (3) DESIRE (Design Recovery tool). These applications were made available to NASA for inspection and evaluation. Programs developed in the MCC Software Technology Program run on the SUN workstation. The programs do not require special configuration, but they will require larger than usual amounts of disk space and RAM to operate properly.

  19. Design study of Software-Implemented Fault-Tolerance (SIFT) computer

    NASA Technical Reports Server (NTRS)

    Wensley, J. H.; Goldberg, J.; Green, M. W.; Kutz, W. H.; Levitt, K. N.; Mills, M. E.; Shostak, R. E.; Whiting-Okeefe, P. M.; Zeidler, H. M.

    1982-01-01

    Software-implemented fault tolerant (SIFT) computer design for commercial aviation is reported. A SIFT design concept is addressed. Alternate strategies for physical implementation are considered. Hardware and software design correctness is addressed. System modeling and effectiveness evaluation are considered from a fault-tolerant point of view.

  20. Research and Design Issues Concerning the Development of Educational Software for Children. Technical Report No. 14.

    ERIC Educational Resources Information Center

    Char, Cynthia

    Several research and design issues to be considered when creating educational software were identified by a field test evaluation of three types of innovative software created at Bank Street College: (1) Probe, software for measuring and graphing temperature data; (2) Rescue Mission, a navigation game that illustrates the computer's use for…