Science.gov

Sample records for geant4 software design

  1. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  2. A Roadmap For Geant4

    NASA Astrophysics Data System (ADS)

    Asai, Makoto

    2012-12-01

    The Geant4 simulation toolkit is now in the 14th year of its production phase. Geant4 is the choice of most current and near future high energy physics experiments as their simulation engine, and it is also widely used in astrophysics, space engineering, medicine and industrial application domains. Geant4 is a “living” code under continuous development; improvement of physics quality and computational speed is still a priority for Geant4. It is evolving and being enriched with new functionalities. On the other hand, the simulation paradigm that prevailed during the foundation of Geant4 is now being rethought because of new technologies in both computer hardware and software. The Geant4 Collaboration has identified many options and possibilities. Geant4 has accommodated some of these by providing a multi-threading prototype based on event-level parallelism. In this article we discuss the past, present and future of the Geant4 toolkit.

  3. The GEANT4 Visualisation System

    SciTech Connect

    Allison, J.; Asai, M.; Barrand, G.; Donszelmann, M.; Minamimoto, K.; Tanaka, S.; Tcherniaev, E.; Tinslay, J.; /SLAC

    2007-11-02

    The Geant4 Visualization System is a multi-driver graphics system designed to serve the Geant4 Simulation Toolkit. It is aimed at the visualization of Geant4 data, primarily detector descriptions and simulated particle trajectories and hits. It can handle a variety of graphical technologies simultaneously and interchangeably, allowing the user to choose the visual representation most appropriate to requirements. It conforms to the low-level Geant4 abstract graphical user interfaces and introduces new abstract classes from which the various drivers are derived and that can be straightforwardly extended, for example, by the addition of a new driver. It makes use of an extendable class library of models and filters for data representation and selection. The Geant4 Visualization System supports a rich set of interactive commands based on the Geant4 command system. It is included in the Geant4 code distribution and maintained and documented like other components of Geant4.

  4. MCNP5 and GEANT4 comparisons for preliminary Fast Neutron Pencil Beam design at the University of Utah TRIGA system

    NASA Astrophysics Data System (ADS)

    Adjei, Christian Amevi

    The main objective of this thesis is twofold. The starting objective was to develop a model for meaningful benchmarking of different versions of GEANT4 against an experimental set-up and MCNP5 pertaining to photon transport and interactions. The following objective was to develop a preliminary design of a Fast Neutron Pencil Beam (FNPB) Facility to be applicable for the University of Utah research reactor (UUTR) using MCNP5 and GEANT4. The three various GEANT4 code versions, GEANT4.9.4, GEANT4.9.3, and GEANT4.9.2, were compared to MCNP5 and the experimental measurements of gamma attenuation in air. The average gamma dose rate was measured in the laboratory experiment at various distances from a shielded cesium source using a Ludlum model 19 portable NaI detector. As it was expected, the gamma dose rate decreased with distance. All three GEANT4 code versions agreed well with both the experimental data and the MCNP5 simulation. Additionally, a simple GEANT4 and MCNP5 model was developed to compare the code agreements for neutron interactions in various materials. Preliminary FNPB design was developed using MCNP5; a semi-accurate model was developed using GEANT4 (because GEANT4 does not support the reactor physics modeling, the reactor was represented as a surface neutron source, thus a semi-accurate model). Based on the MCNP5 model, the fast neutron flux in a sample holder of the FNPB is obtained to be 6.52×107 n/cm2s, which is one order of magnitude lower than gigantic fast neutron pencil beam facilities existing elsewhere. The MCNP5 model-based neutron spectrum indicates that the maximum expected fast neutron flux is at a neutron energy of ~1 MeV. In addition, the MCNP5 model provided information on gamma flux to be expected in this preliminary FNPB design; specifically, in the sample holder, the gamma flux is to be expected to be around 108 γ/cm 2s, delivering a gamma dose of 4.54×103 rem/hr. This value is one to two orders of magnitudes below the gamma

  5. Investigation of OPET Performance Using GATE, a Geant4-Based Simulation Software.

    PubMed

    Rannou, Fernando R; Kohli, Vandana; Prout, David L; Chatziioannou, Arion F

    2004-10-01

    A combined optical positron emission tomography (OPET) system is capable of both optical and PET imaging in the same setting, and it can provide information/interpretation not possible in single-mode imaging. The scintillator array here serves the dual function of coupling the optical signal from bioluminescence/fluorescence to the photodetector and also of channeling optical scintillations from the gamma rays. We report simulation results of the PET part of OPET using GATE, a Geant4 simulation package. The purpose of this investigation is the definition of the geometric parameters of the OPET tomograph. OPET is composed of six detector blocks arranged in a hexagonal ring-shaped pattern with an inner radius of 15.6 mm. Each detector consists of a two-dimensional array of 8 × 8 scintillator crystals each measuring 2 × 2 × 10 mm(3). Monte Carlo simulations were performed using the GATE software to measure absolute sensitivity, depth of interaction, and spatial resolution for two ring configurations, with and without gantry rotations, two crystal materials, and several crystal lengths. Images were reconstructed with filtered backprojection after angular interleaving and transverse one-dimensional interpolation of the sinogram. We report absolute sensitivities nearly seven times that of the prototype microPET at the center of field of view and 2.0 mm tangential and 2.3 mm radial resolutions with gantry rotations up to an 8.0 mm radial offset. These performance parameters indicate that the imaging spatial resolution and sensitivity of the OPET system will be suitable for high-resolution and high-sensitivity small-animal PET imaging.

  6. Geant4 Applications in Space

    SciTech Connect

    Asai, M.; /SLAC

    2007-11-07

    Use of Geant4 is rapidly expanding in space application domain. I try to overview three major application areas of Geant4 in space, which are apparatus simulation for pre-launch design and post-launch analysis, planetary scale simulation for radiation spectra and surface and sub-surface explorations, and micro-dosimetry simulation for single event study and radiation-hardening of semiconductor devices. Recently, not only the mission dependent applications but also various multi-purpose or common tools built on top of Geant4 are also widely available. I overview some of such tools as well. The Geant4 Collaboration identifies that the space applications are now one of the major driving forces of the further developments and refinements of Geant4 toolkit. Highlights of such developments are introduced.

  7. GEANT4 used for neutron beam design of a neutron imaging facility at TRIGA reactor in Morocco

    NASA Astrophysics Data System (ADS)

    Ouardi, A.; Machmach, A.; Alami, R.; Bensitel, A.; Hommada, A.

    2011-09-01

    Neutron imaging has a broad scope of applications and has played a pivotal role in visualizing and quantifying hydrogenous masses in metallic matrices. The field continues to expand into new applications with the installation of new neutron imaging facilities. In this scope, a neutron imaging facility for computed tomography and real-time neutron radiography is currently being developed around 2.0MW TRIGA MARK-II reactor at Maamora Nuclear Research Center in Morocco (Reuscher et al., 1990 [1]; de Menezes et al., 2003 [2]; Deinert et al., 2005 [3]). The neutron imaging facility consists of neutron collimator, real-time neutron imaging system and imaging process systems. In order to reduce the gamma-ray content in the neutron beam, the tangential channel was selected. For power of 250 kW, the corresponding thermal neutron flux measured at the inlet of the tangential channel is around 3×10 11 ncm 2/s. This facility will be based on a conical neutron collimator with two circular diaphragms with diameters of 4 and 2 cm corresponding to L/D-ratio of 165 and 325, respectively. These diaphragms' sizes allow reaching a compromise between good flux and efficient L/D-ratio. Convergent-divergent collimator geometry has been adopted. The beam line consists of a gamma filter, fast neutrons filter, neutron moderator, neutron and gamma shutters, biological shielding around the collimator and several stages of neutron collimator. Monte Carlo calculations by a fully 3D numerical code GEANT4 were used to design the neutron beam line ( http://www.info.cern.ch/asd/geant4/geant4.html[4]). To enhance the neutron thermal beam in terms of quality, several materials, mainly bismuth (Bi) and sapphire (Al 2O 3) were examined as gamma and neutron filters respectively. The GEANT4 simulations showed that the gamma and epithermal and fast neutron could be filtered using the bismuth (Bi) and sapphire (Al 2O 3) filters, respectively. To get a good cadmium ratio, GEANT 4 simulations were used to

  8. Recent Developments in the Geant4 Hadronic Framework

    NASA Astrophysics Data System (ADS)

    Pokorski, Witold; Ribon, Alberto

    2014-06-01

    In this paper we present the recent developments in the Geant4 hadronic framework. Geant4 is the main simulation toolkit used by the LHC experiments and therefore a lot of effort is put into improving the physics models in order for them to have more predictive power. As a consequence, the code complexity increases, which requires constant improvement and optimization on the programming side. At the same time, we would like to review and eventually reduce the complexity of the hadronic software framework. As an example, a factory design pattern has been applied in Geant4 to avoid duplications of objects, like cross sections, which can be used by several processes or physics models. This approach has been applied also for physics lists, to provide a flexible configuration mechanism at run-time, based on macro files. Moreover, these developments open the future possibility to build Geant4 with only a specified sub-set of physics models. Another technical development focused on the reproducibility of the simulation, i.e. the possibility to repeat an event once the random generator status at the beginning of the event is known. This is crucial for debugging rare situations that may occur after long simulations. Moreover, reproducibility in normal, sequential Geant4 simulation is an important prerequisite to verify the equivalence with multithreaded Geant4 simulations.

  9. A Virtual Geant4 Environment

    NASA Astrophysics Data System (ADS)

    Iwai, Go

    2015-12-01

    We describe the development of an environment for Geant4 consisting of an application and data that provide users with a more efficient way to access Geant4 applications without having to download and build the software locally. The environment is platform neutral and offers the users near-real time performance. In addition, the environment consists of data and Geant4 libraries built using low-level virtual machine (LLVM) tools which can produce bitcode that can be embedded in HTML and accessed via a browser. The bitcode is downloaded to the local machine via the browser and can then be configured by the user. This approach provides a way of minimising the risk of leaking potentially sensitive data used to construct the Geant4 model and application in the medical domain for treatment planning. We describe several applications that have used this approach and compare their performance with that of native applications. We also describe potential user communities that could benefit from this approach.

  10. Design of Cherenkov bars for the optical part of the time-of-flight detector in Geant4.

    PubMed

    Nozka, L; Brandt, A; Rijssenbeek, M; Sykora, T; Hoffman, T; Griffiths, J; Steffens, J; Hamal, P; Chytka, L; Hrabovsky, M

    2014-11-17

    We present the results of studies devoted to the development and optimization of the optical part of a high precision time-of-flight (TOF) detector for the Large Hadron Collider (LHC). This work was motivated by a proposal to use such a detector in conjunction with a silicon detector to tag and measure protons from interactions of the type p + p → p + X + p, where the two outgoing protons are scattered in the very forward directions. The fast timing detector uses fused silica (quartz) bars that emit Cherenkov radiation as a relativistic particle passes through and the emitted Cherenkov photons are detected by, for instance, a micro-channel plate multi-anode Photomultiplier Tube (MCP-PMT). Several possible designs are implemented in Geant4 and studied for timing optimization as a function of the arrival time, and the number of Cherenkov photons reaching the photo-sensor.

  11. Recent developments in GEANT4

    NASA Astrophysics Data System (ADS)

    Allison, J.; Amako, K.; Apostolakis, J.; Arce, P.; Asai, M.; Aso, T.; Bagli, E.; Bagulya, A.; Banerjee, S.; Barrand, G.; Beck, B. R.; Bogdanov, A. G.; Brandt, D.; Brown, J. M. C.; Burkhardt, H.; Canal, Ph.; Cano-Ott, D.; Chauvie, S.; Cho, K.; Cirrone, G. A. P.; Cooperman, G.; Cortés-Giraldo, M. A.; Cosmo, G.; Cuttone, G.; Depaola, G.; Desorgher, L.; Dong, X.; Dotti, A.; Elvira, V. D.; Folger, G.; Francis, Z.; Galoyan, A.; Garnier, L.; Gayer, M.; Genser, K. L.; Grichine, V. M.; Guatelli, S.; Guèye, P.; Gumplinger, P.; Howard, A. S.; Hřivnáčová, I.; Hwang, S.; Incerti, S.; Ivanchenko, A.; Ivanchenko, V. N.; Jones, F. W.; Jun, S. Y.; Kaitaniemi, P.; Karakatsanis, N.; Karamitrosi, M.; Kelsey, M.; Kimura, A.; Koi, T.; Kurashige, H.; Lechner, A.; Lee, S. B.; Longo, F.; Maire, M.; Mancusi, D.; Mantero, A.; Mendoza, E.; Morgan, B.; Murakami, K.; Nikitina, T.; Pandola, L.; Paprocki, P.; Perl, J.; Petrović, I.; Pia, M. G.; Pokorski, W.; Quesada, J. M.; Raine, M.; Reis, M. A.; Ribon, A.; Ristić Fira, A.; Romano, F.; Russo, G.; Santin, G.; Sasaki, T.; Sawkey, D.; Shin, J. I.; Strakovsky, I. I.; Taborda, A.; Tanaka, S.; Tomé, B.; Toshito, T.; Tran, H. N.; Truscott, P. R.; Urban, L.; Uzhinsky, V.; Verbeke, J. M.; Verderi, M.; Wendt, B. L.; Wenzel, H.; Wright, D. H.; Wright, D. M.; Yamashita, T.; Yarba, J.; Yoshida, H.

    2016-11-01

    GEANT4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. The adaptation of GEANT4 to multithreading, advances in physics, detector modeling and visualization, extensions to the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.

  12. Recent developments in Geant4

    DOE PAGES

    Allison, J.; Amako, K.; Apostolakis, J.; Arce, P.; Asai, M.; Aso, T.; Bagli, E.; Bagulya, A.; Banerjee, S.; Barrand, G.; et al

    2016-07-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions tomore » the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.« less

  13. Validation of Hadronic Models in Geant4

    SciTech Connect

    Koi, Tatsumi; Wright, Dennis H.; Folger, Gunter; Ivantchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Pete; LeiFan; Wellisch, Hans-Peter

    2007-03-19

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin-target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  14. Validation of Hadronic Models in GEANT4

    SciTech Connect

    Koi, Tatsumi; Wright, Dennis H.; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Peter; Lei, Fan; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  15. Simulations of nuclear resonance fluorescence in GEANT4

    NASA Astrophysics Data System (ADS)

    Lakshmanan, Manu N.; Harrawood, Brian P.; Rusev, Gencho; Agasthya, Greeshma A.; Kapadia, Anuj J.

    2014-11-01

    The nuclear resonance fluorescence (NRF) technique has been used effectively to identify isotopes based on their nuclear energy levels. Specific examples of its modern-day applications include detecting spent nuclear waste and cargo scanning for homeland security. The experimental designs for these NRF applications can be more efficiently optimized using Monte Carlo simulations before the experiment is implemented. One of the most widely used Monte Carlo physics simulations is the open-source toolkit GEANT4. However, NRF physics has not been incorporated into the GEANT4 simulation toolkit in publicly available software. Here we describe the development and testing of an NRF simulation in GEANT4. We describe in depth the development and architecture of this software for the simulation of NRF in any isotope in GEANT4; as well as verification and validation testing of the simulation for NRF in boron. In the verification testing, the simulation showed agreement with the analytical model to be within 0.6% difference for boron and iron. In the validation testing, the simulation showed agreement to be within 20.5% difference with the experimental measurements for boron, with the percent difference likely due to small uncertainties in beam polarization, energy distribution, and detector composition.

  16. Geant4 - Towards major release 10

    NASA Astrophysics Data System (ADS)

    Cosmo, G.; Geant4 Collaboration

    2014-06-01

    The Geant4 simulation toolkit has reached maturity in the middle of the previous decade, providing a wide variety of established features coherently aggregated in a software product, which has become the standard for detector simulation in HEP and is used in a variety of other application domains. We review the most recent capabilities introduced in the kernel, highlighting those, which are being prepared for the next major release (version 10.0) that is scheduled for the end of 2013. A significant new feature contained in this release will be the integration of multi-threading processing, aiming at targeting efficient use of modern many-cores system architectures and minimization of the memory footprint for exploiting event-level parallelism. We discuss its design features and impact on the existing API and user-interface of Geant4. Revisions are made to balance the need for preserving backwards compatibility and to consolidate and improve the interfaces; taking into account requirements from the multithreaded extensions and from the evolution of the data processing models of the LHC experiments.

  17. Geant4 VMC 3.0

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, I.; Gheata, A.

    2015-12-01

    Virtual Monte Carlo (VMC) [1] provides an abstract interface into Monte Carlo transport codes. A user VMC based application, independent from the specific Monte Carlo codes, can be then run with any of the supported simulation programs. Developed by the ALICE Offline Project and further included in ROOT [2], the interface and implementations have reached stability during the last decade and have become a foundation for other detector simulation frameworks, the FAIR facility experiments framework being among the first and largest. Geant4 VMC [3], which provides the implementation of the VMC interface for Geant4 [4], is in continuous maintenance and development, driven by the evolution of Geant4 on one side and requirements from users on the other side. Besides the implementation of the VMC interface, Geant4 VMC also provides a set of examples that demonstrate the use of VMC to new users and also serve for testing purposes. Since major release 2.0, it includes the G4Root navigator package, which implements an interface that allows one to run a Geant4 simulation using a ROOT geometry. The release of Geant4 version 10.00 with the integration of multithreading processing has triggered the development of the next major version of Geant4 VMC (version 3.0), which was released in November 2014. A beta version, available for user testing since March, has helped its consolidation and improvement. We will review the new capabilities introduced in this major version, in particular the integration of multithreading into the VMC design, its impact on the Geant4 VMC and G4Root packages, and the introduction of a new package, MTRoot, providing utility functions for ROOT parallel output in independent files with necessary additions for thread-safety. Migration of user applications to multithreading that preserves the ease of use of VMC will be also discussed. We will also report on the introduction of a new CMake [5] based build system, the migration to ROOT major release 6 and the

  18. Geant4 Computing Performance Benchmarking and Monitoring

    DOE PAGES

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less

  19. Geant4 Computing Performance Benchmarking and Monitoring

    SciTech Connect

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  20. Geant4 Computing Performance Benchmarking and Monitoring

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; Genser, Krzysztof; Jun, Soon Yung; Kowalkowski, James B.; Paterno, Marc

    2015-12-01

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared to previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. The scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.

  1. A generic x-ray tracing toolbox in Geant4

    NASA Astrophysics Data System (ADS)

    Vacanti, Giuseppe; Buis, Ernst-Jan; Collon, Maximilien; Beijersbergen, Marco; Kelly, Chris

    2009-05-01

    We have developed a generic X-ray tracing toolbox based on Geant4, a generic simulation toolkit. By leveraging the facilities available on Geant4, we are able to design and analyze complex X-ray optical systems. In this article we describe our toolbox, and describe how it is being applied to support the development of silicon pore optics for IXO.

  2. Geant4-DNA: overview and recent developments

    NASA Astrophysics Data System (ADS)

    Štěpán, Václav

    software already available for download, as well as future perspectives, will be presented, on behalf of the Geant4-DNA Collaboration.

  3. The Cryogenic AntiCoincidence Detector for the ATHENA X-IFU: Design Aspects by Geant4 Simulation and Preliminary Characterization of the New Single Pixel

    NASA Astrophysics Data System (ADS)

    Macculi, C.; Argan, A.; D'Andrea, M.; Lotti, S.; Piro, L.; Biasotti, M.; Corsini, D.; Gatti, F.; Orlando, A.; Torrioli, G.

    2016-08-01

    The ATHENA observatory is the second large-class ESA mission, in the context of the Cosmic Vision 2015-2025, scheduled to be launched on 2028 at L2 orbit. One of the two planned focal plane instruments is the X-ray Integral Field Unit (X-IFU), which will be able to perform simultaneous high-grade energy spectroscopy and imaging over the 5 arcmin FoV by means of a kilo-pixel array of transition-edge sensor (TES) microcalorimeters, coupled to a high-quality X-ray optics. The X-IFU sensitivity is degraded by the particle background, induced by primary protons of both solar and cosmic rays' origin and secondary electrons. A Cryogenic AntiCoincidence (CryoAC) TES-based detector, located <1 mm below the TES array, will allow the mission to reach the background level that enables its scientific goals. The CryoAC is a 4-pixel detector made of Silicon absorbers sensed by Iridium TESs. We currently achieve a TRL = 3-4 at the single-pixel level. We have designed and developed two further prototypes in order to reach TRL = 4. The design of the CryoAC has been also optimized using the Geant4 simulation tool. Here we will describe some results from the Geant4 simulations performed to optimize the design and preliminary test results from the first of the two detectors, 1 cm2 area, made of 65 Ir TESs.

  4. GEANT4 distributed computing for compact clusters

    NASA Astrophysics Data System (ADS)

    Harrawood, Brian P.; Agasthya, Greeshma A.; Lakshmanan, Manu N.; Raterman, Gretchen; Kapadia, Anuj J.

    2014-11-01

    A new technique for distribution of GEANT4 processes is introduced to simplify running a simulation in a parallel environment such as a tightly coupled computer cluster. Using a new C++ class derived from the GEANT4 toolkit, multiple runs forming a single simulation are managed across a local network of computers with a simple inter-node communication protocol. The class is integrated with the GEANT4 toolkit and is designed to scale from a single symmetric multiprocessing (SMP) machine to compact clusters ranging in size from tens to thousands of nodes. User designed 'work tickets' are distributed to clients using a client-server work flow model to specify the parameters for each individual run of the simulation. The new g4DistributedRunManager class was developed and well tested in the course of our Neutron Stimulated Emission Computed Tomography (NSECT) experiments. It will be useful for anyone running GEANT4 for large discrete data sets such as covering a range of angles in computed tomography, calculating dose delivery with multiple fractions or simply speeding the through-put of a single model.

  5. The Geant4 Bertini Cascade

    SciTech Connect

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron–nucleus interaction models in the Geant4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron–nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other Geant4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  6. The Geant4 Bertini Cascade

    NASA Astrophysics Data System (ADS)

    Wright, D. H.; Kelsey, M. H.

    2015-12-01

    One of the medium energy hadron-nucleus interaction models in the GEANT4 simulation toolkit is based partly on the Bertini intranuclear cascade model. Since its initial appearance in the toolkit, this model has been largely re-written in order to extend its physics capabilities and to reduce its memory footprint. Physics improvements include extensions in applicable energy range and incident particle types, and improved hadron-nucleon cross-sections and angular distributions. Interfaces have also been developed which allow the model to be coupled with other GEANT4 models at lower and higher energies. The inevitable speed reductions due to enhanced physics have been mitigated by memory and CPU efficiency improvements. Details of these improvements, along with selected comparisons of the model to data, are discussed.

  7. Introduction to the Geant4 Simulation toolkit

    SciTech Connect

    Guatelli, S.; Cutajar, D.; Rosenfeld, A. B.; Oborn, B.

    2011-05-05

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  8. Introduction to the Geant4 Simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guatelli, S.; Cutajar, D.; Oborn, B.; Rosenfeld, A. B.

    2011-05-01

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  9. The Geant4 physics validation repository

    SciTech Connect

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-01-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described

  10. The Geant4 physics validation repository

    NASA Astrophysics Data System (ADS)

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described.

  11. Visualization drivers for Geant4

    SciTech Connect

    Beretvas, Andy; /Fermilab

    2005-10-01

    This document is on Geant4 visualization tools (drivers), evaluating pros and cons of each option, including recommendations on which tools to support at Fermilab for different applications. Four visualization drivers are evaluated. They are OpenGL, HepRep, DAWN and VRML. They all have good features, OpenGL provides graphic output without an intermediate file. HepRep provides menus to assist the user. DAWN provides high quality plots and even for large files produces output quickly. VRML uses the smallest disk space for intermediate files. Large experiments at Fermilab will want to write their own display. They should proceed to make this display graphics independent. Medium experiment will probably want to use HepRep because of it's menu support. Smaller scale experiments will want to use OpenGL in the spirit of having immediate response, good quality output and keeping things simple.

  12. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  13. Implementing NRF Physics in Geant4

    SciTech Connect

    Jordan, David V.; Warren, Glen A.

    2006-07-01

    The Geant4 radiation transport Monte Carlo code toolkit currently does not support nuclear resonance fluorescence (NRF). After a brief review of NRF physics, plans for implementing this physics process in Geant4, and validating the output of the code, are described. The plans will be executed as Task 3 of project 50799, "Nuclear Resonance Fluorescence Signatures (NuRFS)".

  14. Geant4 application in a Web browser

    NASA Astrophysics Data System (ADS)

    Garnier, Laurent; Geant4 Collaboration

    2014-06-01

    Geant4 is a toolkit for the simulation of the passage of particles through matter. The Geant4 visualization system supports many drivers including OpenGL[1], OpenInventor, HepRep[2], DAWN[3], VRML, RayTracer, gMocren[4] and ASCIITree, with diverse and complementary functionalities. Web applications have an increasing role in our work, and thanks to emerging frameworks such as Wt [5], building a web application on top of a C++ application without rewriting all the code can be done. Because the Geant4 toolkit's visualization and user interface modules are well decoupled from the rest of Geant4, it is straightforward to adapt these modules to render in a web application instead of a computer's native window manager. The API of the Wt framework closely matches that of Qt [6], our experience in building Qt driver will benefit for Wt driver. Porting a Geant4 application to a web application is easy, and with minimal effort, Geant4 users can replicate this process to share their own Geant4 applications in a web browser.

  15. Alpha Coincidence Spectroscopy studied with GEANT4

    SciTech Connect

    Dion, Michael P.; Miller, Brian W.; Tatishvili, Gocha; Warren, Glen A.

    2013-11-02

    Abstract The high-energy side of peaks in alpha spectra, e.g. 241Am, as measured with a silicon detector has structure caused mainly by alpha-conversion electron and to some extent alphagamma coincidences. We compare GEANT4 simulation results to 241Am alpha spectroscopy measurements with a passivated implanted planar silicon detector. A large discrepancy between the measurements and simulations suggest that the GEANT4 photon evaporation database for 237Np (daughter of 241Am decay) does not accurately describe the conversion electron spectrum and therefore was found to have large discrepancies with experimental measurements. We describe how to improve the agreement between GEANT4 and alpha spectroscopy for actinides of interest by including experimental measurements of conversion electron spectroscopy into the photon evaporation database.

  16. Comparison of GEANT4 very low energy cross section models with experimental data in water

    SciTech Connect

    Incerti, S.; Ivanchenko, A.; Karamitros, M.; Mantero, A.; Moretto, P.; Tran, H. N.; Mascialino, B.; Champion, C.; Ivanchenko, V. N.; Bernal, M. A.; Francis, Z.; Villagrasa, C.; Baldacchino, G.; Gueye, P.; Capra, R.; Nieminen, P.; Zacharatou, C.

    2010-09-15

    Purpose: The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H{sup 0}, H{sup +}) and (He{sup 0}, He{sup +}, He{sup 2+}), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called ''GEANT4-DNA'' physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. Methods: An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. Results: The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant

  17. BoGEMMS: the Bologna Geant4 multi-mission simulator

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Fioretti, V.; Malaguti, P.; Trifoglio, M.; Gianotti, F.

    2012-07-01

    BoGEMMS, (Bologna Geant4 Multi-Mission Simulator) is a software project for fast simulation of payload on board of scientific satellites for prompt background evaluation that has been developed at the INAF/IASF Bologna. By exploiting the Geant4 set of libraries, BoGEMMS allows to interactively set the geometrical and physical parameters (e.g. physics list, materials and thicknesses), recording the interactions (e.g. energy deposit, position, interacting particle) in NASA FITS and CERN root format output files and filtering the output as a real observation in space, to finally produce the background detected count rate and spectra. Four different types of output can be produced by the BoGEMMS capturing different aspects of the interactions. The simulator can also run in parallel jobs and store the results in a centralized server via xrootd protocol. The BoGEMMS is a multi-mission tool, generally designed to be applied to any high-energy mission for which the shielding and instruments performances analysis is required.

  18. Validation of Geant4 Hadronic Generators versus Thin Target Data

    SciTech Connect

    Banerjee, S.; Folger, G.; Ivanchenko, A.; Ivanchenko, V.N.; Kossov, M.; Quesada, J.M.; Schalicke, A.; Uzhinsky, V.; Wenzel, H.; Wright, D.H.; Yarba, J.; /Fermilab

    2012-04-19

    The GEANT4 toolkit is widely used for simulation of high energy physics (HEP) experiments, in particular, those at the Large Hadron Collider (LHC). The requirements of robustness, stability and quality of simulation for the LHC are demanding. This requires an accurate description of hadronic interactions for a wide range of targets over a large energy range, from stopped particle reactions to low energy nuclear interactions to interactions at the TeV energy scale. This is achieved within the Geant4 toolkit by combining a number of models, each of which are valid within a certain energy domain. Comparison of these models to thin target data over a large energy range indicates the strengths and weaknesses of the model descriptions and the energy range over which each model is valid. Software has been developed to handle the large number of validation tests required to provide the feedback needed to improve the models. An automated process for carrying out the validation and storing/displaying the results is being developed and will be discussed.

  19. GEANT4 Simulation of the NPDGamma Experiment

    NASA Astrophysics Data System (ADS)

    Frlez, Emil

    2014-03-01

    The n-> + p --> d + γ experiment, currently taking data at the Oak Ridge SNS facility, is a high-precision measurement of weak nuclear forces at low energies. Detecting the correlation between the cold neutron spin and photon direction in the capture of neutrons on Liquid Hydrogen (LH) target, the experiment is sensitive to the properties of neutral weak current. We have written a GEANT4 Monte Carlo simulation of the NPDGamma detector that, in addition to the active CsI detectors, also includes different targets and passive materials as well as the beam line elements. The neutron beam energy spectrum, its profiles, divergencies, and time-of-flight are simulated in detail. We have used the code to cross-calibrate the positions of (i) polarized LH target, (ii) Aluminum target, and (iii) CCl4 target. The responses of the 48 CsI detectors in the simulation were fixed using data taken on the LH target. Both neutron absorption as well as scattering and thermal processes were turned on in the GEANT4 physics lists. We use the results to simulate in detail the data obtained with different targets used in the experiment within a comprehensive analysis. This work is supported by NSF grant PHY-1307328.

  20. An XML description of detector geometries for GEANT4

    NASA Astrophysics Data System (ADS)

    Figgins, J.; Walker, B.; Comfort, J. R.

    2006-12-01

    A code has been developed that enables the geometry of detectors to be specified easily and flexibly in the XML language, for use in the Monte Carlo program GEANT4. The user can provide clear documentation of the geometry without being proficient in the C++ language of GEANT4. The features and some applications are discussed.

  1. Space and Medical Applications of the Geant4 Simulation Toolkit

    NASA Astrophysics Data System (ADS)

    Perl, Joseph

    2008-10-01

    Geant4 is a toolkit to simulate the passage of particles through matter. While Geant4 was developed for High Energy Physics (HEP), applications now include Nuclear, Medical and Space Physics. Medical applications have been increasing rapidly due to the overall growth of Monte Carlo in Medical Physics and the unique qualities of Geant4 as an all-particle code able to handle complex geometry, motion and fields with the flexibility of modern programming and an open free source code. Work has included characterizing beams and sources, treatment planning and imaging. The all-particle nature of Geant4 has made it popular for the newest modes of radiation treatment: Proton and Particle therapy. Geant4 has been used by ESA, NASA and JAXA to study radiation effects to spacecraft and personnel. The flexibility of Geant4 has enabled teams to incorporate it into their own applications (SPENVIS MULASSIS space environment from QinetiQ and ESA, RADSAFE simulation from Vanderbilt University and NASA). We provide an overview of applications and discuss how Geant4 has responded to specific challenges of moving from HEP to Medical and Space Physics, including recent work to extend Geant4's energy range to low dose radiobiology.

  2. Application of GEANT4 in the Development of New Radiation Therapy Treatment Methods

    NASA Astrophysics Data System (ADS)

    Brahme, Anders; Gudowska, Irena; Larsson, Susanne; Andreassen, Björn; Holmberg, Rickard; Svensson, Roger; Ivanchenko, Vladimir; Bagulya, Alexander; Grichine, Vladimir; Starkov, Nikolay

    2006-04-01

    There is a very fast development of new radiation treatment methods today, from advanced use of intensity modulated photon and electron beams to light ion therapy with narrow scanned beam based treatment units. Accurate radiation transport calculations are a key requisite for these developments where Geant4 is a very useful Monte Carlo code for accurate design of new treatment units. Today we cannot only image the tumor by PET-CT imaging before the treatment but also determine the tumor sensitivity to radiation and even measure in vivo the delivered absorbed dose in three dimensions in the patient. With such methods accurate Monte Carlo calculations will make radiation therapy an almost exact science where the curative doses can be calculated based on patient individual response data. In the present study results from the application of Geant4 are discussed and the comparisons between Geant4 and experimental and other Monte Carlo data are presented.

  3. GEANT4 simulations of the DANCE array

    NASA Astrophysics Data System (ADS)

    Jandel, M.; Bredeweg, T. A.; Couture, A.; Fowler, M. M.; Bond, E. M.; Chadwick, M. B.; Clement, R. R. C.; Esch, E.-I.; O'Donnell, J. M.; Reifarth, R.; Rundberg, R. S.; Ullmann, J. L.; Vieira, D. J.; Wilhelmy, J. B.; Wouters, J. M.; Macri, R. A.; Wu, C. Y.; Becker, J. A.

    2007-08-01

    The detector for advanced neutron capture experiments (DANCE) at Los Alamos National Laboratory (LANL) is used for neutron capture cross sections measurements. Its high granularity of 160 BaF2 detectors in a 4π geometry allows for highly efficient detection of prompt γ-rays following a neutron capture. The performance of the detector was simulated using the GEANT4 Monte Carlo code. The model includes all 160 BaF2 crystals with realistic dimensions and geometry. The 6LiH shell, beam pipe, crystal wrapping material, aluminum holders, photomultiplier material and materials of the calibration sources were included in the simulation. Simulated γ-ray energy and total γ-ray energy spectra gated on cluster and crystal multiplicities were compared to measured data using 88Y, 60Co, 22Na calibration sources. Good agreement was achieved. A total efficiency and peak-to-total ratio as a function of γ-ray energy was established for mono-energetic γ-rays.

  4. An Overview of the Geant4 Toolkit

    NASA Astrophysics Data System (ADS)

    Apostolakis, John; Wright, Dennis H.

    2007-03-01

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  5. Simulation of Cold Neutron Experiments using GEANT4

    NASA Astrophysics Data System (ADS)

    Frlez, Emil; Hall, Joshua; Root, Melinda; Baessler, Stefan; Pocanic, Dinko

    2013-10-01

    We review the available GEANT4 physics processes for the cold neutrons in the energy range 1-100 meV. We consider the cases of the neutron beam interacting with (i) para- and ortho- polarized liquid hydrogen, (ii) Aluminum, and (iii) carbon tetrachloride (CCl4) targets. Scattering, thermal and absorption cross sections used by GEANT4 and MCNP6 libraries are compared with the National Nuclear Data Center (NNDC) compilation. NPDGamma detector simulation is presented as an example of the implementation of the resulting GEANT4 code. This work is supported by NSF grant PHY-0970013.

  6. SU-E-J-72: Geant4 Simulations of Spot-Scanned Proton Beam Treatment Plans

    SciTech Connect

    Kanehira, T; Sutherland, K; Matsuura, T; Umegaki, K; Shirato, H

    2014-06-01

    Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generated and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.

  7. Geant4 electromagnetic physics updates for space radiation effects simulation

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Nieminen, Petteri; Incerti, Sebastien; Santin, Giovanni; Ivantchenko, Vladimir; Grichine, Vladimir; Allison, John; Karamitos, Mathiew

    The Geant4 toolkit is used in many applications including space science studies. The new Geant4 version 10.0 released in December 2013 includes a major revision of the toolkit and offers multi-threaded mode for event level parallelism. At the same time, Geant4 electromagnetic and hadronic physics sub-libraries have been significantly updated. In order to validate the new and updated models Geant4 verification tests and benchmarks were extended. Part of these developments was sponsored by the European Space Agency in the context of research aimed at modelling radiation biological end effects. In this work, we present an overview of results of several benchmarks for electromagnetic physics models relevant to space science. For electromagnetic physics, recently Compton scattering, photoelectric effect, and Rayleigh scattering models have been improved and extended down to lower energies. Models of ionization and fluctuations have also been improved; special micro-dosimetry models for Silicon and liquid water were introduced; the main multiple scattering model was consolidated; and the atomic de-excitation module has been made available to all models. As a result, Geant4 predictions for space radiation effects obtained with different Physics Lists are in better agreement with the benchmark data than previous Geant4 versions. Here we present results of electromagnetic tests and models comparison in the energy interval 10 eV - 10 MeV.

  8. artG4: A Generic Framework for Geant4 Simulations

    SciTech Connect

    Arvanitis, Tasha; Lyon, Adam

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  9. GEANT4 Simulation of Neutron Detector for DAMPE

    NASA Astrophysics Data System (ADS)

    He, M.; Ma, T.; Chang, J.; Zhang, Y.; Huang, Y. Y.; Zang, J. J.; Wu, J.; Dong, T. K.

    2016-01-01

    During recent tens of years dark matter has gradually become a hot topic in astronomical research field, and related theory researches and experiment projects change with each passing day. The Dark Matter Particle Explorer (DAMPE) of our country is proposed under this background. As the probing object involves high energy electrons, appropriate methods must be taken to distinguish them from protons in order to reduce the event probability of other charged particles (e.g. a proton) being mistaken as electrons. The experiments show that, the hadronic shower of high energy proton in BGO electromagnetic calorimeter, which is usually accompanied by the emitting of large number of secondary neutrons, is significantly different from the electromagnetic shower of high energy electron. Through the detection of secondary neutron signal emitting from the bottom of BGO electromagnetic calorimeter and the shower shape of incident particles in BGO electromagnetic calorimeter, we can effectively distinguish whether the incident particles are high energy protons or electrons. This paper introduces the structure and detecting principle of DAMPE neutron detector. We use Monte-Carlo method with GEANT4 software to simulate the signal emitting from protons and electrons at characteristic energy in the neutron detector, and finally summarize the neutron detector's ability to distinguish protons and electrons under different electron acception efficiencies.

  10. Monte Carlo simulation of a photodisintegration of 3 H experiment in Geant4

    NASA Astrophysics Data System (ADS)

    Gray, Isaiah

    2013-10-01

    An upcoming experiment involving photodisintegration of 3 H at the High Intensity Gamma-Ray Source facility at Duke University has been simulated in the software package Geant4. CAD models of silicon detectors and wire chambers were imported from Autodesk Inventor using the program FastRad and the Geant4 GDML importer. Sensitive detectors were associated with the appropriate logical volumes in the exported GDML file so that changes in detector geometry will be easily manifested in the simulation. Probability distribution functions for the energy and direction of outgoing protons were generated using numerical tables from previous theory, and energies and directions were sampled from these distributions using a rejection sampling algorithm. The simulation will be a useful tool to optimize detector geometry, estimate background rates, and test data analysis algorithms. This work was supported by the Triangle Universities Nuclear Laboratory REU program at Duke University.

  11. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    NASA Astrophysics Data System (ADS)

    Apostolakis, J.; Asai, M.; Bagulya, A.; Brown, J. M. C.; Burkhardt, H.; Chikuma, N.; Cortes-Giraldo, M. A.; Elles, S.; Grichine, V.; Guatelli, S.; Incerti, S.; Ivanchenko, V. N.; Jacquemier, J.; Kadri, O.; Maire, M.; Pandola, L.; Sawkey, D.; Toshito, T.; Urban, L.; Yamashita, T.

    2015-12-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  12. GEANT4 simulation of APEX background radiation and shielding

    NASA Astrophysics Data System (ADS)

    Kaluarachchi, Maduka M.; Cates, Gordon D.; Wojtsekhowski, B.

    2015-04-01

    The A' Experiment (APEX), which is approved to run at the Thomas Jefferson National Accelerator Facility (JLab) Hall A, will search for a new vector boson that is hypothesized to be a possible force carrier that couples to dark matter. APEX results should be sensitive to the mass range of 65 MeV to 550 MeV, and high sensitivity will be achieved by means of a high intensity 100 μA beam on a 0.5 g/cm2 Tungsten target resulting in very high luminosity. The experiment should be able to observe the A ' with a coupling constant α ' ~ 1 × 107 times smaller than the electromagnetic coupling constant α. To deal safely with such enormous intensity and luminosity, a full radiation analysis must be used to help with the design of proper radiation shielding. The purpose of this talk is to present preliminary results obtained by simulating radiation background from the APEX experiment using the 3D Monte-Carlo transport code Geant4. Included in the simulation is a detailed Hall A setup: the hall, spectrometers and shield house, beam dump, beam line, septa magnet with its field, as well as the production target. The results were compared to the APEX test run data and used in development of the radiation shielding for sensitive electronics.

  13. Preliminary Investigation of Microdosimetric Track Structure Physics Models in Geant4-DNA and RITRACKS

    PubMed Central

    Bezak, Eva

    2015-01-01

    The major differences between the physics models in Geant4-DNA and RITRACKS Monte Carlo packages are investigated. Proton and electron ionisation interactions and electron excitation interactions in water are investigated in the current work. While these packages use similar semiempirical physics models for inelastic cross-sections, the implementation of these models is demonstrated to be significantly different. This is demonstrated in a simple Monte Carlo simulation designed to identify differences in interaction cross-sections. PMID:26124856

  14. The GEANT4 toolkit for microdosimetry calculations: application to microbeam radiation therapy (MRT).

    PubMed

    Spiga, J; Siegbahn, E A; Bräuer-Krisch, E; Randaccio, P; Bravin, A

    2007-11-01

    Theoretical dose distributions for microbeam radiation therapy (MRT) are computed in this paper using the GEANT4 Monte Carlo (MC) simulation toolkit. MRT is an innovative experimental radiotherapy technique carried out using an array of parallel microbeams of synchrotron-wiggler-generated x rays. Although the biological mechanisms underlying the effects of microbeams are still largely unknown, the effectiveness of MRT can be traced back to the natural ability of normal tissues to rapidly repair small damages to the vasculature, and on the lack of a similar healing process in tumoral tissues. Contrary to conventional therapy, in which each beam is at least several millimeters wide, the narrowness of the microbeams allows a rapid regeneration of the blood vessels along the beams' trajectories. For this reason the calculation of the "valley" dose is of crucial importance and the correct use of MC codes for such purposes must be understood. GEANT4 offers, in addition to the standard libraries, a specialized package specifically designed to deal with electromagnetic interactions of particles with matter for energies down to 250 eV. This package implements two different approaches for electron and photon transport, one based on evaluated data libraries, the other adopting analytical models. These features are exploited to cross-check theoretical computations for MRT. The lateral and depth dose profiles are studied for the irradiation of a 20 cm diameter, 20 cm long cylindrical phantom, with cylindrical sources of different size and energy. Microbeam arrays are simulated with the aid of superposition algorithms, and the ratios of peak-to-valley doses are computed for typical cases used in preclinical assays. Dose profiles obtained using the GEANT4 evaluated data libraries and analytical models are compared with simulation results previously obtained using the PENELOPE code. The results show that dose profiles computed with GEANT4's analytical model are almost

  15. Geant4 models for simulation of hadron/ion nuclear interactions at moderate and low energies.

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Ivanchenko, Vladimir; Quesada, Jose-Manuel; Wright, Dennis

    The Geant4 toolkit is intended for Monte Carlo simulation of particle transport in media. It was initially designed for High Energy Physics purposes such as experiments at the Large Hadron Collider (LHC) at CERN. The toolkit offers a set of models allowing effective simulation of cosmic ray interactions with different materials. For moderate and low energy hadron/ion interactions with nuclei there are a number of competitive models: Binary and Bertini intra-nuclear cascade models, quantum molecular dynamic model (QMD), INCL/ABLA cascade model, and Chiral Invariant Phase Space Decay model (CHIPS). We report the status of these models for the recent version of Geant4 (release 9.3, December 2009). The Bertini cascade in-ternal cross sections were upgraded. The native Geant4 precompound and deexcitation models were used in the Binary cascade and QMD. They were significantly improved including emis-sion of light fragments, the Fermi break-up model, the General Evaporation Model (GEM), the multi-fragmentation model, and the fission model. Comparisons between model predictions and data for thin target experiments for neutron, proton, light ions, and isotope production are presented and discussed. The focus of these validations is concentrated on target materials important for space missions.

  16. Simulation and modeling for the stand-off radiation detection system (SORDS) using GEANT4

    SciTech Connect

    Hoover, Andrew S; Wallace, Mark; Galassi, Mark; Mocko, Michal; Palmer, David; Schultz, Larry; Tornga, Shawn

    2009-01-01

    A Stand-Off Radiation Detection System (SORDS) is being developed through a joint effort by Raytheon, Los Alamos National Laboratory, Bubble Technology Industries, Radiation Monitoring Devices, and the Massachusetts Institute of Technology, for the Domestic Nuclear Detection Office (DNDO). The system is a mobile truck-based platform performing detection, imaging, and spectroscopic identification of gamma-ray sources. A Tri-Modal Imaging (TMI) approach combines active-mask coded aperture imaging, Compton imaging, and shadow imaging techniques. Monte Carlo simulation and modeling using the GEANT4 toolkit was used to generate realistic data for the development of imaging algorithms and associated software code.

  17. Application of Geant4 in routine close geometry gamma spectroscopy for environmental samples.

    PubMed

    Dababneh, Saed; Al-Nemri, Ektimal; Sharaf, Jamal

    2014-08-01

    This work examines the utilization of Geant4 to practically achieve crucial corrections, in close geometry, for self-absorption and true coincidence summing in gamma-ray spectrometry of environmental samples, namely soil and water. After validation, different simulation options have been explored and compared. The simulation was used to correct for self-absorption effects, and to establish a summing-free efficiency curve, thus overcoming limitations and uncertainties imposed by conventional calibration standards. To be applicable in busy laboratories, simulation results were introduced into the conventional software Genie 2000 in order to be reliably used in everyday routine measurements.

  18. GEANT4 simulations of Cherenkov reaction history diagnostics.

    PubMed

    Rubery, M S; Horsfield, C J; Herrmann, H W; Kim, Y; Mack, J M; Young, C S; Caldwell, S E; Evans, S C; Sedilleo, T J; McEvoy, A; Miller, E K; Stoeffl, W; Ali, Z; Toebbe, J

    2010-10-01

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility. PMID:21033850

  19. GEANT4 simulations of Cherenkov reaction history diagnostics

    SciTech Connect

    Rubery, M. S.; Horsfield, C. J.; Herrmann, H. W.; Kim, Y.; Mack, J. M.; Young, C. S.; Caldwell, S. E.; Evans, S. C.; Sedilleo, T. J.; McEvoy, A.; Miller, E. K.; Stoeffl, W.; Ali, Z.

    2010-10-15

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility.

  20. GEANT4 simulations of Cherenkov reaction history diagnosticsa)

    NASA Astrophysics Data System (ADS)

    Rubery, M. S.; Horsfield, C. J.; Herrmann, H. W.; Kim, Y.; Mack, J. M.; Young, C. S.; Caldwell, S. E.; Evans, S. C.; Sedilleo, T. J.; McEvoy, A.; Miller, E. K.; Stoeffl, W.; Ali, Z.; Toebbe, J.

    2010-10-01

    This paper compares the results from a GEANT4 simulation of the gas Cherenkov detector 1 (GCD1) with previous simulations and experimental data from the Omega laser facility. The GCD1 collects gammas emitted during a deuterium-tritium capsule implosion and converts them, through several processes, to Cherenkov light. Photon signals are recorded using subnanosecond photomultiplier tubes, producing burn reaction histories. The GEANT4 GCD1 simulation is first benchmarked against ACCEPT, an integrated tiger series code, with good agreement. The simulation is subsequently compared with data from the Omega laser facility, where experiments have been performed to measure the effects of Hohlraum materials on reaction history signals, in preparation for experiments at the National Ignition Facility.

  1. Transmission Efficiency of the Sage Spectrometer Using GEANT4

    NASA Astrophysics Data System (ADS)

    Cox, D. M.; Herzberg, R.-D.; Papadakis, P.; Ali, F.; Butler, P. A.; Cresswell, J. R.; Mistry, A.; Sampson, J.; Seddon, D. A.; Thornhill, J.; Wells, D.; Konki, J.; Greenlees, P. T.; Rahkila, P.; Pakarinen, J.; Sandzelius, M.; Sorri, J.; Julin, R.; Coleman-Smith, P. J.; Lazarus, I. H.; Letts, S. C.; Simpson, J.; Pucknell, V. F. E.

    2014-09-01

    The new SAGE spectrometer allows simultaneous electron and γ-ray in-beam studies of heavy nuclei. A comprehensive GEANT4 simulation suite has been created for the SAGE spectrometer. This includes both the silicon detectors for electron detection and the germanium detectors for γ-ray detection. The simulation can be used for a wide variety of tests with the aim of better understanding the behaviour of SAGE. A number of aspects of electron transmission are presented here.

  2. Design software for reuse

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Viewgraphs are presented on the designing of software for reuse. Topics include terminology, software reuse maxims, the science of programming, an interface design example, a modularization example, and reuse and implementation guidelines.

  3. Validation of Geant4 hadronic physics models at intermediate energies

    NASA Astrophysics Data System (ADS)

    Banerjee, Sunanda; Geant4 Hadronic Group

    2010-04-01

    GEANT4 provides a number of physics models at intermediate energies (corresponding to incident momenta in the range 1-20 GeV/c). Recently, these models have been validated with existing data from a number of experiments: (a) inclusive proton and neutron production with a variety of beams (π-, π+, p) at different energies between 1 and 9 GeV/c on a number of nuclear targets (from beryllium to uranium); (2) inclusive pion/kaon/proton production from 14.6 GeV/c proton beams on nuclear targets (from beryllium to gold); (3) inclusive pion production from pion beams between 3-13 GeV/c on a number of nuclear targets (from beryllium to lead). The results of simulation/data comparison for different GEANT4 models are discussed in the context of validating the models and determining their usage in physics lists for high energy application. Due to the increasing number of validations becoming available, and the requirement that they be done at regular intervals corresponding to the GEANT4 release schedule, automated methods of validation are being developed.

  4. Electro and gamma nuclear physics in Geant4

    SciTech Connect

    J.P. Wellisch; M. Kossov; P. Degtyarenko

    2003-03-01

    Adequate description of electro and gamma nuclear physics is of utmost importance in studies of electron beam-dumps and intense electron beam accelerators. I also is mandatory to describe neutron backgrounds and activation in linear colliders. This physics was elaborated in Geant4 over the last year, and now entered into the stage of practical application. In the Geant4 Photo-nuclear data base there are at present about 50 nuclei for which the Photo-nuclear absorption cross sections have been measured. Of these, data on 14 nuclei are used to parametrize the gamma nuclear reaction cross-section The resulting cross section is a complex, factorized function of A and e = log(E{gamma}), where E{gamma} is the energy of the incident photon. Electro-nuclear reactions are so closely connected with Photo-nuclear reactions that sometimes they are often called ''Photo-nuclear''. The one-photon exchange mechanism dominates in Electro-nuclear reactions, and the electron can be substituted by a flux of photons. Folding this flux with the gamma-nuclear cross-section, we arrive at an acceptable description of the electro-nuclear physics. Final states in gamma and electro nuclear physics are described using chiral invariant phase-space decay at low gamma or equivalent photon energies, and quark gluon string model at high energies. We will present the modeling of this physics in Geant4, and show results from practical applications.

  5. GEANT4 for breast dosimetry: parameters optimization study

    NASA Astrophysics Data System (ADS)

    Fedon, C.; Longo, F.; Mettivier, G.; Longo, R.

    2015-08-01

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor ({{r}2}>0.99 ) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4.

  6. GEANT4 for breast dosimetry: parameters optimization study.

    PubMed

    Fedon, C; Longo, F; Mettivier, G; Longo, R

    2015-08-21

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor (r2>0.99) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4. PMID:26267405

  7. Software Design Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  8. Progress in Hadronic Physics Modelling in Geant4

    SciTech Connect

    Apostolakis, John; Folger, Gunter; Grichine, Vladimir; Heikkinen, Aatos; Howard, Alexander; Ivanchenko, Vladimir; Kaitaniemi, Pekka; Koi, Tatsumi; Kosov, Mikhail; Quesada, Jose Manuel; Ribon, Alberto; Uzhinsky, Vladimir; Wright, Dennis; /SLAC

    2011-11-28

    Geant4 offers a set of models to simulate hadronic showers in calorimeters. Recent improvements to several models relevant to the modelling of hadronic showers are discussed. These include improved cross sections, a revision of the FTF model, the addition of quasi-elastic scattering to the QGS model, and enhancements in the nuclear precompound and de-excitation models. The validation of physics models against thin target experiments has been extended especially in the energy region 10 GeV and below. Examples of new validation results are shown.

  9. Antinucleus-Nucleus Cross Sections Implemented in Geant4

    SciTech Connect

    Uzhinsky, V.; Apostolakis, J.; Galoyan, A.; Folger, G.; Grichine, V.M.; Ivanchenko, V.N.; Wright, D.H.; /SLAC

    2012-04-26

    Cross sections of antinucleus ({bar p}, {bar d}, {bar t}, {sup 3}{ovr He}, {sup 4}{ovr He}) interactions with nuclei in the energy range 100 MeV/c to 1000 GeV/c per antinucleon are calculated in the Glauber approximation which provides good description of all known {bar p}Across sections. The results were obtained using a new parameterization of the total and elastic {bar p}p cross sections. Simple parameterizations of the antinucleus-nucleus cross sections are proposed for use in estimating the efficiency of antinucleus detection and tracking in cosmic rays and accelerator experiments. These parameterizations are implemented in the Geant4 toolkit.

  10. Measuring software design

    NASA Technical Reports Server (NTRS)

    1986-01-01

    An extensive series of studies of software design measures conducted by the Software Engineering Laboratory is described. Included are the objectives and results of the studies, the method used to perform the studies, and the problems encountered. The document should be useful to researchers planning similar studies as well as to managers and designers concerned with applying quantitative design measures.

  11. Assessment of a new multileaf collimator concept using GEANT4 Monte Carlo simulations.

    PubMed

    Tacke, Martin B; Szymanowski, Hanitra; Oelfke, Uwe; Schulze, Carsten; Nuss, Susanne; Wehrwein, Eugen; Leidenberger, Stefan

    2006-04-01

    The aim of the work was to investigate in advance the dosimetric properties of a new multileaf collimator (MLC) concept with the help of Monte Carlo (MC) simulations prior to the production of a prototype. The geometrical design of the MLC was implemented in the MC code GEANT4. For the simulation of a 6 MV treatment beam, an experimentally validated phase space and a virtual spatial Gaussian-shaped model placed in the origin were used. For the simulation of the geometry in GEANT4, the jaws and the two leaf packages were implemented with the help of computer-aided design data. First, transmission values for different tungsten alloys were extracted using the simulation codes GEANT4 and BEAMnrc and compared to experimental measurements. In a second step, high-resolution simulations were performed to detect the leakage at depth of maximum dose. The 20%-80% penumbra along the travel direction of the leaves was determined using 10 x 10 cm2 fields shifted along the x- and y-axis. The simulated results were compared with measured data. The simulation of the transmission values for different tungsten alloys showed a good agreement with the experimental measurements (within 2.0%). This enabled an accurate estimation of the attenuation coefficient for the various leaf materials. Simulations with varying width of the spatial Gaussian distribution showed that the leakage and the penumbra depend very much on this parameter: for instance, for widths of 2 and 4 mm, the interleaf leakage is below 0.3% and 0.75%, respectively. The results for the leakage and the penumbra (4.7+/-0.5 mm) are in good agreement with the measurements. This study showed that GEANT4 is appropriate for the investigation of the dosimetric properties of a multileaf collimator. In particular, a quantification of the leakage, the penumbra, and the tongue-and-groove effect and an evaluation of the influence of the beam parameters such as the width of the Gaussian distribution was possible.

  12. Simulation of a Helical Channel using GEANT4

    SciTech Connect

    Elvira, V. D.; Lebrun, P.; Spentzouris, P.

    2001-02-01

    We present a simulation of a 72 m long cooling channel proposed by V. Balbekov based on the helical cooling concept developed by Ya. Derbenev. LiH wedge absorbers provide the energy loss mechanism and 201 MHz cavities are used for re-acceleration. They are placed inside a main solenoidal field to focus the beam. A helical field with an amplitude of 0.3 T and a period of 1.8 m provides momentum dispersion for emittance exchange.The simulation is performed using GEANT4. The total fractional transmission is 0.85, and the transverse, longitudinal, and 3-D cooling factors are 3.75, 2.27, and 14.61, respectively. Some version of this helical channel could eventually be used to replace the first section of the double flip channel to keep the longitudinal emittance under control and increase transmission. Although this is an interesting option, the technical challenges are still significant.

  13. Positron Production at JLab Simulated Using Geant4

    SciTech Connect

    Kossler, W. J.; Long, S. S.

    2009-09-02

    The results of a Geant4 Monte-Carlo study of the production of slow positrons using a 140 MeV electron beam which might be available at Jefferson Lab are presented. Positrons are produced by pair production for the gamma-rays produced by bremsstrahlung on the target which is also the stopping medium for the positrons. Positrons which diffuse to the surface of the stopping medium are assumed to be ejected due to a negative work function. Here the target and moderator are combined into one piece. For an osmium target/moderator 3 cm long with transverse dimensions of 1 cm by 1 mm, we obtain a slow positron yield of about 8.5centre dot10{sup 10}/(scentre dotmA) If these positrons were remoderated and re-emitted with a 23% probability we would obtain 2centre dot10{sup 10}/(scentre dotmA) in a micro-beam.

  14. Geant4-based radiation hazard assessment for human exploration missions

    NASA Astrophysics Data System (ADS)

    Bernabeu, J.; Casanova, I.

    Human exploration missions in the Solar System will spend most of the time in deep space, without the shielding from the Earth's magnetic field and hence, directly exposed to Galactic Cosmic Rays (GCR) and Solar Particle Events (SPE). Both GCR and SPE fluences have been used to calculate the dose deposited on a water slab phantom using MULASSIS, a program based on the Geant4 Monte Carlo particle transport code. Doses from several extreme SPE events and from a GCR model are calculated for different shielding materials and thicknesses using a planar slab geometry and compared to current dose limits for space operations. Cross-comparison of MULASSIS with HZETRN (a deterministic code) has also been performed for SPE and GCR environments showing an overall reasonable agreement between both codes.

  15. Nuclear spectroscopy with Geant4: Proton and neutron emission & radioactivity

    NASA Astrophysics Data System (ADS)

    Sarmiento, L. G.; Rudolph, D.

    2016-07-01

    With the aid of a novel combination of existing equipment - JYFLTRAP and the TASISpec decay station - it is possible to perform very clean quantum-state selective, high-resolution particle-γ decay spectroscopy. We intend to study the determination of the branching ratio of the ℓ = 9 proton emission from the Iπ = 19/2-, 3174-keV isomer in the N = Z - 1 nucleus 53Co. The study aims to initiate a series of similar experiments along the proton dripline, thereby providing unique insights into "open quantum systems". The technique has been pioneered in case studies using SHIPTRAP and TASISpec at GSI. Newly available radioactive decay modes in Geant4 simulations are going to corroborate the anticipated experimental results.

  16. Simulation loop between cad systems, GEANT-4 and GeoModel: Implementation and results

    NASA Astrophysics Data System (ADS)

    Sharmazanashvili, A.; Tsutskiridze, Niko

    2016-09-01

    Compare analysis of simulation and as-built geometry descriptions of detector is important field of study for data_vs_Monte-Carlo discrepancies. Shapes consistency and detalization is not important while adequateness of volumes and weights of detector components are essential for tracking. There are 2 main reasons of faults of geometry descriptions in simulation: (1) Difference between simulated and as-built geometry descriptions; (2) Internal inaccuracies of geometry transformations added by simulation software infrastructure itself. Georgian Engineering team developed hub on the base of CATIA platform and several tools enabling to read in CATIA different descriptions used by simulation packages, like XML->CATIA; VP1->CATIA; Geo-Model->CATIA; Geant4->CATIA. As a result it becomes possible to compare different descriptions with each other using the full power of CATIA and investigate both classes of reasons of faults of geometry descriptions. Paper represents results of case studies of ATLAS Coils and End-Cap toroid structures.

  17. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    SciTech Connect

    Uzunyan, S. A.; Blazey, G.; Boi, S.; Coutrakon, G.; Dyshkant, A.; Francis, K.; Hedin, D.; Johnson, E.; Kalnins, J.; Zutshi, V.; Ford, R.; Rauch, J. E.; Rubinov, P.; Sellberg, G.; Wilson, P.; Naimuddin, M.

    2015-12-29

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input for image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.

  18. CASE: Software design technologies

    SciTech Connect

    Kalyanov, G.N.

    1994-05-01

    CASE (Computer-Aided Software Engineering) is a set of methodologies for software design, development, and maintenance supported by a complex of interconnected automation tools. CASE is a set of tools for the programmer, analyst, and developer for the automation of software design and development. Today, CASE has become an independent discipline in software engineering that has given rise to a powerful CASE industry made up of hundreds of firms and companies of various kinds. They include companies that develop tools for software analysis and design and have a wide network of distributors and dealers, firms that develop specialized tools for narrow subject areas or for individual stages of the software life cycle, firms that organize seminars and courses for specialists, consulting firms, which demonstrate the practical power of CASE toolkits for specific applications, and companies specializing in the publication of periodicals and bulletins on CASE. The principal purchasers of CASE toolkits abroad are military organizations, data-processing centers, and commercial software developers.

  19. Diffusion-controlled reactions modeling in Geant4-DNA

    SciTech Connect

    Karamitros, M.; Luan, S.; Bernal, M.A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H.N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k–d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  20. Diffusion-controlled reactions modeling in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Karamitros, M.; Luan, S.; Bernal, M. A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H. N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k-d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  1. Aviation Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    DARcorporation developed a General Aviation CAD package through a Small Business Innovation Research contract from Langley Research Center. This affordable, user-friendly preliminary design system for General Aviation aircraft runs on the popular 486 IBM-compatible personal computers. Individuals taking the home-built approach, small manufacturers of General Aviation airplanes, as well as students and others interested in the analysis and design of aircraft are possible users of the package. The software can cut design and development time in half.

  2. GATE - Geant4 Application for Tomographic Emission: a simulation toolkit for PET and SPECT

    PubMed Central

    Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.

    2012-01-01

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document, and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.ep.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects toward the gridification of GATE and its extension to other domains such as dosimetry are also discussed. PMID:15552416

  3. Thermal neutron response of a boron-coated GEM detector via GEANT4 Monte Carlo code.

    PubMed

    Jamil, M; Rhee, J T; Kim, H G; Ahmad, Farzana; Jeon, Y J

    2014-10-22

    In this work, we report the design configuration and the performance of the hybrid Gas Electron Multiplier (GEM) detector. In order to make the detector sensitive to thermal neutrons, the forward electrode of the GEM has been coated with the enriched boron-10 material, which works as a neutron converter. A total of 5×5cm(2) configuration of GEM has been used for thermal neutron studies. The response of the detector has been estimated via using GEANT4 MC code with two different physics lists. Using the QGSP_BIC_HP physics list, the neutron detection efficiency was determined to be about 3%, while with QGSP_BERT_HP physics list the efficiency was around 2.5%, at the incident thermal neutron energies of 25meV. The higher response of the detector proves that GEM-coated with boron converter improves the efficiency for thermal neutrons detection.

  4. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described.

  5. GEANT4 physics evaluation with testbeam data of the ATLAS hadronic end-cap calorimeter

    NASA Astrophysics Data System (ADS)

    Kiryunin, A. E.; Oberlack, H.; Salihagić, D.; Schacht, P.; Strizenec, P.

    2006-05-01

    We evaluate the validity of the GEANT4 electromagnetic and hadronic physics models by comparing experimental data from beam tests of modules of the ATLAS hadronic end-cap calorimeter with GEANT4-based simulations. Two physics lists (LHEP and QGSP) for the simulation of hadronic showers are evaluated. Calorimeter performance parameters like the energy resolution and shapes of showers are studied both for electrons and charged pions. Furthermore, throughout the paper we compare GEANT4 and the corresponding predictions of GEANT3 used with the G-CALOR code for hadronic shower development.

  6. Recent improvements on the description of hadronic interactions in Geant4

    NASA Astrophysics Data System (ADS)

    Dotti, A.; Apostolakis, J.; Folger, G.; Grichine, V.; Ivanchenko, V.; Kosov, M.; Ribon, A.; Uzhinsky, V.; Wright, D. H.

    2011-04-01

    We present an overview of recent improvements of hadronic models in Geant4 for the physics configurations (Physics Lists) relevant to applications in high energy experiments. During last year the improvements have concentrated on the study of unphysical discontinuities in calorimeter observables in the transition regions between the models used in Physics Lists. The microscopic origin of these have been investigated, and possible improvements of Geant4 code are currently under validation. In this paper we discuss the status of the latest version of Geant4 with emphasis on the most promising new developments, namely the Fritiof based and CHIPS Physics Lists.

  7. Recent Improvements on the Description of Hadronic Interactions in Geant4

    SciTech Connect

    Dotti, A.; Apostolakis, J.; Folger, G.; Grichine, V.; Ivanchenko, V.; Kosov, M.; Ribon, A.; Uzhinsky, V.; Wright, D.H.; /SLAC

    2012-06-07

    We present an overview of recent improvements of hadronic models in Geant4 for the physics configurations (Physics Lists) relevant to applications in high energy experiments. During last year the improvements have concentrated on the study of unphysical discontinuities in calorimeter observables in the transition regions between the models used in Physics Lists. The microscopic origin of these have been investigated, and possible improvements of Geant4 code are currently under validation. In this paper we discuss the status of the latest version of Geant4 with emphasis on the most promising new developments, namely the Fritiof based and CHIPS Physics Lists.

  8. Monte Carlo simulation by GEANT 4 and GESPECOR of in situ gamma-ray spectrometry measurements.

    PubMed

    Chirosca, Alecsandru; Suvaila, Rares; Sima, Octavian

    2013-11-01

    The application of GEANT 4 and GESPECOR Monte Carlo simulation codes for efficiency calibration of in situ gamma-ray spectrometry was studied. The long computing time required by GEANT 4 prevents its use in routine simulations. Due to the application of variance reduction techniques, GESPECOR is much faster. In this code specific procedures for incorporating the depth profile of the activity were implemented. In addition procedures for evaluating the effect of non-homogeneity of the source were developed. The code was validated by comparison with test simulations carried out with GEANT 4 and by comparison with published results. PMID:23566809

  9. Experimental quantification of Geant4 PhysicsList recommendations: methods and results

    NASA Astrophysics Data System (ADS)

    Basaglia, Tullio; Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Saracco, Paolo

    2015-12-01

    The Geant4 physicsjists package encompasses predefined selections of physics processes and models to be used in simulation applications. Limited documentation is available in the literature about Geant4 pre-packaged PhysicsLists and their validation. The reports in the literature mainly concern specific use cases. This paper documents the epistemological grounds for the validation of Geant4 pre-packaged PhysicsLists (and their accessory classes, Builders and PhysicsConstructors) and some examples of the author's scientific activity on this subject.

  10. Balloon Design Software

    NASA Technical Reports Server (NTRS)

    Farley, Rodger

    2007-01-01

    PlanetaryBalloon Version 5.0 is a software package for the design of meridionally lobed planetary balloons. It operates in a Windows environment, and programming was done in Visual Basic 6. By including the effects of circular lobes with load tapes, skin mass, hoop and meridional stress, and elasticity in the structural elements, a more accurate balloon shape of practical construction can be determined as well as the room-temperature cut pattern for the gore shapes. The computer algorithm is formulated for sizing meridionally lobed balloons for any generalized atmosphere or planet. This also covers zero-pressure, over-pressure, and super-pressure balloons. Low circumferential loads with meridionally reinforced load tapes will produce shapes close to what are known as the "natural shape." The software allows for the design of constant angle, constant radius, or constant hoop stress balloons. It uses the desired payload capacity for given atmospheric conditions and determines the required volume, allowing users to design exactly to their requirements. The formulations are generalized to use any lift gas (or mixture of gases), any atmosphere, or any planet as described by the local acceleration of gravity. PlanetaryBalloon software has a comprehensive user manual that covers features ranging from, but not limited to, buoyancy and super-pressure, convenient design equations, shape formulation, and orthotropic stress/strain.

  11. Geant4 hadronic physics validation with ATLAS Tile Calorimeter test-beam data

    NASA Astrophysics Data System (ADS)

    Alexa, C.; Constantinescu, S.; DiÅ£ǎ, S.

    2006-10-01

    We present comparison studies between Geant4 shower packages and ATLAS Tile Calorimeter test-beam data collected at CERN in H8 beam line at the SPS. Emphasis is put on hadronic physics lists and data concerning differences between Tilecal response to pions and protons of same energy. The ratio between the pure hadronic fraction of pion and the pure hadronic fraction of proton Fhπ/Fhp was estimated with Tilecal test-beam data and compared with Geant4 simulations.

  12. GEANT4 physics evaluation with testbeam data of the ATLAS hadronic end-cap calorimeter

    NASA Astrophysics Data System (ADS)

    Kiryunin, A. E.; Oberlack, H.; Salihagić, D.; Schacht, P.; Strizenec, P.

    2009-04-01

    The validation of GEANT4 physics models is done by comparing experimental data from beam tests of modules of the ATLAS hadronic end-cap calorimeter with GEANT4 based simulations. Various physics lists for the simulation of hadronic showers are evaluated. We present results of studies of the calorimeter performance parameters (like energy resolution and shower shapes) as well as results of investigations of the influence of the Birks' law and of cuts on the time of development of hadronic showers.

  13. Physical models implemented in the GEANT4-DNA extension of the GEANT-4 toolkit for calculating initial radiation damage at the molecular level.

    PubMed

    Villagrasa, C; Francis, Z; Incerti, S

    2011-02-01

    The ROSIRIS project aims to study the radiobiology of integrated systems for medical treatment optimisation using ionising radiations and evaluate the associated risk. In the framework of this project, one research focus is the interpretation of the initial radio-induced damage in DNA created by ionising radiation (and detected by γ-H2AX foci analysis) from the track structure of the incident particles. In order to calculate the track structure of ionising particles at a nanometric level, the Geant4 Monte Carlo toolkit was used. Geant4 (Object Oriented Programming Architecture in C++) offers a common platform, available free to all users and relatively easy to use. Nevertheless, the current low-energy threshold for electromagnetic processes in GEANT4 is set to 1 keV (250 eV using the Livermore processes), which is an unsuitable value for nanometric applications. To lower this energy threshold, the necessary interaction processes and models were identified, and the corresponding available cross sections collected from the literature. They are mostly based on the plane-wave Born approximation (first Born approximation, or FBA) for inelastic interactions and on semi-empirical models for energies where the FBA fails (at low energies). In this paper, the extensions that have been introduced into the 9.3 release of the Geant4 toolkit are described, the so-called Geant4-DNA extension, including a set of processes and models adapted in this study and permitting the simulation of electron (8 eV-1 MeV), proton (100 eV-100 MeV) and alpha particle (1 keV-10 MeV) interactions in liquid water. PMID:21186212

  14. Software-Design-Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  15. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework.

    PubMed

    Cañadas, M; Arce, P; Rato Mendes, P

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL(-1) and the simulated peak

  16. Apply Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Baggs, Rhoda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  17. Use of GEANT4 vs. MCNPX for the characterization of a boron-lined neutron detector

    NASA Astrophysics Data System (ADS)

    van der Ende, B. M.; Atanackovic, J.; Erlandson, A.; Bentoumi, G.

    2016-06-01

    This work compares GEANT4 with MCNPX in the characterization of a boron-lined neutron detector. The neutron energy ranges simulated in this work (0.025 eV to 20 MeV) are the traditional domain of MCNP simulations. This paper addresses the question, how well can GEANT4 and MCNPX be employed for detailed thermal neutron detector characterization? To answer this, GEANT4 and MCNPX have been employed to simulate detector response to a 252Cf energy spectrum point source, as well as to simulate mono-energetic parallel beam source geometries. The 252Cf energy spectrum simulation results demonstrate agreement in detector count rate within 3% between the two packages, with the MCNPX results being generally closer to experiment than are those from GEANT4. The mono-energetic source simulations demonstrate agreement in detector response within 5% between the two packages for all neutron energies, and within 1% for neutron energies between 100 eV and 5 MeV. Cross-checks between the two types of simulations using ISO-8529 252Cf energy bins demonstrates that MCNPX results are more self-consistent than are GEANT4 results, by 3-4%.

  18. Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.

    PubMed

    Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B

    2010-09-01

    The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made.

  19. Validation of Geant4 physics models for 56Fe ion beam in various media

    NASA Astrophysics Data System (ADS)

    Jalota, Summit; Kumar, Ashavani

    2012-11-01

    The depth-dose distribution of a 56Fe ion beam has been studied in water, polyethylene, nextel, kevlar and aluminum media. The dose reduction versus areal depth is also calculated for 56Fe ions in carbon, polyethylene and aluminum using the Monte Carlo simulation toolkit Geant4. This study presents the validation of physics models available in Geant4 by comparing the simulated results with the experimental data available in the literature. Simulations are performed using binary cascade (BIC), abrasion-ablation (AA) and quantum molecular dynamics (QMD) models; integrated into Geant4. Deviations from experimental results may be due to the selection of simple geometry. This paper also addresses the differences in the simulated results from various models.

  20. Photon energy absorption coefficients for nuclear track detectors using Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Singh, Vishwanath P.; Medhat, M. E.; Badiger, N. M.

    2015-01-01

    Geant4 Monte Carlo code simulations were used to solve experimental and theoretical complications for calculation of mass energy-absorption coefficients of elements, air, and compounds. The mass energy-absorption coefficients for nuclear track detectors were computed first time using Geant4 Monte Carlo code for energy 1 keV-20 MeV. Very good agreements for simulated results of mass energy-absorption coefficients for carbon, nitrogen, silicon, sodium iodide and nuclear track detectors were observed on comparison with the values reported in the literatures. Kerma relative to air for energy 1 keV-20 MeV and energy absorption buildup factors for energy 50 keV-10 MeV up to 10 mfp penetration depths of the selected nuclear track detectors were also calculated to evaluate the absorption of the gamma photons. Geant4 simulation can be utilized for estimation of mass energy-absorption coefficients in elements and composite materials.

  1. Dose conversion coefficients for ICRP110 voxel phantom in the Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Martins, M. C.; Cordeiro, T. P. V.; Silva, A. X.; Souza-Santos, D.; Queiroz-Filho, P. P.; Hunt, J. G.

    2014-02-01

    The reference adult male voxel phantom recommended by International Commission on Radiological Protection no. 110 was implemented in the Geant4 Monte Carlo code. Geant4 was used to calculate Dose Conversion Coefficients (DCCs) expressed as dose deposited in organs per air kerma for photons, electrons and neutrons in the Annals of the ICRP. In this work the AP and PA irradiation geometries of the ICRP male phantom were simulated for the purpose of benchmarking the Geant4 code. Monoenergetic photons were simulated between 15 keV and 10 MeV and the results were compared with ICRP 110, the VMC Monte Carlo code and the literature data available, presenting a good agreement.

  2. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    SciTech Connect

    Zhou Y.; Mitra S.; Zhu X.; Wang Y.

    2011-10-16

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling, this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C{sub 3}H{sub 6}O{sub 6}N{sub 6}), can be identified to a depth of 20 cm when buried in soil.

  3. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4

    PubMed Central

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-01-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ε). In addition, we present comparisons of GEANT4 simulations performed with a “standard” and a “low-energy” physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results. Key Points Testing the feedback mechanism with GEANT4 Validating the GEANT4 programming toolkit Study the ratio of bremsstrahlung photons to electrons at TGF source altitude PMID:26167437

  4. Development of a Geant4 based Monte Carlo Algorithm to evaluate the MONACO VMAT treatment accuracy.

    PubMed

    Fleckenstein, Jens; Jahnke, Lennart; Lohr, Frank; Wenz, Frederik; Hesser, Jürgen

    2013-02-01

    A method to evaluate the dosimetric accuracy of volumetric modulated arc therapy (VMAT) treatment plans, generated with the MONACO™ (version 3.0) treatment planning system in realistic CT-data with an independent Geant4 based dose calculation algorithm is presented. Therefore a model of an Elekta Synergy linear accelerator treatment head with an MLCi2 multileaf collimator was implemented in Geant4. The time dependent linear accelerator components were modeled by importing either logfiles of an actual plan delivery or a DICOM-RT plan sequence. Absolute dose calibration, depending on a reference measurement, was applied. The MONACO as well as the Geant4 treatment head model was commissioned with lateral profiles and depth dose curves of square fields in water and with film measurements in inhomogeneous phantoms. A VMAT treatment plan for a patient with a thoracic tumor and a VMAT treatment plan of a patient, who received treatment in the thoracic spine region including metallic implants, were used for evaluation. MONACO, as well as Geant4, depth dose curves and lateral profiles of square fields had a mean local gamma (2%, 2mm) tolerance criteria agreement of more than 95% for all fields. Film measurements in inhomogeneous phantoms with a global gamma of (3%, 3mm) showed a pass rate above 95% in all voxels receiving more than 25% of the maximum dose. A dose-volume-histogram comparison of the VMAT patient treatment plans showed mean deviations between Geant4 and MONACO of -0.2% (first patient) and 2.0% (second patient) for the PTVs and (0.5±1.0)% and (1.4±1.1)% for the organs at risk in relation to the prescription dose. The presented method can be used to validate VMAT dose distributions generated by a large number of small segments in regions with high electron density gradients. The MONACO dose distributions showed good agreement with Geant4 and film measurements within the simulation and measurement errors.

  5. Applying Software Design Methodology to Instructional Design

    ERIC Educational Resources Information Center

    East, J. Philip

    2004-01-01

    The premise of this paper is that computer science has much to offer the endeavor of instructional improvement. Software design processes employed in computer science for developing software can be used for planning instruction and should improve instruction in much the same manner that design processes appear to have improved software. Techniques…

  6. DagSolid: a new Geant4 solid class for fast simulation in polygon-mesh geometry.

    PubMed

    Han, Min Cheol; Kim, Chan Hyeong; Jeong, Jong Hwi; Yeom, Yeon Soo; Kim, SungHoon; Wilson, Paul P H; Apostolakis, John

    2013-07-01

    Even though a computer-aided design (CAD)-based geometry can be directly implemented in Geant4 as polygon-mesh using the G4TessellatedSolid class, the computation speed becomes very slow, especially when the geometry is composed of a large number of facets. To address this problem, in the present study, a new Geant4 solid class, named DagSolid, was developed based on the direct accelerated geometry for the Monte Carlo (DAGMC) library which provides the ray-tracing acceleration algorithm functions. To develop the DagSolid class, the new solid class was derived from the G4VSolid class, and its ray-tracing functions were linked to the corresponding functions of the DAGMC library. The results of this study show that the use of the DagSolid class drastically improves the computation speed. The improvement was more significant when there were more facets, meaning that the DagSolid class can be used more effectively for complicated geometries with many facets than for simple geometries. The maximum difference of computation speed was 1562 and 680 times for Geantino and ChargedGeantino, respectively. For real particles (gammas, electrons, neutrons, and protons), the difference of computation speed was less significant, but still was within the range of 53-685 times depending on the type of beam particles simulated. PMID:23771063

  7. A Student Project to use Geant4 Simulations for a TMS-PET combination

    NASA Astrophysics Data System (ADS)

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Rueda, A.; Solano Salinas, C. J.; Wahl, D.; Zamudio, A.

    2007-10-01

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  8. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  9. SU-E-T-565: RAdiation Resistance of Cancer CElls Using GEANT4 DNA: RACE

    SciTech Connect

    Perrot, Y; Payno, H; Delage, E; Maigne, L

    2014-06-01

    Purpose: The objective of the RACE project is to develop a comparison between Monte Carlo simulation using the Geant4-DNA toolkit and measurements of radiation damage on 3D melanoma and chondrosarcoma culture cells coupled with gadolinium nanoparticles. We currently expose the status of the developments regarding simulations. Methods: Monte Carlo studies are driven using the Geant4 toolkit and the Geant4-DNA extension. In order to model the geometry of a cell population, the opensource CPOP++ program is being developed for the geometrical representation of 3D cell populations including a specific cell mesh coupled with a multi-agent system. Each cell includes cytoplasm and nucleus. The correct modeling of the cell population has been validated with confocal microscopy images of spheroids. The Geant4 Livermore physics models are used to simulate the interactions of a 250 keV X-ray beam and the production of secondaries from gadolinium nanoparticles supposed to be fixed on the cell membranes. Geant4-DNA processes are used to simulate the interactions of charged particles with the cells. An atomistic description of the DNA molecule, from PDB (Protein Data Bank) files, is provided by the so-called PDB4DNA Geant4 user application we developed to score energy depositions in DNA base pairs and sugar-phosphate groups. Results: At the microscopic level, our simulations enable assessing microscopic energy distribution in each cell compartment of a realistic 3D cell population. Dose enhancement factors due to the presence of gadolinium nanoparticles can be estimated. At the nanometer scale, direct damages on nuclear DNA are also estimated. Conclusion: We successfully simulated the impact of direct radiations on a realistic 3D cell population model compatible with microdosimetry calculations using the Geant4-DNA toolkit. Upcoming validation and the future integration of the radiochemistry module of Geant4-DNA will propose to correlate clusters of ionizations with in vitro

  10. Simulating cosmic radiation absorption and secondary particle production of solar panel layers of Low Earth Orbit (LEO) satellite with GEANT4

    NASA Astrophysics Data System (ADS)

    Yiǧitoǧlu, Merve; Veske, Doǧa; Nilüfer Öztürk, Zeynep; Bilge Demirköz, Melahat

    2016-07-01

    All devices which operate in space are exposed to cosmic rays during their operation. The resulting radiation may cause fatal damages in the solid structure of devices and the amount of absorbed radiation dose and secondary particle production for each component should be calculated carefully before the production. Solar panels are semiconductor solid state devices and are very sensitive to radiation. Even a short term power cut-off may yield a total failure of the satellite. Even little doses of radiation can change the characteristics of solar cells. This deviation can be caused by rarer high energetic particles as well as the total ionizing dose from the abundant low energy particles. In this study, solar panels planned for a specific LEO satellite, IMECE, are analyzed layer by layer. The Space Environment Information System (SPENVIS) database and GEANT4 simulation software are used to simulate the layers of the panels. The results obtained from the simulation will be taken in account to determine the amount of radiation protection and resistance needed for the panels or to revise the design of the panels.

  11. CMS validation Experience: Test-beam 2004 data vs Geant4

    NASA Astrophysics Data System (ADS)

    Piperov, Stefan

    2007-03-01

    A comparison between the Geant4 Monte-Carlo simulation of CMS Detector's Calorimetric System and data from the 2004 Test-Beam at CERN's SPS H2 beam-line is presented. The overall simulated response agrees quite well with the measured response. Slight differences in the longitudinal shower profiles between the MC predictions made with different Physics Lists are observed.

  12. GEANT4 Hadronic Physics Validation with Lhc Test-Beam Data

    NASA Astrophysics Data System (ADS)

    Alexa, Călin

    2005-02-01

    In the framework of the LHC Computing Grid (LCG) Simulation Physics Validation Project, we present first conclusions about the validation of the Geant4 hadronic physics lists based on comparisons with test-beam data collected with three LHC calorimeters: the ATLAS Tilecal, the ATLAS HEC and the CMS HCAL.

  13. Identifying key surface parameters for optical photon transport in GEANT4/GATE simulations.

    PubMed

    Nilsson, Jenny; Cuplov, Vesna; Isaksson, Mats

    2015-09-01

    For a scintillator used for spectrometry, the generation, transport and detection of optical photons have a great impact on the energy spectrum resolution. A complete Monte Carlo model of a scintillator includes a coupled ionizing particle and optical photon transport, which can be simulated with the GEANT4 code. The GEANT4 surface parameters control the physics processes an optical photon undergoes when reaching the surface of a volume. In this work the impact of each surface parameter on the optical transport was studied by looking at the optical spectrum: the number of detected optical photons per ionizing source particle from a large plastic scintillator, i.e. the output signal. All simulations were performed using GATE v6.2 (GEANT4 Application for Tomographic Emission). The surface parameter finish (polished, ground, front-painted or back-painted) showed the greatest impact on the optical spectrum whereas the surface parameter σ(α), which controls the surface roughness, had a relatively small impact. It was also shown how the surface parameters reflectivity and reflectivity types (specular spike, specular lobe, Lambertian and backscatter) changed the optical spectrum depending on the probability for reflection and the combination of reflectivity types. A change in the optical spectrum will ultimately have an impact on a simulated energy spectrum. By studying the optical spectra presented in this work, a GEANT4 user can predict the shift in an optical spectrum caused be the alteration of a specific surface parameter.

  14. CMS validation experience: Test-beam 2004 data vs GEANT4

    SciTech Connect

    Piperov, Stefan; /Fermilab /Sofiya, Inst. Nucl. Res.

    2007-01-01

    A comparison between the Geant4 Monte-Carlo simulation of CMS Detector's Calorimetric System and data from the 2004 Test-Beam at CERN's SPS H2 beam-line is presented. The overall simulated response agrees quite well with the measured response. Slight differences in the longitudinal shower profiles between the MC predictions made with different Physics Lists are observed.

  15. Refactoring, reengineering and evolution: paths to Geant4 uncertainty quantification and performance improvement

    NASA Astrophysics Data System (ADS)

    Batič, M.; Begalli, M.; Han, M.; Hauf, S.; Hoff, G.; Kim, C. H.; Kuster, M.; Pia, M. G.; Saracco, P.; Seo, H.; Weidenspointner, G.; Zoglauer, A.

    2012-12-01

    Ongoing investigations for the improvement of Geant4 accuracy and computational performance resulting by refactoring and reengineering parts of the code are discussed. Issues in refactoring that are specific to the domain of physics simulation are identified and their impact is elucidated. Preliminary quantitative results are reported.

  16. A CUDA Monte Carlo simulator for radiation therapy dosimetry based on Geant4

    NASA Astrophysics Data System (ADS)

    Henderson, N.; Murakami, K.; Amako, K.; Asai, M.; Aso, T.; Dotti, A.; Kimura, A.; Gerritsen, M.; Kurashige, H.; Perl, J.; Sasaki, T.

    2014-06-01

    Geant4 is a large-scale particle physics package that facilitates every aspect of particle transport simulation. This includes, but is not limited to, geometry description, material definition, tracking of particles passing through and interacting with matter, storage of event data, and visualization. As more detailed and complex simulations are required in different application domains, there is much interest in adapting the code for parallel and multi-core architectures. Parallelism can be achieved by tracking many particles at the same time. The complexity in the context of a GPU/CUDA adaptation is the highly serialized nature of the Geant4 package and the presence of large lookup tables that guide the simulation. This work presents G4CU, a CUDA implementation of the core Geant4 algorithm adapted for dose calculations in radiation therapy. For these applications the geometry is a block of voxels and the physics is limited to low energy electromagnetic physics. These features allow efficient tracking of many particles in parallel on the GPU. Experiments with radiotherapy simulations in G4CU demonstrate about 40 times speedups over Geant4.

  17. Reflight certification software design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  18. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    NASA Astrophysics Data System (ADS)

    Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M. G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  19. The simulation of the LANFOS-H food radiation contamination detector using Geant4 package

    NASA Astrophysics Data System (ADS)

    Piotrowski, Lech Wiktor; Casolino, Marco; Ebisuzaki, Toshikazu; Higashide, Kazuhiro

    2015-02-01

    Recent incident in the Fukushima power plant caused a growing concern about the radiation contamination and resulted in lowering the Japanese limits for the permitted amount of 137Cs in food to 100 Bq/kg. To increase safety and ease the concern we are developing LANFOS (Large Food Non-destructive Area Sampler)-a compact, easy to use detector for assessment of radiation in food. Described in this paper LANFOS-H has a 4 π coverage to assess the amount of 137Cs present, separating it from the possible 40K food contamination. Therefore, food samples do not have to be pre-processed prior to a test and can be consumed after measurements. It is designed for use by non-professionals in homes and small institutions such as schools, showing safety of the samples, but can be also utilized by specialists providing radiation spectrum. Proper assessment of radiation in food in the apparatus requires estimation of the γ conversion factor of the detectors-how many γ photons will produce a signal. In this paper we show results of the Monte Carlo estimation of this factor for various approximated shapes of fish, vegetables and amounts of rice, performed with Geant4 package. We find that the conversion factor combined from all the detectors is similar for all food types and is around 37%, varying maximally by 5% with sample length, much less than for individual detectors. The different inclinations and positions of samples in the detector introduce uncertainty of 1.4%. This small uncertainty validates the concept of a 4 π non-destructive apparatus.

  20. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1977-01-01

    A communications medium to support the design and documentation of complex software applications is studied. The medium also provides the following: (1) a processor which can convert design specifications into an intelligible, informative machine reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor.

  1. Geant4 simulations of the neutron production and transport in the n_TOF spallation target

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.

    2016-11-01

    The neutron production and transport in the spallation target of the n_TOF facility at CERN has been simulated with Geant4. The results obtained with the different hadronic Physics Lists provided by Geant4 have been compared with the experimental neutron flux in n_TOF-EAR1. The best overall agreement in both the absolute value and the energy dependence of the flux from thermal to 1GeV, is obtained with the INCL++ model coupled with the Fritiof Model(FTFP). This Physics List has been thus used to simulate and study the main features of the new n_TOF-EAR2 beam line, currently in its commissioning phase.

  2. Calculation of Coincidence Summing Correction Factors for an HPGe detector using GEANT4.

    PubMed

    Giubrone, G; Ortiz, J; Gallardo, S; Martorell, S; Bas, M C

    2016-07-01

    The aim of this paper was to calculate the True Coincidence Summing Correction Factors (TSCFs) for an HPGe coaxial detector in order to correct the summing effect as a result of the presence of (88)Y and (60)Co in a multigamma source used to obtain a calibration efficiency curve. Results were obtained for three volumetric sources using the Monte Carlo toolkit, GEANT4. The first part of this paper deals with modeling the detector in order to obtain a simulated full energy peak efficiency curve. A quantitative comparison between the measured and simulated values was made across the entire energy range under study. The True Summing Correction Factors were calculated for (88)Y and (60)Co using the full peak efficiencies obtained with GEANT4. This methodology was subsequently applied to (134)Cs, and presented a complex decay scheme. PMID:27085040

  3. Application of automated weight windows to spallation neutron source shielding calculations using Geant4

    NASA Astrophysics Data System (ADS)

    Stenander, John; DiJulio, Douglas D.

    2015-10-01

    We present an implementation of a general weight-window generator for global variance reduction in Geant4 based applications. The implementation is flexible and can be easily adjusted to a user-defined model. In this work, the weight-window generator was applied to calculations based on an instrument shielding model of the European Spallation Source, which is currently under construction in Lund, Sweden. The results and performance of the implemented methods were evaluated through the definition of two figures of merit. It was found that the biased simulations showed an overall improvement in performance compared to the unbiased simulations. The present work demonstrates both the suitability of the generator method and Geant4 for these types of calculations.

  4. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes

    NASA Astrophysics Data System (ADS)

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes.

  5. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes.

    PubMed

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes.

  6. Estimation of photoneutron yield in linear accelerator with different collimation systems by Geant4 and MCNPX simulation codes.

    PubMed

    Kim, Yoon Sang; Khazaei, Zeinab; Ko, Junho; Afarideh, Hossein; Ghergherehchi, Mitra

    2016-04-01

    At present, the bremsstrahlung photon beams produced by linear accelerators are the most commonly employed method of radiotherapy for tumor treatments. A photoneutron source based on three different energies (6, 10 and 15 MeV) of a linac electron beam was designed by means of Geant4 and Monte Carlo N-Particle eXtended (MCNPX) simulation codes. To obtain maximum neutron yield, two arrangements for the photo neutron convertor were studied: (a) without a collimator, and (b) placement of the convertor after the collimator. The maximum photon intensities in tungsten were 0.73, 1.24 and 2.07 photon/e at 6, 10 and 15 MeV, respectively. There was no considerable increase in the photon fluence spectra from 6 to 15 MeV at the optimum thickness between 0.8 mm and 2 mm of tungsten. The optimum dimensions of the collimator were determined to be a length of 140 mm with an aperture of 5 mm  ×  70 mm for iron in a slit shape. According to the neutron yield, the best thickness obtained for the studied materials was 30 mm. The number of neutrons generated in BeO achieved the maximum value at 6 MeV, unlike that in Be, where the highest number of neutrons was observed at 15 MeV. Statistical uncertainty in all simulations was less than 0.3% and 0.05% for MCNPX and the standard electromagnetic (EM) physics packages of Geant4, respectively. Differences among spectra in various regions are due to various cross-section and stopping power data and different simulations of the physics processes. PMID:26975304

  7. Software design by reusing architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.

  8. The local skin dose conversion coefficients of electrons, protons and alpha particles calculated using the Geant4 code.

    PubMed

    Zhang, Bintuan; Dang, Bingrong; Wang, Zhuanzi; Wei, Wei; Li, Wenjian

    2013-10-01

    The skin tissue-equivalent slab reported in the International Commission on Radiological Protection (ICRP) Publication 116 to calculate the localised skin dose conversion coefficients (LSDCCs) was adopted into the Monte Carlo transport code Geant4. The Geant4 code was then utilised for computation of LSDCCs due to a circular parallel beam of monoenergetic electrons, protons and alpha particles <10 MeV. The computed LSDCCs for both electrons and alpha particles are found to be in good agreement with the results using the MCNPX code of ICRP 116 data. The present work thus validates the LSDCC values for both electrons and alpha particles using the Geant4 code.

  9. Monte Carlo modeling and validation of a proton treatment nozzle by using the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Kim, Dae-Hyun; Kang, Young Nam; Suh, Tae-Suk; Shin, Jungwook; Kim, Jong Won; Yoo, Seung Hoon; Park, Seyjoon; Lee, Sang Hoon; Cho, Sungkoo; Shin, Dongho; Kim, Dae Yong; Lee, Se Byeong

    2012-10-01

    Modern commercial treatment planning systems for proton therapy use the pencil beam algorithm for calculating the absorbed dose. Although it is acceptable for clinical radiation treatment, the accuracy of this method is limited. Alternatively, the Monte Carlo method, which is relatively accurate in dose calculations, has been applied recently to proton therapy. To reduce the remaining uncertainty in proton therapy dose calculations, in the present study, we employed Monte Carlo simulations and the Geant4 simulation toolkit to develop a model for a of a proton treatment nozzle. The results from a Geant4-based medical application of the proton treatment nozzle were compared to the measured data. Simulations of the percentage depth dose profiles showed very good agreement within 1 mm in distal range and 3 mm in modulated width. Moreover, the lateral dose profiles showed good agreement within 3% in the central region of the field and within 10% in the penumbra regions. In this work, we proved that the Geant4 Monte Carlo model of a proton treatment nozzle could be used to the calculate proton dose distributions accurately.

  10. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring.

    PubMed

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm(2). PMID:26858937

  11. A study of the runaway relativistic electron avalanche and the feedback theory using GEANT4

    NASA Astrophysics Data System (ADS)

    Broberg Skeltved, Alexander; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas

    2014-05-01

    This study investigate the Runaway Relativistic Electron Avalanche (RREA) and the feedback process as well as the production of Bremsstrahlung photons from Runaway Electrons (REs). These processes are important to understand the production of the intense bursts of gamma-rays known as Terrestrial Gamma-Ray Flashes (TGFs). Results are obtained from Monte Carlo (MC) simulations using the GEometry ANd Tracking 4 (GEANT4) programming toolkit. The simulations takes into account the effects of electron ionisation, electron by electron scattering (Møller scattering) as well as positron and photon interactions, in the 250 eV-100 GeV energy range. Several physics libraries or 'physics lists' are provided with GEANT4 to implement these physics processes in the simulations. We give a detailed analysis of the electron and the feedback multiplication, in particular the avalanche lengths, Λ, the energy distribution and the feedback factor, γ. We also find that our results vary significantly depending on which physics list we implement. In order to verify our results and the GEANT4 programming toolkit, we compare them to previous results from existing models. In addition we present the ratio of the production of bremsstrahlung photons to runaway electrons. From this ratio we obtain the parameter, α, which describe the electron to photon relation.

  12. Assessment of Geant4 Prompt-Gamma Emission Yields in the Context of Proton Therapy Monitoring

    PubMed Central

    Pinto, Marco; Dauvergne, Denis; Freud, Nicolas; Krimmer, Jochen; Létang, Jean M.; Testa, Etienne

    2016-01-01

    Monte Carlo tools have been long used to assist the research and development of solutions for proton therapy monitoring. The present work focuses on the prompt-gamma emission yields by comparing experimental data with the outcomes of the current version of Geant4 using all applicable proton inelastic models. For the case in study and using the binary cascade model, it was found that Geant4 overestimates the prompt-gamma emission yields by 40.2 ± 0.3%, even though it predicts the prompt-gamma profile length of the experimental profile accurately. In addition, the default implementations of all proton inelastic models show an overestimation in the number of prompt gammas emitted. Finally, a set of built-in options and physically sound Geant4 source code changes have been tested in order to try to improve the discrepancy observed. A satisfactory agreement was found when using the QMD model with a wave packet width equal to 1.3 fm2. PMID:26858937

  13. Application of TDCR-Geant4 modeling to standardization of 63Ni.

    PubMed

    Thiam, C; Bobin, C; Chauvenet, B; Bouchard, J

    2012-09-01

    As an alternative to the classical TDCR model applied to liquid scintillation (LS) counting, a stochastic approach based on the Geant4 toolkit is presented for the simulation of light emission inside the dedicated three-photomultiplier detection system. To this end, the Geant4 modeling includes a comprehensive description of optical properties associated with each material constituting the optical chamber. The objective is to simulate the propagation of optical photons from their creation in the LS cocktail to the production of photoelectrons in the photomultipliers. First validated for the case of radionuclide standardization based on Cerenkov emission, the scintillation process has been added to a TDCR-Geant4 modeling using the Birks expression in order to account for the light-emission nonlinearity owing to ionization quenching. The scintillation yield of the commercial Ultima Gold LS cocktail has been determined from double-coincidence detection efficiencies obtained for (60)Co and (54)Mn with the 4π(LS)β-γ coincidence method. In this paper, the stochastic TDCR modeling is applied for the case of the standardization of (63)Ni (pure β(-)-emitter; E(max)=66.98 keV) and the activity concentration is compared with the result given by the classical model.

  14. Geant4 Model Validation of Compton Suppressed System for Process monitoring of Spent Fuel

    SciTech Connect

    Bender, Sarah; Unlu, Kenan; Orton, Christopher R.; Schwantes, Jon M.

    2013-05-01

    Nuclear material accountancy is of continuous concern for the regulatory, safeguards, and verification communities. In particular, spent nuclear fuel reprocessing facilities pose one of the most difficult accountancy challenges: monitoring highly radioactive, fluid sample streams in near real-time. The Multi-Isotope Process monitor will allow for near-real-time indication of process alterations using passive gamma-ray detection coupled with multivariate analysis techniques to guard against potential material diversion or to enhance domestic process monitoring. The Compton continuum from the dominant 661.7 keV 137Cs fission product peak obscures lower energy lines which could be used for spectral and multivariate analysis. Compton suppression may be able to mitigate the challenges posed by the high continuum caused by scattering. A Monte Carlo simulation using the Geant4 toolkit is being developed to predict the expected suppressed spectrum from spent fuel samples to estimate the reduction in the Compton continuum. Despite the lack of timing information between decay events in the particle management of Geant4, encouraging results were recorded utilizing only the information within individual decays without accounting for accidental coincidences. The model has been validated with single and cascade decay emitters in two steps: as an unsuppressed system and with suppression activated. Results of the Geant4 model validation will be presented.

  15. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  16. Structural Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.

  17. Software Design for Smile Analysis

    PubMed Central

    Sodagar, A.; Rafatjoo, R.; Gholami Borujeni, D.; Noroozi, H.; Sarkhosh, A.

    2010-01-01

    Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproducible smile. The record then should be analyzed to determine its characteristics. In this study, we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients. Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high (α=0.7–1). Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo). Conclusion: The results obtained from smile analysis could be used in diagnosis, treatment planning and evaluation of the treatment progress. PMID:21998792

  18. FIB Microfabrication Software Design Considerations

    NASA Astrophysics Data System (ADS)

    Thompson, W.; Bowe, T.; Morlock, S.; Moskowitz, A.; Plourde, G.; Spaulding, G.; Scialdone, C.; Tsiang, E.

    1986-06-01

    Profit margins on high-volume ICs, such as the 256-K DRAM, are now inadequate. U.S. and foreign manufacturers cannot fully recover the ICs' engineering costs before a new round of product competition begins. Consequently, some semiconductor manufacturers are seeking less competitive designs with healthier, longer lasting profitability. These designs must be converted quickly from CAD to functional circuits in order for irofits to be realized. For ultrahigh performance devices, customized circuits, and rapid verification of design, FIB (focused ion beam) systems provide a viable alternative to the lengthy process of producing a large mask set. Early models of FI equipment did not require sophisticated software. However, as FIB technology approaches adolescence, it must be supported by software that gives the user a friendly system, the flexibility to design a wide variety of circuits, and good growth potential for tomorrow's ICs. Presented here is an overview of IBT's MicroFocus" 150 hardware, followed by descriptions of several MicroFocus software modules. Data preparation techniques from IBCAD formats to chip layout are compared to the more conventional lithographies. The MicroFocus 150 schemes for user interfacing, error logging, calibration, and subsystem control are given. The MicroFocus's pattern generator and bit slice software are explained. IBT's FIB patterning algorithms, which allow the fabrication of unique device types, are reviewed.

  19. Extension of PENELOPE to protons: Simulation of nuclear reactions and benchmark with Geant4

    SciTech Connect

    Sterpin, E.; Sorriaux, J.; Vynckier, S.

    2013-11-15

    Purpose: Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4.Methods: PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer–Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for {sup 1}H and ICRU 63 data for {sup 12}C, {sup 14}N, {sup 16}O, {sup 31}P, and {sup 40}Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth–dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth–dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone.Results: For simulations with EM collisions only, integral depth–dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth–dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth

  20. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists.

  1. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN

    PubMed Central

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists. PMID:26170558

  2. Validation of the GEANT4 simulation of bremsstrahlung from thick targets below 3 MeV

    NASA Astrophysics Data System (ADS)

    Pandola, L.; Andenna, C.; Caccia, B.

    2015-05-01

    The bremsstrahlung spectra produced by electrons impinging on thick targets are simulated using the GEANT4 Monte Carlo toolkit. Simulations are validated against experimental data available in literature for a range of energy between 0.5 and 2.8 MeV for Al and Fe targets and for a value of energy of 70 keV for Al, Ag, W and Pb targets. The energy spectra for the different configurations of emission angles, energies and targets are considered. Simulations are performed by using the three alternative sets of electromagnetic models that are available in GEANT4 to describe bremsstrahlung. At higher energies (0.5-2.8 MeV) of the impinging electrons on Al and Fe targets, GEANT4 is able to reproduce the spectral shapes and the integral photon emission in the forward direction. The agreement is within 10-30%, depending on energy, emission angle and target material. The physics model based on the Penelope Monte Carlo code is in slightly better agreement with the measured data than the other two. However, all models over-estimate the photon emission in the backward hemisphere. For the lower energy study (70 keV), which includes higher-Z targets, all models systematically under-estimate the total photon yield, providing agreement between 10% and 50%. The results of this work are of potential interest for medical physics applications, where knowledge of the energy spectra and angular distributions of photons is needed for accurate dose calculations with Monte Carlo and other fluence-based methods.

  3. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists. PMID:26170558

  4. GEANT4 simulation of cyclotron radioisotope production in a solid target.

    PubMed

    Poignant, F; Penfold, S; Asp, J; Takhar, P; Jackson, P

    2016-05-01

    The use of radioisotopes in nuclear medicine is essential for diagnosing and treating cancer. The optimization of their production is a key factor in maximizing the production yield and minimizing the associated costs. An efficient approach to this problem is the use of Monte Carlo simulations prior to experimentation. By predicting isotopes yields, one can study the isotope of interest expected activity for different energy ranges. One can also study the target contamination with other radioisotopes, especially undesired radioisotopes of the wanted chemical element which are difficult to separate from the irradiated target and might result in increasing the dose when delivering the radiopharmaceutical product to the patient. The aim of this work is to build and validate a Monte Carlo simulation platform using the GEANT4 toolkit to model the solid target system of the South Australian Health and Medical Research Institute (SAHMRI) GE Healthcare PETtrace cyclotron. It includes a GEANT4 Graphical User Interface (GUI) where the user can modify simulation parameters such as the energy, shape and current of the proton beam, the target geometry and material, the foil geometry and material and the time of irradiation. The paper describes the simulation and presents a comparison of simulated and experimental/theoretical yields for various nuclear reactions on an enriched nickel 64 target using the GEANT4 physics model QGSP_BIC_AllHP, a model recently developed to evaluate with high precision the interaction of protons with energies below 200MeV available in Geant4 version 10.1. The simulation yield of the (64)Ni(p,n)(64)Cu reaction was found to be 7.67±0.074 mCi·μA(-1) for a target energy range of 9-12MeV. Szelecsenyi et al. (1993) gives a theoretical yield of 6.71mCi·μA(-1) and an experimental yield of 6.38mCi·μA(-1). The (64)Ni(p,n)(64)Cu cross section obtained with the simulation was also verified against the yield predicted from the nuclear database TENDL and

  5. Background simulation of the X-ray detectors using Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Sarkar, R.; Mandal, S.; Nandi, A.; Debnath, D.; Chakrabarti, S. K.; Rao, A. R.

    We have studied the background noise of X-ray detectors using the Geant4 simulation toolkit. The main source of background noise for the X-ray detectors of low earth orbit is due to cosmic diffused background photons. We have calculated the background spectrum for the CZT of ASTROSAT as well as the phoswich detector of RT-2. Also we have studied the importance of the veto detector to reduce the Compton induced background photons. In this simulation ess we also have optimized the passive shielding to minimize the detector weight within the allowed limit of background counts.

  6. Wurtzite Gallium Nitride as a scintillator detector for alpha particles (a Geant4 simulation)

    NASA Astrophysics Data System (ADS)

    Taheri, A.; Sheidaiy, M.

    2015-05-01

    Gallium Nitride has become a very popular material in electronics and optoelectronics. Because of its interesting properties, it is suitable for a large range of applications. This material also shows very good scintillation properties that make it a possible candidate for use as a charged particles scintillator detector. In this work, we simulated the scintillation and optical properties of the gallium nitride in the presence of alpha particles using Geant4. The results show that gallium nitride can be an appropriate choice for this purpose.

  7. Geant4 simulations on Compton scattering of laser photons on relativistic electrons

    SciTech Connect

    Filipescu, D.; Utsunomiya, H.; Gheorghe, I.; Glodariu, T.; Tesileanu, O.; Shima, T.; Takahisa, K.; Miyamoto, S.

    2015-02-24

    Using Geant4, a complex simulation code of the interaction between laser photons and relativistic electrons was developed. We implemented physically constrained electron beam emittance and spacial distribution parameters and we also considered a Gaussian laser beam. The code was tested against experimental data produced at the γ-ray beam line GACKO (Gamma Collaboration Hutch of Konan University) of the synchrotron radiation facility NewSUBARU. Here we will discuss the implications of transverse missallignments of the collimation system relative to the electron beam axis.

  8. Integration of the low-energy particle track simulation code in Geant4

    NASA Astrophysics Data System (ADS)

    Arce, Pedro; Muñoz, Antonio; Moraleda, Montserrat; Gomez Ros, José María; Blanco, Fernando; Perez, José Manuel; García, Gustavo

    2015-08-01

    The Low-Energy Particle Track Simulation code (LEPTS) is a Monte Carlo code developed to simulate the damage caused by radiation at molecular level. The code is based on experimental data of scattering cross sections, both differential and integral, and energy loss data, complemented with theoretical calculations. It covers the interactions of electrons and positrons from energies of 10 keV down to 0.1 eV in different biologically relevant materials. In this article we briefly mention the main characteristics of this code and we present its integration within the Geant4 Monte Carlo toolkit.

  9. Simulation of the 6 MV Elekta Synergy Platform linac photon beam using Geant4 Application for Tomographic Emission

    PubMed Central

    Didi, Samir; Moussa, Abdelilah; Yahya, Tayalati; Mustafa, Zerfaoui

    2015-01-01

    The present work validates the Geant4 Application for Tomographic Emission Monte Carlo software for the simulation of a 6 MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The simulation includes the major components of the linear accelerator (LINAC) with multi-leaf collimator and a homogeneous water phantom. Calculations were performed for the photon beam with several treatment field sizes ranging from 5 cm × 5 cm to 30 cm × 30 cm at 100 cm distance from the source. The simulation was successfully validated by comparison with experimental distributions. Good agreement between simulations and measurements was observed, with dose differences of about 0.02% and 2.5% for depth doses and lateral dose profiles, respectively. This agreement was also emphasized by the Kolmogorov–Smirnov goodness-of-fit test and by the gamma-index comparisons where more than 99% of the points for all simulations fulfill the quality assurance criteria of 2 mm/2%. PMID:26500399

  10. Space radiation analysis: Radiation effects and particle interaction outside the Earth's magnetosphere using GRAS and GEANT4

    NASA Astrophysics Data System (ADS)

    Martinez, Lisandro M.; Kingston, Jennifer

    2012-03-01

    In order to explore the Moon and Mars it is necessary to investigate the hazards due to the space environment and especially ionizing radiation. According to previous papers, much information has been presented in radiation analysis inside the Earth's magnetosphere, but much of this work was not directly relevant to the interplanetary medium. This work intends to explore the effect of radiation on humans inside structures such as the ISS and provide a detailed analysis of galactic cosmic rays (GCRs) and solar proton events (SPEs) using SPENVIS (Space Environment Effects and Information System) and CREME96 data files for particle flux outside the Earth's magnetosphere. The simulation was conducted using GRAS, a European Space Agency (ESA) software based on GEANT4. Dose and equivalent dose have been calculated as well as secondary particle effects and GCR energy spectrum. The calculated total dose effects and equivalent dose indicate the risk and effects that space radiation could have on the crew, these values are calculated using two different types of structures, the ISS and the TransHab modules. Final results indicate the amounts of radiation expected to be absorbed by the astronauts during long duration interplanetary flights; this denotes importance of radiation shielding and the use of proper materials to reduce the effects.

  11. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  12. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  13. Geant4 Monte Carlo simulation of energy loss and transmission and ranges for electrons, protons and ions

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Vladimir

    Geant4 is a toolkit for Monte Carlo simulation of particle transport originally developed for applications in high-energy physics with the focus on experiments at the Large Hadron Collider (CERN, Geneva). The transparency and flexibility of the code has spread its use to other fields of research, e.g. radiotherapy and space science. The tool provides possibility to simulate complex geometry, transportation in electric and magnetic fields and variety of physics models of interaction of particles with media. Geant4 has been used for simulation of radiation effects for number of space missions. Recent upgrades of the toolkit released in December 2009 include new model for ion electronic stopping power based on the revised version of ICRU'73 Report increasing accuracy of simulation of ion transport. In the current work we present the status of Geant4 electromagnetic package for simulation of particle energy loss, ranges and transmission. This has a direct implication for simulation of ground testing setups at existing European facilities and for simulation of radiation effects in space. A number of improvements were introduced for electron and proton transport, followed by a thorough validation. It was the aim of the present study to validate the range against reference data from the United States National Institute of Standards and Technologies (NIST) ESTAR, PSTAR and ASTAR databases. We compared Geant4 and NIST ranges of electrons using different Geant4 models. The best agreement was found for Penelope, except at very low energies in heavy materials, where the Standard package gave better results. Geant4 proton ranges in water agreed with NIST within 1 The validation of the new ion model is performed against recent data on Bragg peak position in water. The data from transmission of carbon ions via various absorbers following Bragg peak in water demonstrate that the new Geant4 model significantly improves precision of ion range. The absolute accuracy of ion range

  14. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  15. Evaluation of proton inelastic reaction models in Geant4 for prompt gamma production during proton radiotherapy

    NASA Astrophysics Data System (ADS)

    Jeyasugiththan, Jeyasingam; Peterson, Stephen W.

    2015-10-01

    During proton beam radiotherapy, discrete secondary prompt gamma rays are induced by inelastic nuclear reactions between protons and nuclei in the human body. In recent years, the Geant4 Monte Carlo toolkit has played an important role in the development of a device for real time dose range verification purposes using prompt gamma radiation. Unfortunately the default physics models in Geant4 do not reliably replicate the measured prompt gamma emission. Determining a suitable physics model for low energy proton inelastic interactions will boost the accuracy of prompt gamma simulations. Among the built-in physics models, we found that the precompound model with a modified initial exciton state of 2 (1 particle, 1 hole) produced more accurate discrete gamma lines from the most important elements found within the body such as 16O, 12C and 14N when comparing them with the available gamma production cross section data. Using the modified physics model, we investigated the prompt gamma spectra produced in a water phantom by a 200 MeV pencil beam of protons. The spectra were attained using a LaBr3 detector with a time-of-flight (TOF) window and BGO active shield to reduce the secondary neutron and gamma background. The simulations show that a 2 ns TOF window could reduce 99% of the secondary neutron flux hitting the detector. The results show that using both timing and active shielding can remove up to 85% of the background radiation which includes a 33% reduction by BGO subtraction.

  16. Analysis of GEANT4 Physics List Properties in the 12 GeV MOLLER Simulation Framework

    NASA Astrophysics Data System (ADS)

    Haufe, Christopher; Moller Collaboration

    2013-10-01

    To determine the validity of new physics beyond the scope of the electroweak theory, nuclear physicists across the globe have been collaborating on future endeavors that will provide the precision needed to confirm these speculations. One of these is the MOLLER experiment - a low-energy particle experiment that will utilize the 12 GeV upgrade of Jefferson Lab's CEBAF accelerator. The motivation of this experiment is to measure the parity-violating asymmetry of scattered polarized electrons off unpolarized electrons in a liquid hydrogen target. This measurement would allow for a more precise determination of the electron's weak charge and weak mixing angle. While still in its planning stages, the MOLLER experiment requires a detailed simulation framework in order to determine how the project should be run in the future. The simulation framework for MOLLER, called ``remoll'', is written in GEANT4 code. As a result, the simulation can utilize a number of GEANT4 coded physics lists that provide the simulation with a number of particle interaction constraints based off of different particle physics models. By comparing these lists with one another using the data-analysis application ROOT, the most optimal physics list for the MOLLER simulation can be determined and implemented. This material is based upon work supported by the National Science Foundation under Grant No. 714001.

  17. Modeling the production and acceleration of runaway electrons in strong inhomogeneous electric fields with GEANT4

    NASA Astrophysics Data System (ADS)

    Broberg Skeltved, Alexander; Østgaard, Nikolai

    2015-04-01

    The mechanism responsible for the production of Terrestrial Gamma-ray Flashes (TGFs) is not yet fully understood. However, from satellite measurements we know that approximately 1017 relativistic electrons must be produced at a source altitude of 15 km in order to explain the measured photon intensity. It is also well established that TGFs and lightning are interlinked. One suggested mechanism is the production and multiplication of runaway electrons in the streamer-leader electric fields. We report on a new study that uses the Geometry and Tracking (GEANT4) programming toolkit to model the acceleration and multiplication of electrons in strong inhomogeneous electric fields such as those occuring in lightning leaders. In this model we implement a physics list of cross-sections developed by the GEANT4 collaboration to model low-energy particle interactions, the Low-energy Background Experiments (LBE). It has been shown that the choice of physics is crucial to obtain correct results. This physics list includes elastic scattering of electrons according to the møller-scattering method and bremsstrahlung according to the Seltzer-Berger method. In the model we simulate particle interactions explicitly for energies above 250 eV (10 eV for photons). Below 250 eV a continuous energy loss function is used.

  18. Simulation of positron backscattering and implantation profiles using Geant4 code

    NASA Astrophysics Data System (ADS)

    Huang, Shi-Juan; Pan, Zi-Wen; Liu, Jian-Dang; Han, Rong-Dian; Ye, Bang-Jiao

    2015-10-01

    For the proper interpretation of the experimental data produced in slow positron beam technique, the positron implantation properties are studied carefully using the latest Geant4 code. The simulated backscattering coefficients, the implantation profiles, and the median implantation depths for mono-energetic positrons with energy range from 1 keV to 50 keV normally incident on different crystals are reported. Compared with the previous experimental results, our simulation backscattering coefficients are in reasonable agreement, and we think that the accuracy may be related to the structures of the host materials in the Geant4 code. Based on the reasonable simulated backscattering coefficients, the adjustable parameters of the implantation profiles which are dependent on materials and implantation energies are obtained. The most important point is that we calculate the positron backscattering coefficients and median implantation depths in amorphous polymers for the first time and our simulations are in fairly good agreement with the previous experimental results. Project supported by the National Natural Science Foundation of China (Grant Nos. 11175171 and 11105139).

  19. Enhancement and validation of Geant4 Brachytherapy application on clinical HDR 192Ir source

    NASA Astrophysics Data System (ADS)

    Ababneh, Eshraq; Dababneh, Saed; Qatarneh, Sharif; Wadi-Ramahi, Shada

    2014-10-01

    The Geant4 Monte Carlo MC associated Brachytherapy example was adapted, enhanced and several analysis techniques have been developed. The simulation studies the isodose distribution of the total, primary and scattered doses around a Nucletron microSelectron 192Ir source. Different phantom materials were used (water, tissue and bone) and the calculation was conducted at various depths and planes. The work provides an early estimate of the required number of primary events to ultimately achieve a given uncertainty at a given distance, in the otherwise CPU and time consuming clinical MC calculation. The adaptation of the Geant4 toolkit and the enhancements introduced to the code are all validated including the comprehensive decay of the 192Ir source, the materials used to build the geometry, the geometry itself and the calculated scatter to primary dose ratio. The simulation quantitatively illustrates that the scattered dose in the bone medium is larger than its value in water and tissue. As the distance away from the source increases, scatter contribution to dose becomes more significant as the primary dose decreases. The developed code could be viewed as a platform that contains detailed dose calculation model for clinical application of HDR 192Ir in Brachytherapy.

  20. Application of dynamic Monte Carlo technique in proton beam radiotherapy using Geant4 simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guan, Fada

    Monte Carlo method has been successfully applied in simulating the particles transport problems. Most of the Monte Carlo simulation tools are static and they can only be used to perform the static simulations for the problems with fixed physics and geometry settings. Proton therapy is a dynamic treatment technique in the clinical application. In this research, we developed a method to perform the dynamic Monte Carlo simulation of proton therapy using Geant4 simulation toolkit. A passive-scattering treatment nozzle equipped with a rotating range modulation wheel was modeled in this research. One important application of the Monte Carlo simulation is to predict the spatial dose distribution in the target geometry. For simplification, a mathematical model of a human body is usually used as the target, but only the average dose over the whole organ or tissue can be obtained rather than the accurate spatial dose distribution. In this research, we developed a method using MATLAB to convert the medical images of a patient from CT scanning into the patient voxel geometry. Hence, if the patient voxel geometry is used as the target in the Monte Carlo simulation, the accurate spatial dose distribution in the target can be obtained. A data analysis tool---root was used to score the simulation results during a Geant4 simulation and to analyze the data and plot results after simulation. Finally, we successfully obtained the accurate spatial dose distribution in part of a human body after treating a patient with prostate cancer using proton therapy.

  1. Optimisation of a dual head semiconductor Compton camera using Geant4

    NASA Astrophysics Data System (ADS)

    Harkness, L. J.; Boston, A. J.; Boston, H. C.; Cooper, R. J.; Cresswell, J. R.; Grint, A. N.; Nolan, P. J.; Oxley, D. C.; Scraggs, D. P.; Beveridge, T.; Gillam, J.; Lazarus, I.

    2009-06-01

    Conventional medical gamma-ray camera systems utilise mechanical collimation to provide information on the position of an incident gamma-ray photon. Systems that use electronic collimation utilising Compton image reconstruction techniques have the potential to offer huge improvements in sensitivity. Position sensitive high purity germanium (HPGe) detector systems are being evaluated as part of a single photon emission computed tomography (SPECT) Compton camera system. Data have been acquired from the orthogonally segmented planar SmartPET detectors, operated in Compton camera mode. The minimum gamma-ray energy which can be imaged by the current system in Compton camera configuration is 244 keV due to the 20 mm thickness of the first scatter detector which causes large gamma-ray absorption. A simulation package for the optimisation of a new semiconductor Compton camera has been developed using the Geant4 toolkit. This paper will show results of preliminary analysis of the validated Geant4 simulation for gamma-ray energies of SPECT, 141 keV.

  2. A Compton camera application for the GAMOS GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Harkness, L. J.; Arce, P.; Judson, D. S.; Boston, A. J.; Boston, H. C.; Cresswell, J. R.; Dormand, J.; Jones, M.; Nolan, P. J.; Sampson, J. A.; Scraggs, D. P.; Sweeney, A.; Lazarus, I.; Simpson, J.

    2012-04-01

    Compton camera systems can be used to image sources of gamma radiation in a variety of applications such as nuclear medicine, homeland security and nuclear decommissioning. To locate gamma-ray sources, a Compton camera employs electronic collimation, utilising Compton kinematics to reconstruct the paths of gamma rays which interact within the detectors. The main benefit of this technique is the ability to accurately identify and locate sources of gamma radiation within a wide field of view, vastly improving the efficiency and specificity over existing devices. Potential advantages of this imaging technique, along with advances in detector technology, have brought about a rapidly expanding area of research into the optimisation of Compton camera systems, which relies on significant input from Monte-Carlo simulations. In this paper, the functionality of a Compton camera application that has been integrated into GAMOS, the GEANT4-based Architecture for Medicine-Oriented Simulations, is described. The application simplifies the use of GEANT4 for Monte-Carlo investigations by employing a script based language and plug-in technology. To demonstrate the use of the Compton camera application, simulated data have been generated using the GAMOS application and acquired through experiment for a preliminary validation, using a Compton camera configured with double sided high purity germanium strip detectors. Energy spectra and reconstructed images for the data sets are presented.

  3. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  4. Calculation of extrapolation curves in the 4π(LS)β-γ coincidence technique with the Monte Carlo code Geant4.

    PubMed

    Bobin, C; Thiam, C; Bouchard, J

    2016-03-01

    At LNE-LNHB, a liquid scintillation (LS) detection setup designed for Triple to Double Coincidence Ratio (TDCR) measurements is also used in the β-channel of a 4π(LS)β-γ coincidence system. This LS counter based on 3 photomultipliers was first modeled using the Monte Carlo code Geant4 to enable the simulation of optical photons produced by scintillation and Cerenkov effects. This stochastic modeling was especially designed for the calculation of double and triple coincidences between photomultipliers in TDCR measurements. In the present paper, this TDCR-Geant4 model is extended to 4π(LS)β-γ coincidence counting to enable the simulation of the efficiency-extrapolation technique by the addition of a γ-channel. This simulation tool aims at the prediction of systematic biases in activity determination due to eventual non-linearity of efficiency-extrapolation curves. First results are described in the case of the standardization (59)Fe. The variation of the γ-efficiency in the β-channel due to the Cerenkov emission is investigated in the case of the activity measurements of (54)Mn. The problem of the non-linearity between β-efficiencies is featured in the case of the efficiency tracing technique for the activity measurements of (14)C using (60)Co as a tracer.

  5. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4

    PubMed Central

    Schümann, J.; Paganetti, H.; Shin, J.; Faddegon, B.; Perl, J.

    2012-01-01

    A key task within all Monte Carlo particle transport codes is Navigation, the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the effciency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the latter with and without boundary skipping, a method where neighboring voxels with the same Hounsfield Unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a TOol for PArticle Simulations layered on top of Geant4. Runtime comparisons were performed on three distinct patient CT data sets: A head and neck, a liver, and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average run time ratio for G4PhantomParamererisation with and without boundary skipping for our heterogeneous data was = 0:97 : 1. The calculated dose distributions agreed with the reference distribution for all but the G4Phantom

  6. Galactic Cosmic Rays and Lunar Secondary Particles from Solar Minimum to Maximum: CRaTER Observations and Geant4 Modeling

    NASA Astrophysics Data System (ADS)

    Looper, M. D.; Mazur, J. E.; Blake, J. B.; Spence, H. E.; Schwadron, N.; Golightly, M. J.; Case, A. W.; Kasper, J. C.; Townsend, L. W.; Wilson, J. K.

    2014-12-01

    The Lunar Reconnaissance Orbiter mission was launched in 2009 during the recent deep and extended solar minimum, with the highest galactic cosmic ray (GCR) fluxes observed since the beginning of the space era. Its Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument was designed to measure the spectra of energy deposits in silicon detectors shielded behind pieces of tissue equivalent plastic, simulating the self-shielding provided by an astronaut's body around radiation-sensitive organs. The CRaTER data set now covers the evolution of the GCR environment near the moon during the first five years of development of the present solar cycle. We will present these observations, along with Geant4 modeling to illustrate the varying particle contributions to the energy-deposit spectra. CRaTER has also measured protons traveling up from the lunar surface after their creation during GCR interactions with surface material, and we will report observations and modeling of the energy and angular distributions of these "albedo" protons.

  7. 3D polymer gel dosimetry and Geant4 Monte Carlo characterization of novel needle based X-ray source

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Sozontov, E.; Safronov, V.; Gutman, G.; Strumban, E.; Jiang, Q.; Li, S.

    2010-11-01

    In the recent years, there have been a few attempts to develop a low energy x-ray radiation sources alternative to conventional radioisotopes used in brachytherapy. So far, all efforts have been centered around the intent to design an interstitial miniaturized x-ray tube. Though direct irradiation of tumors looks very promising, the known insertable miniature x-ray tubes have many limitations: (a) difficulties with focusing and steering the electron beam to the target; (b)necessity to cool the target to increase x-ray production efficiency; (c)impracticability to reduce the diameter of the miniaturized x-ray tube below 4mm (the requirement to decrease the diameter of the x-ray tube and the need to have a cooling system for the target have are mutually exclusive); (c) significant limitations in changing shape and energy of the emitted radiation. The specific aim of this study is to demonstrate the feasibility of a new concept for an insertable low-energy needle x-ray device based on simulation with Geant4 Monte Carlo code and to measure the dose rate distribution for low energy (17.5 keV) x-ray radiation with the 3D polymer gel dosimetry.

  8. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  9. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Suerfu, B.; Xu, J.; Ivantchenko, V.; Mantero, A.; Brown, J. M. C.; Bernal, M. A.; Francis, Z.; Karamitros, M.; Tran, H. N.

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  10. FLOWTRAN-TF software design

    SciTech Connect

    Aleman, S.E.; Flach, G.P.; Hamm, L.L.; Lee, S.Y.; Smith, F.G. III

    1993-02-01

    FLOWTRAN-TF was created to analyze an individual Mk22 fuel assembly during a large break Loss Of Coolant Accident (LOCA) scenario involving the Savannah River Site K-reactor after the initial few seconds of the transient. During the initial few seconds reactor cooling is limited by the static or Ledinegg flow instability phenomenon. The predecessor FLOWTRAN code was developed to analyze this portion of a LOCA. In the several seconds following the break, a significant fraction of the reactor coolant inventory leaks out the break, Emergency Cooling System (ECS) flow is initiated, and air enters the primary coolant circulation loops. Reactor fuel assemblies are cooled by a low flowrate air-water downflow. Existing commercial nuclear industry thermal-hydraulic codes were judged inadequate for detailed modeling of a Mk22 fuel assembly because the application involves a ribbed annular geometry, low pressure, downflow and an air-water mixture. FLOWTRAN-TF is a two-phase thermal-hydraulics code of similar technology to existing commercial codes such as RELAP and TRAC but customized for Savannah River Site applications. The main features and capabilities of FLOWTRAN-TF are detailed Mk22 fuel assembly ribbed annular geometry; conjugate heat transfer; detailed neutronic power distribution; three-dimensional heat conduction in Mk22 fuel and target tubes; two-dimensional coolant flow in channels (axial, azimuthal); single-phase and/or two-phase fluid (gas, liquid and/or gas-liquid); two-component (air, water); constitutive models applicable to low pressure air-water downflow in ribbed annular channels. The design of FLOWTRAN-TF is described in detail in this report which serves as the Software Design Report in accordance with Quality Assurance Procedure IV-4, Rev. 0 ``Software Design and Implementation`` in the 1Q34 manual.

  11. FLOWTRAN-TF software design

    SciTech Connect

    Aleman, S.E.; Flach, G.P.; Hamm, L.L.; Lee, S.Y.; Smith, F.G. III.

    1993-02-01

    FLOWTRAN-TF was created to analyze an individual Mk22 fuel assembly during a large break Loss Of Coolant Accident (LOCA) scenario involving the Savannah River Site K-reactor after the initial few seconds of the transient. During the initial few seconds reactor cooling is limited by the static or Ledinegg flow instability phenomenon. The predecessor FLOWTRAN code was developed to analyze this portion of a LOCA. In the several seconds following the break, a significant fraction of the reactor coolant inventory leaks out the break, Emergency Cooling System (ECS) flow is initiated, and air enters the primary coolant circulation loops. Reactor fuel assemblies are cooled by a low flowrate air-water downflow. Existing commercial nuclear industry thermal-hydraulic codes were judged inadequate for detailed modeling of a Mk22 fuel assembly because the application involves a ribbed annular geometry, low pressure, downflow and an air-water mixture. FLOWTRAN-TF is a two-phase thermal-hydraulics code of similar technology to existing commercial codes such as RELAP and TRAC but customized for Savannah River Site applications. The main features and capabilities of FLOWTRAN-TF are detailed Mk22 fuel assembly ribbed annular geometry; conjugate heat transfer; detailed neutronic power distribution; three-dimensional heat conduction in Mk22 fuel and target tubes; two-dimensional coolant flow in channels (axial, azimuthal); single-phase and/or two-phase fluid (gas, liquid and/or gas-liquid); two-component (air, water); constitutive models applicable to low pressure air-water downflow in ribbed annular channels. The design of FLOWTRAN-TF is described in detail in this report which serves as the Software Design Report in accordance with Quality Assurance Procedure IV-4, Rev. 0 Software Design and Implementation'' in the 1Q34 manual.

  12. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation

    NASA Astrophysics Data System (ADS)

    Ogawara, R.; Ishikawa, M.

    2016-07-01

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements.

  13. Geant4 Application for Simulating the Propagation of Cosmic Rays through the Earth's Magnetosphere

    NASA Astrophysics Data System (ADS)

    Desorgher, L.; Flueckiger, E.O.; Buetikofer, R.; Moser, M.R.

    2003-07-01

    We have developed a Geant4 application to simulate the propagation of cosmic rays through the Earth's magnetosphere. The application computes the motion of charged particles through advanced magnetospheric magnetic field models such as the Tsyganenko 2001 model. It allows to determine cosmic ray cutoff rigidities and asymptotic directions of incidence for user-defined observing positions, directions, and times. By using the new generation of Tsyganenko models, we can analyse the variation of cutoff rigidities and asymptotic directions during magnetic storms as function of the Dst index and of the solar wind dynamic pressure. The paper describes the application, in particular its visualisation potential, and simulation results. Acknowledgments. This work was supported by the Swiss National Science Foundation, grant 20-67092.01 and by the QINETIQ contract CU009-0000028872 in the frame of the ESA/ESTEC SEPTIMESS project.

  14. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation.

    PubMed

    Ogawara, R; Ishikawa, M

    2016-07-01

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements. PMID:27475602

  15. Modeling spallation reactions in tungsten and uranium targets with the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yury; Pshenichnov, Igor; Mishustin, Igor; Greiner, Walter

    2012-02-01

    We study primary and secondary reactions induced by 600 MeV proton beams in monolithic cylindrical targets made of natural tungsten and uranium by using Monte Carlo simulations with the Geant4 toolkit [1-3]. Bertini intranuclear cascade model, Binary cascade model and IntraNuclear Cascade Liège (INCL) with ABLA model [4] were used as calculational options to describe nuclear reactions. Fission cross sections, neutron multiplicity and mass distributions of fragments for 238U fission induced by 25.6 and 62.9 MeV protons are calculated and compared to recent experimental data [5]. Time distributions of neutron leakage from the targets and heat depositions are calculated. This project is supported by Siemens Corporate Technology.

  16. Validation of GEANT4 Monte Carlo models with a highly granular scintillator-steel hadron calorimeter

    NASA Astrophysics Data System (ADS)

    Adloff, C.; Blaha, J.; Blaising, J.-J.; Drancourt, C.; Espargilière, A.; Gaglione, R.; Geffroy, N.; Karyotakis, Y.; Prast, J.; Vouters, G.; Francis, K.; Repond, J.; Schlereth, J.; Smith, J.; Xia, L.; Baldolemar, E.; Li, J.; Park, S. T.; Sosebee, M.; White, A. P.; Yu, J.; Buanes, T.; Eigen, G.; Mikami, Y.; Watson, N. K.; Mavromanolakis, G.; Thomson, M. A.; Ward, D. R.; Yan, W.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Apostolakis, J.; Dotti, A.; Folger, G.; Ivantchenko, V.; Uzhinskiy, V.; Benyamna, M.; Cârloganu, C.; Fehr, F.; Gay, P.; Manen, S.; Royer, L.; Blazey, G. C.; Dyshkant, A.; Lima, J. G. R.; Zutshi, V.; Hostachy, J.-Y.; Morin, L.; Cornett, U.; David, D.; Falley, G.; Gadow, K.; Göttlicher, P.; Günter, C.; Hermberg, B.; Karstensen, S.; Krivan, F.; Lucaci-Timoce, A.-I.; Lu, S.; Lutz, B.; Morozov, S.; Morgunov, V.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Terwort, M.; Vargas-Trevino, A.; Feege, N.; Garutti, E.; Marchesini, I.; Ramilli, M.; Eckert, P.; Harion, T.; Kaplan, A.; Schultz-Coulon, H.-Ch; Shen, W.; Stamen, R.; Bilki, B.; Norbeck, E.; Onel, Y.; Wilson, G. W.; Kawagoe, K.; Dauncey, P. D.; Magnan, A.-M.; Bartsch, V.; Wing, M.; Salvatore, F.; Calvo Alamillo, E.; Fouz, M.-C.; Puerta-Pelayo, J.; Bobchenko, B.; Chadeeva, M.; Danilov, M.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Popov, V.; Rusinov, V.; Tarkovsky, E.; Kirikova, N.; Kozlov, V.; Smirnov, P.; Soloviev, Y.; Buzhan, P.; Ilyin, A.; Kantserov, V.; Kaplin, V.; Karakash, A.; Popova, E.; Tikhomirov, V.; Kiesling, C.; Seidel, K.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Amjad, M. S.; Bonis, J.; Callier, S.; Conforti di Lorenzo, S.; Cornebise, P.; Doublet, Ph; Dulucq, F.; Fleury, J.; Frisson, T.; van der Kolk, N.; Li, H.; Martin-Chassard, G.; Richard, F.; de la Taille, Ch; Pöschl, R.; Raux, L.; Rouëné, J.; Seguin-Moreau, N.; Anduze, M.; Boudry, V.; Brient, J.-C.; Jeans, D.; Mora de Freitas, P.; Musat, G.; Reinhard, M.; Ruan, M.; Videau, H.; Bulanek, B.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Belhorma, B.; Ghazlane, H.; Takeshita, T.; Uozumi, S.; Götze, M.; Hartbrich, O.; Sauer, J.; Weber, S.; Zeitnitz, C.

    2013-07-01

    Calorimeters with a high granularity are a fundamental requirement of the Particle Flow paradigm. This paper focuses on the prototype of a hadron calorimeter with analog readout, consisting of thirty-eight scintillator layers alternating with steel absorber planes. The scintillator plates are finely segmented into tiles individually read out via Silicon Photomultipliers. The presented results are based on data collected with pion beams in the energy range from 8 GeV to 100 GeV. The fine segmentation of the sensitive layers and the high sampling frequency allow for an excellent reconstruction of the spatial development of hadronic showers. A comparison between data and Monte Carlo simulations is presented, concerning both the longitudinal and lateral development of hadronic showers and the global response of the calorimeter. The performance of several GEANT4 physics lists with respect to these observables is evaluated.

  17. Comparison of MCNPX and Geant4 proton energy deposition predictions for clinical use

    PubMed Central

    Titt, U.; Bednarz, B.; Paganetti, H.

    2012-01-01

    Several different Monte Carlo codes are currently being used at proton therapy centers to improve upon dose predictions over standard methods using analytical or semi-empirical dose algorithms. There is a need to better ascertain the differences between proton dose predictions from different available Monte Carlo codes. In this investigation Geant4 and MCNPX, the two most-utilized Monte Carlo codes for proton therapy applications, were used to predict energy deposition distributions in a variety of geometries, comprising simple water phantoms, water phantoms with complex inserts and in a voxelized geometry based on clinical CT data. The gamma analysis was used to evaluate the differences of the predictions between the codes. The results show that in the all cases the agreement was better than clinical acceptance criteria. PMID:22996039

  18. GEANT4 Application for the Simulation of the Head of a Siemens Primus Linac

    SciTech Connect

    Cortes-Giraldo, M. A.; Quesada, J. M.; Gallardo, M. I.

    2010-04-26

    The Monte Carlo simulation of the head of a Siemens Primus Linac used at Virgen Macarena Hospital (Sevilla, Spain) has been performed using the code GEANT4, version 9.2. In this work, the main features of the application built by our group are presented. They are mainly focused in the optimization of the performance of the simulation. The geometry, including the water phantom, has been entirely wrapped by a shielding volume which discards all the particles escaping far away through its walls. With this, a factor of four in the time spent by the simulation can be saved. An interface to read and write phase-space files in IAEA format has been also developed to save CPU time in our simulations. Finally, some calculations of the dose absorption in the water phantom have been done and compared with the results given by EGSnrc and with experimental data obtained for the calibration of the machine.

  19. Nuclear reaction measurements on tissue-equivalent materials and GEANT4 Monte Carlo simulations for hadrontherapy

    NASA Astrophysics Data System (ADS)

    De Napoli, M.; Romano, F.; D'Urso, D.; Licciardello, T.; Agodi, C.; Candiano, G.; Cappuzzello, F.; Cirrone, G. A. P.; Cuttone, G.; Musumarra, A.; Pandola, L.; Scuderi, V.

    2014-12-01

    When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned. Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u-1 12C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.

  20. Geant4 Predictions of Energy Spectra in Typical Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Sabra, M. S.; Barghouty, A. F.

    2014-01-01

    Accurate knowledge of energy spectra inside spacecraft is important for protecting astronauts as well as sensitive electronics from the harmful effects of space radiation. Such knowledge allows one to confidently map the radiation environment inside the vehicle. The purpose of this talk is to present preliminary calculations for energy spectra inside a spherical shell shielding and behind a slab in typical space radiation environment using the 3D Monte-Carlo transport code Geant4. We have simulated proton and iron isotropic sources and beams impinging on Aluminum and Gallium arsenide (GaAs) targets at energies of 0.2, 0.6, 1, and 10 GeV/u. If time permits, other radiation sources and beams (_, C, O) and targets (C, Si, Ge, water) will be presented. The results are compared to ground-based measurements where available.

  1. Development and validation of a GEANT4 radiation transport code for CT dosimetry.

    PubMed

    Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G

    2015-04-01

    The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  2. Development and validation of a GEANT4 radiation transport code for CT dosimetry

    PubMed Central

    Carver, DE; Kost, SD; Fernald, MJ; Lewis, KG; Fraser, ND; Pickens, DR; Price, RR; Stabin, MG

    2014-01-01

    We have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate our simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air, a standard 16-cm acrylic head phantom, and a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of our Monte Carlo simulations. We found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  3. Radiation quality of cosmic ray nuclei studied with Geant4-based simulations

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas N.; Pshenichnov, Igor A.; Mishustin, Igor N.; Bleicher, Marcus

    2014-04-01

    In future missions in deep space a space craft will be exposed to a non-negligible flux of high charge and energy (HZE) particles present in the galactic cosmic rays (GCR). One of the major concerns of manned missions is the impact on humans of complex radiation fields which result from the interactions of HZE particles with the spacecraft materials. The radiation quality of several ions representing GCR is investigated by calculating microdosimetry spectra. A Geant4-based Monte Carlo model for Heavy Ion Therapy (MCHIT) is used to simulate microdosimetry data for HZE particles in extended media where fragmentation reactions play a certain role. Our model is able to reproduce measured microdosimetry spectra for H, He, Li, C and Si in the energy range of 150-490 MeV/u. The effect of nuclear fragmentation on the relative biological effectiveness (RBE) of He, Li and C is estimated and found to be below 10%.

  4. Geant4 calculations for space radiation shielding material Al2O3

    NASA Astrophysics Data System (ADS)

    Capali, Veli; Acar Yesil, Tolga; Kaya, Gokhan; Kaplan, Abdullah; Yavuz, Mustafa; Tilki, Tahir

    2015-07-01

    Aluminium Oxide, Al2O3 is the most widely used material in the engineering applications. It is significant aluminium metal, because of its hardness and as a refractory material owing to its high melting point. This material has several engineering applications in diverse fields such as, ballistic armour systems, wear components, electrical and electronic substrates, automotive parts, components for electric industry and aero-engine. As well, it is used as a dosimeter for radiation protection and therapy applications for its optically stimulated luminescence properties. In this study, stopping powers and penetrating distances have been calculated for the alpha, proton, electron and gamma particles in space radiation shielding material Al2O3 for incident energies 1 keV - 1 GeV using GEANT4 calculation code.

  5. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.

    PubMed

    Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L

    2012-06-01

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  6. Comparing Geant4 hadronic models for the WENDI-II rem meter response function.

    PubMed

    Vanaudenhove, T; Dubus, A; Pauly, N

    2013-01-01

    The WENDI-II rem meter is one of the most popular neutron dosemeters used to assess a useful quantity of radiation protection, namely the ambient dose equivalent. This is due to its high sensitivity and its energy response that approximately follows the conversion function between neutron fluence and ambient dose equivalent in the range of thermal to 5 GeV. The simulation of the WENDI-II response function with the Geant4 toolkit is then perfectly suited to compare low- and high-energy hadronic models provided by this Monte Carlo code. The results showed that the thermal treatment of hydrogen in polyethylene for neutron <4 eV has a great influence over the whole detector range. Above 19 MeV, both Bertini Cascade and Binary Cascade models show a good correlation with the results found in the literature, while low-energy parameterised models are not suitable for this application.

  7. Geant4.10 simulation of geometric model for metaphase chromosome

    NASA Astrophysics Data System (ADS)

    Rafat-Motavalli, L.; Miri-Hakimabad, H.; Bakhtiyari, E.

    2016-04-01

    In this paper, a geometric model of metaphase chromosome is explained. The model is constructed according to the packing ratio and dimension of the structure from nucleosome up to chromosome. A B-DNA base pair is used to construct 200 base pairs of nucleosomes. Each chromatin fiber loop, which is the unit of repeat, has 49,200 bp. This geometry is entered in Geant4.10 Monte Carlo simulation toolkit and can be extended to the whole metaphase chromosomes and any application in which a DNA geometrical model is needed. The chromosome base pairs, chromosome length, and relative length of chromosomes are calculated. The calculated relative length is compared to the relative length of human chromosomes.

  8. Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.

    PubMed

    Chagren, S; Ben Tekaya, M; Reguigui, N; Gharbi, F

    2016-01-01

    In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed. PMID:26623928

  9. Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.

    PubMed

    Chagren, S; Ben Tekaya, M; Reguigui, N; Gharbi, F

    2016-01-01

    In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed.

  10. Geant4-DNA simulations using complex DNA geometries generated by the DnaFabric tool

    NASA Astrophysics Data System (ADS)

    Meylan, S.; Vimont, U.; Incerti, S.; Clairand, I.; Villagrasa, C.

    2016-07-01

    Several DNA representations are used to study radio-induced complex DNA damages depending on the approach and the required level of granularity. Among all approaches, the mechanistic one requires the most resolved DNA models that can go down to atomistic DNA descriptions. The complexity of such DNA models make them hard to modify and adapt in order to take into account different biological conditions. The DnaFabric project was started to provide a tool to generate, visualise and modify such complex DNA models. In the current version of DnaFabric, the models can be exported to the Geant4 code to be used as targets in the Monte Carlo simulation. In this work, the project was used to generate two DNA fibre models corresponding to two DNA compaction levels representing the hetero and the euchromatin. The fibres were imported in a Geant4 application where computations were performed to estimate the influence of the DNA compaction on the amount of calculated DNA damage. The relative difference of the DNA damage computed in the two fibres for the same number of projectiles was found to be constant and equal to 1.3 for the considered primary particles (protons from 300 keV to 50 MeV). However, if only the tracks hitting the DNA target are taken into account, then the relative difference is more important for low energies and decreases to reach zero around 10 MeV. The computations were performed with models that contain up to 18,000 DNA nucleotide pairs. Nevertheless, DnaFabric will be extended to manipulate multi-scale models that go from the molecular to the cellular levels.

  11. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  12. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  13. Software Prototyping: Designing Systems for Users.

    ERIC Educational Resources Information Center

    Spies, Phyllis Bova

    1983-01-01

    Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…

  14. A Geant4 simulation of the depth dose percentage in brain tumors treatments using protons and carbon ions

    NASA Astrophysics Data System (ADS)

    José A. Diaz, M.; Torres, D. A.

    2016-07-01

    The deposited energy and dose distribution of beams of protons and carbon over a head are simulated using the free tool package Geant4 and the data analysis package ROOT-C++. The present work shows a methodology to understand the microscopical process occurring in a session of hadron-therapy using advance simulation tools.

  15. Introducing Third-Year Undergraduates to GEANT4 Simulations of Light Transport and Collection in Scintillation Materials

    ERIC Educational Resources Information Center

    Riggi, Simone; La Rocca, Paola; Riggi, Francesco

    2011-01-01

    GEANT4 simulations of the processes affecting the transport and collection of optical photons generated inside a scintillation detector were carried out, with the aim to complement the educational material offered by textbooks to third-year physics undergraduates. Two typical situations were considered: a long scintillator strip with and without a…

  16. A macroscopic and microscopic study of radon exposure using Geant4 and MCNPX to estimate dose rates and DNA damage

    NASA Astrophysics Data System (ADS)

    van den Akker, Mary Evelyn

    Radon is considered the second-leading cause of lung cancer after smoking. Epidemiological studies have been conducted in miner cohorts as well as general populations to estimate the risks associated with high and low dose exposures. There are problems with extrapolating risk estimates to low dose exposures, mainly that the dose-response curve at low doses is not well understood. Calculated dosimetric quantities give average energy depositions in an organ or a whole body, but morphological features of an individual can affect these values. As opposed to human phantom models, Computed Tomography (CT) scans provide unique, patient-specific geometries that are valuable in modeling the radiological effects of the short-lived radon progeny sources. Monte Carlo particle transport code Geant4 was used with the CT scan data to model radon inhalation in the main bronchial bifurcation. The equivalent dose rates are near the lower bounds of estimates found in the literature, depending on source volume. To complement the macroscopic study, simulations were run in a small tissue volume in Geant4-DNA toolkit. As an expansion of Geant4 meant to simulate direct physical interactions at the cellular level, the particle track structure of the radon progeny alphas can be analyzed to estimate the damage that can occur in sensitive cellular structures like the DNA molecule. These estimates of DNA double strand breaks are lower than those found in Geant4-DNA studies. Further refinements of the microscopic model are at the cutting edge of nanodosimetry research.

  17. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products.

    PubMed

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm(3) water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  18. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products

    PubMed Central

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm3 water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  19. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products.

    PubMed

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm(3) water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  20. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  1. Language and Program for Documenting Software Design

    NASA Technical Reports Server (NTRS)

    Kleine, H.; Zepko, T. M.

    1986-01-01

    Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.

  2. Application of Design Patterns in Refactoring Software Design

    NASA Technical Reports Server (NTRS)

    Baggs. Rjpda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  3. The Simulation of AN Imaging Gamma-Ray Compton Backscattering Device Using GEANT4

    NASA Astrophysics Data System (ADS)

    Flechas, D.; Sarmiento, L. G.; Cristancho, F.; Fajardo, E.

    2014-02-01

    A gamma-backscattering imaging device dubbed Compton Camera, developed at GSI (Darmstadt, Germany) and modified and studied at the Nuclear Physics Group of the National University of Colombia in Bogotá, uses the back-to-back emission of two gamma rays in the positron annihilation to construct a bidimensional image that represents the distribution of matter in the field-of-view of the camera. This imaging capability can be used in a host of different situations, for example, to identify and study deposition and structural defects, and to help locating concealed objects, to name just two cases. In order to increase the understanding of the response of the Compton Camera and, in particular, its image formation process, and to assist in the data analysis, a simulation of the camera was developed using the GEANT4 simulation toolkit. In this work, the images resulting from different experimental conditions are shown. The simulated images and their comparison with the experimental ones already suggest methods to improve the present experimental device

  4. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Bednarz, B.

    2013-02-01

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  5. GEANT4 SIMULATIONS OF GAMMA-RAY EMISSION FROM ACCELERATED PARTICLES IN SOLAR FLARES

    SciTech Connect

    Tang Shichao; Smith, David M.

    2010-10-01

    Gamma-ray spectroscopy provides diagnostics of particle acceleration in solar flares, but care must be taken when interpreting the spectra due to effects of the angular distribution of the accelerated particles (such as relativistic beaming) and Compton reprocessing of the radiation in the solar atmosphere. In this paper, we use the GEANT4 Monte Carlo package to simulate the interactions of accelerated electrons and protons and study the effects of these interactions on the gamma rays resulting from electron bremsstrahlung and pion decay. We consider the ratio of the 511 keV annihilation-line flux to the continuum at 200 keV and in the energy band just above the nuclear de-excitation lines (8-15 MeV) as a diagnostic of the accelerated particles and a point of comparison with data from the X17 flare of 2003 October 28. We also find that pion secondaries from accelerated protons produce a positron annihilation line component at a depth of {approx}10 g cm{sup -2} and that the subsequent Compton scattering of the 511 keV photons produces a continuum that can mimic the spectrum expected from the 3{gamma} decay of orthopositronium.

  6. Physical Modelling of Proton and Heavy Ion Radiation using Geant4

    NASA Astrophysics Data System (ADS)

    Douglass, M.; Bezak, E.

    2012-10-01

    Protons and heavy ion particles are considered to be ideal particles for use in external beam radiotherapy due to superior properties of the dose distribution that results when these particles are incident externally and due to their relative biological effectiveness. While significant research has been performed into the properties and physical dose characteristics of heavy ions, the nuclear reactions (direct and fragmentation) undergone by He4, C12 and Ne20 nuclei used in radiotherapy in materials other than water is still largely unexplored. In the current project, input code was developed for the Monte Carlo toolkit Geant 4 version 9.3 to simulate the transport of several mono-energetic heavy ions through water. The relative dose contributions from secondary particles and nuclear fragments originating from the primary particles were investigated for each ion in both water and dense bone (ICRU) media. The results indicated that the relative contribution to the total physical dose from nuclear fragments increased with both increasing particle mass and with increasing medium density. In the case of 150 MeV protons, secondary particles were shown to contribute less than 0.5% of the peak dose and as high as 25% when using 10570 MeV neon ions in bone. When water was substituted for a bone medium, the contributions from fragments increased by more than 6% for C12 and Ne20.

  7. Interaction of Fast Nucleons with Actinide Nuclei Studied with GEANT4

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yu.; Pshenichnov, I.; Mishustin, I.; Greiner, W.

    2014-04-01

    We model interactions of protons and neutrons with energies from 1 to 1000 MeV with 241Am and 243Am nuclei. The calculations are performed with the Monte Carlo model for Accelerator Driven Systems (MCADS) which we developed based on the GEANT4 toolkit of version 9.4. This toolkit is widely used to simulate the propagation of particles in various materials which contain nuclei up to uranium. After several extensions we apply this toolkit also to proton- and neutron-induced reactions on Am. The fission and radiative neutron capture cross sections, neutron multiplicities and distributions of fission fragments were calculated for 241Am and 243Am and compared with experimental data. As demonstrated, the fission of americium by energetic protons with energies above 20 MeV can be well described by the Intra-Nuclear Cascade Liège (INCL) model combined with the fission-evaporation model ABLA. The calculated average numbers of fission neutrons and mass distributions of fission products agree well with the corresponding data. However, the proton-induced fission below 20 MeV is described less accurately. This is attributed to the limitations of the Intra-Nuclear Cascade model at low projectile energies.

  8. Geant4 simulation of the solar neutron telescope at Sierra Negra, Mexico

    NASA Astrophysics Data System (ADS)

    González, L. X.; Sánchez, F.; Valdés-Galicia, J. F.

    2010-02-01

    The solar neutron telescope (SNT) at Sierra Negra (19.0°N, 97.3°W and 4580 m.a.s.l) is part of a worldwide network of similar detectors (Valdés-Galicia et al., (2004) [1]). This SNT has an area of 4 m2; it is composed by four 1 m×1 m×30 cm plastic scintillators (Sci). The Telescope is completely surrounded by anti-coincidence proportional counters (PRCs) to separate charged particles from the neutron flux. In order to discard photon background it is shielded on its sides by 10 mm thick iron plates and on its top by 5 mm lead plates. It is capable of registering four different channels corresponding to four energy deposition thresholds: E>30, >60, >90 and >120 MeV. The arrival direction of neutrons is determined by gondolas of PRCs in electronic coincidence, four layers of these gondolas orthogonally located underneath the SNT, two in the NS direction and two in the EW direction. We present here simulations of the detector response to neutrons, protons, electrons and gammas in range of energies from 100 to 1000 MeV. We report on the detector efficiency and on its angular resolution for particles impinging the device with different zenith angles. The simulation code was written using the Geant4 package (Agostinelli et al., (2003) [2]), taking into account all relevant physical processes.

  9. Study on gamma response function of EJ301 organic liquid scintillator with GEANT4 and FLUKA

    NASA Astrophysics Data System (ADS)

    Zhang, Su-Ya-La-Tu; Chen, Zhi-Qiang; Han, Rui; Liu, Xing-Quan; Wada, R.; Lin, Wei-Ping; Jin, Zeng-Xue; Xi, Yin-Yin; Liu, Jian-Li; Shi, Fu-Dong

    2013-12-01

    The gamma response function is required for energy calibration of EJ301 (5 cm in diameter and 20 cm in height) organic liquid scintillator detector by means of gamma sources. The GEANT4 and FLUKA Monte Carlo simulation packages were used to simulate the response function of the detector for standard 22Na, 60Co, 137Cs gamma sources. The simulated results showed a good agreement with experimental data by incorporating the energy resolution function to simulation codes. The energy resolution and the position of the maximum Compton electron energy were obtained by comparing measured light output distribution with simulated one. The energy resolution of the detector varied from 21.2% to 12.4% for electrons in the energy region from 0.341 MeV to 1.12 MeV. The accurate position of the maximum Compton electron energy was determined at the position 81% of maximum height of Compton edges distribution. In addition, the relation of the electron energy calibration and the effective neutron detection thresholds were described in detail. The present results indicated that both packages were suited for studying the gamma response function of EJ301 detector.

  10. Ion therapy for uveal melanoma in new human eye phantom based on GEANT4 toolkit.

    PubMed

    Mahdipour, Seyed Ali; Mowlavi, Ali Asghar

    2016-01-01

    Radiotherapy with ion beams like proton and carbon has been used for treatment of eye uveal melanoma for many years. In this research, we have developed a new phantom of human eye for Monte Carlo simulation of tumors treatment to use in GEANT4 toolkit. Total depth-dose profiles for the proton, alpha, and carbon incident beams with the same ranges have been calculated in the phantom. Moreover, the deposited energy of the secondary particles for each of the primary beams is calculated. The dose curves are compared for 47.8MeV proton, 190.1MeV alpha, and 1060MeV carbon ions that have the same range in the target region reaching to the center of tumor. The passively scattered spread-out Bragg peak (SOBP) for each incident beam as well as the flux curves of the secondary particles including neutron, gamma, and positron has been calculated and compared for the primary beams. The high sharpness of carbon beam׳s Bragg peak with low lateral broadening is the benefit of this beam in hadrontherapy but it has disadvantages of dose leakage in the tail after its Bragg peak and high intensity of neutron production. However, proton beam, which has a good conformation with tumor shape owing to the beam broadening caused by scattering, can be a good choice for the large-size tumors. PMID:26831752

  11. Distributions of deposited energy and ionization clusters around ion tracks studied with Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Hilgers, Gerhard; Bleicher, Marcus

    2016-05-01

    The Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT) was extended to study the patterns of energy deposition at sub-micrometer distance from individual ion tracks. Dose distributions for low-energy 1H, 4He, 12C and 16O ions measured in several experiments are well described by the model in a broad range of radial distances, from 0.5 to 3000 nm. Despite the fact that such distributions are characterized by long tails, a dominant fraction of deposited energy (∼80%) is confined within a radius of about 10 nm. The probability distributions of clustered ionization events in nanoscale volumes of water traversed by 1H, 2H, 4He, 6Li, 7Li, and 12C ions are also calculated. A good agreement of calculated ionization cluster-size distributions with the corresponding experimental data suggests that the extended MCHIT can be used to characterize stochastic processes of energy deposition to sensitive cellular structures.

  12. Modeling of x-ray fluorescence using MCNPX and Geant4

    SciTech Connect

    Rajasingam, Akshayan; Hoover, Andrew S; Fensin, Michael L; Tobin, Stephen J

    2009-01-01

    X-Ray Fluorescence (XRF) is one of thirteen non-destructive assay techniques being researched for the purpose of quantifying the Pu mass in used fuel assemblies. The modeling portion of this research will be conducted with the MCNPX transport code. The research presented here was undertaken to test the capability of MCNPX so that it can be used to benchmark measurements made at the ORNL and to give confidence in the application of MCNPX as a predictive tool of the expected capability of XRF in the context of used fuel assemblies. The main focus of this paper is a code-to-code comparison between MCNPX and Geant4 code. Since XRF in used fuel is driven by photon emission and beta decay of fission fragments, both terms were independently researched. Simple cases and used fuel cases were modeled for both source terms. In order to prepare for benchmarking to experiments, it was necessary to determine the relative significance of the various fission fragments for producing X-rays.

  13. VIDA: a voxel-based dosimetry method for targeted radionuclide therapy using Geant4.

    PubMed

    Kost, Susan D; Dewaraja, Yuni K; Abramson, Richard G; Stabin, Michael G

    2015-02-01

    We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy ((131)I, (90)Y, (111)In, (177)Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by (131)I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression.

  14. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field.

    PubMed

    Yang, Y M; Bednarz, B

    2013-02-21

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  15. Validation of a dose deposited by low-energy photons using GATE/GEANT4.

    PubMed

    Thiam, C O; Breton, V; Donnarieix, D; Habib, B; Maigne, L

    2008-06-01

    The GATE Monte Carlo simulation platform based on the Geant4 toolkit has now become a diffused tool for simulating PET and SPECT imaging devices. In this paper, we explore its relevance for dosimetry of low-energy 125I photon brachytherapy sources used to treat prostate cancers. To that end, three 125-iodine sources widely used in prostate cancer brachytherapy treatment have been modelled. GATE simulations reproducing dosimetric reference observables such as radial dose function g(r), anisotropy function F(r, theta) and dose-rate constant (Lambda) were performed in liquid water. The calculations were splitted on the EGEE grid infrastructure to reduce the computing time of the simulations. The results were compared to other relevant Monte Carlo results and to measurements published and fixed as recommended values by the AAPM Task Group 43. GATE results agree with consensus values published by AAPM Task Group 43 with an accuracy better than 2%, demonstrating that GATE is a relevant tool for the study of the dose induced by low-energy photons.

  16. VIDA: A Voxel-Based Dosimetry Method for Targeted Radionuclide Therapy Using Geant4

    PubMed Central

    Dewaraja, Yuni K.; Abramson, Richard G.; Stabin, Michael G.

    2015-01-01

    Abstract We have developed the Voxel-Based Internal Dosimetry Application (VIDA) to provide patient-specific dosimetry in targeted radionuclide therapy performing Monte Carlo simulations of radiation transport with the Geant4 toolkit. The code generates voxel-level dose rate maps using anatomical and physiological data taken from individual patients. Voxel level dose rate curves are then fit and integrated to yield a spatial map of radiation absorbed dose. In this article, we present validation studies using established dosimetry results, including self-dose factors (DFs) from the OLINDA/EXM program for uniform activity in unit density spheres and organ self- and cross-organ DFs in the Radiation Dose Assessment Resource (RADAR) reference adult phantom. The comparison with reference data demonstrated agreement within 5% for self-DFs to spheres and reference phantom source organs for four common radionuclides used in targeted therapy (131I, 90Y, 111In, 177Lu). Agreement within 9% was achieved for cross-organ DFs. We also present dose estimates to normal tissues and tumors from studies of two non-Hodgkin Lymphoma patients treated by 131I radioimmunotherapy, with comparison to results generated independently with another dosimetry code. A relative difference of 12% or less was found between methods for mean absorbed tumor doses accounting for tumor regression. PMID:25594357

  17. Monte Carlo study of a 3D Compton imaging device with GEANT4

    NASA Astrophysics Data System (ADS)

    Lenti, M.; Veltri, M.

    2011-10-01

    In this paper we investigate, with a detailed Monte Carlo simulation based on Geant4, the novel approach of Lenti (2008) [1] to 3D imaging with photon scattering. A monochromatic and well collimated gamma beam is used to illuminate the object to be imaged and the photons Compton scattered are detected by means of a surrounding germanium strip detector. The impact position and the energy of the photons are measured with high precision and the scattering position along the beam axis is calculated. We study as an application of this technique the case of brain imaging but the results can be applied as well to situations where a lighter object, with localized variations of density, is embedded in a denser container. We report here the attainable sensitivity in the detection of density variations as a function of the beam energy, the depth inside the object and size and density of the inclusions. Using a 600 keV gamma beam, for an inclusion with a density increase of 30% with respect to the surrounding tissue and thickness along the beam of 5 mm, we obtain at midbrain position a resolution of about 2 mm and a contrast of 12%. In addition the simulation indicates that for the same gamma beam energy a complete brain scan would result in an effective dose of about 1 mSv.

  18. Software design studies emphasizing Project LOGOS

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.

  19. An empirical study of software design practices

    NASA Technical Reports Server (NTRS)

    Card, David N.; Church, Victor E.; Agresti, William W.

    1986-01-01

    Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.

  20. Void fraction system computer software design description

    SciTech Connect

    Gimera, M.

    1995-02-15

    This document describes the software that controls the void fraction instrument. The format of the document may differ from typical Software Design Reports because it was created with a graphical programming language. Hardware is described in Section 2. The purpose of this document is describe the software, so the hardware description is brief. Software is described in Section 3. LabVIEW was used to develop the viscometer software, so Section 3 begins with an introduction to LabVIEW. This is followed by a description of the main program. Finally each Westinghouse developed subVI (sub program) is discussed.

  1. Software Updates: Web Design--Software that Makes It Easy!

    ERIC Educational Resources Information Center

    Pattridge, Gregory C.

    2002-01-01

    This article discusses Web design software that provides an easy-to-use interface. The "Netscape Communicator" is highlighted for beginning Web page construction and step-by-step instructions are provided for starting out, page colors and properties, indents, bulleted lists, tables, adding links, navigating long documents, creating e-mail links,…

  2. Domain specific software design for decision aiding

    NASA Technical Reports Server (NTRS)

    Keller, Kirby; Stanley, Kevin

    1992-01-01

    McDonnell Aircraft Company (MCAIR) is involved in many large multi-discipline design and development efforts of tactical aircraft. These involve a number of design disciplines that must be coordinated to produce an integrated design and a successful product. Our interpretation of a domain specific software design (DSSD) is that of a representation or framework that is specialized to support a limited problem domain. A DSSD is an abstract software design that is shaped by the problem characteristics. This parallels the theme of object-oriented analysis and design of letting the problem model directly drive the design. The DSSD concept extends the notion of software reusability to include representations or frameworks. It supports the entire software life cycle and specifically leads to improved prototyping capability, supports system integration, and promotes reuse of software designs and supporting frameworks. The example presented in this paper is the task network architecture or design which was developed for the MCAIR Pilot's Associate program. The task network concept supported both module development and system integration within the domain of operator decision aiding. It is presented as an instance where a software design exhibited many of the attributes associated with DSSD concept.

  3. Geant4 simulations on medical Linac operation at 18 MV: Experimental validation based on activation foils

    NASA Astrophysics Data System (ADS)

    Vagena, E.; Stoulos, S.; Manolopoulou, M.

    2016-03-01

    The operation of a medical linear accelerator was simulated using the Geant4 code regarding to study the characteristics of an 18 MeV photon beam. Simulations showed that (a) the photon spectrum at the isocenter is not influenced by changes of the primary electron beam's energy distribution and spatial spread (b) 98% of the photon energy fluence scored at the isocenter is primary photons that have only interacted with the target (c) the number of contaminant electrons is not negligible since it fluctuated around 5×10-5 per primary electron or 2.40×10-3 per photon at the isocenter (d) the number of neutrons that are created by (γ, n) reactions is 3.13×10-6 per primary electron or 1.50×10-3 per photon at the isocenter (e) a flattening filter free beam needs less primary electrons in order to deliver the same photon fluence at the isocenter than a normal flattening filter operation (f) there is no significant increase of the surface dose due to the contaminant electrons by removing the flattening filter (g) comparing the neutron fluences per incident electron for the flattened and unflattened beam, the neutron fluencies is 7% higher for the unflattened beams. To validate the simulations results, the total neutron and photon fluence at the isocenter field were measured using nickel, indium, and natural uranium activation foils. The percentage difference between simulations and measurements was 1.26% in case of uranium and 2.45% in case of the indium foil regarding photon fluencies while for neutrons the discrepancy is higher up to 8.0%. The photon and neutron fluencies of the simulated experiments fall within a range of ±1 and ±2 sigma error, respectively, compared to the ones obtained experimentally.

  4. BC404 scintillators as gamma locators studied via Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Cortés, M. L.; Hoischen, R.; Eisenhauer, K.; Gerl, J.; Pietralla, N.

    2014-05-01

    In many applications in industry and academia, an accurate determination of the direction from where gamma rays are emitted is either needed or desirable. Ion-beam therapy treatments, the search for orphan sources, and homeland security applications are examples of fields that can benefit from directional sensitivity to gamma-radiation. Scintillation detectors are a good option for these types of applications as they have relatively low cost, are easy to handle and can be produced in a large range of different sizes. In this work a Geant4 simulation was developed to study the directional sensitivity of different BC404 scintillator geometries and arrangements. The simulation includes all the physical processes relevant for gamma detection in a scintillator. In particular, the creation and propagation of optical photons inside the scintillator was included. A simplified photomultiplier tube model was also simulated. The physical principle exploited is the angular dependence of the shape of the energy spectrum obtained from thin scintillator layers when irradiated from different angles. After an experimental confirmation of the working principle of the device and a check of the simulation, the possibilities and limitations of directional sensitivity to gamma radiation using scintillator layers was tested. For this purpose, point-like sources of typical energies expected in ion-beam therapy were used. Optimal scintillator thicknesses for different energies were determined and the setup efficiencies calculated. The use of arrays of scintillators to reconstruct the direction of incoming gamma rays was also studied. For this case, a spherical source emitting Bremsstrahlung radiation was used together with a setup consisting of scintillator layers. The capability of this setup to identify the center of the extended source was studied together with its angular resolution.

  5. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  6. Validating Geant4 Versions 7.1 and 8.3 Against 6.1 for BaBar

    SciTech Connect

    Banerjee, Swagato; Brown, David N.; Chen, Chunhui; Cote, David; Dubois-Felsmann, Gregory P.; Gaponenko, Igor; Kim, Peter C.; Lockman, William S.; Neal, Homer A.; Simi, Gabriele; Telnov, Alexandre V.; Wright, Dennis H.; /SLAC

    2011-11-08

    Since 2005 and 2006, respectively, Geant4 versions 7.1 and 8.3 have been available, providing: improvements in modeling of multiple scattering; corrections to muon ionization and improved MIP signature; widening of the core of electromagnetic shower shape profiles; newer implementation of elastic scattering for hadronic processes; detailed implementation of Bertini cascade model for kaons and lambdas, and updated hadronic cross-sections from calorimeter beam tests. The effects of these changes in simulation are studied in terms of closer agreement of simulation using Geant4 versions 7.1 and 8.3 as compared to Geant4 version 6.1 with respect to data distributions of: the hit residuals of tracks in BABAR silicon vertex tracker; the photon and K{sub L}{sup 0} shower shapes in the electromagnetic calorimeter; the ratio of energy deposited in the electromagnetic calorimeter and the flux return of the magnet instrumented with a muon detection system composed of resistive plate chambers and limited-streamer tubes; and the muon identification efficiency in the muon detector system of the BABAR detector.

  7. Optimization of a photoneutron source based on 10 MeV electron beam using Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Askri, Boubaker

    2015-10-01

    Geant4 Monte Carlo code has been used to conceive and optimize a simple and compact neutron source based on a 10 MeV electron beam impinging on a tungsten target adjoined to a beryllium target. For this purpose, a precise photonuclear reaction cross-section model issued from the International Atomic Energy Agency (IAEA) database was linked to Geant4 to accurately simulate the interaction of low energy bremsstrahlung photons with beryllium material. A benchmark test showed that a good agreement was achieved when comparing the emitted neutron flux spectra predicted by Geant4 and Fluka codes for a beryllium cylinder bombarded with a 5 MeV photon beam. The source optimization was achieved through a two stage Monte Carlo simulation. In the first stage, the distributions of the seven phase space coordinates of the bremsstrahlung photons at the boundaries of the tungsten target were determined. In the second stage events corresponding to photons emitted according to these distributions were tracked. A neutron yield of 4.8 × 1010 neutrons/mA/s was obtained at 20 cm from the beryllium target. A thermal neutron yield of 1.5 × 109 neutrons/mA/s was obtained after introducing a spherical shell of polyethylene as a neutron moderator.

  8. Intelligent Detector Design

    SciTech Connect

    Graf, N.A.; /SLAC

    2012-06-11

    As the complexity and resolution of imaging detectors increases, the need for detailed simulation of the experimental setup also becomes more important. Designing the detectors requires efficient tools to simulate the detector response and reconstruct the events. We have developed efficient and flexible tools for detailed physics and detector response simulation as well as event reconstruction and analysis. The primary goal has been to develop a software toolkit and computing infrastructure to allow physicists from universities and labs to quickly and easily conduct physics analyses and contribute to detector research and development. The application harnesses the full power of the Geant4 toolkit without requiring the end user to have any experience with either Geant4 or C++, thereby allowing the user to concentrate on the physics of the detector system.

  9. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  10. Empirical studies of design software: Implications for software engineering environments

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    The empirical studies team of MCC's Design Process Group conducted three studies in 1986-87 in order to gather data on professionals designing software systems in a range of situations. The first study (the Lift Experiment) used thinking aloud protocols in a controlled laboratory setting to study the cognitive processes of individual designers. The second study (the Object Server Project) involved the observation, videotaping, and data collection of a design team of a medium-sized development project over several months in order to study team dynamics. The third study (the Field Study) involved interviews with the personnel from 19 large development projects in the MCC shareholders in order to study how the process of design is affected by organizationl and project behavior. The focus of this report will be on key observations of design process (at several levels) and their implications for the design of environments.

  11. Does software design complexity affect maintenance effort?

    NASA Technical Reports Server (NTRS)

    Epping, Andreas; Lott, Christopher M.

    1994-01-01

    The design complexity of a software system may be characterized within a refinement level (e.g., data flow among modules), or between refinement levels (e.g., traceability between the specification and the design). We analyzed an existing set of data from NASA's Software Engineering Laboratory to test whether changing software modules with high design complexity requires more personnel effort than changing modules with low design complexity. By analyzing variables singly, we identified strong correlations between software design complexity and change effort for error corrections performed during the maintenance phase. By analyzing variables in combination, we found patterns which identify modules in which error corrections were costly to perform during the acceptance test phase.

  12. Reflecting Indigenous Culture in Educational Software Design.

    ERIC Educational Resources Information Center

    Fleer, Marilyn

    1989-01-01

    Discusses research on Australian Aboriginal cognition which relates to the development of appropriate educational software. Describes "Tinja," a software program using familiar content and experiences, Aboriginal characters and cultural values, extensive graphics and animation, peer and group work, and open-ended design to help young children read…

  13. CMD-3 detector offline software development

    NASA Astrophysics Data System (ADS)

    Anisenkov, A.; Ignatov, F.; Pirogov, S.; Sibidanov, A.; Viduk, S.; Zaytsev, A.

    2010-04-01

    CMD-3 is the general purpose cryogenic magnetic detector for VEPP-2000 electron-positron collider, which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). The main aspects of physical program of the experiment are precision measurements of hadronic cross sections, study of known and search for new vector mesons, study of the nbar nand pbar pproduction cross sections in the vicinity of the threshold and search for exotic hadrons in the region of center of mass energy below 2 GeV. This contribution gives a general design overview and a status of implementation of CMD-3 offline software for reconstruction, simulation, visualization and storage management. Software design standards for this project are object oriented programming techniques, C++ as a main language, Geant4 as an only simulation tool, Geant4 based detector geometry description, CLHEP library based primary generators, ROOT toolbox as a persistency manager and Scientific Linux as a main platform. The dedicated software development framework (Cmd3Fwk) was implemented in order to be the basic software integration solution and a high level persistency manager. The key features of the framework are modularity, dynamic data processing chain handling according to the XML configuration of reconstruction modules and on-demand data provisioning mechanisms.

  14. Flight software requirements and design support system

    NASA Technical Reports Server (NTRS)

    Riddle, W. E.; Edwards, B.

    1980-01-01

    The desirability and feasibility of computer-augmented support for the pre-implementation activities occurring during the development of flight control software was investigated. The specific topics to be investigated were the capabilities to be included in a pre-implementation support system for flight control software system development, and the specification of a preliminary design for such a system. Further, the pre-implementation support system was to be characterized and specified under the constraints that it: (1) support both description and assessment of flight control software requirements definitions and design specification; (2) account for known software description and assessment techniques; (3) be compatible with existing and planned NASA flight control software development support system; and (4) does not impose, but may encourage, specific development technologies. An overview of the results is given.

  15. GENII Version 2 Software Design Document

    SciTech Connect

    Napier, Bruce A.; Strenge, Dennis L.; Ramsdell, James V.; Eslinger, Paul W.; Fosmire, Christian J.

    2004-03-08

    This document describes the architectural design for the GENII-V2 software package. This document defines details of the overall structure of the software, the major software components, their data file interfaces, and specific mathematical models to be used. The design represents a translation of the requirements into a description of the software structure, software components, interfaces, and necessary data. The design focuses on the major components and data communication links that are key to the implementation of the software within the operating framework. The purpose of the GENII-V2 software package is to provide the capability to perform dose and risk assessments of environmental releases of radionuclides. The software also has the capability of calculating environmental accumulation and radiation doses from surface water, groundwater, and soil (buried waste) media when an input concentration of radionuclide in these media is provided. This report represents a detailed description of the capabilities of the software product with exact specifications of mathematical models that form the basis for the software implementation and testing efforts. This report also presents a detailed description of the overall structure of the software package, details of main components (implemented in the current phase of work), details of data communication files, and content of basic output reports. The GENII system includes the capabilities for calculating radiation doses following chronic and acute releases. Radionuclide transport via air, water, or biological activity may be considered. Air transport options include both puff and plume models, each allow use of an effective stack height or calculation of plume rise from buoyant or momentum effects (or both). Building wake effects can be included in acute atmospheric release scenarios. The code provides risk estimates for health effects to individuals or populations; these can be obtained using the code by applying

  16. Space Software for Automotive Design

    NASA Technical Reports Server (NTRS)

    1988-01-01

    John Thousand of Wolverine Western Corp. put his aerospace group to work on an unfamiliar job, designing a brake drum using computer design techniques. Computer design involves creation of a mathematical model of a product and analyzing its effectiveness in simulated operation. Technique enables study of performance and structural behavior of a number of different designs before settling on a final configuration. Wolverine employees attacked a traditional brake drum problem, the sudden buildup of heat during fast and repeated braking. Part of brake drum not confined tends to change its shape under combination of heat, physical pressure and rotational forces, a condition known as bellmouthing. Since bellmouthing is a major factor in braking effectiveness, a solution of problem would be a major advance in automotive engineering. A former NASA employee, now a Wolverine employee, knew of a series of NASA computer programs ideally suited to confronting bellmouthing. Originally developed as aids to rocket engine nozzle design, it's capable of analyzing problems generated in a rocket engine or automotive brake drum by heat, expansion, pressure and rotational forces. Use of these computer programs led to new brake drum concept featuring a more durable axle, and heat transfer ribs, or fins, on hub of drum.

  17. Software design and documentation language, revision 1

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1979-01-01

    The Software Design and Documentation Language (SDDL) developed to provide an effective communications medium to support the design and documentation of complex software applications is described. Features of the system include: (1) a processor which can convert design specifications into an intelligible, informative machine-reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor. The SDDL processor is written in the SIMSCRIPT II programming language and is implemented on the UNIVAC 1108, the IBM 360/370, and Control Data machines.

  18. Program Helps Design Tests Of Developmental Software

    NASA Technical Reports Server (NTRS)

    Hops, Jonathan

    1994-01-01

    Computer program called "A Formal Test Representation Language and Tool for Functional Test Designs" (TRL) provides automatic software tool and formal language used to implement category-partition method and produce specification of test cases in testing phase of development of software. Category-partition method useful in defining input, outputs, and purpose of test-design phase of development and combines benefits of choosing normal cases having error-exposing properties. Traceability maintained quite easily by creating test design for each objective in test plan. Effort to transform test cases into procedures simplified by use of automatic software tool to create cases based on test design. Method enables rapid elimination of undesired test cases from consideration and facilitates review of test designs by peer groups. Written in C language.

  19. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  20. SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1994-01-01

    Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures

  1. Design of software engineering teaching website

    NASA Astrophysics Data System (ADS)

    Li, Yuxiang; Liu, Xin; Zhang, Guangbin; Liu, Xingshun; Gao, Zhenbo

    "􀀶oftware engineering" is different from the general professional courses, it is born for getting rid of the software crisis and adapting to the development of software industry, it is a theory course, especially a practical course. However, due to the own characteristics of software engineering curriculum, in the daily teaching process, concerning theoretical study, students may feel boring, obtain low interest in learning and poor test results and other problems. ASPNET design technique is adopted and Access 2007 database is used for system to design and realize "Software Engineering" teaching website. System features mainly include theoretical teaching, case teaching, practical teaching, teaching interaction, database, test item bank, announcement, etc., which can enhance the vitality, interest and dynamic role of learning.

  2. Early-Stage Software Design for Usability

    ERIC Educational Resources Information Center

    Golden, Elspeth

    2010-01-01

    In spite of the goodwill and best efforts of software engineers and usability professionals, systems continue to be built and released with glaring usability flaws that are costly and difficult to fix after the system has been built. Although user interface (UI) designers, be they usability or design experts, communicate usability requirements to…

  3. Modeling User Interactions with Instructional Design Software.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    As one of a series of studies being conducted to develop a useful (predictive) model of the instructional design process that is appropriate to military technical training settings, this study performed initial evaluations on two pieces of instructional design software developed by M. David Merrill and colleagues at Utah State University i.e.,…

  4. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  5. Computed Pion Yields from a Tantalum Rod Target: Comparing MARS15 and GEANT4 Across Proton Energies

    NASA Astrophysics Data System (ADS)

    Brooks, S. J.; Walaron, K. A.

    2006-05-01

    The choice of proton driver energy is an important variable in maximising the pion flux available in later stages of the neutrino factory. Simulations of pion production using a range of energies are presented and cross-checked for reliability between the codes MARS15 and GEANT4. The distributions are combined with postulated apertures for the pion decay channel and muon front-end to estimate the usable muon flux after capture losses. Resolution of discrepancies between the codes awaits experimental data in the required energy range.

  6. SU-E-T-347: Validation of the Condensed History Algorithm of Geant4 Using the Fano Test

    SciTech Connect

    Lee, H; Mathis, M; Sawakuchi, G

    2014-06-01

    Purpose: To validate the condensed history algorithm and physics of the Geant4 Monte Carlo toolkit for simulations of ionization chambers (ICs). This study is the first step to validate Geant4 for calculations of photon beam quality correction factors under the presence of a strong magnetic field for magnetic resonance guided linac system applications. Methods: The electron transport and boundary crossing algorithms of Geant4 version 9.6.p02 were tested under Fano conditions using the Geant4 example/application FanoCavity. User-defined parameters of the condensed history and multiple scattering algorithms were investigated under Fano test conditions for three scattering models (physics lists): G4UrbanMscModel95 (PhysListEmStandard-option3), G4GoudsmitSaundersonMsc (PhysListEmStandard-GS), and G4WentzelVIModel/G4CoulombScattering (PhysListEmStandard-WVI). Simulations were conducted using monoenergetic photon beams, ranging from 0.5 to 7 MeV and emphasizing energies from 0.8 to 3 MeV. Results: The GS and WVI physics lists provided consistent Fano test results (within ±0.5%) for maximum step sizes under 0.01 mm at 1.25 MeV, with improved performance at 3 MeV (within ±0.25%). The option3 physics list provided consistent Fano test results (within ±0.5%) for maximum step sizes above 1 mm. Optimal parameters for the option3 physics list were 10 km maximum step size with default values for other user-defined parameters: 0.2 dRoverRange, 0.01 mm final range, 0.04 range factor, 2.5 geometrical factor, and 1 skin. Simulations using the option3 physics list were ∼70 – 100 times faster compared to GS and WVI under optimal parameters. Conclusion: This work indicated that the option3 physics list passes the Fano test within ±0.5% when using a maximum step size of 10 km for energies suitable for IC calculations in a 6 MV spectrum without extensive computational times. Optimal user-defined parameters using the option3 physics list will be used in future IC simulations to

  7. A GAMOS plug-in for GEANT4 based Monte Carlo simulation of radiation-induced light transport in biological media.

    PubMed

    Glaser, Adam K; Kanick, Stephen C; Zhang, Rongxiao; Arce, Pedro; Pogue, Brian W

    2013-05-01

    We describe a tissue optics plug-in that interfaces with the GEANT4/GAMOS Monte Carlo (MC) architecture, providing a means of simulating radiation-induced light transport in biological media for the first time. Specifically, we focus on the simulation of light transport due to the Čerenkov effect (light emission from charged particle's traveling faster than the local speed of light in a given medium), a phenomenon which requires accurate modeling of both the high energy particle and subsequent optical photon transport, a dynamic coupled process that is not well-described by any current MC framework. The results of validation simulations show excellent agreement with currently employed biomedical optics MC codes, [i.e., Monte Carlo for Multi-Layered media (MCML), Mesh-based Monte Carlo (MMC), and diffusion theory], and examples relevant to recent studies into detection of Čerenkov light from an external radiation beam or radionuclide are presented. While the work presented within this paper focuses on radiation-induced light transport, the core features and robust flexibility of the plug-in modified package make it also extensible to more conventional biomedical optics simulations. The plug-in, user guide, example files, as well as the necessary files to reproduce the validation simulations described within this paper are available online at http://www.dartmouth.edu/optmed/research-projects/monte-carlo-software.

  8. General purpose optimization software for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  9. A comparison of the measured responses of a tissue-equivalent proportional counter to high energy heavy (HZE) particles and those simulated using the Geant4 Monte Carlo code

    PubMed Central

    Taddei, Phillip J.; Zhao, Zhongxiang; Borak, Thomas B.

    2010-01-01

    Monte Carlo simulations of heavy ion interactions using the Geant4 toolkit were compared with measurements of energy deposition in a spherical tissue-equivalent proportional counter (TEPC). A spherical cavity with a physical diameter of 12.7 mm was filled with propane-based tissue-equivalent gas surrounded by a wall of A-150 tissue-equivalent plastic that was 2.54 mm to thick. Measurements and Monte Carlo simulations were used to record the energy deposition and the trajectory of the incident particle on an event-by-event basis for ions ranging in atomic number from 2 (4He) to 26 (56Fe) and in energy from 200 MeV/nucleon to 1000 MeV/nucleon. In the simulations, tracking of secondary electrons was terminated when the range of an electron was below a specified threshold. The effects of range cuts for electrons at 0.5 μm, 1 μm, 10 μm, and 100 μm were evaluated. To simulate an energy deposition influenced by large numbers of low energy electrons with large transverse momentum, it was necessary to track electrons down to range cuts of 10 μm or less. The Geant4 simulated data closely matched the measured data acquired using a TEPC for incident particles traversing the center of the detector as well as near the gas-wall interface. Values of frequency mean lineal energy and dose mean lineal energy were within 8% of the measured data. The production of secondary particles in the aluminum vacuum chamber had no effect on the response of the TEPC for 56Fe at 1000 MeV/nucleon. The results of this study confirm that Geant4 can simulate patterns of energy deposition for existing microdosimeters and is valuable for improving the design of a new generation of detectors used for space dosimetry and for characterizing particle beams used in hadron radiotherapy. PMID:20862212

  10. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam.

    PubMed

    Hall, David C; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues.

  11. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam

    NASA Astrophysics Data System (ADS)

    Hall, David C.; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues.

  12. Inhomogeneity effect in Varian Trilogy Clinac iX 10 MV photon beam using EGSnrc and Geant4 code system

    NASA Astrophysics Data System (ADS)

    Yani, S.; Rhani, M. F.; Haryanto, F.; Arif, I.

    2016-08-01

    Treatment fields consist of tissue other than water equivalent tissue (soft tissue, bones, lungs, etc.). The inhomogeneity effect can be investigated by Monte Carlo (MC) simulation. MC simulation of the radiation transport in an absorbing medium is the most accurate method for dose calculation in radiotherapy. The aim of this work is to evaluate the effect of inhomogeneity phantom on dose calculations in photon beam radiotherapy obtained by different MC codes. MC code system EGSnrc and Geant4 was used in this study. Inhomogeneity phantom dimension is 39.5 × 30.5 × 30 cm3 and made of 4 material slices (12.5 cm water, 10 cm aluminium, 5 cm lung and 12.5 cm water). Simulations were performed for field size 4 × 4 cm2 at SSD 100 cm. The spectrum distribution Varian Trilogy Clinac iX 10 MV was used. Percent depth dose (PDD) and dose profile was investigated in this research. The effects of inhomogeneities on radiation dose distributions depend on the amount, density and atomic number of the inhomogeneity, as well as on the quality of the photon beam. Good agreement between dose distribution from EGSnrc and Geant4 code system in inhomogeneity phantom was observed, with dose differences around 5% and 7% for depth doses and dose profiles.

  13. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam.

    PubMed

    Hall, David C; Makarova, Anastasia; Paganetti, Harald; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues. PMID:26611861

  14. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  15. User Interface Design for Dynamic Geometry Software

    ERIC Educational Resources Information Center

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  16. Teacher-Driven Design of Educational Software.

    ERIC Educational Resources Information Center

    Carlson, Patricia A.

    This paper reflects on the author's participation in two government-sponsored educational software development projects that used a holistic design paradigm in which classroom formative assessment and teacher input played a critical role in the development process. The two project were: R-WISE (Reading and Writing in a Supportive Environment)--a…

  17. Photonic IC design software and process design kits

    NASA Astrophysics Data System (ADS)

    Korthorst, Twan; Stoffer, Remco; Bakker, Arjen

    2015-04-01

    This review discusses photonic IC design software tools, examines existing design flows for photonics design and how these fit different design styles and describes the activities in collaboration and standardization within the silicon photonics group from Si2 and by members of the PDAFlow Foundation to improve design flows. Moreover, it will address the lowering of access barriers to the technology by providing qualified process design kits (PDKs) and improved integration of photonic integrated circuit simulations, physical simulations, mask layout, and verification.

  18. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  19. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  20. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  1. Probing Planetary Bodies for Subsurface Volatiles: GEANT4 Models of Gamma Ray, Fast, Epithermal, and Thermal Neutron Response to Active Neutron Illumination

    NASA Astrophysics Data System (ADS)

    Chin, G.; Sagdeev, R.; Su, J. J.; Murray, J.

    2014-12-01

    Using an active source of neutrons as an in situ probe of a planetary body has proven to be a powerful tool to extract information about the presence, abundance, and location of subsurface volatiles without the need for drilling. The Dynamic Albedo of Neutrons (DAN) instrument on Curiosity is an example of such an instrument and is designed to detect the location and abundance of hydrogen within the top 50 cm of the Martian surface. DAN works by sending a pulse of neutrons towards the ground beneath the rover and detecting the reflected neutrons. The intensity and time of arrival of the reflection depends on the proportion of water, while the time the pulse takes to reach the detector is a function of the depth at which the water is located. Similar instruments can also be effective probes at the polar-regions of the Moon or on asteroids as a way of detecting sequestered volatiles. We present the results of GEANT4 particle simulation models of gamma ray, fast, epithermal, and thermal neutron responses to active neutron illumination. The results are parameterized by hydrogen abundance, stratification and depth of volatile layers, versus the distribution of neutron and gamma ray energy reflections. Models will be presented to approximate Martian, lunar, and asteroid environments and would be useful tools to assess utility for future NASA exploration missions to these types of planetary bodies.

  2. Ray tracing simulations for the wide-field x-ray telescope of the Einstein Probe mission based on Geant4 and XRTG4

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Willingale, Richard; Ling, Zhixing; Feng, Hua; Li, Hong; Ji, Jianfeng; Wang, Wenxin; Zhang, Shuangnan

    2014-07-01

    Einstein Probe (EP) is a proposed small scientific satellite dedicated to time-domain astrophysics working in the soft X-ray band. It will discover transients and monitor variable objects in 0.5-4 keV, for which it will employ a very large instantaneous field-of-view (60° × 60°), along with moderate spatial resolution (FWHM ˜ 5 arcmin). Its wide-field imaging capability will be achieved by using established technology in novel lobster-eye optics. In this paper, we present Monte-Carlo simulations for the focusing capabilities of EP's Wide-field X-ray Telescope (WXT). The simulations are performed using Geant4 with an X-ray tracer which was developed by cosine (http://cosine.nl/) to trace X-rays. Our work is the first step toward building a comprehensive model with which the design of the X-ray optics and the ultimate sensitivity of the instrument can be optimized by simulating the X-ray tracing and radiation environment of the system, including the focal plane detector and the shielding at the same time.

  3. Intelligent Software for System Design and Documentation

    NASA Technical Reports Server (NTRS)

    2002-01-01

    In an effort to develop a real-time, on-line database system that tracks documentation changes in NASA's propulsion test facilities, engineers at Stennis Space Center teamed with ECT International of Brookfield, WI, through the NASA Dual-Use Development Program to create the External Data Program and Hyperlink Add-on Modules for the promis*e software. Promis*e is ECT's top-of-the-line intelligent software for control system design and documentation. With promis*e the user can make use of the automated design process to quickly generate control system schematics, panel layouts, bills of material, wire lists, terminal plans and more. NASA and its testing contractors currently use promis*e to create the drawings and schematics at the E2 Cell 2 test stand located at Stennis Space Center.

  4. Technical Note: Improvements in GEANT4 energy-loss model and the effect on low-energy electron transport in liquid water

    SciTech Connect

    Kyriakou, I.; Incerti, S.

    2015-07-15

    Purpose: The GEANT4-DNA physics models are upgraded by a more accurate set of electron cross sections for ionization and excitation in liquid water. The impact of the new developments on low-energy electron transport simulations by the GEANT4 Monte Carlo toolkit is examined for improving its performance in dosimetry applications at the subcellular and nanometer level. Methods: The authors provide an algorithm for an improved implementation of the Emfietzoglou model dielectric response function of liquid water used in the GEANT4-DNA existing model. The algorithm redistributes the imaginary part of the dielectric function to ensure a physically motivated behavior at the binding energies, while retaining all the advantages of the original formulation, e.g., the analytic properties and the fulfillment of the f-sum-rule. In addition, refinements in the exchange and perturbation corrections to the Born approximation used in the GEANT4-DNA existing model are also made. Results: The new ionization and excitation cross sections are significantly different from those of the GEANT4-DNA existing model. In particular, excitations are strongly enhanced relative to ionizations, resulting in higher W-values and less diffusive dose-point-kernels at sub-keV electron energies. Conclusions: An improved energy-loss model for the excitation and ionization of liquid water by low-energy electrons has been implemented in GEANT4-DNA. The suspiciously low W-values and the unphysical long tail in the dose-point-kernel have been corrected owing to a different partitioning of the dielectric function.

  5. Advanced Extravehicular Mobility Unit Informatics Software Design

    NASA Technical Reports Server (NTRS)

    Wright, Theodore

    2014-01-01

    This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.

  6. Empirical studies of software design: Implications for SSEs

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.

  7. GEANT4 Simulation of Hadronic Interactions at 8-GeV/C to 10-GeV/C: Response to the HARP-CDP Group

    SciTech Connect

    Uzhinsky, V.; Apostolakis, J.; Folger, G.; Ivanchenko, V.N.; Kossov, M.V.; Wright, D.H.; /SLAC

    2011-11-21

    The results of the HARP-CDP group on the comparison of GEANT4 Monte Carlo predictions versus experimental data are discussed. It is shown that the problems observed by the group are caused by an incorrect implementation of old features at the programming level, and by a lack of the nucleon Fermi motion in the simulation of quasielastic scattering. These drawbacks are not due to the physical models used. They do not manifest themselves in the most important applications of the GEANT4 toolkit.

  8. Performance evaluation of multi sampling ionization chamber for heavy ion beams by comparison with GEANT4 simulation

    NASA Astrophysics Data System (ADS)

    Kanke, Yuki; Himac H093 Collaboration

    2014-09-01

    In high-energy heavy-ion accelerator facilities, multi sampling ionization chambers are often used for the identification of the atomic number Z by detecting the energy deposit in it. In the study at GSI, the picture of the escape of secondary electrons, δ rays, from the ionization chamber explains the experimental data of pulse-height resolution. If this picture is correct, the pulse-height resolution should depend on the effective area of the ionization chamber. The experiment have been performed at NIRS-HIMAC. The pulse-height resolutions of two ionization chambers with different effective area were compared by using a 400-MeV/u Ni beam and their fragments. The difference in the pulse-height resolutions was observed. By comparison with the GEANT4 simulation including the δ-rays emission, the performance of the ionization chamber have been evaluated.

  9. Yields of positron and positron emitting nuclei for proton and carbon ion radiation therapy: a simulation study with GEANT4.

    PubMed

    Lau, Andy; Chen, Yong; Ahmad, Salahuddin

    2012-01-01

    A Monte Carlo application is developed to investigate the yields of positron-emitting nuclei (PEN) used for proton and carbon ion range verification techniques using the GEANT4 Toolkit. A base physics list was constructed and used to simulate incident proton and carbon ions onto a PMMA or water phantom using pencil like beams. In each simulation the total yields of PEN are counted and both the PEN and their associated positron depth-distributions were recorded and compared to the incident radiation's Bragg Peak. Alterations to the physics lists are then performed to investigate the PEN yields dependence on the choice of physics list. In our study, we conclude that the yields of PEN can be estimated using the physics list presented here for range verification of incident proton and carbon ions.

  10. Spallation Source Modelling for an ADS Using the MCNPX and GEANT4 Packages for Sensitivity Analysis of Reactivity

    NASA Astrophysics Data System (ADS)

    Antolin, M. Q.; Marinho, F.; Palma, D. A. P.; Martinez, A. S.

    2014-04-01

    A simulation for the time evolution of the MYRRHA conceptual reactor was developed. The SERPENT code was used to simulate the nuclear fuel depletion and the spallation source which drives the system was simulated using both MCNPX and GEANT4 packages. The obtained results for the neutron energy spectrum from the spallation are coherent with each other and were used as input for the SERPENT code which simulated a constant power operation regime. The obtained results show that the criticality of the system is not sensitive to the spallation models employed and only relative small deviations with respect to the inverse kinetic model coming from the point kinetic equations proposed by Gandini were observed.

  11. Software archeology: a case study in software quality assurance and design

    SciTech Connect

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  12. j5 DNA assembly design automation software.

    PubMed

    Hillson, Nathan J; Rosengarten, Rafael D; Keasling, Jay D

    2012-01-20

    Recent advances in Synthetic Biology have yielded standardized and automatable DNA assembly protocols that enable a broad range of biotechnological research and development. Unfortunately, the experimental design required for modern scar-less multipart DNA assembly methods is frequently laborious, time-consuming, and error-prone. Here, we report the development and deployment of a web-based software tool, j5, which automates the design of scar-less multipart DNA assembly protocols including SLIC, Gibson, CPEC, and Golden Gate. The key innovations of the j5 design process include cost optimization, leveraging DNA synthesis when cost-effective to do so, the enforcement of design specification rules, hierarchical assembly strategies to mitigate likely assembly errors, and the instruction of manual or automated construction of scar-less combinatorial DNA libraries. Using a GFP expression testbed, we demonstrate that j5 designs can be executed with the SLIC, Gibson, or CPEC assembly methods, used to build combinatorial libraries with the Golden Gate assembly method, and applied to the preparation of linear gene deletion cassettes for E. coli. The DNA assembly design algorithms reported here are generally applicable to broad classes of DNA construction methodologies and could be implemented to supplement other DNA assembly design tools. Taken together, these innovations save researchers time and effort, reduce the frequency of user design errors and off-target assembly products, decrease research costs, and enable scar-less multipart and combinatorial DNA construction at scales unfeasible without computer-aided design.

  13. Participatory Design Activities and Agile Software Development

    NASA Astrophysics Data System (ADS)

    Kautz, Karlheinz

    This paper contributes to the studies of design activities in information systems development. It provides a case study of a large agile development project and focusses on how customers and users participated in agile development and design activities in practice. The investigated project utilized the agile method eXtreme Programming. Planning games, user stories and story cards, working software, and acceptance tests structured the customer and user involvement. We found genuine customer and user involvement in the design activities in the form of both direct and indirect participation in the agile development project. The involved customer representatives played informative, consultative, and participative roles in the project. This led to their functional empowerment— the users were enabled to carry out their work to their own satisfaction and in an effective, efficient, and economical manner.

  14. COG Software Architecture Design Description Document

    SciTech Connect

    Buck, R M; Lent, E M

    2009-09-21

    This COG Software Architecture Design Description Document describes the organization and functionality of the COG Multiparticle Monte Carlo Transport Code for radiation shielding and criticality calculations, at a level of detail suitable for guiding a new code developer in the maintenance and enhancement of COG. The intended audience also includes managers and scientists and engineers who wish to have a general knowledge of how the code works. This Document is not intended for end-users. This document covers the software implemented in the standard COG Version 10, as released through RSICC and IAEA. Software resources provided by other institutions will not be covered. This document presents the routines grouped by modules and in the order of the three processing phases. Some routines are used in multiple phases. The routine description is presented once - the first time the routine is referenced. Since this is presented at the level of detail for guiding a new code developer, only the routines invoked by another routine that are significant for the processing phase that is being detailed are presented. An index to all routines detailed is included. Tables for the primary data structures are also presented.

  15. SU-E-T-531: Performance Evaluation of Multithreaded Geant4 for Proton Therapy Dose Calculations in a High Performance Computing Facility

    SciTech Connect

    Shin, J; Coss, D; McMurry, J; Farr, J; Faddegon, B

    2014-06-01

    Purpose: To evaluate the efficiency of multithreaded Geant4 (Geant4-MT, version 10.0) for proton Monte Carlo dose calculations using a high performance computing facility. Methods: Geant4-MT was used to calculate 3D dose distributions in 1×1×1 mm3 voxels in a water phantom and patient's head with a 150 MeV proton beam covering approximately 5×5 cm2 in the water phantom. Three timestamps were measured on the fly to separately analyze the required time for initialization (which cannot be parallelized), processing time of individual threads, and completion time. Scalability of averaged processing time per thread was calculated as a function of thread number (1, 100, 150, and 200) for both 1M and 50 M histories. The total memory usage was recorded. Results: Simulations with 50 M histories were fastest with 100 threads, taking approximately 1.3 hours and 6 hours for the water phantom and the CT data, respectively with better than 1.0 % statistical uncertainty. The calculations show 1/N scalability in the event loops for both cases. The gains from parallel calculations started to decrease with 150 threads. The memory usage increases linearly with number of threads. No critical failures were observed during the simulations. Conclusion: Multithreading in Geant4-MT decreased simulation time in proton dose distribution calculations by a factor of 64 and 54 at a near optimal 100 threads for water phantom and patient's data respectively. Further simulations will be done to determine the efficiency at the optimal thread number. Considering the trend of computer architecture development, utilizing Geant4-MT for radiotherapy simulations is an excellent cost-effective alternative for a distributed batch queuing system. However, because the scalability depends highly on simulation details, i.e., the ratio of the processing time of one event versus waiting time to access for the shared event queue, a performance evaluation as described is recommended.

  16. Behaviors of the percentage depth dose curves along the beam axis of a phantom filled with different clinical PTO objects, a Monte Carlo Geant4 study

    NASA Astrophysics Data System (ADS)

    EL Bakkali, Jaafar; EL Bardouni, Tarek; Safavi, Seyedmostafa; Mohammed, Maged; Saeed, Mroan

    2016-08-01

    The aim of this work is to assess the capabilities of Monte Carlo Geant4 code to reproduce the real percentage depth dose (PDD) curves generated in phantoms which mimic three important clinical treatment situations that include lung slab, bone slab, bone-lung slab geometries. It is hoped that this work will lead us to a better understanding of dose distributions in an inhomogeneous medium, and to identify any limitations of dose calculation algorithm implemented in the Geant4 code. For this purpose, the PDD dosimetric functions associated to the three clinical situations described above, were compared to one produced in a homogeneous water phantom. Our results show, firstly, that the Geant4 simulation shows potential mistakes on the shape of the calculated PDD curve of the first physical test object (PTO), and it is obviously not able to successfully predict dose values in regions near to the boundaries between two different materials. This is, surely due to the electron transport algorithm and it is well-known as the artifacts at interface phenomenon. To deal with this issue, we have added and optimized the StepMax parameter to the dose calculation program; consequently the artifacts due to the electron transport were quasi disappeared. However, the Geant4 simulation becomes painfully slow when we attempt to completely resolve the electron artifact problems by considering a smaller value of an electron StepMax parameter. After electron transport optimization, our results demonstrate the medium-level capabilities of the Geant4 code to modeling dose distribution in clinical PTO objects.

  17. GEANT4 Simulation for the Zenith Angle Dependence of Cosmic Muon Intensities at Two Different Geomagnetic Locations

    NASA Astrophysics Data System (ADS)

    Arslan, Halil; Bektasoglu, Mehmet

    2013-06-01

    The zenith angle dependence of cosmic muon flux at sea level in the western, eastern, southern and northern azimuths have been investigated separately for Calcutta, India and Melbourne, Australia for muon momenta up to 500 GeV/c using Geant4 simulation package. These two locations were selected due to the fact that they significantly differ in geomagnetic cutoff rigidity. The exponent n, which is defined by the relation I(θ) = I(0°)cosnθ, was obtained for each azimuth in Calcutta and Melbourne. By acquiring an agreement between the simulation results and the experimental ones, the simulation study was extended for different azimuth angles and higher muon momenta. It was shown that the angular dependence of the cosmic muon intensity decreases with the increase of muon momentum at both locations. Moreover, the exponent becomes independent of both geomagnetic location and the azimuth angle for muons with momentum above 10 GeV/c, and it is nearly zero above 50 GeV/c. Therefore, it can be concluded that the cosmic muons with momenta between 50 GeV/c and 500 GeV/c reach the sea level almost isotropically.

  18. DETECTORS AND EXPERIMENTAL METHODS: Study of neutron response for two hybrid RPC setups using the GEANT4 MC simulation approach

    NASA Astrophysics Data System (ADS)

    M., Jamil; Rhee T., J.; Jeon J., Y.

    2009-10-01

    The present article describes a detailed neutron simulation study in the energy range 10-10 MeV to 1.0 GeV for two different RPC configurations. The simulation studies were taken by using the GEANT4 MC code. Aluminum was utilized on the GND and readout strips for the (a) Bakelite-based and (b) glass-based RPCs. For the former type of RPC setup the neutron sensitivity for the isotropic source was Sn = 2.702 × 10-2 at En = 1.0 GeV, while for the latter type of RPC, the neutron sensitivity for the same source was evaluated as Sn = 4.049 × 10-2 at En = 1.0 GeV. These results were further compared with the previous RPC configuration in which copper was used for ground and pickup pads. Additionally Al was employed at (GND+strips) of the phosphate glass RPC setup and compared with the copper-based phosphate glass RPC. Good agreement with sensitivity values was obtained with the current and previous simulation results.

  19. Analysis of the physical interactions of therapeutic proton beams in water with the use of Geant4 Monte Carlo calculations.

    PubMed

    Morávek, Zdenek; Bogner, Ludwig

    2009-01-01

    The processes that occur when protons traverse a medium are investigated theoretically for a full therapeutic range of energies [20 MeV, 220 MeV]. The investigation is undertaken using the Geant4 toolkit for water medium. The beam is simulated only inside the phantom, effects of beamline are included in the overall beam properties as lateral width and momentum bandwidth. Every energy deposition is catalogued according to the particle and the process that caused it. The catalogued depositions are analysed statistically. There are only few important processes such as proton ionisation and nuclear scattering (elastic/inelastic) that constitute the main features of the energy distribution. At the same time processes concerning electrons are used very often without obvious effect to the result. Such processes can be therefore approximated in the simulation codes in order to improve the performance of the code. Neutron depositions are most important before the Bragg peak, still they are by an order of magnitude smaller than those of protons. In the region behind the Bragg peak only a small number of neutrons is created in the simulation and their energy contribution through secondary protons is by orders smaller than the effect of proton-produced secondary protons within the Bragg peak. Hence, the effects of neutrons created in the calculation volume can be neglected. PMID:19761094

  20. ClassCompass: A Software Design Mentoring System

    ERIC Educational Resources Information Center

    Coelho, Wesley; Murphy, Gail

    2007-01-01

    Becoming a quality software developer requires practice under the guidance of an expert mentor. Unfortunately, in most academic environments, there are not enough experts to provide any significant design mentoring for software engineering students. To address this problem, we present a collaborative software design tool intended to maximize an…

  1. Assessing the Design of Instructional Software.

    ERIC Educational Resources Information Center

    Bangert-Drowns, Robert L.; Kozma, Robert B.

    1989-01-01

    Describes assessment procedures used to select winners of the EDUCOM/NCRIPTAL (National Center for Research to Improve Postsecondary Teaching and Learning) Higher Education Software Awards program; presents the evaluative criteria used for software assessment; and lists the award-winning software for 1987. (32 references) (LRW)

  2. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the geant4 Monte Carlo code

    PubMed Central

    Guan, Fada; Peeler, Christopher; Bronk, Lawrence; Geng, Changran; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Grosshans, David; Mohan, Radhe; Titt, Uwe

    2015-01-01

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the geant 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from geant 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LETt and dose-averaged LET, LETd) using geant 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LETt and LETd of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LETt but significant for LETd. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in geant 4 can result in incorrect LETd calculation results in the dose plateau region for small step limits. The erroneous LETd results can be attributed to the algorithm to determine fluctuations in energy deposition along the

  3. Designing and Using Software for the Learning Disabled.

    ERIC Educational Resources Information Center

    Weisgerber, Robert A.; Rubin, David P.

    1985-01-01

    Basic principles of effective software implementation with learning disabled students are discussed. A prototype software package is described that is specifically designed to help develop discriminatory skills in recognizing letter shapes and letter combinations. (JW)

  4. PD5: A General Purpose Library for Primer Design Software

    PubMed Central

    Riley, Michael C.; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Background Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. Results The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C++ with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. Conclusions The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design PMID:24278254

  5. Determination of age specific ¹³¹I S-factor values for thyroid using anthropomorphic phantom in Geant4 simulations.

    PubMed

    Rahman, Ziaur; Ahmad, Syed Bilal; Mirza, Sikander M; Arshed, Waheed; Mirza, Nasir M; Ahmed, Waheed

    2014-08-01

    Using anthropomorphic phantom in Geant4, determination of β- and γ-absorbed fractions and energy absorbed per event due to (131)I activity in thyroid of individuals of various age groups and geometrical models, have been carried out. In the case of (131)I β-particles, the values of the absorbed fraction increased from 0.88 to 0.97 with fetus age. The maximum difference in absorbed energy per decay for soft tissue and water is 7.2% for γ-rays and 0.4% for β-particles. The new mathematical MIRD embedded in Geant4 (MEG) and two-lobe ellipsoidal models developed in this work have 4.3% and 2.9% lower value of S-factor as compared with the ORNL data.

  6. Engineering Software Suite Validates System Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  7. GATE as a GEANT4-based Monte Carlo platform for the evaluation of proton pencil beam scanning treatment plans.

    PubMed

    Grevillot, L; Bertrand, D; Dessy, F; Freud, N; Sarrut, D

    2012-07-01

    Active scanning delivery systems take full advantage of ion beams to best conform to the tumor and to spare surrounding healthy tissues; however, it is also a challenging technique for quality assurance. In this perspective, we upgraded the GATE/GEANT4 Monte Carlo platform in order to recalculate the treatment planning system (TPS) dose distributions for active scanning systems. A method that allows evaluating the TPS dose distributions with the GATE Monte Carlo platform has been developed and applied to the XiO TPS (Elekta), for the IBA proton pencil beam scanning (PBS) system. First, we evaluated the specificities of each dose engine. A dose-conversion scheme that allows one to convert dose to medium into dose to water was implemented within GATE. Specific test cases in homogeneous and heterogeneous configurations allowed for the estimation of the differences between the beam models implemented in XiO and GATE. Finally, dose distributions of a prostate treatment plan were compared. In homogeneous media, a satisfactory agreement was generally obtained between XiO and GATE. The maximum stopping power difference of 3% occurred in a human tissue of 0.9 g cm(-3) density and led to a significant range shift. Comparisons in heterogeneous configurations pointed out the limits of the TPS dose calculation accuracy and the superiority of Monte Carlo simulations. The necessity of computing dose to water in our Monte Carlo code for comparisons with TPSs is also presented. Finally, the new capabilities of the platform are applied to a prostate treatment plan and dose differences between both dose engines are analyzed in detail. This work presents a generic method to compare TPS dose distributions with the GATE Monte Carlo platform. It is noteworthy that GATE is also a convenient tool for imaging applications, therefore opening new research possibilities for the PBS modality.

  8. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  9. Sensitivity analysis for liver iron measurement through neutron stimulated emission computed tomography: a Monte Carlo study in GEANT4.

    PubMed

    Agasthya, G A; Harrawood, B C; Shah, J P; Kapadia, A J

    2012-01-01

    Neutron stimulated emission computed tomography (NSECT) is being developed as a non-invasive imaging modality to detect and quantify iron overload in the human liver. NSECT uses gamma photons emitted by the inelastic interaction between monochromatic fast neutrons and iron nuclei in the body to detect and quantify the disease. Previous simulated and physical experiments with phantoms have shown that NSECT has the potential to accurately diagnose iron overload with reasonable levels of radiation dose. In this work, we describe the results of a simulation study conducted to determine the sensitivity of the NSECT system for hepatic iron quantification in patients of different sizes. A GEANT4 simulation of the NSECT system was developed with a human liver and two torso sizes corresponding to small and large patients. The iron concentration in the liver ranged between 0.5 and 20 mg g(-1), corresponding to clinically reported iron levels in iron-overloaded patients. High-purity germanium gamma detectors were simulated to detect the emitted gamma spectra, which were background corrected using suitable water phantoms and analyzed to determine the minimum detectable level (MDL) of iron and the sensitivity of the NSECT system. These analyses indicate that for a small patient (torso major axis = 30 cm) the MDL is 0.5 mg g(-1) and sensitivity is ∼13 ± 2 Fe counts/mg/mSv and for a large patient (torso major axis = 40 cm) the values are 1 mg g(-1) and ∼5 ± 1 Fe counts/mg/mSv, respectively. The results demonstrate that the MDL for both patient sizes lies within the clinically significant range for human iron overload.

  10. Sensitivity analysis for liver iron measurement through neutron stimulated emission computed tomography: a Monte Carlo study in GEANT4

    NASA Astrophysics Data System (ADS)

    Agasthya, G. A.; Harrawood, B. C.; Shah, J. P.; Kapadia, A. J.

    2012-01-01

    Neutron stimulated emission computed tomography (NSECT) is being developed as a non-invasive imaging modality to detect and quantify iron overload in the human liver. NSECT uses gamma photons emitted by the inelastic interaction between monochromatic fast neutrons and iron nuclei in the body to detect and quantify the disease. Previous simulated and physical experiments with phantoms have shown that NSECT has the potential to accurately diagnose iron overload with reasonable levels of radiation dose. In this work, we describe the results of a simulation study conducted to determine the sensitivity of the NSECT system for hepatic iron quantification in patients of different sizes. A GEANT4 simulation of the NSECT system was developed with a human liver and two torso sizes corresponding to small and large patients. The iron concentration in the liver ranged between 0.5 and 20 mg g-1,In this paper all iron concentrations with units mg g-1 refer to wet weight concentrations. corresponding to clinically reported iron levels in iron-overloaded patients. High-purity germanium gamma detectors were simulated to detect the emitted gamma spectra, which were background corrected using suitable water phantoms and analyzed to determine the minimum detectable level (MDL) of iron and the sensitivity of the NSECT system. These analyses indicate that for a small patient (torso major axis = 30 cm) the MDL is 0.5 mg g-1 and sensitivity is ˜13 ± 2 Fe counts/mg/mSv and for a large patient (torso major axis = 40 cm) the values are 1 mg g-1 and ˜5 ± 1 Fe counts/mg/mSv, respectively. The results demonstrate that the MDL for both patient sizes lies within the clinically significant range for human iron overload.

  11. Design Your Own Instructional Software: It's Easy.

    ERIC Educational Resources Information Center

    Pauline, Ronald F.

    Computer Assisted Instruction (CAI) is, quite simply, an instance in which instructional content activities are delivered via a computer. Many commercially-available software programs, although excellent programs, may not be acceptable for each individual teacher's classroom. One way to insure that software is not only acceptable but also targets…

  12. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β(+)-emitting nuclei during therapeutic particle irradiation to measured data.

    PubMed

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-21

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β(+)-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β(+)-activity and dose is not feasible, a simulation of the expected β(+)-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β(+)-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β(+)-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  13. The Software Design Document: More than a User's Manual.

    ERIC Educational Resources Information Center

    Bowers, Dennis

    1989-01-01

    Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…

  14. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  15. JPL Facilities and Software for Collaborative Design: 1994 - Present

    NASA Technical Reports Server (NTRS)

    DeFlorio, Paul A.

    2004-01-01

    The viewgraph presentation provides an overview of the history of the JPL Project Design Center (PDC) and, since 2000, the Center for Space Mission Architecture and Design (CSMAD). The discussion includes PDC objectives and scope; mission design metrics; distributed design; a software architecture timeline; facility design principles; optimized design for group work; CSMAD plan view, facility design, and infrastructure; and distributed collaboration tools.

  16. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  17. SEPAC flight software detailed design specifications, volume 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.

  18. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  19. Bias and design in software specifications

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1990-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. Presented here is a model of bias in software specifications. Bias is defined in terms of the specification process and a classification of the attributes of the software product. Our definition of bias provides insight into both the origin and the consequences of bias. It also shows that bias is relative and essentially unavoidable. Finally, we describe current work on defining a measure of bias, formalizing our model, and relating bias to software defects.

  20. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  1. Design Features of Pedagogically-Sound Software in Mathematics.

    ERIC Educational Resources Information Center

    Haase, Howard; And Others

    Weaknesses in educational software currently available in the domain of mathematics are discussed. A technique that was used for the design and production of mathematics software aimed at improving problem-solving skills which combines sound pedagogy and innovative programming is presented. To illustrate the design portion of this technique, a…

  2. Simulation of the radiation exposure in space during a large solar energetic particle event with GEANT4

    NASA Astrophysics Data System (ADS)

    Matthiä, Daniel; Berger, Thomas; Puchalska, Monika; Reitz, Guenther

    in August 1972 in the energy range from 45 MeV to 1 GeV. The transport calculations of the energetic particles through the shielding and the phantom model were performed using the Monte-Carlo code GEANT4.

  3. Compton polarimeter as a focal plane detector for hard X-ray telescope: sensitivity estimation with Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, T.; Vadawale, S. V.; Pendharkar, J.

    2013-04-01

    X-ray polarimetry can be an important tool for investigating various physical processes as well as their geometries at the celestial X-ray sources. However, X-ray polarimetry has not progressed much compared to the spectroscopy, timing and imaging mainly due to the extremely photon-hungry nature of X-ray polarimetry leading to severely limited sensitivity of X-ray polarimeters. The great improvement in sensitivity in spectroscopy and imaging was possible due to focusing X-ray optics which is effective only at the soft X-ray energy range. Similar improvement in sensitivity of polarisation measurement at soft X-ray range is expected in near future with the advent of GEM based photoelectric polarimeters. However, at energies >10 keV, even spectroscopic and imaging sensitivities of X-ray detector are limited due to lack of focusing optics. Thus hard X-ray polarimetry so far has been largely unexplored area. On the other hand, typically the polarisation degree is expected to increase at higher energies as the radiation from non-thermal processes is dominant fraction. So polarisation measurement in hard X-ray can yield significant insights into such processes. With the recent availability of hard X-ray optics (e.g. with upcoming NuSTAR, Astro-H missions) which can focus X-rays from 5 KeV to 80 KeV, sensitivity of X-ray detectors in hard X-ray range is expected to improve significantly. In this context we explore feasibility of a focal plane hard X-ray polarimeter based on Compton scattering having a thin plastic scatterer surrounded by cylindrical array scintillator detectors. We have carried out detailed Geant4 simulation to estimate the modulation factor for 100 % polarized beam as well as polarimetric efficiency of this configuration. We have also validated these results with a semi-analytical approach. Here we present the initial results of polarisation sensitivities of such focal plane Compton polarimeter coupled with the reflection efficiency of present era hard X

  4. Software design for professional risk evaluation

    NASA Astrophysics Data System (ADS)

    Ionescu, V.; Calea, G.; Amza, G.; Iacobescu, G.; Nitoi, D.; Dimitrescu, A.

    2016-08-01

    Professional risk evaluation represents a complex activity involving each economic operator, with important repercussion upon health and security in work. Article represents an innovative study method, regarding professional risk analyze in which cumulative working posts are evaluated. Work presents a new software that helps in putting together all the working positions from a complex organizational system and analyzing them in order to evaluate the possible risks. Using this software, a multiple analysis can be done like: risk estimation, risk evaluation, estimation of residual risks and finally searching of risk reduction measures.

  5. Designing the Undesignable: Social Software and Control

    ERIC Educational Resources Information Center

    Dron, Jon

    2007-01-01

    Social software, such as blogs, wikis, tagging systems and collaborative filters, treats the group as a first-class object within the system. Drawing from theories of transactional distance and control, this paper proposes a model of e-learning that extends traditional concepts of learner-teacher-content interactions to include these emergent…

  6. Evaluating Educational Software Authoring Environments Using a Model Based on Software Engineering and Instructional Design Principles.

    ERIC Educational Resources Information Center

    Collis, Betty A.; Gore, Marilyn

    1987-01-01

    This study suggests a new model for the evaluation of educational software authoring systems and applies this model to a particular authoring system, CSR Trainer 4000. The model used is based on an integrated set of software engineering and instructional design principles. (Author/LRW)

  7. The waveform correlation event detection system global prototype software design

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Trujillo, J.R.; Young, C.J.

    1997-12-01

    The WCEDS prototype software system was developed to investigate the usefulness of waveform correlation methods for CTBT monitoring. The WCEDS prototype performs global seismic event detection and has been used in numerous experiments. This report documents the software system design, presenting an overview of the system operation, describing the system functions, tracing the information flow through the system, discussing the software structures, and describing the subsystem services and interactions. The effectiveness of the software design in meeting project objectives is considered, as well as opportunities for code refuse and lessons learned from the development process. The report concludes with recommendations for modifications and additions envisioned for regional waveform-correlation-based detector.

  8. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  9. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  10. Some Interactive Aspects of a Software Design Schema Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Lee, Hing-Yan; Harandi, Mehdi T.

    1991-01-01

    This paper describes a design schema acquisition tool which forms an important component of a hybrid software design system for reuse. The hybrid system incorporates both schema-based approaches in supporting software design reuse activities and is realized by extensions to the IDeA system. The paper also examines some of the interactive aspects that the tool requires with the domain analyst to accomplish its acquisition task.

  11. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    SciTech Connect

    Guan, Fada; Peeler, Christopher; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Mohan, Radhe; Titt, Uwe; Bronk, Lawrence; Geng, Changran; Grosshans, David

    2015-11-15

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to

  12. A software design for servo system of siderostats.

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuheng; Sun, Shuqin

    1997-06-01

    A software design for the servo system of two siderostats in the prototype of a stellar interferometer is described. The software is written in the EPROM of an 8098 chip, which includes the commuinication routine between 8098 and the main computer. The routine are for positioning and tracking.

  13. Classroom Teachers Working with Software Designers: The Wazzu Widgets Project.

    ERIC Educational Resources Information Center

    Brown, Abbie; Miller, Darcy

    This paper presents results of a year-long project involving K-12 teachers working with student software designers to create "learning objects"--small, computer-based tools (known as "widgets") for concepts identified by the teachers as "difficult to learn." This educational software development project was facilitated by members of Washington…

  14. Designing Distributed Learning Environments with Intelligent Software Agents

    ERIC Educational Resources Information Center

    Lin, Fuhua, Ed.

    2005-01-01

    "Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…

  15. Training Software Developers and Designers to Conduct Usability Evaluations

    ERIC Educational Resources Information Center

    Skov, Mikael Brasholt; Stage, Jan

    2012-01-01

    Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…

  16. Designing an upgrade of the Medley setup for light-ion production and fission cross-section measurements

    NASA Astrophysics Data System (ADS)

    Jansson, K.; Gustavsson, C.; Al-Adili, A.; Hjalmarsson, A.; Andersson-Sundén, E.; Prokofiev, A. V.; Tarrío, D.; Pomp, S.

    2015-09-01

    Measurements of neutron-induced fission cross-sections and light-ion production are planned in the energy range 1-40 MeV at the upcoming Neutrons For Science (NFS) facility. In order to prepare our detector setup for the neutron beam with continuous energy spectrum, a simulation software was written using the Geant4 toolkit for both measurement situations. The neutron energy range around 20 MeV is troublesome when it comes to the cross-sections used by Geant4 since data-driven cross-sections are only available below 20 MeV but not above, where they are based on semi-empirical models. Several customisations were made to the standard classes in Geant4 in order to produce consistent results over the whole simulated energy range. Expected uncertainties are reported for both types of measurements. The simulations have shown that a simultaneous precision measurement of the three standard cross-sections H(n,n), 235U(n,f) and 238U(n,f) relative to each other is feasible using a triple layered target. As high resolution timing detectors for fission fragments we plan to use Parallel Plate Avalanche Counters (PPACs). The simulation results have put some restrictions on the design of these detectors as well as on the target design. This study suggests a fissile target no thicker than 2 μm (1.7 mg/cm2) and a PPAC foil thickness preferably less than 1 μm. We also comment on the usability of Geant4 for simulation studies of neutron reactions in this energy range.

  17. SWEPP Gamma-Ray Spectrometer System software design description

    SciTech Connect

    Femec, D.A.; Killian, E.W.

    1994-08-01

    To assist in the characterization of the radiological contents of contract-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP), the SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory. The SGRS system software controls turntable and detector system activities. In addition to determining the concentrations of gamma-ray-emitting radionuclides, this software also calculates attenuation-corrected isotopic mass ratios of-specific interest. This document describes the software design for the data acquisition and analysis software associated with the SGRS system.

  18. A software system for laser design and analysis

    NASA Technical Reports Server (NTRS)

    Cross, P. L.; Barnes, N. P.; Filer, E. D.

    1990-01-01

    A laser-material database and laser-modeling software system for designing lasers for laser-based Light Detection And Ranging (LIDAR) systems are presented. The software system consists of three basic sections: the database, laser models, and interface software. The database contains the physical parameters of laser, optical, and nonlinear materials required by laser models. The models include efficiency calculations, electrooptical component models, resonator, amplifier, and oscillator models, and miscellaneous models. The interface software provides a user-friendly interface between the user and his personal data files, the database, and models. The structure of the software system is essentially in place, while future plans call for upgrading the computer hardware and software in order to support a multiuser multitask environment.

  19. Evaluation of commercially available lighting design software

    SciTech Connect

    McConnell, D.G.

    1990-09-01

    This report addresses the need for commercially available lighting design computer programs and evaluates several of these programs. Sandia National Laboratories uses these programs to provide lighting designs for exterior closed-circuit television camera intrusion detection assessment for high-security perimeters.

  20. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  1. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  2. Acquiring Software Design Schemas: A Machine Learning Perspective

    NASA Technical Reports Server (NTRS)

    Harandi, Mehdi T.; Lee, Hing-Yan

    1991-01-01

    In this paper, we describe an approach based on machine learning that acquires software design schemas from design cases of existing applications. An overview of the technique, design representation, and acquisition system are presented. the paper also addresses issues associated with generalizing common features such as biases. The generalization process is illustrated using an example.

  3. Improving Software Development Process through Economic Mechanism Design

    NASA Astrophysics Data System (ADS)

    Yilmaz, Murat; O'Connor, Rory V.; Collins, John

    We introduce the novel concept of applying economic mechanism design to software development process, and aim to find ways to adjust the incentives and disincentives of the software organization to align them with the motivations of the participants in order to maximize the delivered value of a software project. We envision a set of principles to design processes that allow people to be self motivated but constantly working toward project goals. The resulting economic mechanism will rely on game theoretic principles (i.e. Stackelberg games) for leveraging the incentives, goals and motivation of the participants in the service of project and organizational goals.

  4. A patterns catalog for RTSJ software designs

    NASA Technical Reports Server (NTRS)

    Benowitz, E. G.; Niessner, A. F.

    2003-01-01

    In this survey paper, we bring together current progress to date in identifying design patterns for use with the real-time specification for Java in a format consistent with contemporary patterns descriptions.

  5. Comparison of MCNPX and GEANT4 to Predict the Contribution of Non-elastic Nuclear Interactions to Absorbed Dose in Water, PMMA and A150

    SciTech Connect

    Shtejer, K.; Arruda-Neto, J. D. T.; Rodrigues, T. E.; Schulte, R.; Wroe, A.; Menezes, M. O. de; Moralles, M.

    2008-08-11

    Proton induced non-elastic nuclear reactions play an important role in the dose distribution of clinically used proton beams as they deposit dose of high biological effectiveness both within the primary beam path as well as outside the beam to untargeted tissues. Non-elastic nuclear reactions can be evaluated using transport codes based on the Monte Carlo method. In this work, we have utilized the Los Alamos code MCNPX and the CERN GEANT4 toolkit, which are currently the most widely used Monte Carlo programs for proton radiation transport simulations in medical physics, to study the contribution of non-elastic nuclear interactions to the absorbed dose of proton beams in the therapeutic energy range. The impact of different available theoretical models to address the nuclear reaction process was investigated. The contribution of secondary particles from non-elastic nuclear reactions was calculated in three materials relevant in radiotherapy applications: water, PMMA and A150. The results evidence that there are differences in the calculated contribution of the secondary particles heavier than protons to the absorbed dose, with different approaches to model the nuclear reactions. The MCNPX calculation give rise to a larger contribution of d, t, {alpha}{sup 3}He to the total dose compared to the GEANT4 physical models chosen in this work.

  6. Design of software for distributed/multiprocessor systems

    SciTech Connect

    Mckelvey, T.R.; Agrawal, D.P.

    1982-01-01

    Software design methodologies for distributed/multiprocessor systems are investigated. Parallelism and multitasking are considered as key issues in the design process. Petri-Nets and precedence graphs are presented as techniques for the modeling of a problem for implementation on a computer system. Techniques using the Petri-Net and precedence graph to decompose the problem model into subsets that may be executed on a distributed/multiprocessor system are presented. These techniques offer a systematic design methodology for the design of distributed/multiprocessor system software. 8 references.

  7. On Designing Lightweight Threads for Substrate Software

    NASA Technical Reports Server (NTRS)

    Haines, Matthew

    1997-01-01

    Existing user-level thread packages employ a 'black box' design approach, where the implementation of the threads is hidden from the user. While this approach is often sufficient for application-level programmers, it hides critical design decisions that system-level programmers must be able to change in order to provide efficient service for high-level systems. By applying the principles of Open Implementation Analysis and Design, we construct a new user-level threads package that supports common thread abstractions and a well-defined meta-interface for altering the behavior of these abstractions. As a result, system-level programmers will have the advantages of using high-level thread abstractions without having to sacrifice performance, flexibility or portability.

  8. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  9. Certification trails and software design for testability

    NASA Technical Reports Server (NTRS)

    Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.

    1993-01-01

    Design techniques which may be applied to make program testing easier were investigated. Methods for modifying a program to generate additional data which we refer to as a certification trail are presented. This additional data is designed to allow the program output to be checked more quickly and effectively. Certification trails were described primarily from a theoretical perspective. A comprehensive attempt to assess experimentally the performance and overall value of the certification trail method is reported. The method was applied to nine fundamental, well-known algorithms for the following problems: convex hull, sorting, huffman tree, shortest path, closest pair, line segment intersection, longest increasing subsequence, skyline, and voronoi diagram. Run-time performance data for each of these problems is given, and selected problems are described in more detail. Our results indicate that there are many cases in which certification trails allow for significantly faster overall program execution time than a 2-version programming approach, and also give further evidence of the breadth of applicability of this method.

  10. A Scaffolding Design Framework for Software to Support Science Inquiry

    ERIC Educational Resources Information Center

    Quintana, Chris; Reiser, Brian J.; Davis, Elizabeth A.; Krajcik, Joseph; Fretz, Eric; Duncan, Ravit Golan; Kyza, Eleni; Edelson, Daniel; Soloway, Elliot

    2004-01-01

    The notion of scaffolding learners to help them succeed in solving problems otherwise too difficult for them is an important idea that has extended into the design of scaffolded software tools for learners. However, although there is a growing body of work on scaffolded tools, scaffold design, and the impact of scaffolding, the field has not yet…

  11. Designing Prediction Tasks in a Mathematics Software Environment

    ERIC Educational Resources Information Center

    Brunström, Mats; Fahlgren, Maria

    2015-01-01

    There is a recognised need in mathematics teaching for new kinds of tasks which exploit the affordances provided by new technology. This paper focuses on the design of prediction tasks to foster student reasoning about exponential functions in a mathematics software environment. It draws on the first iteration of a design based research study…

  12. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  13. Calico: An Early-Phase Software Design Tool

    ERIC Educational Resources Information Center

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  14. Feasibility of using Geant4 Monte Carlo simulation for IMRT dose calculations for the Novalis Tx with a HD-120 multi-leaf collimator

    NASA Astrophysics Data System (ADS)

    Jung, Hyunuk; Shin, Jungsuk; Chung, Kwangzoo; Han, Youngyih; Kim, Jinsung; Choi, Doo Ho

    2015-05-01

    The aim of this study was to develop an independent dose verification system by using a Monte Carlo (MC) calculation method for intensity modulated radiation therapy (IMRT) conducted by using a Varian Novalis Tx (Varian Medical Systems, Palo Alto, CA, USA) equipped with a highdefinition multi-leaf collimator (HD-120 MLC). The Geant4 framework was used to implement a dose calculation system that accurately predicted the delivered dose. For this purpose, the Novalis Tx Linac head was modeled according to the specifications acquired from the manufacturer. Subsequently, MC simulations were performed by varying the mean energy, energy spread, and electron spot radius to determine optimum values of irradiation with 6-MV X-ray beams by using the Novalis Tx system. Computed percentage depth dose curves (PDDs) and lateral profiles were compared to the measurements obtained by using an ionization chamber (CC13). To validate the IMRT simulation by using the MC model we developed, we calculated a simple IMRT field and compared the result with the EBT3 film measurements in a water-equivalent solid phantom. Clinical cases, such as prostate cancer treatment plans, were then selected, and MC simulations were performed. The accuracy of the simulation was assessed against the EBT3 film measurements by using a gamma-index criterion. The optimal MC model parameters to specify the beam characteristics were a 6.8-MeV mean energy, a 0.5-MeV energy spread, and a 3-mm electron radius. The accuracy of these parameters was determined by comparison of MC simulations with measurements. The PDDs and the lateral profiles of the MC simulation deviated from the measurements by 1% and 2%, respectively, on average. The computed simple MLC fields agreed with the EBT3 measurements with a 95% passing rate with 3%/3-mm gamma-index criterion. Additionally, in applying our model to clinical IMRT plans, we found that the MC calculations and the EBT3 measurements agreed well with a passing rate of greater

  15. Cluster computing software for GATE simulations

    SciTech Connect

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-06-15

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

  16. Design software for ion-exchanged glass waveguide devices

    NASA Astrophysics Data System (ADS)

    Tervonen, Ari; Honkanen, Seppo; Poyhonen, Pekka; Tahkokorpi, Markku T.

    1993-04-01

    Software tools for design of passive integrated optical components based on ion-exchanged glass waveguides have been developed. All design programs have been implemented on personal computers. A general simulation program for ion exchange processes is used for optimization of waveguide fabrication. The optical propagation in the calculated channel waveguide profiles is modelled with various methods. A user-friendly user's interface has been included in this modelling software. On the basis of the calculated propagation properties, performance of channel waveguide circuits can be modelled and thus devices for different applications may be designed. From the design parameters, the lithography mask pattern to be used is generated for a commercial CAD program for final mask design. Examples of designed and manufactured guided-wave devices are described. These include 1- to-n splitters and asymmetric Mach-Zehnder interferometers for wavelength division multiplexing.

  17. Software Design Methodology Migration for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .

  18. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    PubMed

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity.

  19. Geant4 simulation for a study of a possible use of carbon ion pencil beams for the treatment of ocular melanomas with the active scanning system at CNAO

    NASA Astrophysics Data System (ADS)

    Farina, E.; Piersimoni, P.; Riccardi, C.; Rimoldi, A.; Tamborini, A.; Ciocca, M.

    2015-12-01

    The aim of this work was to study a possible use of carbon ion pencil beams (delivered with active scanning modality) for the treatment of ocular melanomas at the Centro Nazionale di Adroterapia Oncologica (CNAO). The promising aspect of carbon ions radiotherapy for the treatment of this disease lies in its superior relative radio-biological effectiveness (RBE). The Monte Carlo (MC) Geant4 10.00 toolkit was used to simulate the complete CNAO extraction beamline, with the active and passive components along it. A human eye modeled detector, including a realistic target tumor volume, was used as target. Cross check with previous studies at CNAO using protons allowed comparisons on possible benefits on using such a technique with respect to proton beams. Experimental data on proton and carbon ion beams transverse distributions were used to validate the simulation.

  20. Geant4 simulation of zinc oxide nanowires in anodized aluminum oxide template as a low energy X-ray scintillator detector

    NASA Astrophysics Data System (ADS)

    Taheri, Ali; Saramad, Shahyar; Setayeshi, Saeed

    2013-02-01

    In this work, ZnO nanowires in anodized aluminum oxide nanoporous template are proposed as an architecture for development of new generation of scintillator based X-ray imagers. The optical response of crystalline ordered ZnO nanowire arrays in porous anodized aluminum oxide template under 20 keV X-ray illumination is simulated using the Geant4 Monte Carlo code. The results show that anodized aluminum oxide template has a special impact as a light guide to conduct the optical photons induced by X-ray toward the detector thickness and to decrease the light scattering in detector volume. This inexpensive and effective method can significantly improve the spatial resolution in scintillator based X-ray imagers, especially in medical applications.

  1. Simulation, optimization and testing of a novel high spatial resolution X-ray imager based on Zinc Oxide nanowires in Anodic Aluminium Oxide membrane using Geant4

    NASA Astrophysics Data System (ADS)

    Esfandi, F.; Saramad, S.

    2015-07-01

    In this work, a new generation of scintillator based X-ray imagers based on ZnO nanowires in Anodized Aluminum Oxide (AAO) nanoporous template is characterized. The optical response of ordered ZnO nanowire arrays in porous AAO template under low energy X-ray illumination is simulated by the Geant4 Monte Carlo code and compared with experimental results. The results show that for 10 keV X-ray photons, by considering the light guiding properties of zinc oxide inside the AAO template and suitable selection of detector thickness and pore diameter, the spatial resolution less than one micrometer and the detector detection efficiency of 66% are accessible. This novel nano scintillator detector can have many advantages for medical applications in the future.

  2. A polygon-surface reference Korean male phantom (PSRK-Man) and its direct implementation in Geant4 Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Kim, Chan Hyeong; Jeong, Jong Hwi; Bolch, Wesley E.; Cho, Kun-Woo; Hwang, Sung Bae

    2011-05-01

    Even though the hybrid phantom embodies both the anatomic reality of voxel phantoms and the deformability of stylized phantoms, it must be voxelized to be used in a Monte Carlo code for dose calculation or some imaging simulation, which incurs the inherent limitations of voxel phantoms. In the present study, a voxel phantom named VKH-Man (Visible Korean Human-Man), was converted to a polygon-surface phantom (PSRK-Man, Polygon-Surface Reference Korean-Man), which was then adjusted to the Reference Korean data. Subsequently, the PSRK-Man polygon phantom was directly, without any voxelization process, implemented in the Geant4 Monte Carlo code for dose calculations. The calculated dose values and computation time were then compared with those of HDRK-Man (High Definition Reference Korean-Man), a corresponding voxel phantom adjusted to the same Reference Korean data from the same VKH-Man voxel phantom. Our results showed that the calculated dose values of the PSRK-Man surface phantom agreed well with those of the HDRK-Man voxel phantom. The calculation speed for the PSRK-Man polygon phantom though was 70-150 times slower than that of the HDRK-Man voxel phantom; that speed, however, could be acceptable in some applications, in that direct use of the surface phantom PSRK-Man in Geant4 does not require a separate voxelization process. Computing speed can be enhanced, in future, either by optimizing the Monte Carlo transport kernel for the polygon surfaces or by using modern computing technologies such as grid computing and general-purpose computing on graphics processing units programming.

  3. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured

  4. Using software interoperability to achieve a virtual design environment

    NASA Astrophysics Data System (ADS)

    Gregory, G. Groot; Koshel, R. John

    2005-09-01

    A variety of simulation tools, including optical design and analysis, have benefited by many years of evolution in software functionality and computing power, thus making the notion of virtual design environments a reality. To simulate the optical characteristics of a system, one needs to include optical performance, mechanical design and manufacturing aspects simultaneously. To date, no single software program offers a universal solution. One approach to achieve an integrated environment is to select tools that offer a high degree of interoperability. This allows the selection of the best tools for each aspect of the design working in concert to solve the problem. This paper discusses the issues of how to assemble a design environment and provides an example of a combination of tools for illumination design. We begin by offering a broad definition of interoperability from an optical analysis perspective. This definition includes aspects of file interchange formats, software communications protocols and customized applications. One example solution is proposed by combining SolidWorks1 for computer-aided design (CAD), TracePro2 for optical analysis and MATLAB3 as the mathematical engine for tolerance analysis. The resulting virtual tool will be applied to a lightpipe design task to illustrate how such a system can be used.

  5. Standards for Instructional Computing Software Design and Development.

    ERIC Educational Resources Information Center

    Schaefermeyer, Shanna

    1990-01-01

    Identifies desirable features that should be included in software for effective instructional computing use. Highlights include design of learning activities; curriculum role; modes of instruction, including drill and practice, tutorials, games, simulation, and problem solving; branching; menu driven programs; screen displays; graphics; teacher…

  6. QUICK - An interactive software environment for engineering design

    NASA Technical Reports Server (NTRS)

    Skinner, David L.

    1989-01-01

    QUICK, an interactive software environment for engineering design, provides a programmable FORTRAN-like calculator interface to a wide range of data structures as well as both built-in and user created functions. QUICK also provides direct access to the operating systems of eight different machine architectures. The evolution of QUICK and a brief overview of the current version are presented.

  7. Peeling the Onion: Okapi System Architecture and Software Design Issues.

    ERIC Educational Resources Information Center

    Jones, S.; And Others

    1997-01-01

    Discusses software design issues for Okapi, an information retrieval system that incorporates both search engine and user interface and supports weighted searching, relevance feedback, and query expansion. The basic search system, adjacency searching, and moving toward a distributed system are discussed. (Author/LRW)

  8. Issues in Software Engineering of Relevance to Instructional Design

    ERIC Educational Resources Information Center

    Douglas, Ian

    2006-01-01

    Software engineering is popularly misconceived as being an upmarket term for programming. In a way, this is akin to characterizing instructional design as the process of creating PowerPoint slides. In both these areas, the construction of systems, whether they are learning or computer systems, is only one part of a systematic process. The most…

  9. Designing for User Cognition and Affect in Software Instructions

    ERIC Educational Resources Information Center

    van der Meij, Hans

    2008-01-01

    In this paper we examine how to design software instructions for user cognition and affect. A basic and co-user manual are compared. The first provides fundamental support for both; the latter includes a buddy to further optimize support for user affect. The basic manual was faster and judged as easier to process than the co-user manual. In…

  10. Designing Computer Software To Minimize the Need for Employee Training.

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2000-01-01

    Discusses problems that arise when computer software users have to learn a new system while maintaining productivity. Highlights include active learning; a constructivist view; Vygotsky's zone of proximal development; and a model called Design for Learnability (DesiL) that focuses the performance technologist on an ethnomethodological study of…

  11. code_swarm: a design study in organic software visualization.

    PubMed

    Ogawa, Michael; Ma, Kwan-Liu

    2009-01-01

    In May of 2008, we published online a series of software visualization videos using a method called code_swarm. Shortly thereafter, we made the code open source and its popularity took off. This paper is a study of our code swarm application, comprising its design, results and public response. We share our design methodology, including why we chose the organic information visualization technique, how we designed for both developers and a casual audience, and what lessons we learned from our experiment. We validate the results produced by code_swarm through a qualitative analysis and by gathering online user comments. Furthermore, we successfully released the code as open source, and the software community used it to visualize their own projects and shared their results as well. In the end, we believe code_swarm has positive implications for the future of organic information design and open source information visualization practice.

  12. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  13. Advanced Spacesuit Informatics Software Design for Power, Avionics and Software Version 2.0

    NASA Technical Reports Server (NTRS)

    Wright, Theodore W.

    2016-01-01

    A description of the software design for the 2016 edition of the Informatics computer assembly of the NASAs Advanced Extravehicular Mobility Unit (AEMU), also called the Advanced Spacesuit. The Informatics system is an optional part of the spacesuit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and warning information. It also provides an interface to the suit mounted camera for recording still images, video, and audio field notes.

  14. [Development of a software for 3D virtual phantom design].

    PubMed

    Zou, Lian; Xie, Zhao; Wu, Qi

    2014-02-01

    In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research. PMID:24804488

  15. Design and implementation of embedded Bluetooth software system

    NASA Astrophysics Data System (ADS)

    Zhou, Zhijian; Zhou, Shujie; Xu, Huimin

    2001-10-01

    This thesis introduces the background knowledge and characteristics of Bluetooth technology. Then it summarizes the architecture and working principle of Bluetooth software. After carefully studying the characteristics of embedded operating system and Bluetooth software, this thesis declared two sets of module about Bluetooth software. Corresponding to these module's characteristics, this thesis introduces the design and implementation of LAN Access and Bluetooth headset. The Headset part introduces a developing method corresponding to the particularity of Bluetooth control software. Although these control software are application entity, the control signaling exchanged between them are regulations according to former definitions and they functions through the interaction of data and control information. These data and control information construct the protocol data unit (PDU), and the former definition can be seen as protocol in fact. This thesis uses the advanced development flow on communication protocol development as reference, a formal method - SDL (Specification and Description Language) - describing, validating and coding manually to C. This method not only reserved the efficiency of manually coded code, but also it ensures the quality of codes. The introduction also involves finite state machine theory while introduces the practical developing method on protocol development with the aid of SDL.

  16. Design of Mariner 9 Science Sequences using Interactive Graphics Software

    NASA Technical Reports Server (NTRS)

    Freeman, J. E.; Sturms, F. M, Jr.; Webb, W. A.

    1973-01-01

    This paper discusses the analyst/computer system used to design the daily science sequences required to carry out the desired Mariner 9 science plan. The Mariner 9 computer environment, the development and capabilities of the science sequence design software, and the techniques followed in the daily mission operations are discussed. Included is a discussion of the overall mission operations organization and the individual components which played an essential role in the sequence design process. A summary of actual sequences processed, a discussion of problems encountered, and recommendations for future applications are given.

  17. Computer software design description for the integrated control and data acquisition system LDUA system

    SciTech Connect

    Aftanas, B.L.

    1998-08-12

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components.

  18. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  19. Software Engineering in Practice: Design and Architectures of FLOSS Systems

    NASA Astrophysics Data System (ADS)

    Capiluppi, Andrea; Knowles, Thomas

    Free/Libre/Open Source Software (FLOSS) practitioners and developers are typically also users of their own systems: as a result, traditional software engineering (SE) processes (e.g., the requirements and design phases), take less time to articulate and negotiate among FLOSS developers. Design and requirements are kept more as informal knowledge, rather than formally described and assessed. This paper attempts to recover the SE concepts of software design and architectures from three FLOSS case studies, sharing the same application domain (i.e., Instant Messaging). Its first objective is to determine whether a common architecture emerges from the three systems, which can be used as shared knowledge for future applications. The second objective is to determine whether these architectures evolve or decay during the evolution of these systems. The results of this study are encouraging: albeit no explicit effort was done by FLOSS developers to define a high-level view of the architecture, a common shared architecture could be distilled for the Instant Messaging application domain. It was also found that, for two of the three systems, the architecture becomes better organised, and the components better specified, as long as the system evolves in time.

  20. A requirements specification for a software design support system

    NASA Technical Reports Server (NTRS)

    Noonan, Robert E.

    1988-01-01

    Most existing software design systems (SDSS) support the use of only a single design methodology. A good SDSS should support a wide variety of design methods and languages including structured design, object-oriented design, and finite state machines. It might seem that a multiparadigm SDSS would be expensive in both time and money to construct. However, it is proposed that instead an extensible SDSS that directly implements only minimal database and graphical facilities be constructed. In particular, it should not directly implement tools to faciliate language definition and analysis. It is believed that such a system could be rapidly developed and put into limited production use, with the experience gained used to refine and evolve the systems over time.

  1. Requirements Management System Browser (RMSB) software design description

    SciTech Connect

    Frank, D.D.

    1996-09-30

    The purpose of this document is to provide an ``as-built`` design description for the Requirements Management System Browser (RMSB) application. The Graphical User Interface (GUI) and database structure design are described for the RMSB application, referred to as the ``Browser.`` The RMSB application provides an easy to use PC-based interface to browse systems engineering data stored and managed in a UNIX software application. The system engineering data include functions, requirements, and architectures that make up the Tank Waste Remediation System (TWRS) technical baseline.

  2. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  3. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  4. Calculation of direct effects of 60Co gamma rays on the different DNA structural levels: A simulation study using the Geant4-DNA toolkit

    NASA Astrophysics Data System (ADS)

    Tajik, Marjan; Rozatian, Amir S. H.; Semsarha, Farid

    2015-03-01

    In this study, simple single strand breaks (SSB) and double strand breaks (DSB) due to direct effects of the secondary electron spectrum of 60Co gamma rays on different organizational levels of a volume model of the B-DNA conformation have been calculated using the Geant4-DNA toolkit. Result of this study for the direct DSB yield shows a good agreement with other theoretical and experimental results obtained by both photons and their secondary electrons; however, in the case of SSB a noticeable difference can be observed. Moreover, regarding the almost constant yields of the direct strand breaks in the different structural levels of the DNA, calculated in this work, and compared with some theoretical studies, it can be deduced that the direct strand breaks yields depend mainly on the primary double helix structure of the DNA and the higher-order structures cannot have a noticeable effect on the direct DNA damage inductions by 60Co gamma rays. In contrast, a direct dependency between the direct SSB and DSB yields and the volume of the DNA structure has been found. Also, a further study on the histone proteins showed that they can play an important role in the trapping of low energy electrons without any significant effect on the direct DNA strand breaks inductions, at least in the range of energies used in the current study.

  5. Organ doses from hepatic radioembolization with 90Y, 153Sm, 166Ho and 177Lu: A Monte Carlo simulation study using Geant4

    NASA Astrophysics Data System (ADS)

    Hashikin, N. A. A.; Yeong, C. H.; Guatelli, S.; Abdullah, B. J. J.; Ng, K. H.; Malaroda, A.; Rosenfeld, A. B.; Perkins, A. C.

    2016-03-01

    90Y-radioembolization is a palliative treatment for liver cancer. 90Y decays via beta emission, making imaging difficult due to absence of gamma radiation. Since post-procedure imaging is crucial, several theranostic radionuclides have been explored as alternatives. However, exposures to gamma radiation throughout the treatment caused concern for the organs near the liver. Geant4 Monte Carlo simulation using MIRD Pamphlet 5 reference phantom was carried out. A spherical tumour with 4.3cm radius was modelled within the liver. 1.82GBq of 90Y sources were isotropically distributed within the tumour, with no extrahepatic shunting. The simulation was repeated with 153Sm, 166Ho and 177Lu. The estimated tumour doses for all radionuclides were 262.9Gy. Tumour dose equivalent to 1.82GBq 90Y can be achieved with 8.32, 5.83, and 4.44GBq for 153Sm, 166Ho and 177Lu, respectively. Normal liver doses by the other radionuclides were lower than 90Y, hence beneficial for normal tissue sparing. The organ doses from 153Sm and 177Lu were relatively higher due to higher gamma energy, but were still well below 1Gy. 166Ho, 177Lu and 153Sm offer useful gamma emission for post-procedure imaging. They show potential as 90Y substitutes, delivering comparable tumour doses, lower normal liver doses and other organs doses far below the tolerance limit.

  6. System software design for the CDF Silicon Vertex Detector

    SciTech Connect

    Tkaczyk, S.; Bailey, M.

    1991-11-01

    An automated system for testing and performance evaluation of the CDF Silicon Vertex Detector (SVX) data acquisition electronics is described. The SVX data acquisition chain includes the Fastbus Sequencer and the Rabbit Crate Controller and Digitizers. The Sequencer is a programmable device for which we developed a high level assembly language. Diagnostic, calibration and data acquisition programs have been developed. A distributed software package was developed in order to operate the modules. The package includes programs written in assembly and Fortran languages that are executed concurrently on the SVX Sequencer modules and either a microvax or an SSP. Test software was included to assist technical personnel during the production and maintenance of the modules. Details of the design of different components of the package are reported.

  7. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  8. Software Package Completed for Alloy Design at the Atomic Level

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo H.; Noebe, Ronald D.; Abel, Phillip B.; Good, Brian S.

    2001-01-01

    As a result of a multidisciplinary effort involving solid-state physics, quantum mechanics, and materials and surface science, the first version of a software package dedicated to the atomistic analysis of multicomponent systems was recently completed. Based on the BFS (Bozzolo, Ferrante, and Smith) method for the calculation of alloy and surface energetics, this package includes modules devoted to the analysis of many essential features that characterize any given alloy or surface system, including (1) surface structure analysis, (2) surface segregation, (3) surface alloying, (4) bulk crystalline material properties and atomic defect structures, and (5) thermal processes that allow us to perform phase diagram calculations. All the modules of this Alloy Design Workbench 1.0 (ADW 1.0) are designed to run in PC and workstation environments, and their operation and performance are substantially linked to the needs of the user and the specific application.

  9. Some Didactical and Epistemological Considerations in the Design of Educational Software: The Cabri-Euclide Example

    ERIC Educational Resources Information Center

    Luengo, Vanda

    2005-01-01

    We propose to use didactical theory for the design of educational software. Here we present a set of didactical conditions, and explain how they shape the software design of Cabri-Euclide, a microworld used to learn "mathematical proof" in a geometry setting. The aim is to design software that does not include a predefined knowledge of problem…

  10. Surface moisture measurement system computer software design description

    SciTech Connect

    Vargo, G.F. Jr. Westinghouse Hanford

    1996-08-12

    This document describes the software that performs the data acquisition for the SMMS instrument. This document was created with graphical programming language. The software described in this document was written to comply with the Software Requirements Specification. Hardware is described in Section 2.Software is described in Section 3.

  11. Comparative study of dose distributions and cell survival fractions for 1H, 4He, 12C and 16O beams using Geant4 and Microdosimetric Kinetic model

    NASA Astrophysics Data System (ADS)

    Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2015-04-01

    Depth and radial dose profiles for therapeutic 1H, 4He, 12C and 16O beams are calculated using the Geant4-based Monte Carlo model for Heavy-Ion Therapy (MCHIT). 4He and 16O ions are presented as alternative options to 1H and 12C broadly used for ion-beam cancer therapy. Biological dose profiles and survival fractions of cells are estimated using the modified Microdosimetric Kinetic model. Depth distributions of cell survival of healthy tissues, assuming 10% and 50% survival of tumor cells, are calculated for 6 cm SOBPs at two tumor depths and for different tissues radiosensitivities. It is found that the optimal ion choice depends on (i) depth of the tumor, (ii) dose levels and (iii) the contrast of radiosensitivities of tumor and surrounding healthy tissues. Our results indicate that 12C and 16O ions are more appropriate to spare healthy tissues in the case of a more radioresistant tumor at moderate depths. On the other hand, a sensitive tumor surrounded by more resistant tissues can be better treated with 1H and 4He ions. In general, 4He beam is found to be a good candidate for therapy. It better spares healthy tissues in all considered cases compared to 1H. Besides, the dose conformation is improved for deep-seated tumors compared to 1H, and the damage to surrounding healthy tissues is reduced compared to heavier ions due to the lower impact of nuclear fragmentation. No definite advantages of 16O with respect to 12C ions are found in this study.

  12. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4.

    PubMed

    Pope, D J; Cutajar, D L; George, S P; Guatelli, S; Bucci, J A; Enari, K E; Miller, S; Siegele, R; Rosenfeld, A B

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences.Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  13. Benchmarking the Geant4 full system simulation of an associated alpha-particle detector for use in a D-T neutron generator.

    PubMed

    Zhang, Xiaodong; Hayward, Jason P; Cates, Joshua W; Hausladen, Paul A; Laubach, Mitchell A; Sparger, Johnathan E; Donnald, Samuel B

    2012-08-01

    The position-sensitive alpha-particle detector used to provide the starting time and initial direction of D-T neutrons in a fast-neutron imaging system was simulated with a Geant4-based Monte Carlo program. The whole detector system, which consists of a YAP:Ce scintillator, a fiber-optic faceplate, a light guide, and a position-sensitive photo-multiplier tube (PSPMT), was modeled, starting with incident D-T alphas. The scintillation photons, whose starting time follows the distribution of a scintillation decay curve, were produced and emitted uniformly into a solid angle of 4π along the track segments of the alpha and its secondaries. Through tracking all photons and taking into account the quantum efficiency of the photocathode, the number of photoelectrons and their time and position distributions were obtained. Using a four-corner data reconstruction formula, the flood images of the alpha detector with and without optical grease between the YAP scintillator and the fiber-optic faceplate were obtained, which show agreement with the experimental results. The reconstructed position uncertainties of incident alpha particles for both cases are 1.198 mm and 0.998 mm respectively across the sensitive area of the detector. Simulation results also show that comparing with other faceplates composed of 500 μm, 300 μm, and 100 μm fibers, the 10-μm-fiber faceplate is the best choice to build the detector for better position performance. In addition, the study of the background originating inside the D-T generator suggests that for 500-μm-thick YAP:Ce coated with 1-μm-thick aluminum, and very good signal-to-noise ratio can be expected through application of a simple threshold.

  14. SU-E-T-519: Emission of Secondary Particles From a PMMA Phantom During Proton Irradiation: A Simulation Study with the Geant4 Monte Carlo Toolkit

    SciTech Connect

    Lau, A; Chen, Y; Ahmad, S

    2014-06-01

    Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMA phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study.

  15. SU-E-T-290: Secondary Dose Monitoring Using Scintillating Fibers in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    SciTech Connect

    Tesfamicael, B; Gueye, P; Lyons, D; Avery, S; Mahesh, M

    2014-06-01

    Purpose: To monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers. Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate prostate cancer proton therapy based treatments. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cmm{sup 3} DuPont™ Delrin blocks were used to monitor the emission of secondary particles in the transverse (left and right) and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were used to extract the energy deposited in each fiber and the scintillating block. Results: The transverse dose distributions from secondary particles in both cases agree within <5% and with a very good symmetry. The energy deposited not only gradually increases as one moves from the peripheral row fibers towards the center of the block (aligned with the center of the prostate) but also decreases as one goes from the frontal to distal region of the block. The ratio of the doses from the prostate to the ones in the middle two rows of fibers showed a linear relationship with a slope (−3.55±2.26) × 10−5 MeV per treatment Gy. The distal detectors recorded a very small energy deposited due to water attenuation. Conclusion: With a good calibration and the ability to define a good correlation between the dose to the external fibers and the prostate, such fibers can be used for real time dose verification to the target.

  16. Technical Note: Implementation of biological washout processes within GATE/GEANT4—A Monte Carlo study in the case of carbon therapy treatments

    SciTech Connect

    Martínez-Rovira, I. Jouvie, C.; Jan, S.

    2015-04-15

    Purpose: The imaging of positron emitting isotopes produced during patient irradiation is the only in vivo method used for hadrontherapy dose monitoring in clinics nowadays. However, the accuracy of this method is limited by the loss of signal due to the metabolic decay processes (biological washout). In this work, a generic modeling of washout was incorporated into the GATE simulation platform. Additionally, the influence of the washout on the β{sup +} activity distributions in terms of absolute quantification and spatial distribution was studied. Methods: First, the irradiation of a human head phantom with a {sup 12}C beam, so that a homogeneous dose distribution was achieved in the tumor, was simulated. The generated {sup 11}C and {sup 15}O distribution maps were used as β{sup +} sources in a second simulation, where the PET scanner was modeled following a detailed Monte Carlo approach. The activity distributions obtained in the presence and absence of washout processes for several clinical situations were compared. Results: Results show that activity values are highly reduced (by a factor of 2) in the presence of washout. These processes have a significant influence on the shape of the PET distributions. Differences in the distal activity falloff position of 4 mm are observed for a tumor dose deposition of 1 Gy (T{sub ini} = 0 min). However, in the case of high doses (3 Gy), the washout processes do not have a large effect on the position of the distal activity falloff (differences lower than 1 mm). The important role of the tumor washout parameters on the activity quantification was also evaluated. Conclusions: With this implementation, GATE/GEANT 4 is the only open-source code able to simulate the full chain from the hadrontherapy irradiation to the PET dose monitoring including biological effects. Results show the strong impact of the washout processes, indicating that the development of better models and measurement of biological washout data are

  17. GEANT4 simulation of a scintillating-fibre tracker for the cosmic-ray muon tomography of legacy nuclear waste containers

    NASA Astrophysics Data System (ADS)

    Clarkson, A.; Hamilton, D. J.; Hoek, M.; Ireland, D. G.; Johnstone, J. R.; Kaiser, R.; Keri, T.; Lumsden, S.; Mahon, D. F.; McKinnon, B.; Murray, M.; Nutbeam-Tuffs, S.; Shearer, C.; Staines, C.; Yang, G.; Zimmerman, C.

    2014-05-01

    Cosmic-ray muons are highly penetrative charged particles that are observed at the sea level with a flux of approximately one per square centimetre per minute. They interact with matter primarily through Coulomb scattering, which is exploited in the field of muon tomography to image shielded objects in a wide range of applications. In this paper, simulation studies are presented that assess the feasibility of a scintillating-fibre tracker system for use in the identification and characterisation of nuclear materials stored within industrial legacy waste containers. A system consisting of a pair of tracking modules above and a pair below the volume to be assayed is simulated within the GEANT4 framework using a range of potential fibre pitches and module separations. Each module comprises two orthogonal planes of fibres that allow the reconstruction of the initial and Coulomb-scattered muon trajectories. A likelihood-based image reconstruction algorithm has been developed that allows the container content to be determined with respect to the scattering density λ, a parameter which is related to the atomic number Z of the scattering material. Images reconstructed from this simulation are presented for a range of anticipated scenarios that highlight the expected image resolution and the potential of this system for the identification of high-Z materials within a shielded, concrete-filled container. First results from a constructed prototype system are presented in comparison with those from a detailed simulation. Excellent agreement between experimental data and simulation is observed showing clear discrimination between the different materials assayed throughout.

  18. The Educational Software Design and Evaluation for K-8: Oral and Dental Health Software

    ERIC Educational Resources Information Center

    Kabakci, Isil; Birinci, Gurkay; Izmirli, Serkan

    2007-01-01

    The aim of this study is to inform about the development of the software "Oral and Dental Health" that will supplement the course of Science and Technology for K8 students in the primary school curriculum and to carry out an evaluation study of the software. This software has been prepared for educational purposes. In relation to the evaluation of…

  19. Learning & Personality Types: A Case Study of a Software Design Course

    ERIC Educational Resources Information Center

    Ahmed, Faheem; Campbell, Piers; Jaffar, Ahmad; Alkobaisi, Shayma; Campbell, Julie

    2010-01-01

    The software industry has continued to grow over the past decade and there is now a need to provide education and hands-on training to students in various phases of software life cycle. Software design is one of the vital phases of the software development cycle. Psychological theories assert that not everybody is fit for all kind of tasks as…

  20. Pedagogy Embedded in Educational Software Design: Report of a Case Study.

    ERIC Educational Resources Information Center

    Hinostroza, J. Enrique; Mellar, Harvey

    2001-01-01

    Discussion of educational software focuses on a model of educational software that was derived from a case study of two elementary school teachers participating in a software design process. Considers human-computer interface, interaction, software browsing strategies, and implications for teacher training. (Author/LRW)

  1. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  2. Design of Timing Synchronization Software on EAST-NBI

    NASA Astrophysics Data System (ADS)

    Zhao, Yuanzhe; Hu, Chundong; Sheng, Peng; Zhang, Xiaodan

    2013-12-01

    To ensure the uniqueness and recognition of data and make it easy to analyze and process the data of all subsystems of the neutral beam injector (NBI), it is required that all subsystems have a unified system time. In this paper, the timing synchronization software is presented which is related to many kinds of technologies, such as shared memory, multithreading, TCP protocol and so on. Shared memory helps the server save the information of clients and system time, multithreading can deal with different clients with different threads, the server works under Linux operating system, the client works under Linux operating system and Windows operating system. With the help of this design, synchronization of all subsystems can be achieved in less than one second, and this accuracy is enough for the NBI system and the reliability of data is thus ensured.

  3. Design of Timing System Software on EAST-NBI

    NASA Astrophysics Data System (ADS)

    Zhao, Yuan-Zhe; Hu, Chun-Dong; Sheng, Peng; Zhang, Xiao-Dan; Wu, De-Yun; Cui, Qing-Long

    2013-10-01

    Neutral Beam Injector (NBI) is one of the main plasma heating and plasma current driving methods for Experimental Advanced Superconducting Tokomaks. In order to monitor the NBI experiment, control all the power supply, realize data acquisition and network, the control system is designed. As an important part of NBI control system, timing system (TS) provides a unified clock for all subsystems of NBI. TS controls the input/output services of digital signals and analog signals. It sends feedback message to the control server which is the function of alarm and interlock protection. The TS software runs on a Windows system and uses Labview language code while using client/server mode, multithreading and cyclic redundancy check technology. The experimental results have proved that TS provides a stability and reliability clock to the subsystems of NBI and contributed to the safety of the whole NBI system.

  4. Software Would Largely Automate Design of Kalman Filter

    NASA Technical Reports Server (NTRS)

    Chuang, Jason C. H.; Negast, William J.

    2005-01-01

    Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.

  5. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  6. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4

    NASA Astrophysics Data System (ADS)

    Pope, D. J.; Cutajar, D. L.; George, S. P.; Guatelli, S.; Bucci, J. A.; Enari, K. E.; Miller, S.; Siegele, R.; Rosenfeld, A. B.

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences. Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  7. The investigation of prostatic calcifications using μ-PIXE analysis and their dosimetric effect in low dose rate brachytherapy treatments using Geant4.

    PubMed

    Pope, D J; Cutajar, D L; George, S P; Guatelli, S; Bucci, J A; Enari, K E; Miller, S; Siegele, R; Rosenfeld, A B

    2015-06-01

    Low dose rate brachytherapy is a widely used modality for the treatment of prostate cancer. Most clinical treatment planning systems currently in use approximate all tissue to water, neglecting the existence of inhomogeneities, such as calcifications. The presence of prostatic calcifications may perturb the dose due to the higher photoelectric effect cross section in comparison to water. This study quantitatively evaluates the effect of prostatic calcifications on the dosimetric outcome of brachytherapy treatments by means of Monte Carlo simulations and its potential clinical consequences.Four pathological calcification samples were characterised with micro-particle induced x-ray emission (μ-PIXE) to determine their heavy elemental composition. Calcium, phosphorus and zinc were found to be the predominant heavy elements in the calcification composition. Four clinical patient brachytherapy treatments were modelled using Geant4 based Monte Carlo simulations, in terms of the distribution of brachytherapy seeds and calcifications in the prostate. Dose reductions were observed to be up to 30% locally to the calcification boundary, calcification size dependent. Single large calcifications and closely placed calculi caused local dose reductions of between 30-60%. Individual calculi smaller than 0.5 mm in diameter showed minimal dosimetric impact, however, the effects of small or diffuse calcifications within the prostatic tissue could not be determined using the methods employed in the study. The simulation study showed a varying reduction on common dosimetric parameters. D90 showed a reduction of 2-5%, regardless of calcification surface area and volume. The parameters V100, V150 and V200 were also reduced by as much as 3% and on average by 1%. These reductions were also found to relate to the surface area and volume of calcifications, which may have a significant dosimetric impact on brachytherapy treatment, however, such impacts depend strongly on specific factors

  8. Digital Modeling in Design Foundation Coursework: An Exploratory Study of the Effectiveness of Conceptual Design Software

    ERIC Educational Resources Information Center

    Guidera, Stan; MacPherson, D. Scot

    2008-01-01

    This paper presents the results of a study that was conducted to identify and document student perceptions of the effectiveness of computer modeling software introduced in a design foundations course that had previously utilized only conventional manually-produced representation techniques. Rather than attempt to utilize a production-oriented CAD…

  9. Software design of the ATLAS Muon Cathode Strip Chamber ROD

    NASA Astrophysics Data System (ADS)

    Murillo, R.; Huffer, M.; Claus, R.; Herbst, R.; Lankford, A.; Schernau, M.; Panetta, J.; Sapozhnikov, L.; Eschrich, I.; Deng, J.

    2012-12-01

    The ATLAS Cathode Strip Chamber system consists of two end-caps with 16 chambers each. The CSC Readout Drivers (RODs) are purpose-built boards encapsulating 13 DSPs and around 40 FPGAs. The principal responsibility of each ROD is for the extraction of data from two chambers at a maximum trigger rate of 75 KHz. In addition, each ROD is in charge of the setup, control and monitoring of the on-detector electronics. This paper introduces the design of the CSC ROD software. The main features of this design include an event flow schema that decentralizes the different dataflow streams, which can thus operate asynchronously at its own natural rate; an event building mechanism that associates data transferred by the asynchronous streams belonging to the same event; and a sparcification algorithm that discards uninteresting events and thus reduces the data occupancy volume. The time constraints imposed by the trigger rate have made paramount the use of optimization techniques such as the curiously recurrent template pattern and the programming of critical code in assembly language. The behaviour of the CSC RODs has been characterized in order to validate its performance.

  10. On the design of multimedia software and future system architectures

    NASA Astrophysics Data System (ADS)

    de With, Peter H. N.; Jaspers, Egbert G.

    2004-04-01

    A principal challenge for reducing the cost for designing complex systems-on-chip is to pursue more generic systems for a broad range of products. For this purpose, we explore three new architectural concepts for state-of-art video applications. First, we discuss a reusable scalable hardware architecture employing a hierarchical communication network fitting with the natural hierarchy of the application. In a case study, we show that MPEG streaming in DTV occurs at high level, while subsystems communicate at lower levels. The second concept is a software design that scales over a number of processors to enable reuse over a range of VLSI process technologies. We explore this via an H.264 decoder implementation scaling nearly linearly over up to eight processors by applying data partitioning. The third topic is resource-scalability, which is required to satisfy realtime constraints in a system with a high amount of shared resources. An example complexity-scalable MPEG-2 coder scales the required cycle budget with a factor of three, in parallel with a smooth degradation of quality.

  11. MHTool User's Guide - Software for Manufactured Housing Structural Design

    SciTech Connect

    W. D. Richins

    2005-07-01

    Since the late 1990s, the Department of Energy's Idaho National Laboratory (INL) has worked with the US Department of Housing and Urban Development (HUD), the Manufactured Housing Institute (MHI), the National Institute of Standards and Technology (NIST), the National Science Foundation (NSF), and an industry committee to measure the response of manufactured housing to both artificial and natural wind loads and to develop a computational desktop tool to optimize the structural performance of manufactured housing to HUD Code loads. MHTool is the result of an 8-year intensive testing and verification effort using single and double section homes. MHTool is the first fully integrated structural analysis software package specifically designed for manufactured housing. To use MHTool, industry design engineers will enter information (geometries, materials, connection types, etc.) describing the structure of a manufactured home, creating a base model. Windows, doors, and interior walls can be added to the initial design. Engineers will input the loads required by the HUD Code (wind, snow loads, interior live loads, etc.) and run an embedded finite element solver to find walls or connections where stresses are either excessive or very low. The designer could, for example, substitute a less expensive and easier to install connection in areas with very low stress, then re-run the analysis for verification. If forces and stresses are still within HUD Code requirements, construction costs would be saved without sacrificing quality. Manufacturers can easily change geometries or component properties to optimize designs of various floor plans then submit MHTool input and output in place of calculations for DAPIA review. No change in the regulatory process is anticipated. MHTool, while not yet complete, is now ready for demonstration. The pre-BETA version (Build-16) was displayed at the 2005 National Congress & Expo for Manufactured & Modular Housing. Additional base models and an

  12. Design and performance test of spacecraft test and operation software

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Cui, Yan; Wang, Shuo; Meng, Xiaofeng

    2011-06-01

    Main test processor (MTP) software is the key element of Electrical Ground Support Equipment (EGSE) for spacecraft test and operation used in the Chinese Academy of Space Technology (CAST) for years without innovation. With the increasing demand for a more efficient and agile MTP software, the new MTP software was developed. It adopts layered and plug-in based software architecture, whose core runtime server provides message queue management, share memory management and process management services and forms the framework for a configurable and open architecture system. To investigate the MTP software's performance, the test case of network response time, test sequence management capability and data-processing capability was introduced in detail. Test results show that the MTP software is common and has higher performance than the legacy one.

  13. Designing Computerized Provider Order Entry Software in Iran: The Nurses' and Physicians' Viewpoints.

    PubMed

    Khammarnia, Mohammad; Sharifian, Roxana; Zand, Farid; Keshtkaran, Ali; Barati, Omid

    2016-09-01

    This study aimed to identify the functional requirements of computerized provider order entry software and design this software in Iran. This study was conducted using review documentation, interview, and focus group discussions in Shiraz University of Medical Sciences, as the medical pole in Iran, in 2013-2015. The study sample consisted of physicians (n = 12) and nurses (n = 2) in the largest hospital in the southern part of Iran and information technology experts (n = 5) in Shiraz University of Medical Sciences. Functional requirements of the computerized provider order entry system were examined in three phases. Finally, the functional requirements were distributed in four levels, and accordingly, the computerized provider order entry software was designed. The software had seven main dimensions: (1) data entry, (2) drug interaction management system, (3) warning system, (4) treatment services, (5) ability to write in software, (6) reporting from all sections of the software, and (7) technical capabilities of the software. The nurses and physicians emphasized quick access to the computerized provider order entry software, order prescription section, and applicability of the software. The software had some items that had not been mentioned in other studies. Ultimately, the software was designed by a company specializing in hospital information systems in Iran. This study was the first specific investigation of computerized provider order entry software design in Iran. Based on the results, it is suggested that this software be implemented in hospitals. PMID:27270630

  14. Design consideration for design a flat and ring plastics part using Solidworks software

    NASA Astrophysics Data System (ADS)

    Amran, M. A. M.; Faizal, K. M.; Salleh, M. S.; Sulaiman, M. A.; Mohamad, E.

    2015-12-01

    Various considerations on design of plastic injection moulded parts were applied in initial stage to prevent any defects of end products. Therefore, the objective of this project is to design the plastic injection moulded part by taking consideration on several factors such as draft angle, corner radius and location of gate. In this project, flat plastic part, ring plastic part, core inserts for flat and ring plastic part were designed using SolidWorks software. The plastic part was drawn in sketching mode then the 3D modeling of solid part was generated using various commands. Considerations of plastic part such as draft angle and corner radius with location of gate was considered in the design stage. Finally, it was successfully designed the two plastic parts with their respectively insert by using SolidWorks software. The flat plastic part and ring plastic part were designed for the purpose for future researches for study the weld lines, meld lines, air trapped and geometrical size of the product. Thus, by designing the flat plastic part and ring plastic part having core insert on each part, the completed mould design of two plate mould can be considered. This is because, plastic injection parts are needed to be designed properly in order to neglect any defect when the mould was made.

  15. Risk management in the design of medical device software systems.

    PubMed

    Jones, Paul L; Jorgens, Joseph; Taylor, Alford R; Weber, Markus

    2002-01-01

    The safety of any medical device system is dependent on the application of a disciplined, well-defined, risk management process throughout the product life cycle. Hardware, software, human, and environmental interactions must be assessed in terms of intended use, risk, and cost/benefit criteria. This article addresses these issues in the context of medical devices that incorporate software. The article explains the principles of risk management, using terminology and examples from the domain of software engineering. It may serve as a guide to those new to the concepts of risk management and as an aide-memoire for medical device system/software engineers who are more familiar with the topic.

  16. Exploratory research for the development of a computer aided software design environment with the software technology program

    NASA Technical Reports Server (NTRS)

    Hardwick, Charles

    1991-01-01

    Field studies were conducted by MCC to determine areas of research of mutual interest to MCC and JSC. NASA personnel from the Information Systems Directorate and research faculty from UHCL/RICIS visited MCC in Austin, Texas to examine tools and applications under development in the MCC Software Technology Program. MCC personnel presented workshops in hypermedia, design knowledge capture, and design recovery on site at JSC for ISD personnel. The following programs were installed on workstations in the Software Technology Lab, NASA/JSC: (1) GERM (Graphic Entity Relations Modeler); (2) gIBIS (Graphic Issues Based Information System); and (3) DESIRE (Design Recovery tool). These applications were made available to NASA for inspection and evaluation. Programs developed in the MCC Software Technology Program run on the SUN workstation. The programs do not require special configuration, but they will require larger than usual amounts of disk space and RAM to operate properly.

  17. A Formal Approach to Domain-Oriented Software Design Environments

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    This paper describes a formal approach to domain-oriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domain-oriented specification language and its relationship to implementation-level subroutines. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that guides them in creating diagrams denoting formal specifications. The diagrams also serve to document the specifications. Deductive program synthesis ensures that end-user specifications are correctly implemented. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory, which includes an axiomatization of JPL's SPICELIB subroutine library. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development. Furthermore, AMPHION synthesizes one to two page programs consisting of calls to SPICELIB subroutines from these specifications in just a few minutes. Test results obtained by metering AMPHION's deductive program synthesis component are examined. AMPHION has been installed at JPL and is currently undergoing further refinement in preparation for distribution to hundreds of SPICELIB users worldwide. Current work to support end-user customization of AMPHION's specification acquisition subsystem is briefly discussed, as well as future work to enable domain-expert creation of new AMPHION applications through development of suitable domain theories.

  18. A cross-disciplinary technology transfer for search-based evolutionary computing: from engineering design to software engineering design

    NASA Astrophysics Data System (ADS)

    Simons, C. L.; Parmee, I. C.

    2007-07-01

    Although object-oriented conceptual software design is difficult to learn and perform, computational tool support for the conceptual software designer is limited. In conceptual engineering design, however, computational tools exploiting interactive evolutionary computation (EC) have shown significant utility. This article investigates the cross-disciplinary technology transfer of search-based EC from engineering design to software engineering design in an attempt to provide support for the conceptual software designer. Firstly, genetic operators inspired by genetic algorithms (GAs) and evolutionary programming are evaluated for their effectiveness against a conceptual software design representation using structural cohesion as an objective fitness function. Building on this evaluation, a multi-objective GA inspired by a non-dominated Pareto sorting approach is investigated for an industrial-scale conceptual design problem. Results obtained reveal a mass of interesting and useful conceptual software design solution variants of equivalent optimality—a typical characteristic of successful multi-objective evolutionary search techniques employed in conceptual engineering design. The mass of software design solution variants produced suggests that transferring search-based technology across disciplines has significant potential to provide computationally intelligent tool support for the conceptual software designer.

  19. Design study of Software-Implemented Fault-Tolerance (SIFT) computer

    NASA Technical Reports Server (NTRS)

    Wensley, J. H.; Goldberg, J.; Green, M. W.; Kutz, W. H.; Levitt, K. N.; Mills, M. E.; Shostak, R. E.; Whiting-Okeefe, P. M.; Zeidler, H. M.

    1982-01-01

    Software-implemented fault tolerant (SIFT) computer design for commercial aviation is reported. A SIFT design concept is addressed. Alternate strategies for physical implementation are considered. Hardware and software design correctness is addressed. System modeling and effectiveness evaluation are considered from a fault-tolerant point of view.

  20. Research and Design Issues Concerning the Development of Educational Software for Children. Technical Report No. 14.

    ERIC Educational Resources Information Center

    Char, Cynthia

    Several research and design issues to be considered when creating educational software were identified by a field test evaluation of three types of innovative software created at Bank Street College: (1) Probe, software for measuring and graphing temperature data; (2) Rescue Mission, a navigation game that illustrates the computer's use for…

  1. Teacher-Designed Software for Interactive Linear Equations: Concepts, Interpretive Skills, Applications & Word-Problem Solving.

    ERIC Educational Resources Information Center

    Lawrence, Virginia

    No longer just a user of commercial software, the 21st century teacher is a designer of interactive software based on theories of learning. This software, a comprehensive study of straightline equations, enhances conceptual understanding, sketching, graphic interpretive and word problem solving skills as well as making connections to real-life and…

  2. TH-E-BRE-01: A 3D Solver of Linear Boltzmann Transport Equation Based On a New Angular Discretization Method with Positivity for Photon Dose Calculation Benchmarked with Geant4

    SciTech Connect

    Hong, X; Gao, H

    2014-06-15

    Purpose: The Linear Boltzmann Transport Equation (LBTE) solved through statistical Monte Carlo (MC) method provides the accurate dose calculation in radiotherapy. This work is to investigate the alternative way for accurately solving LBTE using deterministic numerical method due to its possible advantage in computational speed from MC. Methods: Instead of using traditional spherical harmonics to approximate angular scattering kernel, our deterministic numerical method directly computes angular scattering weights, based on a new angular discretization method that utilizes linear finite element method on the local triangulation of unit angular sphere. As a Result, our angular discretization method has the unique advantage in positivity, i.e., to maintain all scattering weights nonnegative all the time, which is physically correct. Moreover, our method is local in angular space, and therefore handles the anisotropic scattering well, such as the forward-peaking scattering. To be compatible with image-guided radiotherapy, the spatial variables are discretized on the structured grid with the standard diamond scheme. After discretization, the improved sourceiteration method is utilized for solving the linear system without saving the linear system to memory. The accuracy of our 3D solver is validated using analytic solutions and benchmarked with Geant4, a popular MC solver. Results: The differences between Geant4 solutions and our solutions were less than 1.5% for various testing cases that mimic the practical cases. More details are available in the supporting document. Conclusion: We have developed a 3D LBTE solver based on a new angular discretization method that guarantees the positivity of scattering weights for physical correctness, and it has been benchmarked with Geant4 for photon dose calculation.

  3. Object-oriented software design for the Mt. Wilson 100-inch Hooker telescope adaptive optics system

    NASA Astrophysics Data System (ADS)

    Schneider, Thomas G.

    2000-06-01

    The object oriented software design paradigm has been instrumented in the development of the Adoptics software used in the Hooker telescope's ADOPT adaptive optics system. The software runs on a Pentium-class PC host and eight DSP processors connected to the host's motherboard bus. C++ classes were created to implement most of the host software's functionality, with the object oriented features of inheritance, encapsulation and abstraction being the most useful. Careful class design at the inception of the project allowed for the rapid addition of features without comprising the integrity of the software. Base class implementations include the DSP system, real-time graphical displays and opto-mechanical actuator control.

  4. Wake Turbulence Mitigation for Departures (WTMD) Prototype System - Software Design Document

    NASA Technical Reports Server (NTRS)

    Sturdy, James L.

    2008-01-01

    This document describes the software design of a prototype Wake Turbulence Mitigation for Departures (WTMD) system that was evaluated in shadow mode operation at the Saint Louis (KSTL) and Houston (KIAH) airports. This document describes the software that provides the system framework, communications, user displays, and hosts the Wind Forecasting Algorithm (WFA) software developed by the M.I.T. Lincoln Laboratory (MIT-LL). The WFA algorithms and software are described in a separate document produced by MIT-LL.

  5. IDEAS and App Development Internship in Hardware and Software Design

    NASA Technical Reports Server (NTRS)

    Alrayes, Rabab D.

    2016-01-01

    In this report, I will discuss the tasks and projects I have completed while working as an electrical engineering intern during the spring semester of 2016 at NASA Kennedy Space Center. In the field of software development, I completed tasks for the G-O Caching Mobile App and the Asbestos Management Information System (AMIS) Web App. The G-O Caching Mobile App was written in HTML, CSS, and JavaScript on the Cordova framework, while the AMIS Web App is written in HTML, CSS, JavaScript, and C# on the AngularJS framework. My goals and objectives on these two projects were to produce an app with an eye-catching and intuitive User Interface (UI), which will attract more employees to participate; to produce a fully-tested, fully functional app which supports workforce engagement and exploration; to produce a fully-tested, fully functional web app that assists technicians working in asbestos management. I also worked in hardware development on the Integrated Display and Environmental Awareness System (IDEAS) wearable technology project. My tasks on this project were focused in PCB design and camera integration. My goals and objectives for this project were to successfully integrate fully functioning custom hardware extenders on the wearable technology headset to minimize the size of hardware on the smart glasses headset for maximum user comfort; to successfully integrate fully functioning camera onto the headset. By the end of this semester, I was able to successfully develop four extender boards to minimize hardware on the headset, and assisted in integrating a fully-functioning camera into the system.

  6. SWEPP Assay System Version 2.0 software design description

    SciTech Connect

    East, L.V.; Marwil, E.S.

    1996-08-01

    The Idaho National Engineering Laboratory (INEL) Stored Waste Examination Pilot Plant (SWEPP) operations staff use nondestructive analysis methods to characterize the radiological contents of contact-handled radioactive waste containers. Containers of waste from Rocky Flats Environmental Technology Site and other Department of Energy (DOE) sites are currently stored at SWEPP. Before these containers can be shipped to the Waste Isolation Pilot Plant (WIPP), SWEPP must verify compliance with storage, shipping, and disposal requirements. This program has been in operation since 1985 at the INEL Radioactive Waste Management Complex (RWMC). One part of the SWEPP program measures neutron emissions from the containers and estimates the mass of plutonium and other transuranic (TRU) isotopes present. A Passive/Active Neutron (PAN) assay system developed at the Los Alamos National Laboratory is used to perform these measurements. A computer program named NEUT2 was originally used to perform the data acquisition and reduction functions for the neutron measurements. This program was originally developed at Los Alamos and extensively modified by a commercial vendor of PAN systems and by personnel at the INEL. NEUT2 uses the analysis methodology outlined, but no formal documentation exists on the program itself. The SWEPP Assay System (SAS) computer program replaced the NEUT2 program in early 1994. The SAS software was developed using an `object model` approach and is documented in accordance with American National Standards Institute (ANSI) and Institute of Electrical and Electronic Engineers (IEEE) standards. The new program incorporates the basic analysis algorithms found in NEUT2. Additional functionality and improvements include a graphical user interface, the ability to change analysis parameters without program code modification, an `object model` design approach and other features for improved flexibility and maintainability.

  7. An overview of software design languages. [for Galileo spacecraft Command and Data Subsystems

    NASA Technical Reports Server (NTRS)

    Callender, E. D.

    1980-01-01

    The nature and use of design languages and associated processors that are used in software development are reviewed with reference to development work on the Galileo spacecraft project, a Jupiter orbiter scheduled for launch in 1984. The major design steps are identified (functional design, architectural design, detailed design, coding, and testing), and the purpose, functions and the range of applications of design languages are examined. Then the general character of any design language is analyzed in terms of syntax and semantics. Finally, the differences and similarities between design languages are illustrated by examining two specific design languages: Software Design and Documentation language and Problem Statement Language/Problem Statement Analyzer.

  8. The UNIX Operating System: A Model for Software Design.

    ERIC Educational Resources Information Center

    Kernighan, Brian W.; Morgan, Samuel P.

    1982-01-01

    Describes UNIX time-sharing operating system, including the program environment, software development tools, flexibility and ease of change, portability and other advantages, and five applications and three nonapplications of the system. (JN)

  9. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    SciTech Connect

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-07-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  10. Software Support for Online Mentoring Programs: A Research-Inspired Design

    ERIC Educational Resources Information Center

    O'Neill, Kevin D.; Weiler, Mark; Sha, Li

    2005-01-01

    This article provides an overview of Telementoring Orchestrator[TM] (TMO), a new web-based software tool designed to aid small or large organizations in supporting telementoring programs (also called online mentoring or e-mentoring programs). In this report, we review the research that inspired the design of the software, and survey the major…

  11. User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis

    SciTech Connect

    Scholtz, Jean; Endert, Alexander N.

    2014-08-01

    In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We present some standing issues in collaborative software based on existing work within the intelligence community. Based on this information we present opportunities to address some of these challenges.

  12. User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis

    SciTech Connect

    Scholtz, Jean; Endert, Alexander

    2014-07-01

    In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We discuss a number of studies of collaboration in the intelligence community and use this information to provide some guidelines for collaboration software.

  13. Windows Calorimeter Control (WinCal) program computer software design description

    SciTech Connect

    Pertzborn, N.F.

    1997-03-26

    The Windows Calorimeter Control (WinCal) Program System Design Description contains a discussion of the design details for the WinCal product. Information in this document will assist a developer in maintaining the WinCal system. The content of this document follows the guidance in WHC-CM-3-10, Software Engineering Standards, Standard for Software User Documentation.

  14. Validation of mission critical software design and implementation using model checking

    NASA Technical Reports Server (NTRS)

    Pingree, P. J.; Mikk, E.; Holzmann, G.; Smith, M.; Dams, D.

    2002-01-01

    Model Checking conducts an exhaustive exploration of all possible behaviors of a software system design and as such can be used to detect defects in designs that are typically difficult to discover with conventional testing approaches.

  15. Application of software technology to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  16. A software design approach for heterogeneous systems of unattended sensors, unmanned vehicles, and monitoring stations

    NASA Astrophysics Data System (ADS)

    Smuda, William J.; Gerhart, Grant; Shing, Man-Tak; Auguston, Mikhail

    2006-09-01

    The design and implementation of software for network systems of diverse physical assets is a continuing challenge to sensor network developers. The problems are often multiplied when adding new elements, and when reconfiguring existing systems. For software systems, like physical systems, explicit architectural descriptions increase system level comprehension. Coupled with well defined object oriented design practices, system extensibility is defined and software reuse and code composition are enabled. Our research is based on model driven design architecture. High level system models are defined in the Unified Modeling Language (UML), the language of the software engineer. However, since most experimental work is done by non-software specialists, (electronics Engineers, Mechanical Engineers and technicians) the model is translated into a graphical, domain specific model. Components are presented as domain specific icons, and constraints from the UML model are propagated into the domain model. Domain specialists manipulate the domain model, which then composes software elements needed at each node to create an aggregate system.

  17. Open Source Software for Experiment Design and Control. (tutorial)

    ERIC Educational Resources Information Center

    Hillenbrand, James M.; Gayvert, Robert T.

    2005-01-01

    The purpose of this paper is to describe a software package that can be used for performing such routine tasks as controlling listening experiments (e.g., simple labeling, discrimination, sentence intelligibility, and magnitude estimation), recording responses and response latencies, analyzing and plotting the results of those experiments,…

  18. Girls' Preferences in Software Design: Insights from a Focus Group.

    ERIC Educational Resources Information Center

    Miller, Leslie; And Others

    1996-01-01

    A lack of gender-sensitive computer games exacerbates female disinterest in technology. Girls-only focus groups revealed phenomena that may help software developers awaken girls' enthusiasm for computing. For instance, girls placed a premium on richly textured video and audio, on collaborating rather than competing, on interacting with male…

  19. The Impact of Social Software in Product Design Higher Education

    ERIC Educational Resources Information Center

    Hurn, Karl

    2012-01-01

    It is difficult to ignore the impact that Web 2.0 and the subsequent social software revolution has had on society in general, and young people in particular. Information is exchanged and interpreted extremely quickly and in ways that were not imagined 10 years ago. Universities are struggling to keep up with this new technology, with outdated…

  20. Autochthonous Change: Self-Renewal through Open Software Design

    ERIC Educational Resources Information Center

    Hedbring, Charles

    2005-01-01

    In all likelihood, currently employed therapists and teachers grew up with computer technology. Part of their computer culture included programming computers for entertainment using popular consumer software like Microsoft Basic. Within this social-educational milieu, the "FACTS+" curriculum represents one long-term project covering the…

  1. Designing Educational Software with Students through Collaborative Design Games: The We!Design&Play Framework

    ERIC Educational Resources Information Center

    Triantafyllakos, George; Palaigeorgiou, George; Tsoukalas, Ioannis A.

    2011-01-01

    In this paper, we present a framework for the development of collaborative design games that can be employed in participatory design sessions with students for the design of educational applications. The framework is inspired by idea generation theory and the design games literature, and guides the development of board games which, through the use…

  2. QUICK - AN INTERACTIVE SOFTWARE ENVIRONMENT FOR ENGINEERING DESIGN

    NASA Technical Reports Server (NTRS)

    Schlaifer, R. S.

    1994-01-01

    QUICK provides the computer user with the facilities of a sophisticated desk calculator which can perform scalar, vector and matrix arithmetic, propagate conic orbits, determine planetary and satellite coordinates and perform other related astrodynamic calculations within a Fortran-like environment. QUICK is an interpreter, therefore eliminating the need to use a compiler or a linker to run QUICK code. QUICK capabilities include options for automated printing of results, the ability to submit operating system commands on some systems, and access to a plotting package (MASL)and a text editor without leaving QUICK. Mathematical and programming features of QUICK include the ability to handle arbitrary algebraic expressions, the capability to define user functions in terms of other functions, built-in constants such as pi, direct access to useful COMMON areas, matrix capabilities, extensive use of double precision calculations, and the ability to automatically load user functions from a standard library. The MASL (The Multi-mission Analysis Software Library) plotting package, included in the QUICK package, is a set of FORTRAN 77 compatible subroutines designed to facilitate the plotting of engineering data by allowing programmers to write plotting device independent applications. Its universality lies in the number of plotting devices it puts at the user's disposal. The MASL package of routines has proved very useful and easy to work with, yielding good plots for most new users on the first or second try. The functions provided include routines for creating histograms, "wire mesh" surface plots and contour plots as well as normal graphs with a large variety of axis types. The library has routines for plotting on cartesian, polar, log, mercator, cyclic, calendar, and stereographic axes, and for performing automatic or explicit scaling. The lengths of the axes of a plot are completely under the control of the program using the library. Programs written to use the MASL

  3. Digital hardware and software design for infrared sensor image processing

    NASA Astrophysics Data System (ADS)

    Bekhtin, Yuri; Barantsev, Alexander; Solyakov, Vladimir; Medvedev, Alexander

    2005-06-01

    The example of the digital hardware-and-software complex consisting of the multi-element matrix sensor the personal computer along with the installed special card AMBPCI is described. The problems of elimination socalled fixed pattern noise (FPN) are considered. To improve current imaging the residual FPN is represented as a multiplicative noise. The wavelet-based de-noising algorithm using sets of noisy and non-noisy data of images is applied.

  4. NSTX-U Digital Coil Protection System Software Detailed Design

    SciTech Connect

    2014-06-01

    The National Spherical Torus Experiment (NSTX) currently uses a collection of analog signal processing solutions for coil protection. Part of the NSTX Upgrade (NSTX-U) entails replacing these analog systems with a software solution running on a conventional computing platform. The new Digital Coil Protection System (DCPS) will replace the old systems entirely, while also providing an extensible framework that allows adding new functionality as desired.

  5. As-built design specification for proportion estimate software subsystem

    NASA Technical Reports Server (NTRS)

    Obrien, S. (Principal Investigator)

    1980-01-01

    The Proportion Estimate Processor evaluates four estimation techniques in order to get an improved estimate of the proportion of a scene that is planted in a selected crop. The four techniques to be evaluated were provided by the techniques development section and are: (1) random sampling; (2) proportional allocation, relative count estimate; (3) proportional allocation, Bayesian estimate; and (4) sequential Bayesian allocation. The user is given two options for computation of the estimated mean square error. These are referred to as the cluster calculation option and the segment calculation option. The software for the Proportion Estimate Processor is operational on the IBM 3031 computer.

  6. A Buyer Behaviour Framework for the Development and Design of Software Agents in E-Commerce.

    ERIC Educational Resources Information Center

    Sproule, Susan; Archer, Norm

    2000-01-01

    Software agents are computer programs that run in the background and perform tasks autonomously as delegated by the user. This paper blends models from marketing research and findings from the field of decision support systems to build a framework for the design of software agents to support in e-commerce buying applications. (Contains 35…

  7. An Overview of U.S. Trends in Educational Software Design.

    ERIC Educational Resources Information Center

    Colvin, Linda B.

    1989-01-01

    Describes trends in educational software design in the United States for elementary and secondary education. Highlights include user-friendly software; learner control; interfacing the computer with other media, including television, telecommunications networks, and optical disk technology; microworlds; graphics; word processing; database…

  8. Improving the quality of numerical software through user-centered design

    SciTech Connect

    Pancake, C. M., Oregon State University

    1998-06-01

    The software interface - whether graphical, command-oriented, menu-driven, or in the form of subroutine calls - shapes the user`s perception of what software can do. It also establishes upper bounds on software usability. Numerical software interfaces typically are based on the designer`s understanding of how the software should be used. That is a poor foundation for usability, since the features that are ``instinctively right`` from the developer`s perspective are often the very ones that technical programmers find most objectionable or most difficult to learn. This paper discusses how numerical software interfaces can be improved by involving users more actively in design, a process known as user-centered design (UCD). While UCD requires extra organization and effort, it results in much higher levels of usability and can actually reduce software costs. This is true not just for graphical user interfaces, but for all software interfaces. Examples show how UCD improved the usability of a subroutine library, a command language, and an invocation interface.

  9. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  10. Learner centered software design to empower physiology education.

    PubMed

    Michea, Yanko; Phelps, Cynthia; Johnson, Craig

    2003-01-01

    Misconceptions in physiology undermine students' knowledge. New uses of technology in education offer interesting alternatives to correct these problems. This poster presents a design strategy based in user-centered design and the result of such process: an interactive program to support learning of respiratory physiology. This is an ongoing project, and future efforts will measure the effectiveness of this design tool in medical education.

  11. An application of the IMC software to controller design for the JPL LSCL Experiment Facility

    NASA Technical Reports Server (NTRS)

    Zhu, Guoming; Skelton, Robert E.

    1993-01-01

    A software package which Integrates Model reduction and Controller design (The IMC software) is applied to design controllers for the JPL Large Spacecraft Control Laboratory Experiment Facility. Modal Cost Analysis is used for the model reduction, and various Output Covariance Constraints are guaranteed by the controller design. The main motivation is to find the controller with the 'best' performance with respect to output variances. Indeed it is shown that by iterating on the reduced order design model, the controller designed does have better performance than that obtained with the first model reduction.

  12. 'Ten Golden Rules' for Designing Software in Medical Education: Results from a Formative Evaluation of DIALOG.

    ERIC Educational Resources Information Center

    Jha, Vikram; Duffy, Sean

    2002-01-01

    Reports the results of an evaluation of Distance Interactive Learning in Obstetrics and Gynecology (DIALOG) which is an electronic program for continuing education. Presents 10 golden rules for designing software for medical practitioners. (Contains 26 references.) (Author/YDS)

  13. CMS Simulation Software

    NASA Astrophysics Data System (ADS)

    Banerjee, S.

    2012-12-01

    The CMS simulation, based on the Geant4 toolkit, has been operational within the new CMS software framework for more than four years. The description of the detector including the forward regions has been completed and detailed investigation of detector positioning and material budget has been carried out using collision data. Detailed modeling of detector noise has been performed and validated with the collision data. In view of the high luminosity runs of the Large Hadron Collider, simulation of pile-up events has become a key issue. Challenges have raised from the point of view of providing a realistic luminosity profile and modeling of out-of-time pileup events, as well as computing issues regarding memory footprint and IO access. These will be especially severe in the simulation of collision events for the LHC upgrades; a new pileup simulation architecture has been introduced to cope with these issues. The CMS detector has observed anomalous energy deposit in the calorimeters and there has been a substantial effort to understand these anomalous signal events present in the collision data. Emphasis has also been given to validation of the simulation code including the physics of the underlying models of Geant4. Test beam as well as collision data are used for this purpose. Measurements of mean response, resolution, energy sharing between the electromagnetic and hadron calorimeters, shower shapes for single hadrons are directly compared with predictions from Monte Carlo. A suite of performance analysis tools has been put in place and has been used to drive several optimizations to allow the code to fit the constraints posed by the CMS computing model.

  14. User-Centered Design of Health Care Software Development: Towards a Cultural Change.

    PubMed

    Stanziola, Enrique; Uznayo, María Quispe; Ortiz, Juan Marcos; Simón, Mariana; Otero, Carlos; Campos, Fernando; Luna, Daniel

    2015-01-01

    Health care software gets better user efficiency, efficacy and satisfaction when the software is designed with their users' needs taken into account. However, it is not trivial to change the practice of software development to adopt user-centered design. In order to produce this change in the Health Informatics Department of the Hospital Italiano de Buenos Aires, a plan was devised and implemented. The article presents the steps of the plan, shows how the steps were carried on, and reflects on the lessons learned through the process. PMID:26262073

  15. Drug Guru: a computer software program for drug design using medicinal chemistry rules.

    PubMed

    Stewart, Kent D; Shiroda, Melisa; James, Craig A

    2006-10-15

    Drug Guru (drug generation using rules) is a new web-based computer software program for medicinal chemists that applies a set of transformations, that is, rules, to an input structure. The transformations correspond to medicinal chemistry design rules-of-thumb taken from the historical lore of drug discovery programs. The output of the program is a list of target analogs that can be evaluated for possible future synthesis. A discussion of the features of the program is followed by an example of the software applied to sildenafil (Viagra) in generating ideas for target analogs for phosphodiesterase inhibition. Comparison with other computer-assisted drug design software is given.

  16. Optical System Critical Design Review (CDR) Flight Software Summary

    NASA Technical Reports Server (NTRS)

    Khorrami, Mori

    2006-01-01

    The Mid Infrared Instrument (MIRI FSW presentation covers: (1) Optical System FSW only and Cooling System FSW is covered at its CDR (2) Requirements & Interfaces (3) Relationship with the ISIM FSW (4) FSW Design Drivers & Solutions.

  17. Design-to-fabricate: maker hardware requires maker software.

    PubMed

    Schmidt, Ryan; Ratto, Matt

    2013-01-01

    As a result of consumer-level 3D printers' increasing availability and affordability, the audience for 3D-design tools has grown considerably. However, current tools are ill-suited for these users. They have steep learning curves and don't take into account that the end goal is a physical object, not a digital model. A new class of "maker"-level design tools is needed to accompany this new commodity hardware. However, recent examples of such tools achieve accessibility primarily by constraining functionality. In contrast, the meshmixer project is building tools that provide accessibility and expressive power by leveraging recent computer graphics research in geometry processing. The project members have had positive experiences with several 3D-design-to-print workshops and are exploring several design-to-fabricate problems. This article is part of a special issue on 3D printing.

  18. Psychosocial Risks Generated By Assets Specific Design Software

    NASA Astrophysics Data System (ADS)

    Remus, Furtună; Angela, Domnariu; Petru, Lazăr

    2015-07-01

    The human activity concerning an occupation is resultant from the interaction between the psycho-biological, socio-cultural and organizational-occupational factors. Tehnological development, automation and computerization that are to be found in all the branches of activity, the level of speed in which things develop, as well as reaching their complexity, require less and less physical aptitudes and more cognitive qualifications. The person included in the work process is bound in most of the cases to come in line with the organizational-occupational situations that are specific to the demands of the job. The role of the programmer is essencial in the process of execution of ordered softwares, thus the truly brilliant ideas can only come from well-rested minds, concentrated on their tasks. The actual requirements of the jobs, besides the high number of benefits and opportunities, also create a series of psycho-social risks, which can increase the level of stress during work activity, especially for those who work under pressure.

  19. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  20. Scaffolding Design Guidelines for Learner-Centered Software Environments.

    ERIC Educational Resources Information Center

    Quintana, Chris; Krajcik, Joseph; Soloway, Elliot

    If learners are to engage in science inquiry, they need significant support, or scaffolding, to help them mindfully do the cognitive science tasks that are just out of their reach. One approach for supporting learners is to design computational tools that incorporate scaffolding features to make new practices accessible and visible so learners can…

  1. A Dialogue and Social Software Perspective on Deep Learning Design

    ERIC Educational Resources Information Center

    Ravenscroft, Andrew; Boyle, Tom

    2010-01-01

    This article considers projects in Technology Enhanced Learning (TEL) that have focussed on designing digital tools that stimulate and support dialogue rich learning. These have emphasised collaborative thinking and meaning making in a rich and varied range of educational contexts. Technically, they have exploited AI, CSCL and HCI techniques, and…

  2. Software Manuals: Where Instructional Design and Technical Writing Join Forces.

    ERIC Educational Resources Information Center

    Thurston, Walter, Ed.

    1986-01-01

    Presents highlights from a panel discussion by well known San Francisco Bay area documentation writers, instructional designers, and human performance technologists. Three issues on user performance and documentation are addressed: whether people avoid reading user manuals and why; major human factors influencing documentation use; and…

  3. Methods and software tools for design evaluation in population pharmacokinetics-pharmacodynamics studies.

    PubMed

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)-pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization.

  4. Investigation into the development of computer aided design software for space based sensors

    NASA Technical Reports Server (NTRS)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  5. The Implementation of Satellite Attitude Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Reid, W. Mark; Hansell, William; Phillips, Tom; Anderson, Mark O.; Drury, Derek

    1998-01-01

    NASA established the Small Explorer (SNMX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions. The SMEX program has produced five satellites, three of which have been successfully launched. The remaining two spacecraft are scheduled for launch within the coming year. NASA has recently developed a prototype for the next generation Small Explorer spacecraft (SMEX-Lite). This paper describes the object-oriented design (OOD) of the SMEX-Lite Attitude Control System (ACS) software. The SMEX-Lite ACS is three-axis controlled and is capable of performing sub-arc-minute pointing. This paper first describes high level requirements governing the SMEX-Lite ACS software architecture. Next, the context in which the software resides is explained. The paper describes the principles of encapsulation, inheritance, and polymorphism with respect to the implementation of an ACS software system. This paper will also discuss the design of several ACS software components. Specifically, object-oriented designs are presented for sensor data processing, attitude determination, attitude control, and failure detection. Finally, this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects.

  6. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  7. The ATLAS integrated structural analysis and design software system

    NASA Technical Reports Server (NTRS)

    Dreisbach, R. L.; Giles, G. L.

    1978-01-01

    The ATLAS system provides an extensive set of integrated technical computer-program modules for the analysis and design of general structural configurations, as well as capabilities that are particularly suited for the aeroelastic design of flight vehicles. The system is based on the stiffness formulation of the finite element structural analysis method and can be executed in batch and interactive computing environments on CDC 6600/CYBER computers. Problem-definition input data are written in an engineering-oriented language using a free field format. Input-data default values, generation options, and data quality checks provided by the preprocessors minimize the amount of data and flowtime for problem definition/verfication. Postprocessors allow selected input and calculated data to be extracted, manipulated, and displayed via on-line and off-line prints or plots for monitoring and verifying problem solutions. The sequence and mode of execution of selected program modules are controlled by a common user-oriented language.

  8. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  9. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  10. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    SciTech Connect

    Kent Norris

    2009-10-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  11. Software for the Design of Swimming Pool Dehumidifiers Units

    NASA Astrophysics Data System (ADS)

    Rubina, Aleš; Blasinski, Petr; Tesař, Zdeněk

    2013-06-01

    The article deals with the description and solution of physical phenomena taking place during evaporation of water. The topicality of the theme is given a number of built indoor swimming pool and wellness centers at present. In addressing HVAC systems serving these areas, it is necessary to know the various design parameters in the interior including the water temperature as the pool temperature and humidity. Following is a description of the calculation module, air handling units, including optimizing the settings of the physical changes in order to ensure the lowest energy consumption for air treatment and required maintaining internal microclimate parameters.

  12. Designing of a Computer Software for Detection of Approximal Caries in Posterior Teeth

    PubMed Central

    Valizadeh, Solmaz; Goodini, Mostafa; Ehsani, Sara; Mohseni, Hadis; Azimi, Fateme; Bakhshandeh, Hooman

    2015-01-01

    Background: Radiographs, adjunct to clinical examination are always valuable complementary methods for dental caries detection. Recently, progressing in digital imaging system provides possibility of software designing for automatically dental caries detection. Objectives: The aim of this study was to develop and assess the function of diagnostic computer software designed for evaluation of approximal caries in posterior teeth. This software should be able to indicate the depth and location of caries on digital radiographic images. Materials and Methods: Digital radiographs were obtained of 93 teeth including 183 proximal surfaces. These images were used as a database for designing the software and training the software designer. In the design phase, considering the summed density of pixels in rows and columns of the images, the teeth were separated from each other and the unnecessary regions; for example, the root area in the alveolar bone was eliminated. Therefore, based on summed intensities, each image was segmented such that each segment contained only one tooth. Subsequently, based on the fuzzy logic, a well-known data-clustering algorithm named fuzzy c-means (FCM) was applied to the images to cluster or segment each tooth. This algorithm is referred to as a soft clustering method, which assigns data elements to one or more clusters with a specific membership function. Using the extracted clusters, the tooth border was determined and assessed for cavity. The results of histological analysis were used as the gold standard for comparison with the results obtained from the software. Depth of caries was measured, and finally Intraclass Correlation Coefficient (ICC) and Bland-Altman plot were used to show the agreement between the methods. Results: The software diagnosed 60% of enamel caries. The ICC (for detection of enamel caries) between the computer software and histological analysis results was determined as 0.609 (95% confidence interval [CI] = 0

  13. Verification and translation of distributed computing system software design

    SciTech Connect

    Chen, J.N.

    1987-01-01

    A methodology for generating a distributed computing system application program for the design specification based on modified Petri nets is presented. There are four major stages in this methodology: (1) to build a structured graphics specification model, (2) to verify abstract data type and detect deadlock of the model, (3) the define communicate among individual processes within the model, and (4) to translate symbolic representation into a program of a specified high-level target language. In this dissertation, Ada is used as the specified high-level target language. The structured graphics promote intelligibility because hierarchical decomposition functional modules is encouraged and the behavior of each process can be easily extracted from the net as a separate view of the system. The formal method described in this dissertation uses symbolic formal method presentation to represent the design specification of distributed computing systems. This symbolic representation is then translated into an equivalent Ada program structure, especially with the features of concurrency and synchronization. Artificial intelligence techniques are employed to verify and to detect deadlock properties in a distributed computing system environment. In the aspect of verification, the axioms of abstract data types are translated into PROLOG clauses and some inquires are tested to prove correctness of abstract data types.

  14. Software Design Document for the AMP Nuclear Fuel Performance Code

    SciTech Connect

    Philip, Bobby; Clarno, Kevin T; Cochran, Bill

    2010-03-01

    The purpose of this document is to describe the design of the AMP nuclear fuel performance code. It provides an overview of the decomposition into separable components, an overview of what those components will do, and the strategic basis for the design. The primary components of a computational physics code include a user interface, physics packages, material properties, mathematics solvers, and computational infrastructure. Some capability from established off-the-shelf (OTS) packages will be leveraged in the development of AMP, but the primary physics components will be entirely new. The material properties required by these physics operators include many highly non-linear properties, which will be replicated from FRAPCON and LIFE where applicable, as well as some computationally-intensive operations, such as gap conductance, which depends upon the plenum pressure. Because there is extensive capability in off-the-shelf leadership class computational solvers, AMP will leverage the Trilinos, PETSc, and SUNDIALS packages. The computational infrastructure includes a build system, mesh database, and other building blocks of a computational physics package. The user interface will be developed through a collaborative effort with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Capability Transfer program element as much as possible and will be discussed in detail in a future document.

  15. Conceptual design of the control software for the European Solar Telescope

    NASA Astrophysics Data System (ADS)

    Di Marcantonio, P.; Cirami, R.; Romano, P.; Cosentino, R.; Ermolli, I.; Giorgi, F.

    2012-09-01

    Aim of this paper is to present an overview of the conceptual design of the Control Software for the European Solar Telescope (EST), as emerged after the successful Conceptual Design Review held in June 2011 which formally concluded the EST Preliminary Design Study. After a general description of ECS (EST Control Software) architecture end-to-end, from operation concepts and observation preparations to the control of the planned focal plane instruments, the paper focuses on the arrangement devised to date of ECS to cope with the foreseen scientific requirements. EST major subsystems together with the functions to be controlled are eventually detailed and discussed.

  16. Tank Monitoring and Document control System (TMACS) As Built Software Design Document

    SciTech Connect

    GLASSCOCK, J.A.

    2000-01-27

    This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.

  17. Model-It: A Case Study of Learner-Centered Software Design for Supporting Model Building.

    ERIC Educational Resources Information Center

    Jackson, Shari L.; Stratford, Steven J.; Krajcik, Joseph S.; Soloway, Elliot

    Learner-centered software design (LCSD) guides the design of tasks, tools, and interfaces in order to support the unique needs of learners: growth, diversity and motivation. This paper presents a framework for LCSD and describes a case study of its application to the ScienceWare Model-It, a learner-centered tool to support scientific modeling and…

  18. Spaces for Change: Gender and Technology Access in Collaborative Software Design.

    ERIC Educational Resources Information Center

    Ching, Cynthia Carter; Kafai, Yasmin B.; Marshall, Sue K.

    2000-01-01

    Examines a three-month software design activity in which mixed teams of girls and boys designed and implemented multimedia astronomy resources for younger students. Finds that the configuration of social, physical, and cognitive spaces in the project environment contributed to a positive change in girls' level of access. Discusses implications for…

  19. Toward Understanding the Cognitive Processes of Software Design in Novice Programmers

    ERIC Educational Resources Information Center

    Yeh, Kuo-Chuan

    2009-01-01

    This study provides insights with regard to the types of cognitive processes that are involved in the formation of mental models and the way those models change over the course of a semester in novice programmers doing a design task. Eight novice programmers participated in this study for three distinct software design sessions, using the same…

  20. A software radio approach to global navigation satellite system receiver design

    NASA Astrophysics Data System (ADS)

    Akos, Dennis Matthew

    1997-12-01

    The software radio has been described as the most significant evolution in receiver design since the development of the superheterodyne concept in 1918. The software radio design philosophy is to position an analog-to-digital converter (ADC) as close to the antenna as possible and then process the samples using a combination of software and a programmable microprocessor. There are a number of important advantages to be gained through full exploitation of the software radio concept. The most notable include: (1) The removal of analog signal processing components and their associated nonlinear, temperature-based, and age-based performance characteristics. (2) A single antenna/front-end configuration can be used to receive and demodulate a variety of radio frequency (RF) transmissions. (3) The software radio provides the ultimate simulation/testing environment. Global Navigation Satellite Systems (GNSSs) are the latest and most complex radionavigation systems in widespread use. The United States' Global Positioning System (GPS) and, to a lesser extent, the Russian Global Orbiting Navigation Satellite System (GLONASS) are being targeted for use as next generation aviation navigation systems. As a result, it is critical that a GNSS achieve the reliability and integrity necessary for use within the aerospace system. The receiver design is a key element in achieving the high standards required. This work presents the complete development of a GNSS software radio. A GNSS receiver front end has been constructed, based on the software radio design goals, and has been evaluated against the traditional design. Trade-offs associated with each implementation are presented along with experimental results. Novel bandpass sampling front end designs have been proposed, implemented and tested for the processing of multiple GNSS transmissions. Finally, every aspect of GNSS signal processing has been implemented in software from the necessary spread spectrum acquisition algorithms to

  1. Software design implementation document for TRAC-M data structures

    SciTech Connect

    Jolly-Woodruff, S.; Mahaffy, J.; Giguere, P.; Dearing, J.; Boyack, B.

    1997-07-01

    The Transient Reactor Analysis Code (TRAC)-M system-wide and component data structures are to be reimplemented by using the new features of Fortran 90 (F90). There will be no changes to the conceptual design, data flow, or computational flow with respect to the current TRAC-P, except that readability, maintainability, and extensibility will be improved. However, the task described here is a basic step that does not meet all future needs of the code, especially regarding extensibility. TRAC-M will be fully functional and will produce null computational changes with respect to TRAC-P, Version 5.4.25; computational efficiency will not be degraded significantly. The existing component and functional modularity and possibilities for coarse-grained parallelism will be retained.

  2. Internet-based hardware/software co-design framework for embedded 3D graphics applications

    NASA Astrophysics Data System (ADS)

    Yeh, Chi-Tsai; Wang, Chun-Hao; Huang, Ing-Jer; Wong, Weng-Fai

    2011-12-01

    Advances in technology are making it possible to run three-dimensional (3D) graphics applications on embedded and handheld devices. In this article, we propose a hardware/software co-design environment for 3D graphics application development that includes the 3D graphics software, OpenGL ES application programming interface (API), device driver, and 3D graphics hardware simulators. We developed a 3D graphics system-on-a-chip (SoC) accelerator using transaction-level modeling (TLM). This gives software designers early access to the hardware even before it is ready. On the other hand, hardware designers also stand to gain from the more complex test benches made available in the software for verification. A unique aspect of our framework is that it allows hardware and software designers from geographically dispersed areas to cooperate and work on the same framework. Designs can be entered and executed from anywhere in the world without full access to the entire framework, which may include proprietary components. This results in controlled and secure transparency and reproducibility, granting leveled access to users of various roles.

  3. Designing of robotic production lines using CAx software

    NASA Astrophysics Data System (ADS)

    Wróbel, A.; Langer, P.

    2015-11-01

    Present market conditions causes that modern control systems of robotized manufacturing cells should be characterized by the much greater degree of flexibility, selforganization and, above all, adaptability to emerging outer excitations. The phenomenon of information distribution is one of the most important features of modern control systems. In the paper is presented the approach, based on application of multi-agent systems, for supporting the operation of robotized manufacturing cells. The aim of this approach is to obtain the flexible response to outer excitations and preventing situations that might cause the delay of the production process. The presented paper includes description of the concept of an informatics system designed for controlling the work of production systems, including work cells. Such systems could operate independently if it would be equipped with the selforganization mechanism. It is possible in the case of the proposed multi-agent system. The implementation of the presented concept will follow the present analysis of the described concept. The advantage of the proposed concept is its hierarchical depiction that allows integrating different utilized informatics tools in one complex system. It allows preparing the final computer program.

  4. Design and validation of Segment - freely available software for cardiovascular image analysis

    PubMed Central

    2010-01-01

    Background Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Results Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http

  5. GridOPTICS(TM): A Design for Plug-and-Play Smart Grid Software Architecture

    SciTech Connect

    Gorton, Ian; Liu, Yan; Yin, Jian

    2012-06-03

    As the smart grid becomes reality, software architectures for integrating legacy systems with new innovative approaches for grid management are needed. These architectures must exhibit flexibility, extensibility, interoperability and scalability. In this position paper, we describe our preliminary work to design such an architecture, known as GridOPTICS, that will enable the deployment and integration of new software tools in smart grid operations. Our preliminary design is based upon use cases from PNNL’s Future Power Grid Initiative, which is a developing a collection of advanced software technologies for smart grid management and control. We describe the motivations for GridOPTICS, and the preliminary design that we are currently prototyping for several distinct use cases.

  6. The design of real time infrared image generation software based on Creator and Vega

    NASA Astrophysics Data System (ADS)

    Wang, Rui-feng; Wu, Wei-dong; Huo, Jun-xiu

    2013-09-01

    Considering the requirement of high reality and real-time quality dynamic infrared image of an infrared image simulation, a method to design real-time infrared image simulation application on the platform of VC++ is proposed. This is based on visual simulation software Creator and Vega. The functions of Creator are introduced simply, and the main features of Vega developing environment are analyzed. The methods of infrared modeling and background are offered, the designing flow chart of the developing process of IR image real-time generation software and the functions of TMM Tool and MAT Tool and sensor module are explained, at the same time, the real-time of software is designed.

  7. Modular Software Performance Monitoring

    NASA Astrophysics Data System (ADS)

    Kruse, Daniele Francesco; Kruzelecki, Karol

    2011-12-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the programming paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chosen to solve this problem that involves decomposing the application into parts and monitoring each one of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 - a software interface to hardware counters) for CMSSW, Gaudi and Geant4 has been developed and deployed. We will show how this type of analysis has been proven useful in spotting specific performance problems and effective in helping with code optimization.

  8. Software structure and its performance on FOCAS instrument control, a MOS design, and an analyzing package

    NASA Astrophysics Data System (ADS)

    Yoshida, Michitoshi; Shimizu, Yasuhiro; Sasaki, Toshiyuki; Kosugi, George; Takata, Tadafumi; Sekiguchi, Kazuhiro; Kashikawa, Nobunari; Aoki, Kentaro; Asai, Ryo; Ohyama, Youichi; Kawabata, Koji; Inata, Motoko; Saito, Yoshihiko; Taguchi, Hiroko; Ebizuka, Noboru; Yadoumaru, Yasushi; Ozawa, Tomohiko; Iye, Masanori

    2000-06-01

    Faint Object Camera And Spectrograph (FOCAS) is completed and now waiting for a commissioning run on the Subaru Telescope atop Mauna Kea. We have developed a software system that includes the control of FOCAS instruments, Multiple Object Slits (MOS) design, and an analyzing package especially for evaluating performances of FOCAS. The control software system consists of several processes: a network interface process, user interface process, a central control engine process, a command dispatcher process, local control units, and a data acquisition system. These processes are mutually controlled by passing messages of commands and their status each other. The control system is also connected to Subaru Observation Software System to achieve high efficiency and reliability of observations. We have two off-line systems: a MOS design program, MDP, and an analyzing package. The MDP is a utility software to select spectroscopy targets in the field of view of FOCAS easily through its GUI and to design MOS plates efficiently. The designed MOS parameters are sent to a laser cutter to make a desirable MOS plate. A special package enables prompt performance check and evaluation of the FOCAS itself during a commissioning period. We describe the overall structure of FOCAS software with some GUI samples.

  9. The Implementation of Satellite Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Anderson, Mark O.; Reid, Mark; Drury, Derek; Hansell, William; Phillips, Tom

    1998-01-01

    NASA established the Small Explorer (SMEX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions that can be launched into low earth orbit by small expendable vehicles. The development schedule for each SMEX spacecraft was three years from start to launch. The SMEX program has produced five satellites; Solar Anomalous and Magnetospheric Particle Explorer (SAMPEX), Fast Auroral Snapshot Explorer (FAST), Submillimeter Wave Astronomy Satellite (SWAS), Transition Region and Coronal Explorer (TRACE) and Wide-Field Infrared Explorer (WIRE). SAMPEX and FAST are on-orbit, TRACE is scheduled to be launched in April of 1998, WIRE is scheduled to be launched in September of 1998, and SWAS is scheduled to be launched in January of 1999. In each of these missions, the Attitude Control System (ACS) software was written using a modular procedural design. Current program goals require complete spacecraft development within 18 months. This requirement has increased pressure to write reusable flight software. Object-Oriented Design (OOD) offers the constructs for developing an application that only needs modification for mission unique requirements. This paper describes the OOD that was used to develop the SMEX-Lite ACS software. The SMEX-Lite ACS is three-axis controlled, momentum stabilized, and is capable of performing sub-arc-minute pointing. The paper first describes the high level requirements which governed the architecture of the SMEX-Lite ACS software. Next, the context in which the software resides is explained. The paper describes the benefits of encapsulation, inheritance and polymorphism with respect to the implementation of an ACS software system. This paper will discuss the design of several software components that comprise the ACS software. Specifically, Object-Oriented designs are presented for sensor data processing, attitude control, attitude determination and failure detection. The paper addresses

  10. NanoDesign: Concepts and Software for a Nanotechnology Based on Functionalized Fullerenes

    NASA Technical Reports Server (NTRS)

    Globus, Al; Jaffe, Richard; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Eric Drexler has proposed a hypothetical nanotechnology based on diamond and investigated the properties of such molecular systems. While attractive, diamonoid nanotechnology is not physically accessible with straightforward extensions of current laboratory techniques. We propose a nanotechnology based on functionalized fullerenes and investigate carbon nanotube based gears with teeth added via a benzyne reaction known to occur with C60. The gears are single-walled carbon nanotubes with appended coenzyme groups for teeth. Fullerenes are in widespread laboratory use and can be functionalized in many ways. Companion papers computationally demonstrate the properties of these gears (they appear to work) and the accessibility of the benzyne/nanotube reaction. This paper describes the molecular design techniques and rationale as well as the software that implements these design techniques. The software is a set of persistent C++ objects controlled by TCL command scripts. The c++/tcl interface is automatically generated by a software system called tcl_c++ developed by the author and described here. The objects keep track of different portions of the molecular machinery to allow different simulation techniques and boundary conditions to be applied as appropriate. This capability has been required to demonstrate (computationally) our gear's feasibility. A new distributed software architecture featuring a WWW universal client, CORBA distributed objects, and agent software is under consideration. The software architecture is intended to eventually enable a widely disbursed group to develop complex simulated molecular machines.

  11. Use of software engineering techniques in the design of the ALEPH data acquisition system

    NASA Astrophysics Data System (ADS)

    Charity, T.; McClatchey, R.; Harvey, J.

    1987-08-01

    The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.

  12. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    PubMed

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes. PMID:24364196

  13. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    PubMed

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes.

  14. Use of checkpoint-restart for complex HEP software on traditional architectures and Intel MIC

    NASA Astrophysics Data System (ADS)

    Arya, Kapil; Cooperman, Gene; Dotti, Andrea; Elmer, Peter

    2014-06-01

    Process checkpoint-restart is a technology with great potential for use in HEP workflows. Use cases include debugging, reducing the startup time of applications both in offline batch jobs and the High Level Trigger, permitting job preemption in environments where spare CPU cycles are being used opportunistically and efficient scheduling of a mix of multicore and single-threaded jobs. We report on tests of checkpoint-restart technology using CMS software, Geant4-MT (multi-threaded Geant4), and the DMTCP (Distributed Multithreaded Checkpointing) package. We analyze both single- and multi-threaded applications and test on both standard Intel x86 architectures and on Intel MIC. The tests with multi-threaded applications on Intel MIC are used to consider scalability and performance. These are considered an indicator of what the future may hold for many-core computing.

  15. Protein evolution analysis of S-hydroxynitrile lyase by complete sequence design utilizing the INTMSAlign software.

    PubMed

    Nakano, Shogo; Asano, Yasuhisa

    2015-02-03

    Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.

  16. Protein evolution analysis of S-hydroxynitrile lyase by complete sequence design utilizing the INTMSAlign software

    PubMed Central

    Nakano, Shogo; Asano, Yasuhisa

    2015-01-01

    Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs. PMID:25645341

  17. Design and evaluation of a THz time domain imaging system using standard optical design software.

    PubMed

    Brückner, Claudia; Pradarutti, Boris; Müller, Ralf; Riehemann, Stefan; Notni, Gunther; Tünnermann, Andreas

    2008-09-20

    A terahertz (THz) time domain imaging system is analyzed and optimized with standard optical design software (ZEMAX). Special requirements to the illumination optics and imaging optics are presented. In the optimized system, off-axis parabolic mirrors and lenses are combined. The system has a numerical aperture of 0.4 and is diffraction limited for field points up to 4 mm and wavelengths down to 750 microm. ZEONEX is used as the lens material. Higher aspherical coefficients are used for correction of spherical aberration and reduction of lens thickness. The lenses were manufactured by ultraprecision machining. For optimization of the system, ray tracing and wave-optical methods were combined. We show how the ZEMAX Gaussian beam analysis tool can be used to evaluate illumination optics. The resolution of the THz system was tested with a wire and a slit target, line gratings of different period, and a Siemens star. The behavior of the temporal line spread function can be modeled with the polychromatic coherent line spread function feature in ZEMAX. The spectral and temporal resolutions of the line gratings are compared with the respective modulation transfer function of ZEMAX. For maximum resolution, the system has to be diffraction limited down to the smallest wavelength of the spectrum of the THz pulse. Then, the resolution on time domain analysis of the pulse maximum can be estimated with the spectral resolution of the center of gravity wavelength. The system resolution near the optical axis on time domain analysis of the pulse maximum is 1 line pair/mm with an intensity contrast of 0.22. The Siemens star is used for estimation of the resolution of the whole system. An eight channel electro-optic sampling system was used for detection. The resolution on time domain analysis of the pulse maximum of all eight channels could be determined with the Siemens star to be 0.7 line pairs/mm. PMID:18806862

  18. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  19. Safety Software Guide Perspectives for the Design of New Nuclear Facilities (U)

    SciTech Connect

    VINCENT, Andrew

    2005-07-14

    software. The discussion provided herein illustrates benefits of applying the Safety Software Guide to work activities dependent on software applications and directed toward the design of new nuclear facilities. In particular, the Guide-based systematic approach with software enables design processes to effectively proceed and reduce the likelihood of rework activities. Several application examples are provided for the new facility.

  20. geant4 hadronic cascade models analysis of proton and charged pion transverse momentum spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c

    SciTech Connect

    Abdel-Waged, Khaled; Felemban, Nuha; Uzhinskii, V. V.

    2011-07-15

    We describe how various hadronic cascade models, which are implemented in the geant4 toolkit, describe proton and charged pion transverse momentum spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c, recently measured in the hadron production (HARP) experiment at CERN. The Binary, ultrarelativistic quantum molecular dynamics (UrQMD) and modified FRITIOF (FTF) hadronic cascade models are chosen for investigation. The first two models are based on limited (Binary) and branched (UrQMD) binary scattering between cascade particles which can be either a baryon or meson, in the three-dimensional space of the nucleus, while the latter (FTF) considers collective interactions between nucleons only, on the plane of impact parameter. It is found that the slow (p{sub T}{<=}0.3 GeV/c) proton spectra are quite sensitive to the different treatments of cascade pictures, while the fast (p{sub T}>0.3 GeV/c) proton spectra are not strongly affected by the differences between the FTF and UrQMD models. It is also shown that the UrQMD and FTF combined with Binary (FTFB) models could reproduce both proton and charged pion spectra from p + Cu and Pb collisions at 3, 8, and 15 GeV/c with the same accuracy.

  1. Determination and Fabrication of New Shield Super Alloys Materials for Nuclear Reactor Safety by Experiments and Cern-Fluka Monte Carlo Simulation Code, Geant4 and WinXCom

    NASA Astrophysics Data System (ADS)

    Aygun, Bünyamin; Korkut, Turgay; Karabulut, Abdulhalik

    2016-05-01

    Despite the possibility of depletion of fossil fuels increasing energy needs the use of radiation tends to increase. Recently the security-focused debate about planned nuclear power plants still continues. The objective of this thesis is to prevent the radiation spread from nuclear reactors into the environment. In order to do this, we produced higher performanced of new shielding materials which are high radiation holders in reactors operation. Some additives used in new shielding materials; some of iron (Fe), rhenium (Re), nickel (Ni), chromium (Cr), boron (B), copper (Cu), tungsten (W), tantalum (Ta), boron carbide (B4C). The results of this experiments indicated that these materials are good shields against gamma and neutrons. The powder metallurgy technique was used to produce new shielding materials. CERN - FLUKA Geant4 Monte Carlo simulation code and WinXCom were used for determination of the percentages of high temperature resistant and high-level fast neutron and gamma shielding materials participated components. Super alloys was produced and then the experimental fast neutron dose equivalent measurements and gamma radiation absorpsion of the new shielding materials were carried out. The produced products to be used safely reactors not only in nuclear medicine, in the treatment room, for the storage of nuclear waste, nuclear research laboratories, against cosmic radiation in space vehicles and has the qualities.

  2. Design Genetic Algorithm Optimization Education Software Based Fuzzy Controller for a Tricopter Fly Path Planning

    ERIC Educational Resources Information Center

    Tran, Huu-Khoa; Chiou, Juing -Shian; Peng, Shou-Tao

    2016-01-01

    In this paper, the feasibility of a Genetic Algorithm Optimization (GAO) education software based Fuzzy Logic Controller (GAO-FLC) for simulating the flight motion control of Unmanned Aerial Vehicles (UAVs) is designed. The generated flight trajectories integrate the optimized Scaling Factors (SF) fuzzy controller gains by using GAO algorithm. The…

  3. Design and Empirical Evaluation of Search Software for Legal Professionals on the WWW.

    ERIC Educational Resources Information Center

    Dempsey, Bert J.; Vreeland, Robert C.; Sumner, Robert G., Jr.; Yang, Kiduk

    2000-01-01

    Discussion of effective search aids for legal researchers on the World Wide Web focuses on the design and evaluation of two software systems developed to explore models for browsing and searching across a user-selected set of Web sites. Describes crawler-enhanced search engines, filters, distributed full-text searching, and natural language…

  4. Similarities and Differences in the Academic Education of Software Engineering and Architectural Design Professionals

    ERIC Educational Resources Information Center

    Hazzan, Orit; Karni, Eyal

    2006-01-01

    This article focuses on the similarities and differences in the academic education of software engineers and architects. The rationale for this work stems from our observation, each from the perspective of her or his own discipline, that these two professional design and development processes share some similarities. A pilot study was performed,…

  5. The Design and Evaluation of a Cryptography Teaching Strategy for Software Engineering Students

    ERIC Educational Resources Information Center

    Dowling, T.

    2006-01-01

    The present paper describes the design, implementation and evaluation of a cryptography module for final-year software engineering students. The emphasis is on implementation architectures and practical cryptanalysis rather than a standard mathematical approach. The competitive continuous assessment process reflects this approach and rewards…

  6. Constraint-Driven Software Design: An Escape from the Waterfall Model.

    ERIC Educational Resources Information Center

    de Hoog, Robert; And Others

    1994-01-01

    Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…

  7. Managing Courseware Production: An Instructional Design Model with a Software Engineering Approach.

    ERIC Educational Resources Information Center

    Yang, Chia-Shing; And Others

    1995-01-01

    Proposes an instructional design model for the production of courseware and combines this model with software engineering principles to specify project management responsibilities and procedures. Describes the roles of team members involved in the project and offers a project template to manage production activities to ensure that end products…

  8. Software design as a problem in learning theory (a research overview)

    NASA Technical Reports Server (NTRS)

    Fass, Leona F.

    1992-01-01

    Our interest in automating software design has come out of our research in automated reasoning, inductive inference, learnability, and algebraic machine theory. We have investigated these areas extensively, in connection with specific problems of language representation, acquisition, processing, and design. In the case of formal context-free (CF) languages we established existence of finite learnable models ('behavioral realizations') and procedures for constructing them effectively. We also determined techniques for automatic construction of the models, inductively inferring them from finite examples of how they should 'behave'. These results were obtainable due to appropriate representation of domain knowledge, and constraints on the domain that the representation defined. It was when we sought to generalize our results, and adapt or apply them, that we began investigating the possibility of determining similar procedures for constructing correct software. Discussions with other researchers led us to examine testing and verification processes, as they are related to inference, and due to their considerable importance in correct software design. Motivating papers by other researchers, led us to examine these processes in some depth. Here we present our approach to those software design issues raised by other researchers, within our own theoretical context. We describe our results, relative to those of the other researchers, and conclude that they do not compare unfavorably.

  9. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  10. Getting Down to Business: Software Design Company, Module 20. [Student Guide]. Entrepreneurship Training Components.

    ERIC Educational Resources Information Center

    Shapiro, Norma

    This module on owning and operating a software design company is one of 36 in a series on entrepreneurship. The introduction tells the student what topics will be covered and suggests other modules to read in related occupations. Each unit includes student goals, a case study, and a discussion of the unit subject matter. Learning activities are…

  11. Software design and documentation language: User's guide for SDDL release 4

    NASA Technical Reports Server (NTRS)

    Zepko, T. M.

    1981-01-01

    The changes introduced in the PASCAL implementation of the software design and documentation language are described. These changes include a number of new capabilities, plus some changes to make the language more consistent and easier to use. Incompatibilities with earlier versions are limited to certain of the directive statements.

  12. An Assessmant of a Beofulf System for a Wide Class of Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    Katz, D. S.; Cwik, T.; Kwan, B. H.; Lou, J. Z.; Springer, P. L.; Sterling, T. L.; Wang, P.

    1997-01-01

    This paper discusses Beowulf systems, focusing on Hyglac, the Beowulf system installed at the Jet Propulsion Laboratory. The purpose of the paper is to assess how a system of this type will perform while running a variety of scientific and engineering analysis and design software.

  13. Effects of Reflective Thinking in the Process of Designing Software on Students' Learning Performances

    ERIC Educational Resources Information Center

    Hsieh, Pei-Hsuan; Chen, Nian-Shing

    2012-01-01

    The purpose of this study is to examine the effects of reflective thinking effects in the process of designing software on students' learning performances. The study contends that reflective thinking is a useful teaching strategy to improve learning performance among lower achieving students. Participants were students from two groups: Higher…

  14. Learning Embedded Software Design in an Open 3A Multiuser Laboratory

    ERIC Educational Resources Information Center

    Shih, Chien-Chou; Hwang, Lain-Jinn

    2011-01-01

    The need for professional programmers in embedded applications has become critical for industry growth. This need has increased the popularity of embedded software design courses, which are resource-intensive and space-limited in traditional real lab-based instruction. To overcome geographic and time barriers in enhancing practical skills that…

  15. The Design of Lessons Using Mathematics Analysis Software to Support Multiple Representations in Secondary School Mathematics

    ERIC Educational Resources Information Center

    Pierce, Robyn; Stacey, Kaye; Wander, Roger; Ball, Lynda

    2011-01-01

    Current technologies incorporating sophisticated mathematical analysis software (calculation, graphing, dynamic geometry, tables, and more) provide easy access to multiple representations of mathematical problems. Realising the affordances of such technology for students' learning requires carefully designed lessons. This paper reports on design…

  16. A Test of the Design of a Video Tutorial for Software Training

    ERIC Educational Resources Information Center

    van der Meij, J.; van der Meij, H.

    2015-01-01

    The effectiveness of a video tutorial versus a paper-based tutorial for software training has yet to be established. Mixed outcomes from the empirical studies to date suggest that for a video tutorial to outperform its paper-based counterpart, the former should be crafted so that it addresses the strengths of both designs. This was attempted in…

  17. Using Tablet PCs and Interactive Software in IC Design Courses to Improve Learning

    ERIC Educational Resources Information Center

    Simoni, M.

    2011-01-01

    This paper describes an initial study of using tablet PCs and interactive course software in integrated circuit (IC) design courses. A rapidly growing community is demonstrating how this technology can improve learning and retention of material by facilitating interaction between faculty and students via cognitive exercises during lectures. While…

  18. Web-based software tool for constraint-based design specification of synthetic biological systems.

    PubMed

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ).

  19. WRAP Module 1 data management system (DMS) software design description (SDD)

    SciTech Connect

    Talmage, P.A.

    1995-03-17

    The Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS) System Design Description (SDD) describes the logical and physical architecture of the system. The WRAP 1 DMS SDD formally partitions the elements of the system described in the WRAP 1 DMS Software requirements specification into design objects and describes the key properties and relationships among the design objects and interfaces with external systems such as the WRAP Plant Control System (PCS). The WRAP 1 DMS SDD can be thought of as a detailed blueprint for implementation activities. The design descriptions contained within this document will describe, in detail, the software products that will be developed to assist the Project W-026, Waste Receiving and Processing Module 1, in their management functions. The WRAP 1 DMS is required to collect, store, and report data related to certification, tracking, packaging, repackaging, processing, and shipment of waste processed or stored at the WRAP 1 facility.

  20. Web-based software tool for constraint-based design specification of synthetic biological systems.

    PubMed

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ). PMID:25426642