Sample records for geant4 software design

  1. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less

  2. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genser, Krzysztof; Hatcher, Robert; Perdue, Gabriel

    2016-11-10

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies.more » It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.« less

  3. A Virtual Geant4 Environment

    NASA Astrophysics Data System (ADS)

    Iwai, Go

    2015-12-01

    We describe the development of an environment for Geant4 consisting of an application and data that provide users with a more efficient way to access Geant4 applications without having to download and build the software locally. The environment is platform neutral and offers the users near-real time performance. In addition, the environment consists of data and Geant4 libraries built using low-level virtual machine (LLVM) tools which can produce bitcode that can be embedded in HTML and accessed via a browser. The bitcode is downloaded to the local machine via the browser and can then be configured by the user. This approach provides a way of minimising the risk of leaking potentially sensitive data used to construct the Geant4 model and application in the medical domain for treatment planning. We describe several applications that have used this approach and compare their performance with that of native applications. We also describe potential user communities that could benefit from this approach.

  4. Software aspects of the Geant4 validation repository

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Wenzel, Hans; Elvira, Daniel; Genser, Krzysztof; Yarba, Julia; Carminati, Federico; Folger, Gunter; Konstantinov, Dmitri; Pokorski, Witold; Ribon, Alberto

    2017-10-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  5. Software Aspects of the Geant4 Validation Repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dotti, Andrea; Wenzel, Hans; Elvira, Daniel

    2016-01-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientic Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  6. Geomega: MEGAlib's Uniform Geometry and Detector Description Tool for Geant3, MGGPOD, and Geant4

    NASA Astrophysics Data System (ADS)

    Zoglauer, Andreas C.; Andritschke, R.; Schopper, F.; Wunderer, C. B.

    2006-09-01

    The Medium Energy Gamma-ray Astronomy library MEGAlib is a set of software tools for the analysis of low to medium energy gamma-ray telescopes, especially Compton telescopes. It comprises all necessary data analysis steps from simulation/measurements via event reconstruction to image reconstruction and enables detailed performance assessments. In the energy range of Compton telescopes (with energy deposits from a few keV up to hundreds of MeV), the Geant Monte-Carlo software packages (Geant3 with its MGGPOD extension as well as Geant4) are widely used. Since each tool has its unique advantages, MEGAlib contains a geometry and detector description library, called Geomega, which allows to use those tools in a uniform way. It incorporates the versatile 3D display facilities available within the ROOT libraries. The same geometry, material, trigger, and detector description can be used for all simulation tools as well as for the later event analysis in the MEGAlib framework. This is done by converting the MEGAlib geometry into the Geant3 or MGGPOD format or directly linking the Geomega library into Geant4. The geometry description can handle most (and can be extended to handle all) volumes common to Geant3, Geant4 and ROOT. In Geomega a list of features is implemented which are especially useful for optimizing detector geometries: It allows to define constants, can handle mathematical operations, enables volume scaling, checks for overlaps of detector volumes, does mass calculations, etc. Used in combination with MEGAlib, Geomega enables discretization, application of detector noise, thresholds, various trigger conditions, defective pixels, etc. The highly modular and completely object-oriented library is written in C++ and based on ROOT. It has been originally developed for the tracking Compton scattering and Pair creation telescope MEGA and has been successfully applied to a wide variety of telescopes, such as ACT, NuSTAR, or GRI.

  7. Recent developments in Geant4

    DOE PAGES

    Allison, J.; Amako, K.; Apostolakis, J.; ...

    2016-07-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions tomore » the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.« less

  8. A Toolkit to Study Sensitivity of the Geant4 Predictions to the Variations of the Physics Model Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fields, Laura; Genser, Krzysztof; Hatcher, Robert

    Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. Thismore » raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.« less

  9. GEANT4 distributed computing for compact clusters

    NASA Astrophysics Data System (ADS)

    Harrawood, Brian P.; Agasthya, Greeshma A.; Lakshmanan, Manu N.; Raterman, Gretchen; Kapadia, Anuj J.

    2014-11-01

    A new technique for distribution of GEANT4 processes is introduced to simplify running a simulation in a parallel environment such as a tightly coupled computer cluster. Using a new C++ class derived from the GEANT4 toolkit, multiple runs forming a single simulation are managed across a local network of computers with a simple inter-node communication protocol. The class is integrated with the GEANT4 toolkit and is designed to scale from a single symmetric multiprocessing (SMP) machine to compact clusters ranging in size from tens to thousands of nodes. User designed 'work tickets' are distributed to clients using a client-server work flow model to specify the parameters for each individual run of the simulation. The new g4DistributedRunManager class was developed and well tested in the course of our Neutron Stimulated Emission Computed Tomography (NSECT) experiments. It will be useful for anyone running GEANT4 for large discrete data sets such as covering a range of angles in computed tomography, calculating dose delivery with multiple fractions or simply speeding the through-put of a single model.

  10. Geant4-DNA: overview and recent developments

    NASA Astrophysics Data System (ADS)

    Štěpán, Václav

    software already available for download, as well as future perspectives, will be presented, on behalf of the Geant4-DNA Collaboration.

  11. Geant4 Computing Performance Benchmarking and Monitoring

    DOE PAGES

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; ...

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less

  12. Integration of g4tools in Geant4

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, Ivana

    2014-06-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  13. Nuclear spectroscopy with Geant4. The superheavy challenge

    NASA Astrophysics Data System (ADS)

    Sarmiento, Luis G.

    2016-12-01

    The simulation toolkit Geant4 was originally developed at CERN for high-energy physics. Over the years it has been established as a swiss army knife not only in particle physics but it has seen an accelerated expansion towards nuclear physics and more recently to medical imaging and γ- and ion- therapy to mention but a handful of new applications. The validity of Geant4 is vast and large across many particles, ions, materials, and physical processes with typically various different models to choose from. Unfortunately, atomic nuclei with atomic number Z > 100 are not properly supported. This is likely due to the rather novelty of the field, its comparably small user base, and scarce evaluated experimental data. To circumvent this situation different workarounds have been used over the years. In this work the simulation toolkit Geant4 will be introduced with its different components and the effort to bring the software to the heavy and superheavy region will be described.

  14. A tool to convert CAD models for importation into Geant4

    NASA Astrophysics Data System (ADS)

    Vuosalo, C.; Carlsmith, D.; Dasu, S.; Palladino, K.; LUX-ZEPLIN Collaboration

    2017-10-01

    The engineering design of a particle detector is usually performed in a Computer Aided Design (CAD) program, and simulation of the detector’s performance can be done with a Geant4-based program. However, transferring the detector design from the CAD program to Geant4 can be laborious and error-prone. SW2GDML is a tool that reads a design in the popular SOLIDWORKS CAD program and outputs Geometry Description Markup Language (GDML), used by Geant4 for importing and exporting detector geometries. Other methods for outputting CAD designs are available, such as the STEP format, and tools exist to convert these formats into GDML. However, these conversion methods produce very large and unwieldy designs composed of tessellated solids that can reduce Geant4 performance. In contrast, SW2GDML produces compact, human-readable GDML that employs standard geometric shapes rather than tessellated solids. This paper will describe the development and current capabilities of SW2GDML and plans for its enhancement. The aim of this tool is to automate importation of detector engineering models into Geant4-based simulation programs to support rapid, iterative cycles of detector design, simulation, and optimization.

  15. Comparison of GEANT4 very low energy cross section models with experimental data in water.

    PubMed

    Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C

    2010-09-01

    The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models

  16. The Geant4 physics validation repository

    NASA Astrophysics Data System (ADS)

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-01

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. The functionality of these components and the technology choices we made are also described.

  17. The Geant4 physics validation repository

    DOE PAGES

    Wenzel, H.; Yarba, J.; Dotti, A.

    2015-12-23

    The Geant4 collaboration regularly performs validation and regression tests. The results are stored in a central repository and can be easily accessed via a web application. In this article we describe the Geant4 physics validation repository which consists of a relational database storing experimental data and Geant4 test results, a java API and a web application. Lastly, the functionality of these components and the technology choices we made are also described

  18. Geant4 simulations of a wide-angle x-ray focusing telescope

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Zhang, Shuangnan; Willingale, Richard; Ling, Zhixing

    2017-06-01

    The rapid development of X-ray astronomy has been made possible by widely deploying X-ray focusing telescopes on board many X-ray satellites. Geant4 is a very powerful toolkit for Monte Carlo simulations and has remarkable abilities to model complex geometrical configurations. However, the library of physical processes available in Geant4 lacks a description of the reflection of X-ray photons at a grazing incident angle which is the core physical process in the simulation of X-ray focusing telescopes. The scattering of low-energy charged particles from the mirror surfaces is another noteworthy process which is not yet incorporated into Geant4. Here we describe a Monte Carlo model of a simplified wide-angle X-ray focusing telescope adopting lobster-eye optics and a silicon detector using the Geant4 toolkit. With this model, we simulate the X-ray tracing, proton scattering and background detection. We find that: (1) the effective area obtained using Geant4 is in agreement with that obtained using Q software with an average difference of less than 3%; (2) X-rays are the dominant background source below 10 keV; (3) the sensitivity of the telescope is better by at least one order of magnitude than that of a coded mask telescope with the same physical dimensions; (4) the number of protons passing through the optics and reaching the detector by Firsov scattering is about 2.5 times that of multiple scattering for the lobster-eye telescope.

  19. Geant4 hadronic physics for space radiation environment.

    PubMed

    Ivantchenko, Anton V; Ivanchenko, Vladimir N; Molina, Jose-Manuel Quesada; Incerti, Sebastien L

    2012-01-01

    To test and to develop Geant4 (Geometry And Tracking version 4) Monte Carlo hadronic models with focus on applications in a space radiation environment. The Monte Carlo simulations have been performed using the Geant4 toolkit. Binary (BIC), its extension for incident light ions (BIC-ion) and Bertini (BERT) cascades were used as main Monte Carlo generators. For comparisons purposes, some other models were tested too. The hadronic testing suite has been used as a primary tool for model development and validation against experimental data. The Geant4 pre-compound (PRECO) and de-excitation (DEE) models were revised and improved. Proton, neutron, pion, and ion nuclear interactions were simulated with the recent version of Geant4 9.4 and were compared with experimental data from thin and thick target experiments. The Geant4 toolkit offers a large set of models allowing effective simulation of interactions of particles with matter. We have tested different Monte Carlo generators with our hadronic testing suite and accordingly we can propose an optimal configuration of Geant4 models for the simulation of the space radiation environment.

  20. Geant4-DNA example applications for track structure simulations in liquid water: a report from the Geant4-DNA Project.

    PubMed

    Incerti, S; Kyriakou, I; Bernal, M A; Bordage, M C; Francis, Z; Guatelli, S; Ivanchenko, V; Karamitros, M; Lampe, N; Lee, S B; Meylan, S; Min, C H; Shin, W G; Nieminen, P; Sakata, D; Tang, N; Villagrasa, C; Tran, H; Brown, J M C

    2018-06-14

    This Special Report presents a description of Geant4-DNA user applications dedicated to the simulation of track structures (TS) in liquid water and associated physical quantities (e.g. range, stopping power, mean free path…). These example applications are included in the Geant4 Monte Carlo toolkit and are available in open access. Each application is described and comparisons to recent international recommendations are shown (e.g. ICRU, MIRD), when available. The influence of physics models available in Geant4-DNA for the simulation of electron interactions in liquid water is discussed. Thanks to these applications, the authors show that the most recent sets of physics models available in Geant4-DNA (the so-called "option4″ and "option 6″ sets) enable more accurate simulation of stopping powers, dose point kernels and W-values in liquid water, than the default set of models ("option 2″) initially provided in Geant4-DNA. They also serve as reference applications for Geant4-DNA users interested in TS simulations. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  1. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  2. Allowing for crystalline structure effects in Geant4

    DOE PAGES

    Bagli, Enrico; Asai, Makoto; Dotti, Andrea; ...

    2017-03-24

    In recent years, the Geant4 toolkit for the Monte Carlo simulation of radiation with matter has seen large growth in its divers user community. A fundamental aspect of a successful physics experiment is the availability of a reliable and precise simulation code. Geant4 currently does not allow for the simulation of particle interactions with anything other than amorphous matter. To overcome this limitation, the GECO (GEant4 Crystal Objects) project developed a general framework for managing solid-state structures in the Geant4 kernel and validate it against experimental data. As a result, accounting for detailed geometrical structures allows, for example, simulation ofmore » diffraction from crystal planes or the channeling of charged particle.« less

  3. Analysis Tools in Geant4 10.2 and 10.3

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, I.; Barrand, G.

    2017-10-01

    A new analysis category based on g4tools was added in Geant4 release 9.5 (2011). The aim was to provide users with a lightweight analysis tool available as part of the Geant4 installation without the need to link to an external analysis package. It has progressively been included in all Geant4 examples. Frequent questions in the Geant4 users forum show its increasing popularity in the Geant4 users community. In this presentation, we will give a brief overview of g4tools and the analysis category. We report on new developments since our CHEP 2013 contribution as well as mention upcoming new features.

  4. Geant4 Modifications for Accurate Fission Simulations

    NASA Astrophysics Data System (ADS)

    Tan, Jiawei; Bendahan, Joseph

    Monte Carlo is one of the methods to simulate the generation and transport of radiation through matter. The most widely used radiation simulation codes are MCNP and Geant4. The simulation of fission production and transport by MCNP has been thoroughly benchmarked. There is an increasing number of users that prefer using Geant4 due to the flexibility of adding features. However, it has been found that Geant4 does not have the proper fission-production cross sections and does not produce the correct fission products. To achieve accurate results for studies in fissionable material applications, Geant4 was modified to correct these inaccuracies and to add new capabilities. The fission model developed by the Lawrence Livermore National Laboratory was integrated into the neutron-fission modeling package. The photofission simulation capability was enabled using the same neutron-fission library under the assumption that nuclei fission in the same way, independent of the excitation source. The modified fission code provides the correct multiplicity of prompt neutrons and gamma rays, and produces delayed gamma rays and neutrons with time and energy dependencies that are consistent with ENDF/B-VII. The delayed neutrons are now directly produced by a custom package that bypasses the fragment cascade model. The modifications were made for U-235, U-238 and Pu-239 isotopes; however, the new framework allows adding new isotopes easily. The SLAC nuclear data library is used for simulation of isotopes with an atomic number above 92 because it is not available in Geant4. Results of the modified Geant4.10.1 package of neutron-fission and photofission for prompt and delayed radiation are compared with ENDFB-VII and with results produced with the original package.

  5. Electron backscattering simulation in Geant4

    NASA Astrophysics Data System (ADS)

    Dondero, Paolo; Mantero, Alfonso; Ivanchencko, Vladimir; Lotti, Simone; Mineo, Teresa; Fioretti, Valentina

    2018-06-01

    The backscattering of electrons is a key phenomenon in several physics applications which range from medical therapy to space including AREMBES, the new ESA simulation framework for radiation background effects. The importance of properly reproducing this complex interaction has grown considerably in the last years and the Geant4 Monte Carlo simulation toolkit, recently upgraded to the version 10.3, is able to comply with the AREMBES requirements in a wide energy range. In this study a validation of the electron Geant4 backscattering models is performed with respect to several experimental data. In addition a selection of the most recent validation results on the electron scattering processes is also presented. Results of our analysis show a good agreement between simulations and data from several experiments, confirming the Geant4 electron backscattering models to be robust and reliable up to a few tens of electronvolts.

  6. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. CAD-based Automatic Modeling Method for Geant4 geometry model Through MCAM

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Nie, Fanzhi; Wang, Guozhong; Long, Pengcheng; LV, Zhongliang; LV, Zhongliang

    2014-06-01

    Geant4 is a widely used Monte Carlo transport simulation package. Before calculating using Geant4, the calculation model need be established which could be described by using Geometry Description Markup Language (GDML) or C++ language. However, it is time-consuming and error-prone to manually describe the models by GDML. Automatic modeling methods have been developed recently, but there are some problem existed in most of present modeling programs, specially some of them were not accurate or adapted to specifically CAD format. To convert the GDML format models to CAD format accurately, a Geant4 Computer Aided Design (CAD) based modeling method was developed for automatically converting complex CAD geometry model into GDML geometry model. The essence of this method was dealing with CAD model represented with boundary representation (B-REP) and GDML model represented with constructive solid geometry (CSG). At first, CAD model was decomposed to several simple solids which had only one close shell. And then the simple solid was decomposed to convex shell set. Then corresponding GDML convex basic solids were generated by the boundary surfaces getting from the topological characteristic of a convex shell. After the generation of these solids, GDML model was accomplished with series boolean operations. This method was adopted in CAD/Image-based Automatic Modeling Program for Neutronics & Radiation Transport (MCAM), and tested with several models including the examples in Geant4 install package. The results showed that this method could convert standard CAD model accurately, and can be used for Geant4 automatic modeling.

  8. Investigation of OPET Performance Using GATE, a Geant4-Based Simulation Software.

    PubMed

    Rannou, Fernando R; Kohli, Vandana; Prout, David L; Chatziioannou, Arion F

    2004-10-01

    A combined optical positron emission tomography (OPET) system is capable of both optical and PET imaging in the same setting, and it can provide information/interpretation not possible in single-mode imaging. The scintillator array here serves the dual function of coupling the optical signal from bioluminescence/fluorescence to the photodetector and also of channeling optical scintillations from the gamma rays. We report simulation results of the PET part of OPET using GATE, a Geant4 simulation package. The purpose of this investigation is the definition of the geometric parameters of the OPET tomograph. OPET is composed of six detector blocks arranged in a hexagonal ring-shaped pattern with an inner radius of 15.6 mm. Each detector consists of a two-dimensional array of 8 × 8 scintillator crystals each measuring 2 × 2 × 10 mm(3). Monte Carlo simulations were performed using the GATE software to measure absolute sensitivity, depth of interaction, and spatial resolution for two ring configurations, with and without gantry rotations, two crystal materials, and several crystal lengths. Images were reconstructed with filtered backprojection after angular interleaving and transverse one-dimensional interpolation of the sinogram. We report absolute sensitivities nearly seven times that of the prototype microPET at the center of field of view and 2.0 mm tangential and 2.3 mm radial resolutions with gantry rotations up to an 8.0 mm radial offset. These performance parameters indicate that the imaging spatial resolution and sensitivity of the OPET system will be suitable for high-resolution and high-sensitivity small-animal PET imaging.

  9. Monte Carlo simulation of a photodisintegration of 3 H experiment in Geant4

    NASA Astrophysics Data System (ADS)

    Gray, Isaiah

    2013-10-01

    An upcoming experiment involving photodisintegration of 3 H at the High Intensity Gamma-Ray Source facility at Duke University has been simulated in the software package Geant4. CAD models of silicon detectors and wire chambers were imported from Autodesk Inventor using the program FastRad and the Geant4 GDML importer. Sensitive detectors were associated with the appropriate logical volumes in the exported GDML file so that changes in detector geometry will be easily manifested in the simulation. Probability distribution functions for the energy and direction of outgoing protons were generated using numerical tables from previous theory, and energies and directions were sampled from these distributions using a rejection sampling algorithm. The simulation will be a useful tool to optimize detector geometry, estimate background rates, and test data analysis algorithms. This work was supported by the Triangle Universities Nuclear Laboratory REU program at Duke University.

  10. Simulation of orientational coherent effects via Geant4

    NASA Astrophysics Data System (ADS)

    Bagli, E.; Asai, M.; Brandt, D.; Dotti, A.; Guidi, V.; Verderi, M.; Wright, D.

    2017-10-01

    Simulation of orientational coherent effects via Geant4 beam manipulation of high-and very-high-energy particle beams is a hot topic in accelerator physics. Coherent effects of ultra-relativistic particles in bent crystals allow the steering of particle trajectories thanks to the strong electrical field generated between atomic planes. Recently, a collimation experiment with bent crystals was carried out at the CERN-LHC, paving the way to the usage of such technology in current and future accelerators. Geant4 is a widely used object-oriented tool-kit for the Monte Carlo simulation of the interaction of particles with matter in high-energy physics. Moreover, its areas of application include also nuclear and accelerator physics, as well as studies in medical and space science. We present the first Geant4 extension for the simulation of orientational effects in straight and bent crystals for high energy charged particles. The model allows the manipulation of particle trajectories by means of straight and bent crystals and the scaling of the cross sections of hadronic and electromagnetic processes for channeled particles. Based on such a model, an extension of the Geant4 toolkit has been developed. The code and the model have been validated by comparison with published experimental data regarding the deflection efficiency via channeling and the variation of the rate of inelastic nuclear interactions.

  11. Modeling of microporous silicon betaelectric converter with 63Ni plating in GEANT4 toolkit*

    NASA Astrophysics Data System (ADS)

    Zelenkov, P. V.; Sidorov, V. G.; Lelekov, E. T.; Khoroshko, A. Y.; Bogdanov, S. V.; Lelekov, A. T.

    2016-04-01

    The model of electron-hole pairs generation rate distribution in semiconductor is needed to optimize the parameters of microporous silicon betaelectric converter, which uses 63Ni isotope radiation. By using Monte-Carlo methods of GEANT4 software with ultra-low energy electron physics models this distribution in silicon was calculated and approximated with exponential function. Optimal pore configuration was estimated.

  12. GEANT4 Tuning For pCT Development

    NASA Astrophysics Data System (ADS)

    Yevseyeva, Olga; de Assis, Joaquim T.; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, João A. P.; Díaz, Katherin S.; Hormaza, Joel M.; Lopes, Ricardo T.

    2011-08-01

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Thus, the fidelity of proton computed tomography (pCT) simulations as a tool for proton therapy planning depends in the general case on the accuracy of results obtained for the proton interaction with thick absorbers. GEANT4 simulations of proton energy spectra after passing thick absorbers do not agree well with existing experimental data, as showed previously. Moreover, the spectra simulated for the Bethe-Bloch domain showed an unexpected sensitivity to the choice of low-energy electromagnetic models during the code execution. These observations were done with the GEANT4 version 8.2 during our simulations for pCT. This work describes in more details the simulations of the proton passage through aluminum absorbers with varied thickness. The simulations were done by modifying only the geometry in the Hadrontherapy Example, and for all available choices of the Electromagnetic Physics Models. As the most probable reasons for these effects is some specific feature in the code, or some specific implicit parameters in the GEANT4 manual, we continued our study with version 9.2 of the code. Some improvements in comparison with our previous results were obtained. The simulations were performed considering further applications for pCT development.

  13. Study of low energy neutron beam formation based on GEANT4 simulations

    NASA Astrophysics Data System (ADS)

    Avagyan, R.; Avetisyan, R.; Ivanyan, V.; Kerobyan, I.

    2017-07-01

    The possibility of obtaining thermal/epithermal energy neutron beams using external protons from cyclotron C18/18 is studied based on GEANT4 simulations. This study will be the basis of the Beam Shaped Assembly (BSA) development for future Boron Neutron Capture Therapy (BNCT). Proton induced reactions on 9Be target are considered as a neutron source, and dependence of neutron yield on target thickness is investigated. The problem of reducing the ratio of gamma to neutron yields by inserting a lead sheet after the beryllium target is studied as well. By GEANT4 modeling the optimal thicknesses of 9Be target and lead absorber are determined and the design characteristics of beam shaping assembly, including the materials and thicknesses of reflector and moderator are considered.

  14. Low-energy electron dose-point kernel simulations using new physics models implemented in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Bordes, Julien; Incerti, Sébastien; Lampe, Nathanael; Bardiès, Manuel; Bordage, Marie-Claude

    2017-05-01

    When low-energy electrons, such as Auger electrons, interact with liquid water, they induce highly localized ionizing energy depositions over ranges comparable to cell diameters. Monte Carlo track structure (MCTS) codes are suitable tools for performing dosimetry at this level. One of the main MCTS codes, Geant4-DNA, is equipped with only two sets of cross section models for low-energy electron interactions in liquid water (;option 2; and its improved version, ;option 4;). To provide Geant4-DNA users with new alternative physics models, a set of cross sections, extracted from CPA100 MCTS code, have been added to Geant4-DNA. This new version is hereafter referred to as ;Geant4-DNA-CPA100;. In this study, ;Geant4-DNA-CPA100; was used to calculate low-energy electron dose-point kernels (DPKs) between 1 keV and 200 keV. Such kernels represent the radial energy deposited by an isotropic point source, a parameter that is useful for dosimetry calculations in nuclear medicine. In order to assess the influence of different physics models on DPK calculations, DPKs were calculated using the existing Geant4-DNA models (;option 2; and ;option 4;), newly integrated CPA100 models, and the PENELOPE Monte Carlo code used in step-by-step mode for monoenergetic electrons. Additionally, a comparison was performed of two sets of DPKs that were simulated with ;Geant4-DNA-CPA100; - the first set using Geant4‧s default settings, and the second using CPA100‧s original code default settings. A maximum difference of 9.4% was found between the Geant4-DNA-CPA100 and PENELOPE DPKs. Between the two Geant4-DNA existing models, slight differences, between 1 keV and 10 keV were observed. It was highlighted that the DPKs simulated with the two Geant4-DNA's existing models were always broader than those generated with ;Geant4-DNA-CPA100;. The discrepancies observed between the DPKs generated using Geant4-DNA's existing models and ;Geant4-DNA-CPA100; were caused solely by their different cross

  15. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  16. Benchmarking Geant4 for simulating galactic cosmic ray interactions within planetary bodies

    DOE PAGES

    Mesick, K. E.; Feldman, W. C.; Coupland, D. D. S.; ...

    2018-06-20

    Galactic cosmic rays undergo complex nuclear interactions with nuclei within planetary bodies that have little to no atmosphere. Radiation transport simulations are a key tool used in understanding the neutron and gamma-ray albedo coming from these interactions and tracing these signals back to geochemical composition of the target. In this paper, we study the validity of the code Geant4 for simulating such interactions by comparing simulation results to data from the Apollo 17 Lunar Neutron Probe Experiment. Different assumptions regarding the physics are explored to demonstrate how these impact the Geant4 simulation results. In general, all of the Geant4 resultsmore » over-predict the data, however, certain physics lists perform better than others. Finally, in addition, we show that results from the radiation transport code MCNP6 are similar to those obtained using Geant4.« less

  17. Benchmarking Geant4 for simulating galactic cosmic ray interactions within planetary bodies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesick, K. E.; Feldman, W. C.; Coupland, D. D. S.

    Galactic cosmic rays undergo complex nuclear interactions with nuclei within planetary bodies that have little to no atmosphere. Radiation transport simulations are a key tool used in understanding the neutron and gamma-ray albedo coming from these interactions and tracing these signals back to geochemical composition of the target. In this paper, we study the validity of the code Geant4 for simulating such interactions by comparing simulation results to data from the Apollo 17 Lunar Neutron Probe Experiment. Different assumptions regarding the physics are explored to demonstrate how these impact the Geant4 simulation results. In general, all of the Geant4 resultsmore » over-predict the data, however, certain physics lists perform better than others. Finally, in addition, we show that results from the radiation transport code MCNP6 are similar to those obtained using Geant4.« less

  18. N values estimation based on photon flux simulation with Geant4 toolkit.

    PubMed

    Sun, Z J; Danjaji, M; Kim, Y

    2018-06-01

    N values are routinely introduced in photon activation analysis (PAA) as the ratio of special activities of product nuclides to compare the relative intensities of different reaction channels. They determine the individual activities of each radioisotope and the total activity of the sample, which are the primary concerns of radiation safety. Traditionally, N values are calculated from the gamma spectroscopy in real measurements by normalizing the activities of individual nuclides to the reference reaction [ 58 Ni(γ, n) 57 Ni] of the nickel monitor simultaneously irradiated in photon activation. Is it possible to use photon flux simulated by Monte Carlo software to calculate N values even before the actual irradiation starts? This study has applied Geant4 toolkit, a popular platform of simulating the passage of particles through matter, to generate photon flux in the samples. Assisted with photonuclear cross section from IAEA database, it is feasible to predict N values in different experimental setups for simulated target material. We have validated of this method and its consistency with Geant4. Results also show that N values are highly correlated with the beam parameters of incoming electrons and the setup of the electron-photon converter. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV.

    PubMed

    Maigne, L; Perrot, Y; Schaart, D R; Donnarieix, D; Breton, V

    2011-02-07

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV.

  20. Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4

    PubMed Central

    Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien

    2014-01-01

    This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ε). In addition, we present comparisons of GEANT4 simulations performed with a “standard” and a “low-energy” physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results. Key Points Testing the feedback mechanism with GEANT4 Validating the GEANT4 programming toolkit Study the ratio of bremsstrahlung photons to electrons at TGF source altitude PMID:26167437

  1. SU-E-T-565: RAdiation Resistance of Cancer CElls Using GEANT4 DNA: RACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perrot, Y; Payno, H; Delage, E

    2014-06-01

    Purpose: The objective of the RACE project is to develop a comparison between Monte Carlo simulation using the Geant4-DNA toolkit and measurements of radiation damage on 3D melanoma and chondrosarcoma culture cells coupled with gadolinium nanoparticles. We currently expose the status of the developments regarding simulations. Methods: Monte Carlo studies are driven using the Geant4 toolkit and the Geant4-DNA extension. In order to model the geometry of a cell population, the opensource CPOP++ program is being developed for the geometrical representation of 3D cell populations including a specific cell mesh coupled with a multi-agent system. Each cell includes cytoplasm andmore » nucleus. The correct modeling of the cell population has been validated with confocal microscopy images of spheroids. The Geant4 Livermore physics models are used to simulate the interactions of a 250 keV X-ray beam and the production of secondaries from gadolinium nanoparticles supposed to be fixed on the cell membranes. Geant4-DNA processes are used to simulate the interactions of charged particles with the cells. An atomistic description of the DNA molecule, from PDB (Protein Data Bank) files, is provided by the so-called PDB4DNA Geant4 user application we developed to score energy depositions in DNA base pairs and sugar-phosphate groups. Results: At the microscopic level, our simulations enable assessing microscopic energy distribution in each cell compartment of a realistic 3D cell population. Dose enhancement factors due to the presence of gadolinium nanoparticles can be estimated. At the nanometer scale, direct damages on nuclear DNA are also estimated. Conclusion: We successfully simulated the impact of direct radiations on a realistic 3D cell population model compatible with microdosimetry calculations using the Geant4-DNA toolkit. Upcoming validation and the future integration of the radiochemistry module of Geant4-DNA will propose to correlate clusters of ionizations with in

  2. GeantV: from CPU to accelerators

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Arora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Duhem, L.; Elvira, D.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Sehgal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2016-10-01

    The GeantV project aims to research and develop the next-generation simulation software describing the passage of particles through matter. While the modern CPU architectures are being targeted first, resources such as GPGPU, Intel© Xeon Phi, Atom or ARM cannot be ignored anymore by HEP CPU-bound applications. The proof of concept GeantV prototype has been mainly engineered for CPU's having vector units but we have foreseen from early stages a bridge to arbitrary accelerators. A software layer consisting of architecture/technology specific backends supports currently this concept. This approach allows to abstract out the basic types such as scalar/vector but also to formalize generic computation kernels using transparently library or device specific constructs based on Vc, CUDA, Cilk+ or Intel intrinsics. While the main goal of this approach is portable performance, as a bonus, it comes with the insulation of the core application and algorithms from the technology layer. This allows our application to be long term maintainable and versatile to changes at the backend side. The paper presents the first results of basket-based GeantV geometry navigation on the Intel© Xeon Phi KNC architecture. We present the scalability and vectorization study, conducted using Intel performance tools, as well as our preliminary conclusions on the use of accelerators for GeantV transport. We also describe the current work and preliminary results for using the GeantV transport kernel on GPUs.

  3. GeantV: From CPU to accelerators

    DOE PAGES

    Amadio, G.; Ananya, A.; Apostolakis, J.; ...

    2016-01-01

    The GeantV project aims to research and develop the next-generation simulation software describing the passage of particles through matter. While the modern CPU architectures are being targeted first, resources such as GPGPU, Intel© Xeon Phi, Atom or ARM cannot be ignored anymore by HEP CPU-bound applications. The proof of concept GeantV prototype has been mainly engineered for CPU's having vector units but we have foreseen from early stages a bridge to arbitrary accelerators. A software layer consisting of architecture/technology specific backends supports currently this concept. This approach allows to abstract out the basic types such as scalar/vector but also tomore » formalize generic computation kernels using transparently library or device specific constructs based on Vc, CUDA, Cilk+ or Intel intrinsics. While the main goal of this approach is portable performance, as a bonus, it comes with the insulation of the core application and algorithms from the technology layer. This allows our application to be long term maintainable and versatile to changes at the backend side. The paper presents the first results of basket-based GeantV geometry navigation on the Intel© Xeon Phi KNC architecture. We present the scalability and vectorization study, conducted using Intel performance tools, as well as our preliminary conclusions on the use of accelerators for GeantV transport. Lastly, we also describe the current work and preliminary results for using the GeantV transport kernel on GPUs.« less

  4. SU-F-T-149: Development of the Monte Carlo Simulation Platform Using Geant4 for Designing Heavy Ion Therapy Beam Nozzle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Jae-ik; Yoo, SeungHoon; Cho, Sungho

    Purpose: The significant issue of particle therapy such as proton and carbon ion was a accurate dose delivery from beam line to patient. For designing the complex delivery system, Monte Carlo simulation can be used for the simulation of various physical interaction in scatters and filters. In this report, we present the development of Monte Carlo simulation platform to help design the prototype of particle therapy nozzle and performed the Monte Carlo simulation using Geant4. Also we show the prototype design of particle therapy beam nozzle for Korea Heavy Ion Medical Accelerator (KHIMA) project in Korea Institute of Radiological andmore » Medical Science(KIRAMS) at Republic of Korea. Methods: We developed a simulation platform for particle therapy beam nozzle using Geant4. In this platform, the prototype nozzle design of Scanning system for carbon was simply designed. For comparison with theoretic beam optics, the beam profile on lateral distribution at isocenter is compared with Mont Carlo simulation result. From the result of this analysis, we can expected the beam spot property of KHIMA system and implement the spot size optimization for our spot scanning system. Results: For characteristics study of scanning system, various combination of the spot size from accerlator with ridge filter and beam monitor was tested as simple design for KHIMA dose delivery system. Conclusion: In this report, we presented the part of simulation platform and the characteristics study. This study is now on-going in order to develop the simulation platform including the beam nozzle and the dose verification tool with treatment planning system. This will be presented as soon as it is become available.« less

  5. Multiple scattering of 13 and 20 MeV electrons by thin foils: a Monte Carlo study with GEANT, Geant4, and PENELOPE.

    PubMed

    Vilches, M; García-Pareja, S; Guerrero, R; Anguiano, M; Lallena, A M

    2009-09-01

    In this work, recent results from experiments and simulations (with EGSnrc) performed by Ross et al. [Med. Phys. 35, 4121-4131 (2008)] on electron scattering by foils of different materials and thicknesses are compared to those obtained using several Monte Carlo codes. Three codes have been used: GEANT (version 3.21), Geant4 (version 9.1, patch03), and PENELOPE (version 2006). In the case of PENELOPE, mixed and fully detailed simulations have been carried out. Transverse dose distributions in air have been obtained in order to compare with measurements. The detailed PENELOPE simulations show excellent agreement with experiment. The calculations performed with GEANT and PENELOPE (mixed) agree with experiment within 3% except for the Be foil. In the case of Geant4, the distributions are 5% narrower compared to the experimental ones, though the agreement is very good for the Be foil. Transverse dose distribution in water obtained with PENELOPE (mixed) is 4% wider than those calculated by Ross et al. using EGSnrc and is 1% narrower than the transverse dose distributions in air, as considered in the experiment. All the codes give a reasonable agreement (within 5%) with the experimental results for all the material and thicknesses studied.

  6. Design and optimization of an energy degrader with a multi-wedge scheme based on Geant4

    NASA Astrophysics Data System (ADS)

    Liang, Zhikai; Liu, Kaifeng; Qin, Bin; Chen, Wei; Liu, Xu; Li, Dong; Xiong, Yongqian

    2018-05-01

    A proton therapy facility based on an isochronous superconducting cyclotron is under construction in Huazhong University of Science and Technology (HUST). To meet the clinical requirements, an energy degrader is essential in the beamline to modulate the fixed beam energy extracted from the cyclotron. Because of the multiple Coulomb scattering in the degrader, the beam emittance and the energy spread will be considerably increased during the energy degradation process. Therefore, a set of collimators is designed to restrict the increase in beam emittance after the energy degradation. The energy spread will be reduced in the following beam line which is not discussed in this paper. In this paper, the design considerations of an energy degrader and collimators are introduced, and the properties of the degrader material, degrader structure and the initial beam parameters are discussed using the Geant4 Monte-Carlo toolkit, with the main purpose of improving the overall performance of the degrader by multiple parameter optimization.

  7. Comparison of CdZnTe neutron detector models using MCNP6 and Geant4

    NASA Astrophysics Data System (ADS)

    Wilson, Emma; Anderson, Mike; Prendergasty, David; Cheneler, David

    2018-01-01

    The production of accurate detector models is of high importance in the development and use of detectors. Initially, MCNP and Geant were developed to specialise in neutral particle models and accelerator models, respectively; there is now a greater overlap of the capabilities of both, and it is therefore useful to produce comparative models to evaluate detector characteristics. In a collaboration between Lancaster University, UK, and Innovative Physics Ltd., UK, models have been developed in both MCNP6 and Geant4 of Cadmium Zinc Telluride (CdZnTe) detectors developed by Innovative Physics Ltd. Herein, a comparison is made of the relative strengths of MCNP6 and Geant4 for modelling neutron flux and secondary γ-ray emission. Given the increasing overlap of the modelling capabilities of MCNP6 and Geant4, it is worthwhile to comment on differences in results for simulations which have similarities in terms of geometries and source configurations.

  8. Geant4 models for simulation of hadron/ion nuclear interactions at moderate and low energies.

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Ivanchenko, Vladimir; Quesada, Jose-Manuel; Wright, Dennis

    The Geant4 toolkit is intended for Monte Carlo simulation of particle transport in media. It was initially designed for High Energy Physics purposes such as experiments at the Large Hadron Collider (LHC) at CERN. The toolkit offers a set of models allowing effective simulation of cosmic ray interactions with different materials. For moderate and low energy hadron/ion interactions with nuclei there are a number of competitive models: Binary and Bertini intra-nuclear cascade models, quantum molecular dynamic model (QMD), INCL/ABLA cascade model, and Chiral Invariant Phase Space Decay model (CHIPS). We report the status of these models for the recent version of Geant4 (release 9.3, December 2009). The Bertini cascade in-ternal cross sections were upgraded. The native Geant4 precompound and deexcitation models were used in the Binary cascade and QMD. They were significantly improved including emis-sion of light fragments, the Fermi break-up model, the General Evaporation Model (GEM), the multi-fragmentation model, and the fission model. Comparisons between model predictions and data for thin target experiments for neutron, proton, light ions, and isotope production are presented and discussed. The focus of these validations is concentrated on target materials important for space missions.

  9. Geant4 hadronic physics validation with ATLAS Tile Calorimeter test-beam data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexa, C.; Constantinescu, S.; Dita, S.

    We present comparison studies between Geant4 shower packages and ATLAS Tile Calorimeter test-beam data collected at CERN in H8 beam line at the SPS. Emphasis is put on hadronic physics lists and data concerning differences between Tilecal response to pions and protons of same energy. The ratio between the pure hadronic fraction of pion and the pure hadronic fraction of proton F{sub h}{sup {pi}}/F{sub h}{sup p} was estimated with Tilecal test-beam data and compared with Geant4 simulations.

  10. A Geant4 model of backscatter security imaging systems

    NASA Astrophysics Data System (ADS)

    Leboffe, Eric Matthew

    The operating characteristics of x ray security scanner systems that utilize backscatter signal in order to distinguish person borne threats have never been made fully available to the general public. By designing a model using Geant4, studies can be performed which will shed light on systems such as security scanners and allow for analysis of the performance and safety of the system without access to any system data. Despite the fact that the systems are no longer in use at airports in the United States, the ability to design and validate detector models and phenomena is an important capability that can be applied to many current real world applications. The model presented provides estimates for absorbed dose, effective dose and dose depth distribution that are comparable to previously published work and explores imaging capabilities for the system embodiment modeled.

  11. Application of TDCR-Geant4 modeling to standardization of 63Ni.

    PubMed

    Thiam, C; Bobin, C; Chauvenet, B; Bouchard, J

    2012-09-01

    As an alternative to the classical TDCR model applied to liquid scintillation (LS) counting, a stochastic approach based on the Geant4 toolkit is presented for the simulation of light emission inside the dedicated three-photomultiplier detection system. To this end, the Geant4 modeling includes a comprehensive description of optical properties associated with each material constituting the optical chamber. The objective is to simulate the propagation of optical photons from their creation in the LS cocktail to the production of photoelectrons in the photomultipliers. First validated for the case of radionuclide standardization based on Cerenkov emission, the scintillation process has been added to a TDCR-Geant4 modeling using the Birks expression in order to account for the light-emission nonlinearity owing to ionization quenching. The scintillation yield of the commercial Ultima Gold LS cocktail has been determined from double-coincidence detection efficiencies obtained for (60)Co and (54)Mn with the 4π(LS)β-γ coincidence method. In this paper, the stochastic TDCR modeling is applied for the case of the standardization of (63)Ni (pure β(-)-emitter; E(max)=66.98 keV) and the activity concentration is compared with the result given by the classical model. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Energy deposition and neutron flux study in a gravity-driven dense granular target (DGT) with GEANT4 toolkit

    NASA Astrophysics Data System (ADS)

    Zhao, Qiang; Cui, Wenjuan; He, Zhiyong; Zhang, Xueying; Ma, Wenjing

    2018-07-01

    China initiative Accelerator Driven System (CiADS) has been approved as a strategic plan to build an ADS demonstration facility in the next few years. It proposed a new concept for a high-power spallation target: the gravity-driven dense granular target (DGT). As the same with a monolithic target (MT), both solid and liquid target, energy deposition and neutron flux are two critical issues. In this paper, we focus on these two issues and long for some valuable results for the project. Unlike a solid target, the internal geometry structure of a DGT is very complicated. To be as much as closer with the reality, we designed an algorithm and firstly packed the grains randomly in a cylindrical container in GEANT4 software. The packing result was in great agreement with the experimentally measured results. It shows that the algorithm is practicable. In the next step, all the simulations about energy deposition and neutron flux of a DGT were performed with the GEANT4 codes, and the results were compared with the data of a MT. Compared to a MT, a DGT has inarguable advantages in both terms of energy deposition and neutron flux. In addition, the simulations with different radius of grains were also performed. Finally, we found that both the energy deposition and neutron flux are nearly irrelevant to the radius of the grains in the range of 0.5 mm-5 mm when the packing density is same by analyzing the results meticulously.

  13. GEANT4 and PHITS simulations of the shielding of neutrons from the 252Cf source

    NASA Astrophysics Data System (ADS)

    Shin, Jae Won; Hong, Seung-Woo; Bak, Sang-In; Kim, Do Yoon; Kim, Chong Yeal

    2014-09-01

    Monte Carlo simulations are performed by using the GEANT4 and the PHITS for studying the neutron-shielding abilities of several materials, such as graphite, iron, polyethylene, NS-4-FR and KRAFTON-HB. As a neutron source, 252Cf is considered. For the Monte Carlo simulations by using the GEANT4, high precision (G4HP) models with the G4NDL 4.2 based on ENDF/B-VII data are used. For the simulations by using the PHITS, the JENDL-4.0 library is used. The neutron-dose-equivalent rates with or without five different shielding materials are estimated and compared with the experimental values. The differences between the shielding abilities calculated by using the GEANT4 with the G4NDL 4.2 and the PHITS with the JENDL-4.0 are found not to be significant for all the cases considered in this work. The neutron-dose-equivalent rates obtained by using the GEANT4 and the PHITS are compared with experimental data and other simulation results. Our neutron-dose-equivalent rates agree well with the experimental dose-equivalent rates, within 20% errors, except for polyethylene. For polyethylene, the discrepancies between our calculations and the experiments are less than 40%, as observed in other simulation results.

  14. Modeling the Martian neutron and gamma-ray leakage fluxes using Geant4

    NASA Astrophysics Data System (ADS)

    Pirard, Benoit; Desorgher, Laurent; Diez, Benedicte; Gasnault, Olivier

    A new evaluation of the Martian neutron and gamma-ray (continuum and line) leakage fluxes has been performed using the Geant4 code. Even if numerous studies have recently been carried out with Monte Carlo methods to characterize planetary radiation environments, only a few however have been able to reproduce in detail the neutron and gamma-ray spectra observed in orbit. We report on the efforts performed to adapt and validate the Geant4-based PLAN- ETOCOSMICS code for use in planetary neutron and gamma-ray spectroscopy data analysis. Beside the advantage of high transparency and modularity common to Geant4 applications, the new code uses reviewed nuclear cross section data, realistic atmospheric profiles and soil layering, as well as specific effects such as gravity acceleration for low energy neutrons. Results from first simulations are presented for some Martian reference compositions and show a high consistency with corresponding neutron and gamma-ray spectra measured on board Mars Odyssey. Finally we discuss the advantages and perspectives of the improved code for precise simulation of planetary radiation environments.

  15. SU-E-I-77: X-Ray Coherent Scatter Diffraction Pattern Modeling in GEANT4.

    PubMed

    Kapadia, A; Samei, E; Harrawood, B; Sahbaee, P; Chawla, A; Tan, Z; Brady, D

    2012-06-01

    To model X-ray coherent scatter diffraction patterns in GEANT4 for simulating experiments involving material detection through diffraction pattern measurement. Although coherent scatter cross-sections are modeled accurately in GEANT4, diffraction patterns for crystalline materials are not yet included. Here we describe our modeling of crystalline diffraction patterns in GEANT4 for specific materials and the validation of the results against experimentally measured data. Coherent scatter in GEANT4 is currently based on Hubbell's non-relativistic form factor tabulations from EPDL97. We modified the form-factors by introducing an interference function that accounts for the angular dependence between the Rayleigh-scattered photons and the photon wavelength. The modified form factors were used to replace the inherent form-factors in GEANT4. The simulation was tested using monochromatic and polychromatic x-ray beams (separately) incident on objects containing one or more elements with modified form-factors. The simulation results were compared against the experimentally measured diffraction images of corresponding objects using an in-house x-ray diffraction imager for validation. The comparison was made using the following metrics: number of diffraction rings, radial distance, absolute intensity, and relative intensity. Sharp diffraction pattern rings were observed in the monochromatic simulations at locations consistent with the angular dependence of the photon wavelength. In the polychromatic simulations, the diffraction patterns exhibited a radial blur consistent with the energy spread of the polychromatic spectrum. The simulated and experimentally measured patterns showed identical numbers of rings with close agreement in radial distance, absolute and relative intensities (barring statistical fluctuations). No significant change was observed in the execution time of the simulations. This work demonstrates the ability to model coherent scatter diffraction in GEANT4 in

  16. Calculation of electron Dose Point Kernel in water with GEANT4 for medical application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guimaraes, C. C.; Sene, F. F.; Martinelli, J. R.

    2009-06-03

    The rapid insertion of new technologies in medical physics in the last years, especially in nuclear medicine, has been followed by a great development of faster Monte Carlo algorithms. GEANT4 is a Monte Carlo toolkit that contains the tools to simulate the problems of particle transport through matter. In this work, GEANT4 was used to calculate the dose-point-kernel (DPK) for monoenergetic electrons in water, which is an important reference medium for nuclear medicine. The three different physical models of electromagnetic interactions provided by GEANT4 - Low Energy, Penelope and Standard - were employed. To verify the adequacy of these models,more » the results were compared with references from the literature. For all energies and physical models, the agreement between calculated DPKs and reported values is satisfactory.« less

  17. Geant4 Developments for the Radon Electric Dipole Moment Search at TRIUMF

    NASA Astrophysics Data System (ADS)

    Rand, E. T.; Bangay, J. C.; Bianco, L.; Dunlop, R.; Finlay, P.; Garrett, P. E.; Leach, K. G.; Phillips, A. A.; Sumithrarachchi, C. S.; Svensson, C. E.; Wong, J.

    2011-09-01

    An experiment is being developed at TRIUMF to search for a time-reversal violating electric dipole moment (EDM) in odd-A isotopes of Rn. Extensive simulations of the experiment are being performed with GEANT4 to study the backgrounds and sensitivity of the proposed measurement technique involving the detection of γ rays emitted following the β decay of polarized Rn nuclei. GEANT4 developments for the RnEDM experiment include both realistic modelling of the detector geometry and full tracking of the radioactive β, γ, internal conversion, and x-ray processes, including the γ-ray angular distributions essential for measuring an atomic EDM.

  18. Software Design Document SAF Workstation. Volume 1, Sections 1.0 - 2.4. 3.4.86

    DTIC Science & Technology

    1991-06-01

    SLECT TERMS IS. NUMER OF PAGES SIMNET Software Design Document for the SAF Workstation CSCI (CSCI 6). 14. PRICE CODE SECUWItY CLASSIFICATION Is. SECUJRITY...AD-A244 972 SOFTWARE DESIGN DOCUMENT SAF Workstation CSCI (6) Volume 1 of 2 Sections 1.0 - 2.4.3.4.86 DTIC June, 1991 Flt. FCTE S JAN 09 1992...00247 APPROVED FOR PUBLIC RELEASE DISTRBUTION UNLIMITED -Mono SOFTWARE DESIGN DOCUMENT SAF Workstation CSCI (6) Volume 1 of 2 Sections 1.0 - 2.4.3.4.86

  19. Microdosimetry calculations for monoenergetic electrons using Geant4-DNA combined with a weighted track sampling algorithm.

    PubMed

    Famulari, Gabriel; Pater, Piotr; Enger, Shirin A

    2017-07-07

    The aim of this study was to calculate microdosimetric distributions for low energy electrons simulated using the Monte Carlo track structure code Geant4-DNA. Tracks for monoenergetic electrons with kinetic energies ranging from 100 eV to 1 MeV were simulated in an infinite spherical water phantom using the Geant4-DNA extension included in Geant4 toolkit version 10.2 (patch 02). The microdosimetric distributions were obtained through random sampling of transfer points and overlaying scoring volumes within the associated volume of the tracks. Relative frequency distributions of energy deposition f(>E)/f(>0) and dose mean lineal energy ([Formula: see text]) values were calculated in nanometer-sized spherical and cylindrical targets. The effects of scoring volume and scoring techniques were examined. The results were compared with published data generated using MOCA8B and KURBUC. Geant4-DNA produces a lower frequency of higher energy deposits than MOCA8B. The [Formula: see text] values calculated with Geant4-DNA are smaller than those calculated using MOCA8B and KURBUC. The differences are mainly due to the lower ionization and excitation cross sections of Geant4-DNA for low energy electrons. To a lesser extent, discrepancies can also be attributed to the implementation in this study of a new and fast scoring technique that differs from that used in previous studies. For the same mean chord length ([Formula: see text]), the [Formula: see text] calculated in cylindrical volumes are larger than those calculated in spherical volumes. The discrepancies due to cross sections and scoring geometries increase with decreasing scoring site dimensions. A new set of [Formula: see text] values has been presented for monoenergetic electrons using a fast track sampling algorithm and the most recent physics models implemented in Geant4-DNA. This dataset can be combined with primary electron spectra to predict the radiation quality of photon and electron beams.

  20. Monte-Carlo Geant4 numerical simulation of experiments at 247-MeV proton microscope

    NASA Astrophysics Data System (ADS)

    Kantsyrev, A. V.; Skoblyakov, A. V.; Bogdanov, A. V.; Golubev, A. A.; Shilkin, N. S.; Yuriev, D. S.; Mintsev, V. B.

    2018-01-01

    A radiographic facility for an investigation of fast dynamic processes with areal density of targets up to 5 g/cm2 is under development on the basis of high-current proton linear accelerator at the Institute for Nuclear Research (Troitsk, Russia). A virtual model of the proton microscope developed in a software toolkit Geant4 is presented in the article. Fullscale Monte-Carlo numerical simulation of static radiographic experiments at energy of a proton beam 247 MeV was performed. The results of simulation of proton radiography experiments with static model of shock-compressed xenon are presented. The results of visualization of copper and polymethyl methacrylate step wedges static targets also described.

  1. A Geant4 evaluation of the Hornyak button and two candidate detectors for the TREAT hodoscope

    NASA Astrophysics Data System (ADS)

    Fu, Wenkai; Ghosh, Priyarshini; Harrison, Mark J.; McGregor, Douglas S.; Roberts, Jeremy A.

    2018-05-01

    The performance of traditional Hornyak buttons and two proposed variants for fast-neutron hodoscope applications was evaluated using Geant4. The Hornyak button is a ZnS(Ag)-based device previously deployed at the Idaho National Laboratory's TRansient REActor Test Facility (better known as TREAT) for monitoring fast neutrons emitted during pulsing of fissile fuel samples. Past use of these devices relied on pulse-shape discrimination to reduce the significant levels of background Cherenkov radiation. Proposed are two simple designs that reduce the overall light guide mass (here, polymethyl methacrylate or PMMA), employ silicon photomultipliers (SiPMs), and can be operated using pulse-height discrimination alone to eliminate background noise to acceptable levels. Geant4 was first used to model a traditional Hornyak button, and for assumed, hodoscope-like conditions, an intrinsic efficiency of 0.35% for mono-directional fission neutrons was predicted. The predicted efficiency is in reasonably good agreement with experimental data from the literature and, hence, served to validate the physics models and approximations employed. Geant4 models were then developed to optimize the materials and geometries of two alternatives to the Hornyak button, one based on a homogeneous mixture of ZnS(Ag) and PMMA, and one based on alternating layers of ZnS(Ag) and PMMA oriented perpendicular to the incident neutron beam. For the same radiation environment, optimized, 5-cm long (along the beam path) devices of the homogeneous and layered designs were predicted to have efficiencies of approximately 1.3% and 3.3%, respectively. For longer devices, i.e., lengths larger than 25 cm, these efficiencies were shown to peak at approximately 2.2% and 5.9%, respectively. Moreover, both designs were shown to discriminate Cherenkov noise intrinsically by using an appropriate pulse-height discriminator level, i.e., pulse-shape discrimination is not needed for these devices.

  2. A Geant4 evaluation of the Hornyak button and two candidate detectors for the TREAT hodoscope

    DOE PAGES

    Fu, Wenkai; Ghosh, Priyarshini; Harrison, Mark; ...

    2018-02-05

    The performance of traditional Hornyak buttons and two proposed variants for fast-neutron hodoscope applications was evaluated using Geant4. The Hornyak button is a ZnS(Ag)-based device previously deployed at the Idaho National Laboratory's TRansient REActor Test Facility (better known as TREAT) for monitoring fast neutrons emitted during pulsing of fissile fuel samples. Past use of these devices relied on pulse-shape discrimination to reduce the significant levels of background Cherenkov radiation. Proposed are two simple designs that reduce the overall light guide mass (here, polymethyl methacrylate or PMMA), employ silicon photomultipliers (SiPMs), and can be operated using pulse-height discrimination alone to eliminatemore » background noise to acceptable levels. Geant4 was first used to model a traditional Hornyak button, and for assumed, hodoscope-like conditions, an intrinsic efficiency of 0.35% for mono-directional fission neutrons was predicted. The predicted efficiency is in reasonably good agreement with experimental data from the literature and, hence, served to validate the physics models and approximations employed. Geant4 models were then developed to optimize the materials and geometries of two alternatives to the Hornyak button, one based on a homogeneous mixture of ZnS(Ag) and PMMA, and one based on alternating layers of ZnS(Ag) and PMMA oriented perpendicular to the incident neutron beam. For the same radiation environment, optimized, 5-cm long (along the beam path) devices of the homogeneous and layered designs were predicted to have efficiencies of approximately 1.3% and 3.3%, respectively. For longer devices, i.e., lengths larger than 25 cm, these efficiencies were shown to peak at approximately 2.2% and 5.9%, respectively. Furthermore, both designs were shown to discriminate Cherenkov noise intrinsically by using an appropriate pulse-height discriminator level, i.e., pulse-shape discrimination is not needed for these devices.« less

  3. Monte Carlo calculations of thermal neutron capture in gadolinium: a comparison of GEANT4 and MCNP with measurements.

    PubMed

    Enger, Shirin A; Munck af Rosenschöld, Per; Rezaei, Arash; Lundqvist, Hans

    2006-02-01

    GEANT4 is a Monte Carlo code originally implemented for high-energy physics applications and is well known for particle transport at high energies. The capacity of GEANT4 to simulate neutron transport in the thermal energy region is not equally well known. The aim of this article is to compare MCNP, a code commonly used in low energy neutron transport calculations and GEANT4 with experimental results and select the suitable code for gadolinium neutron capture applications. To account for the thermal neutron scattering from chemically bound atoms [S(alpha,beta)] in biological materials a comparison of thermal neutron fluence in tissue-like poly(methylmethacrylate) phantom is made with MCNP4B, GEANT4 6.0 patch1, and measurements from the neutron capture therapy (NCT) facility at the Studsvik, Sweden. The fluence measurements agreed with MCNP calculated results considering S(alpha,beta). The location of the thermal neutron peak calculated with MCNP without S(alpha,beta) and GEANT4 is shifted by about 0.5 cm towards a shallower depth and is 25%-30% lower in amplitude. Dose distribution from the gadolinium neutron capture reaction is then simulated by MCNP and compared with measured data. The simulations made by MCNP agree well with experimental results. As long as thermal neutron scattering from chemically bound atoms are not included in GEANT4 it is not suitable for NCT applications.

  4. Implementation of new physics models for low energy electrons in liquid water in Geant4-DNA.

    PubMed

    Bordage, M C; Bordes, J; Edel, S; Terrissol, M; Franceries, X; Bardiès, M; Lampe, N; Incerti, S

    2016-12-01

    A new alternative set of elastic and inelastic cross sections has been added to the very low energy extension of the Geant4 Monte Carlo simulation toolkit, Geant4-DNA, for the simulation of electron interactions in liquid water. These cross sections have been obtained from the CPA100 Monte Carlo track structure code, which has been a reference in the microdosimetry community for many years. They are compared to the default Geant4-DNA cross sections and show better agreement with published data. In order to verify the correct implementation of the CPA100 cross section models in Geant4-DNA, simulations of the number of interactions and ranges were performed using Geant4-DNA with this new set of models, and the results were compared with corresponding results from the original CPA100 code. Good agreement is observed between the implementations, with relative differences lower than 1% regardless of the incident electron energy. Useful quantities related to the deposited energy at the scale of the cell or the organ of interest for internal dosimetry, like dose point kernels, are also calculated using these new physics models. They are compared with results obtained using the well-known Penelope Monte Carlo code. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Comparison of the thermal neutron scattering treatment in MCNP6 and GEANT4 codes

    NASA Astrophysics Data System (ADS)

    Tran, H. N.; Marchix, A.; Letourneau, A.; Darpentigny, J.; Menelle, A.; Ott, F.; Schwindling, J.; Chauvin, N.

    2018-06-01

    To ensure the reliability of simulation tools, verification and comparison should be made regularly. This paper describes the work performed in order to compare the neutron transport treatment in MCNP6.1 and GEANT4-10.3 in the thermal energy range. This work focuses on the thermal neutron scattering processes for several potential materials which would be involved in the neutron source designs of Compact Accelerator-based Neutrons Sources (CANS), such as beryllium metal, beryllium oxide, polyethylene, graphite, para-hydrogen, light water, heavy water, aluminium and iron. Both thermal scattering law and free gas model, coming from the evaluated data library ENDF/B-VII, were considered. It was observed that the GEANT4.10.03-patch2 version was not able to account properly the coherent elastic process occurring in crystal lattice. This bug is treated in this work and it should be included in the next release of the code. Cross section sampling and integral tests have been performed for both simulation codes showing a fair agreement between the two codes for most of the materials except for iron and aluminium.

  6. GEANT4 Simulation of Neutron Detector for DAMPE

    NASA Astrophysics Data System (ADS)

    He, M.; Ma, T.; Chang, J.; Zhang, Y.; Huang, Y. Y.; Zang, J. J.; Wu, J.; Dong, T. K.

    2016-01-01

    During recent tens of years dark matter has gradually become a hot topic in astronomical research field, and related theory researches and experiment projects change with each passing day. The Dark Matter Particle Explorer (DAMPE) of our country is proposed under this background. As the probing object involves high energy electrons, appropriate methods must be taken to distinguish them from protons in order to reduce the event probability of other charged particles (e.g. a proton) being mistaken as electrons. The experiments show that, the hadronic shower of high energy proton in BGO electromagnetic calorimeter, which is usually accompanied by the emitting of large number of secondary neutrons, is significantly different from the electromagnetic shower of high energy electron. Through the detection of secondary neutron signal emitting from the bottom of BGO electromagnetic calorimeter and the shower shape of incident particles in BGO electromagnetic calorimeter, we can effectively distinguish whether the incident particles are high energy protons or electrons. This paper introduces the structure and detecting principle of DAMPE neutron detector. We use Monte-Carlo method with GEANT4 software to simulate the signal emitting from protons and electrons at characteristic energy in the neutron detector, and finally summarize the neutron detector's ability to distinguish protons and electrons under different electron acception efficiencies.

  7. Modeling proton and alpha elastic scattering in liquid water in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Tran, H. N.; El Bitar, Z.; Champion, C.; Karamitros, M.; Bernal, M. A.; Francis, Z.; Ivantchenko, V.; Lee, S. B.; Shin, J. I.; Incerti, S.

    2015-01-01

    Elastic scattering of protons and alpha (α) particles by water molecules cannot be neglected at low incident energies. However, this physical process is currently not available in the "Geant4-DNA" extension of the Geant4 Monte Carlo simulation toolkit. In this work, we report on theoretical differential and integral cross sections of the elastic scattering process for 100 eV-1 MeV incident protons and for 100 eV-10 MeV incident α particles in liquid water. The calculations are performed within the classical framework described by Everhart et al., Ziegler et al. and by the ICRU 49 Report. Then, we propose an implementation of the corresponding classes into the Geant4-DNA toolkit for modeling the elastic scattering of protons and α particles. Stopping powers as well as ranges are also reported. Then, it clearly appears that the account of the elastic scattering process in the slowing-down of the charged particle improves the agreement with the existing data in particular with the ICRU recommendations.

  8. Geant4 Monte Carlo simulation of energy loss and transmission and ranges for electrons, protons and ions

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Vladimir

    Geant4 is a toolkit for Monte Carlo simulation of particle transport originally developed for applications in high-energy physics with the focus on experiments at the Large Hadron Collider (CERN, Geneva). The transparency and flexibility of the code has spread its use to other fields of research, e.g. radiotherapy and space science. The tool provides possibility to simulate complex geometry, transportation in electric and magnetic fields and variety of physics models of interaction of particles with media. Geant4 has been used for simulation of radiation effects for number of space missions. Recent upgrades of the toolkit released in December 2009 include new model for ion electronic stopping power based on the revised version of ICRU'73 Report increasing accuracy of simulation of ion transport. In the current work we present the status of Geant4 electromagnetic package for simulation of particle energy loss, ranges and transmission. This has a direct implication for simulation of ground testing setups at existing European facilities and for simulation of radiation effects in space. A number of improvements were introduced for electron and proton transport, followed by a thorough validation. It was the aim of the present study to validate the range against reference data from the United States National Institute of Standards and Technologies (NIST) ESTAR, PSTAR and ASTAR databases. We compared Geant4 and NIST ranges of electrons using different Geant4 models. The best agreement was found for Penelope, except at very low energies in heavy materials, where the Standard package gave better results. Geant4 proton ranges in water agreed with NIST within 1 The validation of the new ion model is performed against recent data on Bragg peak position in water. The data from transmission of carbon ions via various absorbers following Bragg peak in water demonstrate that the new Geant4 model significantly improves precision of ion range. The absolute accuracy of ion range

  9. Design of Cherenkov bars for the optical part of the time-of-flight detector in Geant4.

    PubMed

    Nozka, L; Brandt, A; Rijssenbeek, M; Sykora, T; Hoffman, T; Griffiths, J; Steffens, J; Hamal, P; Chytka, L; Hrabovsky, M

    2014-11-17

    We present the results of studies devoted to the development and optimization of the optical part of a high precision time-of-flight (TOF) detector for the Large Hadron Collider (LHC). This work was motivated by a proposal to use such a detector in conjunction with a silicon detector to tag and measure protons from interactions of the type p + p → p + X + p, where the two outgoing protons are scattered in the very forward directions. The fast timing detector uses fused silica (quartz) bars that emit Cherenkov radiation as a relativistic particle passes through and the emitted Cherenkov photons are detected by, for instance, a micro-channel plate multi-anode Photomultiplier Tube (MCP-PMT). Several possible designs are implemented in Geant4 and studied for timing optimization as a function of the arrival time, and the number of Cherenkov photons reaching the photo-sensor.

  10. G4DARI: Geant4/GATE based Monte Carlo simulation interface for dosimetry calculation in radiotherapy.

    PubMed

    Slimani, Faiçal A A; Hamdi, Mahdjoub; Bentourkia, M'hamed

    2018-05-01

    Monte Carlo (MC) simulation is widely recognized as an important technique to study the physics of particle interactions in nuclear medicine and radiation therapy. There are different codes dedicated to dosimetry applications and widely used today in research or in clinical application, such as MCNP, EGSnrc and Geant4. However, such codes made the physics easier but the programming remains a tedious task even for physicists familiar with computer programming. In this paper we report the development of a new interface GEANT4 Dose And Radiation Interactions (G4DARI) based on GEANT4 for absorbed dose calculation and for particle tracking in humans, small animals and complex phantoms. The calculation of the absorbed dose is performed based on 3D CT human or animal images in DICOM format, from images of phantoms or from solid volumes which can be made from any pure or composite material to be specified by its molecular formula. G4DARI offers menus to the user and tabs to be filled with values or chemical formulas. The interface is described and as application, we show results obtained in a lung tumor in a digital mouse irradiated with seven energy beams, and in a patient with glioblastoma irradiated with five photon beams. In conclusion, G4DARI can be easily used by any researcher without the need to be familiar with computer programming, and it will be freely available as an application package. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. GEANT4 simulation of cyclotron radioisotope production in a solid target.

    PubMed

    Poignant, F; Penfold, S; Asp, J; Takhar, P; Jackson, P

    2016-05-01

    The use of radioisotopes in nuclear medicine is essential for diagnosing and treating cancer. The optimization of their production is a key factor in maximizing the production yield and minimizing the associated costs. An efficient approach to this problem is the use of Monte Carlo simulations prior to experimentation. By predicting isotopes yields, one can study the isotope of interest expected activity for different energy ranges. One can also study the target contamination with other radioisotopes, especially undesired radioisotopes of the wanted chemical element which are difficult to separate from the irradiated target and might result in increasing the dose when delivering the radiopharmaceutical product to the patient. The aim of this work is to build and validate a Monte Carlo simulation platform using the GEANT4 toolkit to model the solid target system of the South Australian Health and Medical Research Institute (SAHMRI) GE Healthcare PETtrace cyclotron. It includes a GEANT4 Graphical User Interface (GUI) where the user can modify simulation parameters such as the energy, shape and current of the proton beam, the target geometry and material, the foil geometry and material and the time of irradiation. The paper describes the simulation and presents a comparison of simulated and experimental/theoretical yields for various nuclear reactions on an enriched nickel 64 target using the GEANT4 physics model QGSP_BIC_AllHP, a model recently developed to evaluate with high precision the interaction of protons with energies below 200MeV available in Geant4 version 10.1. The simulation yield of the (64)Ni(p,n)(64)Cu reaction was found to be 7.67±0.074 mCi·μA(-1) for a target energy range of 9-12MeV. Szelecsenyi et al. (1993) gives a theoretical yield of 6.71mCi·μA(-1) and an experimental yield of 6.38mCi·μA(-1). The (64)Ni(p,n)(64)Cu cross section obtained with the simulation was also verified against the yield predicted from the nuclear database TENDL and

  12. Designing a new type of neutron detector for neutron and gamma-ray discrimination via GEANT4.

    PubMed

    Shan, Qing; Chu, Shengnan; Ling, Yongsheng; Cai, Pingkun; Jia, Wenbao

    2016-04-01

    Design of a new type of neutron detector, consisting of a fast neutron converter, plastic scintillator, and Cherenkov detector, to discriminate 14-MeV fast neutrons and gamma rays in a pulsed n-γ mixed field and monitor their neutron fluxes is reported in this study. Both neutrons and gamma rays can produce fluorescence in the scintillator when they are incident on the detector. However, only the secondary charged particles of the gamma rays can produce Cherenkov light in the Cherenkov detector. The neutron and gamma-ray fluxes can be calculated by measuring the fluorescence and Cherenkov light. The GEANT4 Monte Carlo simulation toolkit is used to simulate the whole process occurring in the detector, whose optimum parameters are known. Analysis of the simulation results leads to a calculation method of neutron flux. This method is verified by calculating the neutron fluxes using pulsed n-γ mixed fields with different n/γ ratios, and the results show that the relative errors of all calculations are <5%. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Modeling of a cyclotron target for the production of 11C with Geant4.

    PubMed

    Chiappiniello, Andrea; Zagni, Federico; Infantino, Angelo; Vichi, Sara; Cicoria, Gianfranco; Morigi, Maria Pia; Marengo, Mario

    2018-04-12

    In medical cyclotron facilities, 11C is produced according to the 14N(p,α)11C reaction and widely employed in studies of prostate and brain cancers by Positron Emission Tomography. It is known from literature [1] that the 11C-target assembly shows a reduction in efficiency during time, meaning a decrease of activity produced at the end of bombardment. This effect might depend on aspects still not completely known. Possible causes of the loss of performance of the 11C-target assembly were addressed by Monte Carlo simulations. Geant4 was used to model the 11C-target assembly of a GE PETtrace cyclotron. The physical and transport parameters to be used in the energy range of medical applications were extracted from literature data and 11C routine productions. The Monte Carlo assessment of 11C saturation yield was performed varying several parameters such as the proton energy and the angle of the target assembly with respect to the proton beam. The estimated 11C saturation yield is in agreement with IAEA data at the energy of interest, while is about the 35% greater than experimental value. A more comprehensive modeling of the target system, including thermodynamic effect, is required. The energy absorbed in the inner layer of the target chamber was up to 46.5 J/mm2 under typical irradiation conditions. This study shows that Geant4 is potentially a useful tool to design and optimize targetry for PET radionuclide productions. Tests to choose the Geant4 physics libraries should be performed before using this tool with different energies and materials. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. GEANT4 Simulation of Neutron Detector for DAMPE

    NASA Astrophysics Data System (ADS)

    Ming, He; Tao, Ma; Jin, Chang; Yan, Zhang; Yong-yi, Huang; Jing-jing, Zang; Jian, Wu; Tie-kuang, Dong

    2016-10-01

    In recent decades, dark matter has gradually become a hot topic in astronomical research, and the related theoretical research and experimental project are updated with each passing day. The Dark Matter Particle Explorer (DAMPE) of our country was proposed under this background. As the detected object involves high-energy electrons, appropriate methods must be taken to distinguish them from protons, in order to reduce the event probability of other charged particles (for example protons) being mistaken as electrons. The experiments show that the hadron shower of high-energy proton in BGO (Bismuth Germanium Oxide) calorimeter, which is usually accompanied with the emitting of a large number of secondary neutrons, is significantly different from the electromagnetic shower of high-energy electron. Through the detection of secondary neutron signals emerging from the bottom of BGO calorimeter, and the shower shape of incident particles in the BGO calorimeter, we can effectively distinguish whether the incident particles are high-energy protons or electrons. This paper introduces the structure and detection principle of the DAMPE neutron detector. We use the Monte-Carlo method and the GEANT4 software to simulate the signals produced by protons and electrons at the characteristic energy in the neutron detector, and finally summarize the neutron detector's ability to distinguish protons and electrons under different electron acceptabilities.

  15. Calibration of the radiation monitor onboard Akebono using Geant4

    NASA Astrophysics Data System (ADS)

    Asai, Keiko; Takashima, Takeshi; Koi, Tatsumi; Nagai, Tsugunobu

    Natural high-energy electrons and protons (keV-MeV) in the space contaminate the data re-ciprocally. In order to calibrate the energy ranges and to remove data contamination on the radiation monitor (RDM) onboard the Japanese satellite, Akebono (EXOS-D), the detector is investigated using the Geant4 simulation toolkit of computational particle tracing. The semi-polar orbiting Akebono, launched in February 1989, is active now. This satellite has been observed the space environment at altitudes of several thousands km. The RDM instrument onboard Akebono monitors energetic particles in the Earth's radiation belt and gives important data accumulated for about two solar cycles. The data from RDM are for electrons in three energy channels of 0.3 MeV, protons in three energy channels of ¿ 30 MeV, and alpha particles in one energy channels of 15-45 MeV. The energy ranges are however based on information of about 20 years ago so that the data seem to include some errors actuary. In addition, these data include contamination of electrons and protons reciprocally. Actuary it is noticed that the electron data are contaminated by the solar protons but unknown quantitative amount of the contamination. Therefore we need data calibration in order to correct the energy ranges and to remove data contamination. The Geant4 simulation gives information of trajectories of incident and secondary particles whose are interacted with materials. We examine the RDM monitor using the Geant4 simulation. We find from the results that relativistic electrons of MeV behave quite complicatedly because of particle-material interaction in the instrument. The results indicate that efficiencies of detection and contamination are dependent on energy. This study compares the electron data from Akebono RDM with the simultaneous observation of CRRES and tries to lead the values of correction for each of the energy channels.

  16. Evaluation on Geant4 Hadronic Models for Pion Minus, Pion Plus and Neutron Particles as Major Antiproton Annihilation Products

    PubMed Central

    Tavakoli, Mohammad Bagher; Mohammadi, Mohammad Mehdi; Reiazi, Reza; Jabbari, Keyvan

    2015-01-01

    Geant4 is an open source simulation toolkit based on C++, which its advantages progressively lead to applications in research domains especially modeling the biological effects of ionizing radiation at the sub-cellular scale. However, it was shown that Geant4 does not give a reasonable result in the prediction of antiproton dose especially in Bragg peak. One of the reasons could be lack of reliable physic model to predict the final states of annihilation products like pions. Considering the fact that most of the antiproton deposited dose is resulted from high-LET nuclear fragments following pion interaction in surrounding nucleons, we reproduced depth dose curves of most probable energy range of pions and neutron particle using Geant4. We consider this work one of the steps to understand the origin of the error and finally verification of Geant4 for antiproton tracking. Geant4 toolkit version 9.4.6.p01 and Fluka version 2006.3 were used to reproduce the depth dose curves of 220 MeV pions (both negative and positive) and 70 MeV neutrons. The geometry applied in the simulations consist a 20 × 20 × 20 cm3 water tank, similar to that used in CERN for antiproton relative dose measurements. Different physic lists including Quark-Gluon String Precompound (QGSP)_Binary Cascade (BIC)_HP, the recommended setting for hadron therapy, were used. In the case of pions, Geant4 resulted in at least 5% dose discrepancy between different physic lists at depth close to the entrance point. Even up to 15% discrepancy was found in some cases like QBBC compared to QGSP_BIC_HP. A significant difference was observed in dose profiles of different Geant4 physic list at small depths for a beam of pions. In the case of neutrons, large dose discrepancy was observed when LHEP or LHEP_EMV lists were applied. The magnitude of this dose discrepancy could be even 50% greater than the dose calculated by LHEP (or LHEP_EMV) at larger depths. We found that effect different Geant4 physic list in

  17. Extension of PENELOPE to protons: simulation of nuclear reactions and benchmark with Geant4.

    PubMed

    Sterpin, E; Sorriaux, J; Vynckier, S

    2013-11-01

    Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4. PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac-Hartree-Fock-Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer-Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for (1)H and ICRU 63 data for (12)C, (14)N, (16)O, (31)P, and (40)Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth-dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth-dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone. For simulations with EM collisions only, integral depth-dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth-dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth-dose distributions). The agreement is much better with FLUKA, with deviations within

  18. GATE - Geant4 Application for Tomographic Emission: a simulation toolkit for PET and SPECT

    PubMed Central

    Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.

    2012-01-01

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document, and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.ep.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects toward the gridification of GATE and its extension to other domains such as dosimetry are also discussed. PMID:15552416

  19. The impact of new Geant4-DNA cross section models on electron track structure simulations in liquid water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyriakou, I., E-mail: ikyriak@cc.uoi.gr; Šefl, M.; Department of Dosimetry and Application of Ionizing Radiation, Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University in Prague, 115 19 Prague

    The most recent release of the open source and general purpose Geant4 Monte Carlo simulation toolkit (Geant4 10.2 release) contains a new set of physics models in the Geant4-DNA extension for improving the modelling of low-energy electron transport in liquid water (<10 keV). This includes updated electron cross sections for excitation, ionization, and elastic scattering. In the present work, the impact of these developments to track-structure calculations is examined for providing the first comprehensive comparison against the default physics models of Geant4-DNA. Significant differences with the default models are found for the average path length and penetration distance, as well asmore » for dose-point-kernels for electron energies below a few hundred eV. On the other hand, self-irradiation absorbed fractions for tissue-like volumes and low-energy electron sources (including some Auger emitters) reveal rather small differences (up to 15%) between these new and default Geant4-DNA models. The above findings indicate that the impact of the new developments will mainly affect those applications where the spatial pattern of interactions and energy deposition of very-low energy electrons play an important role such as, for example, the modelling of the chemical and biophysical stage of radiation damage to cells.« less

  20. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uzunyan, S. A.; Blazey, G.; Boi, S.

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input formore » image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.« less

  1. Simulation of Radiation Damage to Neural Cells with the Geant4-DNA Toolkit

    NASA Astrophysics Data System (ADS)

    Bayarchimeg, Lkhagvaa; Batmunkh, Munkhbaatar; Belov, Oleg; Lkhagva, Oidov

    2018-02-01

    To help in understanding the physical and biological mechanisms underlying effects of cosmic and therapeutic types of radiation on the central nervous system (CNS), we have developed an original neuron application based on the Geant4 Monte Carlo simulation toolkit, in particular on its biophysical extension Geant4-DNA. The applied simulation technique provides a tool for the simulation of physical, physico-chemical and chemical processes (e.g. production of water radiolysis species in the vicinity of neurons) in realistic geometrical model of neural cells exposed to ionizing radiation. The present study evaluates the microscopic energy depositions and water radiolysis species yields within a detailed structure of a selected neuron taking into account its soma, dendrites, axon and spines following irradiation with carbon and iron ions.

  2. GePEToS: A Geant4 Monte Carlo Simulation Package for Positron Emission Tomography

    NASA Astrophysics Data System (ADS)

    Jan, S.; Collot, J.; Gallin-Martel, M.-L.; Martin, P.; Mayet, F.; Tournefier, E.

    2005-02-01

    GePEToS is a simulation framework developed over the last few years for assessing the instrumental performance of future positron emission tomography (PET) scanners. It is based on Geant4, written in object-oriented C++ and runs on Linux platforms. The validity of GePEToS has been tested on the well-known Siemens ECAT EXACT HR+ camera. The results of two application examples are presented: the design optimization of a liquid Xe /spl mu/PET camera dedicated to small animal imaging as well as the evaluation of the effect of a strong axial magnetic field on the image resolution of a Concorde P4 /spl mu/PET camera.

  3. Geant4-DNA track-structure simulations for gold nanoparticles: The importance of electron discrete models in nanometer volumes.

    PubMed

    Sakata, Dousatsu; Kyriakou, Ioanna; Okada, Shogo; Tran, Hoang N; Lampe, Nathanael; Guatelli, Susanna; Bordage, Marie-Claude; Ivanchenko, Vladimir; Murakami, Koichi; Sasaki, Takashi; Emfietzoglou, Dimitris; Incerti, Sebastien

    2018-05-01

    Gold nanoparticles (GNPs) are known to enhance the absorbed dose in their vicinity following photon-based irradiation. To investigate the therapeutic effectiveness of GNPs, previous Monte Carlo simulation studies have explored GNP dose enhancement using mostly condensed-history models. However, in general, such models are suitable for macroscopic volumes and for electron energies above a few hundred electron volts. We have recently developed, for the Geant4-DNA extension of the Geant4 Monte Carlo simulation toolkit, discrete physics models for electron transport in gold which include the description of the full atomic de-excitation cascade. These models allow event-by-event simulation of electron tracks in gold down to 10 eV. The present work describes how such specialized physics models impact simulation-based studies on GNP-radioenhancement in a context of x-ray radiotherapy. The new discrete physics models are compared to the Geant4 Penelope and Livermore condensed-history models, which are being widely used for simulation-based NP radioenhancement studies. An ad hoc Geant4 simulation application has been developed to calculate the absorbed dose in liquid water around a GNP and its radioenhancement, caused by secondary particles emitted from the GNP itself, when irradiated with a monoenergetic electron beam. The effect of the new physics models is also quantified in the calculation of secondary particle spectra, when originating in the GNP and when exiting from it. The new physics models show similar backscattering coefficients with the existing Geant4 Livermore and Penelope models in large volumes for 100 keV incident electrons. However, in submicron sized volumes, only the discrete models describe the high backscattering that should still be present around GNPs at these length scales. Sizeable differences (mostly above a factor of 2) are also found in the radial distribution of absorbed dose and secondary particles between the new and the existing Geant4

  4. An implementation of discrete electron transport models for gold in the Geant4 simulation toolkit

    NASA Astrophysics Data System (ADS)

    Sakata, D.; Incerti, S.; Bordage, M. C.; Lampe, N.; Okada, S.; Emfietzoglou, D.; Kyriakou, I.; Murakami, K.; Sasaki, T.; Tran, H.; Guatelli, S.; Ivantchenko, V. N.

    2016-12-01

    Gold nanoparticle (GNP) boosted radiation therapy can enhance the biological effectiveness of radiation treatments by increasing the quantity of direct and indirect radiation-induced cellular damage. As the physical effects of GNP boosted radiotherapy occur across energy scales that descend down to 10 eV, Monte Carlo simulations require discrete physics models down to these very low energies in order to avoid underestimating the absorbed dose and secondary particle generation. Discrete physics models for electron transportation down to 10 eV have been implemented within the Geant4-DNA low energy extension of Geant4. Such models allow the investigation of GNP effects at the nanoscale. At low energies, the new models have better agreement with experimental data on the backscattering coefficient, and they show similar performance for transmission coefficient data as the Livermore and Penelope models already implemented in Geant4. These new models are applicable in simulations focussed towards estimating the relative biological effectiveness of radiation in GNP boosted radiotherapy applications with photon and electron radiation sources.

  5. Extension of PENELOPE to protons: Simulation of nuclear reactions and benchmark with Geant4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sterpin, E.; Sorriaux, J.; Vynckier, S.

    2013-11-15

    Purpose: Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4.Methods: PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer–Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for {sup 1}H and ICRUmore » 63 data for {sup 12}C, {sup 14}N, {sup 16}O, {sup 31}P, and {sup 40}Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth–dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth–dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone.Results: For simulations with EM collisions only, integral depth–dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth–dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth

  6. Simulation loop between cad systems, GEANT-4 and GeoModel: Implementation and results

    NASA Astrophysics Data System (ADS)

    Sharmazanashvili, A.; Tsutskiridze, Niko

    2016-09-01

    Compare analysis of simulation and as-built geometry descriptions of detector is important field of study for data_vs_Monte-Carlo discrepancies. Shapes consistency and detalization is not important while adequateness of volumes and weights of detector components are essential for tracking. There are 2 main reasons of faults of geometry descriptions in simulation: (1) Difference between simulated and as-built geometry descriptions; (2) Internal inaccuracies of geometry transformations added by simulation software infrastructure itself. Georgian Engineering team developed hub on the base of CATIA platform and several tools enabling to read in CATIA different descriptions used by simulation packages, like XML->CATIA; VP1->CATIA; Geo-Model->CATIA; Geant4->CATIA. As a result it becomes possible to compare different descriptions with each other using the full power of CATIA and investigate both classes of reasons of faults of geometry descriptions. Paper represents results of case studies of ATLAS Coils and End-Cap toroid structures.

  7. A fast and complete GEANT4 and ROOT Object-Oriented Toolkit: GROOT

    NASA Astrophysics Data System (ADS)

    Lattuada, D.; Balabanski, D. L.; Chesnevskaya, S.; Costa, M.; Crucillà, V.; Guardo, G. L.; La Cognata, M.; Matei, C.; Pizzone, R. G.; Romano, S.; Spitaleri, C.; Tumino, A.; Xu, Y.

    2018-01-01

    Present and future gamma-beam facilities represent a great opportunity to validate and evaluate the cross-sections of many photonuclear reactions at near-threshold energies. Monte Carlo (MC) simulations are very important to evaluate the reaction rates and to maximize the detection efficiency but, unfortunately, they can be very cputime-consuming and in some cases very hard to reproduce, especially when exploring near-threshold cross-section. We developed a software that makes use of the validated tracking GEANT4 libraries and the n-body event generator of ROOT in order to provide a fast, realiable and complete MC tool to be used for nuclear physics experiments. This tool is indeed intended to be used for photonuclear reactions at γ-beam facilities with ELISSA (ELI Silicon Strip Array), a new detector array under development at the Extreme Light Infrastructure - Nuclear Physics (ELI-NP). We discuss the results of MC simulations performed to evaluate the effects of the electromagnetic induced background, of the straggling due to the target thickness and of the resolution of the silicon detectors.

  8. Comparison of Geant4-DNA simulation of S-values with other Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    André, T.; Morini, F.; Karamitros, M.; Delorme, R.; Le Loirec, C.; Campos, L.; Champion, C.; Groetz, J.-E.; Fromm, M.; Bordage, M.-C.; Perrot, Y.; Barberet, Ph.; Bernal, M. A.; Brown, J. M. C.; Deleuze, M. S.; Francis, Z.; Ivanchenko, V.; Mascialino, B.; Zacharatou, C.; Bardiès, M.; Incerti, S.

    2014-01-01

    Monte Carlo simulations of S-values have been carried out with the Geant4-DNA extension of the Geant4 toolkit. The S-values have been simulated for monoenergetic electrons with energies ranging from 0.1 keV up to 20 keV, in liquid water spheres (for four radii, chosen between 10 nm and 1 μm), and for electrons emitted by five isotopes of iodine (131, 132, 133, 134 and 135), in liquid water spheres of varying radius (from 15 μm up to 250 μm). The results have been compared to those obtained from other Monte Carlo codes and from other published data. The use of the Kolmogorov-Smirnov test has allowed confirming the statistical compatibility of all simulation results.

  9. The simulation library of the Belle II software system

    NASA Astrophysics Data System (ADS)

    Kim, D. Y.; Ritter, M.; Bilka, T.; Bobrov, A.; Casarosa, G.; Chilikin, K.; Ferber, T.; Godang, R.; Jaegle, I.; Kandra, J.; Kodys, P.; Kuhr, T.; Kvasnicka, P.; Nakayama, H.; Piilonen, L.; Pulvermacher, C.; Santelj, L.; Schwenker, B.; Sibidanov, A.; Soloviev, Y.; Starič, M.; Uglov, T.

    2017-10-01

    SuperKEKB, the next generation B factory, has been constructed in Japan as an upgrade of KEKB. This brand new e+ e- collider is expected to deliver a very large data set for the Belle II experiment, which will be 50 times larger than the previous Belle sample. Both the triggered physics event rate and the background event rate will be increased by at least 10 times than the previous ones, and will create a challenging data taking environment for the Belle II detector. The software system of the Belle II experiment is designed to execute this ambitious plan. A full detector simulation library, which is a part of the Belle II software system, is created based on Geant4 and has been tested thoroughly. Recently the library has been upgraded with Geant4 version 10.1. The library is behaving as expected and it is utilized actively in producing Monte Carlo data sets for various studies. In this paper, we will explain the structure of the simulation library and the various interfaces to other packages including geometry and beam background simulation.

  10. Calculation of Coincidence Summing Correction Factors for an HPGe detector using GEANT4.

    PubMed

    Giubrone, G; Ortiz, J; Gallardo, S; Martorell, S; Bas, M C

    2016-07-01

    The aim of this paper was to calculate the True Coincidence Summing Correction Factors (TSCFs) for an HPGe coaxial detector in order to correct the summing effect as a result of the presence of (88)Y and (60)Co in a multigamma source used to obtain a calibration efficiency curve. Results were obtained for three volumetric sources using the Monte Carlo toolkit, GEANT4. The first part of this paper deals with modeling the detector in order to obtain a simulated full energy peak efficiency curve. A quantitative comparison between the measured and simulated values was made across the entire energy range under study. The True Summing Correction Factors were calculated for (88)Y and (60)Co using the full peak efficiencies obtained with GEANT4. This methodology was subsequently applied to (134)Cs, and presented a complex decay scheme. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Accelerating a MPEG-4 video decoder through custom software/hardware co-design

    NASA Astrophysics Data System (ADS)

    Díaz, Jorge L.; Barreto, Dacil; García, Luz; Marrero, Gustavo; Carballo, Pedro P.; Núñez, Antonio

    2007-05-01

    In this paper we present a novel methodology to accelerate an MPEG-4 video decoder using software/hardware co-design for wireless DAB/DMB networks. Software support includes the services provided by the embedded kernel μC/OS-II, and the application tasks mapped to software. Hardware support includes several custom co-processors and a communication architecture with bridges to the main system bus and with a dual port SRAM. Synchronization among tasks is achieved at two levels, by a hardware protocol and by kernel level scheduling services. Our reference application is an MPEG-4 video decoder composed of several software functions and written using a special C++ library named CASSE. Profiling and space exploration techniques were used previously over the Advanced Simple Profile (ASP) MPEG-4 decoder to determinate the best HW/SW partition developed here. This research is part of the ARTEMI project and its main goal is the establishment of methodologies for the design of real-time complex digital systems using Programmable Logic Devices with embedded microprocessors as target technology and the design of multimedia systems for broadcasting networks as reference application.

  12. Modelling PET radionuclide production in tissue and external targets using Geant4

    NASA Astrophysics Data System (ADS)

    Amin, T.; Infantino, A.; Lindsay, C.; Barlow, R.; Hoehr, C.

    2017-07-01

    The Proton Therapy Facility in TRIUMF provides 74 MeV protons extracted from a 500 MeV H- cyclotron for ocular melanoma treatments. During treatment, positron emitting radionuclides such as 1C, 15O and 13N are produced in patient tissue. Using PET scanners, the isotopic activity distribution can be measured for in-vivo range verification. A second cyclotron, the TR13, provides 13 MeV protons onto liquid targets for the production of PET radionuclides such as 18F, 13N or 68Ga, for medical applications. The aim of this work was to validate Geant4 against FLUKA and experimental measurements for production of the above-mentioned isotopes using the two cyclotrons. The results show variable degrees of agreement. For proton therapy, the proton-range agreement was within 2 mm for 11C activity, whereas 13N disagreed. For liquid targets at the TR13 the average absolute deviation ratio between FLUKA and experiment was 1.9±2.7, whereas the average absolute deviation ratio between Geant4 and experiment was 0. 6±0.4. This is due to the uncertainties present in experimentally determined reaction cross sections.

  13. Abstract ID: 176 Geant4 implementation of inter-atomic interference effect in small-angle coherent X-ray scattering for materials of medical interest.

    PubMed

    Paternò, Gianfranco; Cardarelli, Paolo; Contillo, Adriano; Gambaccini, Mauro; Taibi, Angelo

    2018-01-01

    Advanced applications of digital mammography such as dual-energy and tomosynthesis require multiple exposures and thus deliver higher dose compared to standard mammograms. A straightforward manner to reduce patient dose without affecting image quality would be removal of the anti-scatter grid, provided that the involved reconstruction algorithms are able to take the scatter figure into account [1]. Monte Carlo simulations are very well suited for the calculation of X-ray scatter distribution and can be used to integrate such information within the reconstruction software. Geant4 is an open source C++ particle tracking code widely used in several physical fields, including medical physics [2,3]. However, the coherent scattering cross section used by the standard Geant4 code does not take into account the influence of molecular interference. According to the independent atomic scattering approximation (the so-called free-atom model), coherent radiation is indistinguishable from primary radiation because its angular distribution is peaked in the forward direction. Since interference effects occur between x-rays scattered by neighbouring atoms in matter, it was shown experimentally that the scatter distribution is affected by the molecular structure of the target, even in amorphous materials. The most important consequence is that the coherent scatter distribution is not peaked in the forward direction, and the position of the maximum is strongly material-dependent [4]. In this contribution, we present the implementation of a method to take into account inter-atomic interference in small-angle coherent scattering in Geant4, including a dedicated data set of suitable molecular form factor values for several materials of clinical interest. Furthermore, we present scatter images of simple geometric phantoms in which the Rayleigh contribution is rigorously evaluated. Copyright © 2017.

  14. TOPAS/Geant4 configuration for ionization chamber calculations in proton beams

    NASA Astrophysics Data System (ADS)

    Wulff, Jörg; Baumann, Kilian-Simon; Verbeek, Nico; Bäumer, Christian; Timmermann, Beate; Zink, Klemens

    2018-06-01

    Monte Carlo (MC) calculations are a fundamental tool for the investigation of ionization chambers (ICs) in radiation fields, and for calculations in the scope of IC reference dosimetry. Geant4, as used for the toolkit TOPAS, is a major general purpose code, generally suitable for investigating ICs in primary proton beams. To provide reliable results, the impact of parameter settings and the limitations of the underlying condensed history (CH) algorithm need to be known. A Fano cavity test was implemented in Geant4 (10.03.p1) for protons, based on the existing version for electrons distributed with the Geant4 release. This self-consistent test allows the calculation to be compared with the expected result for the typical IC-like geometry of an air-filled cavity surrounded by a higher density material. Various user-selectable parameters of the CH implementation in the EMStandardOpt4 physics-list were tested for incident proton energies between 30 and 250 MeV. Using TOPAS (3.1.p1) the influence of production cuts was investigated for bare air-cavities in water, irradiated by primary protons. Detailed IC geometries for an NACP-02 plane-parallel chamber and an NE2571 Farmer-chamber were created. The overall factor f Q as a ratio between the dose-to-water and dose to the sensitive air-volume was calculated for incident proton energies between 70 and 250 MeV. The Fano test demonstrated the EMStandardOpt4 physics-list with the WentzelIV multiple scattering model as appropriate for IC calculations. If protons start perpendicular to the air cavity, no further step-size limitations are required to pass the test within 0.1%. For an isotropic source, limitations of the maximum step length within the air cavity and its surrounding as well as a limitation of the maximum fractional energy loss per step were required to pass within 0.2%. A production cut of  ⩽5 μm or  ∼15 keV for all particles yielded a constant result for f Q of bare air-filled cavities. The overall

  15. TOPAS/Geant4 configuration for ionization chamber calculations in proton beams.

    PubMed

    Wulff, Jörg; Baumann, Kilian-Simon; Verbeek, Nico; Bäumer, Christian; Timmermann, Beate; Zink, Klemens

    2018-06-07

    Monte Carlo (MC) calculations are a fundamental tool for the investigation of ionization chambers (ICs) in radiation fields, and for calculations in the scope of IC reference dosimetry. Geant4, as used for the toolkit TOPAS, is a major general purpose code, generally suitable for investigating ICs in primary proton beams. To provide reliable results, the impact of parameter settings and the limitations of the underlying condensed history (CH) algorithm need to be known. A Fano cavity test was implemented in Geant4 (10.03.p1) for protons, based on the existing version for electrons distributed with the Geant4 release. This self-consistent test allows the calculation to be compared with the expected result for the typical IC-like geometry of an air-filled cavity surrounded by a higher density material. Various user-selectable parameters of the CH implementation in the EMStandardOpt4 physics-list were tested for incident proton energies between 30 and 250 MeV. Using TOPAS (3.1.p1) the influence of production cuts was investigated for bare air-cavities in water, irradiated by primary protons. Detailed IC geometries for an NACP-02 plane-parallel chamber and an NE2571 Farmer-chamber were created. The overall factor f Q as a ratio between the dose-to-water and dose to the sensitive air-volume was calculated for incident proton energies between 70 and 250 MeV. The Fano test demonstrated the EMStandardOpt4 physics-list with the WentzelIV multiple scattering model as appropriate for IC calculations. If protons start perpendicular to the air cavity, no further step-size limitations are required to pass the test within 0.1%. For an isotropic source, limitations of the maximum step length within the air cavity and its surrounding as well as a limitation of the maximum fractional energy loss per step were required to pass within 0.2%. A production cut of  ⩽5 μm or  ∼15 keV for all particles yielded a constant result for f Q of bare air-filled cavities. The overall

  16. GEANT4-based full simulation of the PADME experiment at the DAΦNE BTF

    NASA Astrophysics Data System (ADS)

    Leonardi, E.; Kozhuharov, V.; Raggi, M.; Valente, P.

    2017-10-01

    A possible solution to the dark matter problem postulates that dark particles can interact with Standard Model particles only through a new force mediated by a “portal”. If the new force has a U(1) gauge structure, the “portal” is a massive photon-like vector particle, called dark photon or A‧. The PADME experiment at the DAΦNE Beam-Test Facility (BTF) in Frascati is designed to detect dark photons produced in positron on fixed target annihilations decaying to dark matter (e+e-→γA‧) by measuring the final state missing mass. The experiment will be composed of a thin active diamond target where a 550 MeV positron beam will impinge to produce e+e- annihilation events. The surviving beam will be deflected with a magnet while the photons produced in the annihilation will be measured by a calorimeter composed of BGO crystals. To reject the background from Bremsstrahlung gamma production, a set of segmented plastic scintillator vetoes will be used to detect positrons exiting the target with an energy lower than that of the beam, while a fast small angle calorimeter will be used to reject the e+e-→γγ(γ) background. To optimize the experimental layout in terms of signal acceptance and background rejection, the full layout of the experiment was modelled with the GEANT4 simulation package. In this paper we will describe the details of the simulation and report on the results obtained with the software.

  17. Comparison of experimental proton-induced fluorescence spectra for a selection of thin high-Z samples with Geant4 Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Barberet, Ph.; Dévès, G.; Michelet, C.; Francis, Z.; Ivantchenko, V.; Mantero, A.; El Bitar, Z.; Bernal, M. A.; Tran, H. N.; Karamitros, M.; Seznec, H.

    2015-09-01

    The general purpose Geant4 Monte Carlo simulation toolkit is able to simulate radiative and non-radiative atomic de-excitation processes such as fluorescence and Auger electron emission, occurring after interaction of incident ionising radiation with target atomic electrons. In this paper, we evaluate the Geant4 modelling capability for the simulation of fluorescence spectra induced by 1.5 MeV proton irradiation of thin high-Z foils (Fe, GdF3, Pt, Au) with potential interest for nanotechnologies and life sciences. Simulation results are compared to measurements performed at the Centre d'Etudes Nucléaires de Bordeaux-Gradignan AIFIRA nanobeam line irradiation facility in France. Simulation and experimental conditions are described and the influence of Geant4 electromagnetic physics models is discussed.

  18. Multi-threading performance of Geant4, MCNP6, and PHITS Monte Carlo codes for tetrahedral-mesh geometry.

    PubMed

    Han, Min Cheol; Yeom, Yeon Soo; Lee, Hyun Su; Shin, Bangho; Kim, Chan Hyeong; Furuta, Takuya

    2018-05-04

    In this study, the multi-threading performance of the Geant4, MCNP6, and PHITS codes was evaluated as a function of the number of threads (N) and the complexity of the tetrahedral-mesh phantom. For this, three tetrahedral-mesh phantoms of varying complexity (simple, moderately complex, and highly complex) were prepared and implemented in the three different Monte Carlo codes, in photon and neutron transport simulations. Subsequently, for each case, the initialization time, calculation time, and memory usage were measured as a function of the number of threads used in the simulation. It was found that for all codes, the initialization time significantly increased with the complexity of the phantom, but not with the number of threads. Geant4 exhibited much longer initialization time than the other codes, especially for the complex phantom (MRCP). The improvement of computation speed due to the use of a multi-threaded code was calculated as the speed-up factor, the ratio of the computation speed on a multi-threaded code to the computation speed on a single-threaded code. Geant4 showed the best multi-threading performance among the codes considered in this study, with the speed-up factor almost linearly increasing with the number of threads, reaching ~30 when N  =  40. PHITS and MCNP6 showed a much smaller increase of the speed-up factor with the number of threads. For PHITS, the speed-up factors were low when N  =  40. For MCNP6, the increase of the speed-up factors was better, but they were still less than ~10 when N  =  40. As for memory usage, Geant4 was found to use more memory than the other codes. In addition, compared to that of the other codes, the memory usage of Geant4 more rapidly increased with the number of threads, reaching as high as ~74 GB when N  =  40 for the complex phantom (MRCP). It is notable that compared to that of the other codes, the memory usage of PHITS was much lower, regardless of both the complexity of the

  19. Muon Telescope (MuTe): A first study using Geant4

    NASA Astrophysics Data System (ADS)

    Asorey, H.; Balaguera-Rojas, A.; Calderon-Ardila, R.; Núñez, L. A.; Sanabria-Gómez, J. D.; Súarez-Durán, M.; Tapia, A.

    2017-07-01

    Muon tomography is based on recording the difference of absorption of muons by matter, as ordinary radiography does for using X-rays. The interaction of cosmic rays with the atmosphere produces extensive air showers which provides an abundant source for atmospheric muons, benefiting various applications of muon tomography, particularly the study of the inner structure of volcanoes. The MuTe (for Muon Telescope) is a hybrid detector composed of scintillation bars and a water Cherenkov detector designed to measure cosmic muon flux crossing volcanic edifices. This detector consists of two scintillator plates (1.44 m2 with 30 x 30 pixels), with a maximum distance of 2.0m of separation. In this work we report the first simulation of the MuTe using GEANT4 -set of simulation tools, based in C++ - that provides information about the interaction between radiation and matter. This computational tool allows us to know the energy deposited by the muons and modeling the response of the scintillators and the water cherenkov detector to the passage of radiation which is crucial to compare to our data analysis.

  20. The effects of nuclear data library processing on Geant4 and MCNP simulations of the thermal neutron scattering law

    NASA Astrophysics Data System (ADS)

    Hartling, K.; Ciungu, B.; Li, G.; Bentoumi, G.; Sur, B.

    2018-05-01

    Monte Carlo codes such as MCNP and Geant4 rely on a combination of physics models and evaluated nuclear data files (ENDF) to simulate the transport of neutrons through various materials and geometries. The grid representation used to represent the final-state scattering energies and angles associated with neutron scattering interactions can significantly affect the predictions of these codes. In particular, the default thermal scattering libraries used by MCNP6.1 and Geant4.10.3 do not accurately reproduce the ENDF/B-VII.1 model in simulations of the double-differential cross section for thermal neutrons interacting with hydrogen nuclei in a thin layer of water. However, agreement between model and simulation can be achieved within the statistical error by re-processing ENDF/B-VII.I thermal scattering libraries with the NJOY code. The structure of the thermal scattering libraries and sampling algorithms in MCNP and Geant4 are also reviewed.

  1. Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.

    PubMed

    Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B

    2010-09-01

    The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made.

  2. Simulating cosmic radiation absorption and secondary particle production of solar panel layers of Low Earth Orbit (LEO) satellite with GEANT4

    NASA Astrophysics Data System (ADS)

    Yiǧitoǧlu, Merve; Veske, Doǧa; Nilüfer Öztürk, Zeynep; Bilge Demirköz, Melahat

    2016-07-01

    All devices which operate in space are exposed to cosmic rays during their operation. The resulting radiation may cause fatal damages in the solid structure of devices and the amount of absorbed radiation dose and secondary particle production for each component should be calculated carefully before the production. Solar panels are semiconductor solid state devices and are very sensitive to radiation. Even a short term power cut-off may yield a total failure of the satellite. Even little doses of radiation can change the characteristics of solar cells. This deviation can be caused by rarer high energetic particles as well as the total ionizing dose from the abundant low energy particles. In this study, solar panels planned for a specific LEO satellite, IMECE, are analyzed layer by layer. The Space Environment Information System (SPENVIS) database and GEANT4 simulation software are used to simulate the layers of the panels. The results obtained from the simulation will be taken in account to determine the amount of radiation protection and resistance needed for the panels or to revise the design of the panels.

  3. Performance of Geant4 in simulating semiconductor particle detector response in the energy range below 1 MeV

    NASA Astrophysics Data System (ADS)

    Soti, G.; Wauters, F.; Breitenfeldt, M.; Finlay, P.; Kraev, I. S.; Knecht, A.; Porobić, T.; Zákoucký, D.; Severijns, N.

    2013-11-01

    Geant4 simulations play a crucial role in the analysis and interpretation of experiments providing low energy precision tests of the Standard Model. This paper focuses on the accuracy of the description of the electron processes in the energy range between 100 and 1000 keV. The effect of the different simulation parameters and multiple scattering models on the backscattering coefficients is investigated. Simulations of the response of HPGe and passivated implanted planar Si detectors to β particles are compared to experimental results. An overall good agreement is found between Geant4 simulations and experimental data.

  4. Efficient voxel navigation for proton therapy dose calculation in TOPAS and Geant4

    NASA Astrophysics Data System (ADS)

    Schümann, J.; Paganetti, H.; Shin, J.; Faddegon, B.; Perl, J.

    2012-06-01

    A key task within all Monte Carlo particle transport codes is ‘navigation’, the calculation to determine at each particle step what volume the particle may be leaving and what volume the particle may be entering. Navigation should be optimized to the specific geometry at hand. For patient dose calculation, this geometry generally involves voxelized computed tomography (CT) data. We investigated the efficiency of navigation algorithms on currently available voxel geometry parameterizations in the Monte Carlo simulation package Geant4: G4VPVParameterisation, G4VNestedParameterisation and G4PhantomParameterisation, the last with and without boundary skipping, a method where neighboring voxels with the same Hounsfield unit are combined into one larger voxel. A fourth parameterization approach (MGHParameterization), developed in-house before the latter two parameterizations became available in Geant4, was also included in this study. All simulations were performed using TOPAS, a tool for particle simulations layered on top of Geant4. Runtime comparisons were made on three distinct patient CT data sets: a head and neck, a liver and a prostate patient. We included an additional version of these three patients where all voxels, including the air voxels outside of the patient, were uniformly set to water in the runtime study. The G4VPVParameterisation offers two optimization options. One option has a 60-150 times slower simulation speed. The other is compatible in speed but requires 15-19 times more memory compared to the other parameterizations. We found the average CPU time used for the simulation relative to G4VNestedParameterisation to be 1.014 for G4PhantomParameterisation without boundary skipping and 1.015 for MGHParameterization. The average runtime ratio for G4PhantomParameterisation with and without boundary skipping for our heterogeneous data was equal to 0.97: 1. The calculated dose distributions agreed with the reference distribution for all but the G4

  5. CPU time optimization and precise adjustment of the Geant4 physics parameters for a VARIAN 2100 C/D gamma radiotherapy linear accelerator simulation using GAMOS.

    PubMed

    Arce, Pedro; Lagares, Juan Ignacio

    2018-01-25

    We have verified the GAMOS/Geant4 simulation model of a 6 MV VARIAN Clinac 2100 C/D linear accelerator by the procedure of adjusting the initial beam parameters to fit the percentage depth dose and cross-profile dose experimental data at different depths in a water phantom. Thanks to the use of a wide range of field sizes, from 2  ×  2 cm 2 to 40  ×  40 cm 2 , a small phantom voxel size and high statistics, fine precision in the determination of the beam parameters has been achieved. This precision has allowed us to make a thorough study of the different physics models and parameters that Geant4 offers. The three Geant4 electromagnetic physics sets of models, i.e. Standard, Livermore and Penelope, have been compared to the experiment, testing the four different models of angular bremsstrahlung distributions as well as the three available multiple-scattering models, and optimizing the most relevant Geant4 electromagnetic physics parameters. Before the fitting, a comprehensive CPU time optimization has been done, using several of the Geant4 efficiency improvement techniques plus a few more developed in GAMOS.

  6. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  7. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Cañadas, M.; Arce, P.; Rato Mendes, P.

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was

  8. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Suerfu, B.; Xu, J.; Ivantchenko, V.; Mantero, A.; Brown, J. M. C.; Bernal, M. A.; Francis, Z.; Karamitros, M.; Tran, H. N.

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  9. The GeantV project: Preparing the future of simulation

    DOE PAGES

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; ...

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energymore » Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.« less

  10. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.

  11. Use of SRIM and Garfield with Geant4 for the characterization of a hybrid 10B/3He neutron detector

    NASA Astrophysics Data System (ADS)

    van der Ende, B. M.; Rand, E. T.; Erlandson, A.; Li, L.

    2018-06-01

    This paper describes a method for more complete neutron detector characterization using Geant4's Monte Carlo methods for characterizing overall detector response rate and Garfield interfaced with SRIM for the simulation of the detector's raw pulses, as applied to a hybrid 10B/3He detector. The Geant4 models characterizing the detector's interaction with a 252Cf point source and parallel beams of mono-energetic neutrons (assuming ISO 8529 reference energy values) compare and agree well with calibrated 252Cf measurements to within 6.4%. Validated Geant4 model outputs serve as input to Garfield+SRIM calculations to provide meaningful pulse height spectra. Modifications to Garfield for this work were necessary to account for simultaneous tracking of electrons resulting from proton and triton reaction products from a single 3He neutron capture event, and it was further necessary to interface Garfield with the energy loss, range, and straggling calculations provided by SRIM. Individual raw pulses generated by Garfield+SRIM are also observed to agree well with experimentally measured raw pulses from the detector.

  12. BRDF profile of Tyvek and its implementation in the Geant4 simulation toolkit.

    PubMed

    Nozka, Libor; Pech, Miroslav; Hiklova, Helena; Mandat, Dusan; Hrabovsky, Miroslav; Schovanek, Petr; Palatka, Miroslav

    2011-02-28

    Diffuse and specular characteristics of the Tyvek 1025-BL material are reported with respect to their implementation in the Geant4 Monte Carlo simulation toolkit. This toolkit incorporates the UNIFIED model. Coefficients defined by the UNIFIED model were calculated from the bidirectional reflectance distribution function (BRDF) profiles measured with a scatterometer for several angles of incidence. Results were amended with profile measurements made by a profilometer.

  13. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    DOE PAGES

    Incerti, S.; Suerfu, B.; Xu, J.; ...

    2016-02-16

    We report that a revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research andmore » other low energy physics fields.« less

  14. A macroscopic and microscopic study of radon exposure using Geant4 and MCNPX to estimate dose rates and DNA damage

    NASA Astrophysics Data System (ADS)

    van den Akker, Mary Evelyn

    Radon is considered the second-leading cause of lung cancer after smoking. Epidemiological studies have been conducted in miner cohorts as well as general populations to estimate the risks associated with high and low dose exposures. There are problems with extrapolating risk estimates to low dose exposures, mainly that the dose-response curve at low doses is not well understood. Calculated dosimetric quantities give average energy depositions in an organ or a whole body, but morphological features of an individual can affect these values. As opposed to human phantom models, Computed Tomography (CT) scans provide unique, patient-specific geometries that are valuable in modeling the radiological effects of the short-lived radon progeny sources. Monte Carlo particle transport code Geant4 was used with the CT scan data to model radon inhalation in the main bronchial bifurcation. The equivalent dose rates are near the lower bounds of estimates found in the literature, depending on source volume. To complement the macroscopic study, simulations were run in a small tissue volume in Geant4-DNA toolkit. As an expansion of Geant4 meant to simulate direct physical interactions at the cellular level, the particle track structure of the radon progeny alphas can be analyzed to estimate the damage that can occur in sensitive cellular structures like the DNA molecule. These estimates of DNA double strand breaks are lower than those found in Geant4-DNA studies. Further refinements of the microscopic model are at the cutting edge of nanodosimetry research.

  15. Comparison of hadron shower data in the PAMELA experiment with Geant 4 simulations

    NASA Astrophysics Data System (ADS)

    Alekseev, V. V.; Dunaeva, O. A.; Bogomolov, Yu V.; Lukyanov, A. D.; Malakhov, V. V.; Mayorov, A. G.; Rodenko, S. A.

    2017-01-01

    The sampling imaging electromagnetic calorimeter of ≈ 16.3 radiation lengths and ≈ 0.6 nuclear interaction length designed and constructed by the PAMELA collaboration as a part of the large magnetic spectrometer PAMELA. Calorimeter consists of 44 single-sided silicon sensor planes interleaved with 22 plates of tungsten absorber (thickness of each tungsten layer 0.26 cm). Silicon planes are composed of a 3 × 3 matrix of silicon detectors, each segmented into 32 read-out strips with a pitch of 2.4 mm. The orientation of the strips of two consecutive layers is orthogonal and therefore provides two-dimensional spatial information. Due to the high granularity, the development of hadronic showers can be study with a good precision. In this work a Monte Carlo simulations (based on Geant4) performed using different available models, and including detector and physical effects, compared with the experimental data obtained on the near Earth orbit. Response of the PAMELA calorimeter to hadronic showers investigated including total energy release in calorimeter and transverse shower profile characteristics.

  16. Comparison of electromagnetic and hadronic models generated using Geant 4 with antiproton dose measured in CERN.

    PubMed

    Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan

    2015-01-01

    After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists.

  17. GEANT4 benchmark with MCNPX and PHITS for activation of concrete

    NASA Astrophysics Data System (ADS)

    Tesse, Robin; Stichelbaut, Frédéric; Pauly, Nicolas; Dubus, Alain; Derrien, Jonathan

    2018-02-01

    The activation of concrete is a real problem from the point of view of waste management. Because of the complexity of the issue, Monte Carlo (MC) codes have become an essential tool to its study. But various codes or even nuclear models exist in MC. MCNPX and PHITS have already been validated for shielding studies but GEANT4 is also a suitable solution. In these codes, different models can be considered for a concrete activation study. The Bertini model is not the best model for spallation while BIC and INCL model agrees well with previous results in literature.

  18. Nuclear spectroscopy with Geant4: Proton and neutron emission & radioactivity

    NASA Astrophysics Data System (ADS)

    Sarmiento, L. G.; Rudolph, D.

    2016-07-01

    With the aid of a novel combination of existing equipment - JYFLTRAP and the TASISpec decay station - it is possible to perform very clean quantum-state selective, high-resolution particle-γ decay spectroscopy. We intend to study the determination of the branching ratio of the ℓ = 9 proton emission from the Iπ = 19/2-, 3174-keV isomer in the N = Z - 1 nucleus 53Co. The study aims to initiate a series of similar experiments along the proton dripline, thereby providing unique insights into "open quantum systems". The technique has been pioneered in case studies using SHIPTRAP and TASISpec at GSI. Newly available radioactive decay modes in Geant4 simulations are going to corroborate the anticipated experimental results.

  19. SU-E-T-05: Comparing DNA Strand Break Yields for Photons under Different Irradiation Conditions with Geant4-DNA.

    PubMed

    Pater, P; Bernal, M; Naqa, I El; Seuntjens, J

    2012-06-01

    To validate and scrutinize published DNA strand break data with Geant4-DNA and a probabilistic model. To study the impact of source size, electronic equilibrium and secondary electron tracking cutoff on direct relative biological effectiveness (DRBE). Geant4 (v4.9.5) was used to simulate a cylindrical region of interest (ROI) with r = 15 nm and length = 1.05 mm, in a slab of liquid water of 1.06 g/cm 3 density. The ROI was irradiated with mono-energetic photons, with a uniformly distributed volumetric isotropic source (0.28, 1.5 keV) or a plane beam (0.662, 1.25 MeV), of variable size. Electrons were tracked down to 50 or 10 eV, with G4-DNA processes and energy transfer greater than 10.79 eV was scored. Based on volume ratios, each scored event had a 0.0388 probability of happening on either DNA helix (break). Clusters of at least one break on each DNA helix within 3.4 nm were found using a DBSCAN algorithm and categorized as double strand breaks (DSB). All other events were categorized as single strand breaks (SSB). Geant4-DNA is able to reproduce strand break yields previously published. Homogeneous irradiation conditions should be present throughout the ROI for DRBE comparisons. SSB yields seem slightly dependent on the primary photon energy. DRBEs show a significant increasing trend for lower energy incident photons. A lower electron cutoff produces higher SSB yields, but decreases the SSB/DSB yields ratio. The probabilistic and geometrical DNA models can predict equivalent results. Using Geant4, we were able to reproduce previously published results on the direct strand break yields of photon and study the importance of irradiation conditions. We also show an ascending trend for DRBE with lower incident photon energies. A probabilistic model coupled with track structure analysis can be used to simulate strand break yields. NSERC, CIHR. © 2012 American Association of Physicists in Medicine.

  20. SU-E-T-531: Performance Evaluation of Multithreaded Geant4 for Proton Therapy Dose Calculations in a High Performance Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, J; Coss, D; McMurry, J

    Purpose: To evaluate the efficiency of multithreaded Geant4 (Geant4-MT, version 10.0) for proton Monte Carlo dose calculations using a high performance computing facility. Methods: Geant4-MT was used to calculate 3D dose distributions in 1×1×1 mm3 voxels in a water phantom and patient's head with a 150 MeV proton beam covering approximately 5×5 cm2 in the water phantom. Three timestamps were measured on the fly to separately analyze the required time for initialization (which cannot be parallelized), processing time of individual threads, and completion time. Scalability of averaged processing time per thread was calculated as a function of thread number (1,more » 100, 150, and 200) for both 1M and 50 M histories. The total memory usage was recorded. Results: Simulations with 50 M histories were fastest with 100 threads, taking approximately 1.3 hours and 6 hours for the water phantom and the CT data, respectively with better than 1.0 % statistical uncertainty. The calculations show 1/N scalability in the event loops for both cases. The gains from parallel calculations started to decrease with 150 threads. The memory usage increases linearly with number of threads. No critical failures were observed during the simulations. Conclusion: Multithreading in Geant4-MT decreased simulation time in proton dose distribution calculations by a factor of 64 and 54 at a near optimal 100 threads for water phantom and patient's data respectively. Further simulations will be done to determine the efficiency at the optimal thread number. Considering the trend of computer architecture development, utilizing Geant4-MT for radiotherapy simulations is an excellent cost-effective alternative for a distributed batch queuing system. However, because the scalability depends highly on simulation details, i.e., the ratio of the processing time of one event versus waiting time to access for the shared event queue, a performance evaluation as described is recommended.« less

  1. Software Design Description for the Navy Coastal Ocean Model (NCOM) Version 4.0

    DTIC Science & Technology

    2008-12-31

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--08-9149 Approved for public release; distribution is unlimited. Software ...suggestions for reducing this burden to Department of Defense, Washington Headquarters Services , Directorate for Information Operations and Reports (0704...LIMITATION OF ABSTRACT Software Design Description for the Navy Coastal Ocean Model (NCOM) Version 4.0 Paul Martin, Charlie N. Barron, Lucy F

  2. GEANT4 Simulation of Hadronic Interactions at 8-GeV/C to 10-GeV/C: Response to the HARP-CDP Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uzhinsky, V.; /Dubna, JINR /CERN; Apostolakis, J.

    2011-11-21

    The results of the HARP-CDP group on the comparison of GEANT4 Monte Carlo predictions versus experimental data are discussed. It is shown that the problems observed by the group are caused by an incorrect implementation of old features at the programming level, and by a lack of the nucleon Fermi motion in the simulation of quasielastic scattering. These drawbacks are not due to the physical models used. They do not manifest themselves in the most important applications of the GEANT4 toolkit.

  3. Diffusion-controlled reactions modeling in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Karamitros, M.; Luan, S.; Bernal, M. A.; Allison, J.; Baldacchino, G.; Davidkova, M.; Francis, Z.; Friedland, W.; Ivantchenko, V.; Ivantchenko, A.; Mantero, A.; Nieminem, P.; Santin, G.; Tran, H. N.; Stepan, V.; Incerti, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing all the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k-d tree data structure for quickly locating, for a given molecule, its closest reactants. The

  4. Diffusion-controlled reactions modeling in Geant4-DNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karamitros, M., E-mail: matkara@gmail.com; CNRS, INCIA, UMR 5287, F-33400 Talence; Luan, S.

    2014-10-01

    Context Under irradiation, a biological system undergoes a cascade of chemical reactions that can lead to an alteration of its normal operation. There are different types of radiation and many competing reactions. As a result the kinetics of chemical species is extremely complex. The simulation becomes then a powerful tool which, by describing the basic principles of chemical reactions, can reveal the dynamics of the macroscopic system. To understand the dynamics of biological systems under radiation, since the 80s there have been on-going efforts carried out by several research groups to establish a mechanistic model that consists in describing allmore » the physical, chemical and biological phenomena following the irradiation of single cells. This approach is generally divided into a succession of stages that follow each other in time: (1) the physical stage, where the ionizing particles interact directly with the biological material; (2) the physico-chemical stage, where the targeted molecules release their energy by dissociating, creating new chemical species; (3) the chemical stage, where the new chemical species interact with each other or with the biomolecules; (4) the biological stage, where the repairing mechanisms of the cell come into play. This article focuses on the modeling of the chemical stage. Method This article presents a general method of speeding-up chemical reaction simulations in fluids based on the Smoluchowski equation and Monte-Carlo methods, where all molecules are explicitly simulated and the solvent is treated as a continuum. The model describes diffusion-controlled reactions. This method has been implemented in Geant4-DNA. The keys to the new algorithm include: (1) the combination of a method to compute time steps dynamically with a Brownian bridge process to account for chemical reactions, which avoids costly fixed time step simulations; (2) a k–d tree data structure for quickly locating, for a given molecule, its closest

  5. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the geant4 Monte Carlo code

    PubMed Central

    Guan, Fada; Peeler, Christopher; Bronk, Lawrence; Geng, Changran; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Grosshans, David; Mohan, Radhe; Titt, Uwe

    2015-01-01

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the geant 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from geant 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LETt and dose-averaged LET, LETd) using geant 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LETt and LETd of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LETt but significant for LETd. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in geant 4 can result in incorrect LETd calculation results in the dose plateau region for small step limits. The erroneous LETd results can be attributed to the algorithm to determine fluctuations in energy deposition along the

  6. Introducing Third-Year Undergraduates to GEANT4 Simulations of Light Transport and Collection in Scintillation Materials

    ERIC Educational Resources Information Center

    Riggi, Simone; La Rocca, Paola; Riggi, Francesco

    2011-01-01

    GEANT4 simulations of the processes affecting the transport and collection of optical photons generated inside a scintillation detector were carried out, with the aim to complement the educational material offered by textbooks to third-year physics undergraduates. Two typical situations were considered: a long scintillator strip with and without a…

  7. Performance of GeantV EM Physics Models

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2017-10-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  8. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  9. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Fada; Peeler, Christopher; Taleei, Reza

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking stepmore » size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm

  10. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou Y.; Mitra S.; Zhu X.

    2011-10-16

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling,more » this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C{sub 3}H{sub 6}O{sub 6}N{sub 6}), can be identified to a depth of 20 cm when buried in soil.« less

  11. Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE)

    DTIC Science & Technology

    2005-04-01

    PA 15213-3890 Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE) Felix Bachmann and Mark Klein Software...DATES COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Methodical Design of Software Architecture Using an Architecture Design Assistant...important for architecture design – quality requirements and constraints are most important Here’s some evidence: If the only concern is

  12. GEANT 4 simulation of (99)Mo photonuclear production in nanoparticles.

    PubMed

    Dikiy, N P; Dovbnya, A N; Fedorchenko, D V; Khazhmuradov, M A

    2016-08-01

    GEANT 4 Monte-Carlo simulation toolkit is used to study the kinematic recoil method of (99)Mo photonuclear production. Simulation for bremsstrahlung photon spectrum with maximum photon energy 30MeV showed that for MoO3 nanoparticle escape fraction decreases from 0.24 to 0.08 when nanoparticle size increases from 20nm to 80nm. For the natural molybdenum and pure (100)Mo we obtained the lower values: from 0.17 to 0.05. The generation of accompanying molybdenum nuclei is significantly lower for pure (100)Mo and is about 3.6 nuclei per single (99)Mo nucleus, while natural molybdenum nanoparticle produce about 48 accompanying nuclei. Also, we have shown that for high-energy photons escape fraction of (99)Mo decreases, while production of unwanted molybdenum isotopes is significantly higher. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. HSCT4.0 Application: Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.

    2001-01-01

    The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.

  14. Calculation of self–shielding factor for neutron activation experiments using GEANT4 and MCNP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero–Barrientos, Jaime, E-mail: jaromero@ing.uchile.cl; Universidad de Chile, DFI, Facultad de Ciencias Físicas Y Matemáticas, Avenida Blanco Encalada 2008, Santiago; Molina, F.

    2016-07-07

    The neutron self–shielding factor G as a function of the neutron energy was obtained for 14 pure metallic samples in 1000 isolethargic energy bins from 1·10{sup −5}eV to 2·10{sup 7}eV using Monte Carlo simulations in GEANT4 and MCNP6. The comparison of these two Monte Carlo codes shows small differences in the final self–shielding factor mostly due to the different cross section databases that each program uses.

  15. Modeling spallation reactions in tungsten and uranium targets with the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yury; Pshenichnov, Igor; Mishustin, Igor; Greiner, Walter

    2012-02-01

    We study primary and secondary reactions induced by 600 MeV proton beams in monolithic cylindrical targets made of natural tungsten and uranium by using Monte Carlo simulations with the Geant4 toolkit [1-3]. Bertini intranuclear cascade model, Binary cascade model and IntraNuclear Cascade Liège (INCL) with ABLA model [4] were used as calculational options to describe nuclear reactions. Fission cross sections, neutron multiplicity and mass distributions of fragments for 238U fission induced by 25.6 and 62.9 MeV protons are calculated and compared to recent experimental data [5]. Time distributions of neutron leakage from the targets and heat depositions are calculated. This project is supported by Siemens Corporate Technology.

  16. Guiding the Design of Radiation Imagers with Experimentally Benchmarked Geant4 Simulations for Electron-Tracking Compton Imaging

    NASA Astrophysics Data System (ADS)

    Coffer, Amy Beth

    -scattered electron-trajectories is with high-resolution Charged-Coupled Devices (CCDs). The proof-of-principle CCD-based ETCI experiment demonstrated the CCDs' ability to measure the Compton-scattered electron-tracks as a 2-dimensional image. Electron-track-imaging algorithms using the electron-track-image are able to determine the 3-dimensional electron-track trajectory within +/- 20 degrees. The work presented here is the physics simulations developed along side the experimental proof-of-principle experiment. The development of accurate physics modeling for multiple-layer CCDs based ETCI systems allow for the accurate prediction of future ETCI system performance. The simulations also enable quick development insights for system design, and they guide the development of electron-track reconstruction methods. The physics simulation efforts for this project looked closely at the accuracy of the Geant4 Monte Carlo methods for medium energy electron transport. In older version of Geant4 there were some discrepancies between the electron-tracking experimental measurements and the simulation results. It was determined that when comparing the electron dynamics of electrons at very high resolutions, Geant4 simulations must be fine tuned with careful choices for physics production cuts and electron physics stepping sizes. One result of this work is a CCDs Monte Carlo model that has been benchmarked to experimental findings and fully characterized for both photon and electron transport. The CCDs physics model now match to within 1 percent error of experimental results for scattered-electron energies below 500 keV. Following the improvements of the CCDs simulations, the performance of a realistic two-layer CCD-stack system was characterized. The realistic CCD-stack system looked at the effect of thin passive-layers on the CCDs' front face and back-contact. The photon interaction efficiency was calculated for the two-layer CCD-stack, and we found that there is a 90 percent probability of

  17. Validation of a virtual source model of medical linac for Monte Carlo dose calculation using multi-threaded Geant4

    NASA Astrophysics Data System (ADS)

    Aboulbanine, Zakaria; El Khayati, Naïma

    2018-04-01

    The use of phase space in medical linear accelerator Monte Carlo (MC) simulations significantly improves the execution time and leads to results comparable to those obtained from full calculations. The classical representation of phase space stores directly the information of millions of particles, producing bulky files. This paper presents a virtual source model (VSM) based on a reconstruction algorithm, taking as input a compressed file of roughly 800 kb derived from phase space data freely available in the International Atomic Energy Agency (IAEA) database. This VSM includes two main components; primary and scattered particle sources, with a specific reconstruction method developed for each. Energy spectra and other relevant variables were extracted from IAEA phase space and stored in the input description data file for both sources. The VSM was validated for three photon beams: Elekta Precise 6 MV/10 MV and a Varian TrueBeam 6 MV. Extensive calculations in water and comparisons between dose distributions of the VSM and IAEA phase space were performed to estimate the VSM precision. The Geant4 MC toolkit in multi-threaded mode (Geant4-[mt]) was used for fast dose calculations and optimized memory use. Four field configurations were chosen for dose calculation validation to test field size and symmetry effects, , , and for squared fields, and for an asymmetric rectangular field. Good agreement in terms of formalism, for 3%/3 mm and 2%/3 mm criteria, for each evaluated radiation field and photon beam was obtained within a computation time of 60 h on a single WorkStation for a 3 mm voxel matrix. Analyzing the VSM’s precision in high dose gradient regions, using the distance to agreement concept (DTA), showed also satisfactory results. In all investigated cases, the mean DTA was less than 1 mm in build-up and penumbra regions. In regards to calculation efficiency, the event processing speed is six times faster using Geant4-[mt] compared to sequential

  18. Electron-hole pairs generation rate estimation irradiated by isotope Nickel-63 in silicone using GEANT4

    NASA Astrophysics Data System (ADS)

    Kovalev, I. V.; Sidorov, V. G.; Zelenkov, P. V.; Khoroshko, A. Y.; Lelekov, A. T.

    2015-10-01

    To optimize parameters of beta-electrical converter of isotope Nickel-63 radiation, model of the distribution of EHP generation rate in semiconductor must be derived. By using Monte-Carlo methods in GEANT4 system with ultra-low energy electron physics models this distribution in silicon calculated and approximated with Gauss function. Maximal efficient isotope layer thickness and maximal energy efficiency of EHP generation were estimated.

  19. Simulations of GCR interactions within planetary bodies using GEANT4

    NASA Astrophysics Data System (ADS)

    Mesick, K.; Feldman, W. C.; Stonehill, L. C.; Coupland, D. D. S.

    2017-12-01

    On planetary bodies with little to no atmosphere, Galactic Cosmic Rays (GCRs) can hit the body and produce neutrons primarily through nuclear spallation within the top few meters of the surfaces. These neutrons undergo further nuclear interactions with elements near the planetary surface and some will escape the surface and can be detected by landed or orbiting neutron radiation detector instruments. The neutron leakage signal at fast neutron energies provides a measure of average atomic mass of the near-surface material and in the epithermal and thermal energy ranges is highly sensitive to the presence of hydrogen. Gamma-rays can also escape the surface, produced at characteristic energies depending on surface composition, and can be detected by gamma-ray instruments. The intra-nuclear cascade (INC) that occurs when high-energy GCRs interact with elements within a planetary surface to produce the leakage neutron and gamma-ray signals is highly complex, and therefore Monte Carlo based radiation transport simulations are commonly used for predicting and interpreting measurements from planetary neutron and gamma-ray spectroscopy instruments. In the past, the simulation code that has been widely used for this type of analysis is MCNPX [1], which was benchmarked against data from the Lunar Neutron Probe Experiment (LPNE) on Apollo 17 [2]. In this work, we consider the validity of the radiation transport code GEANT4 [3], another widely used but open-source code, by benchmarking simulated predictions of the LPNE experiment to the Apollo 17 data. We consider the impact of different physics model options on the results, and show which models best describe the INC based on agreement with the Apollo 17 data. The success of this validation then gives us confidence in using GEANT4 to simulate GCR-induced neutron leakage signals on Mars in relevance to a re-analysis of Mars Odyssey Neutron Spectrometer data. References [1] D.B. Pelowitz, Los Alamos National Laboratory, LA-CP-05

  20. Development of a software package for solid-angle calculations using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai

    2014-02-01

    Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C++, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4.

  1. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  2. Geant4 Predictions of Energy Spectra in Typical Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Sabra, M. S.; Barghouty, A. F.

    2014-01-01

    Accurate knowledge of energy spectra inside spacecraft is important for protecting astronauts as well as sensitive electronics from the harmful effects of space radiation. Such knowledge allows one to confidently map the radiation environment inside the vehicle. The purpose of this talk is to present preliminary calculations for energy spectra inside a spherical shell shielding and behind a slab in typical space radiation environment using the 3D Monte-Carlo transport code Geant4. We have simulated proton and iron isotropic sources and beams impinging on Aluminum and Gallium arsenide (GaAs) targets at energies of 0.2, 0.6, 1, and 10 GeV/u. If time permits, other radiation sources and beams (_, C, O) and targets (C, Si, Ge, water) will be presented. The results are compared to ground-based measurements where available.

  3. A Geant4 simulation of the depth dose percentage in brain tumors treatments using protons and carbon ions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz, José A. M., E-mail: joadiazme@unal.edu.co; Torres, D. A., E-mail: datorresg@unal.edu.co

    2016-07-07

    The deposited energy and dose distribution of beams of protons and carbon over a head are simulated using the free tool package Geant4 and the data analysis package ROOT-C++. The present work shows a methodology to understand the microscopical process occurring in a session of hadron-therapy using advance simulation tools.

  4. The 4MOST facility control software

    NASA Astrophysics Data System (ADS)

    Pramskiy, Alexander; Mandel, Holger; Rothmaier, Florian; Stilz, Ingo; Winkler, Roland; Hahn, Thomas

    2016-07-01

    The 4-m Multi-Object Spectrographic Telescope (4MOST) is one high-resolution (R 18000) and two lowresolution (R fi 5000) spectrographs covering the wavelength range between 390 and 950 nm. The spectrographs will be installed on ESO VISTA telescope and will be fed by approximately 2400 fibres. The instrument is capable to simultaneously obtain spectra of about 2400 objects distributed over an hexagonal field-of-view of four square degrees. This paper aims at giving an overview of the control software design, which is based on the standard ESO VLT software architecture and customised to fit the needs of the 4MOST instrument. In particular, the facility control software is intended to arrange the precise positioning of the fibres, to schedule and observe many surveys in parallel, and to combine the output from the three spectrographs. Moreover, 4MOST's software will include user-friendly graphical user interfaces that enable users to interact with the facility control system and to monitor all data-taking and calibration tasks of the instrument. A secondary guiding system will be implemented to correct for any fibre exure and thus to improve 4MOST's guiding performance. The large amount of fibres requires the custom design of data exchange to avoid performance issues. The observation sequences are designed to use spectrographs in parallel with synchronous points for data exchange between subsystems. In order to control hardware devices, Programmable Logic Controller (PLC) components will be used, the new standard for future instruments at ESO.

  5. Analysis of GEANT4 Physics List Properties in the 12 GeV MOLLER Simulation Framework

    NASA Astrophysics Data System (ADS)

    Haufe, Christopher; Moller Collaboration

    2013-10-01

    To determine the validity of new physics beyond the scope of the electroweak theory, nuclear physicists across the globe have been collaborating on future endeavors that will provide the precision needed to confirm these speculations. One of these is the MOLLER experiment - a low-energy particle experiment that will utilize the 12 GeV upgrade of Jefferson Lab's CEBAF accelerator. The motivation of this experiment is to measure the parity-violating asymmetry of scattered polarized electrons off unpolarized electrons in a liquid hydrogen target. This measurement would allow for a more precise determination of the electron's weak charge and weak mixing angle. While still in its planning stages, the MOLLER experiment requires a detailed simulation framework in order to determine how the project should be run in the future. The simulation framework for MOLLER, called ``remoll'', is written in GEANT4 code. As a result, the simulation can utilize a number of GEANT4 coded physics lists that provide the simulation with a number of particle interaction constraints based off of different particle physics models. By comparing these lists with one another using the data-analysis application ROOT, the most optimal physics list for the MOLLER simulation can be determined and implemented. This material is based upon work supported by the National Science Foundation under Grant No. 714001.

  6. Geant4 simulations of the absorption of photons in CsI and NaI produced by electrons with energies up to 4 MeV and their application to precision measurements of the β-energy spectrum with a calorimetric technique

    NASA Astrophysics Data System (ADS)

    Huyan, X.; Naviliat-Cuncic, O.; Voytas, P.; Chandavar, S.; Hughes, M.; Minamisono, K.; Paulauskas, S. V.

    2018-01-01

    The yield of photons produced by electrons slowing down in CsI and NaI was studied with four electromagnetic physics constructors included in the Geant4 toolkit. The subsequent absorption of photons in detector geometries used for measurements of the β spectrum shape was also studied with a focus on the determination of the absorption fraction. For electrons with energies in the range 0.5-4 MeV, the relative photon yields determined with the four Geant4 constructors differ at the level of 10-2 in amplitude and the relative absorption fractions differ at the level of 10-4 in amplitude. The differences among constructors enabled the estimation of the sensitivity to Geant4 simulations for the measurement of the β energy spectrum shape in 6He decay using a calorimetric technique with ions implanted in the active volume of detectors. The size of the effect associated with photons escaping the detectors was quantified in terms of a slope which, on average, is respectively - 5 . 4 %/MeV and - 4 . 8 %/MeV for the CsI and NaI geometries. The corresponding relative uncertainties as determined from the spread of results obtained with the four Geant4 constructors are 0.0067 and 0.0058.

  7. Geant4 calculations for space radiation shielding material Al2O3

    NASA Astrophysics Data System (ADS)

    Capali, Veli; Acar Yesil, Tolga; Kaya, Gokhan; Kaplan, Abdullah; Yavuz, Mustafa; Tilki, Tahir

    2015-07-01

    Aluminium Oxide, Al2O3 is the most widely used material in the engineering applications. It is significant aluminium metal, because of its hardness and as a refractory material owing to its high melting point. This material has several engineering applications in diverse fields such as, ballistic armour systems, wear components, electrical and electronic substrates, automotive parts, components for electric industry and aero-engine. As well, it is used as a dosimeter for radiation protection and therapy applications for its optically stimulated luminescence properties. In this study, stopping powers and penetrating distances have been calculated for the alpha, proton, electron and gamma particles in space radiation shielding material Al2O3 for incident energies 1 keV - 1 GeV using GEANT4 calculation code.

  8. GEANT4 simulations of a novel 3He-free thermalization neutron detector

    NASA Astrophysics Data System (ADS)

    Mazzone, A.; Finocchiaro, P.; Lo Meo, S.; Colonna, N.

    2018-05-01

    A novel concept for 3He-free thermalization detector is here investigated by means of GEANT4 simulations. The detector is based on strips of solid-state detectors with 6Li deposit for neutron conversion. Various geometrical configurations have been investigated in order to find the optimal solution, in terms of value and energy dependence of the efficiency for neutron energies up to 10 MeV. The expected performance of the new detector are compared with those of an optimized thermalization detector based on standard 3He tubes. Although an 3He-based detector is superior in terms of performance and simplicity, the proposed solution may become more appealing in terms of costs in case of shortage of 3He supply.

  9. Validation of a virtual source model of medical linac for Monte Carlo dose calculation using multi-threaded Geant4.

    PubMed

    Aboulbanine, Zakaria; El Khayati, Naïma

    2018-04-13

    The use of phase space in medical linear accelerator Monte Carlo (MC) simulations significantly improves the execution time and leads to results comparable to those obtained from full calculations. The classical representation of phase space stores directly the information of millions of particles, producing bulky files. This paper presents a virtual source model (VSM) based on a reconstruction algorithm, taking as input a compressed file of roughly 800 kb derived from phase space data freely available in the International Atomic Energy Agency (IAEA) database. This VSM includes two main components; primary and scattered particle sources, with a specific reconstruction method developed for each. Energy spectra and other relevant variables were extracted from IAEA phase space and stored in the input description data file for both sources. The VSM was validated for three photon beams: Elekta Precise 6 MV/10 MV and a Varian TrueBeam 6 MV. Extensive calculations in water and comparisons between dose distributions of the VSM and IAEA phase space were performed to estimate the VSM precision. The Geant4 MC toolkit in multi-threaded mode (Geant4-[mt]) was used for fast dose calculations and optimized memory use. Four field configurations were chosen for dose calculation validation to test field size and symmetry effects, [Formula: see text] [Formula: see text], [Formula: see text] [Formula: see text], and [Formula: see text] [Formula: see text] for squared fields, and [Formula: see text] [Formula: see text] for an asymmetric rectangular field. Good agreement in terms of [Formula: see text] formalism, for 3%/3 mm and 2%/3 mm criteria, for each evaluated radiation field and photon beam was obtained within a computation time of 60 h on a single WorkStation for a 3 mm voxel matrix. Analyzing the VSM's precision in high dose gradient regions, using the distance to agreement concept (DTA), showed also satisfactory results. In all investigated cases, the mean DTA was

  10. Optimization of {sup 6}LiF:ZnS(Ag) Scintillator Light Yield Using Geant4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yehuda-Zada, Y.; Ben-Gurion University; Pritchard, K.

    2015-07-01

    Neutrons provide an effective tool to probe materials structure. Neutron diffraction is a method to determine the atomic and magnetic structure of a material based on neutron scattering. By this method a collimated incident beam of thermal neutrons heat the examined sample and based on the obtained diffraction pattern information on the structure of the material is provided. Research for developing a novel cold neutron detector for Chromatic Analysis Neutron Diffractometer Or Reflectometer (CANDOR) is underway at the NIST center for neutron research. The system unique design is aimed to provide over ten times fold faster analysis of materials thanmore » conventional system. In order to achieve the fast analysis a large number of neutron detectors is required. A key design constraint for this detector is the thickness of the neutron sensitive element. This is met using {sup 6}LiF:ZnS(Ag) scintillation material with embedded wavelength shifting (WLS) fibers conducting scintillation light to silicon photomultiplier photo-sensors. The detector sensitivity is determined by both the neutron capture probability ({sup 6}Li density) and the detectable light output produced by the ZnS(Ag) ionization, the latter of which is hindered by the fluorescence absorption of the scintillation light by the ZnS. Tradeoffs between the neutron capture probability, stimulated light production and light attenuation for determining the optimal stoichiometry of the {sup 6}LiF and ZnS(Ag) as well as the volume ratio of scintillator and fiber. Simulations performed using the GEANT4 Monte Carlo package were made in order to optimize the detector design. GEANT4 enables the investigation of the neutron interaction with the detector, the ionization process and the light transfer process following the nuclear process. The series of conversions required for this detector were modelled: - A cold neutron enters the sensor and is captured by {sup 6}Li in the scintillator mixture ({sup 6}Li (n,α) {sup

  11. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β(+)-emitting nuclei during therapeutic particle irradiation to measured data.

    PubMed

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-21

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β(+)-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β(+)-activity and dose is not feasible, a simulation of the expected β(+)-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β(+)-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β(+)-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  12. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β+-emitting nuclei during therapeutic particle irradiation to measured data

    NASA Astrophysics Data System (ADS)

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-01

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β+-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β+-activity and dose is not feasible, a simulation of the expected β+-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β+-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β+-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  13. Full Geant4 and FLUKA simulations of an e-LINAC for its use in particle detectors performance tests

    NASA Astrophysics Data System (ADS)

    Alpat, B.; Pilicer, E.; Servoli, L.; Menichelli, M.; Tucceri, P.; Italiani, M.; Buono, E.; Di Capua, F.

    2012-03-01

    In this work we present the results of full Geant4 and FLUKA simulations and comparison with dosimetry data of an electron LINAC of St. Maria Hospital located in Terni, Italy. The facility is being used primarily for radiotherapy and the goal of the present study is the detailed investigation of electron beam parameters to evaluate the possibility to use the e-LINAC (during time slots when it is not used for radiotherapy) to test the performance of detector systems, in particular those designed to operate in space. The critical beam parameters are electron energy, profile and flux available at the surface of device to be tested. The present work aims to extract these parameters from dosimetry calibration data available at the e-LINAC. The electron energy ranges from 4 MeV to 20 MeV. The dose measurements have been performed by using an Advanced Markus Chamber which has a small sensitive volume.

  14. Apply Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Baggs, Rhoda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  15. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  16. Efficiency transfer using the GEANT4 code of CERN for HPGe gamma spectrometry.

    PubMed

    Chagren, S; Tekaya, M Ben; Reguigui, N; Gharbi, F

    2016-01-01

    In this work we apply the GEANT4 code of CERN to calculate the peak efficiency in High Pure Germanium (HPGe) gamma spectrometry using three different procedures. The first is a direct calculation. The second corresponds to the usual case of efficiency transfer between two different configurations at constant emission energy assuming a reference point detection configuration and the third, a new procedure, consists on the transfer of the peak efficiency between two detection configurations emitting the gamma ray in different energies assuming a "virtual" reference point detection configuration. No pre-optimization of the detector geometrical characteristics was performed before the transfer to test the ability of the efficiency transfer to reduce the effect of the ignorance on their real magnitude on the quality of the transferred efficiency. The obtained and measured efficiencies were found in good agreement for the two investigated methods of efficiency transfer. The obtained agreement proves that Monte Carlo method and especially the GEANT4 code constitute an efficient tool to obtain accurate detection efficiency values. The second investigated efficiency transfer procedure is useful to calibrate the HPGe gamma detector for any emission energy value for a voluminous source using one point source detection efficiency emitting in a different energy as a reference efficiency. The calculations preformed in this work were applied to the measurement exercise of the EUROMET428 project. A measurement exercise where an evaluation of the full energy peak efficiencies in the energy range 60-2000 keV for a typical coaxial p-type HpGe detector and several types of source configuration: point sources located at various distances from the detector and a cylindrical box containing three matrices was performed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Modeling the truebeam linac using a CAD to Geant4 geometry implementation: dose and IAEA-compliant phase space calculations.

    PubMed

    Constantin, Magdalena; Perl, Joseph; LoSasso, Tom; Salop, Arthur; Whittum, David; Narula, Anisha; Svatos, Michelle; Keall, Paul J

    2011-07-01

    To create an accurate 6 MV Monte Carlo simulation phase space for the Varian TrueBeam treatment head geometry imported from CAD (computer aided design) without adjusting the input electron phase space parameters. GEANT4 v4.9.2.p01 was employed to simulate the 6 MV beam treatment head geometry of the Varian TrueBeam linac. The electron tracks in the linear accelerator were simulated with Parmela, and the obtained electron phase space was used as an input to the Monte Carlo beam transport and dose calculations. The geometry components are tessellated solids included in GEANT4 as GDML (generalized dynamic markup language) files obtained via STEP (standard for the exchange of product) export from Pro/Engineering, followed by STEP import in Fastrad, a STEP-GDML converter. The linac has a compact treatment head and the small space between the shielding collimator and the divergent are of the upper jaws forbids the implementation of a plane for storing the phase space. Instead, an IAEA (International Atomic Energy Agency) compliant phase space writer was implemented on a cylindrical surface. The simulation was run in parallel on a 1200 node Linux cluster. The 6 MV dose calculations were performed for field sizes varying from 4 x 4 to 40 x 40 cm2. The voxel size for the 60 x 60 x 40 cm3 water phantom was 4 x 4 x 4 mm3. For the 10 x 10 cm2 field, surface buildup calculations were performed using 4 x 4 x 2 mm3 voxels within 20 mm of the surface. For the depth dose curves, 98% of the calculated data points agree within 2% with the experimental measurements for depths between 2 and 40 cm. For depths between 5 and 30 cm, agreement within 1% is obtained for 99% (4 x 4), 95% (10 x 10), 94% (20 x 20 and 30 x 30), and 89% (40 x 40) of the data points, respectively. In the buildup region, the agreement is within 2%, except at 1 mm depth where the deviation is 5% for the 10 x 10 cm2 open field. For the lateral dose profiles, within the field size for fields up to 30 x 30 cm2, the

  18. Simulation and Digitization of a Gas Electron Multiplier Detector Using Geant4 and an Object-Oriented Digitization Program

    NASA Astrophysics Data System (ADS)

    McMullen, Timothy; Liyanage, Nilanga; Xiong, Weizhi; Zhao, Zhiwen

    2017-01-01

    Our research has focused on simulating the response of a Gas Electron Multiplier (GEM) detector using computational methods. GEM detectors provide a cost effective solution for radiation detection in high rate environments. A detailed simulation of GEM detector response to radiation is essential for the successful adaption of these detectors to different applications. Using Geant4 Monte Carlo (GEMC), a wrapper around Geant4 which has been successfully used to simulate the Solenoidal Large Intensity Device (SoLID) at Jefferson Lab, we are developing a simulation of a GEM chamber similar to the detectors currently used in our lab. We are also refining an object-oriented digitization program, which translates energy deposition information from GEMC into electronic readout which resembles the readout from our physical detectors. We have run the simulation with beta particles produced by the simulated decay of a 90Sr source, as well as with a simulated bremsstrahlung spectrum. Comparing the simulation data with real GEM data taken under similar conditions is used to refine the simulation parameters. Comparisons between results from the simulations and results from detector tests will be presented.

  19. Designing an upgrade of the Medley setup for light-ion production and fission cross-section measurements

    NASA Astrophysics Data System (ADS)

    Jansson, K.; Gustavsson, C.; Al-Adili, A.; Hjalmarsson, A.; Andersson-Sundén, E.; Prokofiev, A. V.; Tarrío, D.; Pomp, S.

    2015-09-01

    Measurements of neutron-induced fission cross-sections and light-ion production are planned in the energy range 1-40 MeV at the upcoming Neutrons For Science (NFS) facility. In order to prepare our detector setup for the neutron beam with continuous energy spectrum, a simulation software was written using the Geant4 toolkit for both measurement situations. The neutron energy range around 20 MeV is troublesome when it comes to the cross-sections used by Geant4 since data-driven cross-sections are only available below 20 MeV but not above, where they are based on semi-empirical models. Several customisations were made to the standard classes in Geant4 in order to produce consistent results over the whole simulated energy range. Expected uncertainties are reported for both types of measurements. The simulations have shown that a simultaneous precision measurement of the three standard cross-sections H(n,n), 235U(n,f) and 238U(n,f) relative to each other is feasible using a triple layered target. As high resolution timing detectors for fission fragments we plan to use Parallel Plate Avalanche Counters (PPACs). The simulation results have put some restrictions on the design of these detectors as well as on the target design. This study suggests a fissile target no thicker than 2 μm (1.7 mg/cm2) and a PPAC foil thickness preferably less than 1 μm. We also comment on the usability of Geant4 for simulation studies of neutron reactions in this energy range.

  20. Aircraft Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  1. Flight software requirements and design support system

    NASA Technical Reports Server (NTRS)

    Riddle, W. E.; Edwards, B.

    1980-01-01

    The desirability and feasibility of computer-augmented support for the pre-implementation activities occurring during the development of flight control software was investigated. The specific topics to be investigated were the capabilities to be included in a pre-implementation support system for flight control software system development, and the specification of a preliminary design for such a system. Further, the pre-implementation support system was to be characterized and specified under the constraints that it: (1) support both description and assessment of flight control software requirements definitions and design specification; (2) account for known software description and assessment techniques; (3) be compatible with existing and planned NASA flight control software development support system; and (4) does not impose, but may encourage, specific development technologies. An overview of the results is given.

  2. Application of dynamic Monte Carlo technique in proton beam radiotherapy using Geant4 simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guan, Fada

    Monte Carlo method has been successfully applied in simulating the particles transport problems. Most of the Monte Carlo simulation tools are static and they can only be used to perform the static simulations for the problems with fixed physics and geometry settings. Proton therapy is a dynamic treatment technique in the clinical application. In this research, we developed a method to perform the dynamic Monte Carlo simulation of proton therapy using Geant4 simulation toolkit. A passive-scattering treatment nozzle equipped with a rotating range modulation wheel was modeled in this research. One important application of the Monte Carlo simulation is to predict the spatial dose distribution in the target geometry. For simplification, a mathematical model of a human body is usually used as the target, but only the average dose over the whole organ or tissue can be obtained rather than the accurate spatial dose distribution. In this research, we developed a method using MATLAB to convert the medical images of a patient from CT scanning into the patient voxel geometry. Hence, if the patient voxel geometry is used as the target in the Monte Carlo simulation, the accurate spatial dose distribution in the target can be obtained. A data analysis tool---root was used to score the simulation results during a Geant4 simulation and to analyze the data and plot results after simulation. Finally, we successfully obtained the accurate spatial dose distribution in part of a human body after treating a patient with prostate cancer using proton therapy.

  3. The simulation of the LANFOS-H food radiation contamination detector using Geant4 package

    NASA Astrophysics Data System (ADS)

    Piotrowski, Lech Wiktor; Casolino, Marco; Ebisuzaki, Toshikazu; Higashide, Kazuhiro

    2015-02-01

    Recent incident in the Fukushima power plant caused a growing concern about the radiation contamination and resulted in lowering the Japanese limits for the permitted amount of 137Cs in food to 100 Bq/kg. To increase safety and ease the concern we are developing LANFOS (Large Food Non-destructive Area Sampler)-a compact, easy to use detector for assessment of radiation in food. Described in this paper LANFOS-H has a 4 π coverage to assess the amount of 137Cs present, separating it from the possible 40K food contamination. Therefore, food samples do not have to be pre-processed prior to a test and can be consumed after measurements. It is designed for use by non-professionals in homes and small institutions such as schools, showing safety of the samples, but can be also utilized by specialists providing radiation spectrum. Proper assessment of radiation in food in the apparatus requires estimation of the γ conversion factor of the detectors-how many γ photons will produce a signal. In this paper we show results of the Monte Carlo estimation of this factor for various approximated shapes of fish, vegetables and amounts of rice, performed with Geant4 package. We find that the conversion factor combined from all the detectors is similar for all food types and is around 37%, varying maximally by 5% with sample length, much less than for individual detectors. The different inclinations and positions of samples in the detector introduce uncertainty of 1.4%. This small uncertainty validates the concept of a 4 π non-destructive apparatus.

  4. Comparing Geant4 hadronic models for the WENDI-II rem meter response function.

    PubMed

    Vanaudenhove, T; Dubus, A; Pauly, N

    2013-01-01

    The WENDI-II rem meter is one of the most popular neutron dosemeters used to assess a useful quantity of radiation protection, namely the ambient dose equivalent. This is due to its high sensitivity and its energy response that approximately follows the conversion function between neutron fluence and ambient dose equivalent in the range of thermal to 5 GeV. The simulation of the WENDI-II response function with the Geant4 toolkit is then perfectly suited to compare low- and high-energy hadronic models provided by this Monte Carlo code. The results showed that the thermal treatment of hydrogen in polyethylene for neutron <4 eV has a great influence over the whole detector range. Above 19 MeV, both Bertini Cascade and Binary Cascade models show a good correlation with the results found in the literature, while low-energy parameterised models are not suitable for this application.

  5. Design software for reuse

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Viewgraphs are presented on the designing of software for reuse. Topics include terminology, software reuse maxims, the science of programming, an interface design example, a modularization example, and reuse and implementation guidelines.

  6. Modeling the TrueBeam linac using a CAD to Geant4 geometry implementation: Dose and IAEA-compliant phase space calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constantin, Magdalena; Perl, Joseph; LoSasso, Tom

    2011-07-15

    Purpose: To create an accurate 6 MV Monte Carlo simulation phase space for the Varian TrueBeam treatment head geometry imported from cad (computer aided design) without adjusting the input electron phase space parameters. Methods: geant4 v4.9.2.p01 was employed to simulate the 6 MV beam treatment head geometry of the Varian TrueBeam linac. The electron tracks in the linear accelerator were simulated with Parmela, and the obtained electron phase space was used as an input to the Monte Carlo beam transport and dose calculations. The geometry components are tessellated solids included in geant4 as gdml (generalized dynamic markup language) files obtainedmore » via STEP (standard for the exchange of product) export from Pro/Engineering, followed by STEP import in Fastrad, a STEP-gdml converter. The linac has a compact treatment head and the small space between the shielding collimator and the divergent arc of the upper jaws forbids the implementation of a plane for storing the phase space. Instead, an IAEA (International Atomic Energy Agency) compliant phase space writer was implemented on a cylindrical surface. The simulation was run in parallel on a 1200 node Linux cluster. The 6 MV dose calculations were performed for field sizes varying from 4 x 4 to 40 x 40 cm{sup 2}. The voxel size for the 60x60x40 cm{sup 3} water phantom was 4x4x4 mm{sup 3}. For the 10x10 cm{sup 2} field, surface buildup calculations were performed using 4x4x2 mm{sup 3} voxels within 20 mm of the surface. Results: For the depth dose curves, 98% of the calculated data points agree within 2% with the experimental measurements for depths between 2 and 40 cm. For depths between 5 and 30 cm, agreement within 1% is obtained for 99% (4x4), 95% (10x10), 94% (20x20 and 30x30), and 89% (40x40) of the data points, respectively. In the buildup region, the agreement is within 2%, except at 1 mm depth where the deviation is 5% for the 10x10 cm{sup 2} open field. For the lateral dose profiles, within the

  7. Application of Design Patterns in Refactoring Software Design

    NASA Technical Reports Server (NTRS)

    Baggs. Rjpda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  8. Dose calculations at high altitudes and in deep space with GEANT4 using BIC and JQMD models for nucleus nucleus reactions

    NASA Astrophysics Data System (ADS)

    Sihver, L.; Matthiä, D.; Koi, T.; Mancusi, D.

    2008-10-01

    Radiation exposure of aircrew is more and more recognized as an occupational hazard. The ionizing environment at standard commercial aircraft flight altitudes consists mainly of secondary particles, of which the neutrons give a major contribution to the dose equivalent. Accurate estimations of neutron spectra in the atmosphere are therefore essential for correct calculations of aircrew doses. Energetic solar particle events (SPE) could also lead to significantly increased dose rates, especially at routes close to the North Pole, e.g. for flights between Europe and USA. It is also well known that the radiation environment encountered by personnel aboard low Earth orbit (LEO) spacecraft or aboard a spacecraft traveling outside the Earth's protective magnetosphere is much harsher compared with that within the atmosphere since the personnel are exposed to radiation from both galactic cosmic rays (GCR) and SPE. The relative contribution to the dose from GCR when traveling outside the Earth's magnetosphere, e.g. to the Moon or Mars, is even greater, and reliable and accurate particle and heavy ion transport codes are essential to calculate the radiation risks for both aircrew and personnel on spacecraft. We have therefore performed calculations of neutron distributions in the atmosphere, total dose equivalents, and quality factors at different depths in a water sphere in an imaginary spacecraft during solar minimum in a geosynchronous orbit. The calculations were performed with the GEANT4 Monte Carlo (MC) code using both the binary cascade (BIC) model, which is part of the standard GEANT4 package, and the JQMD model, which is used in the particle and heavy ion transport code PHITS GEANT4.

  9. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  10. Radiation Shielding Study of Advanced Data and Power Management Systems (ADPMS) Housing Using Geant4

    NASA Astrophysics Data System (ADS)

    Garcia, F.; Kurvinen, K.; Brander, T.; Orava, R.; Heino, J.; Virtanen, A.; Kettunen, H.; Tenhunen, M.

    2008-02-01

    A design goal for current space system is to reduce the mass used to enclose components of the spacecraft. One potential target is to reduce the mass of electronics and its housings. The use of composite materials, especially CFRP (Carbon Fiber Reinforced Plastic) is a well known and vastly used approach to mass reduction. A design goal, cost reduction, has increased the use of commercial (non-space qualified) electronics. These commercial circuits and other components cannot tolerate as high radiation levels as space qualified components. Therefore, the use of standard electronics components poses a challenge in terms of the radiation protection capability of the ADPMS housings. The main goal of this study is to provide insight on the radiation shielding protection produced by different configurations of CFRP tungsten laminates of epoxies and cyanate esters and then to compare them to the protection given by the commonly used aluminum. For a spacecraft operating in LEO and MEO orbits the main components of the space radiation environment are energetic electrons and protons, therefore in our study we will compare the experimental and simulation results of the radiation attenuation of different types of laminates for those particles. At the same time the experimental data has been used to validate the Geant4 model of the laminates, which can be used for future optimizations of the laminate structures.

  11. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  12. Designing Educational Software for Tomorrow.

    ERIC Educational Resources Information Center

    Harvey, Wayne

    Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…

  13. gemcWeb: A Cloud Based Nuclear Physics Simulation Software

    NASA Astrophysics Data System (ADS)

    Markelon, Sam

    2017-09-01

    gemcWeb allows users to run nuclear physics simulations from the web. Being completely device agnostic, scientists can run simulations from anywhere with an Internet connection. Having a full user system, gemcWeb allows users to revisit and revise their projects, and share configurations and results with collaborators. gemcWeb is based on simulation software gemc, which is based on standard GEant4. gemcWeb requires no C++, gemc, or GEant4 knowledge. Using a simple but powerful GUI allows users to configure their project from geometries and configurations stored on the deployment server. Simulations are then run on the server, with results being posted to the user, and then securely stored. Python based and open-source, the main version of gemcWeb is hosted internally at Jefferson National Labratory and used by the CLAS12 and Electron-Ion Collider Project groups. However, as the software is open-source, and hosted as a GitHub repository, an instance can be deployed on the open web, or any institution's intra-net. An instance can be configured to host experiments specific to an institution, and the code base can be modified by any individual or group. Special thanks to: Maurizio Ungaro, PhD., creator of gemc; Markus Diefenthaler, PhD., advisor; and Kyungseon Joo, PhD., advisor.

  14. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation.

    PubMed

    Ogawara, R; Ishikawa, M

    2016-07-01

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements.

  15. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.

    PubMed

    Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L

    2012-06-07

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  16. Simulations of neutron transport at low energy: a comparison between GEANT and MCNP.

    PubMed

    Colonna, N; Altieri, S

    2002-06-01

    The use of the simulation tool GEANT for neutron transport at energies below 20 MeV is discussed, in particular with regard to shielding and dose calculations. The reliability of the GEANT/MICAP package for neutron transport in a wide energy range has been verified by comparing the results of simulations performed with this package in a wide energy range with the prediction of MCNP-4B, a code commonly used for neutron transport at low energy. A reasonable agreement between the results of the two codes is found for the neutron flux through a slab of material (iron and ordinary concrete), as well as for the dose released in soft tissue by neutrons. These results justify the use of the GEANT/MICAP code for neutron transport in a wide range of applications, including health physics problems.

  17. A Vision on the Status and Evolution of HEP Physics Software Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canal, P.; Elvira, D.; Hatcher, R.

    2013-07-28

    This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.

  18. Recent developments in software for the Belle II aerogel RICH

    NASA Astrophysics Data System (ADS)

    Šantelj, L.; Adachi, I.; Dolenec, R.; Hataya, K.; Iori, S.; Iwata, S.; Kakuno, H.; Kataura, R.; Kawai, H.; Kindo, H.; Kobayashi, T.; Korpar, S.; Križan, P.; Kumita, T.; Mrvar, M.; Nishida, S.; Ogawa, K.; Ogawa, S.; Pestotnik, R.; Sumiyoshi, T.; Tabata, M.; Yonenaga, M.; Yusa, Y.

    2017-12-01

    For the Belle II spectrometer a proximity focusing RICH counter with an aerogel radiator (ARICH) will be employed as a PID system in the forward end-cap region of the spectrometer. The detector will provide about 4σ separation of pions and kaons up to momenta of 3.5 GeV/c, at the kinematic limits of the experiment. We present the up-to-date status of the ARICH simulation and reconstruction software, focusing on the recent improvements of the reconstruction algorithms and detector description in the Geant4 simulation. In addition, as a demonstration of detector readout software functionality we show the first cosmic ray Cherenkov rings observed in the ARICH.

  19. Development of a Geant4 application to characterise a prototype neutron detector based on three orthogonal 3He tubes inside an HDPE sphere.

    PubMed

    Gracanin, V; Guatelli, S; Prokopovich, D; Rosenfeld, A B; Berry, A

    2017-01-01

    The Bonner Sphere Spectrometer (BSS) system is a well-established technique for neutron dosimetry that involves detection of thermal neutrons within a range of hydrogenous moderators. BSS detectors are often used to perform neutron field surveys in order to determine the ambient dose equivalent H*(10) and estimate health risk to personnel. There is a potential limitation of existing neutron survey techniques, since some detectors do not consider the direction of the neutron field, which can result in overly conservative estimates of dose in neutron fields. This paper shows the development of a Geant4 simulation application to characterise a prototype neutron detector based on three orthogonal 3 He tubes inside a single HDPE sphere built at the Australian Nuclear Science and Technology Organisation (ANSTO). The Geant4 simulation has been validated with respect to experimental measurements performed with an Am-Be source. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  20. Signal pulse emulation for scintillation detectors using Geant4 Monte Carlo with light tracking simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogawara, R.; Ishikawa, M., E-mail: masayori@med.hokudai.ac.jp

    The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr{sub 3}:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposedmore » technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements.« less

  1. Including Delbrück scattering in GEANT4

    NASA Astrophysics Data System (ADS)

    Omer, Mohamed; Hajima, Ryoichi

    2017-08-01

    Elastic scattering of γ-rays is a significant interaction among γ-ray interactions with matter. Therefore, the planning of experiments involving measurements of γ-rays using Monte Carlo simulations usually includes elastic scattering. However, current simulation tools do not provide a complete picture of elastic scattering. The majority of these tools assume Rayleigh scattering is the primary contributor to elastic scattering and neglect other elastic scattering processes, such as nuclear Thomson and Delbrück scattering. Here, we develop a tabulation-based method to simulate elastic scattering in one of the most common open-source Monte Carlo simulation toolkits, GEANT4. We collectively include three processes, Rayleigh scattering, nuclear Thomson scattering, and Delbrück scattering. Our simulation more appropriately uses differential cross sections based on the second-order scattering matrix instead of current data, which are based on the form factor approximation. Moreover, the superposition of these processes is carefully taken into account emphasizing the complex nature of the scattering amplitudes. The simulation covers an energy range of 0.01 MeV ≤ E ≤ 3 MeV and all elements with atomic numbers of 1 ≤ Z ≤ 99. In addition, we validated our simulation by comparing the differential cross sections measured in earlier experiments with those extracted from the simulations. We find that the simulations are in good agreement with the experimental measurements. Differences between the experiments and the simulations are 21% for uranium, 24% for lead, 3% for tantalum, and 8% for cerium at 2.754 MeV. Coulomb corrections to the Delbrück amplitudes may account for the relatively large differences that appear at higher Z values.

  2. Development and validation of a GEANT4 radiation transport code for CT dosimetry

    PubMed Central

    Carver, DE; Kost, SD; Fernald, MJ; Lewis, KG; Fraser, ND; Pickens, DR; Price, RR; Stabin, MG

    2014-01-01

    We have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate our simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air, a standard 16-cm acrylic head phantom, and a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of our Monte Carlo simulations. We found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  3. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this

  4. Studying the response of a plastic scintillator to gamma rays using the Geant4 Monte Carlo code.

    PubMed

    Ghadiri, Rasoul; Khorsandi, Jamshid

    2015-05-01

    To determine the gamma ray response function of an NE-102 scintillator and to investigate the gamma spectra due to the transport of optical photons, we simulated an NE-102 scintillator using Geant4 code. The results of the simulation were compared with experimental data. Good consistency between the simulation and data was observed. In addition, the time and spatial distributions, along with the energy distribution and surface treatments of scintillation detectors, were calculated. This simulation makes us capable of optimizing the photomultiplier tube (or photodiodes) position to yield the best coupling to the detector. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Creation of a Geant4 Muon Tomography Package for Imaging of Nuclear Fuel in Dry Cask Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsoukalas, Lefteri H.

    2016-03-01

    This is the final report of the NEUP project “Creation of a Geant4 Muon Tomography Package for Imaging of Nuclear Fuel in Dry Cask Storage”, DE-NE0000695. The project started on December 1, 2013 and this report covers the period December 1, 2013 through November 30, 2015. The project was successfully completed and this report provides an overview of the main achievements, results and findings throughout the duration of the project. Additional details can be found in the main body of this report and on the individual Quarterly Reports and associated Deliverables of the project, uploaded in PICS-NE.

  6. Optimization of 6LiF:ZnS(Ag) scintillator light yield using GEANT4

    NASA Astrophysics Data System (ADS)

    Yehuda-Zada, Y.; Pritchard, K.; Ziegler, J. B.; Cooksey, C.; Siebein, K.; Jackson, M.; Hurlbut, C.; Kadmon, Y.; Cohen, Y.; Ibberson, R. M.; Majkrzak, C. F.; Maliszewskyj, N. C.; Orion, I.; Osovizky, A.

    2018-06-01

    A new cold neutron detector has been developed at the NIST Center for Neutron Research (NCNR) for the CANDoR (Chromatic Analysis Neutron Diffractometer or Reflectometer) project. Geometric and performance constraints dictate that this detector be exceptionally thin (∼ 2 mm). For this reason, the design of the detector consists of a 6LiF:ZnS(Ag) scintillator with embedded wavelength shifting (WLS) fibers. We used the GEANT4 package to simulate neutron capture and light transport in the detector to optimize the composition and arrangement of materials to satisfy the competing requirements of high neutron capture probability and light production and transport. In the process, we have developed a method for predicting light collection and total neutron detection efficiency for different detector configurations. The simulation was performed by adjusting crucial parameters such as the scintillator stoichiometry, light yield, component grain size, WLS fiber geometry, and reflectors at the outside edges of the scintillator volume. Three different detector configurations were fabricated and their test results were correlated with the simulations. Through this correlation we have managed to find a common photon threshold for the different detector configurations which was then used to simulate and predict the efficiencies for many other detector configurations. New detectors that have been fabricated based on simulation results yielding the desired sensitivity of 90% for 3.27 meV (5 Å) cold neutrons. The simulation has proven to be a useful tool by dramatically reducing the development period and the required number of detector prototypes. It can be used to test new designs with different thicknesses and different target neutron energies.

  7. Software design and documentation language: User's guide for SDDL release 4

    NASA Technical Reports Server (NTRS)

    Zepko, T. M.

    1981-01-01

    The changes introduced in the PASCAL implementation of the software design and documentation language are described. These changes include a number of new capabilities, plus some changes to make the language more consistent and easier to use. Incompatibilities with earlier versions are limited to certain of the directive statements.

  8. Geant4 simulation of ion chambers response to 60Co spectrum of LNMRI/IRD Shepherd 81-14D Radiator

    NASA Astrophysics Data System (ADS)

    Queiroz Filho, P. P.; Da Silva, C. N. M.

    2018-03-01

    The National Ionizing Radiation Metrology Laboratory of the Radioprotection and Dosimetry Institute (LNMRI / IRD) has recently acquired a Shepherd 81-14D Radiator. In this work we simulate, using Geant4, the behavior with the inverse square law radiation for 3 models of PTW spherical chambers used in radioprotection, a relevant information to planning the measurements. We did the corrections for the attenuation and scattering in the air for each distance, where we used the 60Co spectrum simulated previously.

  9. Software archeology: a case study in software quality assurance and design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less

  10. Evaluation of double photon coincidence Compton imaging method with GEANT4 simulation

    NASA Astrophysics Data System (ADS)

    Yoshihara, Yuri; Shimazoe, Kenji; Mizumachi, Yuki; Takahashi, Hiroyuki

    2017-11-01

    Compton imaging has been used for various applications including astronomical observations, radioactive waste management, and biomedical imaging. The positions of radioisotopes are determined in the intersections of multiple cone traces through a large number of events, which reduces signal to noise ratio (SNR) of the images. We have developed an advanced Compton imaging method to localize radioisotopes with high SNR by using information of the interactions of Compton scattering caused by two gamma rays at the same time, as the double photon coincidence Compton imaging method. The targeted radioisotopes of this imaging method are specific nuclides that emit several gamma rays at the same time such as 60Co, 134Cs, and 111In, etc. Since their locations are determined in the intersections of two Compton cones, the most of cone traces would disappear in the three-dimensional space, which enhances the SNR and angular resolution. In this paper, the comparison of the double photon coincidence Compton imaging method and the single photon Compton imaging method was conducted by using GEANT4 Monte Carlo simulation.

  11. Software Prototyping: Designing Systems for Users.

    ERIC Educational Resources Information Center

    Spies, Phyllis Bova

    1983-01-01

    Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…

  12. Assessment and improvements of Geant4 hadronic models in the context of prompt-gamma hadrontherapy monitoring

    NASA Astrophysics Data System (ADS)

    Dedes, G.; Pinto, M.; Dauvergne, D.; Freud, N.; Krimmer, J.; Létang, J. M.; Ray, C.; Testa, E.

    2014-04-01

    Monte Carlo simulations are nowadays essential tools for a wide range of research topics in the field of radiotherapy. They also play an important role in the effort to develop a real-time monitoring system for quality assurance in proton and carbon ion therapy, by means of prompt-gamma detection. The internal theoretical nuclear models of Monte Carlo simulation toolkits are of decisive importance for the accurate description of neutral or charged particle emission, produced by nuclear interactions between beam particles and target nuclei. We assess the performance of Geant4 nuclear models in the context of prompt-gamma emission, comparing them with experimental data from proton and carbon ion beams. As has been shown in the past and further indicated in our study, the prompt-gamma yields are consistently overestimated by Geant4 by a factor of about 100% to 200% over an energy range from 80 to 310 MeV/u for the case of 12C, and to a lesser extent for 160 MeV protons. Furthermore, we focus on the quantum molecular dynamics (QMD) modeling of ion-ion collisions, in order to optimize its description of light nuclei, which are abundant in the human body and mainly anticipated in hadrontherapy applications. The optimization has been performed by benchmarking QMD free parameters with well established nuclear properties. In addition, we study the effect of this optimization on charged particle emission. With the usage of the proposed parameter values, discrepancies reduce to less than 70%, with the highest values being attributed to the nucleon-ion induced prompt-gammas. This conclusion, also confirmed by the disagreement we observe in the case of proton beams, indicates the need for further investigation on nuclear models which describe proton and neutron induced nuclear reactions.

  13. Development and validation of a GEANT4 radiation transport code for CT dosimetry.

    PubMed

    Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G

    2015-04-01

    The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other.

  14. Balloon Design Software

    NASA Technical Reports Server (NTRS)

    Farley, Rodger

    2007-01-01

    PlanetaryBalloon Version 5.0 is a software package for the design of meridionally lobed planetary balloons. It operates in a Windows environment, and programming was done in Visual Basic 6. By including the effects of circular lobes with load tapes, skin mass, hoop and meridional stress, and elasticity in the structural elements, a more accurate balloon shape of practical construction can be determined as well as the room-temperature cut pattern for the gore shapes. The computer algorithm is formulated for sizing meridionally lobed balloons for any generalized atmosphere or planet. This also covers zero-pressure, over-pressure, and super-pressure balloons. Low circumferential loads with meridionally reinforced load tapes will produce shapes close to what are known as the "natural shape." The software allows for the design of constant angle, constant radius, or constant hoop stress balloons. It uses the desired payload capacity for given atmospheric conditions and determines the required volume, allowing users to design exactly to their requirements. The formulations are generalized to use any lift gas (or mixture of gases), any atmosphere, or any planet as described by the local acceleration of gravity. PlanetaryBalloon software has a comprehensive user manual that covers features ranging from, but not limited to, buoyancy and super-pressure, convenient design equations, shape formulation, and orthotropic stress/strain.

  15. SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1994-01-01

    Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures

  16. Validation of Geant4 fragmentation for Heavy Ion Therapy

    NASA Astrophysics Data System (ADS)

    Bolst, David; Cirrone, Giuseppe A. P.; Cuttone, Giacomo; Folger, Gunter; Incerti, Sebastien; Ivanchenko, Vladimir; Koi, Tatsumi; Mancusi, Davide; Pandola, Luciano; Romano, Francesco; Rosenfeld, Anatoly B.; Guatelli, Susanna

    2017-10-01

    12C ion therapy has had growing interest in recent years for its excellent dose conformity. However at therapeutic energies, which can be as high as 400 MeV/u, carbon ions produce secondary fragments. For an incident 400 MeV/u 12C ion beam, ∼ 70 % of the beam will undergo fragmentation before the Bragg Peak. The dosimetric and radiobiological impact of these fragments must be accurately characterised, as it can result in increasing the risk of secondary cancer for the patient as well as altering the relative biological effectiveness. This work investigates the accuracy of three different nuclear fragmentation models available in the Monte Carlo Toolkit Geant4, the Binary Intranuclear Cascade (BIC), the Quantum Molecular Dynamics (QMD) and the Liege Intranuclear Cascade (INCL++). The models were benchmarked against experimental data for a pristine 400 MeV/u 12C beam incident upon a water phantom, including fragment yield, angular and energy distribution. For fragment yields the three alternative models agreed between ∼ 5 and ∼ 35 % with experimental measurements, the QMD using the "Frag" option gave the best agreement for lighter fragments but had reduced agreement for larger fragments. For angular distributions INCL++ was seen to provide the best agreement among the models for all elements with the exception of Hydrogen, while BIC and QMD was seen to produce broader distributions compared to experiment. BIC and QMD performed similar to one another for kinetic energy distributions while INCL++ suffered from producing lower energy distributions compared to the other models and experiment.

  17. Validation of Geant4 fragmentation for Heavy Ion Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolst, David; Cirrone, Giuseppe A. P.; Cuttone, Giacomo

    12C ion therapy has had growing interest in recent years for its excellent dose conformity. However at therapeutic energies, which can be as high as 400 MeV/u, carbon ions produce secondary fragments. For an incident 400 MeV/u 12C ion beam, ~70% of the beam will undergo fragmentation before the Bragg Peak. The dosimetric and radiobiological impact of these fragments must be accurately characterised, as it can result in increasing the risk of secondary cancer for the patient as well as altering the relative biological effectiveness. Here, this work investigates the accuracy of three different nuclear fragmentation models available in themore » Monte Carlo Toolkit Geant4, the Binary Intranuclear Cascade (BIC), the Quantum Molecular Dynamics (QMD) and the Liege Intranuclear Cascade (INCL++). The models were benchmarked against experimental data for a pristine 400 MeV/u 12C beam incident upon a water phantom, including fragment yield, angular and energy distribution. For fragment yields the three alternative models agreed between ~5 and ~35% with experimental measurements, the QMD using the “Frag” option gave the best agreement for lighter fragments but had reduced agreement for larger fragments. For angular distributions INCL++ was seen to provide the best agreement among the models for all elements with the exception of Hydrogen, while BIC and QMD was seen to produce broader distributions compared to experiment. BIC and QMD performed similar to one another for kinetic energy distributions while INCL++ suffered from producing lower energy distributions compared to the other models and experiment.« less

  18. Validation of Geant4 fragmentation for Heavy Ion Therapy

    DOE PAGES

    Bolst, David; Cirrone, Giuseppe A. P.; Cuttone, Giacomo; ...

    2017-07-12

    12C ion therapy has had growing interest in recent years for its excellent dose conformity. However at therapeutic energies, which can be as high as 400 MeV/u, carbon ions produce secondary fragments. For an incident 400 MeV/u 12C ion beam, ~70% of the beam will undergo fragmentation before the Bragg Peak. The dosimetric and radiobiological impact of these fragments must be accurately characterised, as it can result in increasing the risk of secondary cancer for the patient as well as altering the relative biological effectiveness. Here, this work investigates the accuracy of three different nuclear fragmentation models available in themore » Monte Carlo Toolkit Geant4, the Binary Intranuclear Cascade (BIC), the Quantum Molecular Dynamics (QMD) and the Liege Intranuclear Cascade (INCL++). The models were benchmarked against experimental data for a pristine 400 MeV/u 12C beam incident upon a water phantom, including fragment yield, angular and energy distribution. For fragment yields the three alternative models agreed between ~5 and ~35% with experimental measurements, the QMD using the “Frag” option gave the best agreement for lighter fragments but had reduced agreement for larger fragments. For angular distributions INCL++ was seen to provide the best agreement among the models for all elements with the exception of Hydrogen, while BIC and QMD was seen to produce broader distributions compared to experiment. BIC and QMD performed similar to one another for kinetic energy distributions while INCL++ suffered from producing lower energy distributions compared to the other models and experiment.« less

  19. Design and Construction of a Positron Emission Tomography (PET) Unit and Medical Applications with GEANT Detector Simulation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karagoz, Muge

    1998-01-01

    In order to investigate the possibility of the construction of a sample PET coincidence unit in our HEP laboratory, a setup with two face to face PMTs and two 2x8 Csi(Tl) scintillator matrices has been constructed. In this setup, 1-D projections of a pointlike 22 Na positron source at different angles have been measured. Using these projections a 2-D image has been formed. Monte Carlo studies of this setup have been implemented using the detector simulation tool in CERN program library, GEANT. Again with GEANT a sample human body is created to study the effects of proton therapy. Utilization ofmore » the simulation as a pretherapy tool is also investigated.« less

  20. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  1. Software Design Methods for Real-Time Systems

    DTIC Science & Technology

    1989-12-01

    This module describes the concepts and methods used in the software design of real time systems . It outlines the characteristics of real time systems , describes...the role of software design in real time system development, surveys and compares some software design methods for real - time systems , and

  2. Empirical studies of software design: Implications for SSEs

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.

  3. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  4. Geant4 Simulations for the Radon Electric Dipole Moment Search at TRIUMF

    NASA Astrophysics Data System (ADS)

    Rand, Evan; Bangay, Jack; Bianco, Laura; Dunlop, Ryan; Finlay, Paul; Garrett, Paul; Leach, Kyle; Phillips, Andrew; Svensson, Carl; Sumithrarachchi, Chandana; Wong, James

    2010-11-01

    The existence of a permanent electric dipole moment (EDM) requires the violation of time-reversal symmetry (T) or, equivalently, the violation of charge conjugation C and parity P (CP). Although no particle EDM has yet been found, current theories beyond the Standard Model, e.g. multiple-Higgs theories, left-right symmetry, and supersymmetry, predict EDMs within current experimental reach. In fact, present limits on the EDMs of the neutron, electron and ^199Hg atom have significantly reduced the parameter spaces of these models. The measurement of a non-zero EDM would be a direct measurement of the violation of time-reversal symmetry, and would represent a clear signal of new physics beyond the Standard Model. Recent theoretical calculations predict large enhancements in the atomic EDMs for atoms with octupole-deformed nuclei, making odd-A Rn isotopes prime candidates for the EDM search. The Geant4 simulations presented here are essential for the development towards an EDM measurement. They provide an accurate description of γ-ray scattering and backgrounds in the experimental apparatus, and are being used to study the overall sensitivity of the RnEDM experiment at TRIUMF in Vancouver, B.C.

  5. Language and Program for Documenting Software Design

    NASA Technical Reports Server (NTRS)

    Kleine, H.; Zepko, T. M.

    1986-01-01

    Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.

  6. An empirical study of software design practices

    NASA Technical Reports Server (NTRS)

    Card, David N.; Church, Victor E.; Agresti, William W.

    1986-01-01

    Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.

  7. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  8. Validation of GEANT4 Monte Carlo models with a highly granular scintillator-steel hadron calorimeter

    NASA Astrophysics Data System (ADS)

    Adloff, C.; Blaha, J.; Blaising, J.-J.; Drancourt, C.; Espargilière, A.; Gaglione, R.; Geffroy, N.; Karyotakis, Y.; Prast, J.; Vouters, G.; Francis, K.; Repond, J.; Schlereth, J.; Smith, J.; Xia, L.; Baldolemar, E.; Li, J.; Park, S. T.; Sosebee, M.; White, A. P.; Yu, J.; Buanes, T.; Eigen, G.; Mikami, Y.; Watson, N. K.; Mavromanolakis, G.; Thomson, M. A.; Ward, D. R.; Yan, W.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Apostolakis, J.; Dotti, A.; Folger, G.; Ivantchenko, V.; Uzhinskiy, V.; Benyamna, M.; Cârloganu, C.; Fehr, F.; Gay, P.; Manen, S.; Royer, L.; Blazey, G. C.; Dyshkant, A.; Lima, J. G. R.; Zutshi, V.; Hostachy, J.-Y.; Morin, L.; Cornett, U.; David, D.; Falley, G.; Gadow, K.; Göttlicher, P.; Günter, C.; Hermberg, B.; Karstensen, S.; Krivan, F.; Lucaci-Timoce, A.-I.; Lu, S.; Lutz, B.; Morozov, S.; Morgunov, V.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Terwort, M.; Vargas-Trevino, A.; Feege, N.; Garutti, E.; Marchesini, I.; Ramilli, M.; Eckert, P.; Harion, T.; Kaplan, A.; Schultz-Coulon, H.-Ch; Shen, W.; Stamen, R.; Bilki, B.; Norbeck, E.; Onel, Y.; Wilson, G. W.; Kawagoe, K.; Dauncey, P. D.; Magnan, A.-M.; Bartsch, V.; Wing, M.; Salvatore, F.; Calvo Alamillo, E.; Fouz, M.-C.; Puerta-Pelayo, J.; Bobchenko, B.; Chadeeva, M.; Danilov, M.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Popov, V.; Rusinov, V.; Tarkovsky, E.; Kirikova, N.; Kozlov, V.; Smirnov, P.; Soloviev, Y.; Buzhan, P.; Ilyin, A.; Kantserov, V.; Kaplin, V.; Karakash, A.; Popova, E.; Tikhomirov, V.; Kiesling, C.; Seidel, K.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Amjad, M. S.; Bonis, J.; Callier, S.; Conforti di Lorenzo, S.; Cornebise, P.; Doublet, Ph; Dulucq, F.; Fleury, J.; Frisson, T.; van der Kolk, N.; Li, H.; Martin-Chassard, G.; Richard, F.; de la Taille, Ch; Pöschl, R.; Raux, L.; Rouëné, J.; Seguin-Moreau, N.; Anduze, M.; Boudry, V.; Brient, J.-C.; Jeans, D.; Mora de Freitas, P.; Musat, G.; Reinhard, M.; Ruan, M.; Videau, H.; Bulanek, B.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Belhorma, B.; Ghazlane, H.; Takeshita, T.; Uozumi, S.; Götze, M.; Hartbrich, O.; Sauer, J.; Weber, S.; Zeitnitz, C.

    2013-07-01

    Calorimeters with a high granularity are a fundamental requirement of the Particle Flow paradigm. This paper focuses on the prototype of a hadron calorimeter with analog readout, consisting of thirty-eight scintillator layers alternating with steel absorber planes. The scintillator plates are finely segmented into tiles individually read out via Silicon Photomultipliers. The presented results are based on data collected with pion beams in the energy range from 8 GeV to 100 GeV. The fine segmentation of the sensitive layers and the high sampling frequency allow for an excellent reconstruction of the spatial development of hadronic showers. A comparison between data and Monte Carlo simulations is presented, concerning both the longitudinal and lateral development of hadronic showers and the global response of the calorimeter. The performance of several GEANT4 physics lists with respect to these observables is evaluated.

  9. Comparison of Geant4 multiple Coulomb scattering models with theory for radiotherapy protons

    NASA Astrophysics Data System (ADS)

    Makarova, Anastasia; Gottschalk, Bernard; Sauerwein, Wolfgang

    2017-08-01

    Usually, Monte Carlo models are validated against experimental data. However, models of multiple Coulomb scattering (MCS) in the Gaussian approximation are exceptional in that we have theories which are probably more accurate than the experiments which have, so far, been done to test them. In problems directly sensitive to the distribution of angles leaving the target, the relevant theory is the Molière/Fano/Hanson variant of Molière theory (Gottschalk et al 1993 Nucl. Instrum. Methods Phys. Res. B 74 467-90). For transverse spreading of the beam in the target itself, the theory of Preston and Koehler (Gottschalk (2012 arXiv:1204.4470)) holds. Therefore, in this paper we compare Geant4 simulations, using the Urban and Wentzel models of MCS, with theory rather than experiment, revealing trends which would otherwise be obscured by experimental scatter. For medium-energy (radiotherapy) protons, and low-Z (water-like) target materials, Wentzel appears to be better than Urban in simulating the distribution of outgoing angles. For beam spreading in the target itself, the two models are essentially equal.

  10. Comparison of Geant4 multiple Coulomb scattering models with theory for radiotherapy protons.

    PubMed

    Makarova, Anastasia; Gottschalk, Bernard; Sauerwein, Wolfgang

    2017-07-06

    Usually, Monte Carlo models are validated against experimental data. However, models of multiple Coulomb scattering (MCS) in the Gaussian approximation are exceptional in that we have theories which are probably more accurate than the experiments which have, so far, been done to test them. In problems directly sensitive to the distribution of angles leaving the target, the relevant theory is the Molière/Fano/Hanson variant of Molière theory (Gottschalk et al 1993 Nucl. Instrum. Methods Phys. Res. B 74 467-90). For transverse spreading of the beam in the target itself, the theory of Preston and Koehler (Gottschalk (2012 arXiv:1204.4470)) holds. Therefore, in this paper we compare Geant4 simulations, using the Urban and Wentzel models of MCS, with theory rather than experiment, revealing trends which would otherwise be obscured by experimental scatter. For medium-energy (radiotherapy) protons, and low-Z (water-like) target materials, Wentzel appears to be better than Urban in simulating the distribution of outgoing angles. For beam spreading in the target itself, the two models are essentially equal.

  11. Structural Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.

  12. Design and Effects of Scenario Educational Software.

    ERIC Educational Resources Information Center

    Keegan, Mark

    1993-01-01

    Describes the development of educational computer software called scenario software that was designed to incorporate advances in cognitive, affective, and physiological research. Instructional methods are outlined; the need to change from didactic methods to discovery learning is explained; and scenario software design features are discussed. (24…

  13. Aviation Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    DARcorporation developed a General Aviation CAD package through a Small Business Innovation Research contract from Langley Research Center. This affordable, user-friendly preliminary design system for General Aviation aircraft runs on the popular 486 IBM-compatible personal computers. Individuals taking the home-built approach, small manufacturers of General Aviation airplanes, as well as students and others interested in the analysis and design of aircraft are possible users of the package. The software can cut design and development time in half.

  14. Rapid Development of Custom Software Architecture Design Environments

    DTIC Science & Technology

    1999-08-01

    the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture

  15. ClassCompass: A Software Design Mentoring System

    ERIC Educational Resources Information Center

    Coelho, Wesley; Murphy, Gail

    2007-01-01

    Becoming a quality software developer requires practice under the guidance of an expert mentor. Unfortunately, in most academic environments, there are not enough experts to provide any significant design mentoring for software engineering students. To address this problem, we present a collaborative software design tool intended to maximize an…

  16. Reflight certification software design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  17. Software design by reusing architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.

  18. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  19. Software design studies emphasizing Project LOGOS

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.

  20. Nuclear reaction measurements on tissue-equivalent materials and GEANT4 Monte Carlo simulations for hadrontherapy

    NASA Astrophysics Data System (ADS)

    De Napoli, M.; Romano, F.; D'Urso, D.; Licciardello, T.; Agodi, C.; Candiano, G.; Cappuzzello, F.; Cirrone, G. A. P.; Cuttone, G.; Musumarra, A.; Pandola, L.; Scuderi, V.

    2014-12-01

    When a carbon beam interacts with human tissues, many secondary fragments are produced into the tumor region and the surrounding healthy tissues. Therefore, in hadrontherapy precise dose calculations require Monte Carlo tools equipped with complex nuclear reaction models. To get realistic predictions, however, simulation codes must be validated against experimental results; the wider the dataset is, the more the models are finely tuned. Since no fragmentation data for tissue-equivalent materials at Fermi energies are available in literature, we measured secondary fragments produced by the interaction of a 55.6 MeV u-1 12C beam with thick muscle and cortical bone targets. Three reaction models used by the Geant4 Monte Carlo code, the Binary Light Ions Cascade, the Quantum Molecular Dynamic and the Liege Intranuclear Cascade, have been benchmarked against the collected data. In this work we present the experimental results and we discuss the predictive power of the above mentioned models.

  1. Current And Future Directions Of Lens Design Software

    NASA Astrophysics Data System (ADS)

    Gustafson, Darryl E.

    1983-10-01

    The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.

  2. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  3. SEPAC flight software detailed design specifications, volume 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.

  4. Software for simulation of a computed tomography imaging spectrometer using optical design software

    NASA Astrophysics Data System (ADS)

    Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.

    2000-11-01

    Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.

  5. Designing Computerized Provider Order Entry Software in Iran: The Nurses' and Physicians' Viewpoints.

    PubMed

    Khammarnia, Mohammad; Sharifian, Roxana; Zand, Farid; Keshtkaran, Ali; Barati, Omid

    2016-09-01

    This study aimed to identify the functional requirements of computerized provider order entry software and design this software in Iran. This study was conducted using review documentation, interview, and focus group discussions in Shiraz University of Medical Sciences, as the medical pole in Iran, in 2013-2015. The study sample consisted of physicians (n = 12) and nurses (n = 2) in the largest hospital in the southern part of Iran and information technology experts (n = 5) in Shiraz University of Medical Sciences. Functional requirements of the computerized provider order entry system were examined in three phases. Finally, the functional requirements were distributed in four levels, and accordingly, the computerized provider order entry software was designed. The software had seven main dimensions: (1) data entry, (2) drug interaction management system, (3) warning system, (4) treatment services, (5) ability to write in software, (6) reporting from all sections of the software, and (7) technical capabilities of the software. The nurses and physicians emphasized quick access to the computerized provider order entry software, order prescription section, and applicability of the software. The software had some items that had not been mentioned in other studies. Ultimately, the software was designed by a company specializing in hospital information systems in Iran. This study was the first specific investigation of computerized provider order entry software design in Iran. Based on the results, it is suggested that this software be implemented in hospitals.

  6. General purpose optimization software for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  7. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  8. Conceptualization and application of an approach for designing healthcare software interfaces.

    PubMed

    Kumar, Ajit; Maskara, Reena; Maskara, Sanjeev; Chiang, I-Jen

    2014-06-01

    The aim of this study is to conceptualize a novel approach, which facilitates us to design prototype interfaces for healthcare software. Concepts and techniques from various disciplines were used to conceptualize an interface design approach named MORTARS (Map Original Rhetorical To Adapted Rhetorical Situation). The concepts and techniques included in this approach are (1) rhetorical situation - a concept of philosophy provided by Bitzer (1968); (2) move analysis - an applied linguistic technique provided by Swales (1990) and Bhatia (1993); (3) interface design guidelines - a cognitive and computer science concept provided by Johnson (2010); (4) usability evaluation instrument - an interface evaluation questionnaire provided by Lund (2001); (5) user modeling via stereotyping - a cognitive and computer science concept provided by Rich (1979). A prototype interface for outpatient clinic software was designed to introduce the underlying concepts of MORTARS. The prototype interface was evaluated by thirty-two medical informaticians. The medical informaticians found the designed prototype interface to be useful (73.3%), easy to use (71.9%), easy to learn (93.1%), and satisfactory (53.2%). MORTARS approach was found to be effective in designing the prototype user interface for the outpatient clinic software. This approach might be further used to design interfaces for various software pertaining to healthcare and other domains. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  10. The GEANT4 toolkit capability in the hadron therapy field: simulation of a transport beam line

    NASA Astrophysics Data System (ADS)

    Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Raffaele, L.; Russo, G.; Guatelli, S.; Pia, M. G.

    2006-01-01

    At Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare of Catania (Sicily, Italy), the first Italian hadron therapy facility named CATANA (Centro di AdroTerapia ed Applicazioni Nucleari Avanzate) has been realized. Inside CATANA 62 MeV proton beams, accelerated by a superconducting cyclotron, are used for the radiotherapeutic treatments of some types of ocular tumours. Therapy with hadron beams still represents a pioneer technique, and only a few centers worldwide can provide this advanced specialized cancer treatment. On the basis of the experience so far gained, and considering the future hadron-therapy facilities to be developed (Rinecker, Munich Germany, Heidelberg/GSI, Darmstadt, Germany, PSI Villigen, Switzerland, CNAO, Pavia, Italy, Centro di Adroterapia, Catania, Italy) we decided to develop a Monte Carlo application based on the GEANT4 toolkit, for the design, the realization and the optimization of a proton-therapy beam line. Another feature of our project is to provide a general tool able to study the interactions of hadrons with the human tissue and to test the analytical-based treatment planning systems actually used in the routine practice. All the typical elements of a hadron-therapy line, such as diffusers, range shifters, collimators and detectors were modelled. In particular, we simulated the Markus type ionization chamber and a Gaf Chromic film as dosimeters to reconstruct the depth (Bragg peak and Spread Out Bragg Peak) and lateral dose distributions, respectively. We validated our simulated detectors comparing the results with the experimental data available in our facility.

  11. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  12. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    PubMed

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  13. Guidance and Navigation Software Architecture Design for the Autonomous Multi-Agent Physically Interacting Spacecraft (AMPHIS) Test Bed

    DTIC Science & Technology

    2006-12-01

    NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI

  14. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1977-01-01

    A communications medium to support the design and documentation of complex software applications is studied. The medium also provides the following: (1) a processor which can convert design specifications into an intelligible, informative machine reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor.

  15. Feasibility of using Geant4 Monte Carlo simulation for IMRT dose calculations for the Novalis Tx with a HD-120 multi-leaf collimator

    NASA Astrophysics Data System (ADS)

    Jung, Hyunuk; Shin, Jungsuk; Chung, Kwangzoo; Han, Youngyih; Kim, Jinsung; Choi, Doo Ho

    2015-05-01

    The aim of this study was to develop an independent dose verification system by using a Monte Carlo (MC) calculation method for intensity modulated radiation therapy (IMRT) conducted by using a Varian Novalis Tx (Varian Medical Systems, Palo Alto, CA, USA) equipped with a highdefinition multi-leaf collimator (HD-120 MLC). The Geant4 framework was used to implement a dose calculation system that accurately predicted the delivered dose. For this purpose, the Novalis Tx Linac head was modeled according to the specifications acquired from the manufacturer. Subsequently, MC simulations were performed by varying the mean energy, energy spread, and electron spot radius to determine optimum values of irradiation with 6-MV X-ray beams by using the Novalis Tx system. Computed percentage depth dose curves (PDDs) and lateral profiles were compared to the measurements obtained by using an ionization chamber (CC13). To validate the IMRT simulation by using the MC model we developed, we calculated a simple IMRT field and compared the result with the EBT3 film measurements in a water-equivalent solid phantom. Clinical cases, such as prostate cancer treatment plans, were then selected, and MC simulations were performed. The accuracy of the simulation was assessed against the EBT3 film measurements by using a gamma-index criterion. The optimal MC model parameters to specify the beam characteristics were a 6.8-MeV mean energy, a 0.5-MeV energy spread, and a 3-mm electron radius. The accuracy of these parameters was determined by comparison of MC simulations with measurements. The PDDs and the lateral profiles of the MC simulation deviated from the measurements by 1% and 2%, respectively, on average. The computed simple MLC fields agreed with the EBT3 measurements with a 95% passing rate with 3%/3-mm gamma-index criterion. Additionally, in applying our model to clinical IMRT plans, we found that the MC calculations and the EBT3 measurements agreed well with a passing rate of greater

  16. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  17. Modeling VLF signal modulation during solar flares with GEANT4 Monte Carlo simulation, a simple chemical model and LWPC

    NASA Astrophysics Data System (ADS)

    Palit, Sourav; Chakrabarti, Sandip Kumar; Pal, Sujay; Basak, Tamal

    Extra ionization by X-rays during solar flares affects VLF signal propagation through D-region ionosphere. Ionization produced in the lower ionosphere due to X-ray spectra of solar flares are simulated with an efficient detector simulation program, GEANT4. The balancing between the ionization and loss processes, causing the lower ionosphere to settle back to its undisturbed state is handled with a simple chemical model consisting of four broad species of ion densities. Using the electron densities, modified VLF signal amplitude is then computed with LWPC code. VLF signal along NWC (Australia) to IERC/ICSP (India) propagation path is examined during a M and a X-type solar flares and observational deviations are compared with simulated results. The agreement is found to be excellent.

  18. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  19. Software requirements flow-down and preliminary software design for the G-CLEF spectrograph

    NASA Astrophysics Data System (ADS)

    Evans, Ian N.; Budynkiewicz, Jamie A.; DePonte Evans, Janet; Miller, Joseph B.; Onyuksel, Cem; Paxson, Charles; Plummer, David A.

    2016-08-01

    The Giant Magellan Telescope (GMT)-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, precision radial velocity (PRV) optical echelle spectrograph that will be the first light instrument on the GMT. The G-CLEF instrument device control subsystem (IDCS) provides software control of the instrument hardware, including the active feedback loops that are required to meet the G-CLEF PRV stability requirements. The IDCS is also tasked with providing operational support packages that include data reduction pipelines and proposal preparation tools. A formal, but ultimately pragmatic approach is being used to establish a complete and correct set of requirements for both the G-CLEF device control and operational support packages. The device control packages must integrate tightly with the state-machine driven software and controls reference architecture designed by the GMT Organization. A model-based systems engineering methodology is being used to develop a preliminary design that meets these requirements. Through this process we have identified some lessons that have general applicability to the development of software for ground-based instrumentation. For example, tasking an individual with overall responsibility for science/software/hardware integration is a key step to ensuring effective integration between these elements. An operational concept document that includes detailed routine and non- routine operational sequences should be prepared in parallel with the hardware design process to tie together these elements and identify any gaps. Appropriate time-phasing of the hardware and software design phases is important, but revisions to driving requirements that impact software requirements and preliminary design are inevitable. Such revisions must be carefully managed to ensure efficient use of resources.

  20. Ethics in computer software design and development

    Treesearch

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  1. General software design for multisensor data fusion

    NASA Astrophysics Data System (ADS)

    Zhang, Junliang; Zhao, Yuming

    1999-03-01

    In this paper a general method of software design for multisensor data fusion is discussed in detail, which adopts object-oriented technology under UNIX operation system. The software for multisensor data fusion is divided into six functional modules: data collection, database management, GIS, target display and alarming data simulation etc. Furthermore, the primary function, the components and some realization methods of each modular is given. The interfaces among these functional modular relations are discussed. The data exchange among each functional modular is performed by interprocess communication IPC, including message queue, semaphore and shared memory. Thus, each functional modular is executed independently, which reduces the dependence among functional modules and helps software programing and testing. This software for multisensor data fusion is designed as hierarchical structure by the inheritance character of classes. Each functional modular is abstracted and encapsulated through class structure, which avoids software redundancy and enhances readability.

  2. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  3. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov Websites

    battery designers, developers, and manufacturers create affordable, high-performance lithium-ion (Li-ion Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided ) batteries for next-generation electric-drive vehicles (EDVs). An image of a simulation of a battery pack

  4. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  5. Software design and documentation language, revision 1

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1979-01-01

    The Software Design and Documentation Language (SDDL) developed to provide an effective communications medium to support the design and documentation of complex software applications is described. Features of the system include: (1) a processor which can convert design specifications into an intelligible, informative machine-reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor. The SDDL processor is written in the SIMSCRIPT II programming language and is implemented on the UNIVAC 1108, the IBM 360/370, and Control Data machines.

  6. Software support environment design knowledge capture

    NASA Technical Reports Server (NTRS)

    Dollman, Tom

    1990-01-01

    The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.

  7. Automatic extraction and visualization of object-oriented software design metrics

    NASA Astrophysics Data System (ADS)

    Lakshminarayana, Anuradha; Newman, Timothy S.; Li, Wei; Talburt, John

    2000-02-01

    Software visualization is a graphical representation of software characteristics and behavior. Certain modes of software visualization can be useful in isolating problems and identifying unanticipated behavior. In this paper we present a new approach to aid understanding of object- oriented software through 3D visualization of software metrics that can be extracted from the design phase of software development. The focus of the paper is a metric extraction method and a new collection of glyphs for multi- dimensional metric visualization. Our approach utilize the extensibility interface of a popular CASE tool to access and automatically extract the metrics from Unified Modeling Language class diagrams. Following the extraction of the design metrics, 3D visualization of these metrics are generated for each class in the design, utilizing intuitively meaningful 3D glyphs that are representative of the ensemble of metrics. Extraction and visualization of design metrics can aid software developers in the early study and understanding of design complexity.

  8. Software For Design Of Life-Support Systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1991-01-01

    Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.

  9. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  10. An XML-based method for astronomy software designing

    NASA Astrophysics Data System (ADS)

    Liao, Mingxue; Aili, Yusupu; Zhang, Jin

    XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.

  11. Calculation of DNA strand breaks due to direct and indirect effects of Auger electrons from incorporated 123I and 125I radionuclides using the Geant4 computer code.

    PubMed

    Raisali, Gholamreza; Mirzakhanian, Lalageh; Masoudi, Seyed Farhad; Semsarha, Farid

    2013-01-01

    In this work the number of DNA single-strand breaks (SSB) and double-strand breaks (DSB) due to direct and indirect effects of Auger electrons from incorporated (123)I and (125)I have been calculated by using the Geant4-DNA toolkit. We have performed and compared the calculations for several cases: (125)I versus (123)I, source positions and direct versus indirect breaks to study the capability of the Geant4-DNA in calculations of DNA damage yields. Two different simple geometries of a 41 base pair of B-DNA have been simulated. The location of (123)I has been considered to be in (123)IdUrd and three different locations for (125)I. The results showed that the simpler geometry is sufficient for direct break calculations while indirect damage yield is more sensitive to the helical shape of DNA. For (123)I Auger electrons, the average number of DSB due to the direct hits is almost twice the DSB due to the indirect hits. Furthermore, a comparison between the average number of SSB or DSB caused by Auger electrons of (125)I and (123)I in (125)IdUrd and (123)IdUrd shows that (125)I is 1.5 times more effective than (123)I per decay. The results are in reasonable agreement with previous experimental and theoretical results which shows the applicability of the Geant-DNA toolkit in nanodosimetry calculations which benefits from the open-source accessibility with the advantage that the DNA models used in this work enable us to save the computational time. Also, the results showed that the simpler geometry is suitable for direct break calculations, while for the indirect damage yield, the more precise model is preferred.

  12. Using mathematical software to design power electronic converters

    NASA Astrophysics Data System (ADS)

    Hinov, Nikolay; Hranov, Tsveti

    2017-12-01

    In the paper is presented mathematical software, which was used for design of power electronic devices. Examined to different example, which are applied to designing electronic converters. In this way, it is possible to play different combinations of the circuit elements by simple means, thus optimizing according to certain criteria and limitations. Free software with a simple and intuitive interface is selected. No special user training is required to work with it and no further training is required. The use of mathematical software greatly facilitates the design, assists and makes it attractive and accessible to a wider range of students and specialists in power electronics training.

  13. Design Features of Pedagogically-Sound Software in Mathematics.

    ERIC Educational Resources Information Center

    Haase, Howard; And Others

    Weaknesses in educational software currently available in the domain of mathematics are discussed. A technique that was used for the design and production of mathematics software aimed at improving problem-solving skills which combines sound pedagogy and innovative programming is presented. To illustrate the design portion of this technique, a…

  14. BC404 scintillators as gamma locators studied via Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Cortés, M. L.; Hoischen, R.; Eisenhauer, K.; Gerl, J.; Pietralla, N.

    2014-05-01

    In many applications in industry and academia, an accurate determination of the direction from where gamma rays are emitted is either needed or desirable. Ion-beam therapy treatments, the search for orphan sources, and homeland security applications are examples of fields that can benefit from directional sensitivity to gamma-radiation. Scintillation detectors are a good option for these types of applications as they have relatively low cost, are easy to handle and can be produced in a large range of different sizes. In this work a Geant4 simulation was developed to study the directional sensitivity of different BC404 scintillator geometries and arrangements. The simulation includes all the physical processes relevant for gamma detection in a scintillator. In particular, the creation and propagation of optical photons inside the scintillator was included. A simplified photomultiplier tube model was also simulated. The physical principle exploited is the angular dependence of the shape of the energy spectrum obtained from thin scintillator layers when irradiated from different angles. After an experimental confirmation of the working principle of the device and a check of the simulation, the possibilities and limitations of directional sensitivity to gamma radiation using scintillator layers was tested. For this purpose, point-like sources of typical energies expected in ion-beam therapy were used. Optimal scintillator thicknesses for different energies were determined and the setup efficiencies calculated. The use of arrays of scintillators to reconstruct the direction of incoming gamma rays was also studied. For this case, a spherical source emitting Bremsstrahlung radiation was used together with a setup consisting of scintillator layers. The capability of this setup to identify the center of the extended source was studied together with its angular resolution.

  15. Design study of Software-Implemented Fault-Tolerance (SIFT) computer

    NASA Technical Reports Server (NTRS)

    Wensley, J. H.; Goldberg, J.; Green, M. W.; Kutz, W. H.; Levitt, K. N.; Mills, M. E.; Shostak, R. E.; Whiting-Okeefe, P. M.; Zeidler, H. M.

    1982-01-01

    Software-implemented fault tolerant (SIFT) computer design for commercial aviation is reported. A SIFT design concept is addressed. Alternate strategies for physical implementation are considered. Hardware and software design correctness is addressed. System modeling and effectiveness evaluation are considered from a fault-tolerant point of view.

  16. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  17. ModSAF Software Architecture Design and Overview Document

    DTIC Science & Technology

    1993-12-20

    ADVANCED DISTRIBUTED SIMULATIONTECHNOLOGY AD-A282 740 ModSAF SOFTWARE ARCHITECTURE DESIGN AND OVERVIEW DOCUMENT Ver 1.0 - 20 December 1993 D T...AND SUBTITLE 5. FUNDING NUMBERS MOdSAF SOFTWARE ARCHITECTURE DESIGN AND OVERVIEW DOCUMENT C N61339-91-D-O00, Delivery Order (0021), ModSAF (CDRL A004) 6

  18. Reflecting Indigenous Culture in Educational Software Design.

    ERIC Educational Resources Information Center

    Fleer, Marilyn

    1989-01-01

    Discusses research on Australian Aboriginal cognition which relates to the development of appropriate educational software. Describes "Tinja," a software program using familiar content and experiences, Aboriginal characters and cultural values, extensive graphics and animation, peer and group work, and open-ended design to help young…

  19. The Software Design Document: More than a User's Manual.

    ERIC Educational Resources Information Center

    Bowers, Dennis

    1989-01-01

    Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…

  20. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  1. Analytical Design of Evolvable Software for High-Assurance Computing

    DTIC Science & Technology

    2001-02-14

    Mathematical expression for the Total Sum of Squares which measures the variability that results when all values are treated as a combined sample coming from...primarily interested in background on software design and high-assurance computing, research in software architecture generation or evaluation...respectively. Those readers solely interested in the validation of a software design approach should at the minimum read Chapter 6 followed by Chapter

  2. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  3. Software For Design And Analysis Of Tanks And Cylindrical Shells

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.; Graham, Jerry B.

    1995-01-01

    Skin-stringer Tank Analysis Spreadsheet System (STASS) computer program developed for use as preliminary design software tool that enables quick-turnaround design and analysis of structural domes and cylindrical barrel sections in propellant tanks or other cylindrical shells. Determines minimum required skin thicknesses for domes and cylindrical shells to withstand material failure due to applied pressures (ullage and/or hydrostatic) and runs buckling analyses on cylindrical shells and skin-stringers. Implemented as workbook program, using Microsoft Excel v4.0 on Macintosh II. Also implemented using Microsoft Excel v4.0 for Microsoft Windows v3.1 IBM PC.

  4. Preliminary design of the HARMONI science software

    NASA Astrophysics Data System (ADS)

    Piqueras, Laure; Jarno, Aurelien; Pécontal-Rousset, Arlette; Loupias, Magali; Richard, Johan; Schwartz, Noah; Fusco, Thierry; Sauvage, Jean-François; Neichel, Benoît; Correia, Carlos M.

    2016-08-01

    This paper introduces the science software of HARMONI. The Instrument Numerical Model simulates the instrument from the optical point of view and provides synthetic exposures simulating detector readouts from data-cubes containing astrophysical scenes. The Data Reduction Software converts raw-data frames into a fully calibrated, scientifically usable data cube. We present the functionalities and the preliminary design of this software, describe some of the methods and algorithms used and highlight the challenges that we will have to face.

  5. New Cerec software version 4.3 for Omnicam and Bluecam.

    PubMed

    Fritzsche, G; Schenk, O

    2014-01-01

    The introduction of the Cerec Omnicam acquisition unit in September 2012 presented Sirona with a challenge: con- figuring the existing software version 4 for both the exist- ing Bluecam, which uses still images, and the video-based Omnicam. Sirona has succeeded in making all the features introduced in version 4.2 (such as the virtual articulator or implant-supported single-tooth restorations, both monolithic and two-part designs) work with both camera types, without compromising the uniform, homogeneous look and feel of the software. The virtual articulator (Figs 1a to 1c) now has even more individual configuration options and allows the setting of almost all angles derived from the individual transfer bow based on precalculated average values. The new software version 4.3, presented in July 2014, fixes some minor bugs, such as the time-consuming "empty grinding" after necessary water changes during the grinding process, but also includes many features that noticeably ease the workflow. For example, the important scanning precision in the region of the anterior incisal edges has been improved, which makes the scanning process more reliable, faster, and far more comfortable.

  6. The radiation environment on the surface of Mars - Numerical calculations of the galactic component with GEANT4/PLANETOCOSMICS.

    PubMed

    Matthiä, Daniel; Berger, Thomas

    2017-08-01

    Galactic cosmic radiation and secondary particles produced in the interaction with the atmosphere lead to a complex radiation field on the Martian surface. A workshop ("1st Mars Space Radiation Modeling Workshop") organized by the MSL-RAD science team was held in June 2016 in Boulder with the goal to compare models capable to predict this radiation field with each other and measurements from the RAD instrument onboard the curiosity rover taken between November 15, 2015 and January 15, 2016. In this work the results of PLANETOCOSMICS/GEANT4 contributed to the workshop are presented. Calculated secondary particle spectra on the Martian surface are investigated and the radiation field's directionality of the different particles in dependence on the energy is discussed. Omnidirectional particle fluxes are used in combination with fluence to dose conversion factors to calculate absorbed dose rates and dose equivalent rates in a slab of tissue. Copyright © 2017. Published by Elsevier Ltd.

  7. Design and implementation of Skype USB user gateway software

    NASA Astrophysics Data System (ADS)

    Qi, Yang

    2017-08-01

    With the widespread application of VoIP, the client with private protocol becomes more and more popular. Skype is one of the representatives. How to connect Skype with PSTN just by Skype client has gradually become hot. This paper design and implement the software based on a kind of USB User Gateway. With the software Skype user can freely communicate with PSTN phone. FSM is designed as the core of the software, and Skype control is separated by the USB Gateway control. In this way, the communication becomes more flexible and efficient. In the actual user testing, the software obtains good results.

  8. Swarming Robot Design, Construction and Software Implementation

    NASA Technical Reports Server (NTRS)

    Stolleis, Karl A.

    2014-01-01

    In this paper is presented an overview of the hardware design, construction overview, software design and software implementation for a small, low-cost robot to be used for swarming robot development. In addition to the work done on the robot, a full simulation of the robotic system was developed using Robot Operating System (ROS) and its associated simulation. The eventual use of the robots will be exploration of evolving behaviors via genetic algorithms and builds on the work done at the University of New Mexico Biological Computation Lab.

  9. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  10. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pordes, Rush; Snider, Erica

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation softwaremore » and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.« less

  11. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field.

    PubMed

    Yang, Y M; Bednarz, B

    2013-02-21

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  12. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Bednarz, B.

    2013-02-01

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  13. Early-Stage Software Design for Usability

    ERIC Educational Resources Information Center

    Golden, Elspeth

    2010-01-01

    In spite of the goodwill and best efforts of software engineers and usability professionals, systems continue to be built and released with glaring usability flaws that are costly and difficult to fix after the system has been built. Although user interface (UI) designers, be they usability or design experts, communicate usability requirements to…

  14. Thick-foils activation technique for neutron spectrum unfolding with the MINUIT routine-Comparison with GEANT4 simulations

    NASA Astrophysics Data System (ADS)

    Vagena, E.; Theodorou, K.; Stoulos, S.

    2018-04-01

    Neutron activation technique has been applied using a proposed set of twelve thick metal foils (Au, As, Cd, In, Ir, Er, Mn, Ni, Se, Sm, W, Zn) for off-site measurements to obtain the neutron spectrum over a wide energy range (from thermal up to a few MeV) in intense neutron-gamma mixed fields such as around medical Linacs. The unfolding procedure takes into account the activation rates measured using thirteen (n , γ) and two (n , p) reactions without imposing a guess solution-spectrum. The MINUIT minimization routine unfolds a neutron spectrum that is dominated by fast neutrons (70%) peaking at 0.3 MeV, while the thermal peak corresponds to the 15% of the total neutron fluence equal to the epithermal-resonances area. The comparison of the unfolded neutron spectrum against the simulated one with the GEANT4 Monte-Carlo code shows a reasonable agreement within the measurement uncertainties. Therefore, the proposed set of activation thick-foils could be a useful tool in order to determine low flux neutrons spectrum in intense mixed field.

  15. Efficiency of depleted UO2 based semiconductor neutron detectors in direct and indirect configuration—A GEANT4 simulation study

    NASA Astrophysics Data System (ADS)

    Parida, M. K.; Prabakar, K.; Sundari, S. T.

    2018-03-01

    In the present work, Monte Carlo simulations using GEANT4 are carried out to estimate the efficiency of semiconductor neutron detectors with depleted UO2 (DUO2) as converter material, in both planar (direct and indirect) and 3D geometry (cylindrical perforation and trenches structure) configurations. The simulations were conducted for neutrons of variable energy viz., thermal (25 meV) and fast (1 to 10 MeV) that were incident on varying thicknesses (0.25 μm to 1000 μm), diameters (1 μm to 9 μm) and widths (1 μm to 9 μm) along with depths (50 μm to 275 μm) of DUO2 for planar, cylindrical perforated and trench structures, respectively. In the case of direct planar detectors, efficiency was found to increase with the thickness of DUO2 and the rate at which efficiency increased was found to follow the macroscopic fission cross section at the corresponding neutron energy. In the case of indirect planar detector, efficiency was lower as compared to direct configuration and was found to saturate beyond a thickness of ~3 μm. This saturation is explained on the basis of mean free path of neutrons in the DUO2 material. For the 3D perforated silicon detectors of cylindrical (trench) geometry, backfilled with DUO2, the efficiency for detection of thermal neutrons ~25 meV and fast neutrons ~ typical energy of 10 MeV was found to be ~0.0159% (~0.0177%) and ~0.0088% (0.0098%), respectively. These efficiency values were two (one) order values higher than planar indirect detector for thermal (fast) neutrons. Histogram plots were also obtained from the GEANT4 simulations to monitor the energy distribution of fission products in planar (direct and indirect) and 3D geometry (cylindrical and trench) configurations. These plots revealed that, for all the detector configurations, the energy deposited by the fission products are higher as compared to the typical gamma ray background. Thus, for detectors with DUO2 as converter material, higher values of low level discriminator

  16. Software For Drawing Design Details Concurrently

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    Software system containing five computer-aided-design programs enables more than one designer to work on same part or assembly at same time. Reduces time necessary to produce design by implementing concept of parallel or concurrent detailing, in which all detail drawings documenting three-dimensional model of part or assembly produced simultaneously, rather than sequentially. Keeps various detail drawings consistent with each other and with overall design by distributing changes in each detail to all other affected details.

  17. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  18. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  19. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software.

    PubMed

    Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad

    2015-01-01

    Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.

  20. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  1. Geant4 simulations of soft proton scattering in X-ray optics. A tentative validation using laboratory measurements

    NASA Astrophysics Data System (ADS)

    Fioretti, Valentina; Mineo, Teresa; Bulgarelli, Andrea; Dondero, Paolo; Ivanchenko, Vladimir; Lei, Fan; Lotti, Simone; Macculi, Claudio; Mantero, Alfonso

    2017-12-01

    Low energy protons (< 300 keV) can enter the field of view of X-ray telescopes, scatter on their mirror surfaces at small incident angles, and deposit energy on the detector. This phenomenon can cause intense background flares at the focal plane decreasing the mission observing time (e.g. the XMM-Newton mission) or in the most extreme cases, damaging the X-ray detector. A correct modelization of the physics process responsible for the grazing angle scattering processes is mandatory to evaluate the impact of such events on the performance (e.g. observation time, sensitivity) of future X-ray telescopes as the ESA ATHENA mission. The Remizovich model describes particles reflected by solids at glancing angles in terms of the Boltzmann transport equation using the diffuse approximation and the model of continuous slowing down in energy. For the first time this solution, in the approximation of no energy losses, is implemented, verified, and qualitatively validated on top of the Geant4 release 10.2, with the possibility to add a constant energy loss to each interaction. This implementation is verified by comparing the simulated proton distribution to both the theoretical probability distribution and with independent ray-tracing simulations. Both the new scattering physics and the Coulomb scattering already built in the official Geant4 distribution are used to reproduce the latest experimental results on grazing angle proton scattering. At 250 keV multiple scattering delivers large proton angles and it is not consistent with the observation. Among the tested models, the single scattering seems to better reproduce the scattering efficiency at the three energies but energy loss obtained at small scattering angles is significantly lower than the experimental values. In general, the energy losses obtained in the experiment are higher than what obtained by the simulation. The experimental data are not completely representative of the soft proton scattering experienced by

  2. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured

  3. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  4. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  5. Toward a Formal Model of the Design and Evolution of Software

    DTIC Science & Technology

    1988-12-20

    should have the flezibiity to support a variety of design methodologies, be compinhenaive enough to encompass the gamut of software lifecycle...the future. It should have the flezibility to support a variety of design methodologies, be comprehensive enough to encompass the gamut of software...variety of design methodologies, be comprehensive enough to encompass the gamut of software lifecycle activities, and be precise enough to provide the

  6. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  7. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  8. Designing Distributed Learning Environments with Intelligent Software Agents

    ERIC Educational Resources Information Center

    Lin, Fuhua, Ed.

    2005-01-01

    "Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…

  9. Agile development approach for the observatory control software of the DAG 4m telescope

    NASA Astrophysics Data System (ADS)

    Güçsav, B. Bülent; ćoker, Deniz; Yeşilyaprak, Cahit; Keskin, Onur; Zago, Lorenzo; Yerli, Sinan K.

    2016-08-01

    Observatory Control Software for the upcoming 4m infrared telescope of DAG (Eastern Anatolian Observatory in Turkish) is in the beginning of its lifecycle. After the process of elicitation-validation of the initial requirements, we have been focused on preparation of a rapid conceptual design not only to see the big picture of the system but also to clarify the further development methodology. The existing preliminary designs for both software (including TCS and active optics control system) and hardware shall be presented here in brief to exploit the challenges the DAG software team has been facing with. The potential benefits of an agile approach for the development will be discussed depending on the published experience of the community and on the resources available to us.

  10. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  11. Ion therapy for uveal melanoma in new human eye phantom based on GEANT4 toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahdipour, Seyed Ali; Mowlavi, Ali Asghar, E-mail: amowlavi@hsu.ac.ir; ICTP, Associate Federation Scheme, Medical Physics Field, Trieste

    Radiotherapy with ion beams like proton and carbon has been used for treatment of eye uveal melanoma for many years. In this research, we have developed a new phantom of human eye for Monte Carlo simulation of tumors treatment to use in GEANT4 toolkit. Total depth−dose profiles for the proton, alpha, and carbon incident beams with the same ranges have been calculated in the phantom. Moreover, the deposited energy of the secondary particles for each of the primary beams is calculated. The dose curves are compared for 47.8 MeV proton, 190.1 MeV alpha, and 1060 MeV carbon ions that havemore » the same range in the target region reaching to the center of tumor. The passively scattered spread-out Bragg peak (SOBP) for each incident beam as well as the flux curves of the secondary particles including neutron, gamma, and positron has been calculated and compared for the primary beams. The high sharpness of carbon beam's Bragg peak with low lateral broadening is the benefit of this beam in hadrontherapy but it has disadvantages of dose leakage in the tail after its Bragg peak and high intensity of neutron production. However, proton beam, which has a good conformation with tumor shape owing to the beam broadening caused by scattering, can be a good choice for the large-size tumors.« less

  12. Making software get along: integrating optical and mechanical design programs

    NASA Astrophysics Data System (ADS)

    Shackelford, Christie J.; Chinnock, Randal B.

    2001-03-01

    As modern optomechanical engineers, we have the good fortune of having very sophisticated software programs available to us. The current optical design, mechanical design, industrial design, and CAM programs are very powerful tools with some very desirable features. However, no one program can do everything necessary to complete an entire optomechanical system design. Each program has a unique set of features and benefits, and typically two or mo re will be used during the product development process. At a minimum, an optical design program and a mechanical CAD package will be employed. As we strive for efficient, cost-effective, and rapid progress in our development projects, we must use these programs to their full advantage, while keeping redundant tasks to a minimum. Together, these programs offer the promise of a `seamless' flow of data from concept all the way to the download of part designs directly to the machine shop for fabrication. In reality, transferring data from one software package to the next is often frustrating. Overcoming these problems takes some know-how, a bit of creativity, and a lot of persistence. This paper describes a complex optomechanical development effort in which a variety of software tools were used from the concept stage to prototyping. It will describe what software was used for each major design task, how we learned to use them together to best advantage, and how we overcame the frustrations of software that didn't get along.

  13. Detector Simulations with DD4hep

    NASA Astrophysics Data System (ADS)

    Petrič, M.; Frank, M.; Gaede, F.; Lu, S.; Nikiforou, N.; Sailer, A.

    2017-10-01

    Detector description is a key component of detector design studies, test beam analyses, and most of particle physics experiments that require the simulation of more and more different detector geometries and event types. This paper describes DD4hep, which is an easy-to-use yet flexible and powerful detector description framework that can be used for detector simulation and also extended to specific needs for a particular working environment. Linear collider detector concepts ILD, SiD and CLICdp as well as detector development collaborations CALICE and FCal have chosen to adopt the DD4hep geometry framework and its DDG4 pathway to Geant4 as its core simulation and reconstruction tools. The DDG4 plugins suite includes a wide variety of input formats, provides access to the Geant4 particle gun or general particles source and allows for handling of Monte Carlo truth information, eg. by linking hits and the primary particle that caused them, which is indispensable for performance and efficiency studies. An extendable array of segmentations and sensitive detectors allows the simulation of a wide variety of detector technologies. This paper shows how DD4hep allows to perform complex Geant4 detector simulations without compiling a single line of additional code by providing a palette of sub-detector components that can be combined and configured via compact XML files. Simulation is controlled either completely via the command line or via simple Python steering files interpreted by a Python executable. It also discusses how additional plugins and extensions can be created to increase the functionality.

  14. Inter-comparison of Dose Distributions Calculated by FLUKA, GEANT4, MCNP, and PHITS for Proton Therapy

    NASA Astrophysics Data System (ADS)

    Yang, Zi-Yi; Tsai, Pi-En; Lee, Shao-Chun; Liu, Yen-Chiang; Chen, Chin-Cheng; Sato, Tatsuhiko; Sheu, Rong-Jiun

    2017-09-01

    The dose distributions from proton pencil beam scanning were calculated by FLUKA, GEANT4, MCNP, and PHITS, in order to investigate their applicability in proton radiotherapy. The first studied case was the integrated depth dose curves (IDDCs), respectively from a 100 and a 226-MeV proton pencil beam impinging a water phantom. The calculated IDDCs agree with each other as long as each code employs 75 eV for the ionization potential of water. The second case considered a similar condition of the first case but with proton energies in a Gaussian distribution. The comparison to the measurement indicates the inter-code differences might not only due to different stopping power but also the nuclear physics models. How the physics parameter setting affect the computation time was also discussed. In the third case, the applicability of each code for pencil beam scanning was confirmed by delivering a uniform volumetric dose distribution based on the treatment plan, and the results showed general agreement between each codes, the treatment plan, and the measurement, except that some deviations were found in the penumbra region. This study has demonstrated that the selected codes are all capable of performing dose calculations for therapeutic scanning proton beams with proper physics settings.

  15. Empirical studies of design software: Implications for software engineering environments

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    The empirical studies team of MCC's Design Process Group conducted three studies in 1986-87 in order to gather data on professionals designing software systems in a range of situations. The first study (the Lift Experiment) used thinking aloud protocols in a controlled laboratory setting to study the cognitive processes of individual designers. The second study (the Object Server Project) involved the observation, videotaping, and data collection of a design team of a medium-sized development project over several months in order to study team dynamics. The third study (the Field Study) involved interviews with the personnel from 19 large development projects in the MCC shareholders in order to study how the process of design is affected by organizationl and project behavior. The focus of this report will be on key observations of design process (at several levels) and their implications for the design of environments.

  16. User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Endert, Alexander

    In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We discuss a number of studies of collaboration in the intelligence community and use this information to provide some guidelines for collaboration software.

  17. GeMS: an advanced software package for designing synthetic genes.

    PubMed

    Jayaraj, Sebastian; Reid, Ralph; Santi, Daniel V

    2005-01-01

    A user-friendly, advanced software package for gene design is described. The software comprises an integrated suite of programs-also provided as stand-alone tools-that automatically performs the following tasks in gene design: restriction site prediction, codon optimization for any expression host, restriction site inclusion and exclusion, separation of long sequences into synthesizable fragments, T(m) and stem-loop determinations, optimal oligonucleotide component design and design verification/error-checking. The output is a complete design report and a list of optimized oligonucleotides to be prepared for subsequent gene synthesis. The user interface accommodates both inexperienced and experienced users. For inexperienced users, explanatory notes are provided such that detailed instructions are not necessary; for experienced users, a streamlined interface is provided without such notes. The software has been extensively tested in the design and successful synthesis of over 400 kb of genes, many of which exceeded 5 kb in length.

  18. Geant4 beam model for boron neutron capture therapy: investigation of neutron dose components.

    PubMed

    Moghaddasi, Leyla; Bezak, Eva

    2018-03-01

    Boron neutron capture therapy (BNCT) is a biochemically-targeted type of radiotherapy, selectively delivering localized dose to tumour cells diffused in normal tissue, while minimizing normal tissue toxicity. BNCT is based on thermal neutron capture by stable [Formula: see text]B nuclei resulting in emission of short-ranged alpha particles and recoil [Formula: see text]Li nuclei. The purpose of the current work was to develop and validate a Monte Carlo BNCT beam model and to investigate contribution of individual dose components resulting of neutron interactions. A neutron beam model was developed in Geant4 and validated against published data. The neutron beam spectrum, obtained from literature for a cyclotron-produced beam, was irradiated to a water phantom with boron concentrations of 100 μg/g. The calculated percentage depth dose curves (PDDs) in the phantom were compared with published data to validate the beam model in terms of total and boron depth dose deposition. Subsequently, two sensitivity studies were conducted to quantify the impact of: (1) neutron beam spectrum, and (2) various boron concentrations on the boron dose component. Good agreement was achieved between the calculated and measured neutron beam PDDs (within 1%). The resulting boron depth dose deposition was also in agreement with measured data. The sensitivity study of several boron concentrations showed that the calculated boron dose gradually converged beyond 100 μg/g boron concentration. This results suggest that 100μg/g tumour boron concentration may be optimal and above this value limited increase in boron dose is expected for a given neutron flux.

  19. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  1. User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Endert, Alexander N.

    In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We present some standing issues in collaborative software based on existing work within the intelligence community. Based on this information we present opportunities to address some of these challenges.

  2. A four-alternative forced choice (4AFC) software for observer performance evaluation in radiology

    NASA Astrophysics Data System (ADS)

    Zhang, Guozhi; Cockmartin, Lesley; Bosmans, Hilde

    2016-03-01

    Four-alternative forced choice (4AFC) test is a psychophysical method that can be adopted for observer performance evaluation in radiological studies. While the concept of this method is well established, difficulties to handle large image data, perform unbiased sampling, and keep track of the choice made by the observer have restricted its application in practice. In this work, we propose an easy-to-use software that can help perform 4AFC tests with DICOM images. The software suits for any experimental design that follows the 4AFC approach. It has a powerful image viewing system that favorably simulates the clinical reading environment. The graphical interface allows the observer to adjust various viewing parameters and perform the selection with very simple operations. The sampling process involved in 4AFC as well as the speed and accuracy of the choice made by the observer is precisely monitored in the background and can be easily exported for test analysis. The software has also a defensive mechanism for data management and operation control that minimizes the possibility of mistakes from user during the test. This software can largely facilitate the use of 4AFC approach in radiological observer studies and is expected to have widespread applicability.

  3. Accuracy of computerized automatic identification of cephalometric landmarks by a designed software.

    PubMed

    Shahidi, Sh; Shahidi, S; Oshagh, M; Gozin, F; Salehi, P; Danaei, S M

    2013-01-01

    The purpose of this study was to design software for localization of cephalometric landmarks and to evaluate its accuracy in finding landmarks. 40 digital cephalometric radiographs were randomly selected. 16 landmarks which were important in most cephalometric analyses were chosen to be identified. Three expert orthodontists manually identified landmarks twice. The mean of two measurements of each landmark was defined as the baseline landmark. The computer was then able to compare the automatic system's estimate of a landmark with the baseline landmark. The software was designed using Delphi and Matlab programming languages. The techniques were template matching, edge enhancement and some accessory techniques. The total mean error between manually identified and automatically identified landmarks was 2.59 mm. 12.5% of landmarks had mean errors less than 1 mm. 43.75% of landmarks had mean errors less than 2 mm. The mean errors of all landmarks except the anterior nasal spine were less than 4 mm. This software had significant accuracy for localization of cephalometric landmarks and could be used in future applications. It seems that the accuracy obtained with the software which was developed in this study is better than previous automated systems that have used model-based and knowledge-based approaches.

  4. Object-oriented software design in semiautomatic building extraction

    NASA Astrophysics Data System (ADS)

    Guelch, Eberhard; Mueller, Hardo

    1997-08-01

    Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.

  5. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  6. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  7. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  8. Intelligent Software for System Design and Documentation

    NASA Technical Reports Server (NTRS)

    2002-01-01

    In an effort to develop a real-time, on-line database system that tracks documentation changes in NASA's propulsion test facilities, engineers at Stennis Space Center teamed with ECT International of Brookfield, WI, through the NASA Dual-Use Development Program to create the External Data Program and Hyperlink Add-on Modules for the promis*e software. Promis*e is ECT's top-of-the-line intelligent software for control system design and documentation. With promis*e the user can make use of the automated design process to quickly generate control system schematics, panel layouts, bills of material, wire lists, terminal plans and more. NASA and its testing contractors currently use promis*e to create the drawings and schematics at the E2 Cell 2 test stand located at Stennis Space Center.

  9. Acquiring Software Design Schemas: A Machine Learning Perspective

    NASA Technical Reports Server (NTRS)

    Harandi, Mehdi T.; Lee, Hing-Yan

    1991-01-01

    In this paper, we describe an approach based on machine learning that acquires software design schemas from design cases of existing applications. An overview of the technique, design representation, and acquisition system are presented. the paper also addresses issues associated with generalizing common features such as biases. The generalization process is illustrated using an example.

  10. Monte Carlo simulation of MOSFET dosimeter for electron backscatter using the GEANT4 code.

    PubMed

    Chow, James C L; Leung, Michael K K

    2008-06-01

    The aim of this study is to investigate the influence of the body of the metal-oxide-semiconductor field effect transistor (MOSFET) dosimeter in measuring the electron backscatter from lead. The electron backscatter factor (EBF), which is defined as the ratio of dose at the tissue-lead interface to the dose at the same point without the presence of backscatter, was calculated by the Monte Carlo simulation using the GEANT4 code. Electron beams with energies of 4, 6, 9, and 12 MeV were used in the simulation. It was found that in the presence of the MOSFET body, the EBFs were underestimated by about 2%-0.9% for electron beam energies of 4-12 MeV, respectively. The trend of the decrease of EBF with an increase of electron energy can be explained by the small MOSFET dosimeter, mainly made of epoxy and silicon, not only attenuated the electron fluence of the electron beam from upstream, but also the electron backscatter generated by the lead underneath the dosimeter. However, this variation of the EBF underestimation is within the same order of the statistical uncertainties as the Monte Carlo simulations, which ranged from 1.3% to 0.8% for the electron energies of 4-12 MeV, due to the small dosimetric volume. Such small EBF deviation is therefore insignificant when the uncertainty of the Monte Carlo simulation is taken into account. Corresponding measurements were carried out and uncertainties compared to Monte Carlo results were within +/- 2%. Spectra of energy deposited by the backscattered electrons in dosimetric volumes with and without the lead and MOSFET were determined by Monte Carlo simulations. It was found that in both cases, when the MOSFET body is either present or absent in the simulation, deviations of electron energy spectra with and without the lead decrease with an increase of the electron beam energy. Moreover, the softer spectrum of the backscattered electron when lead is present can result in a reduction of the MOSFET response due to stronger

  11. Advanced Software Techniques for Data Management Systems. Volume 2: Space Shuttle Flight Executive System: Functional Design

    NASA Technical Reports Server (NTRS)

    Pepe, J. T.

    1972-01-01

    A functional design of software executive system for the space shuttle avionics computer is presented. Three primary functions of the executive are emphasized in the design: task management, I/O management, and configuration management. The executive system organization is based on the applications software and configuration requirements established during the Phase B definition of the Space Shuttle program. Although the primary features of the executive system architecture were derived from Phase B requirements, it was specified for implementation with the IBM 4 Pi EP aerospace computer and is expected to be incorporated into a breadboard data management computer system at NASA Manned Spacecraft Center's Information system division. The executive system was structured for internal operation on the IBM 4 Pi EP system with its external configuration and applications software assumed to the characteristic of the centralized quad-redundant avionics systems defined in Phase B.

  12. SU-G-TeP3-04: Evaluation of the Dose Enhancement with Gold Nanoparticle in Microdosimetry Level Using the Geant4-DNA Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, C; Chow, J

    Purpose: This study investigated the dose enhancement effect of using gold nanoparticles (GNP) as radiation sensitizers radiated by different photon beam energies. Microdosimetry of photon-irradiated GNP was determined by the Geant4-DNA process in the DNA scale. Methods: Monte Carlo simulation was conducted using the Geant4 toolkit (ver. 10.2). A GNP with different sizes (30, 50, and 100nm diameter sphere) and a DNA were placed in a water cube (1µm{sup 3}). The GNP was irradiated by photon beams with different energies (50, 100, and 150keV) and produced secondary electrons to increase the dose to the DNA. Energy depositions were calculated formore » both with and without GNP and to investigate the dose enhancement effect at the DNA. The distance between the GNP and DNA was varied to optimize the best GNP position to the DNA. The photon beam source was set to 200nm from the GNP in each simulation. Results: It is found that GNP had a dose enhancement effect on kV photon radiations. For Monte Carlo results on different GNP sizes, distances between the GNP and DNA, and photon beam energies, enhancement ratio was found increasing as GNP size increased. The distance between the GNP and DNA affected the result that as distance increased while the dose enhancement ratio decreased. However, the effect of changing distance was not as significant as varying the GNP size. In addition, increasing the photon beam energy also increased the dose enhancement ratio. The largest dose enhancement ratio was found to be 3.5, when the GNP (100nm diameter) irradiated by the 150keV photon beam was set to 80nm from the DNA. Conclusion: Dose enhancement was determined in the DNA with GNP in the microdosimetry scale. It is concluded that the dose enhancement varied with the photon beam energy, GNP size and distance between the GNP and DNA.« less

  13. Research on Visualization Design Method in the Field of New Media Software Engineering

    NASA Astrophysics Data System (ADS)

    Deqiang, Hu

    2018-03-01

    In the new period of increasingly developed science and technology, with the increasingly fierce competition in the market and the increasing demand of the masses, new design and application methods have emerged in the field of new media software engineering, that is, the visualization design method. Applying the visualization design method to the field of new media software engineering can not only improve the actual operation efficiency of new media software engineering but more importantly the quality of software development can be enhanced by means of certain media of communication and transformation; on this basis, the progress and development of new media software engineering in China are also continuously promoted. Therefore, the application of visualization design method in the field of new media software engineering is analysed concretely in this article from the perspective of the overview of visualization design methods and on the basis of systematic analysis of the basic technology.

  14. Journal of Open Source Software (JOSS): design and first-year review

    NASA Astrophysics Data System (ADS)

    Smith, Arfon M.

    2018-01-01

    JOSS is a free and open-access journal that publishes articles describing research software across all disciplines. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. JOSS published more than 100 articles in its first year, many from the scientific python ecosystem (including a number of articles related to astronomy and astrophysics). JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative.In this presentation, I'll describes the motivation, design, and progress of the Journal of Open Source Software (JOSS) and how it compares to other avenues for publishing research software in astronomy.

  15. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  16. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  17. JPL Facilities and Software for Collaborative Design: 1994 - Present

    NASA Technical Reports Server (NTRS)

    DeFlorio, Paul A.

    2004-01-01

    The viewgraph presentation provides an overview of the history of the JPL Project Design Center (PDC) and, since 2000, the Center for Space Mission Architecture and Design (CSMAD). The discussion includes PDC objectives and scope; mission design metrics; distributed design; a software architecture timeline; facility design principles; optimized design for group work; CSMAD plan view, facility design, and infrastructure; and distributed collaboration tools.

  18. Training Software Developers and Designers to Conduct Usability Evaluations

    ERIC Educational Resources Information Center

    Skov, Mikael Brasholt; Stage, Jan

    2012-01-01

    Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…

  19. Use of Software Tools in Teaching Relational Database Design.

    ERIC Educational Resources Information Center

    McIntyre, D. R.; And Others

    1995-01-01

    Discusses the use of state-of-the-art software tools in teaching a graduate, advanced, relational database design course. Results indicated a positive student response to the prototype of expert systems software and a willingness to utilize this new technology both in their studies and in future work applications. (JKP)

  20. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    NASA Technical Reports Server (NTRS)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  1. The Implementation of Satellite Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Anderson, Mark O.; Reid, Mark; Drury, Derek; Hansell, William; Phillips, Tom

    1998-01-01

    NASA established the Small Explorer (SMEX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions that can be launched into low earth orbit by small expendable vehicles. The development schedule for each SMEX spacecraft was three years from start to launch. The SMEX program has produced five satellites; Solar Anomalous and Magnetospheric Particle Explorer (SAMPEX), Fast Auroral Snapshot Explorer (FAST), Submillimeter Wave Astronomy Satellite (SWAS), Transition Region and Coronal Explorer (TRACE) and Wide-Field Infrared Explorer (WIRE). SAMPEX and FAST are on-orbit, TRACE is scheduled to be launched in April of 1998, WIRE is scheduled to be launched in September of 1998, and SWAS is scheduled to be launched in January of 1999. In each of these missions, the Attitude Control System (ACS) software was written using a modular procedural design. Current program goals require complete spacecraft development within 18 months. This requirement has increased pressure to write reusable flight software. Object-Oriented Design (OOD) offers the constructs for developing an application that only needs modification for mission unique requirements. This paper describes the OOD that was used to develop the SMEX-Lite ACS software. The SMEX-Lite ACS is three-axis controlled, momentum stabilized, and is capable of performing sub-arc-minute pointing. The paper first describes the high level requirements which governed the architecture of the SMEX-Lite ACS software. Next, the context in which the software resides is explained. The paper describes the benefits of encapsulation, inheritance and polymorphism with respect to the implementation of an ACS software system. This paper will discuss the design of several software components that comprise the ACS software. Specifically, Object-Oriented designs are presented for sensor data processing, attitude control, attitude determination and failure detection. The paper addresses

  2. Using CASE Software to Teach Undergraduates Systems Analysis and Design.

    ERIC Educational Resources Information Center

    Wilcox, Russell E.

    1988-01-01

    Describes the design and delivery of a college course for information system students utilizing a Computer-Aided Software Engineering program. Discusses class assignments, cooperative learning, student attitudes, and the advantages of using this software in the course. (CW)

  3. Learning & Personality Types: A Case Study of a Software Design Course

    ERIC Educational Resources Information Center

    Ahmed, Faheem; Campbell, Piers; Jaffar, Ahmad; Alkobaisi, Shayma; Campbell, Julie

    2010-01-01

    The software industry has continued to grow over the past decade and there is now a need to provide education and hands-on training to students in various phases of software life cycle. Software design is one of the vital phases of the software development cycle. Psychological theories assert that not everybody is fit for all kind of tasks as…

  4. User Interface Design for Dynamic Geometry Software

    ERIC Educational Resources Information Center

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  5. Exploratory research for the development of a computer aided software design environment with the software technology program

    NASA Technical Reports Server (NTRS)

    Hardwick, Charles

    1991-01-01

    Field studies were conducted by MCC to determine areas of research of mutual interest to MCC and JSC. NASA personnel from the Information Systems Directorate and research faculty from UHCL/RICIS visited MCC in Austin, Texas to examine tools and applications under development in the MCC Software Technology Program. MCC personnel presented workshops in hypermedia, design knowledge capture, and design recovery on site at JSC for ISD personnel. The following programs were installed on workstations in the Software Technology Lab, NASA/JSC: (1) GERM (Graphic Entity Relations Modeler); (2) gIBIS (Graphic Issues Based Information System); and (3) DESIRE (Design Recovery tool). These applications were made available to NASA for inspection and evaluation. Programs developed in the MCC Software Technology Program run on the SUN workstation. The programs do not require special configuration, but they will require larger than usual amounts of disk space and RAM to operate properly.

  6. Cognitive task analysis-based design and authoring software for simulation training.

    PubMed

    Munro, Allen; Clark, Richard E

    2013-10-01

    The development of more effective medical simulators requires a collaborative team effort where three kinds of expertise are carefully coordinated: (1) exceptional medical expertise focused on providing complete and accurate information about the medical challenges (i.e., critical skills and knowledge) to be simulated; (2) instructional expertise focused on the design of simulation-based training and assessment methods that produce maximum learning and transfer to patient care; and (3) software development expertise that permits the efficient design and development of the software required to capture expertise, present it in an engaging way, and assess student interactions with the simulator. In this discussion, we describe a method of capturing more complete and accurate medical information for simulators and combine it with new instructional design strategies that emphasize the learning of complex knowledge. Finally, we describe three different types of software support (Development/Authoring, Run Time, and Post Run Time) required at different stages in the development of medical simulations and the instructional design elements of the software required at each stage. We describe the contributions expected of each kind of software and the different instructional control authoring support required. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  7. Critical Design Decisions of The Planck LFI Level 1 Software

    NASA Astrophysics Data System (ADS)

    Morisset, N.; Rohlfs, R.; Türler, M.; Meharga, M.; Binko, P.; Beck, M.; Frailis, M.; Zacchei, A.

    2010-12-01

    The PLANCK satellite with two on-board instruments, a Low Frequency Instrument (LFI) and a High Frequency Instrument (HFI) has been launched on May 14th with Ariane 5. The ISDC Data Centre for Astrophysics in Versoix, Switzerland has developed and maintains the Planck LFI Level 1 software for the Data Processing Centre (DPC) in Trieste, Italy. The main tasks of the Level 1 processing are to retrieve the daily available scientific and housekeeping (HK) data of the LFI instrument, the Sorption Cooler and the 4k Cooler data from Mission Operation Centre (MOC) in Darmstadt; to sort them by time and by type (detector, observing mode, etc...); to extract the spacecraft attitude information from auxiliary files; to flag the data according to several criteria; and to archive the resulting Time Ordered Information (TOI), which will then be used to produce maps of the sky in different spectral bands. The output of the Level 1 software are the TOI files in FITS format, later ingested into the Data Management Component (DMC) database. This software has been used during different phases of the LFI instrument development. We started to reuse some ISDC components for the LFI Qualification Model (QM) and we completely rework the software for the Flight Model (FM). This was motivated by critical design decisions taken jointly with the DPC. The main questions were: a) the choice of the data format: FITS or DMC? b) the design of the pipelines: use of the Planck Process Coordinator (ProC) or a simple Perl script? c) do we adapt the existing QM software or do we restart from scratch? The timeline and available manpower are also important issues to be taken into account. We present here the orientation of our choices and discuss their pertinence based on the experience of the final pre-launch tests and the start of real Planck LFI operations.

  8. An Overview of U.S. Trends in Educational Software Design.

    ERIC Educational Resources Information Center

    Colvin, Linda B.

    1989-01-01

    Describes trends in educational software design in the United States for elementary and secondary education. Highlights include user-friendly software; learner control; interfacing the computer with other media, including television, telecommunications networks, and optical disk technology; microworlds; graphics; word processing; database…

  9. AFOSR BRI: Co-Design of Hardware/Software for Predicting MAV Aerodynamics

    DTIC Science & Technology

    2016-09-27

    DOCUMENTATION PAGE Form ApprovedOMB No. 0704-0188 1. REPORT DATE (DD-MM-YYYY) 2. REPORT TYPE 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER 6. AUTHOR(S) 7...703-588-8494 AFOSR BRI While Moore’s Law theoretically doubles processor performance every 24 months, much of the realizable performance remains...past efforts to develop such CFD codes on accelerated processors showed limited success, our hardware/software co-design approach created malleable

  10. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Software Design Methodology Migration for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .

  12. Energy resolution experiments of conical organic scintillators and a comparison with Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Sosa, C. S.; Thompson, S. J.; Chichester, D. L.; Clarke, S. D.; Di Fulvio, A.; Pozzi, S. A.

    2018-08-01

    An increase in light-collection efficiency (LCE) improves the energy resolution of scintillator-based detection systems. An improvement in energy resolution can benefit detector performance, for example by lowering the measurement threshold and achieving greater accuracy in light-output calibration. This work shows that LCE can be increased by modifying the scintillator shape to reduce optical-photon reflections, thereby decreasing transmission and absorption likelihood at the reflector boundary. The energy resolution of four organic scintillators (EJ200) were compared: two cones and two right-circular cylinders, all with equal base diameter and height (50 mm). The sides of each shape had two surface conditions: one was polished and the other was ground. Each scintillator was coupled to the center of four photomultiplier tube (PMT) configurations of different diameters. The photocathode response of all PMTs was assessed as a function of position using a small cube (5 mm height) of EJ200. The worst configuration, a highly polished conical scintillator mated to a PMT of equal base diameter, produced a smeared energy spectrum. The cause of spectrum smearing is explored in detail. Results demonstrate that a ground cone had the greatest improvement in energy resolution over a ground cylinder by approximately 16.2% at 478 keVee, when using the largest diameter (127 mm) PMT. This result is attributed to the greater LCE of the cone, its ground surface, and the uniform photocathode response near center of the largest PMT. Optical-photon transport simulations in Geant4 of the cone and cylinder assuming a diffuse reflector and a uniform photocathode were compared to the best experimental configuration and agreed well. If a detector application requires excellent energy resolution above all other considerations, a ground cone on a large PMT is recommended over a cylinder.

  13. Design Recovery for Software Library Population

    DTIC Science & Technology

    1992-12-01

    increase understandability, efficiency, and maintainability of the software and the design. A good representation choice will also aid in...required for a reengineering project. It details the analysis and planning phase and gives good criteria for determining the need for a reengineering...because it deals with all of these issues. With his complete description of the analysis and planning phase, Byrne has a good foundation for

  14. Digital casts in orthodontics: a comparison of 4 software systems.

    PubMed

    Westerlund, Anna; Tancredi, Weronika; Ransjö, Maria; Bresin, Andrea; Psonis, Spyros; Torgersson, Olof

    2015-04-01

    The introduction of digital cast models is inevitable in the otherwise digitized everyday life of orthodontics. The introduction of this new technology, however, is not straightforward, and selecting an appropriate system can be difficult. The aim of the study was to compare 4 orthodontic digital software systems regarding service, features, and usability. Information regarding service offered by the companies was obtained from questionnaires and Web sites. The features of each software system were collected by exploring the user manuals and the software programs. Replicas of pretreatment casts were sent to Cadent (OrthoCAD; Cadent, Carlstadt, NJ), OthoLab (O3DM; OrthoLab, Poznan, Poland), OrthoProof (DigiModel; OrthoProof, Nieuwegein, The Netherlands), and 3Shape (OrthoAnalyzer; 3Shape, Copenhagen, Denmark). The usability of the programs was assessed by experts in interaction design and usability using the "enhanced cognitive walkthrough" method: 4 tasks were defined and performed by a group of domain experts while they were observed by usability experts. The services provided by the companies were similar. Regarding the features, all 4 systems were able to perform basic measurements; however, not all provided the peer assessment rating index or the American Board of Orthodontics analysis, simulation of the treatment with braces, or digital articulation of the casts. All systems demonstrated weaknesses in usability. However, OrthoCAD and 03DM were considered to be easier to learn for first-time users. In general, the usability of these programs was poor and needs to be further developed. Hands-on training supervised by the program experts is recommended for beginners. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  15. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    PubMed

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Optomechanical design software for segmented mirrors

    NASA Astrophysics Data System (ADS)

    Marrero, Juan

    2016-08-01

    The software package presented in this paper, still under development, was born to help analyzing the influence of the many parameters involved in the design of a large segmented mirror telescope. In summary, it is a set of tools which were added to a common framework as they were needed. Great emphasis has been made on the graphical presentation, as scientific visualization nowadays cannot be conceived without the use of a helpful 3d environment, showing the analyzed system as close to reality as possible. Use of third party software packages is limited to ANSYS, which should be available in the system only if the FEM results are needed. Among the various functionalities of the software, the next ones are worth mentioning here: automatic 3d model construction of a segmented mirror from a set of parameters, geometric ray tracing, automatic 3d model construction of a telescope structure around the defined mirrors from a set of parameters, segmented mirror human access assessment, analysis of integration tolerances, assessment of segments collision, structural deformation under gravity and thermal variation, mirror support system analysis including warping harness mechanisms, etc.

  17. Geant4 simulation for a study of a possible use of carbon ions pencil beam for the treatment of ocular melanomas with the active scanning system at CNAO Centre

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farina, E.; Piersimoni, P.; Riccardi, C.

    The aim of this work is to validate the Geant4 application reproducing the CNAO (National Centre for Oncological Hadrontherapy) beamline and to study of a possible use of carbon ion pencil beams for the treatment of ocular melanomas at the CNAO Centre. The promising aspect of carbon ions radiotherapy for the treatment of this disease lies in its superior relative radiobiological effectiveness (RBE). The Monte Carlo Geant4 toolkit is used to simulate the complete CNAO extraction beamline, with the active and passive components along it. A human eye modeled detector, including a realistic target tumor volume, is used as target.more » Cross check with previous studies at CNAO using protons allows comparisons on possible benefits on using such a technique with respect to proton beams. Before the eye-detector irradiation a validation of the Geant4 simulation with CNAO experimental data is carried out with both carbon ions and protons. Important beam parameters such as the transverse FWHM and scanned radiation field 's uniformity are tested within the simulation and compared with experimental measurements at CNAO Centre. The physical processes involved in secondary particles generation by carbon ions and protons in the eye-detector are reproduced to take into account the additional dose to the primary beam given to irradiated eye's tissues. A study of beam shaping is carried out to produce a uniform 3D dose distribution (shaped on the tumor) by the use of a spread out Bragg peak. The eye-detector is then irradiated through a two dimensional transverse beam scan at different depths. In the use case the eye-detector is rotated of an angle of 40 deg. in the vertical direction, in order to mis-align the tumor from healthy tissues in front of it. The treatment uniformity on the tumor in the eye-detector is tested. For a more quantitative description of the deposited dose in the eye-detector and for the evaluation of the ratio between the dose deposited in the tumor and the

  18. The Implementation of Satellite Attitude Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Reid, W. Mark; Hansell, William; Phillips, Tom; Anderson, Mark O.; Drury, Derek

    1998-01-01

    NASA established the Small Explorer (SNMX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions. The SMEX program has produced five satellites, three of which have been successfully launched. The remaining two spacecraft are scheduled for launch within the coming year. NASA has recently developed a prototype for the next generation Small Explorer spacecraft (SMEX-Lite). This paper describes the object-oriented design (OOD) of the SMEX-Lite Attitude Control System (ACS) software. The SMEX-Lite ACS is three-axis controlled and is capable of performing sub-arc-minute pointing. This paper first describes high level requirements governing the SMEX-Lite ACS software architecture. Next, the context in which the software resides is explained. The paper describes the principles of encapsulation, inheritance, and polymorphism with respect to the implementation of an ACS software system. This paper will also discuss the design of several ACS software components. Specifically, object-oriented designs are presented for sensor data processing, attitude determination, attitude control, and failure detection. Finally, this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects.

  19. Power Analysis Tutorial for Experimental Design Software

    DTIC Science & Technology

    2014-11-01

    I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Document D-5205 November 2014 Power Analysis Tutorial for Experimental Design Software...16) [Jun 2013]. I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Document D-5205 Power Analysis Tutorial for Experimental Design ...Test and Evaluation (T&E) community is increasing its employment of Design of Experiments (DOE), a rigorous methodology for planning and evaluating

  20. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  1. The TI-99/4A Software.

    ERIC Educational Resources Information Center

    Wrege, Rachael; And Others

    1982-01-01

    Describes the software modules produced by Texas Instruments for use with the TI-99/4A home computer. Among the modules described are: Personal Real Estate, Programing Aids, Home Financial Decisions, Music Maker, Weight Control and Nutrition, Early Learning Fun, and Tax/Investment Record Keeping. (JL)

  2. Wake Turbulence Mitigation for Departures (WTMD) Prototype System - Software Design Document

    NASA Technical Reports Server (NTRS)

    Sturdy, James L.

    2008-01-01

    This document describes the software design of a prototype Wake Turbulence Mitigation for Departures (WTMD) system that was evaluated in shadow mode operation at the Saint Louis (KSTL) and Houston (KIAH) airports. This document describes the software that provides the system framework, communications, user displays, and hosts the Wind Forecasting Algorithm (WFA) software developed by the M.I.T. Lincoln Laboratory (MIT-LL). The WFA algorithms and software are described in a separate document produced by MIT-LL.

  3. Simulation, optimization and testing of a novel high spatial resolution X-ray imager based on Zinc Oxide nanowires in Anodic Aluminium Oxide membrane using Geant4

    NASA Astrophysics Data System (ADS)

    Esfandi, F.; Saramad, S.

    2015-07-01

    In this work, a new generation of scintillator based X-ray imagers based on ZnO nanowires in Anodized Aluminum Oxide (AAO) nanoporous template is characterized. The optical response of ordered ZnO nanowire arrays in porous AAO template under low energy X-ray illumination is simulated by the Geant4 Monte Carlo code and compared with experimental results. The results show that for 10 keV X-ray photons, by considering the light guiding properties of zinc oxide inside the AAO template and suitable selection of detector thickness and pore diameter, the spatial resolution less than one micrometer and the detector detection efficiency of 66% are accessible. This novel nano scintillator detector can have many advantages for medical applications in the future.

  4. Executive system software design and expert system implementation

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1992-01-01

    The topics are presented in viewgraph form and include: software requirements; design layout of the automated assembly system; menu display for automated composite command; expert system features; complete robot arm state diagram and logic; and expert system benefits.

  5. SOFTWARE DESIGN FOR REAL-TIME SYSTEMS.

    DTIC Science & Technology

    Real-time computer systems and real-time computations are defined for the purposes of this report. The design of software for real - time systems is...discussed, employing the concept that all real - time systems belong to one of two types. The types are classified according to the type of control...program used; namely: Pre-assigned Iterative Cycle and Real-time Queueing. The two types of real - time systems are described in general, with supplemental

  6. Software design as a problem in learning theory (a research overview)

    NASA Technical Reports Server (NTRS)

    Fass, Leona F.

    1992-01-01

    Our interest in automating software design has come out of our research in automated reasoning, inductive inference, learnability, and algebraic machine theory. We have investigated these areas extensively, in connection with specific problems of language representation, acquisition, processing, and design. In the case of formal context-free (CF) languages we established existence of finite learnable models ('behavioral realizations') and procedures for constructing them effectively. We also determined techniques for automatic construction of the models, inductively inferring them from finite examples of how they should 'behave'. These results were obtainable due to appropriate representation of domain knowledge, and constraints on the domain that the representation defined. It was when we sought to generalize our results, and adapt or apply them, that we began investigating the possibility of determining similar procedures for constructing correct software. Discussions with other researchers led us to examine testing and verification processes, as they are related to inference, and due to their considerable importance in correct software design. Motivating papers by other researchers, led us to examine these processes in some depth. Here we present our approach to those software design issues raised by other researchers, within our own theoretical context. We describe our results, relative to those of the other researchers, and conclude that they do not compare unfavorably.

  7. ATM Technology Demonstration-1 Phase II Boeing Configurable Graphical Display (CGD) Software Design Description

    NASA Technical Reports Server (NTRS)

    Wilber, George F.

    2017-01-01

    This Software Description Document (SDD) captures the design for developing the Flight Interval Management (FIM) system Configurable Graphics Display (CGD) software. Specifically this SDD describes aspects of the Boeing CGD software and the surrounding context and interfaces. It does not describe the Honeywell components of the CGD system. The SDD provides the system overview, architectural design, and detailed design with all the necessary information to implement the Boeing components of the CGD software and integrate them into the CGD subsystem within the larger FIM system. Overall system and CGD system-level requirements are derived from the CGD SRS (in turn derived from the Boeing System Requirements Design Document (SRDD)). Display and look-and-feel requirements are derived from Human Machine Interface (HMI) design documents and working group recommendations. This Boeing CGD SDD is required to support the upcoming Critical Design Review (CDR).

  8. Software Engineering Design Principles Applied to Instructional Design: What Can We Learn from Our Sister Discipline?

    ERIC Educational Resources Information Center

    Adnan, Nor Hafizah; Ritzhaupt, Albert D.

    2018-01-01

    The failure of many instructional design initiatives is often attributed to poor instructional design. Current instructional design models do not provide much insight into design processes for creating e-learning instructional solutions. Given the similarities between the fields of instructional design and software engineering, instructional…

  9. Robust Software Architecture for Robots

    NASA Technical Reports Server (NTRS)

    Aghazanian, Hrand; Baumgartner, Eric; Garrett, Michael

    2009-01-01

    Robust Real-Time Reconfigurable Robotics Software Architecture (R4SA) is the name of both a software architecture and software that embodies the architecture. The architecture was conceived in the spirit of current practice in designing modular, hard, realtime aerospace systems. The architecture facilitates the integration of new sensory, motor, and control software modules into the software of a given robotic system. R4SA was developed for initial application aboard exploratory mobile robots on Mars, but is adaptable to terrestrial robotic systems, real-time embedded computing systems in general, and robotic toys.

  10. Software Prototyping

    PubMed Central

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  11. Analysis software can put surgical precision into medical device design.

    PubMed

    Jain, S

    2005-11-01

    Use of finite element analysis software can give design engineers greater freedom to experiment with new designs and materials and allow companies to get products through clinical trials and onto the market faster. This article suggests how.

  12. Ada Software Design Methods Formulation.

    DTIC Science & Technology

    1982-10-01

    Programmer technical 2018 Principle Scientific Programmer technical 2020 Principle Scientif:c Programmer tnchnical 3001 Junior Programns. entry level...0.570 156 6010-. I---. 0.684 7 1031------------- 0.481 77 3119-. 0.620 94 4034-. ----- 0.696 90 4027-. -- ’---- 0.759 31 2018 -. I-’" 0.823 142 5063-. I...1094-2 0-117 cluster 4 2007 Senior Scientific Programmer technical 2016 Scientific Programmer technical 1080 Senior Software Engineer technical 2018

  13. A Student Experiment Method for Learning the Basics of Embedded Software Technologies Including Hardware/Software Co-design

    NASA Astrophysics Data System (ADS)

    Kambe, Hidetoshi; Mitsui, Hiroyasu; Endo, Satoshi; Koizumi, Hisao

    The applications of embedded system technologies have spread widely in various products, such as home appliances, cellular phones, automobiles, industrial machines and so on. Due to intensified competition, embedded software has expanded its role in realizing sophisticated functions, and new development methods like a hardware/software (HW/SW) co-design for uniting HW and SW development have been researched. The shortfall of embedded SW engineers was estimated to be approximately 99,000 in the year 2006, in Japan. Embedded SW engineers should understand HW technologies and system architecture design as well as SW technologies. However, a few universities offer this kind of education systematically. We propose a student experiment method for learning the basics of embedded system development, which includes a set of experiments for developing embedded SW, developing embedded HW and experiencing HW/SW co-design. The co-design experiment helps students learn about the basics of embedded system architecture design and the flow of designing actual HW and SW modules. We developed these experiments and evaluated them.

  14. FLOWER IPv4/IPv6 Network Flow Summarization software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nickless, Bill; Curtis, Darren; Christy, Jason

    FLOWER was written as a refactoring/reimplementation of the existing Flo software used by the Cooperative Protection Program (CPP) to provide network flow summaries for analysis by the Operational Analysis Center (OAC) and other US Department of Energy cyber security elements. FLOWER is designed and tested to operate at 10 gigabits/second, nearly 10 times faster than competing solutions. FLOWER output is optimized for importation into SQL databases for categorization and analysis. FLOWER is written in C++ using current best software engineering practices.

  15. Investigation into the development of computer aided design software for space based sensors

    NASA Technical Reports Server (NTRS)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  16. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  17. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  18. An overview of software design languages. [for Galileo spacecraft Command and Data Subsystems

    NASA Technical Reports Server (NTRS)

    Callender, E. D.

    1980-01-01

    The nature and use of design languages and associated processors that are used in software development are reviewed with reference to development work on the Galileo spacecraft project, a Jupiter orbiter scheduled for launch in 1984. The major design steps are identified (functional design, architectural design, detailed design, coding, and testing), and the purpose, functions and the range of applications of design languages are examined. Then the general character of any design language is analyzed in terms of syntax and semantics. Finally, the differences and similarities between design languages are illustrated by examining two specific design languages: Software Design and Documentation language and Problem Statement Language/Problem Statement Analyzer.

  19. Design and implementation of the mobility assessment tool: software description.

    PubMed

    Barnard, Ryan T; Marsh, Anthony P; Rejeski, Walter Jack; Pecorella, Anthony; Ip, Edward H

    2013-07-23

    In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications--one built in Java, the other in Objective-C for the Apple iPad--were then built that could present the instrument described in the XML document and collect participants' responses. Separating the instrument's definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine.

  20. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  1. Designing Tracking Software for Image-Guided Surgery Applications: IGSTK Experience

    PubMed Central

    Enquobahrie, Andinet; Gobbi, David; Turek, Matt; Cheng, Patrick; Yaniv, Ziv; Lindseth, Frank; Cleary, Kevin

    2009-01-01

    Objective Many image-guided surgery applications require tracking devices as part of their core functionality. The Image-Guided Surgery Toolkit (IGSTK) was designed and developed to interface tracking devices with software applications incorporating medical images. Methods IGSTK was designed as an open source C++ library that provides the basic components needed for fast prototyping and development of image-guided surgery applications. This library follows a component-based architecture with several components designed for specific sets of image-guided surgery functions. At the core of the toolkit is the tracker component that handles communication between a control computer and navigation device to gather pose measurements of surgical instruments present in the surgical scene. The representations of the tracked instruments are superimposed on anatomical images to provide visual feedback to the clinician during surgical procedures. Results The initial version of the IGSTK toolkit has been released in the public domain and several trackers are supported. The toolkit and related information are available at www.igstk.org. Conclusion With the increased popularity of minimally invasive procedures in health care, several tracking devices have been developed for medical applications. Designing and implementing high-quality and safe software to handle these different types of trackers in a common framework is a challenging task. It requires establishing key software design principles that emphasize abstraction, extensibility, reusability, fault-tolerance, and portability. IGSTK is an open source library that satisfies these needs for the image-guided surgery community. PMID:20037671

  2. Use of the GEANT4 Monte Carlo to determine three-dimensional dose factors for radionuclide dosimetry

    NASA Astrophysics Data System (ADS)

    Amato, Ernesto; Italiano, Antonio; Minutoli, Fabio; Baldari, Sergio

    2013-04-01

    The voxel-level dosimetry is the most simple and common approach to internal dosimetry of nonuniform distributions of activity within the human body. Aim of this work was to obtain the dose "S" factors (mGy/MBqs) at the voxel level for eight beta and beta-gamma emitting radionuclides commonly used in nuclear medicine diagnostic and therapeutic procedures. We developed a Monte Carlo simulation in GEANT4 of a region of soft tissue as defined by the ICRP, divided into 11×11×11 cubic voxels, 3 mm in side. The simulation used the parameterizations of the electromagnetic interaction optimized for low energy (EEDL, EPDL). The decay of each radionuclide (32P, 90Y, 99mTc, 177Lu, 131I, 153Sm, 186Re, 188Re) were simulated homogeneously distributed within the central voxel (0,0,0), and the energy deposited in the surrounding voxels was mediated on the 8 octants of the three dimensional space, for reasons of symmetry. The results obtained were compared with those available in the literature. While the iodine deviations remain within 16%, for phosphorus, a pure beta emitter, the agreement is very good for self-dose (0,0,0) and good for the dose to first neighbors, while differences are observed ranging from -60% to +100% for voxels far distant from the source. The existence of significant differences in the percentage calculation of the voxel S factors, especially for pure beta emitters such as 32P or 90Y, has already been highlighted by other authors. These data can usefully extend the dosimetric approach based on the voxel to other radionuclides not covered in the available literature.

  3. Wearable Inset-Fed FR4 Microstrip Patch Antenna Design

    NASA Astrophysics Data System (ADS)

    Zaini, S. R. Mohd; Rani, K. N. Abdul

    2018-03-01

    This project proposes the design of a wireless body area network (WBAN) microstrip patch antenna covered by the jeans fabric as the outer layer operating at the center frequency, fc of 2.40 GHz. Precisely, the microstrip patch antenna with the inset-fed edge technique is designed and simulated systematically by using the Keysight Advanced Design System (ADS) software where the FR4 board with the dielectric constant, ɛr of 4.70, dissipation factor or loss tangent, tan δ of 0.02 and height, h of 1.60 mm is the chosen dielectric substrate. The wearable microstrip patch antenna design is then fabricated using the FR4 printed circuit board (PCB) material, hidden inside the jeans fabric, and attached to clothing, such as a jacket accordingly. Simulation and fabrication measurement results show that the designed microstrip patch antenna characteristics can be applied significantly within the industrial, scientific, and medical (ISM) radio band, which is at fc = 2.40 GHz.

  4. Inertial Upper Stage (IUS) software analysis

    NASA Technical Reports Server (NTRS)

    Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.

    1979-01-01

    The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.

  5. Tank Monitoring and Document control System (TMACS) As Built Software Design Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GLASSCOCK, J.A.

    This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.

  6. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  7. Advanced Extravehicular Mobility Unit Informatics Software Design

    NASA Technical Reports Server (NTRS)

    Wright, Theodore

    2014-01-01

    This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.

  8. Software Package Completed for Alloy Design at the Atomic Level

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo H.; Noebe, Ronald D.; Abel, Phillip B.; Good, Brian S.

    2001-01-01

    As a result of a multidisciplinary effort involving solid-state physics, quantum mechanics, and materials and surface science, the first version of a software package dedicated to the atomistic analysis of multicomponent systems was recently completed. Based on the BFS (Bozzolo, Ferrante, and Smith) method for the calculation of alloy and surface energetics, this package includes modules devoted to the analysis of many essential features that characterize any given alloy or surface system, including (1) surface structure analysis, (2) surface segregation, (3) surface alloying, (4) bulk crystalline material properties and atomic defect structures, and (5) thermal processes that allow us to perform phase diagram calculations. All the modules of this Alloy Design Workbench 1.0 (ADW 1.0) are designed to run in PC and workstation environments, and their operation and performance are substantially linked to the needs of the user and the specific application.

  9. Software platform for simulation of a prototype proton CT scanner.

    PubMed

    Giacometti, Valentina; Bashkirov, Vladimir A; Piersimoni, Pierluigi; Guatelli, Susanna; Plautz, Tia E; Sadrozinski, Hartmut F-W; Johnson, Robert P; Zatserklyaniy, Andriy; Tessonnier, Thomas; Parodi, Katia; Rosenfeld, Anatoly B; Schulte, Reinhard W

    2017-03-01

    Proton computed tomography (pCT) is a promising imaging technique to substitute or at least complement x-ray CT for more accurate proton therapy treatment planning as it allows calculating directly proton relative stopping power from proton energy loss measurements. A proton CT scanner with a silicon-based particle tracking system and a five-stage scintillating energy detector has been completed. In parallel a modular software platform was developed to characterize the performance of the proposed pCT. The modular pCT software platform consists of (1) a Geant4-based simulation modeling the Loma Linda proton therapy beam line and the prototype proton CT scanner, (2) water equivalent path length (WEPL) calibration of the scintillating energy detector, and (3) image reconstruction algorithm for the reconstruction of the relative stopping power (RSP) of the scanned object. In this work, each component of the modular pCT software platform is described and validated with respect to experimental data and benchmarked against theoretical predictions. In particular, the RSP reconstruction was validated with both experimental scans, water column measurements, and theoretical calculations. The results show that the pCT software platform accurately reproduces the performance of the existing prototype pCT scanner with a RSP agreement between experimental and simulated values to better than 1.5%. The validated platform is a versatile tool for clinical proton CT performance and application studies in a virtual setting. The platform is flexible and can be modified to simulate not yet existing versions of pCT scanners and higher proton energies than those currently clinically available. © 2017 American Association of Physicists in Medicine.

  10. [Development of a software for 3D virtual phantom design].

    PubMed

    Zou, Lian; Xie, Zhao; Wu, Qi

    2014-02-01

    In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research.

  11. Comparison of GEANT4 Physics Models with Measured Beta Particle Data in Aluminum using a Strontium-90 Source

    NASA Astrophysics Data System (ADS)

    Everett, Samantha

    2010-10-01

    A transmission curve experiment was carried out to measure the range of beta particles in aluminum in the health physics laboratory located on the campus of Texas Southern University. The transmission count rate through aluminum for varying radiation lengths was measured using beta particles emitted from a low activity (˜1 μCi) Sr-90 source. The count rate intensity was recorded using a Geiger Mueller tube (SGC N210/BNC) with an active volume of 61 cm^3 within a systematic detection accuracy of a few percent. We compared these data with a realistic simulation of the experimental setup using the Geant4 Monte Carlo toolkit (version 9.3). The purpose of this study was to benchmark our Monte Carlo for future experiments as part of a more comprehensive research program. Transmission curves were simulated based on the standard and low-energy electromagnetic physics models, and using the radioactive decay module for the electrons primary energy distribution. To ensure the validity of our measurements, linear extrapolation techniques were employed to determine the in-medium beta particle range from the measured data and was found to be 1.87 g/cm^2 (˜0.693 cm), in agreement with literature values. We found that the general shape of the measured data and simulated curves were comparable; however, a discrepancy in the relative count rates was observed. The origin of this disagreement is still under investigation.

  12. The Design of Software for Three-Phase Induction Motor Test System

    NASA Astrophysics Data System (ADS)

    Haixiang, Xu; Fengqi, Wu; Jiai, Xue

    2017-11-01

    The design and development of control system software is important to three-phase induction motor test equipment, which needs to be completely familiar with the test process and the control procedure of test equipment. In this paper, the software is developed according to the national standard (GB/T1032-2005) about three-phase induction motor test method by VB language. The control system and data analysis software and the implement about motor test system are described individually, which has the advantages of high automation and high accuracy.

  13. Geant4-DNA simulation of DNA damage caused by direct and indirect radiation effects and comparison with biological data.

    NASA Astrophysics Data System (ADS)

    Villagrasa, Carmen; Meylan, Sylvain; Gonon, Geraldine; Gruel, Gaëtan; Giesen, Ulrich; Bueno, Marta; Rabus, Hans

    2017-09-01

    In this work we present results obtained in the frame of the BioQuaRT project. The objective of the study was the correlation between the number of radiation-induced double strand breaks (DSB) of the DNA molecule and the probability of detecting nuclear foci after targeted microbeam irradiation of cells with protons and alpha particles of different LET. The former were obtained by simulation with new methods integrated into Geant4-DNA that permit calculating the number of DSB in a DNA target model induced by direct and indirect radiation effects. A particular focus was laid in this work on evaluating the influence of different criteria applied to the simulated results for predicting the formation of a direct SSB. Indeed, these criteria have an important impact on the predicted number of DSB per particle track and its dependence with LET. Among the criteria tested in this work, the case that a direct radiation interaction leads to a strand break if the cumulative energy deposited in the backbone part of one nucleotide exceeds a threshold of 17.5 eV leads to the best agreement with the relative LET dependence of number of radiation induced foci. Further calculations and experimental data are nevertheless needed in order to fix the simulation parameters and to help interpreting the biological experimental data observed by immunofluorescence in terms of the DSB complexity.

  14. The design of 1-wire net meteorological observatory for 2.4 m telescope

    NASA Astrophysics Data System (ADS)

    Zhu, Gao-Feng; Wei, Ka-Ning; Fan, Yu-Feng; Xu, Jun; Qin, Wei

    2005-03-01

    The weather is an important factor to affect astronomical observations. The 2.4 m telescope can not work in Robotic Mode without the weather data input. Therefore it is necessary to build a meteorological observatory near the 2.4 m telescope. In this article, the design of the 1-wire net meteorological observatory, which includes hardware and software systems, is introduced. The hardware system is made up of some kinds of sensors and ADC. A suited power station system is also designed. The software system is based on Windows XP operating system and MySQL data management system, and a prototype system of browse/server model is developed by JAVA and JSP. After being tested, the meteorological observatory can register the immediate data of weather, such as raining, snowing, and wind speed. At last, the data will be stored for feature use. The product and the design can work well for the 2.4 m telescope.

  15. Software Design for Interactive Graphic Radiation Treatment Simulation Systems*

    PubMed Central

    Kalet, Ira J.; Sweeney, Christine; Jacky, Jonathan

    1990-01-01

    We examine issues in the design of interactive computer graphic simulation programs for radiation treatment planning (RTP), as well as expert system programs that automate parts of the RTP process, in light of ten years of experience at designing, building and using such programs. An experiment in object-oriented design using standard Pascal shows that while some advantage is gained from the design, it is still difficult to achieve modularity and to integrate expert system components. A new design based on the Common LISP Object System (CLOS) is described. This series of designs for RTP software shows that this application benefits in specific ways from object-oriented design methods and appropriate languages and tools.

  16. Open Source and Design Thinking at NASA: A Vision for Future Software

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2017-01-01

    NASA Mission Control Software for the Visualization of data has historically been closed, accessible only to small groups of flight controllers, often bound to a specific mission discipline such as flight dynamics, health and status or mission planning. Open Mission Control Technologies (MCT) provides new capability for NASA mission controllers and, by being fully open source, opens up NASA software for the visualization of mission data to broader communities inside and outside of NASA. Open MCT is the product of a design thinking process within NASA, using participatory design and design sprints to build a product that serves users.

  17. Co-design of software and hardware to implement remote sensing algorithms

    NASA Astrophysics Data System (ADS)

    Theiler, James P.; Frigo, Janette R.; Gokhale, Maya; Szymanski, John J.

    2002-01-01

    Both for offline searches through large data archives and for onboard computation at the sensor head, there is a growing need for ever-more rapid processing of remote sensing data. For many algorithms of use in remote sensing, the bulk of the processing takes place in an ``inner loop'' with a large number of simple operations. For these algorithms, dramatic speedups can often be obtained with specialized hardware. The difficulty and expense of digital design continues to limit applicability of this approach, but the development of new design tools is making this approach more feasible, and some notable successes have been reported. On the other hand, it is often the case that processing can also be accelerated by adopting a more sophisticated algorithm design. Unfortunately, a more sophisticated algorithm is much harder to implement in hardware, so these approaches are often at odds with each other. With careful planning, however, it is sometimes possible to combine software and hardware design in such a way that each complements the other, and the final implementation achieves speedup that would not have been possible with a hardware-only or a software-only solution. We will in particular discuss the co-design of software and hardware to achieve substantial speedup of algorithms for multispectral image segmentation and for endmember identification.

  18. Measuring the complexity of design in real-time imaging software

    NASA Astrophysics Data System (ADS)

    Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.

    2007-02-01

    Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.

  19. Application of Experimental and Quasi-Experimental Research Designs to Educational Software Evaluation.

    ERIC Educational Resources Information Center

    Muller, Eugene W.

    1985-01-01

    Develops generalizations for empirical evaluation of software based upon suitability of several research designs--pretest posttest control group, single-group pretest posttest, nonequivalent control group, time series, and regression discontinuity--to type of software being evaluated, and on circumstances under which evaluation is conducted. (MBR)

  20. Ground control station software design for micro aerial vehicles

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  1. IsoDesign: a software for optimizing the design of 13C-metabolic flux analysis experiments.

    PubMed

    Millard, Pierre; Sokol, Serguei; Letisse, Fabien; Portais, Jean-Charles

    2014-01-01

    The growing demand for (13) C-metabolic flux analysis ((13) C-MFA) in the field of metabolic engineering and systems biology is driving the need to rationalize expensive and time-consuming (13) C-labeling experiments. Experimental design is a key step in improving both the number of fluxes that can be calculated from a set of isotopic data and the precision of flux values. We present IsoDesign, a software that enables these parameters to be maximized by optimizing the isotopic composition of the label input. It can be applied to (13) C-MFA investigations using a broad panel of analytical tools (MS, MS/MS, (1) H NMR, (13) C NMR, etc.) individually or in combination. It includes a visualization module to intuitively select the optimal label input depending on the biological question to be addressed. Applications of IsoDesign are described, with an example of the entire (13) C-MFA workflow from the experimental design to the flux map including important practical considerations. IsoDesign makes the experimental design of (13) C-MFA experiments more accessible to a wider biological community. IsoDesign is distributed under an open source license at http://metasys.insa-toulouse.fr/software/isodes/ © 2013 Wiley Periodicals, Inc.

  2. Supporting metabolomics with adaptable software: design architectures for the end-user.

    PubMed

    Sarpe, Vladimir; Schriemer, David C

    2017-02-01

    Large and disparate sets of LC-MS data are generated by modern metabolomics profiling initiatives, and while useful software tools are available to annotate and quantify compounds, the field requires continued software development in order to sustain methodological innovation. Advances in software development practices allow for a new paradigm in tool development for metabolomics, where increasingly the end-user can develop or redeploy utilities ranging from simple algorithms to complex workflows. Resources that provide an organized framework for development are described and illustrated with LC-MS processing packages that have leveraged their design tools. Full access to these resources depends in part on coding experience, but the emergence of workflow builders and pluggable frameworks strongly reduces the skill level required. Developers in the metabolomics community are encouraged to use these resources and design content for uptake and reuse. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. The design of real time infrared image generation software based on Creator and Vega

    NASA Astrophysics Data System (ADS)

    Wang, Rui-feng; Wu, Wei-dong; Huo, Jun-xiu

    2013-09-01

    Considering the requirement of high reality and real-time quality dynamic infrared image of an infrared image simulation, a method to design real-time infrared image simulation application on the platform of VC++ is proposed. This is based on visual simulation software Creator and Vega. The functions of Creator are introduced simply, and the main features of Vega developing environment are analyzed. The methods of infrared modeling and background are offered, the designing flow chart of the developing process of IR image real-time generation software and the functions of TMM Tool and MAT Tool and sensor module are explained, at the same time, the real-time of software is designed.

  4. Advanced Spacesuit Informatics Software Design for Power, Avionics and Software Version 2.0

    NASA Technical Reports Server (NTRS)

    Wright, Theodore W.

    2016-01-01

    A description of the software design for the 2016 edition of the Informatics computer assembly of the NASAs Advanced Extravehicular Mobility Unit (AEMU), also called the Advanced Spacesuit. The Informatics system is an optional part of the spacesuit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and warning information. It also provides an interface to the suit mounted camera for recording still images, video, and audio field notes.

  5. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  6. Protein evolution analysis of S-hydroxynitrile lyase by complete sequence design utilizing the INTMSAlign software.

    PubMed

    Nakano, Shogo; Asano, Yasuhisa

    2015-02-03

    Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.

  7. Protein evolution analysis of S-hydroxynitrile lyase by complete sequence design utilizing the INTMSAlign software

    NASA Astrophysics Data System (ADS)

    Nakano, Shogo; Asano, Yasuhisa

    2015-02-01

    Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.

  8. OPTICON: Pro-Matlab software for large order controlled structure design

    NASA Technical Reports Server (NTRS)

    Peterson, Lee D.

    1989-01-01

    A software package for large order controlled structure design is described and demonstrated. The primary program, called OPTICAN, uses both Pro-Matlab M-file routines and selected compiled FORTRAN routines linked into the Pro-Matlab structure. The program accepts structural model information in the form of state-space matrices and performs three basic design functions on the model: (1) open loop analyses; (2) closed loop reduced order controller synthesis; and (3) closed loop stability and performance assessment. The current controller synthesis methods which were implemented in this software are based on the Generalized Linear Quadratic Gaussian theory of Bernstein. In particular, a reduced order Optimal Projection synthesis algorithm based on a homotopy solution method was successfully applied to an experimental truss structure using a 58-state dynamic model. These results are presented and discussed. Current plans to expand the practical size of the design model to several hundred states and the intention to interface Pro-Matlab to a supercomputing environment are discussed.

  9. QUICK - An interactive software environment for engineering design

    NASA Technical Reports Server (NTRS)

    Skinner, David L.

    1989-01-01

    QUICK, an interactive software environment for engineering design, provides a programmable FORTRAN-like calculator interface to a wide range of data structures as well as both built-in and user created functions. QUICK also provides direct access to the operating systems of eight different machine architectures. The evolution of QUICK and a brief overview of the current version are presented.

  10. Real-time solar magnetograph operation system software design and user's guide

    NASA Technical Reports Server (NTRS)

    Wang, C.

    1984-01-01

    The Real Time Solar Magnetograph (RTSM) Operation system software design on PDP11/23+ is presented along with the User's Guide. The RTSM operation software is for real time instrumentation control, data collection and data management. The data is used for vector analysis, plotting or graphics display. The processed data is then easily compared with solar data from other sources, such as the Solar Maximum Mission (SMM).

  11. Software Design Description for the Tidal Open-boundary Prediction System (TOPS)

    DTIC Science & Technology

    2010-05-04

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--10-9209 Approved for public release; distribution is unlimited. Software ...Department of Defense, Washington Headquarters Services , Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite...RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (include area code) b. ABSTRACT c. THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Software Design

  12. Inspection design using 2D phased array, TFM and cueMAP software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGilp, Ailidh; Dziewierz, Jerzy; Lardner, Tim

    2014-02-18

    A simulation suite, cueMAP, has been developed to facilitate the design of inspection processes and sparse 2D array configurations. At the core of cueMAP is a Total Focusing Method (TFM) imaging algorithm that enables computer assisted design of ultrasonic inspection scenarios, including the design of bespoke array configurations to match the inspection criteria. This in-house developed TFM code allows for interactive evaluation of image quality indicators of ultrasonic imaging performance when utilizing a 2D phased array working in FMC/TFM mode. The cueMAP software uses a series of TFM images to build a map of resolution, contrast and sensitivity of imagingmore » performance of a simulated reflector, swept across the inspection volume. The software takes into account probe properties, wedge or water standoff, and effects of specimen curvature. In the validation process of this new software package, two 2D arrays have been evaluated on 304n stainless steel samples, typical of the primary circuit in nuclear plants. Thick section samples have been inspected using a 1MHz 2D matrix array. Due to the processing efficiency of the software, the data collected from these array configurations has been used to investigate the influence sub-aperture operation on inspection performance.« less

  13. A requirements specification for a software design support system

    NASA Technical Reports Server (NTRS)

    Noonan, Robert E.

    1988-01-01

    Most existing software design systems (SDSS) support the use of only a single design methodology. A good SDSS should support a wide variety of design methods and languages including structured design, object-oriented design, and finite state machines. It might seem that a multiparadigm SDSS would be expensive in both time and money to construct. However, it is proposed that instead an extensible SDSS that directly implements only minimal database and graphical facilities be constructed. In particular, it should not directly implement tools to faciliate language definition and analysis. It is believed that such a system could be rapidly developed and put into limited production use, with the experience gained used to refine and evolve the systems over time.

  14. An evaluation of software tools for the design and development of cockpit displays

    NASA Technical Reports Server (NTRS)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  15. Stochastic optimization of GeantV code by use of genetic algorithms

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) and handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. The goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.

  16. Model-Driven Development for PDS4 Software and Services

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Algermissen, S. S.; Cayanan, M. D.; Joyner, R. S.; Hardman, S. H.; Padams, J. H.

    2018-04-01

    PDS4 data product labels provide the information necessary for processing the referenced digital object. However, significantly more information is available in the PDS4 Information Model. This additional information is made available for use, by both software and services, to configure, promote resiliency, and improve interoperability.

  17. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  18. Overview and Software Architecture of the Copernicus Trajectory Design and Optimization System

    NASA Technical Reports Server (NTRS)

    Williams, Jacob; Senent, Juan S.; Ocampo, Cesar; Mathur, Ravi; Davis, Elizabeth C.

    2010-01-01

    The Copernicus Trajectory Design and Optimization System represents an innovative and comprehensive approach to on-orbit mission design, trajectory analysis and optimization. Copernicus integrates state of the art algorithms in optimization, interactive visualization, spacecraft state propagation, and data input-output interfaces, allowing the analyst to design spacecraft missions to all possible Solar System destinations. All of these features are incorporated within a single architecture that can be used interactively via a comprehensive GUI interface, or passively via external interfaces that execute batch processes. This paper describes the Copernicus software architecture together with the challenges associated with its implementation. Additionally, future development and planned new capabilities are discussed. Key words: Copernicus, Spacecraft Trajectory Optimization Software.

  19. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    NASA Technical Reports Server (NTRS)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  20. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  1. Real-time PCR (qPCR) primer design using free online software.

    PubMed

    Thornton, Brenda; Basu, Chhandak

    2011-01-01

    Real-time PCR (quantitative PCR or qPCR) has become the preferred method for validating results obtained from assays which measure gene expression profiles. The process uses reverse transcription polymerase chain reaction (RT-PCR), coupled with fluorescent chemistry, to measure variations in transcriptome levels between samples. The four most commonly used fluorescent chemistries are SYBR® Green dyes and TaqMan®, Molecular Beacon or Scorpion probes. SYBR® Green is very simple to use and cost efficient. As SYBR® Green dye binds to any double-stranded DNA product, its success depends greatly on proper primer design. Many types of online primer design software are available, which can be used free of charge to design desirable SYBR® Green-based qPCR primers. This laboratory exercise is intended for those who have a fundamental background in PCR. It addresses the basic fluorescent chemistries of real-time PCR, the basic rules and pitfalls of primer design, and provides a step-by-step protocol for designing SYBR® Green-based primers with free, online software. Copyright © 2010 Wiley Periodicals, Inc.

  2. Real World Software Engineering

    DTIC Science & Technology

    1994-07-15

    Corvision Cortex Corporation Daisys S /Cubed, Inc. Design/IDF & CPN Meta Software Corp. 22 EasyCase Professional Evergreen CASE Tools 8522 150th 4th Ave NE...Final RSUoTL 28 Sep 92-31 May 94 4. TITLE AND SUBTITLE S . FUNDING NUMBERS Real World Software Engineering 6. AUTHOR( S ) Donald Gotterbarn Robert Riser . a...nin• Sm-i t’h 7. PERFORMING ORGANIZATION NAME( S ) AND AOORESS(ES1 8. PERFORMING ORGANIZATION REPORT NUMBER East Tennessee State University Department

  3. Teacher-Designed Software for Interactive Linear Equations: Concepts, Interpretive Skills, Applications & Word-Problem Solving.

    ERIC Educational Resources Information Center

    Lawrence, Virginia

    No longer just a user of commercial software, the 21st century teacher is a designer of interactive software based on theories of learning. This software, a comprehensive study of straightline equations, enhances conceptual understanding, sketching, graphic interpretive and word problem solving skills as well as making connections to real-life and…

  4. Molecular Cloning Designer Simulator (MCDS): All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects.

    PubMed

    Shi, Zhenyu; Vickers, Claudia E

    2016-12-01

    Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.

  5. Multidisciplinary Modeling Software for Analysis, Design, and Optimization of HRRLS Vehicles

    NASA Technical Reports Server (NTRS)

    Spradley, Lawrence W.; Lohner, Rainald; Hunt, James L.

    2011-01-01

    The concept for Highly Reliable Reusable Launch Systems (HRRLS) under the NASA Hypersonics project is a two-stage-to-orbit, horizontal-take-off / horizontal-landing, (HTHL) architecture with an air-breathing first stage. The first stage vehicle is a slender body with an air-breathing propulsion system that is highly integrated with the airframe. The light weight slender body will deflect significantly during flight. This global deflection affects the flow over the vehicle and into the engine and thus the loads and moments on the vehicle. High-fidelity multi-disciplinary analyses that accounts for these fluid-structures-thermal interactions are required to accurately predict the vehicle loads and resultant response. These predictions of vehicle response to multi physics loads, calculated with fluid-structural-thermal interaction, are required in order to optimize the vehicle design over its full operating range. This contract with ResearchSouth addresses one of the primary objectives of the Vehicle Technology Integration (VTI) discipline: the development of high-fidelity multi-disciplinary analysis and optimization methods and tools for HRRLS vehicles. The primary goal of this effort is the development of an integrated software system that can be used for full-vehicle optimization. This goal was accomplished by: 1) integrating the master code, FEMAP, into the multidiscipline software network to direct the coupling to assure accurate fluid-structure-thermal interaction solutions; 2) loosely-coupling the Euler flow solver FEFLO to the available and proven aeroelasticity and large deformation (FEAP) code; 3) providing a coupled Euler-boundary layer capability for rapid viscous flow simulation; 4) developing and implementing improved Euler/RANS algorithms into the FEFLO CFD code to provide accurate shock capturing, skin friction, and heat-transfer predictions for HRRLS vehicles in hypersonic flow, 5) performing a Reynolds-averaged Navier-Stokes computation on an HRRLS

  6. Hadronic energy resolution of a highly granular scintillator-steel hadron calorimeter using software compensation techniques

    NASA Astrophysics Data System (ADS)

    Adloff, C.; Blaha, J.; Blaising, J.-J.; Drancourt, C.; Espargilière, A.; Gaglione, R.; Geffroy, N.; Karyotakis, Y.; Prast, J.; Vouters, G.; Francis, K.; Repond, J.; Smith, J.; Xia, L.; Baldolemar, E.; Li, J.; Park, S. T.; Sosebee, M.; White, A. P.; Yu, J.; Buanes, T.; Eigen, G.; Mikami, Y.; Watson, N. K.; Goto, T.; Mavromanolakis, G.; Thomson, M. A.; Ward, D. R.; Yan, W.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Benyamna, M.; Cârloganu, C.; Fehr, F.; Gay, P.; Manen, S.; Royer, L.; Blazey, G. C.; Dyshkant, A.; Lima, J. G. R.; Zutshi, V.; Hostachy, J.-Y.; Morin, L.; Cornett, U.; David, D.; Falley, G.; Gadow, K.; Göttlicher, P.; Günter, C.; Hermberg, B.; Karstensen, S.; Krivan, F.; Lucaci-Timoce, A.-I.; Lu, S.; Lutz, B.; Morozov, S.; Morgunov, V.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Terwort, M.; Vargas-Trevino, A.; Feege, N.; Garutti, E.; Marchesini, I.; Ramilli, M.; Eckert, P.; Harion, T.; Kaplan, A.; Schultz-Coulon, H.-Ch; Shen, W.; Stamen, R.; Tadday, A.; Bilki, B.; Norbeck, E.; Onel, Y.; Wilson, G. W.; Kawagoe, K.; Dauncey, P. D.; Magnan, A.-M.; Wing, M.; Salvatore, F.; Calvo Alamillo, E.; Fouz, M.-C.; Puerta-Pelayo, J.; Balagura, V.; Bobchenko, B.; Chadeeva, M.; Danilov, M.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Rusinov, V.; Tarkovsky, E.; Kirikova, N.; Kozlov, V.; Smirnov, P.; Soloviev, Y.; Buzhan, P.; Dolgoshein, B.; Ilyin, A.; Kantserov, V.; Kaplin, V.; Karakash, A.; Popova, E.; Smirnov, S.; Kiesling, C.; Pfau, S.; Seidel, K.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Bonis, J.; Bouquet, B.; Callier, S.; Cornebise, P.; Doublet, Ph; Dulucq, F.; Faucci Giannelli, M.; Fleury, J.; Li, H.; Martin-Chassard, G.; Richard, F.; de la Taille, Ch; Pöschl, R.; Raux, L.; Seguin-Moreau, N.; Wicek, F.; Anduze, M.; Boudry, V.; Brient, J.-C.; Jeans, D.; Mora de Freitas, P.; Musat, G.; Reinhard, M.; Ruan, M.; Videau, H.; Bulanek, B.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Belhorma, B.; Ghazlane, H.; Takeshita, T.; Uozumi, S.; Sauer, J.; Weber, S.; Zeitnitz, C.

    2012-09-01

    The energy resolution of a highly granular 1 m3 analogue scintillator-steel hadronic calorimeter is studied using charged pions with energies from 10 GeV to 80 GeV at the CERN SPS. The energy resolution for single hadrons is determined to be approximately 58%/√E/GeV. This resolution is improved to approximately 45%/√E/GeV with software compensation techniques. These techniques take advantage of the event-by-event information about the substructure of hadronic showers which is provided by the imaging capabilities of the calorimeter. The energy reconstruction is improved either with corrections based on the local energy density or by applying a single correction factor to the event energy sum derived from a global measure of the shower energy density. The application of the compensation algorithms to geant4 simulations yield resolution improvements comparable to those observed for real data.

  7. Autonomous robot software development using simple software components

    NASA Astrophysics Data System (ADS)

    Burke, Thomas M.; Chung, Chan-Jin

    2004-10-01

    Developing software to control a sophisticated lane-following, obstacle-avoiding, autonomous robot can be demanding and beyond the capabilities of novice programmers - but it doesn"t have to be. A creative software design utilizing only basic image processing and a little algebra, has been employed to control the LTU-AISSIG autonomous robot - a contestant in the 2004 Intelligent Ground Vehicle Competition (IGVC). This paper presents a software design equivalent to that used during the IGVC, but with much of the complexity removed. The result is an autonomous robot software design, that is robust, reliable, and can be implemented by programmers with a limited understanding of image processing. This design provides a solid basis for further work in autonomous robot software, as well as an interesting and achievable robotics project for students.

  8. NanoDesign: Concepts and Software for a Nanotechnology Based on Functionalized Fullerenes

    NASA Technical Reports Server (NTRS)

    Globus, Al; Jaffe, Richard; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Eric Drexler has proposed a hypothetical nanotechnology based on diamond and investigated the properties of such molecular systems. While attractive, diamonoid nanotechnology is not physically accessible with straightforward extensions of current laboratory techniques. We propose a nanotechnology based on functionalized fullerenes and investigate carbon nanotube based gears with teeth added via a benzyne reaction known to occur with C60. The gears are single-walled carbon nanotubes with appended coenzyme groups for teeth. Fullerenes are in widespread laboratory use and can be functionalized in many ways. Companion papers computationally demonstrate the properties of these gears (they appear to work) and the accessibility of the benzyne/nanotube reaction. This paper describes the molecular design techniques and rationale as well as the software that implements these design techniques. The software is a set of persistent C++ objects controlled by TCL command scripts. The c++/tcl interface is automatically generated by a software system called tcl_c++ developed by the author and described here. The objects keep track of different portions of the molecular machinery to allow different simulation techniques and boundary conditions to be applied as appropriate. This capability has been required to demonstrate (computationally) our gear's feasibility. A new distributed software architecture featuring a WWW universal client, CORBA distributed objects, and agent software is under consideration. The software architecture is intended to eventually enable a widely disbursed group to develop complex simulated molecular machines.

  9. The Software Engineering Prototype.

    DTIC Science & Technology

    1983-06-01

    34. sThis cnly means that the ’claim’, i.e., "accepted wisdcu" in systems design, was set up as the aiternative to the hypcthesis, in accord with tra dit ion...conflict and its resolution are m~~lyto occur when users can exercise their influence 4n the levelc2- inert prcezss. Ccnflict 4itsslY os snotr lead...the traditional method of software de- velopment often has poor results. Recently, a new approach to software development, the prototype approach

  10. VIMOS Instrument Control Software Design: an Object Oriented Approach

    NASA Astrophysics Data System (ADS)

    Brau-Nogué, Sylvie; Lucuix, Christian

    2002-12-01

    The Franco-Italian VIMOS instrument is a VIsible imaging Multi-Object Spectrograph with outstanding multiplex capabilities, allowing to take spectra of more than 800 objects simultaneously, or integral field spectroscopy mode in a 54x54 arcsec area. VIMOS is being installed at the Nasmyth focus of the third Unit Telescope of the European Southern Observatory Very Large Telescope (VLT) at Mount Paranal in Chile. This paper will describe the analysis, the design and the implementation of the VIMOS Instrument Control System, using UML notation. Our Control group followed an Object Oriented software process while keeping in mind the ESO VLT standard control concepts. At ESO VLT a complete software library is available. Rather than applying waterfall lifecycle, ICS project used iterative development, a lifecycle consisting of several iterations. Each iteration consisted in : capture and evaluate the requirements, visual modeling for analysis and design, implementation, test, and deployment. Depending of the project phases, iterations focused more or less on specific activity. The result is an object model (the design model), including use-case realizations. An implementation view and a deployment view complement this product. An extract of VIMOS ICS UML model will be presented and some implementation, integration and test issues will be discussed.

  11. Issues in Software Engineering of Relevance to Instructional Design

    ERIC Educational Resources Information Center

    Douglas, Ian

    2006-01-01

    Software engineering is popularly misconceived as being an upmarket term for programming. In a way, this is akin to characterizing instructional design as the process of creating PowerPoint slides. In both these areas, the construction of systems, whether they are learning or computer systems, is only one part of a systematic process. The most…

  12. Inclusion of LCCA in Alaska flexible pavement design software manual.

    DOT National Transportation Integrated Search

    2012-10-01

    Life cycle cost analysis is a key part for selecting materials and techniques that optimize the service life of a pavement in terms of cost and performance. While the Alaska : Flexible Pavement Design software has been in use since 2004, there is no ...

  13. Peeling the Onion: Okapi System Architecture and Software Design Issues.

    ERIC Educational Resources Information Center

    Jones, S.; And Others

    1997-01-01

    Discusses software design issues for Okapi, an information retrieval system that incorporates both search engine and user interface and supports weighted searching, relevance feedback, and query expansion. The basic search system, adjacency searching, and moving toward a distributed system are discussed. (Author/LRW)

  14. Designing for User Cognition and Affect in Software Instructions

    ERIC Educational Resources Information Center

    van der Meij, Hans

    2008-01-01

    In this paper we examine how to design software instructions for user cognition and affect. A basic and co-user manual are compared. The first provides fundamental support for both; the latter includes a buddy to further optimize support for user affect. The basic manual was faster and judged as easier to process than the co-user manual. In…

  15. A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology

    NASA Astrophysics Data System (ADS)

    Lina, L.; Murata, K.

    2006-12-01

    In the present study, we design a system that is named "STARS (Solar-Terrestrial data Analysis and Reference System)". The STARS provides a research environment that researchers can refer to and analyse a variety of data with single software. This software design is based on the OMT (Object Modeling Technique). The OMT is one of the object-oriented techniques, which has an advantage in maintenance improvement, reuse and long time development of a system. At the Center for Information Technology, Ehime University, after our designing of the STARS, we have already started implementing the STARS. The latest version of the STARS, the STARS5, was released in 2006. Any user can download the system from our WWW site (http:// www.infonet.cite.ehime-u.ac.jp/STARS). The present paper is mainly devoted to the design of a data analysis software system. Through our designing, we paid attention so that the design is flexible and applicable when other developers design software for the similar purpose. If our model is so particular only for our own purpose, it would be useless for other developers. Through our design of the domain object model, we carefully removed the parts, which depend on the system resources, e.g. hardware and software. We put the dependent parts into the application object model. In the present design, therefore, the domain object model and the utility object model are independent of computer resource. This helps anther developer to construct his/her own system based the present design. They simply modify their own application object models according to their system resource. This division of the design between dependent and independent part into three object models is one of the advantages in the OMT. If the design of software is completely done along with the OMT, implementation is rather simple and automatic: developers simply map their designs on our programs. If one creates "ganother STARS" with other programming language such as Java, the programmer

  16. Stochastic optimization of GeantV code by use of genetic algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) andmore » handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. Here, the goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.« less

  17. Stochastic optimization of GeantV code by use of genetic algorithms

    DOE PAGES

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; ...

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) andmore » handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. Here, the goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.« less

  18. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  19. Hardware and Software Design of FPGA-based PCIe Gen3 interface for APEnet+ network interconnect system

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Frezza, O.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Paolucci, P. S.; Pastorelli, E.; Rossetti, D.; Simula, F.; Tosoratto, L.; Vicini, P.

    2015-12-01

    In the attempt to develop an interconnection architecture optimized for hybrid HPC systems dedicated to scientific computing, we designed APEnet+, a point-to-point, low-latency and high-performance network controller supporting 6 fully bidirectional off-board links over a 3D torus topology. The first release of APEnet+ (named V4) was a board based on a 40 nm Altera FPGA, integrating 6 channels at 34 Gbps of raw bandwidth per direction and a PCIe Gen2 x8 host interface. It has been the first-of-its-kind device to implement an RDMA protocol to directly read/write data from/to Fermi and Kepler NVIDIA GPUs using NVIDIA peer-to-peer and GPUDirect RDMA protocols, obtaining real zero-copy GPU-to-GPU transfers over the network. The latest generation of APEnet+ systems (now named V5) implements a PCIe Gen3 x8 host interface on a 28 nm Altera Stratix V FPGA, with multi-standard fast transceivers (up to 14.4 Gbps) and an increased amount of configurable internal resources and hardware IP cores to support main interconnection standard protocols. Herein we present the APEnet+ V5 architecture, the status of its hardware and its system software design. Both its Linux Device Driver and the low-level libraries have been redeveloped to support the PCIe Gen3 protocol, introducing optimizations and solutions based on hardware/software co-design.

  20. Software architecture of INO340 telescope control system

    NASA Astrophysics Data System (ADS)

    Ravanmehr, Reza; Khosroshahi, Habib

    2016-08-01

    The software architecture plays an important role in distributed control system of astronomical projects because many subsystems and components must work together in a consistent and reliable way. We have utilized a customized architecture design approach based on "4+1 view model" in order to design INOCS software architecture. In this paper, after reviewing the top level INOCS architecture, we present the software architecture model of INOCS inspired by "4+1 model", for this purpose we provide logical, process, development, physical, and scenario views of our architecture using different UML diagrams and other illustrative visual charts. Each view presents INOCS software architecture from a different perspective. We finish the paper by science data operation of INO340 and the concluding remarks.

  1. Toward Understanding the Cognitive Processes of Software Design in Novice Programmers

    ERIC Educational Resources Information Center

    Yeh, Kuo-Chuan

    2009-01-01

    This study provides insights with regard to the types of cognitive processes that are involved in the formation of mental models and the way those models change over the course of a semester in novice programmers doing a design task. Eight novice programmers participated in this study for three distinct software design sessions, using the same…

  2. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 4: IPAD system design

    NASA Technical Reports Server (NTRS)

    Goldfarb, W.; Carpenter, L. C.; Redhed, D. D.; Hansen, S. D.; Anderson, L. O.; Kawaguchi, A. S.

    1973-01-01

    The computing system design of IPAD is described and the requirements which form the basis for the system design are discussed. The system is presented in terms of a functional design description and technical design specifications. The functional design specifications give the detailed description of the system design using top-down structured programming methodology. Human behavioral characteristics, which specify the system design at the user interface, security considerations, and standards for system design, implementation, and maintenance are also part of the technical design specifications. Detailed specifications of the two most common computing system types in use by the major aerospace companies which could support the IPAD system design are presented. The report of a study to investigate migration of IPAD software between the two candidate 3rd generation host computing systems and from these systems to a 4th generation system is included.

  3. SU-F-T-125: Radial Dose Distributions From Carbon Ions of Therapeutic Energies Calculated with Geant4-DNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassiliev, O

    Purpose: Radial dose distribution D(r) is the dose as a function of lateral distance from the path of a heavy charged particle. Its main application is in modelling of biological effects of heavy ions, including applications to hadron therapy. It is the main physical parameter of a broad group of radiobiological models known as the amorphous track models. Our purpose was to calculate D(r) with Monte Carlo for carbon ions of therapeutic energies, find a simple formula for D(r) and fit it to the Monte Carlo data. Methods: All calculations were performed with Geant4-DNA code, for carbon ion energies frommore » 10 to 400 MeV/u (ranges in water: ∼ 0.4 mm to 27 cm). The spatial resolution of dose distribution in the lateral direction was 1 nm. Electron tracking cut off energy was 11 eV (ionization threshold). The maximum lateral distance considered was 10 µm. Over this distance, D(r) decreases with distance by eight orders of magnitude. Results: All calculated radial dose distributions had a similar shape dominated by the well-known inverse square dependence on the distance. Deviations from the inverse square law were observed close to the beam path (r<10 nm) and at large distances (r >1 µm). At small and large distances D(r) decreased, respectively, slower and faster than the inverse square of distance. A formula for D(r) consistent with this behavior was found and fitted to the Monte Carlo data. The accuracy of the fit was better than 10% for all distances considered. Conclusion: We have generated a set of radial dose distributions for carbon ions that covers the entire range of therapeutic energies, for distances from the ion path of up to 10 µm. The latter distance is sufficient for most applications because dose beyond 10 µm is extremely low.« less

  4. Design of Mariner 9 Science Sequences using Interactive Graphics Software

    NASA Technical Reports Server (NTRS)

    Freeman, J. E.; Sturms, F. M, Jr.; Webb, W. A.

    1973-01-01

    This paper discusses the analyst/computer system used to design the daily science sequences required to carry out the desired Mariner 9 science plan. The Mariner 9 computer environment, the development and capabilities of the science sequence design software, and the techniques followed in the daily mission operations are discussed. Included is a discussion of the overall mission operations organization and the individual components which played an essential role in the sequence design process. A summary of actual sequences processed, a discussion of problems encountered, and recommendations for future applications are given.

  5. GEANT4 simulation of a scintillating-fibre tracker for the cosmic-ray muon tomography of legacy nuclear waste containers

    NASA Astrophysics Data System (ADS)

    Clarkson, A.; Hamilton, D. J.; Hoek, M.; Ireland, D. G.; Johnstone, J. R.; Kaiser, R.; Keri, T.; Lumsden, S.; Mahon, D. F.; McKinnon, B.; Murray, M.; Nutbeam-Tuffs, S.; Shearer, C.; Staines, C.; Yang, G.; Zimmerman, C.

    2014-05-01

    Cosmic-ray muons are highly penetrative charged particles that are observed at the sea level with a flux of approximately one per square centimetre per minute. They interact with matter primarily through Coulomb scattering, which is exploited in the field of muon tomography to image shielded objects in a wide range of applications. In this paper, simulation studies are presented that assess the feasibility of a scintillating-fibre tracker system for use in the identification and characterisation of nuclear materials stored within industrial legacy waste containers. A system consisting of a pair of tracking modules above and a pair below the volume to be assayed is simulated within the GEANT4 framework using a range of potential fibre pitches and module separations. Each module comprises two orthogonal planes of fibres that allow the reconstruction of the initial and Coulomb-scattered muon trajectories. A likelihood-based image reconstruction algorithm has been developed that allows the container content to be determined with respect to the scattering density λ, a parameter which is related to the atomic number Z of the scattering material. Images reconstructed from this simulation are presented for a range of anticipated scenarios that highlight the expected image resolution and the potential of this system for the identification of high-Z materials within a shielded, concrete-filled container. First results from a constructed prototype system are presented in comparison with those from a detailed simulation. Excellent agreement between experimental data and simulation is observed showing clear discrimination between the different materials assayed throughout.

  6. Modeling of very low frequency (VLF) radio wave signal profile due to solar flares using the GEANT4 Monte Carlo simulation coupled with ionospheric chemistry

    NASA Astrophysics Data System (ADS)

    Palit, S.; Basak, T.; Mondal, S. K.; Pal, S.; Chakrabarti, S. K.

    2013-09-01

    X-ray photons emitted during solar flares cause ionization in the lower ionosphere (~60 to 100 km) in excess of what is expected to occur due to a quiet sun. Very low frequency (VLF) radio wave signals reflected from the D-region of the ionosphere are affected by this excess ionization. In this paper, we reproduce the deviation in VLF signal strength during solar flares by numerical modeling. We use GEANT4 Monte Carlo simulation code to compute the rate of ionization due to a M-class flare and a X-class flare. The output of the simulation is then used in a simplified ionospheric chemistry model to calculate the time variation of electron density at different altitudes in the D-region of the ionosphere. The resulting electron density variation profile is then self-consistently used in the LWPC code to obtain the time variation of the change in VLF signal. We did the modeling of the VLF signal along the NWC (Australia) to IERC/ICSP (India) propagation path and compared the results with observations. The agreement is found to be very satisfactory.

  7. The Design and Realization of Radio Telescope Control Software in Windows XP System with VC++

    NASA Astrophysics Data System (ADS)

    Zhao, Rong-Bing; Aili, Yu; Zhang, Jin; Yu, Yun

    2007-03-01

    The main function of the radio telescope control software is to drive the radio telescope to track the target accurately. The design of radio telescope control software is based on Windows XP system with VC++. The functions of the software, communication mode and the user interface is introduced in this article.

  8. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  9. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    NASA Technical Reports Server (NTRS)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  10. Performance profiling for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Choi, Wonqook; Cho, Kihyeon; Yeo, Insung

    2018-05-01

    In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.

  11. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  12. Constraint-Driven Software Design: An Escape from the Waterfall Model.

    ERIC Educational Resources Information Center

    de Hoog, Robert; And Others

    1994-01-01

    Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…

  13. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    NASA Technical Reports Server (NTRS)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  14. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  15. Software design for a compact interferometer

    NASA Astrophysics Data System (ADS)

    Vogel, Andreas

    1993-01-01

    Experience shows that very often a lot of similar elements have to be tested by the optician. Only a small number of input parameters are changed in a well defined manner. So it is useful to develop simplified software for special applications. The software is used in a compact phase shifting interferometer. Up to five interferometers can be controlled by a single PC-AT computer. Modular programming simplifies the software modification for new applications.

  16. Software design of a remote real-time ECG monitoring system

    NASA Astrophysics Data System (ADS)

    Yu, Chengbo; Tao, Hongyan

    2005-12-01

    Heart disease is one of the main diseases that threaten the health and lives of human beings. At present, the normal remote ECG monitoring system has the disadvantages of a short testing distance and limitation of monitoring lines. Because of accident and paroxysmal disease, ECG monitoring has extended from the hospital to the family. Therefore, remote ECG monitoring through the Internet has the actual value and significance. The principle and design method of software of the remote dynamic ECG monitor was presented and discussed. The monitoring software is programmed with Delphi software based on client-sever interactive mode. The application program of the system, which makes use of multithreading technology, is shown to perform in an excellent manner. The program includes remote link users and ECG processing, i.e. ECG data's receiving, real-time displaying, recording and replaying. The system can connect many clients simultaneously and perform real-time monitoring to patients.

  17. Detection And Mapping (DAM) package. Volume 4B: Software System Manual, part 2

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    Computer programs, graphic devices, and an integrated set of manual procedures designed for efficient production of precisely registered and formatted maps from digital data are presented. The software can be used on any Univac 1100 series computer. The software includes pre-defined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3.

  18. Ada Software Design Methods Formulation.

    DTIC Science & Technology

    1982-10-01

    cycle organization is also appropriate for another reason. The source material for the case studies is the work of the two contractors who participated in... working version of the system exist. The integration phase takes the pieces developed and combines them into a single working system. Interfaces...hardware, developed separately from the software, is united with the software, and further testing is performed until the system is a working whole

  19. Web-based software tool for constraint-based design specification of synthetic biological systems.

    PubMed

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ).

  20. Parallels in Computer-Aided Design Framework and Software Development Environment Efforts.

    DTIC Science & Technology

    1992-05-01

    de - sign kits, and tool and design management frameworks. Also, books about software engineer- ing environments [Long 91] and electronic design...tool integration [Zarrella 90], and agreement upon a universal de - sign automation framework, such as the CAD Framework Initiative (CFI) [Malasky 91...ments: identification, control, status accounting, and audit and review. The paper by Dart ex- tracts 15 CM concepts from existing SDEs and tools

  1. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  2. The Application of SNiPER to the JUNO Simulation

    NASA Astrophysics Data System (ADS)

    Lin, Tao; Zou, Jiaheng; Li, Weidong; Deng, Ziyan; Fang, Xiao; Cao, Guofu; Huang, Xingtao; You, Zhengyun; JUNO Collaboration

    2017-10-01

    The JUNO (Jiangmen Underground Neutrino Observatory) is a multipurpose neutrino experiment which is designed to determine neutrino mass hierarchy and precisely measure oscillation parameters. As one of the important systems, the JUNO offline software is being developed using the SNiPER software. In this proceeding, we focus on the requirements of JUNO simulation and present the working solution based on the SNiPER. The JUNO simulation framework is in charge of managing event data, detector geometries and materials, physics processes, simulation truth information etc. It glues physics generator, detector simulation and electronics simulation modules together to achieve a full simulation chain. In the implementation of the framework, many attractive characteristics of the SNiPER have been used, such as dynamic loading, flexible flow control, multiple event management and Python binding. Furthermore, additional efforts have been made to make both detector and electronics simulation flexible enough to accommodate and optimize different detector designs. For the Geant4-based detector simulation, each sub-detector component is implemented as a SNiPER tool which is a dynamically loadable and configurable plugin. So it is possible to select the detector configuration at runtime. The framework provides the event loop to drive the detector simulation and interacts with the Geant4 which is implemented as a passive service. All levels of user actions are wrapped into different customizable tools, so that user functions can be easily extended by just adding new tools. The electronics simulation has been implemented by following an event driven scheme. The SNiPER task component is used to simulate data processing steps in the electronics modules. The electronics and trigger are synchronized by triggered events containing possible physics signals. The JUNO simulation software has been released and is being used by the JUNO collaboration to do detector design optimization, event

  3. A CBI Model for the Design of CAI Software by Teachers/Nonprogrammers.

    ERIC Educational Resources Information Center

    Tessmer, Martin; Jonassen, David H.

    This paper describes a design model presented in workbook form which is intended to facilitate computer-assisted instruction (CAI) software design by teachers who do not have programming experience. Presentation of the model is preceded by a number of assumptions that underlie the instructional content and methods of the textbook. It is argued…

  4. The Role and Design of Screen Images in Software Documentation.

    ERIC Educational Resources Information Center

    van der Meij, Hans

    2000-01-01

    Discussion of learning a new computer software program focuses on how to support the joint handling of a manual, input devices, and screen display. Describes a study that examined three design styles for manuals that included screen images to reduce split-attention problems and discusses theory versus practice and cognitive load theory.…

  5. Design of Control Software for a High-Speed Coherent Doppler Lidar System for CO2 Measurement

    NASA Technical Reports Server (NTRS)

    Vanvalkenburg, Randal L.; Beyon, Jeffrey Y.; Koch, Grady J.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.

    2010-01-01

    The design of the software for a 2-micron coherent high-speed Doppler lidar system for CO2 measurement at NASA Langley Research Center is discussed in this paper. The specific strategy and design topology to meet the requirements of the system are reviewed. In order to attain the high-speed digitization of the different types of signals to be sampled on multiple channels, a carefully planned design of the control software is imperative. Samples of digitized data from each channel and their roles in data analysis post processing are also presented. Several challenges of extremely-fast, high volume data acquisition are discussed. The software must check the validity of each lidar return as well as other monitoring channel data in real-time. For such high-speed data acquisition systems, the software is a key component that enables the entire scope of CO2 measurement studies using commercially available system components.

  6. A methodology for efficiency optimization of betavoltaic cell design using an isotropic planar source having an energy dependent beta particle distribution.

    PubMed

    Theirrattanakul, Sirichai; Prelas, Mark

    2017-09-01

    Nuclear batteries based on silicon carbide betavoltaic cells have been studied extensively in the literature. This paper describes an analysis of design parameters, which can be applied to a variety of materials, but is specific to silicon carbide. In order to optimize the interface between a beta source and silicon carbide p-n junction, it is important to account for the specific isotope, angular distribution of the beta particles from the source, the energy distribution of the source as well as the geometrical aspects of the interface between the source and the transducer. In this work, both the angular distribution and energy distribution of the beta particles are modeled using a thin planar beta source (e.g., H-3, Ni-63, S-35, Pm-147, Sr-90, and Y-90) with GEANT4. Previous studies of betavoltaics with various source isotopes have shown that Monte Carlo based codes such as MCNPX, GEANT4 and Penelope generate similar results. GEANT4 is chosen because it has important strengths for the treatment of electron energies below one keV and it is widely available. The model demonstrates the effects of angular distribution, the maximum energy of the beta particle and energy distribution of the beta source on the betavoltaic and it is useful in determining the spatial profile of the power deposition in the cell. Copyright © 2017. Published by Elsevier Ltd.

  7. Final Scientific/Technical Report for "Enabling Exascale Hardware and Software Design through Scalable System Virtualization"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinda, Peter August

    2015-03-17

    This report describes the activities, findings, and products of the Northwestern University component of the "Enabling Exascale Hardware and Software Design through Scalable System Virtualization" project. The purpose of this project has been to extend the state of the art of systems software for high-end computing (HEC) platforms, and to use systems software to better enable the evaluation of potential future HEC platforms, for example exascale platforms. Such platforms, and their systems software, have the goal of providing scientific computation at new scales, thus enabling new research in the physical sciences and engineering. Over time, the innovations in systems softwaremore » for such platforms also become applicable to more widely used computing clusters, data centers, and clouds. This was a five-institution project, centered on the Palacios virtual machine monitor (VMM) systems software, a project begun at Northwestern, and originally developed in a previous collaboration between Northwestern University and the University of New Mexico. In this project, Northwestern (including via our subcontract to the University of Pittsburgh) contributed to the continued development of Palacios, along with other team members. We took the leadership role in (1) continued extension of support for emerging Intel and AMD hardware, (2) integration and performance enhancement of overlay networking, (3) connectivity with architectural simulation, (4) binary translation, and (5) support for modern Non-Uniform Memory Access (NUMA) hosts and guests. We also took a supporting role in support for specialized hardware for I/O virtualization, profiling, configurability, and integration with configuration tools. The efforts we led (1-5) were largely successful and executed as expected, with code and papers resulting from them. The project demonstrated the feasibility of a virtualization layer for HEC computing, similar to such layers for cloud or datacenter computing. For effort (3

  8. Modeling of the Very Low Frequency (VLF) radio wave signal profile due to solar flares using the GEANT4 Monte Carlo simulation coupled with ionospheric chemistry

    NASA Astrophysics Data System (ADS)

    Palit, S.; Basak, T.; Mondal, S. K.; Pal, S.; Chakrabarti, S. K.

    2013-03-01

    X-ray photons emitted during solar flares cause ionization in the lower ionosphere (~ 60 to 100 km) in excess of what is expected from a quiet sun. Very Low Frequency (VLF) radio wave signals reflected from the D region are affected by this excess ionization. In this paper, we reproduce the deviation in VLF signal strength during solar flares by numerical modeling. We use GEANT4 Monte Carlo simulation code to compute the rate of ionization due to a M-class and a X-class flare. The output of the simulation is then used in a simplified ionospheric chemistry model to calculate the time variation of electron density at different altitudes in the lower ionosphere. The resulting electron density variation profile is then self-consistently used in the LWPC code to obtain the time variation of the VLF signal change. We did the modeling of the VLF signal along the NWC (Australia) to IERC/ICSP (India) propagation path and compared the results with observations. The agreement is found to be very satisfactory.

  9. Bias and design in software specifications

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1990-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. Presented here is a model of bias in software specifications. Bias is defined in terms of the specification process and a classification of the attributes of the software product. Our definition of bias provides insight into both the origin and the consequences of bias. It also shows that bias is relative and essentially unavoidable. Finally, we describe current work on defining a measure of bias, formalizing our model, and relating bias to software defects.

  10. Usability analysis of 2D graphics software for designing technical clothing.

    PubMed

    Teodoroski, Rita de Cassia Clark; Espíndola, Edilene Zilma; Silva, Enéias; Moro, Antônio Renato Pereira; Pereira, Vera Lucia D V

    2012-01-01

    With the advent of technology, the computer became a working tool increasingly present in companies. Its purpose is to increase production and reduce the inherent errors in manual production. The aim of this study was to analyze the usability of 2D graphics software in creating clothing designs by a professional during his work. The movements of the mouse, keyboard and graphical tools were monitored in real time by software Camtasia 7® installed on the user's computer. To register the use of mouse and keyboard we used auxiliary software called MouseMeter®, which quantifies the number of times they pressed the right, middle and left mouse's buttons, the keyboard and also the distance traveled in meters by the cursor on the screen. Data was collected in periods of 15 minutes, 1 hour and 8 hours, consecutively. The results showed that the job is considered repetitive and high demands physical efforts, which can lead to the appearance of repetitive strain injuries. Thus, the goal of minimizing operator efforts and thereby enhance the usability of the examined tool, becomes imperative to replace the mouse by a device called tablet, which also offers an electronic pen and a drawing platform for design development.

  11. Prototype software model for designing intruder detection systems with simulation

    NASA Astrophysics Data System (ADS)

    Smith, Jeffrey S.; Peters, Brett A.; Curry, James C.; Gupta, Dinesh

    1998-08-01

    This article explores using discrete-event simulation for the design and control of defence oriented fixed-sensor- based detection system in a facility housing items of significant interest to enemy forces. The key issues discussed include software development, simulation-based optimization within a modeling framework, and the expansion of the framework to create real-time control tools and training simulations. The software discussed in this article is a flexible simulation environment where the data for the simulation are stored in an external database and the simulation logic is being implemented using a commercial simulation package. The simulation assesses the overall security level of a building against various intruder scenarios. A series of simulation runs with different inputs can determine the change in security level with changes in the sensor configuration, building layout, and intruder/guard strategies. In addition, the simulation model developed for the design stage of the project can be modified to produce a control tool for the testing, training, and real-time control of systems with humans and sensor hardware in the loop.

  12. Software Design Description for the Polar Ice Prediction System (PIPS) Version 3.0

    DTIC Science & Technology

    2008-11-05

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--08-9150 Approved for public release; distribution is unlimited. Software ...collection of information, including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services , Directorate for...THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Software Design Description for the Polar Ice Prediction System (PIPS) Version 3.0 Pamela G

  13. Finite element prediction on the chassis design of UniART4 racing car

    NASA Astrophysics Data System (ADS)

    Zaman, Z. I.; Basaruddin, K. S.; Basha, M. H.; Rahman, M. T. Abd; Daud, R.

    2017-09-01

    This paper presents the analysis and evaluation of the chassis design for University Automotive Racing Team No. 4 (UniART4) car based on finite element analysis. The existing UniART4 car chassis was measured and modelled geometrically using Solidwork before analysed in FEA software (ANSYS). Four types of static structural analysis were used to predict the chassis design capability under four different loading conditions; vertical bending, lateral bending, lateral torsion and horizontal lozenging. The results showed the chassis subjected to the highest stress and strain under horizontal lozenging, whereas the minimum stress and strain response was obtained under lateral bending. The present analysis result could provide valuable information in predicting the sustainability of the current UniART car chassis design.

  14. Research and Design Issues Concerning the Development of Educational Software for Children. Technical Report No. 14.

    ERIC Educational Resources Information Center

    Char, Cynthia

    Several research and design issues to be considered when creating educational software were identified by a field test evaluation of three types of innovative software created at Bank Street College: (1) Probe, software for measuring and graphing temperature data; (2) Rescue Mission, a navigation game that illustrates the computer's use for…

  15. Design of a secure remote management module for a software-operated medical device.

    PubMed

    Burnik, Urban; Dobravec, Štefan; Meža, Marko

    2017-12-09

    Software-based medical devices need to be maintained throughout their entire life cycle. The efficiency of after-sales maintenance can be improved by managing medical systems remotely. This paper presents how to design the remote access function extensions in order to prevent risks imposed by uncontrolled remote access. A thorough analysis of standards and legislation requirements regarding safe operation and risk management of medical devices is presented. Based on the formal requirements, a multi-layer machine design solution is proposed that eliminates remote connectivity risks by strict separation of regular device functionalities from remote management service, deploys encrypted communication links and uses digital signatures to prevent mishandling of software images. The proposed system may also be used as an efficient version update of the existing medical device designs.

  16. Measurements and Monte-Carlo simulations of the particle self-shielding effect of B4C grains in neutron shielding concrete

    NASA Astrophysics Data System (ADS)

    DiJulio, D. D.; Cooper-Jensen, C. P.; Llamas-Jansa, I.; Kazi, S.; Bentley, P. M.

    2018-06-01

    A combined measurement and Monte-Carlo simulation study was carried out in order to characterize the particle self-shielding effect of B4C grains in neutron shielding concrete. Several batches of a specialized neutron shielding concrete, with varying B4C grain sizes, were exposed to a 2 Å neutron beam at the R2D2 test beamline at the Institute for Energy Technology located in Kjeller, Norway. The direct and scattered neutrons were detected with a neutron detector placed behind the concrete blocks and the results were compared to Geant4 simulations. The particle self-shielding effect was included in the Geant4 simulations by calculating effective neutron cross-sections during the Monte-Carlo simulation process. It is shown that this method well reproduces the measured results. Our results show that shielding calculations for low-energy neutrons using such materials would lead to an underestimate of the shielding required for a certain design scenario if the particle self-shielding effect is not included in the calculations.

  17. Art & Design Software Development Using IBM Handy (A Personal Experience).

    ERIC Educational Resources Information Center

    McWhinnie, Harold J.

    This paper presents some of the results from a course in art and design. The course involved the use of simple computer programs for the arts. Attention was geared to the development of graphic components for educational software. The purpose of the course was to provide, through lectures and extensive hands on experience, a basic introduction to…

  18. Primer3_masker: integrating masking of template sequence with primer design software.

    PubMed

    Kõressaar, Triinu; Lepamets, Maarja; Kaplinski, Lauris; Raime, Kairi; Andreson, Reidar; Remm, Maido

    2018-06-01

    Designing PCR primers for amplifying regions of eukaryotic genomes is a complicated task because the genomes contain a large number of repeat sequences and other regions unsuitable for amplification by PCR. We have developed a novel k-mer based masking method that uses a statistical model to detect and mask failure-prone regions on the DNA template prior to primer design. We implemented the software as a standalone software primer3_masker and integrated it into the primer design program Primer3. The standalone version of primer3_masker is implemented in C. The source code is freely available at https://github.com/bioinfo-ut/primer3_masker/ (standalone version for Linux and macOS) and at https://github.com/primer3-org/primer3/ (integrated version). Primer3 web application that allows masking sequences of 196 animal and plant genomes is available at http://primer3.ut.ee/. maido.remm@ut.ee. Supplementary data are available at Bioinformatics online.

  19. Software system design for the non-null digital Moiré interferometer

    NASA Astrophysics Data System (ADS)

    Chen, Meng; Hao, Qun; Hu, Yao; Wang, Shaopu; Li, Tengfei; Li, Lin

    2016-11-01

    Aspheric optical components are an indispensable part of modern optics systems. With the development of aspheric optical elements fabrication technique, high-precision figure error test method of aspheric surfaces is a quite urgent issue now. We proposed a digital Moiré interferometer technique (DMIT) based on partial compensation principle for aspheric and freeform surface measurement. Different from traditional interferometer, DMIT consists of a real and a virtual interferometer. The virtual interferometer is simulated with Zemax software to perform phase-shifting and alignment. We can get the results by a series of calculation with the real interferogram and virtual interferograms generated by computer. DMIT requires a specific, reliable software system to ensure its normal work. Image acquisition and data processing are two important parts in this system. And it is also a challenge to realize the connection between the real and virtual interferometer. In this paper, we present a software system design for DMIT with friendly user interface and robust data processing features, enabling us to acquire the figure error of the measured asphere. We choose Visual C++ as the software development platform and control the ideal interferometer by using hybrid programming with Zemax. After image acquisition and data transmission, the system calls image processing algorithms written with Matlab to calculate the figure error of the measured asphere. We test the software system experimentally. In the experiment, we realize the measurement of an aspheric surface and prove the feasibility of the software system.

  20. AirSTAR Hardware and Software Design for Beyond Visual Range Flight Research

    NASA Technical Reports Server (NTRS)

    Laughter, Sean; Cox, David

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Airborne Subscale Transport Aircraft Research (AirSTAR) Unmanned Aerial System (UAS) is a facility developed to study the flight dynamics of vehicles in emergency conditions, in support of aviation safety research. The system was upgraded to have its operational range significantly expanded, going beyond the line of sight of a ground-based pilot. A redesign of the airborne flight hardware was undertaken, as well as significant changes to the software base, in order to provide appropriate autonomous behavior in response to a number of potential failures and hazards. Ground hardware and system monitors were also upgraded to include redundant communication links, including ADS-B based position displays and an independent flight termination system. The design included both custom and commercially available avionics, combined to allow flexibility in flight experiment design while still benefiting from tested configurations in reversionary flight modes. A similar hierarchy was employed in the software architecture, to allow research codes to be tested, with a fallback to more thoroughly validated flight controls. As a remotely piloted facility, ground systems were also developed to ensure the flight modes and system state were communicated to ground operations personnel in real-time. Presented in this paper is a general overview of the concept of operations for beyond visual range flight, and a detailed review of the airborne hardware and software design. This discussion is held in the context of the safety and procedural requirements that drove many of the design decisions for the AirSTAR UAS Beyond Visual Range capability.

  1. A microkernel design for component-based parallel numerical software systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.

    1999-01-13

    What is the minimal software infrastructure and what type of conventions are needed to simplify development of sophisticated parallel numerical application codes using a variety of software components that are not necessarily available as source code? We propose an opaque object-based model where the objects are dynamically loadable from the file system or network. The microkernel required to manage such a system needs to include, at most: (1) a few basic services, namely--a mechanism for loading objects at run time via dynamic link libraries, and consistent schemes for error handling and memory management; and (2) selected methods that all objectsmore » share, to deal with object life (destruction, reference counting, relationships), and object observation (viewing, profiling, tracing). We are experimenting with these ideas in the context of extensible numerical software within the ALICE (Advanced Large-scale Integrated Computational Environment) project, where we are building the microkernel to manage the interoperability among various tools for large-scale scientific simulations. This paper presents some preliminary observations and conclusions from our work with microkernel design.« less

  2. Geant4 simulation of clinical proton and carbon ion beams for the treatment of ocular melanomas with the full 3-D pencil beam scanning system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farina, Edoardo; Riccardi, Cristina; Rimoldi, Adele

    This work investigates the possibility to use carbon ion beams delivered with active scanning modality, for the treatment of ocular melanomas at the Centro Nazionale di Adroterapia Oncologica (CNAO) in Pavia. The radiotherapy with carbon ions offers many advantages with respect to the radiotherapy with protons or photons, such as a higher relative radio-biological effectiveness (RBE) and a dose release better localized to the tumor. The Monte Carlo (MC) Geant4 10.00 patch-03 toolkit is used to reproduce the complete CNAO extraction beam line, including all the active and passive components characterizing it. The simulation of proton and carbon ion beamsmore » and radiation scanned field is validated against CNAO experimental data. For the irradiation study of the ocular melanoma an eye-detector, representing a model of a human eye, is implemented in the simulation. Each element of the eye is reproduced with its chemical and physical properties. Inside the eye-detector a realistic tumor volume is placed and used as the irradiation target. A comparison between protons and carbon ions eye irradiations allows to study possible treatment benefits if carbon ions are used instead of protons. (authors)« less

  3. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  4. Digital Modeling in Design Foundation Coursework: An Exploratory Study of the Effectiveness of Conceptual Design Software

    ERIC Educational Resources Information Center

    Guidera, Stan; MacPherson, D. Scot

    2008-01-01

    This paper presents the results of a study that was conducted to identify and document student perceptions of the effectiveness of computer modeling software introduced in a design foundations course that had previously utilized only conventional manually-produced representation techniques. Rather than attempt to utilize a production-oriented CAD…

  5. Design and Pedagogical Issues in the Development of the InSight Series of Instructional Software.

    ERIC Educational Resources Information Center

    Baro, John A.; Lehmkulke, Stephen

    1993-01-01

    Design issues in development of InSight software for optometric education include choice of hardware, identification of audience, definition of scope and limitations of content, selection of user interface and programing environment, obtaining user feedback, and software distribution. Pedagogical issues include practicality and improvement on…

  6. 78 FR 54365 - Uniform Fine Assessment Version 4.0 Software; Calculating Amounts of Civil Penalties for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... Version 4.0 Software; Calculating Amounts of Civil Penalties for Violations of Regulations AGENCY: Federal... Agency has begun using the Uniform Fine Assessment (UFA) Version 4.0 software to calculate the amounts of... penalties for violations of the FMCSRs and HMRs and since the mid- 1990's FMCSA has used its UFA software to...

  7. Software cost/resource modeling: Software quality tradeoff measurement

    NASA Technical Reports Server (NTRS)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  8. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  9. Parallel design patterns for a low-power, software-defined compressed video encoder

    NASA Astrophysics Data System (ADS)

    Bruns, Michael W.; Hunt, Martin A.; Prasad, Durga; Gunupudi, Nageswara R.; Sonachalam, Sekar

    2011-06-01

    Video compression algorithms such as H.264 offer much potential for parallel processing that is not always exploited by the technology of a particular implementation. Consumer mobile encoding devices often achieve real-time performance and low power consumption through parallel processing in Application Specific Integrated Circuit (ASIC) technology, but many other applications require a software-defined encoder. High quality compression features needed for some applications such as 10-bit sample depth or 4:2:2 chroma format often go beyond the capability of a typical consumer electronics device. An application may also need to efficiently combine compression with other functions such as noise reduction, image stabilization, real time clocks, GPS data, mission/ESD/user data or software-defined radio in a low power, field upgradable implementation. Low power, software-defined encoders may be implemented using a massively parallel memory-network processor array with 100 or more cores and distributed memory. The large number of processor elements allow the silicon device to operate more efficiently than conventional DSP or CPU technology. A dataflow programming methodology may be used to express all of the encoding processes including motion compensation, transform and quantization, and entropy coding. This is a declarative programming model in which the parallelism of the compression algorithm is expressed as a hierarchical graph of tasks with message communication. Data parallel and task parallel design patterns are supported without the need for explicit global synchronization control. An example is described of an H.264 encoder developed for a commercially available, massively parallel memorynetwork processor device.

  10. LADES: a software for constructing and analyzing longitudinal designs in biomedical research.

    PubMed

    Vázquez-Alcocer, Alan; Garzón-Cortes, Daniel Ladislao; Sánchez-Casas, Rosa María

    2014-01-01

    One of the most important steps in biomedical longitudinal studies is choosing a good experimental design that can provide high accuracy in the analysis of results with a minimum sample size. Several methods for constructing efficient longitudinal designs have been developed based on power analysis and the statistical model used for analyzing the final results. However, development of this technology is not available to practitioners through user-friendly software. In this paper we introduce LADES (Longitudinal Analysis and Design of Experiments Software) as an alternative and easy-to-use tool for conducting longitudinal analysis and constructing efficient longitudinal designs. LADES incorporates methods for creating cost-efficient longitudinal designs, unequal longitudinal designs, and simple longitudinal designs. In addition, LADES includes different methods for analyzing longitudinal data such as linear mixed models, generalized estimating equations, among others. A study of European eels is reanalyzed in order to show LADES capabilities. Three treatments contained in three aquariums with five eels each were analyzed. Data were collected from 0 up to the 12th week post treatment for all the eels (complete design). The response under evaluation is sperm volume. A linear mixed model was fitted to the results using LADES. The complete design had a power of 88.7% using 15 eels. With LADES we propose the use of an unequal design with only 14 eels and 89.5% efficiency. LADES was developed as a powerful and simple tool to promote the use of statistical methods for analyzing and creating longitudinal experiments in biomedical research.

  11. A Geant4-based Simulation to Evaluate the Feasibility of Using Nuclear Resonance Fluorescence (NRF) in Determining Atomic Compositions of Body Tissue in Cancer Diagnostics and Irradiation

    NASA Astrophysics Data System (ADS)

    Gilbo, Yekaterina; Wijesooriya, Krishni; Liyanage, Nilanga

    2017-01-01

    Customarily applied in homeland security for identifying concealed explosives and chemical weapons, NRF (Nuclear Resonance Fluorescence) may have high potential in determining atomic compositions of body tissue. High energy photons incident on a target excite the target nuclei causing characteristic re-emission of resonance photons. As the nuclei of each isotope have well-defined excitation energies, NRF uniquely indicates the isotopic content of the target. NRF radiation corresponding to nuclear isotopes present in the human body is emitted during radiotherapy based on Bremsstrahlung photons generated in a linear electron accelerator. We have developed a Geant4 simulation in order to help assess NRF capabilities in detecting, mapping, and characterizing tumors. We have imported a digital phantom into the simulation using anatomical data linked to known chemical compositions of various tissues. Work is ongoing to implement the University of Virginia's cancer center treatment setup and patient geometry, and to collect and analyze the simulation's physics quantities to evaluate the potential of NRF for medical imaging applications. Preliminary results will be presented.

  12. The effect of 111In radionuclide distance and auger electron energy on direct induction of DNA double-strand breaks: a Monte Carlo study using Geant4 toolkit.

    PubMed

    Piroozfar, Behnaz; Raisali, Gholamreza; Alirezapour, Behrouz; Mirzaii, Mohammad

    2018-04-01

    In this study, the effect of 111 In position and Auger electron energy on direct induction of DSBs was investigated. The Geant4-DNA simulation toolkit was applied using a simple B-DNA form extracted from PDBlib library. First, the simulation was performed for electrons with energies of 111 In and equal emission probabilities to find the most effective electron energies. Then, 111 In Auger electrons' actual spectrum was considered and their contribution in DSB induction analysed. The results showed that the most effective electron energy is 183 eV, but due to the higher emission probability of 350 eV electrons, most of the DSBs were induced by the latter electrons. Also, it was observed that most of the DSBs are induced by electrons emitted within 4 nm of the central axis of the DNA and were mainly due to breaks with <4 base pairs distance in opposing strands. Whilst, when 111 In atoms are very close to the DNA, 1.3 DSBs have been obtained per decay of 111 In atoms. The results show that the most effective Auger electrons are the 350 eV electrons from 111 In atoms with <4 nm distance from the central axis of the DNA which induce ∼1.3 DSBs per decay when bound to the DNA. This value seems reasonable when compared with the reported experimental data.

  13. Learning Embedded Software Design in an Open 3A Multiuser Laboratory

    ERIC Educational Resources Information Center

    Shih, Chien-Chou; Hwang, Lain-Jinn

    2011-01-01

    The need for professional programmers in embedded applications has become critical for industry growth. This need has increased the popularity of embedded software design courses, which are resource-intensive and space-limited in traditional real lab-based instruction. To overcome geographic and time barriers in enhancing practical skills that…

  14. Designing Spatial Visualisation Tasks for Middle School Students with a 3D Modelling Software: An Instrumental Approach

    ERIC Educational Resources Information Center

    Turgut, Melih; Uygan, Candas

    2015-01-01

    In this work, certain task designs to enhance middle school students' spatial visualisation ability, in the context of an instrumental approach, have been developed. 3D modelling software, SketchUp®, was used. In the design process, software tools were focused on and, thereafter, the aim was to interpret the instrumental genesis and spatial…

  15. Use of software engineering techniques in the design of the ALEPH data acquisition system

    NASA Astrophysics Data System (ADS)

    Charity, T.; McClatchey, R.; Harvey, J.

    1987-08-01

    The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.

  16. SEI Software Engineering Education Directory.

    DTIC Science & Technology

    1987-02-01

    Software Design and Development Gilbert. Philip Systems: CDC Cyber 170/750 CDC Cyber 170760 DEC POP 11/44 PRIME AT&T 3B5 IBM PC IBM XT IBM RT...Macintosh VAx 8300 Software System Development and Laboratory CS 480/480L U P X T Textbooks: Software Design and Development Gilbert, Philip Systems: CDC...Acting Chair (618) 692-2386 Courses: Software Design and Development CS 424 U P E Y Textbooks: Software Design and Development, Gilbert, Philip Topics

  17. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  18. Parallel software for lattice N = 4 supersymmetric Yang-Mills theory

    NASA Astrophysics Data System (ADS)

    Schaich, David; DeGrand, Thomas

    2015-05-01

    We present new parallel software, SUSY LATTICE, for lattice studies of four-dimensional N = 4 supersymmetric Yang-Mills theory with gauge group SU(N). The lattice action is constructed to exactly preserve a single supersymmetry charge at non-zero lattice spacing, up to additional potential terms included to stabilize numerical simulations. The software evolved from the MILC code for lattice QCD, and retains a similar large-scale framework despite the different target theory. Many routines are adapted from an existing serial code (Catterall and Joseph, 2012), which SUSY LATTICE supersedes. This paper provides an overview of the new parallel software, summarizing the lattice system, describing the applications that are currently provided and explaining their basic workflow for non-experts in lattice gauge theory. We discuss the parallel performance of the code, and highlight some notable aspects of the documentation for those interested in contributing to its future development.

  19. An investigation of modelling and design for software service applications.

    PubMed

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  20. An investigation of modelling and design for software service applications

    PubMed Central

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  1. Lateral variations of radiobiological properties of therapeutic fields of 1H, 4He, 12C and 16O ions studied with Geant4 and microdosimetric kinetic model

    NASA Astrophysics Data System (ADS)

    Dewey, Sophie; Burigo, Lucas; Pshenichnov, Igor; Mishustin, Igor; Bleicher, Marcus

    2017-07-01

    As known, in cancer therapy with ion beams the relative biological effectiveness (RBE) of ions changes in the course of their propagation in tissues. Such changes are caused not only by increasing the linear energy transfer (LET) of beam particles with the penetration depth towards the Bragg peak, but also by nuclear reactions induced by beam nuclei leading to the production of various secondary particles. Although the changes of RBE along the beam axis have been studied quite well, much less attention has been paid to the evolution of RBE in the transverse direction, perpendicular to the beam axis. In order to fill this gap, we simulated radiation fields of 1H, 4He, 12C and 16O nuclei of 20 mm in diameter by means of a Geant4-based Monte Carlo model for heavy-ion therapy connected with the modified microdosimetric kinetic model to describe the response of normal ((α/β)_x-rays=3.8 Gy) and early-responding ((α/β)_x-rays=10 Gy) tissues. Depth and radial distributions of saturation-corrected dose-mean lineal energy, RBE and RBE-weighted dose are investigated for passive beam shaping and active beam scanning. The field of 4He has a small lateral spread as compared with 1H field, and it is characterised by a modest lateral variation of RBE suggesting the use of fixed RBE values across the field transverse cross section at each depth. Reduced uncertainties of RBE on the boundary of a 4He treatment field can be advantageous in a specific case of an organ at risk located in lateral proximity to the target volume. It is found that the lateral distributions of RBE calculated for 12C and 16O fields demonstrate fast variations in the radial direction due to changes of dose and composition of secondary fragments in the field penumbra. Nevertheless, the radiation fields of all four projectiles at radii larger than 20 mm can be characterized by a common RBE value defined by tissue radiosensitivity. These findings can help, in particular, in accessing the transverse

  2. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  3. MONTE: the next generation of mission design and navigation software

    NASA Astrophysics Data System (ADS)

    Evans, Scott; Taber, William; Drain, Theodore; Smith, Jonathon; Wu, Hsi-Cheng; Guevara, Michelle; Sunseri, Richard; Evans, James

    2018-03-01

    The Mission analysis, Operations and Navigation Toolkit Environment (MONTE) (Sunseri et al. in NASA Tech Briefs 36(9), 2012) is an astrodynamic toolkit produced by the Mission Design and Navigation Software Group at the Jet Propulsion Laboratory. It provides a single integrated environment for all phases of deep space and Earth orbiting missions. Capabilities include: trajectory optimization and analysis, operational orbit determination, flight path control, and 2D/3D visualization. MONTE is presented to the user as an importable Python language module. This allows a simple but powerful user interface via CLUI or script. In addition, the Python interface allows MONTE to be used seamlessly with other canonical scientific programming tools such as SciPy, NumPy, and Matplotlib. MONTE is the prime operational orbit determination software for all JPL navigated missions.

  4. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  5. Fast Photon Monte Carlo for Water Cherenkov Detectors

    NASA Astrophysics Data System (ADS)

    Latorre, Anthony; Seibert, Stanley

    2012-03-01

    We present Chroma, a high performance optical photon simulation for large particle physics detectors, such as the water Cerenkov far detector option for LBNE. This software takes advantage of the CUDA parallel computing platform to propagate photons using modern graphics processing units. In a computer model of a 200 kiloton water Cerenkov detector with 29,000 photomultiplier tubes, Chroma can propagate 2.5 million photons per second, around 200 times faster than the same simulation with Geant4. Chroma uses a surface based approach to modeling geometry which offers many benefits over a solid based modelling approach which is used in other simulations like Geant4.

  6. The effect of electronic health record software design on resident documentation and compliance with evidence-based medicine.

    PubMed

    Rodriguez Torres, Yasaira; Huang, Jordan; Mihlstin, Melanie; Juzych, Mark S; Kromrei, Heidi; Hwang, Frank S

    2017-01-01

    This study aimed to determine the role of electronic health record software in resident education by evaluating documentation of 30 elements extracted from the American Academy of Ophthalmology Dry Eye Syndrome Preferred Practice Pattern. The Kresge Eye Institute transitioned to using electronic health record software in June 2013. We evaluated the charts of 331 patients examined in the resident ophthalmology clinic between September 1, 2011, and March 31, 2014, for an initial evaluation for dry eye syndrome. We compared documentation rates for the 30 evidence-based elements between electronic health record chart note templates among the ophthalmology residents. Overall, significant changes in documentation occurred when transitioning to a new version of the electronic health record software with average compliance ranging from 67.4% to 73.6% (p < 0.0005). Electronic Health Record A had high compliance (>90%) in 13 elements while Electronic Health Record B had high compliance (>90%) in 11 elements. The presence of dialog boxes was responsible for significant changes in documentation of adnexa, puncta, proptosis, skin examination, contact lens wear, and smoking exposure. Significant differences in documentation were correlated with electronic health record template design rather than individual resident or residents' year in training. Our results show that electronic health record template design influences documentation across all resident years. Decreased documentation likely results from "mouse click fatigue" as residents had to access multiple dialog boxes to complete documentation. These findings highlight the importance of EHR template design to improve resident documentation and integration of evidence-based medicine into their clinical notes.

  7. Design of single phase inverter using microcontroller assisted by data processing applications software

    NASA Astrophysics Data System (ADS)

    Ismail, K.; Muharam, A.; Amin; Widodo Budi, S.

    2015-12-01

    Inverter is widely used for industrial, office, and residential purposes. Inverter supports the development of alternative energy such as solar cells, wind turbines and fuel cells by converting dc voltage to ac voltage. Inverter has been made with a variety of hardware and software combinations, such as the use of pure analog circuit and various types of microcontroller as controller. When using pure analog circuit, modification would be difficult because it will change the entire hardware components. In inverter with microcontroller based design (with software), calculations to generate AC modulation is done in the microcontroller. This increases programming complexity and amount of coding downloaded to the microcontroller chip (capacity flash memory in the microcontroller is limited). This paper discusses the design of a single phase inverter using unipolar modulation of sine wave and triangular wave, which is done outside the microcontroller using data processing software application (Microsoft Excel), result shows that complexity programming was reduce and resolution sampling data is very influence to THD. Resolution sampling must taking ½ A degree to get best THD (15.8%).

  8. An Interactive and Automated Software Development Environment.

    DTIC Science & Technology

    1982-12-01

    four levels. Each DFD has an accompanying textual description to aid the reader in understanding the diagram. Both the data flows and the operations...students using the SDW for major software developments. Students in the software engineering courses use the SDW as a pedagogical tool for learning the...the SDWE. For thiz reason, the modified SDWE Algorithmic Design is included as Appendix F. 172 Ui 4.4 Desigin RL the ProQiect Data Baes The Project

  9. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  10. Systems biology driven software design for the research enterprise.

    PubMed

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-06-25

    In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data.

  11. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  12. Detailed design and first tests of the application software for the instrument control unit of Euclid-NISP

    NASA Astrophysics Data System (ADS)

    Ligori, S.; Corcione, L.; Capobianco, V.; Bonino, D.; Sirri, G.; Fornari, F.; Giacomini, F.; Patrizii, L.; Valenziano, L.; Travaglini, R.; Colodro, C.; Bortoletto, F.; Bonoli, C.; Chiarusi, T.; Margiotta, A.; Mauri, N.; Pasqualini, L.; Spurio, M.; Tenti, M.; Dal Corso, F.; Dusini, S.; Laudisio, F.; Sirignano, C.; Stanco, L.; Ventura, S.; Auricchio, N.; Balestra, A.; Franceschi, E.; Morgante, G.; Trifoglio, M.; Medinaceli, E.; Guizzo, G. P.; Debei, S.; Stephen, J. B.

    2016-07-01

    In this paper we describe the detailed design of the application software (ASW) of the instrument control unit (ICU) of NISP, the Near-Infrared Spectro-Photometer of the Euclid mission. This software is based on a real-time operating system (RTEMS) and will interface with all the subunits of NISP, as well as the command and data management unit (CDMU) of the spacecraft for telecommand and housekeeping management. We briefly review the main requirements driving the design and the architecture of the software that is approaching the Critical Design Review level. The interaction with the data processing unit (DPU), which is the intelligent subunit controlling the detector system, is described in detail, as well as the concept for the implementation of the failure detection, isolation and recovery (FDIR) algorithms. The first version of the software is under development on a Breadboard model produced by AIRBUS/CRISA. We describe the results of the tests and the main performances and budgets.

  13. Design and Applications of Rapid Image Tile Producing Software Based on Mosaic Dataset

    NASA Astrophysics Data System (ADS)

    Zha, Z.; Huang, W.; Wang, C.; Tang, D.; Zhu, L.

    2018-04-01

    Map tile technology is widely used in web geographic information services. How to efficiently produce map tiles is key technology for rapid service of images on web. In this paper, a rapid producing software for image tile data based on mosaic dataset is designed, meanwhile, the flow of tile producing is given. Key technologies such as cluster processing, map representation, tile checking, tile conversion and compression in memory are discussed. Accomplished by software development and tested by actual image data, the results show that this software has a high degree of automation, would be able to effectively reducing the number of IO and improve the tile producing efficiency. Moreover, the manual operations would be reduced significantly.

  14. Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.

  15. Evaluation of the dose perturbation around gold and steel fiducial markers in a medical linac through Geant4 Monte Carlo simulation.

    PubMed

    Pontoriero, Antonio; Amato, Ernesto; Iatí, Giuseppe; De Renzis, Costantino; Pergolizzi, Stefano

    2015-01-01

    Purpose of this work was to study the dose perturbation within the target volume of a external MV radiation therapy when using metal fiducials. We developed a Monte Carlo simulation in Geant4 of a cylindrical fiducial made either of gold or of steel and simulated the photon irradiation beam originating from a medical Linac operating at 6, 10 or 15 MV. For each energy, two different irradiation schemes were simulated: a single 5 × 5-cm square field in the -x direction, and five 5 × 5-cm fields at 0°, 80°, 165°, 195° and 280°. In a single beam irradiation scheme, we observed a dose reduction behind fiducials varying from -20% for gold at 6 MV to -5% for steel at 15 MV, and a dose increment in front of the fiducial ranging from +33% for gold at 15 MV to +10% for steel at 6 MV. When five beams were employed, a dose increment ranging from +28% to +46% has been found around gold. Around a steel fiducial, an average increment of +17% was found, irrespective of the photon energy. When using a single beam, the decrement of dose behind both steel and gold markers increases with the photon energy. This effect vanishes when a multifield treatment is delivered; in this instance there is a dose increment around fiducials, according to both fiducial material and photon energy, with lower values for steel and 6 MV. This energy represents the best choice when fiducial markers are present inside the irradiated volume.

  16. The Design and Development of a Web-Interface for the Software Engineering Automation System

    DTIC Science & Technology

    2001-09-01

    application on the Internet. 14. SUBJECT TERMS Computer Aided Prototyping, Real Time Systems , Java 15. NUMBER OF...difficult. Developing the entire system only to find it does not meet the customer’s needs is a tremendous waste of time. Real - time systems need a...software prototyping is an iterative software development methodology utilized to improve the analysis and design of real - time systems [2]. One

  17. Influence of the geometrical detail in the description of DNA and the scoring method of ionization clustering on nanodosimetric parameters of track structure: a Monte Carlo study using Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Bueno, M.; Schulte, R.; Meylan, S.; Villagrasa, C.

    2015-11-01

    The aim of this study was to evaluate the influence of the geometrical detail of the DNA on nanodosimetric parameters of track structure induced by protons and alpha particles of different energies (LET values ranging from 1 to 162.5~\\text{keV}~μ {{\\text{m}}-1} ) as calculated by Geant4-DNA Monte Carlo simulations. The first geometry considered consisted of a well-structured placement of a realistic description of the DNA double helix wrapped around cylindrical histones (GeomHist) forming a 18 kbp-long chromatin fiber. In the second geometry considered, the DNA was modeled as a total of 1800 ten bp-long homogeneous cylinders (2.3 nm diameter and 3.4 nm height) placed in random positions and orientations (GeomCyl). As for GeomHist, GeomCyl contained a DNA material equivalent to 18 kbp. Geant4-DNA track structure simulations were performed and ionizations were counted in the scoring volumes. For GeomCyl, clusters were defined as the number of ionizations (ν) scored in each 10 bp-long cylinder. For GeomHist, clusters of ionizations scored in the sugar-phosphate groups of the double-helix were revealed by the DBSCAN clustering algorithm according to a proximity criteria among ionizations separated by less than 10 bp. The topology of the ionization clusters formed using GeomHist and GeomCyl geometries were compared in terms of biologically relevant nanodosimetric quantities. The discontinuous modeling of the DNA for GeomCyl led to smaller cluster sizes than for GeomHist. The continuous modeling of the DNA molecule for GeomHist allowed the merging of ionization points by the DBSCAN algorithm giving rise to larger clusters, which were not detectable within the GeomCyl geometry. Mean cluster size (m1) was found to be of the order of 10% higher for GeomHist compared to GeomCyl for LET <15~\\text{keV}~μ {{\\text{m}}-1} . For higher LETs, the difference increased with LET similarly for protons and alpha particles. Both geometries showed the same relationship

  18. Developing Engineering and Science Process Skills Using Design Software in an Elementary Education

    NASA Astrophysics Data System (ADS)

    Fusco, Christopher

    This paper examines the development of process skills through an engineering design approach to instruction in an elementary lesson that combines Science, Technology, Engineering, and Math (STEM). The study took place with 25 fifth graders in a public, suburban school district. Students worked in groups of five to design and construct model bridges based on research involving bridge building design software. The assessment was framed around individual student success as well as overall group processing skills. These skills were assessed through an engineering design packet rubric (student work), student surveys of learning gains, observation field notes, and pre- and post-assessment data. The results indicate that students can successfully utilize design software to inform constructions of model bridges, develop science process skills through problem based learning, and understand academic concepts through a design project. The final result of this study shows that design engineering is effective for developing cooperative learning skills. The study suggests that an engineering program offered as an elective or as part of the mandatory curriculum could be beneficial for developing students' critical thinking, inter- and intra-personal skills, along with an increased their understanding and awareness for scientific phenomena. In conclusion, combining a design approach to instruction with STEM can increase efficiency in these areas, generate meaningful learning, and influence student attitudes throughout their education.

  19. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  20. A federated design for a neurobiological simulation engine: the CBI federated software architecture.

    PubMed

    Cornelis, Hugo; Coop, Allan D; Bower, James M

    2012-01-01

    Simulator interoperability and extensibility has become a growing requirement in computational biology. To address this, we have developed a federated software architecture. It is federated by its union of independent disparate systems under a single cohesive view, provides interoperability through its capability to communicate, execute programs, or transfer data among different independent applications, and supports extensibility by enabling simulator expansion or enhancement without the need for major changes to system infrastructure. Historically, simulator interoperability has relied on development of declarative markup languages such as the neuron modeling language NeuroML, while simulator extension typically occurred through modification of existing functionality. The software architecture we describe here allows for both these approaches. However, it is designed to support alternative paradigms of interoperability and extensibility through the provision of logical relationships and defined application programming interfaces. They allow any appropriately configured component or software application to be incorporated into a simulator. The architecture defines independent functional modules that run stand-alone. They are arranged in logical layers that naturally correspond to the occurrence of high-level data (biological concepts) versus low-level data (numerical values) and distinguish data from control functions. The modular nature of the architecture and its independence from a given technology facilitates communication about similar concepts and functions for both users and developers. It provides several advantages for multiple independent contributions to software development. Importantly, these include: (1) Reduction in complexity of individual simulator components when compared to the complexity of a complete simulator, (2) Documentation of individual components in terms of their inputs and outputs, (3) Easy removal or replacement of unnecessary or