Sample records for carlo simulation software

  1. Three-dimensional electron microscopy simulation with the CASINO Monte Carlo software.

    PubMed

    Demers, Hendrix; Poirier-Demers, Nicolas; Couture, Alexandre Réal; Joly, Dany; Guilmain, Marc; de Jonge, Niels; Drouin, Dominique

    2011-01-01

    Monte Carlo softwares are widely used to understand the capabilities of electron microscopes. To study more realistic applications with complex samples, 3D Monte Carlo softwares are needed. In this article, the development of the 3D version of CASINO is presented. The software feature a graphical user interface, an efficient (in relation to simulation time and memory use) 3D simulation model, accurate physic models for electron microscopy applications, and it is available freely to the scientific community at this website: www.gel.usherbrooke.ca/casino/index.html. It can be used to model backscattered, secondary, and transmitted electron signals as well as absorbed energy. The software features like scan points and shot noise allow the simulation and study of realistic experimental conditions. This software has an improved energy range for scanning electron microscopy and scanning transmission electron microscopy applications. Copyright © 2011 Wiley Periodicals, Inc.

  2. Three-Dimensional Electron Microscopy Simulation with the CASINO Monte Carlo Software

    PubMed Central

    Demers, Hendrix; Poirier-Demers, Nicolas; Couture, Alexandre Réal; Joly, Dany; Guilmain, Marc; de Jonge, Niels; Drouin, Dominique

    2011-01-01

    Monte Carlo softwares are widely used to understand the capabilities of electron microscopes. To study more realistic applications with complex samples, 3D Monte Carlo softwares are needed. In this paper, the development of the 3D version of CASINO is presented. The software feature a graphical user interface, an efficient (in relation to simulation time and memory use) 3D simulation model, accurate physic models for electron microscopy applications, and it is available freely to the scientific community at this website: www.gel.usherbrooke.ca/casino/index.html. It can be used to model backscattered, secondary, and transmitted electron signals as well as absorbed energy. The software features like scan points and shot noise allow the simulation and study of realistic experimental conditions. This software has an improved energy range for scanning electron microscopy and scanning transmission electron microscopy applications. PMID:21769885

  3. Massively parallelized Monte Carlo software to calculate the light propagation in arbitrarily shaped 3D turbid media

    NASA Astrophysics Data System (ADS)

    Zoller, Christian; Hohmann, Ansgar; Ertl, Thomas; Kienle, Alwin

    2017-07-01

    The Monte Carlo method is often referred as the gold standard to calculate the light propagation in turbid media [1]. Especially for complex shaped geometries where no analytical solutions are available the Monte Carlo method becomes very important [1, 2]. In this work a Monte Carlo software is presented, to simulate the light propagation in complex shaped geometries. To improve the simulation time the code is based on OpenCL such that graphics cards can be used as well as other computing devices. Within the software an illumination concept is presented to realize easily all kinds of light sources, like spatial frequency domain (SFD), optical fibers or Gaussian beam profiles. Moreover different objects, which are not connected to each other, can be considered simultaneously, without any additional preprocessing. This Monte Carlo software can be used for many applications. In this work the transmission spectrum of a tooth and the color reconstruction of a virtual object are shown, using results from the Monte Carlo software.

  4. Monte Carlo Methodology Serves Up a Software Success

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Widely used for the modeling of gas flows through the computation of the motion and collisions of representative molecules, the Direct Simulation Monte Carlo method has become the gold standard for producing research and engineering predictions in the field of rarefied gas dynamics. Direct Simulation Monte Carlo was first introduced in the early 1960s by Dr. Graeme Bird, a professor at the University of Sydney, Australia. It has since proved to be a valuable tool to the aerospace and defense industries in providing design and operational support data, as well as flight data analysis. In 2002, NASA brought to the forefront a software product that maintains the same basic physics formulation of Dr. Bird's method, but provides effective modeling of complex, three-dimensional, real vehicle simulations and parallel processing capabilities to handle additional computational requirements, especially in areas where computational fluid dynamics (CFD) is not applicable. NASA's Direct Simulation Monte Carlo Analysis Code (DAC) software package is now considered the Agency s premier high-fidelity simulation tool for predicting vehicle aerodynamics and aerothermodynamic environments in rarified, or low-density, gas flows.

  5. Gray: a ray tracing-based Monte Carlo simulator for PET

    NASA Astrophysics Data System (ADS)

    Freese, David L.; Olcott, Peter D.; Buss, Samuel R.; Levin, Craig S.

    2018-05-01

    Monte Carlo simulation software plays a critical role in PET system design. Performing complex, repeated Monte Carlo simulations can be computationally prohibitive, as even a single simulation can require a large amount of time and a computing cluster to complete. Here we introduce Gray, a Monte Carlo simulation software for PET systems. Gray exploits ray tracing methods used in the computer graphics community to greatly accelerate simulations of PET systems with complex geometries. We demonstrate the implementation of models for positron range, annihilation acolinearity, photoelectric absorption, Compton scatter, and Rayleigh scatter. For validation, we simulate the GATE PET benchmark, and compare energy, distribution of hits, coincidences, and run time. We show a speedup using Gray, compared to GATE for the same simulation, while demonstrating nearly identical results. We additionally simulate the Siemens Biograph mCT system with both the NEMA NU-2 scatter phantom and sensitivity phantom. We estimate the total sensitivity within % when accounting for differences in peak NECR. We also estimate the peak NECR to be kcps, or within % of published experimental data. The activity concentration of the peak is also estimated within 1.3%.

  6. Direct Simulation Monte Carlo Calculations in Support of the Columbia Shuttle Orbiter Accident Investigation

    NASA Technical Reports Server (NTRS)

    Gallis, Michael A.; LeBeau, Gerald J.; Boyles, Katie A.

    2003-01-01

    The Direct Simulation Monte Carlo method was used to provide 3-D simulations of the early entry phase of the Shuttle Orbiter. Undamaged and damaged scenarios were modeled to provide calibration points for engineering "bridging function" type of analysis. Currently the simulation technology (software and hardware) are mature enough to allow realistic simulations of three dimensional vehicles.

  7. Gray: a ray tracing-based Monte Carlo simulator for PET.

    PubMed

    Freese, David L; Olcott, Peter D; Buss, Samuel R; Levin, Craig S

    2018-05-21

    Monte Carlo simulation software plays a critical role in PET system design. Performing complex, repeated Monte Carlo simulations can be computationally prohibitive, as even a single simulation can require a large amount of time and a computing cluster to complete. Here we introduce Gray, a Monte Carlo simulation software for PET systems. Gray exploits ray tracing methods used in the computer graphics community to greatly accelerate simulations of PET systems with complex geometries. We demonstrate the implementation of models for positron range, annihilation acolinearity, photoelectric absorption, Compton scatter, and Rayleigh scatter. For validation, we simulate the GATE PET benchmark, and compare energy, distribution of hits, coincidences, and run time. We show a [Formula: see text] speedup using Gray, compared to GATE for the same simulation, while demonstrating nearly identical results. We additionally simulate the Siemens Biograph mCT system with both the NEMA NU-2 scatter phantom and sensitivity phantom. We estimate the total sensitivity within [Formula: see text]% when accounting for differences in peak NECR. We also estimate the peak NECR to be [Formula: see text] kcps, or within [Formula: see text]% of published experimental data. The activity concentration of the peak is also estimated within 1.3%.

  8. Monte Carlo reference data sets for imaging research: Executive summary of the report of AAPM Research Committee Task Group 195.

    PubMed

    Sechopoulos, Ioannis; Ali, Elsayed S M; Badal, Andreu; Badano, Aldo; Boone, John M; Kyprianou, Iacovos S; Mainegra-Hing, Ernesto; McMillan, Kyle L; McNitt-Gray, Michael F; Rogers, D W O; Samei, Ehsan; Turner, Adam C

    2015-10-01

    The use of Monte Carlo simulations in diagnostic medical imaging research is widespread due to its flexibility and ability to estimate quantities that are challenging to measure empirically. However, any new Monte Carlo simulation code needs to be validated before it can be used reliably. The type and degree of validation required depends on the goals of the research project, but, typically, such validation involves either comparison of simulation results to physical measurements or to previously published results obtained with established Monte Carlo codes. The former is complicated due to nuances of experimental conditions and uncertainty, while the latter is challenging due to typical graphical presentation and lack of simulation details in previous publications. In addition, entering the field of Monte Carlo simulations in general involves a steep learning curve. It is not a simple task to learn how to program and interpret a Monte Carlo simulation, even when using one of the publicly available code packages. This Task Group report provides a common reference for benchmarking Monte Carlo simulations across a range of Monte Carlo codes and simulation scenarios. In the report, all simulation conditions are provided for six different Monte Carlo simulation cases that involve common x-ray based imaging research areas. The results obtained for the six cases using four publicly available Monte Carlo software packages are included in tabular form. In addition to a full description of all simulation conditions and results, a discussion and comparison of results among the Monte Carlo packages and the lessons learned during the compilation of these results are included. This abridged version of the report includes only an introductory description of the six cases and a brief example of the results of one of the cases. This work provides an investigator the necessary information to benchmark his/her Monte Carlo simulation software against the reference cases included here before performing his/her own novel research. In addition, an investigator entering the field of Monte Carlo simulations can use these descriptions and results as a self-teaching tool to ensure that he/she is able to perform a specific simulation correctly. Finally, educators can assign these cases as learning projects as part of course objectives or training programs.

  9. Play It Again: Teaching Statistics with Monte Carlo Simulation

    ERIC Educational Resources Information Center

    Sigal, Matthew J.; Chalmers, R. Philip

    2016-01-01

    Monte Carlo simulations (MCSs) provide important information about statistical phenomena that would be impossible to assess otherwise. This article introduces MCS methods and their applications to research and statistical pedagogy using a novel software package for the R Project for Statistical Computing constructed to lessen the often steep…

  10. [Evaluation of Organ Dose Estimation from Indices of CT Dose Using Dose Index Registry].

    PubMed

    Iriuchijima, Akiko; Fukushima, Yasuhiro; Ogura, Akio

    Direct measurement of each patient organ dose from computed tomography (CT) is not possible. Most methods to estimate patient organ dose is using Monte Carlo simulation with dedicated software. However, dedicated software is too expensive for small scale hospitals. Not every hospital can estimate organ dose with dedicated software. The purpose of this study was to evaluate the simple method of organ dose estimation using some common indices of CT dose. The Monte Carlo simulation software Radimetrics (Bayer) was used for calculating organ dose and analysis relationship between indices of CT dose and organ dose. Multidetector CT scanners were compared with those from two manufactures (LightSpeed VCT, GE Healthcare; SOMATOM Definition Flash, Siemens Healthcare). Using stored patient data from Radimetrics, the relationships between indices of CT dose and organ dose were indicated as each formula for estimating organ dose. The accuracy of estimation method of organ dose was compared with the results of Monte Carlo simulation using the Bland-Altman plots. In the results, SSDE was the feasible index for estimation organ dose in almost organs because it reflected each patient size. The differences of organ dose between estimation and simulation were within 23%. In conclusion, our estimation method of organ dose using indices of CT dose is convenient for clinical with accuracy.

  11. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  12. Proton Upset Monte Carlo Simulation

    NASA Technical Reports Server (NTRS)

    O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.

    2009-01-01

    The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.

  13. Monte Carlo Simulations for VLBI2010

    NASA Astrophysics Data System (ADS)

    Wresnik, J.; Böhm, J.; Schuh, H.

    2007-07-01

    Monte Carlo simulations are carried out at the Institute of Geodesy and Geophysics (IGG), Vienna, and at Goddard Space Flight Center (GSFC), Greenbelt (USA), with the goal to design a new geodetic Very Long Baseline Interferometry (VLBI) system. Influences of the schedule, the network geometry and the main stochastic processes on the geodetic results are investigated. Therefore schedules are prepared with the software package SKED (Vandenberg 1999), and different strategies are applied to produce temporally very dense schedules which are compared in terms of baseline length repeatabilities. For the simulation of VLBI observations a Monte Carlo Simulator was set up which creates artificial observations by randomly simulating wet zenith delay and clock values as well as additive white noise representing the antenna errors. For the simulation at IGG the VLBI analysis software OCCAM (Titov et al. 2004) was adapted. Random walk processes with power spectrum densities of 0.7 and 0.1 psec2/sec are used for the simulation of wet zenith delays. The clocks are simulated with Allan Standard Deviations of 1*10^-14 @ 50 min and 2*10^-15 @ 15 min and three levels of white noise, 4 psec, 8 psec and, 16 psec, are added to the artificial observations. The variations of the power spectrum densities of the clocks and wet zenith delays, and the application of different white noise levels show clearly that the wet delay is the critical factor for the improvement of the geodetic VLBI system. At GSFC the software CalcSolve is used for the VLBI analysis, therefore a comparison between the software packages OCCAM and CalcSolve was done with simulated data. For further simulations the wet zenith delay was modeled by a turbulence model. This data was provided by Nilsson T. and was added to the simulation work. Different schedules have been run.

  14. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    NASA Astrophysics Data System (ADS)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  15. An Educational MONTE CARLO Simulation/Animation Program for the Cosmic Rays Muons and a Prototype Computer-Driven Hardware Display.

    ERIC Educational Resources Information Center

    Kalkanis, G.; Sarris, M. M.

    1999-01-01

    Describes an educational software program for the study of and detection methods for the cosmic ray muons passing through several light transparent materials (i.e., water, air, etc.). Simulates muons and Cherenkov photons' paths and interactions and visualizes/animates them on the computer screen using Monte Carlo methods/techniques which employ…

  16. [Comparison of Organ Dose Calculation Using Monte Carlo Simulation and In-phantom Dosimetry in CT Examination].

    PubMed

    Iriuchijima, Akiko; Fukushima, Yasuhiro; Ogura, Akio

    Direct measurement of each patient organ dose from computed tomography (CT) is not possible. Most methods to estimate patient organ dose is using Monte Carlo simulation with dedicated software. However, the method and the relative differences between organ dose simulation and measurement is unclear. The purpose of this study was to compare organ doses evaluated by Monte Carlo simulation with doses evaluated by in-phantom dosimetry. The simulation software Radimetrics (Bayer) was used for the calculation of organ dose. Measurement was performed with radio-photoluminescence glass dosimeter (RPLD) set at various organ positions within RANDO phantom. To evaluate difference of CT scanner, two different CT scanners were used in this study. Angular dependence of RPLD and measurement of effective energy were performed for each scanner. The comparison of simulation and measurement was evaluated by relative differences. In the results, angular dependence of RPLD at two scanners was 31.6±0.45 mGy for SOMATOM Definition Flash and 29.2±0.18 mGy for LightSpeed VCT. The organ dose was 42.2 mGy (range, 29.9-52.7 mGy) by measurements and 37.7 mGy (range, 27.9-48.1 mGy) by simulations. The relative differences of organ dose between measurement and simulation were 13%, excluding of breast's 42%. We found that organ dose by simulation was lower than by measurement. In conclusion, the results of relative differences will be useful for evaluating organ doses for individual patients by simulation software Radimetrics.

  17. Monte Carlo Simulation Tool Installation and Operation Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguayo Navarrete, Estanislao; Ankney, Austin S.; Berguson, Timothy J.

    2013-09-02

    This document provides information on software and procedures for Monte Carlo simulations based on the Geant4 toolkit, the ROOT data analysis software and the CRY cosmic ray library. These tools have been chosen for its application to shield design and activation studies as part of the simulation task for the Majorana Collaboration. This document includes instructions for installation, operation and modification of the simulation code in a high cyber-security computing environment, such as the Pacific Northwest National Laboratory network. It is intended as a living document, and will be periodically updated. It is a starting point for information collection bymore » an experimenter, and is not the definitive source. Users should consult with one of the authors for guidance on how to find the most current information for their needs.« less

  18. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    PubMed

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  19. SIMCA T 1.0: A SAS Computer Program for Simulating Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Raiche, Gilles; Blais, Jean-Guy

    2006-01-01

    Monte Carlo methodologies are frequently applied to study the sampling distribution of the estimated proficiency level in adaptive testing. These methods eliminate real situational constraints. However, these Monte Carlo methodologies are not currently supported by the available software programs, and when these programs are available, their…

  20. Estimation of whole-body radiation exposure from brachytherapy for oral cancer using a Monte Carlo simulation

    PubMed Central

    Ozaki, Y.; Kaida, A.; Miura, M.; Nakagawa, K.; Toda, K.; Yoshimura, R.; Sumi, Y.; Kurabayashi, T.

    2017-01-01

    Abstract Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192Ir hairpins and 198Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192Ir hairpins and 198Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained. PMID:28339846

  1. Preliminary Dynamic Feasibility and Analysis of a Spherical, Wind-Driven (Tumbleweed), Martian Rover

    NASA Technical Reports Server (NTRS)

    Flick, John J.; Toniolo, Matthew D.

    2005-01-01

    The process and findings are presented from a preliminary feasibility study examining the dynamics characteristics of a spherical wind-driven (or Tumbleweed) rover, which is intended for exploration of the Martian surface. The results of an initial feasibility study involving several worst-case mobility situations that a Tumbleweed rover might encounter on the surface of Mars are discussed. Additional topics include the evaluation of several commercially available analysis software packages that were examined as possible platforms for the development of a Monte Carlo Tumbleweed mission simulation tool. This evaluation lead to the development of the Mars Tumbleweed Monte Carlo Simulator (or Tumbleweed Simulator) using the Vortex physics software package from CM-Labs, Inc. Discussions regarding the development and evaluation of the Tumbleweed Simulator, as well as the results of a preliminary analysis using the tool are also presented. Finally, a brief conclusions section is presented.

  2. Organization and use of a Software/Hardware Avionics Research Program (SHARP)

    NASA Technical Reports Server (NTRS)

    Karmarkar, J. S.; Kareemi, M. N.

    1975-01-01

    The organization and use is described of the software/hardware avionics research program (SHARP) developed to duplicate the automatic portion of the STOLAND simulator system, on a general-purpose computer system (i.e., IBM 360). The program's uses are: (1) to conduct comparative evaluation studies of current and proposed airborne and ground system concepts via single run or Monte Carlo simulation techniques, and (2) to provide a software tool for efficient algorithm evaluation and development for the STOLAND avionics computer.

  3. SUPREM-DSMC: A New Scalable, Parallel, Reacting, Multidimensional Direct Simulation Monte Carlo Flow Code

    NASA Technical Reports Server (NTRS)

    Campbell, David; Wysong, Ingrid; Kaplan, Carolyn; Mott, David; Wadsworth, Dean; VanGilder, Douglas

    2000-01-01

    An AFRL/NRL team has recently been selected to develop a scalable, parallel, reacting, multidimensional (SUPREM) Direct Simulation Monte Carlo (DSMC) code for the DoD user community under the High Performance Computing Modernization Office (HPCMO) Common High Performance Computing Software Support Initiative (CHSSI). This paper will introduce the JANNAF Exhaust Plume community to this three-year development effort and present the overall goals, schedule, and current status of this new code.

  4. A Monte Carlo software for the 1-dimensional simulation of IBIC experiments

    NASA Astrophysics Data System (ADS)

    Forneris, J.; Jakšić, M.; Pastuović, Ž.; Vittone, E.

    2014-08-01

    The ion beam induced charge (IBIC) microscopy is a valuable tool for the analysis of the electronic properties of semiconductors. In this work, a recently developed Monte Carlo approach for the simulation of IBIC experiments is presented along with a self-standing software equipped with graphical user interface. The method is based on the probabilistic interpretation of the excess charge carrier continuity equations and it offers to the end-user the full control not only of the physical properties ruling the induced charge formation mechanism (i.e., mobility, lifetime, electrostatics, device's geometry), but also of the relevant experimental conditions (ionization profiles, beam dispersion, electronic noise) affecting the measurement of the IBIC pulses. Moreover, the software implements a novel model for the quantitative evaluation of the radiation damage effects on the charge collection efficiency degradation of ion-beam-irradiated devices. The reliability of the model implementation is then validated against a benchmark IBIC experiment.

  5. Estimation of whole-body radiation exposure from brachytherapy for oral cancer using a Monte Carlo simulation.

    PubMed

    Ozaki, Y; Watanabe, H; Kaida, A; Miura, M; Nakagawa, K; Toda, K; Yoshimura, R; Sumi, Y; Kurabayashi, T

    2017-07-01

    Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192Ir hairpins and 198Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192Ir hairpins and 198Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained. © The Author 2017. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  6. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit.

    PubMed

    Badal, Andreu; Badano, Aldo

    2009-11-01

    It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  7. The effect of carrier gas flow rate and source cell temperature on low pressure organic vapor phase deposition simulation by direct simulation Monte Carlo method

    PubMed Central

    Wada, Takao; Ueda, Noriaki

    2013-01-01

    The process of low pressure organic vapor phase deposition (LP-OVPD) controls the growth of amorphous organic thin films, where the source gases (Alq3 molecule, etc.) are introduced into a hot wall reactor via an injection barrel using an inert carrier gas (N2 molecule). It is possible to control well the following substrate properties such as dopant concentration, deposition rate, and thickness uniformity of the thin film. In this paper, we present LP-OVPD simulation results using direct simulation Monte Carlo-Neutrals (Particle-PLUS neutral module) which is commercial software adopting direct simulation Monte Carlo method. By estimating properly the evaporation rate with experimental vaporization enthalpies, the calculated deposition rates on the substrate agree well with the experimental results that depend on carrier gas flow rate and source cell temperature. PMID:23674843

  8. Simulation Insights Using "R"

    ERIC Educational Resources Information Center

    Kostadinov, Boyan

    2013-01-01

    This article attempts to introduce the reader to computational thinking and solving problems involving randomness. The main technique being employed is the Monte Carlo method, using the freely available software "R for Statistical Computing." The author illustrates the computer simulation approach by focusing on several problems of…

  9. ME(SSY)**2: Monte Carlo Code for Star Cluster Simulations

    NASA Astrophysics Data System (ADS)

    Freitag, Marc Dewi

    2013-02-01

    ME(SSY)**2 stands for “Monte-carlo Experiments with Spherically SYmmetric Stellar SYstems." This code simulates the long term evolution of spherical clusters of stars; it was devised specifically to treat dense galactic nuclei. It is based on the pioneering Monte Carlo scheme proposed by Hénon in the 70's and includes all relevant physical ingredients (2-body relaxation, stellar mass spectrum, collisions, tidal disruption, ldots). It is basically a Monte Carlo resolution of the Fokker-Planck equation. It can cope with any stellar mass spectrum or velocity distribution. Being a particle-based method, it also allows one to take stellar collisions into account in a very realistic way. This unique code, featuring most important physical processes, allows million particle simulations, spanning a Hubble time, in a few CPU days on standard personal computers and provides a wealth of data only rivalized by N-body simulations. The current version of the software requires the use of routines from the "Numerical Recipes in Fortran 77" (http://www.nrbook.com/a/bookfpdf.php).

  10. Monte Carlo Simulations and Generation of the SPI Response

    NASA Technical Reports Server (NTRS)

    Sturner, S. J.; Shrader, C. R.; Weidenspointner, G.; Teegarden, B. J.; Attie, D.; Diehl, R.; Ferguson, C.; Jean, P.; vonKienlin, A.

    2003-01-01

    In this paper we discuss the methods developed for the production of the INTEGRAL/SPI instrument response. The response files were produced using a suite of Monte Carlo simulation software developed at NASA/GSFC based on the GEANT-3 package available from CERN. The production of the INTEGRAL/SPI instrument response also required the development of a detailed computer mass model for SPI. We discuss our extensive investigations into methods to reduce both the computation time and storage requirements for the SPI response. We also discuss corrections to the simulated response based on our comparison of ground and inflight calibration data with MGEANT simulation.

  11. Monte Carlo Simulations and Generation of the SPI Response

    NASA Technical Reports Server (NTRS)

    Sturner, S. J.; Shrader, C. R.; Weidenspointner, G.; Teegarden, B. J.; Attie, D.; Cordier, B.; Diehl, R.; Ferguson, C.; Jean, P.; vonKienlin, A.

    2003-01-01

    In this paper we discuss the methods developed for the production of the INTEGRAL/SPI instrument response. The response files were produced using a suite of Monte Carlo simulation software developed at NASA/GSFC based on the GEANT-3 package available from CERN. The production of the INTEGRAL/SPI instrument response also required the development of a detailed computer mass model for SPI. We discuss ow extensive investigations into methods to reduce both the computation time and storage requirements for the SPI response. We also discuss corrections to the simulated response based on our comparison of ground and infiight Calibration data with MGEANT simulations.

  12. Software for Simulation of Hyperspectral Images

    NASA Technical Reports Server (NTRS)

    Richtsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.

    2002-01-01

    A package of software generates simulated hyperspectral images for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport as well as surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, 'ground truth' is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces and the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for and a supplement to field validation data.

  13. TGeoCad: an Interface between ROOT and CAD Systems

    NASA Astrophysics Data System (ADS)

    Luzzi, C.; Carminati, F.

    2014-06-01

    In the simulation of High Energy Physics experiment a very high precision in the description of the detector geometry is essential to achieve the required performances. The physicists in charge of Monte Carlo Simulation of the detector need to collaborate efficiently with the engineers working at the mechanical design of the detector. Often, this collaboration is made hard by the usage of different and incompatible software. ROOT is an object-oriented C++ framework used by physicists for storing, analyzing and simulating data produced by the high-energy physics experiments while CAD (Computer-Aided Design) software is used for mechanical design in the engineering field. The necessity to improve the level of communication between physicists and engineers led to the implementation of an interface between the ROOT geometrical modeler used by the virtual Monte Carlo simulation software and the CAD systems. In this paper we describe the design and implementation of the TGeoCad Interface that has been developed to enable the use of ROOT geometrical models in several CAD systems. To achieve this goal, the ROOT geometry description is converted into STEP file format (ISO 10303), which can be imported and used by many CAD systems.

  14. Monte Carlo modeling of the Siemens Optifocus multileaf collimator.

    PubMed

    Laliena, Victor; García-Romero, Alejandro

    2015-05-01

    We have developed a new component module for the BEAMnrc software package, called SMLC, which models the tongue-and-groove structure of the Siemens Optifocus multileaf collimator. The ultimate goal is to perform accurate Monte Carlo simulations of the IMRT treatments carried out with Optifocus. SMLC has been validated by direct geometry checks and by comparing quantitatively the results of simulations performed with it and with the component module VARMLC. Measurements and Monte Carlo simulations of absorbed dose distributions of radiation fields sensitive to the tongue-and-groove effect have been performed to tune the free parameters of SMLC. The measurements cannot be accurately reproduced with VARMLC. Finally, simulations of a typical IMRT field showed that SMLC improves the agreement with experimental measurements with respect to VARMLC in clinically relevant cases. 87.55. K. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  16. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badal, Andreu; Badano, Aldo

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-raymore » imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.« less

  17. SU-F-T-111: Investigation of the Attila Deterministic Solver as a Supplement to Monte Carlo for Calculating Out-Of-Field Radiotherapy Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mille, M; Lee, C; Failla, G

    Purpose: To use the Attila deterministic solver as a supplement to Monte Carlo for calculating out-of-field organ dose in support of epidemiological studies looking at the risks of second cancers. Supplemental dosimetry tools are needed to speed up dose calculations for studies involving large-scale patient cohorts. Methods: Attila is a multi-group discrete ordinates code which can solve the 3D photon-electron coupled linear Boltzmann radiation transport equation on a finite-element mesh. Dose is computed by multiplying the calculated particle flux in each mesh element by a medium-specific energy deposition cross-section. The out-of-field dosimetry capability of Attila is investigated by comparing averagemore » organ dose to that which is calculated by Monte Carlo simulation. The test scenario consists of a 6 MV external beam treatment of a female patient with a tumor in the left breast. The patient is simulated by a whole-body adult reference female computational phantom. Monte Carlo simulations were performed using MCNP6 and XVMC. Attila can export a tetrahedral mesh for MCNP6, allowing for a direct comparison between the two codes. The Attila and Monte Carlo methods were also compared in terms of calculation speed and complexity of simulation setup. A key perquisite for this work was the modeling of a Varian Clinac 2100 linear accelerator. Results: The solid mesh of the torso part of the adult female phantom for the Attila calculation was prepared using the CAD software SpaceClaim. Preliminary calculations suggest that Attila is a user-friendly software which shows great promise for our intended application. Computational performance is related to the number of tetrahedral elements included in the Attila calculation. Conclusion: Attila is being explored as a supplement to the conventional Monte Carlo radiation transport approach for performing retrospective patient dosimetry. The goal is for the dosimetry to be sufficiently accurate for use in retrospective epidemiological investigations.« less

  18. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms

    NASA Astrophysics Data System (ADS)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.

  19. Parallelized Monte Carlo software to efficiently simulate the light propagation in arbitrarily shaped objects and aligned scattering media.

    PubMed

    Zoller, Christian Johannes; Hohmann, Ansgar; Foschum, Florian; Geiger, Simeon; Geiger, Martin; Ertl, Thomas Peter; Kienle, Alwin

    2018-06-01

    A GPU-based Monte Carlo software (MCtet) was developed to calculate the light propagation in arbitrarily shaped objects, like a human tooth, represented by a tetrahedral mesh. A unique feature of MCtet is a concept to realize different kinds of light-sources illuminating the complex-shaped surface of an object, for which no preprocessing step is needed. With this concept, it is also possible to consider photons leaving a turbid media and reentering again in case of a concave object. The correct implementation was shown by comparison with five other Monte Carlo software packages. A hundredfold acceleration compared with central processing units-based programs was found. MCtet can simulate anisotropic light propagation, e.g., by accounting for scattering at cylindrical structures. The important influence of the anisotropic light propagation, caused, e.g., by the tubules in human dentin, is shown for the transmission spectrum through a tooth. It was found that the sensitivity to a change in the oxygen saturation inside the pulp for transmission spectra is much larger if the tubules are considered. Another "light guiding" effect based on a combination of a low scattering and a high refractive index in enamel is described. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  20. MCViNE- An object oriented Monte Carlo neutron ray tracing simulation package

    DOE PAGES

    Lin, J. Y. Y.; Smith, Hillary L.; Granroth, Garrett E.; ...

    2015-11-28

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiplemore » scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. As a result, with simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.« less

  1. SIMREL: Software for Coefficient Alpha and Its Confidence Intervals with Monte Carlo Studies

    ERIC Educational Resources Information Center

    Yurdugul, Halil

    2009-01-01

    This article describes SIMREL, a software program designed for the simulation of alpha coefficients and the estimation of its confidence intervals. SIMREL runs on two alternatives. In the first one, if SIMREL is run for a single data file, it performs descriptive statistics, principal components analysis, and variance analysis of the item scores…

  2. Simulation of Hyperspectral Images

    NASA Technical Reports Server (NTRS)

    Richsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.

    2004-01-01

    A software package generates simulated hyperspectral imagery for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport, as well as reflections from surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, "ground truth" is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces, as well as the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for, and a supplement to, field validation data.

  3. Probabilistic approach of resource assessment in Kerinci geothermal field using numerical simulation coupling with monte carlo simulation

    NASA Astrophysics Data System (ADS)

    Hidayat, Iki; Sutopo; Pratama, Heru Berian

    2017-12-01

    The Kerinci geothermal field is one phase liquid reservoir system in the Kerinci District, western part of Jambi Province. In this field, there are geothermal prospects that identified by the heat source up flow inside a National Park area. Kerinci field was planned to develop 1×55 MWe by Pertamina Geothermal Energy. To define reservoir characterization, the numerical simulation of Kerinci field is developed by using TOUGH2 software with information from conceptual model. The pressure and temperature profile well data of KRC-B1 are validated with simulation data to reach natural state condition. The result of the validation is suitable matching. Based on natural state simulation, the resource assessment of Kerinci geothermal field is estimated by using Monte Carlo simulation with the result P10-P50-P90 are 49.4 MW, 64.3 MW and 82.4 MW respectively. This paper is the first study of resource assessment that has been estimated successfully in Kerinci Geothermal Field using numerical simulation coupling with Monte carlo simulation.

  4. Software for Engineering Simulations of a Spacecraft

    NASA Technical Reports Server (NTRS)

    Shireman, Kirk; McSwain, Gene; McCormick, Bernell; Fardelos, Panayiotis

    2005-01-01

    Spacecraft Engineering Simulation II (SES II) is a C-language computer program for simulating diverse aspects of operation of a spacecraft characterized by either three or six degrees of freedom. A functional model in SES can include a trajectory flight plan; a submodel of a flight computer running navigational and flight-control software; and submodels of the environment, the dynamics of the spacecraft, and sensor inputs and outputs. SES II features a modular, object-oriented programming style. SES II supports event-based simulations, which, in turn, create an easily adaptable simulation environment in which many different types of trajectories can be simulated by use of the same software. The simulation output consists largely of flight data. SES II can be used to perform optimization and Monte Carlo dispersion simulations. It can also be used to perform simulations for multiple spacecraft. In addition to its generic simulation capabilities, SES offers special capabilities for space-shuttle simulations: for this purpose, it incorporates submodels of the space-shuttle dynamics and a C-language version of the guidance, navigation, and control components of the space-shuttle flight software.

  5. Parameter Uncertainty Analysis Using Monte Carlo Simulations for a Regional-Scale Groundwater Model

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Pohlmann, K.

    2016-12-01

    Regional-scale grid-based groundwater models for flow and transport often contain multiple types of parameters that can intensify the challenge of parameter uncertainty analysis. We propose a Monte Carlo approach to systematically quantify the influence of various types of model parameters on groundwater flux and contaminant travel times. The Monte Carlo simulations were conducted based on the steady-state conversion of the original transient model, which was then combined with the PEST sensitivity analysis tool SENSAN and particle tracking software MODPATH. Results identified hydrogeologic units whose hydraulic conductivity can significantly affect groundwater flux, and thirteen out of 173 model parameters that can cause large variation in travel times for contaminant particles originating from given source zones.

  6. Status of the Space Radiation Monte Carlos Simulation Based on FLUKA and ROOT

    NASA Technical Reports Server (NTRS)

    Andersen, Victor; Carminati, Federico; Empl, Anton; Ferrari, Alfredo; Pinsky, Lawrence; Sala, Paola; Wilson, Thomas L.

    2002-01-01

    The NASA-funded project reported on at the first IWSSRR in Arona to develop a Monte-Carlo simulation program for use in simulating the space radiation environment based on the FLUKA and ROOT codes is well into its second year of development, and considerable progress has been made. The general tasks required to achieve the final goals include the addition of heavy-ion interactions into the FLUKA code and the provision of a ROOT-based interface to FLUKA. The most significant progress to date includes the incorporation of the DPMJET event generator code within FLUKA to handle heavy-ion interactions for incident projectile energies greater than 3GeV/A. The ongoing effort intends to extend the treatment of these interactions down to 10 MeV, and at present two alternative approaches are being explored. The ROOT interface is being pursued in conjunction with the CERN LHC ALICE software team through an adaptation of their existing AliROOT software. As a check on the validity of the code, a simulation of the recent data taken by the ATIC experiment is underway.

  7. Monte Carlo simulation of energy-dispersive x-ray fluorescence and applications

    NASA Astrophysics Data System (ADS)

    Li, Fusheng

    Four key components with regards to Monte Carlo Library Least Squares (MCLLS) have been developed by the author. These include: a comprehensive and accurate Monte Carlo simulation code - CEARXRF5 with Differential Operators (DO) and coincidence sampling, Detector Response Function (DRF), an integrated Monte Carlo - Library Least-Squares (MCLLS) Graphical User Interface (GUI) visualization System (MCLLSPro) and a new reproducible and flexible benchmark experiment setup. All these developments or upgrades enable the MCLLS approach to be a useful and powerful tool for a tremendous variety of elemental analysis applications. CEARXRF, a comprehensive and accurate Monte Carlo code for simulating the total and individual library spectral responses of all elements, has been recently upgraded to version 5 by the author. The new version has several key improvements: input file format fully compatible with MCNP5, a new efficient general geometry tracking code, versatile source definitions, various variance reduction techniques (e.g. weight window mesh and splitting, stratifying sampling, etc.), a new cross section data storage and accessing method which improves the simulation speed by a factor of four and new cross section data, upgraded differential operators (DO) calculation capability, and also an updated coincidence sampling scheme which including K-L and L-L coincidence X-Rays, while keeping all the capabilities of the previous version. The new Differential Operators method is powerful for measurement sensitivity study and system optimization. For our Monte Carlo EDXRF elemental analysis system, it becomes an important technique for quantifying the matrix effect in near real time when combined with the MCLLS approach. An integrated visualization GUI system has been developed by the author to perform elemental analysis using iterated Library Least-Squares method for various samples when an initial guess is provided. This software was built on the Borland C++ Builder platform and has a user-friendly interface to accomplish all qualitative and quantitative tasks easily. That is to say, the software enables users to run the forward Monte Carlo simulation (if necessary) or use previously calculated Monte Carlo library spectra to obtain the sample elemental composition estimation within a minute. The GUI software is easy to use with user-friendly features and has the capability to accomplish all related tasks in a visualization environment. It can be a powerful tool for EDXRF analysts. A reproducible experiment setup has been built and experiments have been performed to benchmark the system. Two types of Standard Reference Materials (SRM), stainless steel samples from National Institute of Standards and Technology (NIST) and aluminum alloy samples from Alcoa Inc., with certified elemental compositions, are tested with this reproducible prototype system using a 109Cd radioisotope source (20mCi) and a liquid nitrogen cooled Si(Li) detector. The results show excellent agreement between the calculated sample compositions and their reference values and the approach is very fast.

  8. Genetic Algorithms and Their Application to the Protein Folding Problem

    DTIC Science & Technology

    1993-12-01

    and symbolic methods, random methods such as Monte Carlo simulation and simulated annealing, distance geometry, and molecular dynamics. Many of these...calculated energies with those obtained using the molecular simulation software package called CHARMm. 10 9) Test both the simple and parallel simpie genetic...homology-based, and simplification techniques. 3.21 Molecular Dynamics. Perhaps the most natural approach is to actually simulate the folding process. This

  9. RNA folding kinetics using Monte Carlo and Gillespie algorithms.

    PubMed

    Clote, Peter; Bayegan, Amir H

    2018-04-01

    RNA secondary structure folding kinetics is known to be important for the biological function of certain processes, such as the hok/sok system in E. coli. Although linear algebra provides an exact computational solution of secondary structure folding kinetics with respect to the Turner energy model for tiny ([Formula: see text]20 nt) RNA sequences, the folding kinetics for larger sequences can only be approximated by binning structures into macrostates in a coarse-grained model, or by repeatedly simulating secondary structure folding with either the Monte Carlo algorithm or the Gillespie algorithm. Here we investigate the relation between the Monte Carlo algorithm and the Gillespie algorithm. We prove that asymptotically, the expected time for a K-step trajectory of the Monte Carlo algorithm is equal to [Formula: see text] times that of the Gillespie algorithm, where [Formula: see text] denotes the Boltzmann expected network degree. If the network is regular (i.e. every node has the same degree), then the mean first passage time (MFPT) computed by the Monte Carlo algorithm is equal to MFPT computed by the Gillespie algorithm multiplied by [Formula: see text]; however, this is not true for non-regular networks. In particular, RNA secondary structure folding kinetics, as computed by the Monte Carlo algorithm, is not equal to the folding kinetics, as computed by the Gillespie algorithm, although the mean first passage times are roughly correlated. Simulation software for RNA secondary structure folding according to the Monte Carlo and Gillespie algorithms is publicly available, as is our software to compute the expected degree of the network of secondary structures of a given RNA sequence-see http://bioinformatics.bc.edu/clote/RNAexpNumNbors .

  10. SimDoseCT: dose reporting software based on Monte Carlo simulation for a 320 detector-row cone-beam CT scanner and ICRP computational adult phantoms

    NASA Astrophysics Data System (ADS)

    Cros, Maria; Joemai, Raoul M. S.; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal

    2017-08-01

    This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.

  11. SimDoseCT: dose reporting software based on Monte Carlo simulation for a 320 detector-row cone-beam CT scanner and ICRP computational adult phantoms.

    PubMed

    Cros, Maria; Joemai, Raoul M S; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal

    2017-07-17

    This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.

  12. Monte-Carlo Geant4 numerical simulation of experiments at 247-MeV proton microscope

    NASA Astrophysics Data System (ADS)

    Kantsyrev, A. V.; Skoblyakov, A. V.; Bogdanov, A. V.; Golubev, A. A.; Shilkin, N. S.; Yuriev, D. S.; Mintsev, V. B.

    2018-01-01

    A radiographic facility for an investigation of fast dynamic processes with areal density of targets up to 5 g/cm2 is under development on the basis of high-current proton linear accelerator at the Institute for Nuclear Research (Troitsk, Russia). A virtual model of the proton microscope developed in a software toolkit Geant4 is presented in the article. Fullscale Monte-Carlo numerical simulation of static radiographic experiments at energy of a proton beam 247 MeV was performed. The results of simulation of proton radiography experiments with static model of shock-compressed xenon are presented. The results of visualization of copper and polymethyl methacrylate step wedges static targets also described.

  13. Reducing statistical uncertainties in simulated organ doses of phantoms immersed in water

    DOE PAGES

    Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.; ...

    2016-08-13

    In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less

  14. Monte Carlo treatment planning with modulated electron radiotherapy: framework development and application

    NASA Astrophysics Data System (ADS)

    Alexander, Andrew William

    Within the field of medical physics, Monte Carlo radiation transport simulations are considered to be the most accurate method for the determination of dose distributions in patients. The McGill Monte Carlo treatment planning system (MMCTP), provides a flexible software environment to integrate Monte Carlo simulations with current and new treatment modalities. A developing treatment modality called energy and intensity modulated electron radiotherapy (MERT) is a promising modality, which has the fundamental capabilities to enhance the dosimetry of superficial targets. An objective of this work is to advance the research and development of MERT with the end goal of clinical use. To this end, we present the MMCTP system with an integrated toolkit for MERT planning and delivery of MERT fields. Delivery is achieved using an automated "few leaf electron collimator" (FLEC) and a controller. Aside from the MERT planning toolkit, the MMCTP system required numerous add-ons to perform the complex task of large-scale autonomous Monte Carlo simulations. The first was a DICOM import filter, followed by the implementation of DOSXYZnrc as a dose calculation engine and by logic methods for submitting and updating the status of Monte Carlo simulations. Within this work we validated the MMCTP system with a head and neck Monte Carlo recalculation study performed by a medical dosimetrist. The impact of MMCTP lies in the fact that it allows for systematic and platform independent large-scale Monte Carlo dose calculations for different treatment sites and treatment modalities. In addition to the MERT planning tools, various optimization algorithms were created external to MMCTP. The algorithms produced MERT treatment plans based on dose volume constraints that employ Monte Carlo pre-generated patient-specific kernels. The Monte Carlo kernels are generated from patient-specific Monte Carlo dose distributions within MMCTP. The structure of the MERT planning toolkit software and optimization algorithms are demonstrated. We investigated the clinical significance of MERT on spinal irradiation, breast boost irradiation, and a head and neck sarcoma cancer site using several parameters to analyze the treatment plans. Finally, we investigated the idea of mixed beam photon and electron treatment planning. Photon optimization treatment planning tools were included within the MERT planning toolkit for the purpose of mixed beam optimization. In conclusion, this thesis work has resulted in the development of an advanced framework for photon and electron Monte Carlo treatment planning studies and the development of an inverse planning system for photon, electron or mixed beam radiotherapy (MBRT). The justification and validation of this work is found within the results of the planning studies, which have demonstrated dosimetric advantages to using MERT or MBRT in comparison to clinical treatment alternatives.

  15. SU-F-T-619: Dose Evaluation of Specific Patient Plans Based On Monte Carlo Algorithm for a CyberKnife Stereotactic Radiosurgery System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piao, J; PLA 302 Hospital, Beijing; Xu, S

    2016-06-15

    Purpose: This study will use Monte Carlo to simulate the Cyberknife system, and intend to develop the third-party tool to evaluate the dose verification of specific patient plans in TPS. Methods: By simulating the treatment head using the BEAMnrc and DOSXYZnrc software, the comparison between the calculated and measured data will be done to determine the beam parameters. The dose distribution calculated in the Raytracing, Monte Carlo algorithms of TPS (Multiplan Ver4.0.2) and in-house Monte Carlo simulation method for 30 patient plans, which included 10 head, lung and liver cases in each, were analyzed. The γ analysis with the combinedmore » 3mm/3% criteria would be introduced to quantitatively evaluate the difference of the accuracy between three algorithms. Results: More than 90% of the global error points were less than 2% for the comparison of the PDD and OAR curves after determining the mean energy and FWHM.The relative ideal Monte Carlo beam model had been established. Based on the quantitative evaluation of dose accuracy for three algorithms, the results of γ analysis shows that the passing rates (84.88±9.67% for head,98.83±1.05% for liver,98.26±1.87% for lung) of PTV in 30 plans between Monte Carlo simulation and TPS Monte Carlo algorithms were good. And the passing rates (95.93±3.12%,99.84±0.33% in each) of PTV in head and liver plans between Monte Carlo simulation and TPS Ray-tracing algorithms were also good. But the difference of DVHs in lung plans between Monte Carlo simulation and Ray-tracing algorithms was obvious, and the passing rate (51.263±38.964%) of γ criteria was not good. It is feasible that Monte Carlo simulation was used for verifying the dose distribution of patient plans. Conclusion: Monte Carlo simulation algorithm developed in the CyberKnife system of this study can be used as a reference tool for the third-party tool, which plays an important role in dose verification of patient plans. This work was supported in part by the grant from Chinese Natural Science Foundation (Grant No. 11275105). Thanks for the support from Accuray Corp.« less

  16. Excavator Design Validation

    NASA Technical Reports Server (NTRS)

    Pholsiri, Chalongrath; English, James; Seberino, Charles; Lim, Yi-Je

    2010-01-01

    The Excavator Design Validation tool verifies excavator designs by automatically generating control systems and modeling their performance in an accurate simulation of their expected environment. Part of this software design includes interfacing with human operations that can be included in simulation-based studies and validation. This is essential for assessing productivity, versatility, and reliability. This software combines automatic control system generation from CAD (computer-aided design) models, rapid validation of complex mechanism designs, and detailed models of the environment including soil, dust, temperature, remote supervision, and communication latency to create a system of high value. Unique algorithms have been created for controlling and simulating complex robotic mechanisms automatically from just a CAD description. These algorithms are implemented as a commercial cross-platform C++ software toolkit that is configurable using the Extensible Markup Language (XML). The algorithms work with virtually any mobile robotic mechanisms using module descriptions that adhere to the XML standard. In addition, high-fidelity, real-time physics-based simulation algorithms have also been developed that include models of internal forces and the forces produced when a mechanism interacts with the outside world. This capability is combined with an innovative organization for simulation algorithms, new regolith simulation methods, and a unique control and study architecture to make powerful tools with the potential to transform the way NASA verifies and compares excavator designs. Energid's Actin software has been leveraged for this design validation. The architecture includes parametric and Monte Carlo studies tailored for validation of excavator designs and their control by remote human operators. It also includes the ability to interface with third-party software and human-input devices. Two types of simulation models have been adapted: high-fidelity discrete element models and fast analytical models. By using the first to establish parameters for the second, a system has been created that can be executed in real time, or faster than real time, on a desktop PC. This allows Monte Carlo simulations to be performed on a computer platform available to all researchers, and it allows human interaction to be included in a real-time simulation process. Metrics on excavator performance are established that work with the simulation architecture. Both static and dynamic metrics are included.

  17. Probabilistic analysis algorithm for UA slope software program.

    DOT National Transportation Integrated Search

    2013-12-01

    A reliability-based computational algorithm for using a single row and equally spaced drilled shafts to : stabilize an unstable slope has been developed in this research. The Monte-Carlo simulation (MCS) : technique was used in the previously develop...

  18. Cross-platform validation and analysis environment for particle physics

    NASA Astrophysics Data System (ADS)

    Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.

    2017-11-01

    A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.

  19. Parallel Grand Canonical Monte Carlo (ParaGrandMC) Simulation Code

    NASA Technical Reports Server (NTRS)

    Yamakov, Vesselin I.

    2016-01-01

    This report provides an overview of the Parallel Grand Canonical Monte Carlo (ParaGrandMC) simulation code. This is a highly scalable parallel FORTRAN code for simulating the thermodynamic evolution of metal alloy systems at the atomic level, and predicting the thermodynamic state, phase diagram, chemical composition and mechanical properties. The code is designed to simulate multi-component alloy systems, predict solid-state phase transformations such as austenite-martensite transformations, precipitate formation, recrystallization, capillary effects at interfaces, surface absorption, etc., which can aid the design of novel metallic alloys. While the software is mainly tailored for modeling metal alloys, it can also be used for other types of solid-state systems, and to some degree for liquid or gaseous systems, including multiphase systems forming solid-liquid-gas interfaces.

  20. A Monte Carlo model for 3D grain evolution during welding

    NASA Astrophysics Data System (ADS)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-09-01

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bézier curves, which allow for the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. The model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.

  1. How Monte Carlo heuristics aid to identify the physical processes of drug release kinetics.

    PubMed

    Lecca, Paola

    2018-01-01

    We implement a Monte Carlo heuristic algorithm to model drug release from a solid dosage form. We show that with Monte Carlo simulations it is possible to identify and explain the causes of the unsatisfactory predictive power of current drug release models. It is well known that the power-law, the exponential models, as well as those derived from or inspired by them accurately reproduce only the first 60% of the release curve of a drug from a dosage form. In this study, by using Monte Carlo simulation approaches, we show that these models fit quite accurately almost the entire release profile when the release kinetics is not governed by the coexistence of different physico-chemical mechanisms. We show that the accuracy of the traditional models are comparable with those of Monte Carlo heuristics when these heuristics approximate and oversimply the phenomenology of drug release. This observation suggests to develop and use novel Monte Carlo simulation heuristics able to describe the complexity of the release kinetics, and consequently to generate data more similar to those observed in real experiments. Implementing Monte Carlo simulation heuristics of the drug release phenomenology may be much straightforward and efficient than hypothesizing and implementing from scratch complex mathematical models of the physical processes involved in drug release. Identifying and understanding through simulation heuristics what processes of this phenomenology reproduce the observed data and then formalize them in mathematics may allow avoiding time-consuming, trial-error based regression procedures. Three bullet points, highlighting the customization of the procedure. •An efficient heuristics based on Monte Carlo methods for simulating drug release from solid dosage form encodes is presented. It specifies the model of the physical process in a simple but accurate way in the formula of the Monte Carlo Micro Step (MCS) time interval.•Given the experimentally observed curve of drug release, we point out how Monte Carlo heuristics can be integrated in an evolutionary algorithmic approach to infer the mode of MCS best fitting the observed data, and thus the observed release kinetics.•The software implementing the method is written in R language, the free most used language in the bioinformaticians community.

  2. An interface for simulating radiative transfer in and around volcanic plumes with the Monte Carlo radiative transfer model McArtim

    USGS Publications Warehouse

    Kern, Christoph

    2016-03-23

    This report describes two software tools that, when used as front ends for the three-dimensional backward Monte Carlo atmospheric-radiative-transfer model (RTM) McArtim, facilitate the generation of lookup tables of volcanic-plume optical-transmittance characteristics in the ultraviolet/visible-spectral region. In particular, the differential optical depth and derivatives thereof (that is, weighting functions), with regard to a change in SO2 column density or aerosol optical thickness, can be simulated for a specific measurement geometry and a representative range of plume conditions. These tables are required for the retrieval of SO2 column density in volcanic plumes, using the simulated radiative-transfer/differential optical-absorption spectroscopic (SRT-DOAS) approach outlined by Kern and others (2012). This report, together with the software tools published online, is intended to make this sophisticated SRT-DOAS technique available to volcanologists and gas geochemists in an operational environment, without the need for an indepth treatment of the underlying principles or the low-level interface of the RTM McArtim.

  3. Monte Carlo Methods in Materials Science Based on FLUKA and ROOT

    NASA Technical Reports Server (NTRS)

    Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor

    2003-01-01

    A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the CERN ALICE (A Large Ion Collisions Experiment) software team through an adaptation of their existing AliROOT (ALICE Using ROOT) architecture. In order to check our progress against actual data, we have chosen to simulate the ATIC14 (Advanced Thin Ionization Calorimeter) cosmic-ray astrophysics balloon payload as well as neutron fluences in the Mir spacecraft. This paper contains a summary of status of this project, and a roadmap to its successful completion.

  4. Efficient Simulation of Secondary Fluorescence Via NIST DTSA-II Monte Carlo.

    PubMed

    Ritchie, Nicholas W M

    2017-06-01

    Secondary fluorescence, the final term in the familiar matrix correction triumvirate Z·A·F, is the most challenging for Monte Carlo models to simulate. In fact, only two implementations of Monte Carlo models commonly used to simulate electron probe X-ray spectra can calculate secondary fluorescence-PENEPMA and NIST DTSA-II a (DTSA-II is discussed herein). These two models share many physical models but there are some important differences in the way each implements X-ray emission including secondary fluorescence. PENEPMA is based on PENELOPE, a general purpose software package for simulation of both relativistic and subrelativistic electron/positron interactions with matter. On the other hand, NIST DTSA-II was designed exclusively for simulation of X-ray spectra generated by subrelativistic electrons. NIST DTSA-II uses variance reduction techniques unsuited to general purpose code. These optimizations help NIST DTSA-II to be orders of magnitude more computationally efficient while retaining detector position sensitivity. Simulations execute in minutes rather than hours and can model differences that result from detector position. Both PENEPMA and NIST DTSA-II are capable of handling complex sample geometries and we will demonstrate that both are of similar accuracy when modeling experimental secondary fluorescence data from the literature.

  5. Cross-platform validation and analysis environment for particle physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.

    A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for onlinemore » validation of Monte Carlo event samples through a web interface.« less

  6. Fast Monte Carlo simulation of a dispersive sample on the SEQUOIA spectrometer at the SNS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granroth, Garrett E; Chen, Meili; Kohl, James Arthur

    2007-01-01

    Simulation of an inelastic scattering experiment, with a sample and a large pixilated detector, usually requires days of time because of finite processor speeds. We report simulations on an SNS (Spallation Neutron Source) instrument, SEQUOIA, that reduce the time to less than 2 hours by using parallelization and the resources of the TeraGrid. SEQUOIA is a fine resolution (∆E/Ei ~ 1%) chopper spectrometer under construction at the SNS. It utilizes incident energies from Ei = 20 meV to 2 eV and will have ~ 144,000 detector pixels covering 1.6 Sr of solid angle. The full spectrometer, including a 1-D dispersivemore » sample, has been simulated using the Monte Carlo package McStas. This paper summarizes the method of parallelization for and results from these simulations. In addition, limitations of and proposed improvements to current analysis software will be discussed.« less

  7. MASTOS: Mammography Simulation Tool for design Optimization Studies.

    PubMed

    Spyrou, G; Panayiotakis, G; Tzanakos, G

    2000-01-01

    Mammography is a high quality imaging technique for the detection of breast lesions, which requires dedicated equipment and optimum operation. The design parameters of a mammography unit have to be decided and evaluated before the construction of such a high cost of apparatus. The optimum operational parameters also must be defined well before the real breast examination. MASTOS is a software package, based on Monte Carlo methods, that is designed to be used as a simulation tool in mammography. The input consists of the parameters that have to be specified when using a mammography unit, and also the parameters specifying the shape and composition of the breast phantom. In addition, the input may specify parameters needed in the design of a new mammographic apparatus. The main output of the simulation is a mammographic image and calculations of various factors that describe the image quality. The Monte Carlo simulation code is PC-based and is driven by an outer shell of a graphical user interface. The entire software package is a simulation tool for mammography and can be applied in basic research and/or in training in the fields of medical physics and biomedical engineering as well as in the performance evaluation of new designs of mammography units and in the determination of optimum standards for the operational parameters of a mammography unit.

  8. Computing Radiative Transfer in a 3D Medium

    NASA Technical Reports Server (NTRS)

    Von Allmen, Paul; Lee, Seungwon

    2012-01-01

    A package of software computes the time-dependent propagation of a narrow laser beam in an arbitrary three- dimensional (3D) medium with absorption and scattering, using the transient-discrete-ordinates method and a direct integration method. Unlike prior software that utilizes a Monte Carlo method, this software enables simulation at very small signal-to-noise ratios. The ability to simulate propagation of a narrow laser beam in a 3D medium is an improvement over other discrete-ordinate software. Unlike other direct-integration software, this software is not limited to simulation of propagation of thermal radiation with broad angular spread in three dimensions or of a laser pulse with narrow angular spread in two dimensions. Uses for this software include (1) computing scattering of a pulsed laser beam on a material having given elastic scattering and absorption profiles, and (2) evaluating concepts for laser-based instruments for sensing oceanic turbulence and related measurements of oceanic mixed-layer depths. With suitable augmentation, this software could be used to compute radiative transfer in ultrasound imaging in biological tissues, radiative transfer in the upper Earth crust for oil exploration, and propagation of laser pulses in telecommunication applications.

  9. THE DiskMass SURVEY. III. STELLAR KINEMATICS VIA CROSS-CORRELATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westfall, Kyle B.; Bershady, Matthew A.; Verheijen, Marc A. W., E-mail: westfall@astro.rug.nl, E-mail: mab@astro.wisc.edu, E-mail: verheyen@astro.rug.nl

    2011-03-15

    We describe a new cross-correlation (CC) approach used by our survey to derive stellar kinematics from galaxy-continuum spectroscopy. This approach adopts the formal error analysis derived by Statler, but properly handles spectral masks. Thus, we address the primary concerns regarding application of the CC method to censored data, while maintaining its primary advantage by consolidating kinematic and template-mismatch information toward different regions of the CC function. We identify a systematic error in the nominal CC method of approximately 10% in velocity dispersion incurred by a mistreatment of detector-censored data, which is eliminated by our new method. We derive our approachmore » from first principles, and we use Monte Carlo simulations to demonstrate its efficacy. An identical set of Monte Carlo simulations performed using the well-established penalized-pixel-fitting code of Cappellari and Emsellem compares favorably with the results from our newly implemented software. Finally, we provide a practical demonstration of this software by extracting stellar kinematics from SparsePak spectra of UGC 6918.« less

  10. VoxelMages: a general-purpose graphical interface for designing geometries and processing DICOM images for PENELOPE.

    PubMed

    Giménez-Alventosa, V; Ballester, F; Vijande, J

    2016-12-01

    The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. DSN Array Simulator

    NASA Technical Reports Server (NTRS)

    Tikidjian, Raffi; Mackey, Ryan

    2008-01-01

    The DSN Array Simulator (wherein 'DSN' signifies NASA's Deep Space Network) is an updated version of software previously denoted the DSN Receive Array Technology Assessment Simulation. This software (see figure) is used for computational modeling of a proposed DSN facility comprising user-defined arrays of antennas and transmitting and receiving equipment for microwave communication with spacecraft on interplanetary missions. The simulation includes variations in spacecraft tracked and communication demand changes for up to several decades of future operation. Such modeling is performed to estimate facility performance, evaluate requirements that govern facility design, and evaluate proposed improvements in hardware and/or software. The updated version of this software affords enhanced capability for characterizing facility performance against user-defined mission sets. The software includes a Monte Carlo simulation component that enables rapid generation of key mission-set metrics (e.g., numbers of links, data rates, and date volumes), and statistical distributions thereof as functions of time. The updated version also offers expanded capability for mixed-asset network modeling--for example, for running scenarios that involve user-definable mixtures of antennas having different diameters (in contradistinction to a fixed number of antennas having the same fixed diameter). The improved version also affords greater simulation fidelity, sufficient for validation by comparison with actual DSN operations and analytically predictable performance metrics.

  12. A Monte Carlo model for 3D grain evolution during welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bezier curves, which allow formore » the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. Furthermore, the model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.« less

  13. A Monte Carlo model for 3D grain evolution during welding

    DOE PAGES

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-08-04

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bezier curves, which allow formore » the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. Furthermore, the model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.« less

  14. Towards real-time photon Monte Carlo dose calculation in the cloud

    NASA Astrophysics Data System (ADS)

    Ziegenhein, Peter; Kozin, Igor N.; Kamerling, Cornelis Ph; Oelfke, Uwe

    2017-06-01

    Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.

  15. Towards real-time photon Monte Carlo dose calculation in the cloud.

    PubMed

    Ziegenhein, Peter; Kozin, Igor N; Kamerling, Cornelis Ph; Oelfke, Uwe

    2017-06-07

    Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.

  16. Comparison of Three Methods of Calculation, Experimental and Monte Carlo Simulation in Investigation of Organ Doses (Thyroid, Sternum, Cervical Vertebra) in Radioiodine Therapy

    PubMed Central

    Shahbazi-Gahrouei, Daryoush; Ayat, Saba

    2012-01-01

    Radioiodine therapy is an effective method for treating thyroid cancer carcinoma, but it has some affects on normal tissues, hence dosimetry of vital organs is important to weigh the risks and benefits of this method. The aim of this study is to measure the absorbed doses of important organs by Monte Carlo N Particle (MCNP) simulation and comparing the results of different methods of dosimetry by performing a t-paired test. To calculate the absorbed dose of thyroid, sternum, and cervical vertebra using the MCNP code, *F8 tally was used. Organs were simulated by using a neck phantom and Medical Internal Radiation Dosimetry (MIRD) method. Finally, the results of MCNP, MIRD, and Thermoluminescent dosimeter (TLD) measurements were compared by SPSS software. The absorbed dose obtained by Monte Carlo simulations for 100, 150, and 175 mCi administered 131I was found to be 388.0, 427.9, and 444.8 cGy for thyroid, 208.7, 230.1, and 239.3 cGy for sternum and 272.1, 299.9, and 312.1 cGy for cervical vertebra. The results of paired t-test were 0.24 for comparing TLD dosimetry and MIRD calculation, 0.80 for MCNP simulation and MIRD, and 0.19 for TLD and MCNP. The results showed no significant differences among three methods of Monte Carlo simulations, MIRD calculation and direct experimental dosimetry using TLD. PMID:23717806

  17. Multibody Simulation Software Testbed for Small-Body Exploration and Sampling

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Blackmore, James C.; Mandic, Milan

    2011-01-01

    G-TAG is a software tool for the multibody simulation of a spacecraft with a robotic arm and a sampling mechanism, which performs a touch-and-go (TAG) maneuver for sampling from the surface of a small celestial body. G-TAG utilizes G-DYN, a multi-body simulation engine described in the previous article, and interfaces to controllers, estimators, and environmental forces that affect the spacecraft. G-TAG can easily be adapted for the analysis of the mission stress cases to support the design of a TAG system, as well as for comprehensive Monte Carlo simulations to analyze and evaluate a particular TAG system design. Any future small-body mission will benefit from using G-TAG, which has already been extensively used in Comet Odyssey and Galahad Asteroid New Frontiers proposals.

  18. Three-Dimensional (3D) Nanometrology Based on Scanning Electron Microscope (SEM) Stereophotogrammetry.

    PubMed

    Tondare, Vipin N; Villarrubia, John S; Vlada R, András E

    2017-10-01

    Three-dimensional (3D) reconstruction of a sample surface from scanning electron microscope (SEM) images taken at two perspectives has been known for decades. Nowadays, there exist several commercially available stereophotogrammetry software packages. For testing these software packages, in this study we used Monte Carlo simulated SEM images of virtual samples. A virtual sample is a model in a computer, and its true dimensions are known exactly, which is impossible for real SEM samples due to measurement uncertainty. The simulated SEM images can be used for algorithm testing, development, and validation. We tested two stereophotogrammetry software packages and compared their reconstructed 3D models with the known geometry of the virtual samples used to create the simulated SEM images. Both packages performed relatively well with simulated SEM images of a sample with a rough surface. However, in a sample containing nearly uniform and therefore low-contrast zones, the height reconstruction error was ≈46%. The present stereophotogrammetry software packages need further improvement before they can be used reliably with SEM images with uniform zones.

  19. Front panel engineering with CAD simulation tool

    NASA Astrophysics Data System (ADS)

    Delacour, Jacques; Ungar, Serge; Mathieu, Gilles; Hasna, Guenther; Martinez, Pascal; Roche, Jean-Christophe

    1999-04-01

    THe progress made recently in display technology covers many fields of application. The specification of radiance, colorimetry and lighting efficiency creates some new challenges for designers. Photometric design is limited by the capability of correctly predicting the result of a lighting system, to save on the costs and time taken to build multiple prototypes or bread board benches. The second step of the research carried out by company OPTIS is to propose an optimization method to be applied to the lighting system, developed in the software SPEOS. The main features of the tool requires include the CAD interface, to enable fast and efficient transfer between mechanical and light design software, the source modeling, the light transfer model and an optimization tool. The CAD interface is mainly a prototype of transfer, which is not the subjects here. Photometric simulation is efficiently achieved by using the measured source encoding and a simulation by the Monte Carlo method. Today, the advantages and the limitations of the Monte Carlo method are well known. The noise reduction requires a long calculation time, which increases with the complexity of the display panel. A successful optimization is difficult to achieve, due to the long calculation time required for each optimization pass including a Monte Carlo simulation. The problem was initially defined as an engineering method of study. The experience shows that good understanding and mastering of the phenomenon of light transfer is limited by the complexity of non sequential propagation. The engineer must call for the help of a simulation and optimization tool. The main point needed to be able to perform an efficient optimization is a quick method for simulating light transfer. Much work has been done in this area and some interesting results can be observed. It must be said that the Monte Carlo method wastes time calculating some results and information which are not required for the needs of the simulation. Low efficiency transfer system cost a lot of lost time. More generally, the light transfer simulation can be treated efficiently when the integrated result is composed of elementary sub results that include quick analytical calculated intersections. The first axis of research appear. The quick integration research and the quick calculation of geometric intersections. The first axis of research brings some general solutions also valid for multi-reflection systems. The second axis requires some deep thinking on the intersection calculation. An interesting way is the subdivision of space in VOXELS. This is an adapted method of 3D division of space according to the objects and their location. An experimental software has been developed to provide a validation of the method. The gain is particularly high in complex systems. An important reduction in the calculation time has been achieved.

  20. Dynamic Modeling Using MCSim and R (SOT 2016 Biological Modeling Webinar Series)

    EPA Science Inventory

    MCSim is a stand-alone software package for simulating and analyzing dynamic models, with a focus on Bayesian analysis using Markov Chain Monte Carlo. While it is an extremely powerful package, it is somewhat inflexible, and offers only a limited range of analysis options, with n...

  1. Simple Sensitivity Analysis for Orion GNC

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  2. Fast scattering simulation tool for multi-energy x-ray imaging

    NASA Astrophysics Data System (ADS)

    Sossin, A.; Tabary, J.; Rebuffel, V.; Létang, J. M.; Freud, N.; Verger, L.

    2015-12-01

    A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.

  3. A laboratory investigation of the variability of cloud reflected radiance fields

    NASA Technical Reports Server (NTRS)

    Mckee, T. B.; Cox, S. K.

    1986-01-01

    A method to determine the radiative properties of complex cloud fields was developed. A Cloud field optical simulator (CFOS) was constructed to simulate the interaction of cloud fields with visible radiation. The CFOS was verified by comparing experimental results from it with calculations performed with a Monte Carlo radiative transfer model. A software library was developed to process, reduce, and display CFOS data. The CFSOS was utilized to study the reflected radiane patterns from simulated cloud fields.

  4. Monte Carlo simulation of portal dosimetry on a rectilinear voxel geometry: a variable gantry angle solution.

    PubMed

    Chin, P W; Spezi, E; Lewis, D G

    2003-08-21

    A software solution has been developed to carry out Monte Carlo simulations of portal dosimetry using the BEAMnrc/DOSXYZnrc code at oblique gantry angles. The solution is based on an integrated phantom, whereby the effect of incident beam obliquity was included using geometric transformations. Geometric transformations are accurate within +/- 1 mm and +/- 1 degrees with respect to exact values calculated using trigonometry. An application in portal image prediction of an inhomogeneous phantom demonstrated good agreement with measured data, where the root-mean-square of the difference was under 2% within the field. Thus, we achieved a dose model framework capable of handling arbitrary gantry angles, voxel-by-voxel phantom description and realistic particle transport throughout the geometry.

  5. A method for the complete analysis of NORM building materials by γ-ray spectrometry using HPGe detectors.

    PubMed

    Quintana, B; Pedrosa, M C; Vázquez-Canelas, L; Santamaría, R; Sanjuán, M A; Puertas, F

    2018-04-01

    A methodology including software tools for analysing NORM building materials and residues by low-level gamma-ray spectrometry has been developed. It comprises deconvolution of gamma-ray spectra using the software GALEA with focus on the natural radionuclides and Monte Carlo simulations for efficiency and true coincidence summing corrections. The methodology has been tested on a range of building materials and validated against reference materials. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Monte Carlo Simulation to Estimate Likelihood of Direct Lightning Strikes

    NASA Technical Reports Server (NTRS)

    Mata, Carlos; Medelius, Pedro

    2008-01-01

    A software tool has been designed to quantify the lightning exposure at launch sites of the stack at the pads under different configurations. In order to predict lightning strikes to generic structures, this model uses leaders whose origins (in the x-y plane) are obtained from a 2D random, normal distribution.

  7. Hazard Detection Software for Lunar Landing

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Johnson, Andrew E.; Werner, Robert A.; Montgomery, James F.

    2011-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing a system for safe and precise manned lunar landing that involves novel sensors, but also specific algorithms. ALHAT has selected imaging LIDAR (light detection and ranging) as the sensing modality for onboard hazard detection because imaging LIDARs can rapidly generate direct measurements of the lunar surface elevation from high altitude. Then, starting with the LIDAR-based Hazard Detection and Avoidance (HDA) algorithm developed for Mars Landing, JPL has developed a mature set of HDA software for the manned lunar landing problem. Landing hazards exist everywhere on the Moon, and many of the more desirable landing sites are near the most hazardous terrain, so HDA is needed to autonomously and safely land payloads over much of the lunar surface. The HDA requirements used in the ALHAT project are to detect hazards that are 0.3 m tall or higher and slopes that are 5 or greater. Steep slopes, rocks, cliffs, and gullies are all hazards for landing and, by computing the local slope and roughness in an elevation map, all of these hazards can be detected. The algorithm in this innovation is used to measure slope and roughness hazards. In addition to detecting these hazards, the HDA capability also is able to find a safe landing site free of these hazards for a lunar lander with diameter .15 m over most of the lunar surface. This software includes an implementation of the HDA algorithm, software for generating simulated lunar terrain maps for testing, hazard detection performance analysis tools, and associated documentation. The HDA software has been deployed to Langley Research Center and integrated into the POST II Monte Carlo simulation environment. The high-fidelity Monte Carlo simulations determine the required ground spacing between LIDAR samples (ground sample distances) and the noise on the LIDAR range measurement. This simulation has also been used to determine the effect of viewing on hazard detection performance. The software has also been deployed to Johnson Space Center and integrated into the ALHAT real-time Hardware-in-the-Loop testbed.

  8. A 3DHZETRN Code in a Spherical Uniform Sphere with Monte Carlo Verification

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2014-01-01

    The computationally efficient HZETRN code has been used in recent trade studies for lunar and Martian exploration and is currently being used in the engineering development of the next generation of space vehicles, habitats, and extra vehicular activity equipment. A new version (3DHZETRN) capable of transporting High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation is under development. In the present report, new algorithms for light ion and neutron propagation with well-defined convergence criteria in 3D objects is developed and tested against Monte Carlo simulations to verify the solution methodology. The code will be available through the software system, OLTARIS, for shield design and validation and provides a basis for personal computer software capable of space shield analysis and optimization.

  9. [Development of a software for 3D virtual phantom design].

    PubMed

    Zou, Lian; Xie, Zhao; Wu, Qi

    2014-02-01

    In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.

    In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less

  11. Atomdroid: a computational chemistry tool for mobile platforms.

    PubMed

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  12. Simple Sensitivity Analysis for Orion Guidance Navigation and Control

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  13. Deterministic absorbed dose estimation in computed tomography using a discrete ordinates method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norris, Edward T.; Liu, Xin, E-mail: xinliu@mst.edu; Hsieh, Jiang

    Purpose: Organ dose estimation for a patient undergoing computed tomography (CT) scanning is very important. Although Monte Carlo methods are considered gold-standard in patient dose estimation, the computation time required is formidable for routine clinical calculations. Here, the authors instigate a deterministic method for estimating an absorbed dose more efficiently. Methods: Compared with current Monte Carlo methods, a more efficient approach to estimating the absorbed dose is to solve the linear Boltzmann equation numerically. In this study, an axial CT scan was modeled with a software package, Denovo, which solved the linear Boltzmann equation using the discrete ordinates method. Themore » CT scanning configuration included 16 x-ray source positions, beam collimators, flat filters, and bowtie filters. The phantom was the standard 32 cm CT dose index (CTDI) phantom. Four different Denovo simulations were performed with different simulation parameters, including the number of quadrature sets and the order of Legendre polynomial expansions. A Monte Carlo simulation was also performed for benchmarking the Denovo simulations. A quantitative comparison was made of the simulation results obtained by the Denovo and the Monte Carlo methods. Results: The difference in the simulation results of the discrete ordinates method and those of the Monte Carlo methods was found to be small, with a root-mean-square difference of around 2.4%. It was found that the discrete ordinates method, with a higher order of Legendre polynomial expansions, underestimated the absorbed dose near the center of the phantom (i.e., low dose region). Simulations of the quadrature set 8 and the first order of the Legendre polynomial expansions proved to be the most efficient computation method in the authors’ study. The single-thread computation time of the deterministic simulation of the quadrature set 8 and the first order of the Legendre polynomial expansions was 21 min on a personal computer. Conclusions: The simulation results showed that the deterministic method can be effectively used to estimate the absorbed dose in a CTDI phantom. The accuracy of the discrete ordinates method was close to that of a Monte Carlo simulation, and the primary benefit of the discrete ordinates method lies in its rapid computation speed. It is expected that further optimization of this method in routine clinical CT dose estimation will improve its accuracy and speed.« less

  14. A Grand Canonical Monte Carlo simulation program for computing ion distributions around biomolecules in hard sphere solvents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The GIBS software program is a Grand Canonical Monte Carlo (GCMC) simulation program (written in C++) that can be used for 1) computing the excess chemical potential of ions and the mean activity coefficients of salts in homogeneous electrolyte solutions; and, 2) for computing the distribution of ions around fixed macromolecules such as, nucleic acids and proteins. The solvent can be represented as neutral hard spheres or as a dielectric continuum. The ions are represented as charged hard spheres that can interact via Coulomb, hard-sphere, or Lennard-Jones potentials. In addition to hard-sphere repulsions, the ions can also be made tomore » interact with the solvent hard spheres via short-ranged attractive square-well potentials.« less

  15. dsmcFoam+: An OpenFOAM based direct simulation Monte Carlo solver

    NASA Astrophysics Data System (ADS)

    White, C.; Borg, M. K.; Scanlon, T. J.; Longshaw, S. M.; John, B.; Emerson, D. R.; Reese, J. M.

    2018-03-01

    dsmcFoam+ is a direct simulation Monte Carlo (DSMC) solver for rarefied gas dynamics, implemented within the OpenFOAM software framework, and parallelised with MPI. It is open-source and released under the GNU General Public License in a publicly available software repository that includes detailed documentation and tutorial DSMC gas flow cases. This release of the code includes many features not found in standard dsmcFoam, such as molecular vibrational and electronic energy modes, chemical reactions, and subsonic pressure boundary conditions. Since dsmcFoam+ is designed entirely within OpenFOAM's C++ object-oriented framework, it benefits from a number of key features: the code emphasises extensibility and flexibility so it is aimed first and foremost as a research tool for DSMC, allowing new models and test cases to be developed and tested rapidly. All DSMC cases are as straightforward as setting up any standard OpenFOAM case, as dsmcFoam+ relies upon the standard OpenFOAM dictionary based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of a DSMC simulation is not typical of most OpenFOAM applications. We show that dsmcFoam+ compares well to other well-known DSMC codes and to analytical solutions in terms of benchmark results.

  16. Modeling Latent Interactions at Level 2 in Multilevel Structural Equation Models: An Evaluation of Mean-Centered and Residual-Centered Unconstrained Approaches

    ERIC Educational Resources Information Center

    Leite, Walter L.; Zuo, Youzhen

    2011-01-01

    Among the many methods currently available for estimating latent variable interactions, the unconstrained approach is attractive to applied researchers because of its relatively easy implementation with any structural equation modeling (SEM) software. Using a Monte Carlo simulation study, we extended and evaluated the unconstrained approach to…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mark A.; Bigelow, Matthew; Gilkey, Jeff C.

    The Super Strypi SWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle that includes a subset of the Super Strypi NGC software (guidance, ACS and sequencer). Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters, guidance parameters and Monte-Carlo parameters are defined in input files. Output parameters are saved to a Matlab mat file.

  18. MOLSIM: A modular molecular simulation software

    PubMed Central

    Jurij, Reščič

    2015-01-01

    The modular software MOLSIM for all‐atom molecular and coarse‐grained simulations is presented with focus on the underlying concepts used. The software possesses four unique features: (1) it is an integrated software for molecular dynamic, Monte Carlo, and Brownian dynamics simulations; (2) simulated objects are constructed in a hierarchical fashion representing atoms, rigid molecules and colloids, flexible chains, hierarchical polymers, and cross‐linked networks; (3) long‐range interactions involving charges, dipoles and/or anisotropic dipole polarizabilities are handled either with the standard Ewald sum, the smooth particle mesh Ewald sum, or the reaction‐field technique; (4) statistical uncertainties are provided for all calculated observables. In addition, MOLSIM supports various statistical ensembles, and several types of simulation cells and boundary conditions are available. Intermolecular interactions comprise tabulated pairwise potentials for speed and uniformity and many‐body interactions involve anisotropic polarizabilities. Intramolecular interactions include bond, angle, and crosslink potentials. A very large set of analyses of static and dynamic properties is provided. The capability of MOLSIM can be extended by user‐providing routines controlling, for example, start conditions, intermolecular potentials, and analyses. An extensive set of case studies in the field of soft matter is presented covering colloids, polymers, and crosslinked networks. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:25994597

  19. Monte Carlo Shower Counter Studies

    NASA Technical Reports Server (NTRS)

    Snyder, H. David

    1991-01-01

    Activities and accomplishments related to the Monte Carlo shower counter studies are summarized. A tape of the VMS version of the GEANT software was obtained and installed on the central computer at Gallaudet University. Due to difficulties encountered in updating this VMS version, a decision was made to switch to the UNIX version of the package. This version was installed and used to generate the set of data files currently accessed by various analysis programs. The GEANT software was used to write files of data for positron and proton showers. Showers were simulated for a detector consisting of 50 alternating layers of lead and scintillator. Each file consisted of 1000 events at each of the following energies: 0.1, 0.5, 2.0, 10, 44, and 200 GeV. Data analysis activities related to clustering, chi square, and likelihood analyses are summarized. Source code for the GEANT user subprograms and data analysis programs are provided along with example data plots.

  20. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    NASA Astrophysics Data System (ADS)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 83617 No. of bytes in distributed program, including test data, etc.: 1038160 Distribution format: tar.gz Programming language: C++. Computer: Tested on several PCs and on Mac. Operating system: Linux, Mac OS X, Windows (native and cygwin). RAM: It is dependent on the input data but usually between 1 and 10 MB. Classification: 2.5, 21.1. External routines: XrayLib (https://github.com/tschoonj/xraylib/wiki) Nature of problem: Simulation of a wide range of X-ray imaging and spectroscopy experiments using different types of sources and detectors. Solution method: XRMC is a versatile program that is useful for the simulation of a wide range of X-ray imaging and spectroscopy experiments. It enables the simulation of monochromatic and polychromatic X-ray sources, with unpolarised or partially/completely polarised radiation. Single-element detectors as well as two-dimensional pixel detectors can be used in the simulations, with several acquisition options. In the current version of the program, the sample is modelled by combining convex three-dimensional objects demarcated by quadric surfaces, such as planes, ellipsoids and cylinders. The Monte Carlo approach makes XRMC able to accurately simulate X-ray photon transport and interactions with matter up to any order of interaction. The differential cross-sections and all other quantities related to the interaction processes (photoelectric absorption, fluorescence emission, elastic and inelastic scattering) are computed using the xraylib software library, which is currently the most complete and up-to-date software library for X-ray parameters. The use of variance reduction techniques makes XRMC able to reduce the simulation time by several orders of magnitude compared to other general-purpose Monte Carlo simulation programs. Running time: It is dependent on the complexity of the simulation. For the examples distributed with the code, it ranges from less than 1 s to a few minutes.

  1. The simulation library of the Belle II software system

    NASA Astrophysics Data System (ADS)

    Kim, D. Y.; Ritter, M.; Bilka, T.; Bobrov, A.; Casarosa, G.; Chilikin, K.; Ferber, T.; Godang, R.; Jaegle, I.; Kandra, J.; Kodys, P.; Kuhr, T.; Kvasnicka, P.; Nakayama, H.; Piilonen, L.; Pulvermacher, C.; Santelj, L.; Schwenker, B.; Sibidanov, A.; Soloviev, Y.; Starič, M.; Uglov, T.

    2017-10-01

    SuperKEKB, the next generation B factory, has been constructed in Japan as an upgrade of KEKB. This brand new e+ e- collider is expected to deliver a very large data set for the Belle II experiment, which will be 50 times larger than the previous Belle sample. Both the triggered physics event rate and the background event rate will be increased by at least 10 times than the previous ones, and will create a challenging data taking environment for the Belle II detector. The software system of the Belle II experiment is designed to execute this ambitious plan. A full detector simulation library, which is a part of the Belle II software system, is created based on Geant4 and has been tested thoroughly. Recently the library has been upgraded with Geant4 version 10.1. The library is behaving as expected and it is utilized actively in producing Monte Carlo data sets for various studies. In this paper, we will explain the structure of the simulation library and the various interfaces to other packages including geometry and beam background simulation.

  2. Accelerated rescaling of single Monte Carlo simulation runs with the Graphics Processing Unit (GPU).

    PubMed

    Yang, Owen; Choi, Bernard

    2013-01-01

    To interpret fiber-based and camera-based measurements of remitted light from biological tissues, researchers typically use analytical models, such as the diffusion approximation to light transport theory, or stochastic models, such as Monte Carlo modeling. To achieve rapid (ideally real-time) measurement of tissue optical properties, especially in clinical situations, there is a critical need to accelerate Monte Carlo simulation runs. In this manuscript, we report on our approach using the Graphics Processing Unit (GPU) to accelerate rescaling of single Monte Carlo runs to calculate rapidly diffuse reflectance values for different sets of tissue optical properties. We selected MATLAB to enable non-specialists in C and CUDA-based programming to use the generated open-source code. We developed a software package with four abstraction layers. To calculate a set of diffuse reflectance values from a simulated tissue with homogeneous optical properties, our rescaling GPU-based approach achieves a reduction in computation time of several orders of magnitude as compared to other GPU-based approaches. Specifically, our GPU-based approach generated a diffuse reflectance value in 0.08ms. The transfer time from CPU to GPU memory currently is a limiting factor with GPU-based calculations. However, for calculation of multiple diffuse reflectance values, our GPU-based approach still can lead to processing that is ~3400 times faster than other GPU-based approaches.

  3. Performance of student software development teams: the influence of personality and identifying as team members

    NASA Astrophysics Data System (ADS)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms should substantially influence the team's performance. This paper explores the influence of both these perspectives in university software engineering project teams. Eighty students worked to complete a piece of software in small project teams during 2007 or 2008. To reduce limitations in statistical analysis, Monte Carlo simulation techniques were employed to extrapolate from the results of the original sample to a larger simulated sample (2043 cases, within 319 teams). The results emphasise the importance of taking into account personality (particularly conscientiousness), and both team identification and the team's norm of performance, in order to cultivate higher levels of performance in student software engineering project teams.

  4. The optical design and simulation of the collimated solar simulator

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Ma, Tao

    2018-01-01

    The solar simulator is a lighting device that can simulate the solar radiation. It has been widely used in the testing of solar cells, satellite space environment simulation and ground experiment, test and calibration precision of solar sensor. The solar simulator mainly consisted of short—arc xenon lamp, ellipsoidal reflectors, a group of optical integrator, field stop, aspheric folding mirror and collimating reflector. In this paper, the solar simulator's optical system basic size are given by calculation. Then the system is optically modeled with the Lighttools software, and the simulation analysis on solar simulator using the Monte Carlo ray -tracing technique is conducted. Finally, the simulation results are given quantitatively by diagrammatic form. The rationality of the design is verified on the basis of theory.

  5. Propulsive Reaction Control System Model

    NASA Technical Reports Server (NTRS)

    Brugarolas, Paul; Phan, Linh H.; Serricchio, Frederick; San Martin, Alejandro M.

    2011-01-01

    This software models a propulsive reaction control system (RCS) for guidance, navigation, and control simulation purposes. The model includes the drive electronics, the electromechanical valve dynamics, the combustion dynamics, and thrust. This innovation follows the Mars Science Laboratory entry reaction control system design, and has been created to meet the Mars Science Laboratory (MSL) entry, descent, and landing simulation needs. It has been built to be plug-and-play on multiple MSL testbeds [analysis, Monte Carlo, flight software development, hardware-in-the-loop, and ATLO (assembly, test and launch operations) testbeds]. This RCS model is a C language program. It contains two main functions: the RCS electronics model function that models the RCS FPGA (field-programmable-gate-array) processing and commanding of the RCS valve, and the RCS dynamic model function that models the valve and combustion dynamics. In addition, this software provides support functions to initialize the model states, set parameters, access model telemetry, and access calculated thruster forces.

  6. MOlecular MAterials Property Prediction Package (MOMAP) 1.0: a software package for predicting the luminescent properties and mobility of organic functional materials

    NASA Astrophysics Data System (ADS)

    Niu, Yingli; Li, Wenqiang; Peng, Qian; Geng, Hua; Yi, Yuanping; Wang, Linjun; Nan, Guangjun; Wang, Dong; Shuai, Zhigang

    2018-04-01

    MOlecular MAterials Property Prediction Package (MOMAP) is a software toolkit for molecular materials property prediction. It focuses on luminescent properties and charge mobility properties. This article contains a brief descriptive introduction of key features, theoretical models and algorithms of the software, together with examples that illustrate the performance. First, we present the theoretical models and algorithms for molecular luminescent properties calculation, which includes the excited-state radiative/non-radiative decay rate constant and the optical spectra. Then, a multi-scale simulation approach and its algorithm for the molecular charge mobility are described. This approach is based on hopping model and combines with Kinetic Monte Carlo and molecular dynamics simulations, and it is especially applicable for describing a large category of organic semiconductors, whose inter-molecular electronic coupling is much smaller than intra-molecular charge reorganisation energy.

  7. Statistical methods for launch vehicle guidance, navigation, and control (GN&C) system design and analysis

    NASA Astrophysics Data System (ADS)

    Rose, Michael Benjamin

    A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical formulations that are discussed are applicable to ascent on Earth or other planets as well as other rocket-powered systems such as sounding rockets and ballistic missiles.

  8. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images.

    PubMed

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity.

  9. Noise tolerant illumination optimization applied to display devices

    NASA Astrophysics Data System (ADS)

    Cassarly, William J.; Irving, Bruce

    2005-02-01

    Display devices have historically been designed through an iterative process using numerous hardware prototypes. This process is effective but the number of iterations is limited by the time and cost to make the prototypes. In recent years, virtual prototyping using illumination software modeling tools has replaced many of the hardware prototypes. Typically, the designer specifies the design parameters, builds the software model, predicts the performance using a Monte Carlo simulation, and uses the performance results to repeat this process until an acceptable design is obtained. What is highly desired, and now possible, is to use illumination optimization to automate the design process. Illumination optimization provides the ability to explore a wider range of design options while also providing improved performance. Since Monte Carlo simulations are often used to calculate the system performance but those predictions have statistical uncertainty, the use of noise tolerant optimization algorithms is important. The use of noise tolerant illumination optimization is demonstrated by considering display device designs that extract light using 2D paint patterns as well as 3D textured surfaces. A hybrid optimization approach that combines a mesh feedback optimization with a classical optimizer is demonstrated. Displays with LED sources and cold cathode fluorescent lamps are considered.

  10. On the possibility of 'real-time' Monte Carlo calculations for the estimation of absorbed dose in radioimmunotherapy.

    PubMed

    Johnson, T K; Vessella, R L

    1989-07-01

    Dosimetry calculations of monoclonal antibodies (MABs) are made difficult because the focus of radioactivity is targeted for a nonstandard volume in a nonstandard geometry, precluding straightforward application of the MIRD formalism. The MABDOS software addresses this shortcoming by interactive placement of a spherical perturbation into the Standard Man geometry for each tumor focus. S tables are calculated by a Monte Carlo simulation of photon transport for each organ system (including tumor) that localizes activity. Performance benchmarks are reported that measure the time required to simulate 60,000 photons for each penetrating radiation in the spectrum of 99mTc and 131I using the kidney as source organ. Results indicate that calculation times are probably prohibitive on current microcomputer platforms. Mini and supercomputers offer a realistic platform for MABDOS patient dosimetry estimates.

  11. EVALUATING THE SENSITIVITY OF RADIONUCLIDE DETECTORS FOR CONDUCTING A MARITIME ON-BOARD SEARCH USING MONTE CARLO SIMULATION IMPLEMENTED IN AVERT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S; Dave Dunn, D

    The sensitivity of two specific types of radionuclide detectors for conducting an on-board search in the maritime environment was evaluated using Monte Carlo simulation implemented in AVERT{reg_sign}. AVERT{reg_sign}, short for the Automated Vulnerability Evaluation for Risk of Terrorism, is personal computer based vulnerability assessment software developed by the ARES Corporation. The sensitivity of two specific types of radionuclide detectors for conducting an on-board search in the maritime environment was evaluated using Monte Carlo simulation. The detectors, a RadPack and also a Personal Radiation Detector (PRD), were chosen from the class of Human Portable Radiation Detection Systems (HPRDS). Human Portable Radiationmore » Detection Systems (HPRDS) serve multiple purposes. In the maritime environment, there is a need to detect, localize, characterize, and identify radiological/nuclear (RN) material or weapons. The RadPack is a commercially available broad-area search device used for gamma and also for neutron detection. The PRD is chiefly used as a personal radiation protection device. It is also used to detect contraband radionuclides and to localize radionuclide sources. Neither device has the capacity to characterize or identify radionuclides. The principal aim of this study was to investigate the sensitivity of both the RadPack and the PRD while being used under controlled conditions in a simulated maritime environment for detecting hidden RN contraband. The detection distance varies by the source strength and the shielding present. The characterization parameters of the source are not indicated in this report so the results summarized are relative. The Monte Carlo simulation results indicate the probability of detection of the RN source at certain distances from the detector which is a function of transverse speed and instrument sensitivity for the specified RN source.« less

  12. Dosimetric evaluation of the clinical implementation of the first commercial IMRT Monte Carlo treatment planning system at 6 MV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heath, Emily; Seuntjens, Jan; Sheikh-Bagheri, Daryoush

    2004-10-01

    In this work we dosimetrically evaluated the clinical implementation of a commercial Monte Carlo treatment planning software (PEREGRINE, North American Scientific, Cranberry Township, PA) intended for quality assurance (QA) of intensity modulated radiation therapy treatment plans. Dose profiles calculated in homogeneous and heterogeneous phantoms using this system were compared to both measurements and simulations using the EGSnrc Monte Carlo code for the 6 MV beam of a Varian CL21EX linear accelerator. For simple jaw-defined fields, calculations agree within 2% of the dose at d{sub max} with measurements in homogeneous phantoms with the exception of the buildup region where the calculationsmore » overestimate the dose by up to 8%. In heterogeneous lung and bone phantoms the agreement is within 3%, on average, up to 5% for a 1x1 cm{sup 2} field. We tested two consecutive implementations of the MLC model. After matching the calculated and measured MLC leakage, simulations of static and dynamic MLC-defined fields using the most recent MLC model agreed to within 2% with measurements.« less

  13. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  14. hybrid\\scriptsize{{MANTIS}}: a CPU-GPU Monte Carlo method for modeling indirect x-ray detectors with columnar scintillators

    NASA Astrophysics Data System (ADS)

    Sharma, Diksha; Badal, Andreu; Badano, Aldo

    2012-04-01

    The computational modeling of medical imaging systems often requires obtaining a large number of simulated images with low statistical uncertainty which translates into prohibitive computing times. We describe a novel hybrid approach for Monte Carlo simulations that maximizes utilization of CPUs and GPUs in modern workstations. We apply the method to the modeling of indirect x-ray detectors using a new and improved version of the code \\scriptsize{{MANTIS}}, an open source software tool used for the Monte Carlo simulations of indirect x-ray imagers. We first describe a GPU implementation of the physics and geometry models in fast\\scriptsize{{DETECT}}2 (the optical transport model) and a serial CPU version of the same code. We discuss its new features like on-the-fly column geometry and columnar crosstalk in relation to the \\scriptsize{{MANTIS}} code, and point out areas where our model provides more flexibility for the modeling of realistic columnar structures in large area detectors. Second, we modify \\scriptsize{{PENELOPE}} (the open source software package that handles the x-ray and electron transport in \\scriptsize{{MANTIS}}) to allow direct output of location and energy deposited during x-ray and electron interactions occurring within the scintillator. This information is then handled by optical transport routines in fast\\scriptsize{{DETECT}}2. A load balancer dynamically allocates optical transport showers to the GPU and CPU computing cores. Our hybrid\\scriptsize{{MANTIS}} approach achieves a significant speed-up factor of 627 when compared to \\scriptsize{{MANTIS}} and of 35 when compared to the same code running only in a CPU instead of a GPU. Using hybrid\\scriptsize{{MANTIS}}, we successfully hide hours of optical transport time by running it in parallel with the x-ray and electron transport, thus shifting the computational bottleneck from optical to x-ray transport. The new code requires much less memory than \\scriptsize{{MANTIS}} and, as a result, allows us to efficiently simulate large area detectors.

  15. Simulation of Watts Bar Unit 1 Initial Startup Tests with Continuous Energy Monte Carlo Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Godfrey, Andrew T; Gehin, Jess C; Bekar, Kursat B

    2014-01-01

    The Consortium for Advanced Simulation of Light Water Reactors* is developing a collection of methods and software products known as VERA, the Virtual Environment for Reactor Applications. One component of the testing and validation plan for VERA is comparison of neutronics results to a set of continuous energy Monte Carlo solutions for a range of pressurized water reactor geometries using the SCALE component KENO-VI developed by Oak Ridge National Laboratory. Recent improvements in data, methods, and parallelism have enabled KENO, previously utilized predominately as a criticality safety code, to demonstrate excellent capability and performance for reactor physics applications. The highlymore » detailed and rigorous KENO solutions provide a reliable nu-meric reference for VERAneutronics and also demonstrate the most accurate predictions achievable by modeling and simulations tools for comparison to operating plant data. This paper demonstrates the performance of KENO-VI for the Watts Bar Unit 1 Cycle 1 zero power physics tests, including reactor criticality, control rod worths, and isothermal temperature coefficients.« less

  16. Optimizing Muscle Parameters in Musculoskeletal Modeling Using Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Hanson, Andrea; Reed, Erik; Cavanagh, Peter

    2011-01-01

    Astronauts assigned to long-duration missions experience bone and muscle atrophy in the lower limbs. The use of musculoskeletal simulation software has become a useful tool for modeling joint and muscle forces during human activity in reduced gravity as access to direct experimentation is limited. Knowledge of muscle and joint loads can better inform the design of exercise protocols and exercise countermeasure equipment. In this study, the LifeModeler(TM) (San Clemente, CA) biomechanics simulation software was used to model a squat exercise. The initial model using default parameters yielded physiologically reasonable hip-joint forces. However, no activation was predicted in some large muscles such as rectus femoris, which have been shown to be active in 1-g performance of the activity. Parametric testing was conducted using Monte Carlo methods and combinatorial reduction to find a muscle parameter set that more closely matched physiologically observed activation patterns during the squat exercise. Peak hip joint force using the default parameters was 2.96 times body weight (BW) and increased to 3.21 BW in an optimized, feature-selected test case. The rectus femoris was predicted to peak at 60.1% activation following muscle recruitment optimization, compared to 19.2% activation with default parameters. These results indicate the critical role that muscle parameters play in joint force estimation and the need for exploration of the solution space to achieve physiologically realistic muscle activation.

  17. Error and Uncertainty Analysis for Ecological Modeling and Simulation

    DTIC Science & Technology

    2001-12-01

    management (LRAM) accounting for environmental, training, and economic factors. In the ELVS methodology, soil erosion status is used as a quantitative...Monte-Carlo approach. The optimization is realized through economic functions or on decision constraints, such as, unit sample cost, number of samples... nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G

  18. The High Cost of Complexity in Experimental Design and Data Analysis: Type I and Type II Error Rates in Multiway ANOVA.

    ERIC Educational Resources Information Center

    Smith, Rachel A.; Levine, Timothy R.; Lachlan, Kenneth A.; Fediuk, Thomas A.

    2002-01-01

    Notes that the availability of statistical software packages has led to a sharp increase in use of complex research designs and complex statistical analyses in communication research. Reports a series of Monte Carlo simulations which demonstrate that this complexity may come at a heavier cost than many communication researchers realize. Warns…

  19. Implementing the Second-Order Fermi Process in a Kinetic Monte-Carlo Simulation

    NASA Technical Reports Server (NTRS)

    Summerlin, Errol J.

    2010-01-01

    Radio JOVE is an education and outreach project intended to give students and other interested individuals hands-on experience in learning radio astronomy. They can do this through building a radio telescope from a relatively inexpensive kit that includes the parts for a receiver and an antenna as well as software for a computer chart recorder emulator (Radio Skypipe) and other reference materials

  20. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée; McKay, Erin

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of amore » given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. Conclusions: The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.« less

  1. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry.

    PubMed

    Garcia, Marie-Paule; Villoing, Daphnée; McKay, Erin; Ferrer, Ludovic; Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila; Bardiès, Manuel

    2015-12-01

    The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit gate offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on gate to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user's imaging requirements and generates automatically command files used as input for gate. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant gate input files are generated for the virtual patient model and associated pharmacokinetics. Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body "step and shoot" acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.

  2. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    USGS Publications Warehouse

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf

  3. The discounting model selector: Statistical software for delay discounting applications.

    PubMed

    Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A

    2017-05-01

    Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.

  4. 3D Space Radiation Transport in a Shielded ICRU Tissue Sphere

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2014-01-01

    A computationally efficient 3DHZETRN code capable of simulating High Charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation was recently developed for a simple homogeneous shield object. Monte Carlo benchmarks were used to verify the methodology in slab and spherical geometry, and the 3D corrections were shown to provide significant improvement over the straight-ahead approximation in some cases. In the present report, the new algorithms with well-defined convergence criteria are extended to inhomogeneous media within a shielded tissue slab and a shielded tissue sphere and tested against Monte Carlo simulation to verify the solution methods. The 3D corrections are again found to more accurately describe the neutron and light ion fluence spectra as compared to the straight-ahead approximation. These computationally efficient methods provide a basis for software capable of space shield analysis and optimization.

  5. Monte Carlo modelling of Schottky diode for rectenna simulation

    NASA Astrophysics Data System (ADS)

    Bernuchon, E.; Aniel, F.; Zerounian, N.; Grimault-Jacquin, A. S.

    2017-09-01

    Before designing a detector circuit, the electrical parameters extraction of the Schottky diode is a critical step. This article is based on a Monte-Carlo (MC) solver of the Boltzmann Transport Equation (BTE) including different transport mechanisms at the metal-semiconductor contact such as image force effect or tunneling. The weight of tunneling and thermionic current is quantified according to different degrees of tunneling modelling. The I-V characteristic highlights the dependence of the ideality factor and the current saturation with bias. Harmonic Balance (HB) simulation on a rectifier circuit within Advanced Design System (ADS) software shows that considering non-linear ideality factor and saturation current for the electrical model of the Schottky diode does not seem essential. Indeed, bias independent values extracted in forward regime on I-V curve are sufficient. However, the non-linear series resistance extracted from a small signal analysis (SSA) strongly influences the conversion efficiency at low input powers.

  6. LCG MCDB—a knowledgebase of Monte-Carlo simulated events

    NASA Astrophysics Data System (ADS)

    Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.

    2008-02-01

    In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project. Program summaryProgram title: LCG Monte-Carlo Data Base Catalogue identifier: ADZX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 30 129 No. of bytes in distributed program, including test data, etc.: 216 943 Distribution format: tar.gz Programming language: Perl Computer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 Gb Operating system: Scientific Linux CERN 3/4 RAM: 1 073 741 824 bytes (1 Gb) Classification: 9 External routines:perl >= 5.8.5; Perl modules DBD-mysql >= 2.9004, File::Basename, GD::SecurityImage, GD::SecurityImage::AC, Linux::Statistics, XML::LibXML > 1.6, XML::SAX, XML::NamespaceSupport; Apache HTTP Server >= 2.0.59; mod auth external >= 2.2.9; edg-utils-system RPM package; gd >= 2.0.28; rpm package CASTOR-client >= 2.1.2-4; arc-server (optional) Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC) generators to prepare the events. For example, the same MC samples of Standard Model (SM) processes can be employed for the investigations either in the SM analyses (as a signal) or in searches for new phenomena in Beyond Standard Model analyses (as a background). If the samples are made available publicly and equipped with corresponding and comprehensive documentation, it can speed up cross checks of the samples themselves and physical models applied. Some event samples require a lot of computing resources for preparation. So, a central storage of the samples prevents possible waste of researcher time and computing resources, which can be used to prepare the same events many times. Solution method: Creation of a special knowledgebase (MCDB) designed to keep event samples for the LHC experimental and phenomenological community. The knowledgebase is realized as a separate web-server ( http://mcdb.cern.ch). All event samples are kept on types at CERN. Documentation describing the events is the main contents of MCDB. Users can browse the knowledgebase, read and comment articles (documentation), and download event samples. Authors can upload new event samples, create new articles, and edit own articles. Restrictions: The software is adopted to solve the problems, described in the article and there are no any additional restrictions. Unusual features: The software provides a framework to store and document large files with flexible authentication and authorization system. Different external storages with large capacity can be used to keep the files. The WEB Content Management System provides all of the necessary interfaces for the authors of the files, end-users and administrators. Running time: Real time operations. References: [1] The main LCG MCDB server, http://mcdb.cern.ch/. [2] P. Bartalini, L. Dudko, A. Kryukov, I.V. Selyuzhenkov, A. Sherstnev, A. Vologdin, LCG Monte-Carlo data base, hep-ph/0404241. [3] J.P. Baud, B. Couturier, C. Curran, J.D. Durand, E. Knezo, S. Occhetti, O. Barring, CASTOR: status and evolution, cs.oh/0305047.

  7. Monte Carlo calculations for reporting patient organ doses from interventional radiology

    NASA Astrophysics Data System (ADS)

    Huo, Wanli; Feng, Mang; Pi, Yifei; Chen, Zhi; Gao, Yiming; Xu, X. George

    2017-09-01

    This paper describes a project to generate organ dose data for the purposes of extending VirtualDose software from CT imaging to interventional radiology (IR) applications. A library of 23 mesh-based anthropometric patient phantoms were involved in Monte Carlo simulations for database calculations. Organ doses and effective doses of IR procedures with specific beam projection, filed of view (FOV) and beam quality for all parts of body were obtained. Comparing organ doses for different beam qualities, beam projections, patients' ages and patient's body mass indexes (BMIs) which generated by VirtualDose-IR, significant discrepancies were observed. For relatively long time exposure, IR doses depend on beam quality, beam direction and patient size. Therefore, VirtualDose-IR, which is based on the latest anatomically realistic patient phantoms, can generate accurate doses for IR treatment. It is suitable to apply this software in clinical IR dose management as an effective tool to estimate patient doses and optimize IR treatment plans.

  8. Exposure to 137Cs deposited in soil – A Monte Carlo study

    NASA Astrophysics Data System (ADS)

    da Silveira, Lucas M.; Pereira, Marco A. M.; Neves, Lucio P.; Perini, Ana P.; Belinato, Walmir; Caldas, Linda V. E.; Santos, William S.

    2018-03-01

    In the event of an environmental contamination with radioactive materials, one of the most dangerous materials is 137Cs. In order to evaluate the radiation doses involved in an environmental contamination of soil, with 137Cs, we carried out a computational dosimetric study. We determined the radiation conversion coefficients (CC) for effective (E) and equivalent (H T) doses, using a male and a female anthropomorphic phantoms. These phantoms were coupled with the MCNPX (2.7.0) Monte Carlo simulation software, for three different types of soil. The highest CC[H T] values were for the gonads and skin (male) and bone marrow and skin (female). We found no difference for the different types of soil.

  9. Evaluated teletherapy source library

    DOEpatents

    Cox, Lawrence J.; Schach Von Wittenau, Alexis E.

    2000-01-01

    The Evaluated Teletherapy Source Library (ETSL) is a system of hardware and software that provides for maintenance of a library of useful phase space descriptions (PSDs) of teletherapy sources used in radiation therapy for cancer treatment. The PSDs are designed to be used by PEREGRINE, the all-particle Monte Carlo dose calculation system. ETSL also stores other relevant information such as monitor unit factors (MUFs) for use with the PSDs, results of PEREGRINE calculations using the PSDs, clinical calibration measurements, and geometry descriptions sufficient for calculational purposes. Not all of this information is directly needed by PEREGRINE. It also is capable of acting as a repository for the Monte Carlo simulation history files from which the generic PSDs are derived.

  10. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  11. Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*

    PubMed Central

    Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G

    2014-01-01

    Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image-based dosimetry in nuclear medicine. PMID:24200697

  12. TU-D-209-02: A Backscatter Point Spread Function for Entrance Skin Dose Determination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vijayan, S; Xiong, Z; Shankar, A

    Purpose: To determine the distribution of backscattered radiation to the skin resulting from a non-uniform distribution of primary radiation through convolution with a backscatter point spread function (PSF). Methods: A backscatter PSF is determined using Monte Carlo simulation of a 1 mm primary beam incident on a 30 × 30 cm × 20 cm thick PMMA phantom using EGSnrc software. A primary profile is similarly obtained without the phantom and the difference from the total provides the backscatter profile. This scatter PSF characterizes the backscatter spread for a “point” primary interaction and can be convolved with the entrance primary dosemore » distribution to obtain the total entrance skin dose. The backscatter PSF was integrated into the skin dose tracking system (DTS), a graphical utility for displaying the color-coded skin dose distribution on a 3D graphic of the patient during interventional fluoroscopic procedures. The backscatter convolution method was validated for the non-uniform beam resulting from the use of an ROI attenuator. The ROI attenuator is a copper sheet with about 20% primary transmission (0.7 mm thick) containing a circular aperture; this attenuator is placed in the beam to reduce dose in the periphery while maintaining full dose in the region of interest. The DTS calculated primary plus backscatter distribution is compared to that measured with GafChromic film and that calculated using EGSnrc Monte-Carlo software. Results: The PSF convolution method used in the DTS software was able to account for the spread of backscatter from the ROI region to the region under the attenuator. The skin dose distribution determined using DTS with the ROI attenuator was in good agreement with the distributions measured with Gafchromic film and determined by Monte Carlo simulation Conclusion: The PSF convolution technique provides an accurate alternative for entrance skin dose determination with non-uniform primary x-ray beams. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  13. Determination of tailored filter sets to create rayfiles including spatial and angular resolved spectral information.

    PubMed

    Rotscholl, Ingo; Trampert, Klaus; Krüger, Udo; Perner, Martin; Schmidt, Franz; Neumann, Cornelius

    2015-11-16

    To simulate and optimize optical designs regarding perceived color and homogeneity in commercial ray tracing software, realistic light source models are needed. Spectral rayfiles provide angular and spatial varying spectral information. We propose a spectral reconstruction method with a minimum of time consuming goniophotometric near field measurements with optical filters for the purpose of creating spectral rayfiles. Our discussion focuses on the selection of the ideal optical filter combination for any arbitrary spectrum out of a given filter set by considering measurement uncertainties with Monte Carlo simulations. We minimize the simulation time by a preselection of all filter combinations, which bases on factorial design.

  14. Fast Photon Monte Carlo for Water Cherenkov Detectors

    NASA Astrophysics Data System (ADS)

    Latorre, Anthony; Seibert, Stanley

    2012-03-01

    We present Chroma, a high performance optical photon simulation for large particle physics detectors, such as the water Cerenkov far detector option for LBNE. This software takes advantage of the CUDA parallel computing platform to propagate photons using modern graphics processing units. In a computer model of a 200 kiloton water Cerenkov detector with 29,000 photomultiplier tubes, Chroma can propagate 2.5 million photons per second, around 200 times faster than the same simulation with Geant4. Chroma uses a surface based approach to modeling geometry which offers many benefits over a solid based modelling approach which is used in other simulations like Geant4.

  15. PENGEOM-A general-purpose geometry package for Monte Carlo simulation of radiation transport in material systems defined by quadric surfaces

    NASA Astrophysics Data System (ADS)

    Almansa, Julio; Salvat-Pujol, Francesc; Díaz-Londoño, Gloria; Carnicer, Artur; Lallena, Antonio M.; Salvat, Francesc

    2016-02-01

    The Fortran subroutine package PENGEOM provides a complete set of tools to handle quadric geometries in Monte Carlo simulations of radiation transport. The material structure where radiation propagates is assumed to consist of homogeneous bodies limited by quadric surfaces. The PENGEOM subroutines (a subset of the PENELOPE code) track particles through the material structure, independently of the details of the physics models adopted to describe the interactions. Although these subroutines are designed for detailed simulations of photon and electron transport, where all individual interactions are simulated sequentially, they can also be used in mixed (class II) schemes for simulating the transport of high-energy charged particles, where the effect of soft interactions is described by the random-hinge method. The definition of the geometry and the details of the tracking algorithm are tailored to optimize simulation speed. The use of fuzzy quadric surfaces minimizes the impact of round-off errors. The provided software includes a Java graphical user interface for editing and debugging the geometry definition file and for visualizing the material structure. Images of the structure are generated by using the tracking subroutines and, hence, they describe the geometry actually passed to the simulation code.

  16. SU-F-I-53: Coded Aperture Coherent Scatter Spectral Imaging of the Breast: A Monte Carlo Evaluation of Absorbed Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, R; Lakshmanan, M; Fong, G

    Purpose: Coherent scatter based imaging has shown improved contrast and molecular specificity over conventional digital mammography however the biological risks have not been quantified due to a lack of accurate information on absorbed dose. This study intends to characterize the dose distribution and average glandular dose from coded aperture coherent scatter spectral imaging of the breast. The dose deposited in the breast from this new diagnostic imaging modality has not yet been quantitatively evaluated. Here, various digitized anthropomorphic phantoms are tested in a Monte Carlo simulation to evaluate the absorbed dose distribution and average glandular dose using clinically feasible scanmore » protocols. Methods: Geant4 Monte Carlo radiation transport simulation software is used to replicate the coded aperture coherent scatter spectral imaging system. Energy sensitive, photon counting detectors are used to characterize the x-ray beam spectra for various imaging protocols. This input spectra is cross-validated with the results from XSPECT, a commercially available application that yields x-ray tube specific spectra for the operating parameters employed. XSPECT is also used to determine the appropriate number of photons emitted per mAs of tube current at a given kVp tube potential. With the implementation of the XCAT digital anthropomorphic breast phantom library, a variety of breast sizes with differing anatomical structure are evaluated. Simulations were performed with and without compression of the breast for dose comparison. Results: Through the Monte Carlo evaluation of a diverse population of breast types imaged under real-world scan conditions, a clinically relevant average glandular dose for this new imaging modality is extrapolated. Conclusion: With access to the physical coherent scatter imaging system used in the simulation, the results of this Monte Carlo study may be used to directly influence the future development of the modality to keep breast dose to a minimum while still maintaining clinically viable image quality.« less

  17. A Monte Carlo model for the internal dosimetry of choroid plexuses in nuclear medicine procedures.

    PubMed

    Amato, Ernesto; Cicone, Francesco; Auditore, Lucrezia; Baldari, Sergio; Prior, John O; Gnesin, Silvano

    2018-05-01

    Choroid plexuses are vascular structures located in the brain ventricles, showing specific uptake of some diagnostic and therapeutic radiopharmaceuticals currently under clinical investigation, such as integrin-binding arginine-glycine-aspartic acid (RGD) peptides. No specific geometry for choroid plexuses has been implemented in commercially available software for internal dosimetry. The aims of the present study were to assess the dependence of absorbed dose to the choroid plexuses on the organ geometry implemented in Monte Carlo simulations, and to propose an analytical model for the internal dosimetry of these structures for 18 F, 64 Cu, 67 Cu, 68 Ga, 90 Y, 131 I and 177 Lu nuclides. A GAMOS Monte Carlo simulation based on direct organ segmentation was taken as the gold standard to validate a second simulation based on a simplified geometrical model of the choroid plexuses. Both simulations were compared with the OLINDA/EXM sphere model. The gold standard and the simplified geometrical model gave similar dosimetry results (dose difference < 3.5%), indicating that the latter can be considered as a satisfactory approximation of the real geometry. In contrast, the sphere model systematically overestimated the absorbed dose compared to both Monte Carlo models (range: 4-50% dose difference), depending on the isotope energy and organ mass. Therefore, the simplified geometric model was adopted to introduce an analytical approach for choroid plexuses dosimetry in the mass range 2-16 g. The proposed model enables the estimation of the choroid plexuses dose by a simple bi-parametric function, once the organ mass and the residence time of the radiopharmaceutical under investigation are provided. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images*

    PubMed Central

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    Objective This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. Materials and Methods A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. Results With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. Conclusion The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity. PMID:25741101

  19. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  1. Black Box Testing: Experiments with Runway Incursion Advisory Alerting System

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    2005-01-01

    This report summarizes our research findings on the Black box testing of Runway Incursion Advisory Alerting System (RIAAS) and Runway Safety Monitor (RSM) system. Developing automated testing software for such systems has been a problem because of the extensive information that has to be processed. Customized software solutions have been proposed. However, they are time consuming to develop. Here, we present a less expensive, and a more general test platform that is capable of performing complete black box testing. The technique is based on the classification of the anomalies that arise during Monte Carlo simulations. In addition, we also discuss a generalized testing tool (prototype) that we have developed.

  2. Studies on heavy charged particle interaction, water equivalence and Monte Carlo simulation in some gel dosimeters, water, human tissues and water phantoms

    NASA Astrophysics Data System (ADS)

    Kurudirek, Murat

    2015-09-01

    Some gel dosimeters, water, human tissues and water phantoms were investigated with respect to their radiological properties in the energy region 10 keV-10 MeV. The effective atomic numbers (Zeff) and electron densities (Ne) for some heavy charged particles such as protons, He ions, B ions and C ions have been calculated for the first time for Fricke, MAGIC, MAGAT, PAGAT, PRESAGE, water, adipose tissue, muscle skeletal (ICRP), muscle striated (ICRU), plastic water, WT1 and RW3 using mass stopping powers from SRIM Monte Carlo software. The ranges and straggling were also calculated for the given materials. Two different set of mass stopping powers were used to calculate Zeff for comparison. The water equivalence of the given materials was also determined based on the results obtained. The Monte Carlo simulation of the charged particle transport was also done using SRIM code. The heavy ion distribution along with its parameters were shown for the given materials for different heavy ions. Also the energy loss and damage events in water when irradiated with 100 keV heavy ions were studied in detail.

  3. STEFFY-software to calculate nuclide-specific total counting efficiency in well-type γ-ray detectors.

    PubMed

    Pommé, S

    2012-09-01

    A software package is presented to calculate the total counting efficiency for the decay of radionuclides in a well-type γ-ray detector. It is specifically applied to primary standardisation of activity by means of 4πγ-counting with a NaI(Tl) well-type scintillation detector. As an alternative to Monte Carlo simulations, the software combines good accuracy with superior speed and ease-of-use. It is also well suited to investigate uncertainties associated with the 4πγ-counting method for a variety of radionuclides and detector dimensions. In this paper, the underlying analytical models for the radioactive decay and subsequent counting efficiency of the emitted radiation in the detector are summarised. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Apparatus and method for tracking a molecule or particle in three dimensions

    DOEpatents

    Werner, James H [Los Alamos, NM; Goodwin, Peter M [Los Alamos, NM; Lessard, Guillaume [Santa Fe, NM

    2009-03-03

    An apparatus and method were used to track the movement of fluorescent particles in three dimensions. Control software was used with the apparatus to implement a tracking algorithm for tracking the motion of the individual particles in glycerol/water mixtures. Monte Carlo simulations suggest that the tracking algorithms in combination with the apparatus may be used for tracking the motion of single fluorescent or fluorescently labeled biomolecules in three dimensions.

  5. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  6. Comparison of Monte Carlo simulated and measured performance parameters of miniPET scanner

    NASA Astrophysics Data System (ADS)

    Kis, S. A.; Emri, M.; Opposits, G.; Bükki, T.; Valastyán, I.; Hegyesi, Gy.; Imrek, J.; Kalinka, G.; Molnár, J.; Novák, D.; Végh, J.; Kerek, A.; Trón, L.; Balkay, L.

    2007-02-01

    In vivo imaging of small laboratory animals is a valuable tool in the development of new drugs. For this purpose, miniPET, an easy to scale modular small animal PET camera has been developed at our institutes. The system has four modules, which makes it possible to rotate the whole detector system around the axis of the field of view. Data collection and image reconstruction are performed using a data acquisition (DAQ) module with Ethernet communication facility and a computer cluster of commercial PCs. Performance tests were carried out to determine system parameters, such as energy resolution, sensitivity and noise equivalent count rate. A modified GEANT4-based GATE Monte Carlo software package was used to simulate PET data analogous to those of the performance measurements. GATE was run on a Linux cluster of 10 processors (64 bit, Xeon with 3.0 GHz) and controlled by a SUN grid engine. The application of this special computer cluster reduced the time necessary for the simulations by an order of magnitude. The simulated energy spectra, maximum rate of true coincidences and sensitivity of the camera were in good agreement with the measured parameters.

  7. [Series: Medical Applications of the PHITS Code (2): Acceleration by Parallel Computing].

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko

    2015-01-01

    Time-consuming Monte Carlo dose calculation becomes feasible owing to the development of computer technology. However, the recent development is due to emergence of the multi-core high performance computers. Therefore, parallel computing becomes a key to achieve good performance of software programs. A Monte Carlo simulation code PHITS contains two parallel computing functions, the distributed-memory parallelization using protocols of message passing interface (MPI) and the shared-memory parallelization using open multi-processing (OpenMP) directives. Users can choose the two functions according to their needs. This paper gives the explanation of the two functions with their advantages and disadvantages. Some test applications are also provided to show their performance using a typical multi-core high performance workstation.

  8. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  9. Hybrid Microgrid Configuration Optimization with Evolutionary Algorithms

    NASA Astrophysics Data System (ADS)

    Lopez, Nicolas

    This dissertation explores the Renewable Energy Integration Problem, and proposes a Genetic Algorithm embedded with a Monte Carlo simulation to solve large instances of the problem that are impractical to solve via full enumeration. The Renewable Energy Integration Problem is defined as finding the optimum set of components to supply the electric demand to a hybrid microgrid. The components considered are solar panels, wind turbines, diesel generators, electric batteries, connections to the power grid and converters, which can be inverters and/or rectifiers. The methodology developed is explained as well as the combinatorial formulation. In addition, 2 case studies of a single objective optimization version of the problem are presented, in order to minimize cost and to minimize global warming potential (GWP) followed by a multi-objective implementation of the offered methodology, by utilizing a non-sorting Genetic Algorithm embedded with a monte Carlo Simulation. The method is validated by solving a small instance of the problem with known solution via a full enumeration algorithm developed by NREL in their software HOMER. The dissertation concludes that the evolutionary algorithms embedded with Monte Carlo simulation namely modified Genetic Algorithms are an efficient form of solving the problem, by finding approximate solutions in the case of single objective optimization, and by approximating the true Pareto front in the case of multiple objective optimization of the Renewable Energy Integration Problem.

  10. Monte Carlo parametric studies of neutron interrogation with the Associated Particle Technique for cargo container inspections

    NASA Astrophysics Data System (ADS)

    Deyglun, Clément; Carasco, Cédric; Pérot, Bertrand

    2014-06-01

    The detection of Special Nuclear Materials (SNM) by neutron interrogation is extensively studied by Monte Carlo simulation at the Nuclear Measurement Laboratory of CEA Cadarache (French Alternative Energies and Atomic Energy Commission). The active inspection system is based on the Associated Particle Technique (APT). Fissions induced by tagged neutrons (i.e. correlated to an alpha particle in the DT neutron generator) in SNM produce high multiplicity coincidences which are detected with fast plastic scintillators. At least three particles are detected in a short time window following the alpha detection, whereas nonnuclear materials mainly produce single events, or pairs due to (n,2n) and (n,n'γ) reactions. To study the performances of an industrial cargo container inspection system, Monte Carlo simulations are performed with the MCNP-PoliMi transport code, which records for each neutron history the relevant information: reaction types, position and time of interactions, energy deposits, secondary particles, etc. The output files are post-processed with a specific tool developed with ROOT data analysis software. Particles not correlated with an alpha particle (random background), counting statistics, and time-energy resolutions of the data acquisition system are taken into account in the numerical model. Various matrix compositions, suspicious items, SNM shielding and positions inside the container, are simulated to assess the performances and limitations of an industrial system.

  11. Quantitative microbial risk assessment for Escherichia coli O157:H7, salmonella, and Listeria monocytogenes in leafy green vegetables consumed at salad bars.

    PubMed

    Franz, E; Tromp, S O; Rijgersberg, H; van der Fels-Klerx, H J

    2010-02-01

    Fresh vegetables are increasingly recognized as a source of foodborne outbreaks in many parts of the world. The purpose of this study was to conduct a quantitative microbial risk assessment for Escherichia coli O157:H7, Salmonella, and Listeria monocytogenes infection from consumption of leafy green vegetables in salad from salad bars in The Netherlands. Pathogen growth was modeled in Aladin (Agro Logistics Analysis and Design Instrument) using time-temperature profiles in the chilled supply chain and one particular restaurant with a salad bar. A second-order Monte Carlo risk assessment model was constructed (using @Risk) to estimate the public health effects. The temperature in the studied cold chain was well controlled below 5 degrees C. Growth of E. coli O157:H7 and Salmonella was minimal (17 and 15%, respectively). Growth of L. monocytogenes was considerably greater (194%). Based on first-order Monte Carlo simulations, the average number of cases per year in The Netherlands associated the consumption leafy greens in salads from salad bars was 166, 187, and 0.3 for E. coli O157:H7, Salmonella, and L. monocytogenes, respectively. The ranges of the average number of annual cases as estimated by second-order Monte Carlo simulation (with prevalence and number of visitors as uncertain variables) were 42 to 551 for E. coli O157:H7, 81 to 281 for Salmonella, and 0.1 to 0.9 for L. monocytogenes. This study included an integration of modeling pathogen growth in the supply chain of fresh leafy vegetables destined for restaurant salad bars using software designed to model and design logistics and modeling the public health effects using probabilistic risk assessment software.

  12. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    PubMed

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  13. Backscatter factors and mass energy-absorption coefficient ratios for diagnostic radiology dosimetry

    NASA Astrophysics Data System (ADS)

    Benmakhlouf, Hamza; Bouchard, Hugo; Fransson, Annette; Andreo, Pedro

    2011-11-01

    Backscatter factors, B, and mass energy-absorption coefficient ratios, (μen/ρ)w, air, for the determination of the surface dose in diagnostic radiology were calculated using Monte Carlo simulations. The main purpose was to extend the range of available data to qualities used in modern x-ray techniques, particularly for interventional radiology. A comprehensive database for mono-energetic photons between 4 and 150 keV and different field sizes was created for a 15 cm thick water phantom. Backscattered spectra were calculated with the PENELOPE Monte Carlo system, scoring track-length fluence differential in energy with negligible statistical uncertainty; using the Monte Carlo computed spectra, B factors and (μen/ρ)w, air were then calculated numerically for each energy. Weighted averaging procedures were subsequently used to convolve incident clinical spectra with mono-energetic data. The method was benchmarked against full Monte Carlo calculations of incident clinical spectra obtaining differences within 0.3-0.6%. The technique used enables the calculation of B and (μen/ρ)w, air for any incident spectrum without further time-consuming Monte Carlo simulations. The adequacy of the extended dosimetry data to a broader range of clinical qualities than those currently available, while keeping consistency with existing data, was confirmed through detailed comparisons. Mono-energetic and spectra-averaged values were compared with published data, including those in ICRU Report 74 and IAEA TRS-457, finding average differences of 0.6%. Results are provided in comprehensive tables appropriated for clinical use. Additional qualities can easily be calculated using a designed GUI interface in conjunction with software to generate incident photon spectra.

  14. Neutrino oscillation parameter sampling with MonteCUBES

    NASA Astrophysics Data System (ADS)

    Blennow, Mattias; Fernandez-Martinez, Enrique

    2010-01-01

    We present MonteCUBES ("Monte Carlo Utility Based Experiment Simulator"), a software package designed to sample the neutrino oscillation parameter space through Markov Chain Monte Carlo algorithms. MonteCUBES makes use of the GLoBES software so that the existing experiment definitions for GLoBES, describing long baseline and reactor experiments, can be used with MonteCUBES. MonteCUBES consists of two main parts: The first is a C library, written as a plug-in for GLoBES, implementing the Markov Chain Monte Carlo algorithm to sample the parameter space. The second part is a user-friendly graphical Matlab interface to easily read, analyze, plot and export the results of the parameter space sampling. Program summaryProgram title: MonteCUBES (Monte Carlo Utility Based Experiment Simulator) Catalogue identifier: AEFJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 69 634 No. of bytes in distributed program, including test data, etc.: 3 980 776 Distribution format: tar.gz Programming language: C Computer: MonteCUBES builds and installs on 32 bit and 64 bit Linux systems where GLoBES is installed Operating system: 32 bit and 64 bit Linux RAM: Typically a few MBs Classification: 11.1 External routines: GLoBES [1,2] and routines/libraries used by GLoBES Subprograms used:Cat Id ADZI_v1_0, Title GLoBES, Reference CPC 177 (2007) 439 Nature of problem: Since neutrino masses do not appear in the standard model of particle physics, many models of neutrino masses also induce other types of new physics, which could affect the outcome of neutrino oscillation experiments. In general, these new physics imply high-dimensional parameter spaces that are difficult to explore using classical methods such as multi-dimensional projections and minimizations, such as those used in GLoBES [1,2]. Solution method: MonteCUBES is written as a plug-in to the GLoBES software [1,2] and provides the necessary methods to perform Markov Chain Monte Carlo sampling of the parameter space. This allows an efficient sampling of the parameter space and has a complexity which does not grow exponentially with the parameter space dimension. The integration of the MonteCUBES package with the GLoBES software makes sure that the experimental definitions already in use by the community can also be used with MonteCUBES, while also lowering the learning threshold for users who already know GLoBES. Additional comments: A Matlab GUI for interpretation of results is included in the distribution. Running time: The typical running time varies depending on the dimensionality of the parameter space, the complexity of the experiment, and how well the parameter space should be sampled. The running time for our simulations [3] with 15 free parameters at a Neutrino Factory with O(10) samples varied from a few hours to tens of hours. References:P. Huber, M. Lindner, W. Winter, Comput. Phys. Comm. 167 (2005) 195, hep-ph/0407333. P. Huber, J. Kopp, M. Lindner, M. Rolinec, W. Winter, Comput. Phys. Comm. 177 (2007) 432, hep-ph/0701187. S. Antusch, M. Blennow, E. Fernandez-Martinez, J. Lopez-Pavon, arXiv:0903.3986 [hep-ph].

  15. SU-E-T-36: A GPU-Accelerated Monte-Carlo Dose Calculation Platform and Its Application Toward Validating a ViewRay Beam Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y; Mazur, T; Green, O

    Purpose: To build a fast, accurate and easily-deployable research platform for Monte-Carlo dose calculations. We port the dose calculation engine PENELOPE to C++, and accelerate calculations using GPU acceleration. Simulations of a Co-60 beam model provided by ViewRay demonstrate the capabilities of the platform. Methods: We built software that incorporates a beam model interface, CT-phantom model, GPU-accelerated PENELOPE engine, and GUI front-end. We rewrote the PENELOPE kernel in C++ (from Fortran) and accelerated the code on a GPU. We seamlessly integrated a Co-60 beam model (obtained from ViewRay) into our platform. Simulations of various field sizes and SSDs using amore » homogeneous water phantom generated PDDs, dose profiles, and output factors that were compared to experiment data. Results: With GPU acceleration using a dated graphics card (Nvidia Tesla C2050), a highly accurate simulation – including 100*100*100 grid, 3×3×3 mm3 voxels, <1% uncertainty, and 4.2×4.2 cm2 field size – runs 24 times faster (20 minutes versus 8 hours) than when parallelizing on 8 threads across a new CPU (Intel i7-4770). Simulated PDDs, profiles and output ratios for the commercial system agree well with experiment data measured using radiographic film or ionization chamber. Based on our analysis, this beam model is precise enough for general applications. Conclusions: Using a beam model for a Co-60 system provided by ViewRay, we evaluate a dose calculation platform that we developed. Comparison to measurements demonstrates the promise of our software for use as a research platform for dose calculations, with applications including quality assurance and treatment plan verification.« less

  16. A probabilistic sizing tool and Monte Carlo analysis for entry vehicle ablative thermal protection systems

    NASA Astrophysics Data System (ADS)

    Mazzaracchio, Antonio; Marchetti, Mario

    2010-03-01

    Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.

  17. SU-C-204-06: Monte Carlo Dose Calculation for Kilovoltage X-Ray-Psoralen Activated Cancer Therapy (X-PACT): Preliminary Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mein, S; Gunasingha, R; Nolan, M

    Purpose: X-PACT is an experimental cancer therapy where kV x-rays are used to photo-activate anti-cancer therapeutics through phosphor intermediaries (phosphors that absorb x-rays and re-radiate as UV light). Clinical trials in pet dogs are currently underway (NC State College of Veterinary Medicine) and an essential component is the ability to model the kV dose in these dogs. Here we report the commissioning and characterization of a Monte Carlo (MC) treatment planning simulation tool to calculate X-PACT radiation doses in canine trials. Methods: FLUKA multi-particle MC simulation package was used to simulate a standard X-PACT radiation treatment beam of 80kVp withmore » the Varian OBI x-ray source geometry. The beam quality was verified by comparing measured and simulated attenuation of the beam by various thicknesses of aluminum (2–4.6 mm) under narrow beam conditions (HVL). The beam parameters at commissioning were then corroborated using MC, characterized and verified with empirically collected commissioning data, including: percent depth dose curves (PDD), back-scatter factors (BSF), collimator scatter factor(s), and heel effect, etc. All simulations were conducted for N=30M histories at M=100 iterations. Results: HVL and PDD simulation data agreed with an average percent error of 2.42%±0.33 and 6.03%±1.58, respectively. The mean square error (MSE) values for HVL and PDD (0.07% and 0.50%) were low, as expected; however, longer simulations are required to validate convergence to the expected values. Qualitatively, pre- and post-filtration source spectra matched well with 80kVp references generated via SPEKTR software. Further validation of commissioning data simulation is underway in preparation for first-time 3D dose calculations with canine CBCT data. Conclusion: We have prepared a Monte Carlo simulation capable of accurate dose calculation for use with ongoing X-PACT canine clinical trials. Preliminary results show good agreement with measured data and hold promise for accurate quantification of dose for this novel psoralen X-ray therapy. Funding Support, Disclosures, & Conflict of Interest: The Monte Carlo simulation work was not funded; Drs. Adamson & Oldham have received funding from Immunolight LLC for X-PACT research.« less

  18. Dosimetric investigation of LDR brachytherapy ¹⁹²Ir wires by Monte Carlo and TPS calculations.

    PubMed

    Bozkurt, Ahmet; Acun, Hediye; Kemikler, Gonul

    2013-01-01

    The aim of this study was to investigate the dose rate distribution around (192)Ir wires used as radioactive sources in low-dose-rate brachytherapy applications. Monte Carlo modeling of a 0.3-mm diameter source and its surrounding water medium was performed for five different wire lengths (1-5 cm) using the MCNP software package. The computed dose rates per unit of air kerma at distances from 0.1 up to 10 cm away from the source were first verified with literature data sets. Then, the simulation results were compared with the calculations from the XiO CMS commercial treatment planning system. The study results were found to be in concordance with the treatment planning system calculations except for the shorter wires at close distances.

  19. Bayesian Treatment of Uncertainty in Environmental Modeling: Optimization, Sampling and Data Assimilation Using the DREAM Software Package

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2012-12-01

    In the past decade much progress has been made in the treatment of uncertainty in earth systems modeling. Whereas initial approaches has focused mostly on quantification of parameter and predictive uncertainty, recent methods attempt to disentangle the effects of parameter, forcing (input) data, model structural and calibration data errors. In this talk I will highlight some of our recent work involving theory, concepts and applications of Bayesian parameter and/or state estimation. In particular, new methods for sequential Monte Carlo (SMC) and Markov Chain Monte Carlo (MCMC) simulation will be presented with emphasis on massively parallel distributed computing and quantification of model structural errors. The theoretical and numerical developments will be illustrated using model-data synthesis problems in hydrology, hydrogeology and geophysics.

  20. Vega roll and attitude control system algorithms trade-off study

    NASA Astrophysics Data System (ADS)

    Paulino, N.; Cuciniello, G.; Cruciani, I.; Corraro, F.; Spallotta, D.; Nebula, F.

    2013-12-01

    This paper describes the trade-off study for the selection of the most suitable algorithms for the Roll and Attitude Control System (RACS) within the FPS-A program, aimed at developing the new Flight Program Software of VEGA Launcher. Two algorithms were analyzed: Switching Lines (SL) and Quaternion Feedback Regulation. Using a development simulation tool that models two critical flight phases (Long Coasting Phase (LCP) and Payload Release (PLR) Phase), both algorithms were assessed with Monte Carlo batch simulations for both of the phases. The statistical outcomes of the results demonstrate a 100 percent success rate for Quaternion Feedback Regulation, and support the choice of this method.

  1. LANL Summer 2016 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendoza, Paul Michael

    The Monte Carlo N-Particle (MCNP) transport code developed at Los Alamos National Laboratory (LANL) utilizes nuclear cross-section data in a compact ENDF (ACE) format. The accuracy of MCNP calculations depends on the accuracy of nuclear ACE data tables, which depends on the accuracy of the original ENDF files. There are some noticeable differences in ENDF files from one generation to the next, even among the more common fissile materials. As the next generation of ENDF files is being prepared, several software tools were developed to simulate a large number of benchmarks in MCNP (over 1000), collect data from these simulations,more » and visually represent the results.« less

  2. Molecular Optical Simulation Environment (MOSE): A Platform for the Simulation of Light Propagation in Turbid Media

    PubMed Central

    Ren, Shenghan; Chen, Xueli; Wang, Hailong; Qu, Xiaochao; Wang, Ge; Liang, Jimin; Tian, Jie

    2013-01-01

    The study of light propagation in turbid media has attracted extensive attention in the field of biomedical optical molecular imaging. In this paper, we present a software platform for the simulation of light propagation in turbid media named the “Molecular Optical Simulation Environment (MOSE)”. Based on the gold standard of the Monte Carlo method, MOSE simulates light propagation both in tissues with complicated structures and through free-space. In particular, MOSE synthesizes realistic data for bioluminescence tomography (BLT), fluorescence molecular tomography (FMT), and diffuse optical tomography (DOT). The user-friendly interface and powerful visualization tools facilitate data analysis and system evaluation. As a major measure for resource sharing and reproducible research, MOSE aims to provide freeware for research and educational institutions, which can be downloaded at http://www.mosetm.net. PMID:23577215

  3. System Advisor Model, SAM 2014.1.14: General Description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blair, Nate; Dobos, Aron P.; Freeman, Janine

    2014-02-01

    This document describes the capabilities of the U.S. Department of Energy and National Renewable Energy Laboratory's System Advisor Model (SAM), Version 2013.9.20, released on September 9, 2013. SAM is a computer model that calculates performance and financial metrics of renewable energy systems. Project developers, policy makers, equipment manufacturers, and researchers use graphs and tables of SAM results in the process of evaluating financial, technology, and incentive options for renewable energy projects. SAM simulates the performance of photovoltaic, concentrating solar power, solar water heating, wind, geothermal, biomass, and conventional power systems. The financial model can represent financial structures for projects thatmore » either buy and sell electricity at retail rates (residential and commercial) or sell electricity at a price determined in a power purchase agreement (utility). SAM's advanced simulation options facilitate parametric and sensitivity analyses, and statistical analysis capabilities are available for Monte Carlo simulation and weather variability (P50/P90) studies. SAM can also read input variables from Microsoft Excel worksheets. For software developers, the SAM software development kit (SDK) makes it possible to use SAM simulation modules in their applications written in C/C++, C#, Java, Python, and MATLAB. NREL provides both SAM and the SDK as free downloads at http://sam.nrel.gov. Technical support and more information about the software are available on the website.« less

  4. Atmospheric release model for the E-area low-level waste facility: Updates and modifications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The atmospheric release model (ARM) utilizes GoldSim® Monte Carlo simulation software (GTG, 2017) to evaluate the flux of gaseous radionuclides as they volatilize from E-Area disposal facility waste zones, diffuse into the air-filled soil pores surrounding the waste, and emanate at the land surface. This report documents the updates and modifications to the ARM for the next planned E-Area PA considering recommendations from the 2015 PA strategic planning team outlined by Butcher and Phifer.

  5. Arcus end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Wilms, Joern; Guenther, H. Moritz; Dauser, Thomas; Huenemoerder, David P.; Ptak, Andrew; Smith, Randall; Arcus Team

    2018-01-01

    We present an overview of the end-to-end simulation environment that we are implementing as part of the Arcus phase A Study. With the rcus simulator, we aim to to model the imaging, detection, and event reconstruction properties of the spectrometer. The simulator uses a Monte Carlo ray-trace approach, projecting photons onto the Arcus focal plane from the silicon pore optic mirrors and critical-angle transmission gratings. We simulate the detection and read-out of the photons in the focal plane CCDs with software originally written for the eROSITA and Athena-WFI detectors; we include all relevant detector physics, such as charge splitting, and effects of the detector read-out, such as out of time events. The output of the simulation chain is an event list that closely resembles the data expected during flight. This event list is processed using a prototype event reconstruction chain for the order separation, wavelength calibration, and effective area calibration. The output is compatible with standard X-ray astronomical analysis software.During phase A, the end-to-end simulation approach is used to demonstrate the overall performance of the mission, including a full simulation of the calibration effort. Continued development during later phases of the mission will ensure that the simulator remains a faithful representation of the true mission capabilities, and will ultimately be used as the Arcus calibration model.

  6. Simulation of Top Quark Pair Production as a Background for Higgs Events at the Compact Muon Solenoid

    NASA Astrophysics Data System (ADS)

    Justus, Christopher

    2005-04-01

    In this study, we simulated top-antitop (tt-bar) quark events at the Compact Muon Solenoid (CMS), an experiment presently being constructed at the Large Hadron Collider in Geneva, Switzerland. The tt-bar process is an important background for Higgs events. We used a chain of software to simulate and reconstruct processes that will occur inside the detector. CMKIN was used to generate and store Monte Carlo Events. OSCAR, a GEANT4 based CMS detector simulator, was used to simulate the CMS detector and how particles would interact with the detector. Next, we used ORCA to simulate the response of the readout electronics at CMS. Last, we used the Jet/MET Root maker to create root files of jets and missing energy. We are now using this software analysis chain to complete a systematic study of initial state radiation at hadron colliders. This study is essential because tt-bar is the main background for the Higgs boson and these processes are extremely sensitive to initial state radiation. Results of our initial state radiation study will be presented. We started this study at the new LHC Physics Center (LPC) located at Fermi National Accelerator Laboratory, and we are now completing the study at the University of Rochester.

  7. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    NASA Astrophysics Data System (ADS)

    Landry, Guillaume; deBlois, François; Verhaegen, Frank

    2013-11-01

    Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI). Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV), planar (portal) mega voltage (MV), fan beam computed tomography (CT) and cone beam CT (CBCT) imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared. Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU) for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  8. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  9. The "neutron channel design"—A method for gaining the desired neutrons

    NASA Astrophysics Data System (ADS)

    Hu, G.; Hu, H. S.; Wang, S.; Pan, Z. H.; Jia, Q. G.; Yan, M. F.

    2016-12-01

    The neutrons with desired parameters can be obtained after initial neutrons penetrating various structure and component of the material. A novel method, the "neutron channel design", is proposed in this investigation for gaining the desired neutrons. It is established by employing genetic algorithm (GA) combining with Monte Carlo software. This method is verified by obtaining 0.01eV to 1.0eV neutrons from the Compact Accelerator-driven Neutron Source (CANS). One layer polyethylene (PE) moderator was designed and installed behind the beryllium target in CANS. The simulations and the experiment for detection the neutrons were carried out. The neutron spectrum at 500cm from the PE moderator was simulated by MCNP and PHITS software. The counts of 0.01eV to 1.0eV neutrons were simulated by MCNP and detected by the thermal neutron detector in the experiment. These data were compared and analyzed. Then this method is researched on designing the complex structure of PE and the composite material consisting of PE, lead and zirconium dioxide.

  10. Compton scatter imaging: A promising modality for image guidance in lung stereotactic body radiation therapy

    PubMed Central

    Redler, Gage; Jones, Kevin C.; Templeton, Alistair; Bernard, Damian; Turian, Julius; Chu, James C. H.

    2018-01-01

    Purpose Lung stereotactic body radiation therapy (SBRT) requires delivering large radiation doses with millimeter accuracy, making image guidance essential. An approach to forming images of patient anatomy from Compton-scattered photons during lung SBRT is presented. Methods To investigate the potential of scatter imaging, a pinhole collimator and flat-panel detector are used for spatial localization and detection of photons scattered during external beam therapy using lung SBRT treatment conditions (6 MV FFF beam). MCNP Monte Carlo software is used to develop a model to simulate scatter images. This model is validated by comparing experimental and simulated phantom images. Patient scatter images are then simulated from 4DCT data. Results Experimental lung tumor phantom images have sufficient contrast-to-noise to visualize the tumor with as few as 10 MU (0.5 s temporal resolution). The relative signal intensity from objects of different composition as well as lung tumor contrast for simulated phantom images agree quantitatively with experimental images, thus validating the Monte Carlo model. Scatter images are shown to display high contrast between different materials (lung, water, bone). Simulated patient images show superior (~double) tumor contrast compared to MV transmission images. Conclusions Compton scatter imaging is a promising modality for directly imaging patient anatomy during treatment without additional radiation, and it has the potential to complement existing technologies and aid tumor tracking and lung SBRT image guidance. PMID:29360151

  11. Compton scatter imaging: A promising modality for image guidance in lung stereotactic body radiation therapy.

    PubMed

    Redler, Gage; Jones, Kevin C; Templeton, Alistair; Bernard, Damian; Turian, Julius; Chu, James C H

    2018-03-01

    Lung stereotactic body radiation therapy (SBRT) requires delivering large radiation doses with millimeter accuracy, making image guidance essential. An approach to forming images of patient anatomy from Compton-scattered photons during lung SBRT is presented. To investigate the potential of scatter imaging, a pinhole collimator and flat-panel detector are used for spatial localization and detection of photons scattered during external beam therapy using lung SBRT treatment conditions (6 MV FFF beam). MCNP Monte Carlo software is used to develop a model to simulate scatter images. This model is validated by comparing experimental and simulated phantom images. Patient scatter images are then simulated from 4DCT data. Experimental lung tumor phantom images have sufficient contrast-to-noise to visualize the tumor with as few as 10 MU (0.5 s temporal resolution). The relative signal intensity from objects of different composition as well as lung tumor contrast for simulated phantom images agree quantitatively with experimental images, thus validating the Monte Carlo model. Scatter images are shown to display high contrast between different materials (lung, water, bone). Simulated patient images show superior (~double) tumor contrast compared to MV transmission images. Compton scatter imaging is a promising modality for directly imaging patient anatomy during treatment without additional radiation, and it has the potential to complement existing technologies and aid tumor tracking and lung SBRT image guidance. © 2018 American Association of Physicists in Medicine.

  12. TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badal, A; Zbijewski, W; Bolch, W

    Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods,more » are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10{sup 7} xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the virtual generation of medical images and accurate estimation of radiation dose and other imaging parameters. For this, detailed computational phantoms of the patient anatomy must be utilized and implemented within the radiation transport code. Computational phantoms presently come in one of three format types, and in one of four morphometric categories. Format types include stylized (mathematical equation-based), voxel (segmented CT/MR images), and hybrid (NURBS and polygon mesh surfaces). Morphometric categories include reference (small library of phantoms by age at 50th height/weight percentile), patient-dependent (larger library of phantoms at various combinations of height/weight percentiles), patient-sculpted (phantoms altered to match the patient's unique outer body contour), and finally, patient-specific (an exact representation of the patient with respect to both body contour and internal anatomy). The existence and availability of these phantoms represents a very important advance for the simulation of realistic medical imaging applications using Monte Carlo methods. New Monte Carlo simulation codes need to be thoroughly validated before they can be used to perform novel research. Ideally, the validation process would involve comparison of results with those of an experimental measurement, but accurate replication of experimental conditions can be very challenging. It is very common to validate new Monte Carlo simulations by replicating previously published simulation results of similar experiments. This process, however, is commonly problematic due to the lack of sufficient information in the published reports of previous work so as to be able to replicate the simulation in detail. To aid in this process, the AAPM Task Group 195 prepared a report in which six different imaging research experiments commonly performed using Monte Carlo simulations are described and their results provided. The simulation conditions of all six cases are provided in full detail, with all necessary data on material composition, source, geometry, scoring and other parameters provided. The results of these simulations when performed with the four most common publicly available Monte Carlo packages are also provided in tabular form. The Task Group 195 Report will be useful for researchers needing to validate their Monte Carlo work, and for trainees needing to learn Monte Carlo simulation methods. In this symposium we will review the recent advancements in highperformance computing hardware enabling the reduction in computational resources needed for Monte Carlo simulations in medical imaging. We will review variance reduction techniques commonly applied in Monte Carlo simulations of medical imaging systems and present implementation strategies for efficient combination of these techniques with GPU acceleration. Trade-offs involved in Monte Carlo acceleration by means of denoising and “sparse sampling” will be discussed. A method for rapid scatter correction in cone-beam CT (<5 min/scan) will be presented as an illustration of the simulation speeds achievable with optimized Monte Carlo simulations. We will also discuss the development, availability, and capability of the various combinations of computational phantoms for Monte Carlo simulation of medical imaging systems. Finally, we will review some examples of experimental validation of Monte Carlo simulations and will present the AAPM Task Group 195 Report. Learning Objectives: Describe the advances in hardware available for performing Monte Carlo simulations in high performance computing environments. Explain variance reduction, denoising and sparse sampling techniques available for reduction of computational time needed for Monte Carlo simulations of medical imaging. List and compare the computational anthropomorphic phantoms currently available for more accurate assessment of medical imaging parameters in Monte Carlo simulations. Describe experimental methods used for validation of Monte Carlo simulations in medical imaging. Describe the AAPM Task Group 195 Report and its use for validation and teaching of Monte Carlo simulations in medical imaging.« less

  13. The GENIE Neutrino Monte Carlo Generator: Physics and User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andreopoulos, Costas; Barry, Christopher; Dytman, Steve

    2015-10-20

    GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of itsmore » physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.« less

  14. Hyper-X Stage Separation Trajectory Validation Studies

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.; Bose, David M.; McMinn, John D.; Martin, John G.; Strovers, Brian K.

    2003-01-01

    An independent twelve degree-of-freedom simulation of the X-43A separation trajectory was created with the Program to Optimize Simulated trajectories (POST II). This simulation modeled the multi-body dynamics of the X-43A and its booster and included the effect of two pyrotechnically actuated pistons used to push the vehicles apart as well as aerodynamic interaction forces and moments between the two vehicles. The simulation was developed to validate trajectory studies conducted with a 14 degree-of-freedom simulation created early in the program using the Automatic Dynamic Analysis of Mechanics Systems (ADAMS) simulation software. The POST simulation was less detailed than the official ADAMS-based simulation used by the Project, but was simpler, more concise and ran faster, while providing similar results. The increase in speed provided by the POST simulation provided the Project with an alternate analysis tool. This tool was ideal for performing separation control logic trade studies that required the running of numerous Monte Carlo trajectories.

  15. Benchmarking the ATLAS software through the Kit Validation engine

    NASA Astrophysics Data System (ADS)

    De Salvo, Alessandro; Brasolin, Franco

    2010-04-01

    The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.

  16. A software tool to assess uncertainty in transient-storage model parameters using Monte Carlo simulations

    USGS Publications Warehouse

    Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.

    2017-01-01

    Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.

  17. Predictive Modeling of Terrestrial Radiation Exposure from Geologic Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malchow, Russell L.; Haber, Daniel University of Nevada, Las Vegas; Burnley, Pamela

    2015-01-01

    Aerial gamma ray surveys are important for those working in nuclear security and industry for determining locations of both anthropogenic radiological sources and natural occurrences of radionuclides. During an aerial gamma ray survey, a low flying aircraft, such as a helicopter, flies in a linear pattern across the survey area while measuring the gamma emissions with a sodium iodide (NaI) detector. Currently, if a gamma ray survey is being flown in an area, the only way to correct for geologic sources of gamma rays is to have flown the area previously. This is prohibitively expensive and would require complete nationalmore » coverage. This project’s goal is to model the geologic contribution to radiological backgrounds using published geochemical data, GIS software, remote sensing, calculations, and modeling software. K, U and Th are the three major gamma emitters in geologic material. U and Th are assumed to be in secular equilibrium with their daughter isotopes. If K, U, and Th abundance values are known for a given geologic unit the expected gamma ray exposure rate can be calculated using the Grasty equation or by modeling software. Monte Carlo N-Particle Transport software (MCNP), developed by Los Alamos National Laboratory, is modeling software designed to simulate particles and their interactions with matter. Using this software, models have been created that represent various lithologies. These simulations randomly generate gamma ray photons at energy levels expected from natural radiologic sources. The photons take a random path through the simulated geologic media and deposit their energy at the end of their track. A series of nested spheres have been created and filled with simulated atmosphere to record energy deposition. Energies deposited are binned in the same manner as the NaI detectors used during an aerial survey. These models are used in place of the simplistic Grasty equation as they take into account absorption properties of the lithology which the simplistic equation ignores.« less

  18. Dynamic provisioning of local and remote compute resources with OpenStack

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  19. Design considerations for a C-shaped PET system, dedicated to small animal brain imaging, using GATE Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Efthimiou, N.; Papadimitroulas, P.; Kostou, T.; Loudos, G.

    2015-09-01

    Commercial clinical and preclinical PET scanners rely on the full cylindrical geometry for whole body scans as well as for dedicated organs. In this study we propose the construction of a low cost dual-head C-shaped PET system dedicated for small animal brain imaging. Monte Carlo simulation studies were performed using GATE toolkit to evaluate the optimum design in terms of sensitivity, distortions in the FOV and spatial resolution. The PET model is based on SiPMs and BGO pixelated arrays. Four different configurations with C- angle 0°, 15°, 30° and 45° within the modules, were considered. Geometrical phantoms were used for the evaluation process. STIR software, extended by an efficient multi-threaded ray tracing technique, was used for the image reconstruction. The algorithm automatically adjusts the size of the FOV according to the shape of the detector's geometry. The results showed improvement in sensitivity of ∼15% in case of 45° C-angle compared to the 0° case. The spatial resolution was found 2 mm for 45° C-angle.

  20. SU-E-T-235: Monte Carlo Analysis of the Dose Enhancement in the Scalp of Patients Due to Titanium Plate Backscatter During Post-Operative Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, M; Elson, H; Lamba, M

    2014-06-01

    Purpose: To quantify the clinically observed dose enhancement adjacent to cranial titanium fixation plates during post-operative radiotherapy. Methods: Irradiation of a titanium burr hole cover was simulated using Monte Carlo code MCNPX for a 6 MV photon spectrum to investigate backscatter dose enhancement due to increased production of secondary electrons within the titanium plate. The simulated plate was placed 3 mm deep in a water phantom, and dose deposition was tallied for 0.2 mm thick cells adjacent to the entrance and exit sides of the plate. These results were compared to a simulation excluding the presence of the titanium tomore » calculate relative dose enhancement on the entrance and exit sides of the plate. To verify simulated results, two titanium burr hole covers (Synthes, Inc. and Biomet, Inc.) were irradiated with 6 MV photons in a solid water phantom containing GafChromic MD-55 film. The phantom was irradiated on a Varian 21EX linear accelerator at multiple gantry angles (0–180 degrees) to analyze the angular dependence of the backscattered radiation. Relative dose enhancement was quantified using computer software. Results: Monte Carlo simulations indicate a relative difference of 26.4% and 7.1% on the entrance and exit sides of the plate respectively. Film dosimetry results using a similar geometry indicate a relative difference of 13% and -10% on the entrance and exit sides of the plate respectively. Relative dose enhancement on the entrance side of the plate decreased with increasing gantry angle from 0 to 180 degrees. Conclusion: Film and simulation results demonstrate an increase in dose to structures immediately adjacent to cranial titanium fixation plates. Increased beam obliquity has shown to alleviate dose enhancement to some extent. These results are consistent with clinically observed effects.« less

  1. Consistent multiphysics simulation of a central tower CSP plant as applied to ISTORE

    NASA Astrophysics Data System (ADS)

    Votyakov, Evgeny V.; Papanicolas, Costas N.

    2017-06-01

    We present a unified consistent multiphysics approach to model a central tower CSP plant. The framework for the model includes Monte Carlo ray tracing (RT) and computational fluid dynamics (CFD) components utilizing the OpenFOAM C++ software library. The RT part works effectively with complex surfaces of engineering design given in CAD formats. The CFD simulation, which is based on 3D Navier-Stokes equations, takes into account all possible heat transfer mechanisms: radiation, conduction, and convection. Utilizing this package, the solar field of the experimental Platform for Research, Observation, and TEchnological Applications in Solar Energy (PROTEAS) and the Integrated STOrage and Receiver (ISTORE), developed at the Cyprus Institute, are being examined.

  2. Monte Carlo simulations of quantum dot solar concentrators: ray tracing based on fluorescence mapping

    NASA Astrophysics Data System (ADS)

    Schuler, A.; Kostro, A.; Huriet, B.; Galande, C.; Scartezzini, J.-L.

    2008-08-01

    One promising application of semiconductor nanostructures in the field of photovoltaics might be quantum dot solar concentrators. Quantum dot containing nanocomposite thin films are synthesized at EPFL-LESO by a low cost sol-gel process. In order to study the potential of the novel planar photoluminescent concentrators, reliable computer simulations are needed. A computer code for ray tracing simulations of quantum dot solar concentrators has been developed at EPFL-LESO on the basis of Monte Carlo methods that are applied to polarization-dependent reflection/transmission at interfaces, photon absorption by the semiconductor nanocrystals and photoluminescent reemission. The software allows importing measured or theoretical absorption/reemission spectra describing the photoluminescent properties of the quantum dots. Hereby the properties of photoluminescent reemission are described by a set of emission spectra depending on the energy of the incoming photon, allowing to simulate the photoluminescent emission using the inverse function method. By our simulations, the importance of two main factors is revealed, an emission spectrum matched to the spectral efficiency curve of the photovoltaic cell, and a large Stokes shift, which is advantageous for the lateral energy transport. No significant energy losses are implied when the quantum dots are contained within a nanocomposite coating instead of being dispersed in the entire volume of the pane. Together with the knowledge on the optoelectronical properties of suitable photovoltaic cells, the simulations allow to predict the total efficiency of the envisaged concentrating PV systems, and to optimize photoluminescent emission frequencies, optical densities, and pane dimensions.

  3. NOTE: Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool

    NASA Astrophysics Data System (ADS)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  4. Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool.

    PubMed

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-07

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  5. Comparison of internal dose estimates obtained using organ-level, voxel S value, and Monte Carlo techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimes, Joshua, E-mail: grimes.joshua@mayo.edu; Celler, Anna

    2014-09-15

    Purpose: The authors’ objective was to compare internal dose estimates obtained using the Organ Level Dose Assessment with Exponential Modeling (OLINDA/EXM) software, the voxel S value technique, and Monte Carlo simulation. Monte Carlo dose estimates were used as the reference standard to assess the impact of patient-specific anatomy on the final dose estimate. Methods: Six patients injected with{sup 99m}Tc-hydrazinonicotinamide-Tyr{sup 3}-octreotide were included in this study. A hybrid planar/SPECT imaging protocol was used to estimate {sup 99m}Tc time-integrated activity coefficients (TIACs) for kidneys, liver, spleen, and tumors. Additionally, TIACs were predicted for {sup 131}I, {sup 177}Lu, and {sup 90}Y assuming themore » same biological half-lives as the {sup 99m}Tc labeled tracer. The TIACs were used as input for OLINDA/EXM for organ-level dose calculation and voxel level dosimetry was performed using the voxel S value method and Monte Carlo simulation. Dose estimates for {sup 99m}Tc, {sup 131}I, {sup 177}Lu, and {sup 90}Y distributions were evaluated by comparing (i) organ-level S values corresponding to each method, (ii) total tumor and organ doses, (iii) differences in right and left kidney doses, and (iv) voxelized dose distributions calculated by Monte Carlo and the voxel S value technique. Results: The S values for all investigated radionuclides used by OLINDA/EXM and the corresponding patient-specific S values calculated by Monte Carlo agreed within 2.3% on average for self-irradiation, and differed by as much as 105% for cross-organ irradiation. Total organ doses calculated by OLINDA/EXM and the voxel S value technique agreed with Monte Carlo results within approximately ±7%. Differences between right and left kidney doses determined by Monte Carlo were as high as 73%. Comparison of the Monte Carlo and voxel S value dose distributions showed that each method produced similar dose volume histograms with a minimum dose covering 90% of the volume (D90) agreeing within ±3%, on average. Conclusions: Several aspects of OLINDA/EXM dose calculation were compared with patient-specific dose estimates obtained using Monte Carlo. Differences in patient anatomy led to large differences in cross-organ doses. However, total organ doses were still in good agreement since most of the deposited dose is due to self-irradiation. Comparison of voxelized doses calculated by Monte Carlo and the voxel S value technique showed that the 3D dose distributions produced by the respective methods are nearly identical.« less

  6. A SPECT system simulator built on the SolidWorks TM 3D-Design package.

    PubMed

    Li, Xin; Furenlid, Lars R

    2014-08-17

    We have developed a GPU-accelerated SPECT system simulator that integrates into instrument-design workflow [1]. This simulator includes a gamma-ray tracing module that can rapidly propagate gamma-ray photons through arbitrary apertures modeled by SolidWorks TM -created stereolithography (.STL) representations with a full complement of physics cross sections [2, 3]. This software also contains a scintillation detector simulation module that can model a scintillation detector with arbitrary scintillation crystal shape and light-sensor arrangement. The gamma-ray tracing module enables us to efficiently model aperture and detector crystals in SolidWorks TM and save them as STL file format, then load the STL-format model into this module to generate list-mode results of interacted gamma-ray photon information (interaction positions and energies) inside the detector crystals. The Monte-Carlo scintillation detector simulation module enables us to simulate how scintillation photons get reflected, refracted and absorbed inside a scintillation detector, which contributes to more accurate simulation of a SPECT system.

  7. Parachute Models Used in the Mars Science Laboratory Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Cruz, Juan R.; Way, David W.; Shidner, Jeremy D.; Davis, Jody L.; Powell, Richard W.; Kipp, Devin M.; Adams, Douglas S.; Witkowski, Al; Kandis, Mike

    2013-01-01

    An end-to-end simulation of the Mars Science Laboratory (MSL) entry, descent, and landing (EDL) sequence was created at the NASA Langley Research Center using the Program to Optimize Simulated Trajectories II (POST2). This simulation is capable of providing numerous MSL system and flight software responses, including Monte Carlo-derived statistics of these responses. The MSL POST2 simulation includes models of EDL system elements, including those related to the parachute system. Among these there are models for the parachute geometry, mass properties, deployment, inflation, opening force, area oscillations, aerodynamic coefficients, apparent mass, interaction with the main landing engines, and off-loading. These models were kept as simple as possible, considering the overall objectives of the simulation. The main purpose of this paper is to describe these parachute system models to the extent necessary to understand how they work and some of their limitations. A list of lessons learned during the development of the models and simulation is provided. Future improvements to the parachute system models are proposed.

  8. A SPECT system simulator built on the SolidWorksTM 3D design package

    NASA Astrophysics Data System (ADS)

    Li, Xin; Furenlid, Lars R.

    2014-09-01

    We have developed a GPU-accelerated SPECT system simulator that integrates into instrument-design work flow [1]. This simulator includes a gamma-ray tracing module that can rapidly propagate gamma-ray photons through arbitrary apertures modeled by SolidWorksTM-created stereolithography (.STL) representations with a full com- plement of physics cross sections [2, 3]. This software also contains a scintillation detector simulation module that can model a scintillation detector with arbitrary scintillation crystal shape and light-sensor arrangement. The gamma-ray tracing module enables us to efficiently model aperture and detector crystals in SolidWorksTM and save them as STL file format, then load the STL-format model into this module to generate list-mode results of interacted gamma-ray photon information (interaction positions and energies) inside the detector crystals. The Monte-Carlo scintillation detector simulation module enables us to simulate how scintillation photons get reflected, refracted and absorbed inside a scintillation detector, which contributes to more accurate simulation of a SPECT system.

  9. Educational aspects of molecular simulation

    NASA Astrophysics Data System (ADS)

    Allen, Michael P.

    This article addresses some aspects of teaching simulation methods to undergraduates and graduate students. Simulation is increasingly a cross-disciplinary activity, which means that the students who need to learn about simulation methods may have widely differing backgrounds. Also, they may have a wide range of views on what constitutes an interesting application of simulation methods. Almost always, a successful simulation course includes an element of practical, hands-on activity: a balance always needs to be struck between treating the simulation software as a 'black box', and becoming bogged down in programming issues. With notebook computers becoming widely available, students often wish to take away the programs to run themselves, and access to raw computer power is not the limiting factor that it once was; on the other hand, the software should be portable and, if possible, free. Examples will be drawn from the author's experience in three different contexts. (1) An annual simulation summer school for graduate students, run by the UK CCP5 organization, in which practical sessions are combined with an intensive programme of lectures describing the methodology. (2) A molecular modelling module, given as part of a doctoral training centre in the Life Sciences at Warwick, for students who might not have a first degree in the physical sciences. (3) An undergraduate module in Physics at Warwick, also taken by students from other disciplines, teaching high performance computing, visualization, and scripting in the context of a physical application such as Monte Carlo simulation.

  10. A statistical model investigating the prevalence of tuberculosis in New York City using counting processes with two change-points

    PubMed Central

    ACHCAR, J. A.; MARTINEZ, E. Z.; RUFFINO-NETTO, A.; PAULINO, C. D.; SOARES, P.

    2008-01-01

    SUMMARY We considered a Bayesian analysis for the prevalence of tuberculosis cases in New York City from 1970 to 2000. This counting dataset presented two change-points during this period. We modelled this counting dataset considering non-homogeneous Poisson processes in the presence of the two-change points. A Bayesian analysis for the data is considered using Markov chain Monte Carlo methods. Simulated Gibbs samples for the parameters of interest were obtained using WinBugs software. PMID:18346287

  11. Automated Re-Entry System using FNPEG

    NASA Technical Reports Server (NTRS)

    Johnson, Wyatt R.; Lu, Ping; Stachowiak, Susan J.

    2017-01-01

    This paper discusses the implementation and simulated performance of the FNPEG (Fully Numerical Predictor-corrector Entry Guidance) algorithm into GNC FSW (Guidance, Navigation, and Control Flight Software) for use in an autonomous re-entry vehicle. A few modifications to FNPEG are discussed that result in computational savings -- a change to the state propagator, and a modification to cross-range lateral logic. Finally, some Monte Carlo results are presented using a representative vehicle in both a high-fidelity 6-DOF (degree-of-freedom) sim as well as in a 3-DOF sim for independent validation.

  12. System Advisor Model (SAM) General Description (Version 2017.9.5)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, Janine M; DiOrio, Nicholas A; Blair, Nathan J

    This document describes the capabilities of the System Advisor Model (SAM) developed and distributed by the U.S. Department of Energy's National Renewable Energy Laboratory. The document is for potential users and others wanting to learn about the model's capabilities. SAM is a techno-economic computer model that calculates performance and financial metrics of renewable energy projects. Project developers, policy makers, equipment manufacturers, and researchers use graphs and tables of SAM results in the process of evaluating financial, technology, and incentive options for renewable energy projects. SAM simulates the performance of photovoltaic, concentrating solar power, solar water heating, wind, geothermal, biomass, andmore » conventional power systems. The financial models are for projects that either buy and sell electricity at retail rates (residential and commercial) or sell electricity at a price determined in a power purchase agreement (PPA). SAM's simulation tools facilitate parametric and sensitivity analyses, Monte Carlo simulation and weather variability (P50/P90) studies. SAM can also read input variables from Microsoft Excel worksheets. For software developers, the SAM software development kit (SDK) makes it possible to use SAM simulation modules in their applications written in C/C plus plus, C sharp, Java, Python, MATLAB, and other languages. NREL provides both SAM and the SDK as free downloads at https://sam.nrel.gov. SAM is an open source project, so its source code is available to the public. Researchers can study the code to understand the model algorithms, and software programmers can contribute their own models and enhancements to the project. Technical support and more information about the software are available on the website.« less

  13. TH-AB-207A-07: Radiation Dose Simulation for a Newly Proposed Dynamic Bowtie Filters for CT Using Fast Monte Carlo Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Gao, Y

    Purpose: Dynamic bowtie filter is an innovative design capable of modulating the X-ray and balancing the flux in the detectors, and it introduces a new way of patient-specific CT scan optimizations. This study demonstrates the feasibility of performing fast Monte Carlo dose calculation for a type of dynamic bowtie filter for cone-beam CT (Liu et al. 2014 9(7) PloS one) using MIC coprocessors. Methods: The dynamic bowtie filter in question consists of a highly attenuating bowtie component (HB) and a weakly attenuating bowtie (WB). The HB is filled with CeCl3 solution and its surface is defined by a transcendental equation.more » The WB is an elliptical cylinder filled with air and immersed in the HB. As the scanner rotates, the orientation of WB remains the same with the static patient. In our Monte Carlo simulation, the HB was approximated by 576 boxes. The phantom was a voxelized elliptical cylinder composed of PMMA and surrounded by air (44cm×44cm×40cm, 1000×1000×1 voxels). The dose to the PMMA phantom was tallied with 0.15% statistical uncertainty under 100 kVp source. Two Monte Carlo codes ARCHER and MCNP-6.1 were compared. Both used double-precision. Compiler flags that may trade accuracy for speed were avoided. Results: The wall time of the simulation was 25.4 seconds by ARCHER on a 5110P MIC, 40 seconds on a X5650 CPU, and 523 seconds by the multithreaded MCNP on the same CPU. The high performance of ARCHER is attributed to the parameterized geometry and vectorization of the program hotspots. Conclusion: The dynamic bowtie filter modeled in this study is able to effectively reduce the dynamic range of the detected signals for the photon-counting detectors. With appropriate software optimization methods, the accelerator-based (MIC and GPU) Monte Carlo dose engines have shown good performance and can contribute to patient-specific CT scan optimizations.« less

  14. SU-F-T-25: Design and Implementation of a Multi-Purpose Applicator for Pelvic Brachytherapy Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogue, J; Parsai, E

    Purpose: The current generation of inflatable multichannel brachytherapy applicators, such as the Varian Capri, have limited implementation to only vaginal and rectal cancers. While there are similar designs utilizing rigid, non-inflatable applicators, these alternatives could cause increased dose to surrounding tissue due to air gaps. Modification of the Capri could allow for easier treatment planning by reducing the number of channels and increased versatility by modifying the applicator to include an attachable single tandem for cervical or multiple tandems for endometrial applications. Methods: A Varian Capri applicator was simulated in water to replicate a patient. Multiple plans were optimized tomore » deliver a prescribed dose of 100 cGy at 5mm away from the exterior of the applicator using six to thirteen existing channels. The current model was expanded upon to include a detachable tandem or multiple tandoms to increase its functionality to both cervical and endometrial cancers. Models were constructed in both threedimensional rendering software and Monte Carlo to allow prototyping and simulations. Results: Treatment plans utilizing six to thirteen channels produced limited dosimetric differences between channel arrangements, with a seven channel plan very closely approximating the thirteen channels. It was concluded that only seven channels would be necessary in future simulations to give an accurate representation of the applicator. Tandem attachments were prototyped for the applicator to demonstrate the ease of which they could be included. Future simulation in treatment planning software and Monte Carlo results will be presented to further define the ideal applicator geometry Conclusion: The current Capri applicator design could be easily modified to increase applicability to include cervical and endometrial treatments in addition to vaginal and rectal cancers. This new design helps in a more versatile single use applicator that can easily be inserted and to further reduce dose to critical structures during brachytherapy treatments.« less

  15. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  16. Comparison of GEANT4 very low energy cross section models with experimental data in water.

    PubMed

    Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C

    2010-09-01

    The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.

  17. On the modeling of the 2010 Gulf of Mexico Oil Spill

    NASA Astrophysics Data System (ADS)

    Mariano, A. J.; Kourafalou, V. H.; Srinivasan, A.; Kang, H.; Halliwell, G. R.; Ryan, E. H.; Roffer, M.

    2011-09-01

    Two oil particle trajectory forecasting systems were developed and applied to the 2010 Deepwater Horizon Oil Spill in the Gulf of Mexico. Both systems use ocean current fields from high-resolution numerical ocean circulation model simulations, Lagrangian stochastic models to represent unresolved sub-grid scale variability to advect oil particles, and Monte Carlo-based schemes for representing uncertain biochemical and physical processes. The first system assumes two-dimensional particle motion at the ocean surface, the oil is in one state, and the particle removal is modeled as a Monte Carlo process parameterized by a one number removal rate. Oil particles are seeded using both initial conditions based on observations and particles released at the location of the Maconda well. The initial conditions (ICs) of oil particle location for the two-dimensional surface oil trajectory forecasts are based on a fusing of all available information including satellite-based analyses. The resulting oil map is digitized into a shape file within which a polygon filling software generates longitude and latitude with variable particle density depending on the amount of oil present in the observations for the IC. The more complex system assumes three (light, medium, heavy) states for the oil, each state has a different removal rate in the Monte Carlo process, three-dimensional particle motion, and a particle size-dependent oil mixing model. Simulations from the two-dimensional forecast system produced results that qualitatively agreed with the uncertain "truth" fields. These simulations validated the use of our Monte Carlo scheme for representing oil removal by evaporation and other weathering processes. Eulerian velocity fields for predicting particle motion from data-assimilative models produced better particle trajectory distributions than a free running model with no data assimilation. Monte Carlo simulations of the three-dimensional oil particle trajectory, whose ensembles were generated by perturbing the size of the oil particles and the fraction in a given size range that are released at depth, the two largest unknowns in this problem. 36 realizations of the model were run with only subsurface oil releases. An average of these results yields that after three months, about 25% of the oil remains in the water column and that most of the oil is below 800 m.

  18. Analysis of Naval Ammunition Stock Positioning

    DTIC Science & Technology

    2015-12-01

    model takes once the Monte -Carlo simulation determines the assigned probabilities for site-to-site locations. Column two shows how the simulation...stockpiles and positioning them at coastal Navy facilities. A Monte -Carlo simulation model was developed to simulate expected cost and delivery...TERMS supply chain management, Monte -Carlo simulation, risk, delivery performance, stock positioning 15. NUMBER OF PAGES 85 16. PRICE CODE 17

  19. Monte Carlo simulation of a photodisintegration of 3 H experiment in Geant4

    NASA Astrophysics Data System (ADS)

    Gray, Isaiah

    2013-10-01

    An upcoming experiment involving photodisintegration of 3 H at the High Intensity Gamma-Ray Source facility at Duke University has been simulated in the software package Geant4. CAD models of silicon detectors and wire chambers were imported from Autodesk Inventor using the program FastRad and the Geant4 GDML importer. Sensitive detectors were associated with the appropriate logical volumes in the exported GDML file so that changes in detector geometry will be easily manifested in the simulation. Probability distribution functions for the energy and direction of outgoing protons were generated using numerical tables from previous theory, and energies and directions were sampled from these distributions using a rejection sampling algorithm. The simulation will be a useful tool to optimize detector geometry, estimate background rates, and test data analysis algorithms. This work was supported by the Triangle Universities Nuclear Laboratory REU program at Duke University.

  20. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  1. Monte Carlo simulation to calculate the rate of 137Cs gamma rays dispersion in gallium arsenide compound

    NASA Astrophysics Data System (ADS)

    Haider, F. A.; Chee, F. P.; Abu Hassan, H.; Saafie, S.

    2017-01-01

    Radiation effects on Gallium Arsenide (GaAs) have been tested by exposing samples to Cesium-137 (137Cs) gamma rays. Gallium Arsenide is a basic photonic material for most of the space technology communication, and, therefore, lends itself for applications where this is of concern. Monte Carlo simulations of interaction between direct ionizing radiation and GaAs structure have been performed in TRIM software, being part of SRIM 2011 programming package. An adverse results shows that energy dose does not govern the displacement of atoms and is dependent on the changes of incident angles and thickness of the GaAs target element. At certain thickness of GaAs and incident angle of 137Cs ion, the displacement damage is at its highest value. From the simulation result, it is found that if the thickness of the GaAs semiconductor material is small compared to the projected range at that particular incident energy, the energy loss in the target GaAs will be small. Hence, when the depth of semiconductor material is reduced, the range of damage in the target also decreased. However, the other factors such as quantum size effect, the energy gap between the conduction and valence band must also be taken into consideration when the dimension of the device is diminished.

  2. Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation.

    PubMed

    Ziegenhein, Peter; Pirner, Sven; Ph Kamerling, Cornelis; Oelfke, Uwe

    2015-08-07

    Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37[Formula: see text] compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25[Formula: see text] and 1.95[Formula: see text] faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.

  3. Optical dosimetry probes to validate Monte Carlo and empirical-method-based NIR dose planning in the brain.

    PubMed

    Verleker, Akshay Prabhu; Shaffer, Michael; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M

    2016-12-01

    A three-dimensional photon dosimetry in tissues is critical in designing optical therapeutic protocols to trigger light-activated drug release. The objective of this study is to investigate the feasibility of a Monte Carlo-based optical therapy planning software by developing dosimetry tools to characterize and cross-validate the local photon fluence in brain tissue, as part of a long-term strategy to quantify the effects of photoactivated drug release in brain tumors. An existing GPU-based 3D Monte Carlo (MC) code was modified to simulate near-infrared photon transport with differing laser beam profiles within phantoms of skull bone (B), white matter (WM), and gray matter (GM). A novel titanium-based optical dosimetry probe with isotropic acceptance was used to validate the local photon fluence, and an empirical model of photon transport was developed to significantly decrease execution time for clinical application. Comparisons between the MC and the dosimetry probe measurements were on an average 11.27%, 13.25%, and 11.81% along the illumination beam axis, and 9.4%, 12.06%, 8.91% perpendicular to the beam axis for WM, GM, and B phantoms, respectively. For a heterogeneous head phantom, the measured % errors were 17.71% and 18.04% along and perpendicular to beam axis. The empirical algorithm was validated by probe measurements and matched the MC results (R20.99), with average % error of 10.1%, 45.2%, and 22.1% relative to probe measurements, and 22.6%, 35.8%, and 21.9% relative to the MC, for WM, GM, and B phantoms, respectively. The simulation time for the empirical model was 6 s versus 8 h for the GPU-based Monte Carlo for a head phantom simulation. These tools provide the capability to develop and optimize treatment plans for optimal release of pharmaceuticals in the treatment of cancer. Future work will test and validate these novel delivery and release mechanisms in vivo.

  4. MCNP modelling of vaginal and uterine applicators used in intracavitary brachytherapy and comparison with radiochromic film measurements

    NASA Astrophysics Data System (ADS)

    Ceccolini, E.; Gerardy, I.; Ródenas, J.; van Dycke, M.; Gallardo, S.; Mostacci, D.

    Brachytherapy is an advanced cancer treatment that is minimally invasive, minimising radiation exposure to the surrounding healthy tissues. Microselectron© Nucletron devices with 192Ir source can be used for gynaecological brachytherapy, in patients with vaginal or uterine cancer. Measurements of isodose curves have been performed in a PMMA phantom and compared with Monte Carlo calculations and TPS (Plato software of Nucletron BPS 14.2) evaluation. The isodose measurements have been performed with radiochromic films (Gafchromic EBT©). The dose matrix has been obtained after digitalisation and use of a dose calibration curve obtained with a 6 MV photon beam provided by a medical linear accelerator. A comparison between the calculated and the measured matrix has been performed. The calculated dose matrix is obtained with a simulation using the MCNP5 Monte Carlo code (F4MESH tally).

  5. Epithelial cancers and photon migration: Monte Carlo simulations and diffuse reflectance measurements

    NASA Astrophysics Data System (ADS)

    Tubiana, Jerome; Kass, Alex J.; Newman, Maya Y.; Levitz, David

    2015-07-01

    Detecting pre-cancer in epithelial tissues such as the cervix is a challenging task in low-resources settings. In an effort to achieve low cost cervical cancer screening and diagnostic method for use in low resource settings, mobile colposcopes that use a smartphone as their engine have been developed. Designing image analysis software suited for this task requires proper modeling of light propagation from the abnormalities inside tissues to the camera of the smartphones. Different simulation methods have been developed in the past, by solving light diffusion equations, or running Monte Carlo simulations. Several algorithms exist for the latter, including MCML and the recently developed MCX. For imaging purpose, the observable parameter of interest is the reflectance profile of a tissue under some specific pattern of illumination and optical setup. Extensions of the MCX algorithm to simulate this observable under these conditions were developed. These extensions were validated against MCML and diffusion theory for the simple case of contact measurements, and reflectance profiles under colposcopy imaging geometry were also simulated. To validate this model, the diffuse reflectance profiles of tissue phantoms were measured with a spectrometer under several illumination and optical settings for various homogeneous tissues phantoms. The measured reflectance profiles showed a non-trivial deviation across the spectrum. Measurements of an added absorber experiment on a series of phantoms showed that absorption of dye scales linearly when fit to both MCX and diffusion models. More work is needed to integrate a pupil into the experiment.

  6. GATE Monte Carlo simulations for variations of an integrated PET/MR hybrid imaging system based on the Biograph mMR model

    NASA Astrophysics Data System (ADS)

    Aklan, B.; Jakoby, B. W.; Watson, C. C.; Braun, H.; Ritt, P.; Quick, H. H.

    2015-06-01

    A simulation toolkit, GATE (Geant4 Application for Tomographic Emission), was used to develop an accurate Monte Carlo (MC) simulation of a fully integrated 3T PET/MR hybrid imaging system (Siemens Biograph mMR). The PET/MR components of the Biograph mMR were simulated in order to allow a detailed study of variations of the system design on the PET performance, which are not easy to access and measure on a real PET/MR system. The 3T static magnetic field of the MR system was taken into account in all Monte Carlo simulations. The validation of the MC model was carried out against actual measurements performed on the PET/MR system by following the NEMA (National Electrical Manufacturers Association) NU 2-2007 standard. The comparison of simulated and experimental performance measurements included spatial resolution, sensitivity, scatter fraction, and count rate capability. The validated system model was then used for two different applications. The first application focused on investigating the effect of an extension of the PET field-of-view on the PET performance of the PET/MR system. The second application deals with simulating a modified system timing resolution and coincidence time window of the PET detector electronics in order to simulate time-of-flight (TOF) PET detection. A dedicated phantom was modeled to investigate the impact of TOF on overall PET image quality. Simulation results showed that the overall divergence between simulated and measured data was found to be less than 10%. Varying the detector geometry showed that the system sensitivity and noise equivalent count rate of the PET/MR system increased progressively with an increasing number of axial detector block rings, as to be expected. TOF-based PET reconstructions of the modeled phantom showed an improvement in signal-to-noise ratio and image contrast to the conventional non-TOF PET reconstructions. In conclusion, the validated MC simulation model of an integrated PET/MR system with an overall accuracy error of less than 10% can now be used for further MC simulation applications such as development of hardware components as well as for testing of new PET/MR software algorithms, such as assessment of point-spread function-based reconstruction algorithms.

  7. Simulation loop between cad systems, GEANT-4 and GeoModel: Implementation and results

    NASA Astrophysics Data System (ADS)

    Sharmazanashvili, A.; Tsutskiridze, Niko

    2016-09-01

    Compare analysis of simulation and as-built geometry descriptions of detector is important field of study for data_vs_Monte-Carlo discrepancies. Shapes consistency and detalization is not important while adequateness of volumes and weights of detector components are essential for tracking. There are 2 main reasons of faults of geometry descriptions in simulation: (1) Difference between simulated and as-built geometry descriptions; (2) Internal inaccuracies of geometry transformations added by simulation software infrastructure itself. Georgian Engineering team developed hub on the base of CATIA platform and several tools enabling to read in CATIA different descriptions used by simulation packages, like XML->CATIA; VP1->CATIA; Geo-Model->CATIA; Geant4->CATIA. As a result it becomes possible to compare different descriptions with each other using the full power of CATIA and investigate both classes of reasons of faults of geometry descriptions. Paper represents results of case studies of ATLAS Coils and End-Cap toroid structures.

  8. An Object-Oriented Serial DSMC Simulation Package

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Cai, Chunpei

    2011-05-01

    A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.

  9. A Hardware-Accelerated Quantum Monte Carlo framework (HAQMC) for N-body systems

    NASA Astrophysics Data System (ADS)

    Gothandaraman, Akila; Peterson, Gregory D.; Warren, G. Lee; Hinde, Robert J.; Harrison, Robert J.

    2009-12-01

    Interest in the study of structural and energetic properties of highly quantum clusters, such as inert gas clusters has motivated the development of a hardware-accelerated framework for Quantum Monte Carlo simulations. In the Quantum Monte Carlo method, the properties of a system of atoms, such as the ground-state energies, are averaged over a number of iterations. Our framework is aimed at accelerating the computations in each iteration of the QMC application by offloading the calculation of properties, namely energy and trial wave function, onto reconfigurable hardware. This gives a user the capability to run simulations for a large number of iterations, thereby reducing the statistical uncertainty in the properties, and for larger clusters. This framework is designed to run on the Cray XD1 high performance reconfigurable computing platform, which exploits the coarse-grained parallelism of the processor along with the fine-grained parallelism of the reconfigurable computing devices available in the form of field-programmable gate arrays. In this paper, we illustrate the functioning of the framework, which can be used to calculate the energies for a model cluster of helium atoms. In addition, we present the capabilities of the framework that allow the user to vary the chemical identities of the simulated atoms. Program summaryProgram title: Hardware Accelerated Quantum Monte Carlo (HAQMC) Catalogue identifier: AEEP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 691 537 No. of bytes in distributed program, including test data, etc.: 5 031 226 Distribution format: tar.gz Programming language: C/C++ for the QMC application, VHDL and Xilinx 8.1 ISE/EDK tools for FPGA design and development Computer: Cray XD1 consisting of a dual-core, dualprocessor AMD Opteron 2.2 GHz with a Xilinx Virtex-4 (V4LX160) or Xilinx Virtex-II Pro (XC2VP50) FPGA per node. We use the compute node with the Xilinx Virtex-4 FPGA Operating system: Red Hat Enterprise Linux OS Has the code been vectorised or parallelized?: Yes Classification: 6.1 Nature of problem: Quantum Monte Carlo is a practical method to solve the Schrödinger equation for large many-body systems and obtain the ground-state properties of such systems. This method involves the sampling of a number of configurations of atoms and averaging the properties of the configurations over a number of iterations. We are interested in applying the QMC method to obtain the energy and other properties of highly quantum clusters, such as inert gas clusters. Solution method: The proposed framework provides a combined hardware-software approach, in which the QMC simulation is performed on the host processor, with the computationally intensive functions such as energy and trial wave function computations mapped onto the field-programmable gate array (FPGA) logic device attached as a co-processor to the host processor. We perform the QMC simulation for a number of iterations as in the case of our original software QMC approach, to reduce the statistical uncertainty of the results. However, our proposed HAQMC framework accelerates each iteration of the simulation, by significantly reducing the time taken to calculate the ground-state properties of the configurations of atoms, thereby accelerating the overall QMC simulation. We provide a generic interpolation framework that can be extended to study a variety of pure and doped atomic clusters, irrespective of the chemical identities of the atoms. For the FPGA implementation of the properties, we use a two-region approach for accurately computing the properties over the entire domain, employ deep pipelines and fixed-point for all our calculations guaranteeing the accuracy required for our simulation.

  10. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweezy, Jeremy Ed

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gammamore » transport with multi-temperature treatment, static eigenvalue (k eff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.« less

  11. Numerical integration of detector response functions via Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Keegan John; O'Donnell, John M.; Gomez, Jaime A.

    Calculations of detector response functions are complicated because they include the intricacies of signal creation from the detector itself as well as a complex interplay between the detector, the particle-emitting target, and the entire experimental environment. As such, these functions are typically only accessible through time-consuming Monte Carlo simulations. Furthermore, the output of thousands of Monte Carlo simulations can be necessary in order to extract a physics result from a single experiment. Here we describe a method to obtain a full description of the detector response function using Monte Carlo simulations. We also show that a response function calculated inmore » this way can be used to create Monte Carlo simulation output spectra a factor of ~1000× faster than running a new Monte Carlo simulation. A detailed discussion of the proper treatment of uncertainties when using this and other similar methods is provided as well. Here, this method is demonstrated and tested using simulated data from the Chi-Nu experiment, which measures prompt fission neutron spectra at the Los Alamos Neutron Science Center.« less

  12. Numerical integration of detector response functions via Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Kelly, K. J.; O'Donnell, J. M.; Gomez, J. A.; Taddeucci, T. N.; Devlin, M.; Haight, R. C.; White, M. C.; Mosby, S. M.; Neudecker, D.; Buckner, M. Q.; Wu, C. Y.; Lee, H. Y.

    2017-09-01

    Calculations of detector response functions are complicated because they include the intricacies of signal creation from the detector itself as well as a complex interplay between the detector, the particle-emitting target, and the entire experimental environment. As such, these functions are typically only accessible through time-consuming Monte Carlo simulations. Furthermore, the output of thousands of Monte Carlo simulations can be necessary in order to extract a physics result from a single experiment. Here we describe a method to obtain a full description of the detector response function using Monte Carlo simulations. We also show that a response function calculated in this way can be used to create Monte Carlo simulation output spectra a factor of ∼ 1000 × faster than running a new Monte Carlo simulation. A detailed discussion of the proper treatment of uncertainties when using this and other similar methods is provided as well. This method is demonstrated and tested using simulated data from the Chi-Nu experiment, which measures prompt fission neutron spectra at the Los Alamos Neutron Science Center.

  13. Numerical integration of detector response functions via Monte Carlo simulations

    DOE PAGES

    Kelly, Keegan John; O'Donnell, John M.; Gomez, Jaime A.; ...

    2017-06-13

    Calculations of detector response functions are complicated because they include the intricacies of signal creation from the detector itself as well as a complex interplay between the detector, the particle-emitting target, and the entire experimental environment. As such, these functions are typically only accessible through time-consuming Monte Carlo simulations. Furthermore, the output of thousands of Monte Carlo simulations can be necessary in order to extract a physics result from a single experiment. Here we describe a method to obtain a full description of the detector response function using Monte Carlo simulations. We also show that a response function calculated inmore » this way can be used to create Monte Carlo simulation output spectra a factor of ~1000× faster than running a new Monte Carlo simulation. A detailed discussion of the proper treatment of uncertainties when using this and other similar methods is provided as well. Here, this method is demonstrated and tested using simulated data from the Chi-Nu experiment, which measures prompt fission neutron spectra at the Los Alamos Neutron Science Center.« less

  14. DnaSAM: Software to perform neutrality testing for large datasets with complex null models.

    PubMed

    Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B

    2010-05-01

    Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C; Nguyen, G; Chung, Y

    Purpose: Ureteroscopy involves fluoroscopy which potentially results in considerable amount of radiation dose to the patient. Purpose of this study was two-fold: (a) to develop the effective dose computational model for obese and non-obese patients undergoing left and right ureteroscopy, and (b) to evaluate the utility of a commercial Monte Carlo software for dose assessment in ureteroscopy. Methods: Organ dose measurements were performed on an adult male anthropomorphic phantom, representing the non-obese patients, with 20 high-sensitivity MOSFET detectors and two 0.18cc ionization chambers placed in selected organs. Fat-equivalent paddings were placed around the abdominal region to simulate for obese patients.more » Effective dose (ED) was calculated using ICRP 103 tissue weighting factors and normalized to the effective dose rate in miliSivert per second (mSv/s). In addition, a commercial Monte Carlo (MC) dose estimation program was used to estimate ED for the non-obese model, with table attenuation correction applied to simulate clinical procedure. Results: For the equipment and protocols involved in this study, the MOSFETderived ED rates for the obese patient model (‘Left’: 0.0092±0.0004 mSv/s; ‘Right’: 0.0086±0.0004 mSv/s) was found to be more than twice as much as that to the non-obese patient model (‘Left’: 0.0041±0.0003 mSv/s; ‘Right’: 0.0036±0.0007 mSv/s). The MC-derived ED rates for the non-obese patient model (‘Left’: 0.0041 mSv/s; ‘Right’: 0.0036 mSv/s; with statistical uncertainty of 1%) showed a good agreement with the MOSFET method. Conclusion: The significant difference in ED rate between the obese and non-obese patient models shows the limitation of directly applying commercial softwares for obese patients and leading to considerable underestimation of ED. Although commercial softwares offer a convenient means of dose estimation, but the utility may be limited to standard-man geometry as the software does not account for table attenuation, obese patient geometry, and differences between the anthropomorphic phantom and MC mathematical phantom.« less

  16. Reflective Optics Design for an LED High Beam Headlamp of Motorbikes

    PubMed Central

    Ge, Peng; Wang, Xiang; Li, Yang; Wang, Hong

    2015-01-01

    We propose a reflective optics design for an LED motorbike high beam lamp. We set the measuring screen as an elliptical zone and divide it into many small lattices and divide the spatial angle of the LED source into many parts and make relationships between them. According to the conservation law of energy and the Snell's law, the reflector is generated by freeform optics design method. Then the optical system is simulated by Monte Carlo method using ASAP software. Light pattern of simulation could meet the standard. The high beam headlamp is finally fabricated and assembled into a physical object. Experiment results can fully comply with United Nations Economic Commission for Europe (ECE) vehicle regulations R113 revision 2 (Class C). PMID:25961073

  17. Reflective optics design for an LED high beam headlamp of motorbikes.

    PubMed

    Ge, Peng; Wang, Xiang; Li, Yang; Wang, Hong

    2015-01-01

    We propose a reflective optics design for an LED motorbike high beam lamp. We set the measuring screen as an elliptical zone and divide it into many small lattices and divide the spatial angle of the LED source into many parts and make relationships between them. According to the conservation law of energy and the Snell's law, the reflector is generated by freeform optics design method. Then the optical system is simulated by Monte Carlo method using ASAP software. Light pattern of simulation could meet the standard. The high beam headlamp is finally fabricated and assembled into a physical object. Experiment results can fully comply with United Nations Economic Commission for Europe (ECE) vehicle regulations R113 revision 2 (Class C).

  18. Physics Computing '92: Proceedings of the 4th International Conference

    NASA Astrophysics Data System (ADS)

    de Groot, Robert A.; Nadrchal, Jaroslav

    1993-04-01

    The Table of Contents for the book is as follows: * Preface * INVITED PAPERS * Ab Initio Theoretical Approaches to the Structural, Electronic and Vibrational Properties of Small Clusters and Fullerenes: The State of the Art * Neural Multigrid Methods for Gauge Theories and Other Disordered Systems * Multicanonical Monte Carlo Simulations * On the Use of the Symbolic Language Maple in Physics and Chemistry: Several Examples * Nonequilibrium Phase Transitions in Catalysis and Population Models * Computer Algebra, Symmetry Analysis and Integrability of Nonlinear Evolution Equations * The Path-Integral Quantum Simulation of Hydrogen in Metals * Digital Optical Computing: A New Approach of Systolic Arrays Based on Coherence Modulation of Light and Integrated Optics Technology * Molecular Dynamics Simulations of Granular Materials * Numerical Implementation of a K.A.M. Algorithm * Quasi-Monte Carlo, Quasi-Random Numbers and Quasi-Error Estimates * What Can We Learn from QMC Simulations * Physics of Fluctuating Membranes * Plato, Apollonius, and Klein: Playing with Spheres * Steady States in Nonequilibrium Lattice Systems * CONVODE: A REDUCE Package for Differential Equations * Chaos in Coupled Rotators * Symplectic Numerical Methods for Hamiltonian Problems * Computer Simulations of Surfactant Self Assembly * High-dimensional and Very Large Cellular Automata for Immunological Shape Space * A Review of the Lattice Boltzmann Method * Electronic Structure of Solids in the Self-interaction Corrected Local-spin-density Approximation * Dedicated Computers for Lattice Gauge Theory Simulations * Physics Education: A Survey of Problems and Possible Solutions * Parallel Computing and Electronic-Structure Theory * High Precision Simulation Techniques for Lattice Field Theory * CONTRIBUTED PAPERS * Case Study of Microscale Hydrodynamics Using Molecular Dynamics and Lattice Gas Methods * Computer Modelling of the Structural and Electronic Properties of the Supported Metal Catalysis * Ordered Particle Simulations for Serial and MIMD Parallel Computers * "NOLP" -- Program Package for Laser Plasma Nonlinear Optics * Algorithms to Solve Nonlinear Least Square Problems * Distribution of Hydrogen Atoms in Pd-H Computed by Molecular Dynamics * A Ray Tracing of Optical System for Protein Crystallography Beamline at Storage Ring-SIBERIA-2 * Vibrational Properties of a Pseudobinary Linear Chain with Correlated Substitutional Disorder * Application of the Software Package Mathematica in Generalized Master Equation Method * Linelist: An Interactive Program for Analysing Beam-foil Spectra * GROMACS: A Parallel Computer for Molecular Dynamics Simulations * GROMACS Method of Virial Calculation Using a Single Sum * The Interactive Program for the Solution of the Laplace Equation with the Elimination of Singularities for Boundary Functions * Random-Number Generators: Testing Procedures and Comparison of RNG Algorithms * Micro-TOPIC: A Tokamak Plasma Impurities Code * Rotational Molecular Scattering Calculations * Orthonormal Polynomial Method for Calibrating of Cryogenic Temperature Sensors * Frame-based System Representing Basis of Physics * The Role of Massively Data-parallel Computers in Large Scale Molecular Dynamics Simulations * Short-range Molecular Dynamics on a Network of Processors and Workstations * An Algorithm for Higher-order Perturbation Theory in Radiative Transfer Computations * Hydrostochastics: The Master Equation Formulation of Fluid Dynamics * HPP Lattice Gas on Transputers and Networked Workstations * Study on the Hysteresis Cycle Simulation Using Modeling with Different Functions on Intervals * Refined Pruning Techniques for Feed-forward Neural Networks * Random Walk Simulation of the Motion of Transient Charges in Photoconductors * The Optical Hysteresis in Hydrogenated Amorphous Silicon * Diffusion Monte Carlo Analysis of Modern Interatomic Potentials for He * A Parallel Strategy for Molecular Dynamics Simulations of Polar Liquids on Transputer Arrays * Distribution of Ions Reflected on Rough Surfaces * The Study of Step Density Distribution During Molecular Beam Epitaxy Growth: Monte Carlo Computer Simulation * Towards a Formal Approach to the Construction of Large-scale Scientific Applications Software * Correlated Random Walk and Discrete Modelling of Propagation through Inhomogeneous Media * Teaching Plasma Physics Simulation * A Theoretical Determination of the Au-Ni Phase Diagram * Boson and Fermion Kinetics in One-dimensional Lattices * Computational Physics Course on the Technical University * Symbolic Computations in Simulation Code Development and Femtosecond-pulse Laser-plasma Interaction Studies * Computer Algebra and Integrated Computing Systems in Education of Physical Sciences * Coordinated System of Programs for Undergraduate Physics Instruction * Program Package MIRIAM and Atomic Physics of Extreme Systems * High Energy Physics Simulation on the T_Node * The Chapman-Kolmogorov Equation as Representation of Huygens' Principle and the Monolithic Self-consistent Numerical Modelling of Lasers * Authoring System for Simulation Developments * Molecular Dynamics Study of Ion Charge Effects in the Structure of Ionic Crystals * A Computational Physics Introductory Course * Computer Calculation of Substrate Temperature Field in MBE System * Multimagnetical Simulation of the Ising Model in Two and Three Dimensions * Failure of the CTRW Treatment of the Quasicoherent Excitation Transfer * Implementation of a Parallel Conjugate Gradient Method for Simulation of Elastic Light Scattering * Algorithms for Study of Thin Film Growth * Algorithms and Programs for Physics Teaching in Romanian Technical Universities * Multicanonical Simulation of 1st order Transitions: Interface Tension of the 2D 7-State Potts Model * Two Numerical Methods for the Calculation of Periodic Orbits in Hamiltonian Systems * Chaotic Behavior in a Probabilistic Cellular Automata? * Wave Optics Computing by a Networked-based Vector Wave Automaton * Tensor Manipulation Package in REDUCE * Propagation of Electromagnetic Pulses in Stratified Media * The Simple Molecular Dynamics Model for the Study of Thermalization of the Hot Nucleon Gas * Electron Spin Polarization in PdCo Alloys Calculated by KKR-CPA-LSD Method * Simulation Studies of Microscopic Droplet Spreading * A Vectorizable Algorithm for the Multicolor Successive Overrelaxation Method * Tetragonality of the CuAu I Lattice and Its Relation to Electronic Specific Heat and Spin Susceptibility * Computer Simulation of the Formation of Metallic Aggregates Produced by Chemical Reactions in Aqueous Solution * Scaling in Growth Models with Diffusion: A Monte Carlo Study * The Nucleus as the Mesoscopic System * Neural Network Computation as Dynamic System Simulation * First-principles Theory of Surface Segregation in Binary Alloys * Data Smooth Approximation Algorithm for Estimating the Temperature Dependence of the Ice Nucleation Rate * Genetic Algorithms in Optical Design * Application of 2D-FFT in the Study of Molecular Exchange Processes by NMR * Advanced Mobility Model for Electron Transport in P-Si Inversion Layers * Computer Simulation for Film Surfaces and its Fractal Dimension * Parallel Computation Techniques and the Structure of Catalyst Surfaces * Educational SW to Teach Digital Electronics and the Corresponding Text Book * Primitive Trinomials (Mod 2) Whose Degree is a Mersenne Exponent * Stochastic Modelisation and Parallel Computing * Remarks on the Hybrid Monte Carlo Algorithm for the ∫4 Model * An Experimental Computer Assisted Workbench for Physics Teaching * A Fully Implicit Code to Model Tokamak Plasma Edge Transport * EXPFIT: An Interactive Program for Automatic Beam-foil Decay Curve Analysis * Mapping Technique for Solving General, 1-D Hamiltonian Systems * Freeway Traffic, Cellular Automata, and Some (Self-Organizing) Criticality * Photonuclear Yield Analysis by Dynamic Programming * Incremental Representation of the Simply Connected Planar Curves * Self-convergence in Monte Carlo Methods * Adaptive Mesh Technique for Shock Wave Propagation * Simulation of Supersonic Coronal Streams and Their Interaction with the Solar Wind * The Nature of Chaos in Two Systems of Ordinary Nonlinear Differential Equations * Considerations of a Window-shopper * Interpretation of Data Obtained by RTP 4-Channel Pulsed Radar Reflectometer Using a Multi Layer Perceptron * Statistics of Lattice Bosons for Finite Systems * Fractal Based Image Compression with Affine Transformations * Algorithmic Studies on Simulation Codes for Heavy-ion Reactions * An Energy-Wise Computer Simulation of DNA-Ion-Water Interactions Explains the Abnormal Structure of Poly[d(A)]:Poly[d(T)] * Computer Simulation Study of Kosterlitz-Thouless-Like Transitions * Problem-oriented Software Package GUN-EBT for Computer Simulation of Beam Formation and Transport in Technological Electron-Optical Systems * Parallelization of a Boundary Value Solver and its Application in Nonlinear Dynamics * The Symbolic Classification of Real Four-dimensional Lie Algebras * Short, Singular Pulses Generation by a Dye Laser at Two Wavelengths Simultaneously * Quantum Monte Carlo Simulations of the Apex-Oxygen-Model * Approximation Procedures for the Axial Symmetric Static Einstein-Maxwell-Higgs Theory * Crystallization on a Sphere: Parallel Simulation on a Transputer Network * FAMULUS: A Software Product (also) for Physics Education * MathCAD vs. FAMULUS -- A Brief Comparison * First-principles Dynamics Used to Study Dissociative Chemisorption * A Computer Controlled System for Crystal Growth from Melt * A Time Resolved Spectroscopic Method for Short Pulsed Particle Emission * Green's Function Computation in Radiative Transfer Theory * Random Search Optimization Technique for One-criteria and Multi-criteria Problems * Hartley Transform Applications to Thermal Drift Elimination in Scanning Tunneling Microscopy * Algorithms of Measuring, Processing and Interpretation of Experimental Data Obtained with Scanning Tunneling Microscope * Time-dependent Atom-surface Interactions * Local and Global Minima on Molecular Potential Energy Surfaces: An Example of N3 Radical * Computation of Bifurcation Surfaces * Symbolic Computations in Quantum Mechanics: Energies in Next-to-solvable Systems * A Tool for RTP Reactor and Lamp Field Design * Modelling of Particle Spectra for the Analysis of Solid State Surface * List of Participants

  19. Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry.

    PubMed

    Bostani, Maryam; Mueller, Jonathon W; McMillan, Kyle; Cody, Dianna D; Cagnon, Chris H; DeMarco, John J; McNitt-Gray, Michael F

    2015-02-01

    The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for all exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. The calculated mean percent difference between TLD measurements and Monte Carlo simulations was -4.9% with standard deviation of 8.7% and a range of -22.7% to 5.7%. The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.

  20. Monte Carlo Simulation for Perusal and Practice.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.; Robey, Randall R.

    The meaningful investigation of many problems in statistics can be solved through Monte Carlo methods. Monte Carlo studies can help solve problems that are mathematically intractable through the analysis of random samples from populations whose characteristics are known to the researcher. Using Monte Carlo simulation, the values of a statistic are…

  1. [Accuracy Check of Monte Carlo Simulation in Particle Therapy Using Gel Dosimeters].

    PubMed

    Furuta, Takuya

    2017-01-01

    Gel dosimeters are a three-dimensional imaging tool for dose distribution induced by radiations. They can be used for accuracy check of Monte Carlo simulation in particle therapy. An application was reviewed in this article. An inhomogeneous biological sample placing a gel dosimeter behind it was irradiated by carbon beam. The recorded dose distribution in the gel dosimeter reflected the inhomogeneity of the biological sample. Monte Carlo simulation was conducted by reconstructing the biological sample from its CT image. The accuracy of the particle transport by Monte Carlo simulation was checked by comparing the dose distribution in the gel dosimeter between simulation and experiment.

  2. A fast - Monte Carlo toolkit on GPU for treatment plan dose recalculation in proton therapy

    NASA Astrophysics Data System (ADS)

    Senzacqua, M.; Schiavi, A.; Patera, V.; Pioli, S.; Battistoni, G.; Ciocca, M.; Mairani, A.; Magro, G.; Molinelli, S.

    2017-10-01

    In the context of the particle therapy a crucial role is played by Treatment Planning Systems (TPSs), tools aimed to compute and optimize the tratment plan. Nowadays one of the major issues related to the TPS in particle therapy is the large CPU time needed. We developed a software toolkit (FRED) for reducing dose recalculation time by exploiting Graphics Processing Units (GPU) hardware. Thanks to their high parallelization capability, GPUs significantly reduce the computation time, up to factor 100 respect to a standard CPU running software. The transport of proton beams in the patient is accurately described through Monte Carlo methods. Physical processes reproduced are: Multiple Coulomb Scattering, energy straggling and nuclear interactions of protons with the main nuclei composing the biological tissues. FRED toolkit does not rely on the water equivalent translation of tissues, but exploits the Computed Tomography anatomical information by reconstructing and simulating the atomic composition of each crossed tissue. FRED can be used as an efficient tool for dose recalculation, on the day of the treatment. In fact it can provide in about one minute on standard hardware the dose map obtained combining the treatment plan, earlier computed by the TPS, and the current patient anatomic arrangement.

  3. Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes

    NASA Technical Reports Server (NTRS)

    Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.

    2001-01-01

    The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as part of their continuing efforts to support the users of the FLUKA code within the particle physics community. In keeping with the spirit of developing an evolving physics code, we are planning as part of this project, to participate in the efforts to validate the core FLUKA physics in ground-based accelerator test runs. The emphasis of these test runs will be the physics of greatest interest in the simulation of the space radiation environment. Such a tool will be of great value to planners, designers and operators of future space missions, as well as for the design of the vehicles and habitats to be used on such missions. It will also be of aid to future experiments of various kinds that may be affected at some level by the ambient radiation environment, or in the analysis of hybrid experiment designs that have been discussed for space-based astronomy and astrophysics. The tool will be of value to the Life Sciences personnel involved in the prediction and measurement of radiation doses experienced by the crewmembers on such missions. In addition, the tool will be of great use to the planners of experiments to measure and evaluate the space radiation environment itself. It can likewise be useful in the analysis of safe havens, hazard migration plans, and NASA's call for new research in composites and to NASA engineers modeling the radiation exposure of electronic circuits. This code will provide an important complimentary check on the predictions of analytic codes such as BRYNTRN/HZETRN that are presently used for many similar applications, and which have shortcomings that are more easily overcome with Monte Carlo type simulations. Finally, it is acknowledged that there are similar efforts based around the use of the GEANT4 Monte-Carlo transport code currently under development at CERN. It is our intention to make our software modular and sufficiently flexible to allow the parallel use of either FLUKA or GEANT4 as the physics transport engine.

  4. GATE Monte Carlo simulation of GE Discovery 600 and a uniformity phantom

    NASA Astrophysics Data System (ADS)

    Sheen, Heesoon; Im, Ki Chun; Choi, Yong; Shin, Hanback; Han, Youngyih; Chung, Kwangzoo; Cho, Junsang; Ahn, Sang Hee

    2014-12-01

    GATE (Geant4 Application Tomography Emission) Monte Carlo simulations have been successful in the application of emission tomography for precise modeling of various physical processes. Most previous studies on Monte Carlo simulations have only involved performance assessments using virtual phantoms. Although that allows the performance of simulated positron emission tomography (PET) to be evaluated, it does not reflect the reality of practical conditions. This restriction causes substantial drawbacks in GATE simulations of real situations. To overcome the described limitation and to provide a method to enable simulation research relevant to clinically important issues, we conducted a GATE simulation using real data from a scanner rather than a virtual phantom and evaluated the scanner is performance. For that purpose, the system and the geometry of a commercial GE PET/ CT (computed tomography) scanner, BGO-based Discovery 600 (D600), was developed for the first time. The performance of the modeled PET system was evaluated by using the National Electrical Manufacturers Association NEMA NU 2-2007 protocols and results were compared with those of the reference data. The sensitivity, scatter fraction, noise-equivalent count rate (NECR), and resolution were estimated by using the protocol of the NEMA NU2-2007. Sensitivities were 9.01 cps/kBq at 0 cm and 9.43 cps/kBq at 10 cm. Scatter fractions were 39.5%. The NECR peak was 89.7 kcps @ 14.7 kBq/cc. Resolutions were 4.8 mm in the transaxial plane and 5.9 mm in the axial plane at 1 cm, and 6.2 mm in the transaxial plane and 6.4 mm in the axial plane at 10 cm. The resolutions exceeded the limited value provided by the manufacturer. The uniformity phantom was simulated using the CT and the PET data. The output data in a ROOT format were converted and then reconstructed by using the C program and STIR (Software for Tomographic Image Reconstruction). The reconstructed images of the simulated uniformity phantom data had comparable quality even though improvement in the quality is still required. In conclusion, we have demonstrated a successful simulation of a PET system by using scanned data. In future studies, the parameters that alter the imaging conditions, such as patient movement and physiological change, need to be studied.

  5. Development of the ARISTOTLE webware for cloud-based rarefied gas flow modeling

    NASA Astrophysics Data System (ADS)

    Deschenes, Timothy R.; Grot, Jonathan; Cline, Jason A.

    2016-11-01

    Rarefied gas dynamics are important for a wide variety of applications. An improvement in the ability of general users to predict these gas flows will enable optimization of current, and discovery of future processes. Despite this potential, most rarefied simulation software is designed by and for experts in the community. This has resulted in low adoption of the methods outside of the immediate RGD community. This paper outlines an ongoing effort to create a rarefied gas dynamics simulation tool that can be used by a general audience. The tool leverages a direct simulation Monte Carlo (DSMC) library that is available to the entire community and a web-based simulation process that will enable all users to take advantage of high performance computing capabilities. First, the DSMC library and simulation architecture are described. Then the DSMC library is used to predict a number of representative transient gas flows that are applicable to the rarefied gas dynamics community. The paper closes with a summary and future direction.

  6. Monte Carlo simulations in X-ray imaging

    NASA Astrophysics Data System (ADS)

    Giersch, Jürgen; Durst, Jürgen

    2008-06-01

    Monte Carlo simulations have become crucial tools in many fields of X-ray imaging. They help to understand the influence of physical effects such as absorption, scattering and fluorescence of photons in different detector materials on image quality parameters. They allow studying new imaging concepts like photon counting, energy weighting or material reconstruction. Additionally, they can be applied to the fields of nuclear medicine to define virtual setups studying new geometries or image reconstruction algorithms. Furthermore, an implementation of the propagation physics of electrons and photons allows studying the behavior of (novel) X-ray generation concepts. This versatility of Monte Carlo simulations is illustrated with some examples done by the Monte Carlo simulation ROSI. An overview of the structure of ROSI is given as an example of a modern, well-proven, object-oriented, parallel computing Monte Carlo simulation for X-ray imaging.

  7. Quantum Monte Carlo: Faster, More Reliable, And More Accurate

    NASA Astrophysics Data System (ADS)

    Anderson, Amos Gerald

    2010-06-01

    The Schrodinger Equation has been available for about 83 years, but today, we still strain to apply it accurately to molecules of interest. The difficulty is not theoretical in nature, but practical, since we're held back by lack of sufficient computing power. Consequently, effort is applied to find acceptable approximations to facilitate real time solutions. In the meantime, computer technology has begun rapidly advancing and changing the way we think about efficient algorithms. For those who can reorganize their formulas to take advantage of these changes and thereby lift some approximations, incredible new opportunities await. Over the last decade, we've seen the emergence of a new kind of computer processor, the graphics card. Designed to accelerate computer games by optimizing quantity instead of quality in processor, they have become of sufficient quality to be useful to some scientists. In this thesis, we explore the first known use of a graphics card to computational chemistry by rewriting our Quantum Monte Carlo software into the requisite "data parallel" formalism. We find that notwithstanding precision considerations, we are able to speed up our software by about a factor of 6. The success of a Quantum Monte Carlo calculation depends on more than just processing power. It also requires the scientist to carefully design the trial wavefunction used to guide simulated electrons. We have studied the use of Generalized Valence Bond wavefunctions to simply, and yet effectively, captured the essential static correlation in atoms and molecules. Furthermore, we have developed significantly improved two particle correlation functions, designed with both flexibility and simplicity considerations, representing an effective and reliable way to add the necessary dynamic correlation. Lastly, we present our method for stabilizing the statistical nature of the calculation, by manipulating configuration weights, thus facilitating efficient and robust calculations. Our combination of Generalized Valence Bond wavefunctions, improved correlation functions, and stabilized weighting techniques for calculations run on graphics cards, represents a new way for using Quantum Monte Carlo to study arbitrarily sized molecules.

  8. Rapid Monte Carlo Simulation of Gravitational Wave Galaxies

    NASA Astrophysics Data System (ADS)

    Breivik, Katelyn; Larson, Shane L.

    2015-01-01

    With the detection of gravitational waves on the horizon, astrophysical catalogs produced by gravitational wave observatories can be used to characterize the populations of sources and validate different galactic population models. Efforts to simulate gravitational wave catalogs and source populations generally focus on population synthesis models that require extensive time and computational power to produce a single simulated galaxy. Monte Carlo simulations of gravitational wave source populations can also be used to generate observation catalogs from the gravitational wave source population. Monte Carlo simulations have the advantes of flexibility and speed, enabling rapid galactic realizations as a function of galactic binary parameters with less time and compuational resources required. We present a Monte Carlo method for rapid galactic simulations of gravitational wave binary populations.

  9. Implementation of Monte Carlo Dose calculation for CyberKnife treatment planning

    NASA Astrophysics Data System (ADS)

    Ma, C.-M.; Li, J. S.; Deng, J.; Fan, J.

    2008-02-01

    Accurate dose calculation is essential to advanced stereotactic radiosurgery (SRS) and stereotactic radiotherapy (SRT) especially for treatment planning involving heterogeneous patient anatomy. This paper describes the implementation of a fast Monte Carlo dose calculation algorithm in SRS/SRT treatment planning for the CyberKnife® SRS/SRT system. A superposition Monte Carlo algorithm is developed for this application. Photon mean free paths and interaction types for different materials and energies as well as the tracks of secondary electrons are pre-simulated using the MCSIM system. Photon interaction forcing and splitting are applied to the source photons in the patient calculation and the pre-simulated electron tracks are repeated with proper corrections based on the tissue density and electron stopping powers. Electron energy is deposited along the tracks and accumulated in the simulation geometry. Scattered and bremsstrahlung photons are transported, after applying the Russian roulette technique, in the same way as the primary photons. Dose calculations are compared with full Monte Carlo simulations performed using EGS4/MCSIM and the CyberKnife treatment planning system (TPS) for lung, head & neck and liver treatments. Comparisons with full Monte Carlo simulations show excellent agreement (within 0.5%). More than 10% differences in the target dose are found between Monte Carlo simulations and the CyberKnife TPS for SRS/SRT lung treatment while negligible differences are shown in head and neck and liver for the cases investigated. The calculation time using our superposition Monte Carlo algorithm is reduced up to 62 times (46 times on average for 10 typical clinical cases) compared to full Monte Carlo simulations. SRS/SRT dose distributions calculated by simple dose algorithms may be significantly overestimated for small lung target volumes, which can be improved by accurate Monte Carlo dose calculations.

  10. Particle behavior simulation in thermophoresis phenomena by direct simulation Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Wada, Takao

    2014-07-01

    A particle motion considering thermophoretic force is simulated by using direct simulation Monte Carlo (DSMC) method. Thermophoresis phenomena, which occur for a particle size of 1 μm, are treated in this paper. The problem of thermophoresis simulation is computation time which is proportional to the collision frequency. Note that the time step interval becomes much small for the simulation considering the motion of large size particle. Thermophoretic forces calculated by DSMC method were reported, but the particle motion was not computed because of the small time step interval. In this paper, the molecule-particle collision model, which computes the collision between a particle and multi molecules in a collision event, is considered. The momentum transfer to the particle is computed with a collision weight factor, where the collision weight factor means the number of molecules colliding with a particle in a collision event. The large time step interval is adopted by considering the collision weight factor. Furthermore, the large time step interval is about million times longer than the conventional time step interval of the DSMC method when a particle size is 1 μm. Therefore, the computation time becomes about one-millionth. We simulate the graphite particle motion considering thermophoretic force by DSMC-Neutrals (Particle-PLUS neutral module) with above the collision weight factor, where DSMC-Neutrals is commercial software adopting DSMC method. The size and the shape of the particle are 1 μm and a sphere, respectively. The particle-particle collision is ignored. We compute the thermophoretic forces in Ar and H2 gases of a pressure range from 0.1 to 100 mTorr. The results agree well with Gallis' analytical results. Note that Gallis' analytical result for continuum limit is the same as Waldmann's result.

  11. The MGDO software library for data analysis in Ge neutrinoless double-beta decay experiments

    NASA Astrophysics Data System (ADS)

    Agostini, M.; Detwiler, J. A.; Finnerty, P.; Kröninger, K.; Lenz, D.; Liu, J.; Marino, M. G.; Martin, R.; Nguyen, K. D.; Pandola, L.; Schubert, A. G.; Volynets, O.; Zavarise, P.

    2012-07-01

    The Gerda and Majorana experiments will search for neutrinoless double-beta decay of 76Ge using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both Gerda and Majorana.

  12. Efficient simulation of voxelized phantom in GATE with embedded SimSET multiple photon history generator.

    PubMed

    Lin, Hsin-Hon; Chuang, Keh-Shih; Lin, Yi-Hsing; Ni, Yu-Ching; Wu, Jay; Jan, Meei-Ling

    2014-10-21

    GEANT4 Application for Tomographic Emission (GATE) is a powerful Monte Carlo simulator that combines the advantages of the general-purpose GEANT4 simulation code and the specific software tool implementations dedicated to emission tomography. However, the detailed physical modelling of GEANT4 is highly computationally demanding, especially when tracking particles through voxelized phantoms. To circumvent the relatively slow simulation of voxelized phantoms in GATE, another efficient Monte Carlo code can be used to simulate photon interactions and transport inside a voxelized phantom. The simulation system for emission tomography (SimSET), a dedicated Monte Carlo code for PET/SPECT systems, is well-known for its efficiency in simulation of voxel-based objects. An efficient Monte Carlo workflow integrating GATE and SimSET for simulating pinhole SPECT has been proposed to improve voxelized phantom simulation. Although the workflow achieves a desirable increase in speed, it sacrifices the ability to simulate decaying radioactive sources such as non-pure positron emitters or multiple emission isotopes with complex decay schemes and lacks the modelling of time-dependent processes due to the inherent limitations of the SimSET photon history generator (PHG). Moreover, a large volume of disk storage is needed to store the huge temporal photon history file produced by SimSET that must be transported to GATE. In this work, we developed a multiple photon emission history generator (MPHG) based on SimSET/PHG to support a majority of the medically important positron emitters. We incorporated the new generator codes inside GATE to improve the simulation efficiency of voxelized phantoms in GATE, while eliminating the need for the temporal photon history file. The validation of this new code based on a MicroPET R4 system was conducted for (124)I and (18)F with mouse-like and rat-like phantoms. Comparison of GATE/MPHG with GATE/GEANT4 indicated there is a slight difference in energy spectra for energy below 50 keV due to the lack of x-ray simulation from (124)I decay in the new code. The spatial resolution, scatter fraction and count rate performance are in good agreement between the two codes. For the case studies of (18)F-NaF ((124)I-IAZG) using MOBY phantom with 1  ×  1 × 1 mm(3) voxel sizes, the results show that GATE/MPHG can achieve acceleration factors of approximately 3.1 × (4.5 ×), 6.5 × (10.7 ×) and 9.5 × (31.0 ×) compared with GATE using the regular navigation method, the compressed voxel method and the parameterized tracking technique, respectively. In conclusion, the implementation of MPHG in GATE allows for improved efficiency of voxelized phantom simulations and is suitable for studying clinical and preclinical imaging.

  13. Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.

    Purpose: The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. Methods: MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for allmore » exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. Results: The calculated mean percent difference between TLD measurements and Monte Carlo simulations was −4.9% with standard deviation of 8.7% and a range of −22.7% to 5.7%. Conclusions: The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.« less

  14. Validation of the Monte Carlo simulator GATE for indium-111 imaging.

    PubMed

    Assié, K; Gardin, I; Véra, P; Buvat, I

    2005-07-07

    Monte Carlo simulations are useful for optimizing and assessing single photon emission computed tomography (SPECT) protocols, especially when aiming at measuring quantitative parameters from SPECT images. Before Monte Carlo simulated data can be trusted, the simulation model must be validated. The purpose of this work was to validate the use of GATE, a new Monte Carlo simulation platform based on GEANT4, for modelling indium-111 SPECT data, the quantification of which is of foremost importance for dosimetric studies. To that end, acquisitions of (111)In line sources in air and in water and of a cylindrical phantom were performed, together with the corresponding simulations. The simulation model included Monte Carlo modelling of the camera collimator and of a back-compartment accounting for photomultiplier tubes and associated electronics. Energy spectra, spatial resolution, sensitivity values, images and count profiles obtained for experimental and simulated data were compared. An excellent agreement was found between experimental and simulated energy spectra. For source-to-collimator distances varying from 0 to 20 cm, simulated and experimental spatial resolution differed by less than 2% in air, while the simulated sensitivity values were within 4% of the experimental values. The simulation of the cylindrical phantom closely reproduced the experimental data. These results suggest that GATE enables accurate simulation of (111)In SPECT acquisitions.

  15. Geant4 simulations of a wide-angle x-ray focusing telescope

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Zhang, Shuangnan; Willingale, Richard; Ling, Zhixing

    2017-06-01

    The rapid development of X-ray astronomy has been made possible by widely deploying X-ray focusing telescopes on board many X-ray satellites. Geant4 is a very powerful toolkit for Monte Carlo simulations and has remarkable abilities to model complex geometrical configurations. However, the library of physical processes available in Geant4 lacks a description of the reflection of X-ray photons at a grazing incident angle which is the core physical process in the simulation of X-ray focusing telescopes. The scattering of low-energy charged particles from the mirror surfaces is another noteworthy process which is not yet incorporated into Geant4. Here we describe a Monte Carlo model of a simplified wide-angle X-ray focusing telescope adopting lobster-eye optics and a silicon detector using the Geant4 toolkit. With this model, we simulate the X-ray tracing, proton scattering and background detection. We find that: (1) the effective area obtained using Geant4 is in agreement with that obtained using Q software with an average difference of less than 3%; (2) X-rays are the dominant background source below 10 keV; (3) the sensitivity of the telescope is better by at least one order of magnitude than that of a coded mask telescope with the same physical dimensions; (4) the number of protons passing through the optics and reaching the detector by Firsov scattering is about 2.5 times that of multiple scattering for the lobster-eye telescope.

  16. Calculation of dose distribution in compressible breast tissues using finite element modeling, Monte Carlo simulation and thermoluminescence dosimeters

    NASA Astrophysics Data System (ADS)

    Mohammadyari, Parvin; Faghihi, Reza; Mosleh-Shirazi, Mohammad Amin; Lotfi, Mehrzad; Rahim Hematiyan, Mohammad; Koontz, Craig; Meigooni, Ali S.

    2015-12-01

    Compression is a technique to immobilize the target or improve the dose distribution within the treatment volume during different irradiation techniques such as AccuBoost® brachytherapy. However, there is no systematic method for determination of dose distribution for uncompressed tissue after irradiation under compression. In this study, the mechanical behavior of breast tissue between compressed and uncompressed states was investigated. With that, a novel method was developed to determine the dose distribution in uncompressed tissue after irradiation of compressed breast tissue. Dosimetry was performed using two different methods, namely, Monte Carlo simulations using the MCNP5 code and measurements using thermoluminescent dosimeters (TLD). The displacement of the breast elements was simulated using a finite element model and calculated using ABAQUS software. From these results, the 3D dose distribution in uncompressed tissue was determined. The geometry of the model was constructed from magnetic resonance images of six different women volunteers. The mechanical properties were modeled by using the Mooney-Rivlin hyperelastic material model. Experimental dosimetry was performed by placing the TLD chips into the polyvinyl alcohol breast equivalent phantom. The results determined that the nodal displacements, due to the gravitational force and the 60 Newton compression forces (with 43% contraction in the loading direction and 37% expansion in the orthogonal direction) were determined. Finally, a comparison of the experimental data and the simulated data showed agreement within 11.5%  ±  5.9%.

  17. Calculation of dose distribution in compressible breast tissues using finite element modeling, Monte Carlo simulation and thermoluminescence dosimeters.

    PubMed

    Mohammadyari, Parvin; Faghihi, Reza; Mosleh-Shirazi, Mohammad Amin; Lotfi, Mehrzad; Hematiyan, Mohammad Rahim; Koontz, Craig; Meigooni, Ali S

    2015-12-07

    Compression is a technique to immobilize the target or improve the dose distribution within the treatment volume during different irradiation techniques such as AccuBoost(®) brachytherapy. However, there is no systematic method for determination of dose distribution for uncompressed tissue after irradiation under compression. In this study, the mechanical behavior of breast tissue between compressed and uncompressed states was investigated. With that, a novel method was developed to determine the dose distribution in uncompressed tissue after irradiation of compressed breast tissue. Dosimetry was performed using two different methods, namely, Monte Carlo simulations using the MCNP5 code and measurements using thermoluminescent dosimeters (TLD). The displacement of the breast elements was simulated using a finite element model and calculated using ABAQUS software. From these results, the 3D dose distribution in uncompressed tissue was determined. The geometry of the model was constructed from magnetic resonance images of six different women volunteers. The mechanical properties were modeled by using the Mooney-Rivlin hyperelastic material model. Experimental dosimetry was performed by placing the TLD chips into the polyvinyl alcohol breast equivalent phantom. The results determined that the nodal displacements, due to the gravitational force and the 60 Newton compression forces (with 43% contraction in the loading direction and 37% expansion in the orthogonal direction) were determined. Finally, a comparison of the experimental data and the simulated data showed agreement within 11.5%  ±  5.9%.

  18. Application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) to the steel process chain: case study.

    PubMed

    Bieda, Bogusław

    2014-05-15

    The purpose of the paper is to present the results of application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) data of Mittal Steel Poland (MSP) complex in Kraków, Poland. In order to assess the uncertainty, the software CrystalBall® (CB), which is associated with Microsoft® Excel spreadsheet model, is used. The framework of the study was originally carried out for 2005. The total production of steel, coke, pig iron, sinter, slabs from continuous steel casting (CSC), sheets from hot rolling mill (HRM) and blast furnace gas, collected in 2005 from MSP was analyzed and used for MC simulation of the LCI model. In order to describe random nature of all main products used in this study, normal distribution has been applied. The results of the simulation (10,000 trials) performed with the use of CB consist of frequency charts and statistical reports. The results of this study can be used as the first step in performing a full LCA analysis in the steel industry. Further, it is concluded that the stochastic approach is a powerful method for quantifying parameter uncertainty in LCA/LCI studies and it can be applied to any steel industry. The results obtained from this study can help practitioners and decision-makers in the steel production management. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Quasi-Monte Carlo Methods Applied to Tau-Leaping in Stochastic Biological Systems.

    PubMed

    Beentjes, Casper H L; Baker, Ruth E

    2018-05-25

    Quasi-Monte Carlo methods have proven to be effective extensions of traditional Monte Carlo methods in, amongst others, problems of quadrature and the sample path simulation of stochastic differential equations. By replacing the random number input stream in a simulation procedure by a low-discrepancy number input stream, variance reductions of several orders have been observed in financial applications. Analysis of stochastic effects in well-mixed chemical reaction networks often relies on sample path simulation using Monte Carlo methods, even though these methods suffer from typical slow [Formula: see text] convergence rates as a function of the number of sample paths N. This paper investigates the combination of (randomised) quasi-Monte Carlo methods with an efficient sample path simulation procedure, namely [Formula: see text]-leaping. We show that this combination is often more effective than traditional Monte Carlo simulation in terms of the decay of statistical errors. The observed convergence rate behaviour is, however, non-trivial due to the discrete nature of the models of chemical reactions. We explain how this affects the performance of quasi-Monte Carlo methods by looking at a test problem in standard quadrature.

  20. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  1. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNamara, A; Held, K; Paganetti, H

    2016-06-15

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less

  2. BIM-Sim: Interactive Simulation of Broadband Imaging Using Mie Theory

    PubMed Central

    Berisha, Sebastian; van Dijk, Thomas; Bhargava, Rohit; Carney, P. Scott; Mayerich, David

    2017-01-01

    Understanding the structure of a scattered electromagnetic (EM) field is critical to improving the imaging process. Mechanisms such as diffraction, scattering, and interference affect an image, limiting the resolution, and potentially introducing artifacts. Simulation and visualization of scattered fields thus plays an important role in imaging science. However, EM fields are high-dimensional, making them time-consuming to simulate, and difficult to visualize. In this paper, we present a framework for interactively computing and visualizing EM fields scattered by micro and nano-particles. Our software uses graphics hardware for evaluating the field both inside and outside of these particles. We then use Monte-Carlo sampling to reconstruct and visualize the three-dimensional structure of the field, spectral profiles at individual points, the structure of the field at the surface of the particle, and the resulting image produced by an optical system. PMID:29170738

  3. A parallel computational model for GATE simulations.

    PubMed

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. A model for simulating random atmospheres as a function of latitude, season, and time

    NASA Technical Reports Server (NTRS)

    Campbell, J. W.

    1977-01-01

    An empirical stochastic computer model was developed with the capability of generating random thermodynamic profiles of the atmosphere below an altitude of 99 km which are characteristic of any given season, latitude, and time of day. Samples of temperature, density, and pressure profiles generated by the model are statistically similar to measured profiles in a data base of over 6000 rocket and high-altitude atmospheric soundings; that is, means and standard deviations of modeled profiles and their vertical gradients are in close agreement with data. Model-generated samples can be used for Monte Carlo simulations of aircraft or spacecraft trajectories to predict or account for the effects on a vehicle's performance of atmospheric variability. Other potential uses for the model are in simulating pollutant dispersion patterns, variations in sound propagation, and other phenomena which are dependent on atmospheric properties, and in developing data-reduction software for satellite monitoring systems.

  5. Theoretical substantiation of biological efficacy enhancement for β-delayed particle decay {sup 9}C beam: A Monte Carlo study in combination with analysis with the local effect model approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Liheng; Yan, Yuanlin; Ma, Yuanyuan

    Purpose: To improve the efficacy of heavy ion therapy, β-delayed particle decay {sup 9}C beam as a double irradiation source for cancer therapy has been proposed. The authors’ previous experiment showed that relative biological effectiveness (RBE) values at the depths around the Bragg peak of a {sup 9}C beam were enhanced and compared to its stable counterpart {sup 12}C beam. The purpose of this study was to explore the nature of the biological efficacy enhancement theoretically. Methods: A Monte Carlo simulation study was conducted in this study. First a simplified cell model was established so as to form a tumormore » tissue. Subsequently, the tumor tissue was imported into the Monte Carlo simulation software package GATE and then the tumor cells were virtually irradiated with comparable {sup 9}C and {sup 12}C beams, respectively, in the simulations. The transportation and particle deposition data of the {sup 9}C and {sup 12}C beams, derived from the GATE simulations, were analyzed with the authors’ local effect model implementation so as to deduce cell survival fractions. Results: The particles emitted from the decay process of deposited {sup 9}C particles around a cell nucleus increased the dose delivered to the nucleus and elicited clustered damages around the secondary particles’ trajectories. Therefore, compared to the {sup 12}C beam, the RBE value of the {sup 9}C beam increased at the depths around their Bragg peaks. Conclusions: Collectively, the increased local doses and clustered damages due to the decayed particles emitted from deposited {sup 9}C particles led to the RBE enhancement in contrast with the {sup 12}C beam. Thus, the enhanced RBE effect of a {sup 9}C beam for a simplified tumor model was shown theoretically in this study.« less

  6. Evaluation of General and Tailor Made Force Fields via X-ray Thermal Diffuse Scattering Using Molecular Dynamics and Monte Carlo Simulations of Crystalline Aspirin.

    PubMed

    Chan, Eric J; Neumann, Marcus A

    2018-04-10

    We have performed a comparison of the experimental thermal diffuse scattering (TDS) from crystalline Aspirin (form I) to that calculated from molecular dynamics (MD) simulations based on a variety of general force fields and a tailor-made force field (TMFF). A comparison is also made with Monte Carlo (MC) simulations which use a "harmonic network" approach to describe the intermolecular interactions. These comparisons were based on the hypothesis that TDS could be a useful experimental data in validation of such simulation parameter sets, especially when calculations of dynamical properties (e.g., thermodynamic free energies) from molecular crystals are concerned. Currently such a validation of force field parameters against experimental data is often limited to calculation of specific physical properties, e.g., absolute lattice energies usually at 0 K or heat capacity measurements. TDS harvested from in-house or synchrotron experiments comprises highly detailed structural information representative of the dynamical motions of the crystal lattice. Thus, TDS is a well-suited experimental data-driven means of cross validating theoretical approaches targeted at understanding dynamical properties of crystals. We found from the results of our investigation that the TMFF and COMPASS (from the commercial software "Materials Studio") parameter sets gave the best agreement with experiment. From our homologous MC simulation analysis we are able to show that force constants associated with the molecular torsion angles are likely to be a strong contributing factor for the apparent reason why these aforementioned force fields performed better.

  7. SU-E-T-507: Internal Dosimetry in Nuclear Medicine Using GATE and XCAT Phantom: A Simulation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fallahpoor, M; Abbasi, M; Sen, A

    Purpose Monte Carlo simulations are routinely used for internal dosimetry studies. These studies are conducted with humanoid phantoms such as the XCAT phantom. In this abstract we present the absorbed doses for various pairs of source and target organs using three common radiotracers in nuclear medicine. Methods The GATE software package is used for the Monte Carlo simulations. A typical female XCAT phantom is used as the input. Three radiotracers 153Sm, 131I and 99mTc are studied. The Specific Absorbed Fraction (SAF) for gamma rays (99mTc, 153Sm and 131I) and Specific Fraction (SF) for beta particles (153Sm and 131I) are calculatedmore » for all 100 pairs of source target organs including brain, liver, lung, pancreas, kidney, adrenal, spleen, rib bone, bladder and ovaries. Results The source organs themselves gain the highest absorbed dose as compared to other organs. The dose is found to be inversely proportional to distance from the source organ. In SAF results of 153Sm, when the source organ is lung, the rib bone, gain 0.0730 (Kg-1) that is more than lung itself. Conclusion The absorbed dose for various organs was studied in terms of SAF and SF. Such studies hold importance for future therapeutic procedures and optimization of induced radiotracer.« less

  8. Towards a standardized method to assess straylight in earth observing optical instruments

    NASA Astrophysics Data System (ADS)

    Caron, J.; Taccola, M.; Bézy, J.-L.

    2017-09-01

    Straylight is a spurious effect that can seriously degrade the radiometric accuracy achieved by Earth observing optical instruments, as a result of the high contrast in the observed Earth radiance scenes and spectra. It is considered critical for several ESA missions such as Sentinel-5, FLEX and potential successors to CarbonSat. Although it is traditionally evaluated by Monte-Carlo simulations performed with commercial softwares (e.g. ASAP, Zemax, LightTools), semi-analytical approximate methods [1,2] have drawn some interest in recent years due to their faster computing time and the greater insight they provide in straylight mechanisms. They cannot replace numerical simulations, but may be more advantageous in contexts where many iterations are needed, for instance during the early phases of an instrument design.

  9. ActiWiz 3 – an overview of the latest developments and their application

    NASA Astrophysics Data System (ADS)

    Vincke, H.; Theis, C.

    2018-06-01

    In 2011 the ActiWiz code was developed at CERN in order to optimize the choice of materials for accelerator equipment from a radiological point of view. Since then the code has been extended to allow for calculating complete nuclide inventories and provide evaluations with respect to radiotoxicity, inhalation doses, etc. Until now the software included only pre-defined radiation environments for CERN’s high-energy proton accelerators which were based on FLUKA Monte Carlo calculations. Eventually the decision was taken to invest into a major revamping of the code. Starting with version 3 the software is not limited anymore to pre-defined radiation fields but within a few seconds it can also treat arbitrary environments of which fluence spectra are available. This has become possible due to the use of ~100 CPU years’ worth of FLUKA Monte Carlo simulations as well as the JEFF cross-section library for neutrons < 20 MeV. Eventually the latest code version allowed for the efficient inclusion of 42 additional radiation environments of the LHC experiments as well as considerably more flexibility in view of characterizing also waste from CERN’s Large Electron Positron collider (LEP). New fully integrated analysis functionalities like automatic evaluation of difficult-to-measure nuclides, rapid assessment of the temporal evolution of quantities like radiotoxicity or dose-rates, etc. make the software a powerful tool for characterization complementary to general purpose MC codes like FLUKA. In this paper an overview of the capabilities will be given using recent examples from the domain of waste characterization as well as operational radiation protection.

  10. Accurately modeling Gaussian beam propagation in the context of Monte Carlo techniques

    NASA Astrophysics Data System (ADS)

    Hokr, Brett H.; Winblad, Aidan; Bixler, Joel N.; Elpers, Gabriel; Zollars, Byron; Scully, Marlan O.; Yakovlev, Vladislav V.; Thomas, Robert J.

    2016-03-01

    Monte Carlo simulations are widely considered to be the gold standard for studying the propagation of light in turbid media. However, traditional Monte Carlo methods fail to account for diffraction because they treat light as a particle. This results in converging beams focusing to a point instead of a diffraction limited spot, greatly effecting the accuracy of Monte Carlo simulations near the focal plane. Here, we present a technique capable of simulating a focusing beam in accordance to the rules of Gaussian optics, resulting in a diffraction limited focal spot. This technique can be easily implemented into any traditional Monte Carlo simulation allowing existing models to be converted to include accurate focusing geometries with minimal effort. We will present results for a focusing beam in a layered tissue model, demonstrating that for different scenarios the region of highest intensity, thus the greatest heating, can change from the surface to the focus. The ability to simulate accurate focusing geometries will greatly enhance the usefulness of Monte Carlo for countless applications, including studying laser tissue interactions in medical applications and light propagation through turbid media.

  11. An approach to design a 90Sr radioisotope thermoelectric generator using analytical and Monte Carlo methods with ANSYS, COMSOL, and MCNP.

    PubMed

    Khajepour, Abolhasan; Rahmani, Faezeh

    2017-01-01

    In this study, a 90 Sr radioisotope thermoelectric generator (RTG) with power of milliWatt was designed to operate in the determined temperature (300-312K). For this purpose, the combination of analytical and Monte Carlo methods with ANSYS and COMSOL software as well as the MCNP code was used. This designed RTG contains 90 Sr as a radioisotope heat source (RHS) and 127 coupled thermoelectric modules (TEMs) based on bismuth telluride. Kapton (2.45mm in thickness) and Cryotherm sheets (0.78mm in thickness) were selected as the thermal insulators of the RHS, as well as a stainless steel container was used as a generator chamber. The initial design of the RHS geometry was performed according to the amount of radioactive material (strontium titanate) as well as the heat transfer calculations and mechanical strength considerations. According to the Monte Carlo simulation performed by the MCNP code, approximately 0.35 kCi of 90 Sr is sufficient to generate heat power in the RHS. To determine the optimal design of the RTG, the distribution of temperature as well as the dissipated heat and input power to the module were calculated in different parts of the generator using the ANSYS software. Output voltage according to temperature distribution on TEM was calculated using COMSOL. Optimization of the dimension of the RHS and heat insulator was performed to adapt the average temperature of the hot plate of TEM to the determined hot temperature value. This designed RTG generates 8mW in power with an efficiency of 1%. This proposed approach of combination method can be used for the precise design of various types of RTGs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. TU-H-CAMPUS-IeP1-03: Comparison of Monte Carlo Simulation and Conversion Factor Based Method On Estimation of Effective Dose in Pediatric Patients Undergoing Interventional Cardiac Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, K; Wong, M; Ng, Y

    Purpose: Interventional cardiac procedures utilize frequent fluoroscopy and cineangiography, which impose considerable radiation risk to patients, especially pediatric patients. Accurate calculation of effective dose is important in order to estimate cancer risk over the rest of their lifetime. This study evaluates the difference in effective dose calculated by Monte Carlo simulation with those estimated by locally-derived conversion factors (CF-local) and by commonly quoted conversion factors from Karambatsakidou et al (CF-K). Methods: Effective dose (E),of 12 pediatric patients, age between 2.5–19 years old, who had undergone interventional cardiac procedures, were calculated using PCXMC-2.0 software. Tube spectrum, irradiation geometry, exposure parameters andmore » dose-area product (DAP) of each projection were included in the software calculation. Effective doses for each patient were also estimated by two Methods: 1) CF-local: conversion factor derived locally by generalizing results of 12 patients, multiplied by DAP of each patient gives E-local. 2) CF-K: selected factor from above-mentioned literature, multiplied by DAP of each patient gives E-K. Results: Mean of E, E-local and E-K were 16.01 mSv, 16.80 mSv and 22.25 mSv respectively. A deviation of −29.35% to +34.85% between E and E-local, while a greater deviation of −28.96% to +60.86% between E and EK were observed. E-K overestimated the effective dose for patients at age 7.5–19. Conclusion: Effective dose obtained by conversion factors is simple and quick to estimate radiation risk of pediatric patients. This study showed that estimation by CF-local may bear an error of 35% when compared with Monte Carlo calculation. If using conversion factors derived by other studies may result in an even greater error, of up to 60%, due to factors that are not catered for in the estimation, including patient size, projection angles, exposure parameters, tube filtration, etc. Users must be aware of these potential inaccuracies when simple conversion method is employed.« less

  13. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy

    NASA Astrophysics Data System (ADS)

    Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.

    2016-12-01

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  14. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.

    PubMed

    Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M

    2016-12-07

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  15. Poster — Thur Eve — 46: Monte Carlo model of the Novalis Classic 6MV stereotactic linear accelerator using the GATE simulation platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiebe, J; Department of Physics and Astronomy, University of Calgary, Calgary, AB; Ploquin, N

    2014-08-15

    Monte Carlo (MC) simulation is accepted as the most accurate method to predict dose deposition when compared to other methods in radiation treatment planning. Current dose calculation algorithms used for treatment planning can become inaccurate when small radiation fields and tissue inhomogeneities are present. At our centre the Novalis Classic linear accelerator (linac) is used for Stereotactic Radiosurgery (SRS). The first MC model to date of the Novalis Classic linac was developed at our centre using the Geant4 Application for Tomographic Emission (GATE) simulation platform. GATE is relatively new, open source MC software built from CERN's Geometry and Tracking 4more » (Geant4) toolkit. The linac geometry was modeled using manufacturer specifications, as well as in-house measurements of the micro MLC's. Among multiple model parameters, the initial electron beam was adjusted so that calculated depth dose curves agreed with measured values. Simulations were run on the European Grid Infrastructure through GateLab. Simulation time is approximately 8 hours on GateLab for a complete head model simulation to acquire a phase space file. Current results have a majority of points within 3% of the measured dose values for square field sizes ranging from 6×6 mm{sup 2} to 98×98 mm{sup 2} (maximum field size on the Novalis Classic linac) at 100 cm SSD. The x-ray spectrum was determined from the MC data as well. The model provides an investigation into GATE'S capabilities and has the potential to be used as a research tool and an independent dose calculation engine for clinical treatment plans.« less

  16. Application of dynamic Monte Carlo technique in proton beam radiotherapy using Geant4 simulation toolkit

    NASA Astrophysics Data System (ADS)

    Guan, Fada

    Monte Carlo method has been successfully applied in simulating the particles transport problems. Most of the Monte Carlo simulation tools are static and they can only be used to perform the static simulations for the problems with fixed physics and geometry settings. Proton therapy is a dynamic treatment technique in the clinical application. In this research, we developed a method to perform the dynamic Monte Carlo simulation of proton therapy using Geant4 simulation toolkit. A passive-scattering treatment nozzle equipped with a rotating range modulation wheel was modeled in this research. One important application of the Monte Carlo simulation is to predict the spatial dose distribution in the target geometry. For simplification, a mathematical model of a human body is usually used as the target, but only the average dose over the whole organ or tissue can be obtained rather than the accurate spatial dose distribution. In this research, we developed a method using MATLAB to convert the medical images of a patient from CT scanning into the patient voxel geometry. Hence, if the patient voxel geometry is used as the target in the Monte Carlo simulation, the accurate spatial dose distribution in the target can be obtained. A data analysis tool---root was used to score the simulation results during a Geant4 simulation and to analyze the data and plot results after simulation. Finally, we successfully obtained the accurate spatial dose distribution in part of a human body after treating a patient with prostate cancer using proton therapy.

  17. Monte Carlo Modeling of the Initial Radiation Emitted by a Nuclear Device in the National Capital Region

    DTIC Science & Technology

    2013-07-01

    also simulated in the models. Data was derived from calculations using the three-dimensional Monte Carlo radiation transport code MCNP (Monte Carlo N...32  B.  MCNP PHYSICS OPTIONS ......................................................................................... 33  C.  HAZUS...input deck’) for the MCNP , Monte Carlo N-Particle, radiation transport code. MCNP is a general-purpose code designed to simulate neutron, photon

  18. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  19. A Simplified Model of Local Structure in Aqueous Proline Amino Acid Revealed by First-Principles Molecular Dynamics Simulations

    PubMed Central

    Troitzsch, Raphael Z.; Tulip, Paul R.; Crain, Jason; Martyna, Glenn J.

    2008-01-01

    Aqueous proline solutions are deceptively simple as they can take on complex roles such as protein chaperones, cryoprotectants, and hydrotropic agents in biological processes. Here, a molecular level picture of proline/water mixtures is developed. Car-Parrinello ab initio molecular dynamics (CPAIMD) simulations of aqueous proline amino acid at the B-LYP level of theory, performed using IBM's Blue Gene/L supercomputer and massively parallel software, reveal hydrogen-bonding propensities that are at odds with the predictions of the CHARMM22 empirical force field but are in better agreement with results of recent neutron diffraction experiments. In general, the CPAIMD (B-LYP) simulations predict a simplified structural model of proline/water mixtures consisting of fewer distinct local motifs. Comparisons of simulation results to experiment are made by direct evaluation of the neutron static structure factor S(Q) from CPAIMD (B-LYP) trajectories as well as to the results of the empirical potential structure refinement reverse Monte Carlo procedure applied to the neutron data. PMID:18790850

  20. Analysis of Plane-Parallel Electron Beam Propagation in Different Media by Numerical Simulation Methods

    NASA Astrophysics Data System (ADS)

    Miloichikova, I. A.; Bespalov, V. I.; Krasnykh, A. A.; Stuchebrov, S. G.; Cherepennikov, Yu. M.; Dusaev, R. R.

    2018-04-01

    Simulation by the Monte Carlo method is widely used to calculate the character of ionizing radiation interaction with substance. A wide variety of programs based on the given method allows users to choose the most suitable package for solving computational problems. In turn, it is important to know exactly restrictions of numerical systems to avoid gross errors. Results of estimation of the feasibility of application of the program PCLab (Computer Laboratory, version 9.9) for numerical simulation of the electron energy distribution absorbed in beryllium, aluminum, gold, and water for industrial, research, and clinical beams are presented. The data obtained using programs ITS and Geant4 being the most popular software packages for solving the given problems and the program PCLab are presented in the graphic form. A comparison and an analysis of the results obtained demonstrate the feasibility of application of the program PCLab for simulation of the absorbed energy distribution and dose of electrons in various materials for energies in the range 1-20 MeV.

  1. A simplified model of local structure in aqueous proline amino acid revealed by first-principles molecular dynamics simulations.

    PubMed

    Troitzsch, Raphael Z; Tulip, Paul R; Crain, Jason; Martyna, Glenn J

    2008-12-01

    Aqueous proline solutions are deceptively simple as they can take on complex roles such as protein chaperones, cryoprotectants, and hydrotropic agents in biological processes. Here, a molecular level picture of proline/water mixtures is developed. Car-Parrinello ab initio molecular dynamics (CPAIMD) simulations of aqueous proline amino acid at the B-LYP level of theory, performed using IBM's Blue Gene/L supercomputer and massively parallel software, reveal hydrogen-bonding propensities that are at odds with the predictions of the CHARMM22 empirical force field but are in better agreement with results of recent neutron diffraction experiments. In general, the CPAIMD (B-LYP) simulations predict a simplified structural model of proline/water mixtures consisting of fewer distinct local motifs. Comparisons of simulation results to experiment are made by direct evaluation of the neutron static structure factor S(Q) from CPAIMD (B-LYP) trajectories as well as to the results of the empirical potential structure refinement reverse Monte Carlo procedure applied to the neutron data.

  2. Software for Demonstration of Features of Chain Polymerization Processes

    ERIC Educational Resources Information Center

    Sosnowski, Stanislaw

    2013-01-01

    Free software for the demonstration of the features of homo- and copolymerization processes (free radical, controlled radical, and living) is described. The software is based on the Monte Carlo algorithms and offers insight into the kinetics, molecular weight distribution, and microstructure of the macromolecules formed in those processes. It also…

  3. Simulating Pressure Profiles for the Free-Electron Laser Photoemission Gun Using Molflow+

    NASA Astrophysics Data System (ADS)

    Song, Diego; Hernandez-Garcia, Carlos

    2012-10-01

    The Jefferson Lab Free Electron Laser (FEL) generates tunable laser light by passing a relativistic electron beam generated in a high-voltage DC electron gun with a semiconducting photocathode through a magnetic undulator. The electron gun is in stringent vacuum conditions in order to guarantee photocathode longevity. Considering an upgrade of the electron gun, this project consists of simulating pressure profiles to determine if the novel design meets the electron gun vacuum requirements. The method of simulation employs the software Molflow+, developed by R. Kersevan at the Organisation Europ'eene pour la Recherche Nucl'eaire (CERN), which uses the test-particle Monte Carlo method to simulate molecular flows in 3D structures. Pressure is obtained along specified chamber axes. Results are then compared to measured pressure values from the existing gun for validation. Outgassing rates, surface area, and pressure were found to be proportionally related. The simulations indicate that the upgrade gun vacuum chamber requires more pumping compared to its predecessor, while it holds similar vacuum conditions. The ability to simulate pressure profiles through tools like Molflow+, allows researchers to optimize vacuum systems during the engineering process.

  4. Preliminary results of 3D dose calculations with MCNP-4B code from a SPECT image.

    PubMed

    Rodríguez Gual, M; Lima, F F; Sospedra Alfonso, R; González González, J; Calderón Marín, C

    2004-01-01

    Interface software was developed to generate the input file to run Monte Carlo MCNP-4B code from medical image in Interfile format version 3.3. The software was tested using a spherical phantom of tomography slides with known cumulated activity distribution in Interfile format generated with IMAGAMMA medical image processing system. The 3D dose calculation obtained with Monte Carlo MCNP-4B code was compared with the voxel S factor method. The results show a relative error between both methods less than 1 %.

  5. Three-dimensional modeling of light rays on the surface of a slanted lenticular array for autostereoscopic displays.

    PubMed

    Jung, Sung-Min; Kang, In-Byeong

    2013-08-10

    In this paper, we developed an optical model describing the behavior of light at the surface of a slanted lenticular array for autostereoscopic displays in three dimensions and simulated the optical characteristics of autostereoscopic displays using the Monte Carlo method under actual design conditions. The behavior of light is analyzed by light rays for selected inclination and azimuthal angles; numerical aberrations and conditions of total internal reflection for the lenticular array were found. The intensity and the three-dimensional crosstalk distributions calculated from our model coincide very well with those from conventional design software, and our model shows highly enhanced calculation speed that is 67 times faster than that of the conventional software. From the results, we think that the optical model is very useful for predicting the optical characteristics of autostereoscopic displays with enhanced calculation speed.

  6. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    PubMed

    Thomas, Philipp; Matuschek, Hannes; Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license.

  7. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    PubMed Central

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license. PMID:22723865

  8. Using Stan for Item Response Theory Models

    ERIC Educational Resources Information Center

    Ames, Allison J.; Au, Chi Hang

    2018-01-01

    Stan is a flexible probabilistic programming language providing full Bayesian inference through Hamiltonian Monte Carlo algorithms. The benefits of Hamiltonian Monte Carlo include improved efficiency and faster inference, when compared to other MCMC software implementations. Users can interface with Stan through a variety of computing…

  9. Open Source Software Openfoam as a New Aerodynamical Simulation Tool for Rocket-Borne Measurements

    NASA Astrophysics Data System (ADS)

    Staszak, T.; Brede, M.; Strelnikov, B.

    2015-09-01

    The only way to do in-situ measurements, which are very important experimental studies for atmospheric science, in the mesoshere/lower thermosphere (MLT) is to use sounding rockets. The drawback of using rockets is the shock wave appearing because of the very high speed of the rocket motion (typically about 1000 mIs). This shock wave disturbs the density, the temperature and the velocity fields in the vicinity of the rocket, compared to undisturbed values of the atmosphere. This effect, however, can be quantified and the measured data has to be corrected not just to make it more precise but simply usable. The commonly accepted and widely used tool for this calculations is the Direct Simulation Monte Carlo (DSMC) technique developed by GA. Bird which is available as stand-alone program limited to use a single processor. Apart from complications with simulations of flows around bodies related to different flow regimes in the altitude range of MLT, that rise due to exponential density change by several orders of magnitude, a particular hardware configuration introduces significant difficulty for aerodynamical calculations due to choice of the grid sizes mainly depending on the demands on adequate DSMCs and good resolution of geometries with scale differences of factor of iO~. This makes either the calculation time unreasonably long or even prevents the calculation algorithm from converging. In this paper we apply the free open source software OpenFOAM (licensed under GNU GPL) for a three-dimensional CFD-Simulation of a flow around a sounding rocket instrumentation. An advantage of this software package, among other things, is that it can run on high performance clusters, which are easily scalable. We present the first results and discuss the potential of the new tool in applications for sounding rockets.

  10. Dosimetry for 131Cs and 125I seeds in solid water phantom using radiochromic EBT film.

    PubMed

    Chiu-Tsao, Sou-Tung; Napoli, John J; Davis, Stephen D; Hanley, Joseph; Rivard, Mark J

    2014-09-01

    To measure the 2D dose distributions with submillimeter resolution for (131)Cs (model CS-1 Rev2) and (125)I (model 6711) seeds in a Solid Water phantom using radiochromic EBT film for radial distances from 0.06cm to 5cm. To determine the TG-43 dosimetry parameters in water by applying Solid Water to liquid water correction factors generated from Monte Carlo simulations. Each film piece was positioned horizontally above and in close contact with a (131)Cs or (125)I seed oriented horizontally in a machined groove at the center of a Solid Water phantom, one film at a time. A total of 74 and 50 films were exposed to the (131)Cs and (125)I seeds, respectively. Different film sizes were utilized to gather data in different distance ranges. The exposure time varied according to the seed air-kerma strength and film size in order to deliver doses in the range covered by the film calibration curve. Small films were exposed for shorter times to assess the near field, while larger films were exposed for longer times in order to assess the far field. For calibration, films were exposed to either 40kV (M40) or 50kV (M50) x-rays in air at 100.0cm SSD with doses ranging from 0.2Gy to 40Gy. All experimental, calibration and background films were scanned at a 0.02cmpixel resolution using a CCD camera-based microdensitometer with a green light source. Data acquisition and scanner uniformity correction were achieved with Microd3 software. Data analysis was performed using ImageJ, FV, IDL and Excel software packages. 2D dose distributions were based on the calibration curve established for 50kV x-rays. The Solid Water to liquid water medium correction was calculated using the MCNP5 Monte Carlo code. Subsequently, the TG-43 dosimetry parameters in liquid water medium were determined. Values for the dose-rate constants using EBT film were 1.069±0.036 and 0.923±0.031cGyU(-1)h(-1) for (131)Cs and (125)I seed, respectively. The corresponding values determined using the Monte Carlo method were 1.053±0.014 and 0.924±0.016cGyU(-1)h(-1) for (131)Cs and (125)I seed, respectively. The radial dose functions obtained with EBT film measurements and Monte Carlo simulations were plotted for radial distances up to 5cm, and agreed within the uncertainty of the two methods. The 2D anisotropy functions obtained with both methods also agreed within their uncertainties. EBT film dosimetry in a Solid Water phantom is a viable method for measuring (131)Cs (model CS-1 Rev2) and (125)I (model 6711) brachytherapy seed dose distributions with submillimeter resolution. With the Solid Water to liquid water correction factors generated from Monte Carlo simulations, the measured TG-43 dosimetry parameters in liquid water for these two seed models were found to be in good agreement with those in the literature. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. A Monte Carlo method for the simulation of coagulation and nucleation based on weighted particles and the concepts of stochastic resolution and merging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kotalczyk, G., E-mail: Gregor.Kotalczyk@uni-due.de; Kruis, F.E.

    Monte Carlo simulations based on weighted simulation particles can solve a variety of population balance problems and allow thus to formulate a solution-framework for many chemical engineering processes. This study presents a novel concept for the calculation of coagulation rates of weighted Monte Carlo particles by introducing a family of transformations to non-weighted Monte Carlo particles. The tuning of the accuracy (named ‘stochastic resolution’ in this paper) of those transformations allows the construction of a constant-number coagulation scheme. Furthermore, a parallel algorithm for the inclusion of newly formed Monte Carlo particles due to nucleation is presented in the scope ofmore » a constant-number scheme: the low-weight merging. This technique is found to create significantly less statistical simulation noise than the conventional technique (named ‘random removal’ in this paper). Both concepts are combined into a single GPU-based simulation method which is validated by comparison with the discrete-sectional simulation technique. Two test models describing a constant-rate nucleation coupled to a simultaneous coagulation in 1) the free-molecular regime or 2) the continuum regime are simulated for this purpose.« less

  12. Study the sensitivity of dose calculation in prism treatment planning system using Monte Carlo simulation of 6 MeV electron beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardiansyah, D.; Haryanto, F.; Male, S.

    2014-09-30

    Prism is a non-commercial Radiotherapy Treatment Planning System (RTPS) develop by Ira J. Kalet from Washington University. Inhomogeneity factor is included in Prism TPS dose calculation. The aim of this study is to investigate the sensitivity of dose calculation on Prism using Monte Carlo simulation. Phase space source from head linear accelerator (LINAC) for Monte Carlo simulation is implemented. To achieve this aim, Prism dose calculation is compared with EGSnrc Monte Carlo simulation. Percentage depth dose (PDD) and R50 from both calculations are observed. BEAMnrc is simulated electron transport in LINAC head and produced phase space file. This file ismore » used as DOSXYZnrc input to simulated electron transport in phantom. This study is started with commissioning process in water phantom. Commissioning process is adjusted Monte Carlo simulation with Prism RTPS. Commissioning result is used for study of inhomogeneity phantom. Physical parameters of inhomogeneity phantom that varied in this study are: density, location and thickness of tissue. Commissioning result is shown that optimum energy of Monte Carlo simulation for 6 MeV electron beam is 6.8 MeV. This commissioning is used R50 and PDD with Practical length (R{sub p}) as references. From inhomogeneity study, the average deviation for all case on interest region is below 5 %. Based on ICRU recommendations, Prism has good ability to calculate the radiation dose in inhomogeneity tissue.« less

  13. McStas-model of the delft SESANS

    NASA Astrophysics Data System (ADS)

    Knudsen, E.; Udby, L.; Willendrup, P. K.; Lefmann, K.; Bouwman, W. G.

    2011-06-01

    We present simulation results taking first virtual data from a model of the Spin-Echo Small Angle Scattering (SESANS) instrument situated in Delft, in the framework of the McStas Monte Carlo software package. The main focus has been on making a model of the Delft SESANS instrument, and we can now present the first virtual data from it, using a refracting prism-like sample model. In consequence, polarisation instrumentation is now included natively in the McStas kernel, including options for magnetic fields and a number of utility components. This development has brought us to a point where realistic models of polarisation-enabled instrumentation can be built.

  14. Modularized Parallel Neutron Instrument Simulation on the TeraGrid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Meili; Cobb, John W; Hagen, Mark E

    2007-01-01

    In order to build a bridge between the TeraGrid (TG), a national scale cyberinfrastructure resource, and neutron science, the Neutron Science TeraGrid Gateway (NSTG) is focused on introducing productive HPC usage to the neutron science community, primarily the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL). Monte Carlo simulations are used as a powerful tool for instrument design and optimization at SNS. One of the successful efforts of a collaboration team composed of NSTG HPC experts and SNS instrument scientists is the development of a software facility named PSoNI, Parallelizing Simulations of Neutron Instruments. Parallelizing the traditional serialmore » instrument simulation on TeraGrid resources, PSoNI quickly computes full instrument simulation at sufficient statistical levels in instrument de-sign. Upon SNS successful commissioning, to the end of 2007, three out of five commissioned instruments in SNS target station will be available for initial users. Advanced instrument study, proposal feasibility evalua-tion, and experiment planning are on the immediate schedule of SNS, which pose further requirements such as flexibility and high runtime efficiency on fast instrument simulation. PSoNI has been redesigned to meet the new challenges and a preliminary version is developed on TeraGrid. This paper explores the motivation and goals of the new design, and the improved software structure. Further, it describes the realized new fea-tures seen from MPI parallelized McStas running high resolution design simulations of the SEQUOIA and BSS instruments at SNS. A discussion regarding future work, which is targeted to do fast simulation for automated experiment adjustment and comparing models to data in analysis, is also presented.« less

  15. Engineering Risk Assessment of Space Thruster Challenge Problem

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie

    2014-01-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.

  16. Analysis of the Space Propulsion System Problem Using RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    diego mandelli; curtis smith; cristian rabiti

    This paper presents the solution of the space propulsion problem using a PRA code currently under development at Idaho National Laboratory (INL). RAVEN (Reactor Analysis and Virtual control ENviroment) is a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities. It is designed to derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures) and to perform both Monte- Carlo sampling of random distributed events and Event Tree based analysis. In order to facilitate the input/output handling, a Graphical User Interface (GUI) and a post-processing data-mining module are available.more » RAVEN allows also to interface with several numerical codes such as RELAP5 and RELAP-7 and ad-hoc system simulators. For the space propulsion system problem, an ad-hoc simulator has been developed and written in python language and then interfaced to RAVEN. Such simulator fully models both deterministic (e.g., system dynamics and interactions between system components) and stochastic behaviors (i.e., failures of components/systems such as distribution lines and thrusters). Stochastic analysis is performed using random sampling based methodologies (i.e., Monte-Carlo). Such analysis is accomplished to determine both the reliability of the space propulsion system and to propagate the uncertainties associated to a specific set of parameters. As also indicated in the scope of the benchmark problem, the results generated by the stochastic analysis are used to generate risk-informed insights such as conditions under witch different strategy can be followed.« less

  17. Predicting Flows of Rarefied Gases

    NASA Technical Reports Server (NTRS)

    LeBeau, Gerald J.; Wilmoth, Richard G.

    2005-01-01

    DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation.

  18. Multivariate stochastic simulation with subjective multivariate normal distributions

    Treesearch

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...

  19. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    PubMed

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  20. The Development and Comparison of Molecular Dynamics Simulation and Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Chen, Jundong

    2018-03-01

    Molecular dynamics is an integrated technology that combines physics, mathematics and chemistry. Molecular dynamics method is a computer simulation experimental method, which is a powerful tool for studying condensed matter system. This technique not only can get the trajectory of the atom, but can also observe the microscopic details of the atomic motion. By studying the numerical integration algorithm in molecular dynamics simulation, we can not only analyze the microstructure, the motion of particles and the image of macroscopic relationship between them and the material, but can also study the relationship between the interaction and the macroscopic properties more conveniently. The Monte Carlo Simulation, similar to the molecular dynamics, is a tool for studying the micro-molecular and particle nature. In this paper, the theoretical background of computer numerical simulation is introduced, and the specific methods of numerical integration are summarized, including Verlet method, Leap-frog method and Velocity Verlet method. At the same time, the method and principle of Monte Carlo Simulation are introduced. Finally, similarities and differences of Monte Carlo Simulation and the molecular dynamics simulation are discussed.

  1. Web-based, GPU-accelerated, Monte Carlo simulation and visualization of indirect radiation imaging detector performance.

    PubMed

    Dong, Han; Sharma, Diksha; Badano, Aldo

    2014-12-01

    Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridmantis, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webmantis and visualmantis to facilitate the setup of computational experiments via hybridmantis. The visualization tools visualmantis and webmantis enable the user to control simulation properties through a user interface. In the case of webmantis, control via a web browser allows access through mobile devices such as smartphones or tablets. webmantis acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridmantis. The users can download the output images and statistics through a zip file for future reference. In addition, webmantis provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. The visualization tools visualmantis and webmantis provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual feedback for the model predictions.

  2. Feature aided Monte Carlo probabilistic data association filter for ballistic missile tracking

    NASA Astrophysics Data System (ADS)

    Ozdemir, Onur; Niu, Ruixin; Varshney, Pramod K.; Drozd, Andrew L.; Loe, Richard

    2011-05-01

    The problem of ballistic missile tracking in the presence of clutter is investigated. Probabilistic data association filter (PDAF) is utilized as the basic filtering algorithm. We propose to use sequential Monte Carlo methods, i.e., particle filters, aided with amplitude information (AI) in order to improve the tracking performance of a single target in clutter when severe nonlinearities exist in the system. We call this approach "Monte Carlo probabilistic data association filter with amplitude information (MCPDAF-AI)." Furthermore, we formulate a realistic problem in the sense that we use simulated radar cross section (RCS) data for a missile warhead and a cylinder chaff using Lucernhammer1, a state of the art electromagnetic signature prediction software, to model target and clutter amplitude returns as additional amplitude features which help to improve data association and tracking performance. A performance comparison is carried out between the extended Kalman filter (EKF) and the particle filter under various scenarios using single and multiple sensors. The results show that, when only one sensor is used, the MCPDAF performs significantly better than the EKF in terms of tracking accuracy under severe nonlinear conditions for ballistic missile tracking applications. However, when the number of sensors is increased, even under severe nonlinear conditions, the EKF performs as well as the MCPDAF.

  3. Path Toward a Unified Geometry for Radiation Transport

    NASA Astrophysics Data System (ADS)

    Lee, Kerry

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The work-flow for doing radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats.

  4. Instantons in Quantum Annealing: Thermally Assisted Tunneling Vs Quantum Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Jiang, Zhang; Smelyanskiy, Vadim N.; Boixo, Sergio; Isakov, Sergei V.; Neven, Hartmut; Mazzola, Guglielmo; Troyer, Matthias

    2015-01-01

    Recent numerical result (arXiv:1512.02206) from Google suggested that the D-Wave quantum annealer may have an asymptotic speed-up than simulated annealing, however, the asymptotic advantage disappears when it is compared to quantum Monte Carlo (a classical algorithm despite its name). We show analytically that the asymptotic scaling of quantum tunneling is exactly the same as the escape rate in quantum Monte Carlo for a class of problems. Thus, the Google result might be explained in our framework. We also found that the transition state in quantum Monte Carlo corresponds to the instanton solution in quantum tunneling problems, which is observed in numerical simulations.

  5. Feasibility assessment of the interactive use of a Monte Carlo algorithm in treatment planning for intraoperative electron radiation therapy

    NASA Astrophysics Data System (ADS)

    Guerra, Pedro; Udías, José M.; Herranz, Elena; Santos-Miranda, Juan Antonio; Herraiz, Joaquín L.; Valdivieso, Manlio F.; Rodríguez, Raúl; Calama, Juan A.; Pascau, Javier; Calvo, Felipe A.; Illana, Carlos; Ledesma-Carbayo, María J.; Santos, Andrés

    2014-12-01

    This work analysed the feasibility of using a fast, customized Monte Carlo (MC) method to perform accurate computation of dose distributions during pre- and intraplanning of intraoperative electron radiation therapy (IOERT) procedures. The MC method that was implemented, which has been integrated into a specific innovative simulation and planning tool, is able to simulate the fate of thousands of particles per second, and it was the aim of this work to determine the level of interactivity that could be achieved. The planning workflow enabled calibration of the imaging and treatment equipment, as well as manipulation of the surgical frame and insertion of the protection shields around the organs at risk and other beam modifiers. In this way, the multidisciplinary team involved in IOERT has all the tools necessary to perform complex MC dosage simulations adapted to their equipment in an efficient and transparent way. To assess the accuracy and reliability of this MC technique, dose distributions for a monoenergetic source were compared with those obtained using a general-purpose software package used widely in medical physics applications. Once accuracy of the underlying simulator was confirmed, a clinical accelerator was modelled and experimental measurements in water were conducted. A comparison was made with the output from the simulator to identify the conditions under which accurate dose estimations could be obtained in less than 3 min, which is the threshold imposed to allow for interactive use of the tool in treatment planning. Finally, a clinically relevant scenario, namely early-stage breast cancer treatment, was simulated with pre- and intraoperative volumes to verify that it was feasible to use the MC tool intraoperatively and to adjust dose delivery based on the simulation output, without compromising accuracy. The workflow provided a satisfactory model of the treatment head and the imaging system, enabling proper configuration of the treatment planning system and providing good accuracy in the dosage simulation.

  6. The use of Monte Carlo simulations for accurate dose determination with thermoluminescence dosemeters in radiation therapy beams.

    PubMed

    Mobit, P

    2002-01-01

    The energy responses of LiF-TLDs irradiated in megavoltage electron and photon beams have been determined experimentally by many investigators over the past 35 years but the results vary considerably. General cavity theory has been used to model some of the experimental findings but the predictions of these cavity theories differ from each other and from measurements by more than 13%. Recently, two groups or investigators using Monte Carlo simulations and careful experimental techniques showed that the energy response of 1 mm or 2 mm thick LiF-TLD irradiated by megavoltage photon and electron beams is not more than 5% less than unity for low-Z phantom materials like water or Perspex. However, when the depth of irradiation is significantly different from dmax and the TLD size is more than 5 mm, then the energy response is up to 12% less than unity for incident electron beams. Monte Carlo simulations of some of the experiments reported in the literature showed that some of the contradictory experimental results are reproducible with Monte Carlo simulations. Monte Carlo simulations show that the energy response of LiF-TLDs depends on the size of detector used in electron beams, the depth of irradiation and the incident electron energy. Other differences can be attributed to absolute dose determination and precision of the TL technique. Monte Carlo simulations have also been used to evaluate some of the published general cavity theories. The results show that some of the parameters used to evaluate Burlin's general cavity theory are wrong by factor of 3. Despite this, the estimation of the energy response for most clinical situations using Burlin's cavity equation agrees with Monte Carlo simulations within 1%.

  7. Characterization of scatter in digital mammography from use of Monte Carlo simulations and comparison to physical measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leon, Stephanie M., E-mail: Stephanie.Leon@uth.tmc.edu; Wagner, Louis K.; Brateman, Libby F.

    2014-11-01

    Purpose: Monte Carlo simulations were performed with the goal of verifying previously published physical measurements characterizing scatter as a function of apparent thickness. A secondary goal was to provide a way of determining what effect tissue glandularity might have on the scatter characteristics of breast tissue. The overall reason for characterizing mammography scatter in this research is the application of these data to an image processing-based scatter-correction program. Methods: MCNPX was used to simulate scatter from an infinitesimal pencil beam using typical mammography geometries and techniques. The spreading of the pencil beam was characterized by two parameters: mean radial extentmore » (MRE) and scatter fraction (SF). The SF and MRE were found as functions of target, filter, tube potential, phantom thickness, and the presence or absence of a grid. The SF was determined by separating scatter and primary by the angle of incidence on the detector, then finding the ratio of the measured scatter to the total number of detected events. The accuracy of the MRE was determined by placing ring-shaped tallies around the impulse and fitting those data to the point-spread function (PSF) equation using the value for MRE derived from the physical measurements. The goodness-of-fit was determined for each data set as a means of assessing the accuracy of the physical MRE data. The effect of breast glandularity on the SF, MRE, and apparent tissue thickness was also considered for a limited number of techniques. Results: The agreement between the physical measurements and the results of the Monte Carlo simulations was assessed. With a grid, the SFs ranged from 0.065 to 0.089, with absolute differences between the measured and simulated SFs averaging 0.02. Without a grid, the range was 0.28–0.51, with absolute differences averaging −0.01. The goodness-of-fit values comparing the Monte Carlo data to the PSF from the physical measurements ranged from 0.96 to 1.00 with a grid and 0.65 to 0.86 without a grid. Analysis of the data suggested that the nongrid data could be better described by a biexponential function than the single exponential used here. The simulations assessing the effect of breast composition on SF and MRE showed only a slight impact on these quantities. When compared to a mix of 50% glandular/50% adipose tissue, the impact of substituting adipose or glandular breast compositions on the apparent thickness of the tissue was about 5%. Conclusions: The findings show agreement between the physical measurements published previously and the Monte Carlo simulations presented here; the resulting data can therefore be used more confidently for an application such as image processing-based scatter correction. The findings also suggest that breast composition does not have a major impact on the scatter characteristics of breast tissue. Application of the scatter data to the development of a scatter-correction software program can be simplified by ignoring the variations in density among breast tissues.« less

  8. Software Would Largely Automate Design of Kalman Filter

    NASA Technical Reports Server (NTRS)

    Chuang, Jason C. H.; Negast, William J.

    2005-01-01

    Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.

  9. Monte Carlo simulation of aorta autofluorescence

    NASA Astrophysics Data System (ADS)

    Kuznetsova, A. A.; Pushkareva, A. E.

    2016-08-01

    Results of numerical simulation of autofluorescence of the aorta by the method of Monte Carlo are reported. Two states of the aorta, normal and with atherosclerotic lesions, are studied. A model of the studied tissue is developed on the basis of information about optical, morphological, and physico-chemical properties. It is shown that the data obtained by numerical Monte Carlo simulation are in good agreement with experimental results indicating adequacy of the developed model of the aorta autofluorescence.

  10. Physical Principle for Generation of Randomness

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2009-01-01

    A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)

  11. kmos: A lattice kinetic Monte Carlo framework

    NASA Astrophysics Data System (ADS)

    Hoffmann, Max J.; Matera, Sebastian; Reuter, Karsten

    2014-07-01

    Kinetic Monte Carlo (kMC) simulations have emerged as a key tool for microkinetic modeling in heterogeneous catalysis and other materials applications. Systems, where site-specificity of all elementary reactions allows a mapping onto a lattice of discrete active sites, can be addressed within the particularly efficient lattice kMC approach. To this end we describe the versatile kmos software package, which offers a most user-friendly implementation, execution, and evaluation of lattice kMC models of arbitrary complexity in one- to three-dimensional lattice systems, involving multiple active sites in periodic or aperiodic arrangements, as well as site-resolved pairwise and higher-order lateral interactions. Conceptually, kmos achieves a maximum runtime performance which is essentially independent of lattice size by generating code for the efficiency-determining local update of available events that is optimized for a defined kMC model. For this model definition and the control of all runtime and evaluation aspects kmos offers a high-level application programming interface. Usage proceeds interactively, via scripts, or a graphical user interface, which visualizes the model geometry, the lattice occupations and rates of selected elementary reactions, while allowing on-the-fly changes of simulation parameters. We demonstrate the performance and scaling of kmos with the application to kMC models for surface catalytic processes, where for given operation conditions (temperature and partial pressures of all reactants) central simulation outcomes are catalytic activity and selectivities, surface composition, and mechanistic insight into the occurrence of individual elementary processes in the reaction network.

  12. 3D Monte Carlo simulation of light propagation for laser acupuncture and optimization of illumination parameters

    NASA Astrophysics Data System (ADS)

    Zhong, Fulin; Li, Ting; Pan, Boan; Wang, Pengbo

    2017-02-01

    Laser acupuncture is an effective photochemical and nonthermal stimulation of traditional acupuncture points with lowintensity laser irradiation, which is advantageous in painless, sterile, and safe compared to traditional acupuncture. Laser diode (LD) provides single wavelength and relatively-higher power light for phototherapy. The quantitative effect of illumination parameters of LD in use of laser acupuncture is crucial for practical operation of laser acupuncture. However, this issue is not fully demonstrated, especially since experimental methodologies with animals or human are pretty hard to address to this issue. For example, in order to protect viability of cells and tissue, and get better therapeutic effect, it's necessary to control the output power varied at 5mW 10mW range, while the optimized power is still not clear. This study aimed to quantitatively optimize the laser output power, wavelength, and irradiation direction with highly realistic modeling of light transport in acupunctured tissue. A Monte Carlo Simulation software for 3D vowelized media and the highest-precision human anatomical model Visible Chinese Human (VCH) were employed. Our 3D simulation results showed that longer wavelength/higher illumination power, larger absorption in laser acupuncture; the vertical direction emission of the acupuncture laser results in higher amount of light absorption in both the acupunctured voxel of tissue and muscle layer. Our 3D light distribution of laser acupuncture within VCH tissue model is potential to be used in optimization and real time guidance in clinical manipulation of laser acupuncture.

  13. Structural Reliability and Monte Carlo Simulation.

    ERIC Educational Resources Information Center

    Laumakis, P. J.; Harlow, G.

    2002-01-01

    Analyzes a simple boom structure and assesses its reliability using elementary engineering mechanics. Demonstrates the power and utility of Monte-Carlo simulation by showing that such a simulation can be implemented more readily with results that compare favorably to the theoretical calculations. (Author/MM)

  14. The Monte Carlo Method. Popular Lectures in Mathematics.

    ERIC Educational Resources Information Center

    Sobol', I. M.

    The Monte Carlo Method is a method of approximately solving mathematical and physical problems by the simulation of random quantities. The principal goal of this booklet is to suggest to specialists in all areas that they will encounter problems which can be solved by the Monte Carlo Method. Part I of the booklet discusses the simulation of random…

  15. Fixed forced detection for fast SPECT Monte-Carlo simulation

    NASA Astrophysics Data System (ADS)

    Cajgfinger, T.; Rit, S.; Létang, J. M.; Halty, A.; Sarrut, D.

    2018-03-01

    Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.

  16. Fixed forced detection for fast SPECT Monte-Carlo simulation.

    PubMed

    Cajgfinger, T; Rit, S; Létang, J M; Halty, A; Sarrut, D

    2018-03-02

    Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.

  17. Monte Carlo simulation: Its status and future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murtha, J.A.

    1997-04-01

    Monte Carlo simulation is a statistics-based analysis tool that yields probability-vs.-value relationships for key parameters, including oil and gas reserves, capital exposure, and various economic yardsticks, such as net present value (NPV) and return on investment (ROI). Monte Carlo simulation is a part of risk analysis and is sometimes performed in conjunction with or as an alternative to decision [tree] analysis. The objectives are (1) to define Monte Carlo simulation in a more general context of risk and decision analysis; (2) to provide some specific applications, which can be interrelated; (3) to respond to some of the criticisms; (4) tomore » offer some cautions about abuses of the method and recommend how to avoid the pitfalls; and (5) to predict what the future has in store.« less

  18. Physical time scale in kinetic Monte Carlo simulations of continuous-time Markov chains.

    PubMed

    Serebrinsky, Santiago A

    2011-03-01

    We rigorously establish a physical time scale for a general class of kinetic Monte Carlo algorithms for the simulation of continuous-time Markov chains. This class of algorithms encompasses rejection-free (or BKL) and rejection (or "standard") algorithms. For rejection algorithms, it was formerly considered that the availability of a physical time scale (instead of Monte Carlo steps) was empirical, at best. Use of Monte Carlo steps as a time unit now becomes completely unnecessary.

  19. Monte Carlo simulation for kinetic chemotaxis model: An application to the traveling population wave

    NASA Astrophysics Data System (ADS)

    Yasuda, Shugo

    2017-02-01

    A Monte Carlo simulation of chemotactic bacteria is developed on the basis of the kinetic model and is applied to a one-dimensional traveling population wave in a microchannel. In this simulation, the Monte Carlo method, which calculates the run-and-tumble motions of bacteria, is coupled with a finite volume method to calculate the macroscopic transport of the chemical cues in the environment. The simulation method can successfully reproduce the traveling population wave of bacteria that was observed experimentally and reveal the microscopic dynamics of bacterium coupled with the macroscopic transports of the chemical cues and bacteria population density. The results obtained by the Monte Carlo method are also compared with the asymptotic solution derived from the kinetic chemotaxis equation in the continuum limit, where the Knudsen number, which is defined by the ratio of the mean free path of bacterium to the characteristic length of the system, vanishes. The validity of the Monte Carlo method in the asymptotic behaviors for small Knudsen numbers is numerically verified.

  20. TU-AB-BRC-10: Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison of GPU and MIC Computing Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Xu, X

    Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analyticallymore » derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition,more » substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.« less

  2. Simulation of Rate-Related (Dead-Time) Losses In Passive Neutron Multiplicity Counting Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, L.G.; Norman, P.I.; Leadbeater, T.W.

    Passive Neutron Multiplicity Counting (PNMC) based on Multiplicity Shift Register (MSR) electronics (a form of time correlation analysis) is a widely used non-destructive assay technique for quantifying spontaneously fissile materials such as Pu. At high event rates, dead-time losses perturb the count rates with the Singles, Doubles and Triples being increasingly affected. Without correction these perturbations are a major source of inaccuracy in the measured count rates and assay values derived from them. This paper presents the simulation of dead-time losses and investigates the effect of applying different dead-time models on the observed MSR data. Monte Carlo methods have beenmore » used to simulate neutron pulse trains for a variety of source intensities and with ideal detection geometry, providing an event by event record of the time distribution of neutron captures within the detection system. The action of the MSR electronics was modelled in software to analyse these pulse trains. Stored pulse trains were perturbed in software to apply the effects of dead-time according to the chosen physical process; for example, the ideal paralysable (extending) and non-paralysable models with an arbitrary dead-time parameter. Results of the simulations demonstrate the change in the observed MSR data when the system dead-time parameter is varied. In addition, the paralysable and non-paralysable models of deadtime are compared. These results form part of a larger study to evaluate existing dead-time corrections and to extend their application to correlated sources. (authors)« less

  3. Stochastic simulation and analysis of biomolecular reaction networks

    PubMed Central

    Frazier, John M; Chushak, Yaroslav; Foy, Brent

    2009-01-01

    Background In recent years, several stochastic simulation algorithms have been developed to generate Monte Carlo trajectories that describe the time evolution of the behavior of biomolecular reaction networks. However, the effects of various stochastic simulation and data analysis conditions on the observed dynamics of complex biomolecular reaction networks have not recieved much attention. In order to investigate these issues, we employed a a software package developed in out group, called Biomolecular Network Simulator (BNS), to simulate and analyze the behavior of such systems. The behavior of a hypothetical two gene in vitro transcription-translation reaction network is investigated using the Gillespie exact stochastic algorithm to illustrate some of the factors that influence the analysis and interpretation of these data. Results Specific issues affecting the analysis and interpretation of simulation data are investigated, including: (1) the effect of time interval on data presentation and time-weighted averaging of molecule numbers, (2) effect of time averaging interval on reaction rate analysis, (3) effect of number of simulations on precision of model predictions, and (4) implications of stochastic simulations on optimization procedures. Conclusion The two main factors affecting the analysis of stochastic simulations are: (1) the selection of time intervals to compute or average state variables and (2) the number of simulations generated to evaluate the system behavior. PMID:19534796

  4. VirtualDose: a software for reporting organ doses from CT for adult and pediatric patients.

    PubMed

    Ding, Aiping; Gao, Yiming; Liu, Haikuan; Caracappa, Peter F; Long, Daniel J; Bolch, Wesley E; Liu, Bob; Xu, X George

    2015-07-21

    This paper describes the development and testing of VirtualDose--a software for reporting organ doses for adult and pediatric patients who undergo x-ray computed tomography (CT) examinations. The software is based on a comprehensive database of organ doses derived from Monte Carlo (MC) simulations involving a library of 25 anatomically realistic phantoms that represent patients of different ages, body sizes, body masses, and pregnant stages. Models of GE Lightspeed Pro 16 and Siemens SOMATOM Sensation 16 scanners were carefully validated for use in MC dose calculations. The software framework is designed with the 'software as a service (SaaS)' delivery concept under which multiple clients can access the web-based interface simultaneously from any computer without having to install software locally. The RESTful web service API also allows a third-party picture archiving and communication system software package to seamlessly integrate with VirtualDose's functions. Software testing showed that VirtualDose was compatible with numerous operating systems including Windows, Linux, Apple OS X, and mobile and portable devices. The organ doses from VirtualDose were compared against those reported by CT-Expo and ImPACT-two dosimetry tools that were based on the stylized pediatric and adult patient models that were known to be anatomically simple. The organ doses reported by VirtualDose differed from those reported by CT-Expo and ImPACT by as much as 300% in some of the patient models. These results confirm the conclusion from past studies that differences in anatomical realism offered by stylized and voxel phantoms have caused significant discrepancies in CT dose estimations.

  5. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Z; Gao, M

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster softwaremore » developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.« less

  6. SU-E-T-481: Dosimetric Effects of Tissue Heterogeneity in Proton Therapy: Monte Carlo Simulation and Experimental Study Using Animal Tissue Phantoms.

    PubMed

    Liu, Y; Zheng, Y

    2012-06-01

    Accurate determination of proton dosimetric effect for tissue heterogeneity is critical in proton therapy. Proton beams have finite range and consequently tissue heterogeneity plays a more critical role in proton therapy. The purpose of this study is to investigate the tissue heterogeneity effect in proton dosimetry based on anatomical-based Monte Carlo simulation using animal tissues. Animal tissues including a pig head and beef bulk were used in this study. Both pig head and beef were scanned using a GE CT scanner with 1.25 mm slice thickness. A treatment plan was created, using the CMS XiO treatment planning system (TPS) with a single proton spread-out-Bragg-peak beam (SOBP). Radiochromic films were placed at the distal falloff region. Image guidance was used to align the phantom before proton beams were delivered according to the treatment plan. The same two CT sets were converted to Monte Carlo simulation model. The Monte Carlo simulated dose calculations with/without tissue omposition were compared to TPS calculations and measurements. Based on the preliminary comparison, at the center of SOBP plane, the Monte Carlo simulation dose without tissue composition agreed generally well with TPS calculation. In the distal falloff region, the dose difference was large, and about 2 mm isodose line shift was observed with the consideration of tissue composition. The detailed comparison of dose distributions between Monte Carlo simulation, TPS calculations and measurements is underway. Accurate proton dose calculations are challenging in proton treatment planning for heterogeneous tissues. Tissue heterogeneity and tissue composition may lead to isodose line shifts up to a few millimeters in the distal falloff region. By simulating detailed particle transport and energy deposition, Monte Carlo simulations provide a verification method in proton dose calculation where inhomogeneous tissues are present. © 2012 American Association of Physicists in Medicine.

  7. Self-learning Monte Carlo method

    DOE PAGES

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; ...

    2017-01-04

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of a general and efficient update algorithm for large size systems close to the phase transition, for which local updates perform badly. In this Rapid Communication, we propose a general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. Lastly, we demonstrate the efficiency of SLMC in a spin model at the phasemore » transition point, achieving a 10–20 times speedup.« less

  8. Brownian dynamics and dynamic Monte Carlo simulations of isotropic and liquid crystal phases of anisotropic colloidal particles: a comparative study.

    PubMed

    Patti, Alessandro; Cuetos, Alejandro

    2012-07-01

    We report on the diffusion of purely repulsive and freely rotating colloidal rods in the isotropic, nematic, and smectic liquid crystal phases to probe the agreement between Brownian and Monte Carlo dynamics under the most general conditions. By properly rescaling the Monte Carlo time step, being related to any elementary move via the corresponding self-diffusion coefficient, with the acceptance rate of simultaneous trial displacements and rotations, we demonstrate the existence of a unique Monte Carlo time scale that allows for a direct comparison between Monte Carlo and Brownian dynamics simulations. To estimate the validity of our theoretical approach, we compare the mean square displacement of rods, their orientational autocorrelation function, and the self-intermediate scattering function, as obtained from Brownian dynamics and Monte Carlo simulations. The agreement between the results of these two approaches, even under the condition of heterogeneous dynamics generally observed in liquid crystalline phases, is excellent.

  9. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. Thesemore » lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations. Beginning MCNP users are encouraged to review LA-UR-09-00380, "Criticality Calculations with MCNP: A Primer (3nd Edition)" (available at http:// mcnp.lanl.gov under "Reference Collection") prior to the class. No Monte Carlo class can be complete without having students write their own simple Monte Carlo routines for basic random sampling, use of the random number generator, and simplified particle transport simulation.« less

  10. Web-Based Model Visualization Tools to Aid in Model Optimization and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Alder, J.; van Griensven, A.; Meixner, T.

    2003-12-01

    Individuals applying hydrologic models have a need for a quick easy to use visualization tools to permit them to assess and understand model performance. We present here the Interactive Hydrologic Modeling (IHM) visualization toolbox. The IHM utilizes high-speed Internet access, the portability of the web and the increasing power of modern computers to provide an online toolbox for quick and easy model result visualization. This visualization interface allows for the interpretation and analysis of Monte-Carlo and batch model simulation results. Often times a given project will generate several thousands or even hundreds of thousands simulations. This large number of simulations creates a challenge for post-simulation analysis. IHM's goal is to try to solve this problem by loading all of the data into a database with a web interface that can dynamically generate graphs for the user according to their needs. IHM currently supports: a global samples statistics table (e.g. sum of squares error, sum of absolute differences etc.), top ten simulations table and graphs, graphs of an individual simulation using time step data, objective based dotty plots, threshold based parameter cumulative density function graphs (as used in the regional sensitivity analysis of Spear and Hornberger) and 2D error surface graphs of the parameter space. IHM is ideal for the simplest bucket model to the largest set of Monte-Carlo model simulations with a multi-dimensional parameter and model output space. By using a web interface, IHM offers the user complete flexibility in the sense that they can be anywhere in the world using any operating system. IHM can be a time saving and money saving alternative to spending time producing graphs or conducting analysis that may not be informative or being forced to purchase or use expensive and proprietary software. IHM is a simple, free, method of interpreting and analyzing batch model results, and is suitable for novice to expert hydrologic modelers.

  11. Monte Carlo Simulation of Microscopic Stock Market Models

    NASA Astrophysics Data System (ADS)

    Stauffer, Dietrich

    Computer simulations with random numbers, that is, Monte Carlo methods, have been considerably applied in recent years to model the fluctuations of stock market or currency exchange rates. Here we concentrate on the percolation model of Cont and Bouchaud, to simulate, not to predict, the market behavior.

  12. Radiotherapy Monte Carlo simulation using cloud computing technology.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  13. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  14. Modeling Nitrogen Dynamics in a Waste Stabilization Pond System Using Flexible Modeling Environment with MCMC.

    PubMed

    Mukhtar, Hussnain; Lin, Yu-Pin; Shipin, Oleg V; Petway, Joy R

    2017-07-12

    This study presents an approach for obtaining realization sets of parameters for nitrogen removal in a pilot-scale waste stabilization pond (WSP) system. The proposed approach was designed for optimal parameterization, local sensitivity analysis, and global uncertainty analysis of a dynamic simulation model for the WSP by using the R software package Flexible Modeling Environment (R-FME) with the Markov chain Monte Carlo (MCMC) method. Additionally, generalized likelihood uncertainty estimation (GLUE) was integrated into the FME to evaluate the major parameters that affect the simulation outputs in the study WSP. Comprehensive modeling analysis was used to simulate and assess nine parameters and concentrations of ON-N, NH₃-N and NO₃-N. Results indicate that the integrated FME-GLUE-based model, with good Nash-Sutcliffe coefficients (0.53-0.69) and correlation coefficients (0.76-0.83), successfully simulates the concentrations of ON-N, NH₃-N and NO₃-N. Moreover, the Arrhenius constant was the only parameter sensitive to model performances of ON-N and NH₃-N simulations. However, Nitrosomonas growth rate, the denitrification constant, and the maximum growth rate at 20 °C were sensitive to ON-N and NO₃-N simulation, which was measured using global sensitivity.

  15. The Probe Profile and Lateral Resolution of Scanning Transmission Electron Microscopy of Thick Specimens

    PubMed Central

    Demers, Hendrix; Ramachandra, Ranjan; Drouin, Dominique; de Jonge, Niels

    2012-01-01

    Lateral profiles of the electron probe of scanning transmission electron microscopy (STEM) were simulated at different vertical positions in a micrometers-thick carbon sample. The simulations were carried out using the Monte Carlo method in the CASINO software. A model was developed to fit the probe profiles. The model consisted of the sum of a Gaussian function describing the central peak of the profile, and two exponential decay functions describing the tail of the profile. Calculations were performed to investigate the fraction of unscattered electrons as function of the vertical position of the probe in the sample. Line scans were also simulated over gold nanoparticles at the bottom of a carbon film to calculate the achievable resolution as function of the sample thickness and the number of electrons. The resolution was shown to be noise limited for film thicknesses less than 1 μm. Probe broadening limited the resolution for thicker films. The validity of the simulation method was verified by comparing simulated data with experimental data. The simulation method can be used as quantitative method to predict STEM performance or to interpret STEM images of thick specimens. PMID:22564444

  16. Optimization of the Monte Carlo code for modeling of photon migration in tissue.

    PubMed

    Zołek, Norbert S; Liebert, Adam; Maniewski, Roman

    2006-10-01

    The Monte Carlo method is frequently used to simulate light transport in turbid media because of its simplicity and flexibility, allowing to analyze complicated geometrical structures. Monte Carlo simulations are, however, time consuming because of the necessity to track the paths of individual photons. The time consuming computation is mainly associated with the calculation of the logarithmic and trigonometric functions as well as the generation of pseudo-random numbers. In this paper, the Monte Carlo algorithm was developed and optimized, by approximation of the logarithmic and trigonometric functions. The approximations were based on polynomial and rational functions, and the errors of these approximations are less than 1% of the values of the original functions. The proposed algorithm was verified by simulations of the time-resolved reflectance at several source-detector separations. The results of the calculation using the approximated algorithm were compared with those of the Monte Carlo simulations obtained with an exact computation of the logarithm and trigonometric functions as well as with the solution of the diffusion equation. The errors of the moments of the simulated distributions of times of flight of photons (total number of photons, mean time of flight and variance) are less than 2% for a range of optical properties, typical of living tissues. The proposed approximated algorithm allows to speed up the Monte Carlo simulations by a factor of 4. The developed code can be used on parallel machines, allowing for further acceleration.

  17. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures.

    PubMed

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-04-01

    Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  18. Development of a software package for solid-angle calculations using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai

    2014-02-01

    Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C++, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4.

  19. Coupling Monte Carlo simulations with thermal analysis for correcting microdosimetric spectra from a novel micro-calorimeter

    NASA Astrophysics Data System (ADS)

    Fathi, K.; Galer, S.; Kirkby, K. J.; Palmans, H.; Nisbet, A.

    2017-11-01

    The high uncertainty in the Relative Biological Effectiveness (RBE) values of particle therapy beam, which are used in combination with the quantity absorbed dose in radiotherapy, together with the increase in the number of particle therapy centres worldwide necessitate a better understating of the biological effect of such modalities. The present novel study is part of performance testing and development of a micro-calorimeter based on Superconducting QUantum Interference Devices (SQUIDs). Unlike other microdosimetric detectors that are used for investigating the energy distribution, this detector provides a direct measurement of energy deposition at the micrometre scale, that can be used to improve our understanding of biological effects in particle therapy application, radiation protection and environmental dosimetry. Temperature rises of less than 1μK are detectable and when combined with the low specific heat capacity of the absorber at cryogenic temperature, extremely high energy deposition sensitivity of approximately 0.4 eV can be achieved. The detector consists of 3 layers: tissue equivalent (TE) absorber, superconducting (SC) absorber and silicon substrate. Ideally all energy would be absorbed in the TE absorber and heat rise in the superconducting layer would arise due to heat conduction from the TE layer. However, in practice direct particle absorption occurs in all 3 layers and must be corrected for. To investigate the thermal behaviour within the detector, and quantify any possible correction, particle tracks were simulated employing Geant4 (v9.6) Monte Carlo simulations. The track information was then passed to the COMSOL Multiphysics (Finite Element Method) software. The 3D heat transfer within each layer was then evaluated in a time-dependent model. For a statistically reliable outcome, the simulations had to be repeated for a large number of particles. An automated system has been developed that couples Geant4 Monte Carlo output to COMSOL for determining the expected distribution of proton tracks and their thermal contribution within the detector. The correction factor for a 3.8 MeV proton pencil beam was determined and applied to the expected spectra. The corrected microdosimetric spectra was shown to have a good agreement with the ideal spectra.

  20. Feasibility of Small Animal Anatomical and Functional Imaging with Neutrons: A Monte Carlo Simulation Study

    NASA Astrophysics Data System (ADS)

    Medich, David C.; Currier, Blake H.; Karellas, Andrew

    2014-10-01

    A novel technique is presented for obtaining a single in-vivo image containing both functional and anatomical information in a small animal model such as a mouse. This technique, which incorporates appropriate image neutron-scatter rejection and uses a neutron opaque contrast agent, is based on neutron radiographic technology and was demonstrated through a series of Monte Carlo simulations. With respect to functional imaging, this technique can be useful in biomedical and biological research because it could achieve a spatial resolution orders of magnitude better than what presently can be achieved with current functional imaging technologies such as nuclear medicine (PET, SPECT) and fMRI. For these studies, Monte Carlo simulations were performed with thermal (0.025 eV) neutrons in a 3 cm thick phantom using the MCNP5 simulations software. The goals of these studies were to determine: 1) the extent that scattered neutrons degrade image contrast; 2) the contrasts of various normal and diseased tissues under conditions of complete scatter rejection; 3) the concentrations of Boron-10 and Gadolinium-157 required for contrast differentiation in functional imaging; and 4) the efficacy of collimation for neutron scatter image rejection. Results demonstrate that with proper neutron-scatter rejection, a neutron fluence of 2 ×107 n/cm2 will provide a signal to noise ratio of at least one ( S/N ≥ 1) when attempting to image various 300 μm thick tissues placed in a 3 cm thick phantom. Similarly, a neutron fluence of only 1 ×107 n/cm2 is required to differentiate a 300 μm thick diseased tissue relative to its normal tissue counterpart. The utility of a B-10 contrast agent was demonstrated at a concentration of 50 μg/g to achieve S/N ≥ 1 in 0.3 mm thick tissues while Gd-157 requires only slightly more than 10 μg/g to achieve the same level of differentiation. Lastly, neutron collimator with an L/D ratio from 50 to 200 were calculated to provide appropriate scatter rejection for thick tissue biological imaging with neutrons.

  1. Building Process Improvement Business Cases Using Bayesian Belief Networks and Monte Carlo Simulation

    DTIC Science & Technology

    2009-07-01

    simulation. The pilot described in this paper used this two-step approach within a Define, Measure, Analyze, Improve, and Control ( DMAIC ) framework to...networks, BBN, Monte Carlo simulation, DMAIC , Six Sigma, business case 15. NUMBER OF PAGES 35 16. PRICE CODE 17. SECURITY CLASSIFICATION OF

  2. Discrete Fractional Component Monte Carlo Simulation Study of Dilute Nonionic Surfactants at the Air-Water Interface.

    PubMed

    Yoo, Brian; Marin-Rimoldi, Eliseo; Mullen, Ryan Gotchy; Jusufi, Arben; Maginn, Edward J

    2017-09-26

    We present a newly developed Monte Carlo scheme to predict bulk surfactant concentrations and surface tensions at the air-water interface for various surfactant interfacial coverages. Since the concentration regimes of these systems of interest are typically very dilute (≪10 -5 mol. frac.), Monte Carlo simulations with the use of insertion/deletion moves can provide the ability to overcome finite system size limitations that often prohibit the use of modern molecular simulation techniques. In performing these simulations, we use the discrete fractional component Monte Carlo (DFCMC) method in the Gibbs ensemble framework, which allows us to separate the bulk and air-water interface into two separate boxes and efficiently swap tetraethylene glycol surfactants C 10 E 4 between boxes. Combining this move with preferential translations, volume biased insertions, and Wang-Landau biasing vastly enhances sampling and helps overcome the classical "insertion problem", often encountered in non-lattice Monte Carlo simulations. We demonstrate that this methodology is both consistent with the original molecular thermodynamic theory (MTT) of Blankschtein and co-workers, as well as their recently modified theory (MD/MTT), which incorporates the results of surfactant infinite dilution transfer free energies and surface tension calculations obtained from molecular dynamics simulations.

  3. Free energy and phase equilibria for the restricted primitive model of ionic fluids from Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Orkoulas, Gerassimos; Panagiotopoulos, Athanassios Z.

    1994-07-01

    In this work, we investigate the liquid-vapor phase transition of the restricted primitive model of ionic fluids. We show that at the low temperatures where the phase transition occurs, the system cannot be studied by conventional molecular simulation methods because convergence to equilibrium is slow. To accelerate convergence, we propose cluster Monte Carlo moves capable of moving more than one particle at a time. We then address the issue of charged particle transfers in grand canonical and Gibbs ensemble Monte Carlo simulations, for which we propose a biased particle insertion/destruction scheme capable of sampling short interparticle distances. We compute the chemical potential for the restricted primitive model as a function of temperature and density from grand canonical Monte Carlo simulations and the phase envelope from Gibbs Monte Carlo simulations. Our calculated phase coexistence curve is in agreement with recent results of Caillol obtained on the four-dimensional hypersphere and our own earlier Gibbs ensemble simulations with single-ion transfers, with the exception of the critical temperature, which is lower in the current calculations. Our best estimates for the critical parameters are T*c=0.053, ρ*c=0.025. We conclude with possible future applications of the biased techniques developed here for phase equilibrium calculations for ionic fluids.

  4. Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce

    PubMed Central

    Pratx, Guillem; Xing, Lei

    2011-01-01

    Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes. PMID:22191916

  5. A framework for stochastic simulations and visualization of biological electron-transfer dynamics

    NASA Astrophysics Data System (ADS)

    Nakano, C. Masato; Byun, Hye Suk; Ma, Heng; Wei, Tao; El-Naggar, Mohamed Y.

    2015-08-01

    Electron transfer (ET) dictates a wide variety of energy-conversion processes in biological systems. Visualizing ET dynamics could provide key insight into understanding and possibly controlling these processes. We present a computational framework named VizBET to visualize biological ET dynamics, using an outer-membrane Mtr-Omc cytochrome complex in Shewanella oneidensis MR-1 as an example. Starting from X-ray crystal structures of the constituent cytochromes, molecular dynamics simulations are combined with homology modeling, protein docking, and binding free energy computations to sample the configuration of the complex as well as the change of the free energy associated with ET. This information, along with quantum-mechanical calculations of the electronic coupling, provides inputs to kinetic Monte Carlo (KMC) simulations of ET dynamics in a network of heme groups within the complex. Visualization of the KMC simulation results has been implemented as a plugin to the Visual Molecular Dynamics (VMD) software. VizBET has been used to reveal the nature of ET dynamics associated with novel nonequilibrium phase transitions in a candidate configuration of the Mtr-Omc complex due to electron-electron interactions.

  6. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods

    PubMed Central

    Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.

    2011-01-01

    We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276

  7. A dental public health approach based on computational mathematics: Monte Carlo simulation of childhood dental decay.

    PubMed

    Tennant, Marc; Kruger, Estie

    2013-02-01

    This study developed a Monte Carlo simulation approach to examining the prevalence and incidence of dental decay using Australian children as a test environment. Monte Carlo simulation has been used for a half a century in particle physics (and elsewhere); put simply, it is the probability for various population-level outcomes seeded randomly to drive the production of individual level data. A total of five runs of the simulation model for all 275,000 12-year-olds in Australia were completed based on 2005-2006 data. Measured on average decayed/missing/filled teeth (DMFT) and DMFT of highest 10% of sample (Sic10) the runs did not differ from each other by more than 2% and the outcome was within 5% of the reported sampled population data. The simulations rested on the population probabilities that are known to be strongly linked to dental decay, namely, socio-economic status and Indigenous heritage. Testing the simulated population found DMFT of all cases where DMFT<>0 was 2.3 (n = 128,609) and DMFT for Indigenous cases only was 1.9 (n = 13,749). In the simulation population the Sic25 was 3.3 (n = 68,750). Monte Carlo simulations were created in particle physics as a computational mathematical approach to unknown individual-level effects by resting a simulation on known population-level probabilities. In this study a Monte Carlo simulation approach to childhood dental decay was built, tested and validated. © 2013 FDI World Dental Federation.

  8. CloudMC: a cloud computing application for Monte Carlo simulation.

    PubMed

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-04-21

    This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.

  9. Bayesian Inference on Proportional Elections

    PubMed Central

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  10. Complex Geometric Models of Diffusion and Relaxation in Healthy and Damaged White Matter

    PubMed Central

    Farrell, Jonathan A.D.; Smith, Seth A.; Reich, Daniel S.; Calabresi, Peter A.; van Zijl, Peter C.M.

    2010-01-01

    Which aspects of tissue microstructure affect diffusion weighted MRI signals? Prior models, many of which use Monte-Carlo simulations, have focused on relatively simple models of the cellular microenvironment and have not considered important anatomic details. With the advent of higher-order analysis models for diffusion imaging, such as high-angular-resolution diffusion imaging (HARDI), more realistic models are necessary. This paper presents and evaluates the reproducibility of simulations of diffusion in complex geometries. Our framework is quantitative, does not require specialized hardware, is easily implemented with little programming experience, and is freely available as open-source software. Models may include compartments with different diffusivities, permeabilities, and T2 time constants using both parametric (e.g., spheres and cylinders) and arbitrary (e.g., mesh-based) geometries. Three-dimensional diffusion displacement-probability functions are mapped with high reproducibility, and thus can be readily used to assess reproducibility of diffusion-derived contrasts. PMID:19739233

  11. Bayesian inference on proportional elections.

    PubMed

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  12. High fold computer disk storage DATABASE for fast extended analysis of γ-rays events

    NASA Astrophysics Data System (ADS)

    Stézowski, O.; Finck, Ch.; Prévost, D.

    1999-03-01

    Recently spectacular technical developments have been achieved to increase the resolving power of large γ-ray spectrometers. With these new eyes, physicists are able to study the intricate nature of atomic nuclei. Concurrently more and more complex multidimensional analyses are needed to investigate very weak phenomena. In this article, we first present a software (DATABASE) allowing high fold coincidences γ-rays events to be stored on hard disk. Then, a non-conventional method of analysis, anti-gating procedure, is described. Two physical examples are given to explain how it can be used and Monte Carlo simulations have been performed to test the validity of this method.

  13. Teaching Ionic Solvation Structure with a Monte Carlo Liquid Simulation Program

    ERIC Educational Resources Information Center

    Serrano, Agostinho; Santos, Flavia M. T.; Greca, Ileana M.

    2004-01-01

    The use of molecular dynamics and Monte Carlo methods has provided efficient means to stimulate the behavior of molecular liquids and solutions. A Monte Carlo simulation program is used to compute the structure of liquid water and of water as a solvent to Na(super +), Cl(super -), and Ar on a personal computer to show that it is easily feasible to…

  14. T-Opt: A 3D Monte Carlo simulation for light delivery design in photodynamic therapy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Honda, Norihiro; Hazama, Hisanao; Awazu, Kunio

    2017-02-01

    The interstitial photodynamic therapy (iPDT) with 5-aminolevulinic acid (5-ALA) is a safe and feasible treatment modality of malignant glioblastoma. In order to cover the tumour volume, the exact position of the light diffusers within the lesion is needed to decide precisely. The aim of this study is the development of evaluation method of treatment volume with 3D Monte Carlo simulation for iPDT using 5-ALA. Monte Carlo simulations of fluence rate were performed using the optical properties of the brain tissue infiltrated by tumor cells and normal tissue. 3-D Monte Carlo simulation was used to calculate the position of the light diffusers within the lesion and light transport. The fluence rate near the diffuser was maximum and decreased exponentially with distance. The simulation can calculate the amount of singlet oxygen generated by PDT. In order to increase the accuracy of simulation results, the parameter for simulation includes the quantum yield of singlet oxygen generation, the accumulated concentration of photosensitizer within tissue, fluence rate, molar extinction coefficient at the wavelength of excitation light. The simulation is useful for evaluation of treatment region of iPDT with 5-ALA.

  15. Monte Carlo Simulation of a Segmented Detector for Low-Energy Electron Antineutrinos

    NASA Astrophysics Data System (ADS)

    Qomi, H. Akhtari; Safari, M. J.; Davani, F. Abbasi

    2017-11-01

    Detection of low-energy electron antineutrinos is of importance for several purposes, such as ex-vessel reactor monitoring, neutrino oscillation studies, etc. The inverse beta decay (IBD) is the interaction that is responsible for detection mechanism in (organic) plastic scintillation detectors. Here, a detailed study will be presented dealing with the radiation and optical transport simulation of a typical segmented antineutrino detector withMonte Carlo method using MCNPX and FLUKA codes. This study shows different aspects of the detector, benefiting from inherent capabilities of the Monte Carlo simulation codes.

  16. A Dasymetric-Based Monte Carlo Simulation Approach to the Probabilistic Analysis of Spatial Variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morton, April M; Piburn, Jesse O; McManamay, Ryan A

    2017-01-01

    Monte Carlo simulation is a popular numerical experimentation technique used in a range of scientific fields to obtain the statistics of unknown random output variables. Despite its widespread applicability, it can be difficult to infer required input probability distributions when they are related to population counts unknown at desired spatial resolutions. To overcome this challenge, we propose a framework that uses a dasymetric model to infer the probability distributions needed for a specific class of Monte Carlo simulations which depend on population counts.

  17. Sampling theory and automated simulations for vertical sections, applied to human brain.

    PubMed

    Cruz-Orive, L M; Gelšvartas, J; Roberts, N

    2014-02-01

    In recent years, there have been substantial developments in both magnetic resonance imaging techniques and automatic image analysis software. The purpose of this paper is to develop stereological image sampling theory (i.e. unbiased sampling rules) that can be used by image analysts for estimating geometric quantities such as surface area and volume, and to illustrate its implementation. The methods will ideally be applied automatically on segmented, properly sampled 2D images - although convenient manual application is always an option - and they are of wide applicability in many disciplines. In particular, the vertical sections design to estimate surface area is described in detail and applied to estimate the area of the pial surface and of the boundary between cortex and underlying white matter (i.e. subcortical surface area). For completeness, cortical volume and mean cortical thickness are also estimated. The aforementioned surfaces were triangulated in 3D with the aid of FreeSurfer software, which provided accurate surface area measures that served as gold standards. Furthermore, a software was developed to produce digitized trace curves of the triangulated target surfaces automatically from virtual sections. From such traces, a new method (called the 'lambda method') is presented to estimate surface area automatically. In addition, with the new software, intersections could be counted automatically between the relevant surface traces and a cycloid test grid for the classical design. This capability, together with the aforementioned gold standard, enabled us to thoroughly check the performance and the variability of the different estimators by Monte Carlo simulations for studying the human brain. In particular, new methods are offered to split the total error variance into the orientations, sectioning and cycloid components. The latter prediction was hitherto unavailable--one is proposed here and checked by way of simulations on a given set of digitized vertical sections with automatically superimposed cycloid grids of three different sizes. Concrete and detailed recommendations are given to implement the methods. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  18. Obtaining identical results with double precision global accuracy on different numbers of processors in parallel particle Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cleveland, Mathew A., E-mail: cleveland7@llnl.gov; Brunner, Thomas A.; Gentile, Nicholas A.

    2013-10-15

    We describe and compare different approaches for achieving numerical reproducibility in photon Monte Carlo simulations. Reproducibility is desirable for code verification, testing, and debugging. Parallelism creates a unique problem for achieving reproducibility in Monte Carlo simulations because it changes the order in which values are summed. This is a numerical problem because double precision arithmetic is not associative. Parallel Monte Carlo, both domain replicated and decomposed simulations, will run their particles in a different order during different runs of the same simulation because the non-reproducibility of communication between processors. In addition, runs of the same simulation using different domain decompositionsmore » will also result in particles being simulated in a different order. In [1], a way of eliminating non-associative accumulations using integer tallies was described. This approach successfully achieves reproducibility at the cost of lost accuracy by rounding double precision numbers to fewer significant digits. This integer approach, and other extended and reduced precision reproducibility techniques, are described and compared in this work. Increased precision alone is not enough to ensure reproducibility of photon Monte Carlo simulations. Non-arbitrary precision approaches require a varying degree of rounding to achieve reproducibility. For the problems investigated in this work double precision global accuracy was achievable by using 100 bits of precision or greater on all unordered sums which where subsequently rounded to double precision at the end of every time-step.« less

  19. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    NASA Astrophysics Data System (ADS)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  20. Atomistic modelling of scattering data in the Collaborative Computational Project for Small Angle Scattering (CCP-SAS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkins, Stephen J.; Wright, David W.; Zhang, Hailiang

    2016-10-14

    The capabilities of current computer simulations provide a unique opportunity to model small-angle scattering (SAS) data at the atomistic level, and to include other structural constraints ranging from molecular and atomistic energetics to crystallography, electron microscopy and NMR. This extends the capabilities of solution scattering and provides deeper insights into the physics and chemistry of the systems studied. Realizing this potential, however, requires integrating the experimental data with a new generation of modelling software. To achieve this, the CCP-SAS collaboration (http://www.ccpsas.org/) is developing open-source, high-throughput and user-friendly software for the atomistic and coarse-grained molecular modelling of scattering data. Robust state-of-the-artmore » molecular simulation engines and molecular dynamics and Monte Carlo force fields provide constraints to the solution structure inferred from the small-angle scattering data, which incorporates the known physical chemistry of the system. The implementation of this software suite involves a tiered approach in whichGenAppprovides the deployment infrastructure for running applications on both standard and high-performance computing hardware, andSASSIEprovides a workflow framework into which modules can be plugged to prepare structures, carry out simulations, calculate theoretical scattering data and compare results with experimental data.GenAppproduces the accessible web-based front end termedSASSIE-web, andGenAppandSASSIEalso make community SAS codes available. Applications are illustrated by case studies: (i) inter-domain flexibility in two- to six-domain proteins as exemplified by HIV-1 Gag, MASP and ubiquitin; (ii) the hinge conformation in human IgG2 and IgA1 antibodies; (iii) the complex formed between a hexameric protein Hfq and mRNA; and (iv) synthetic `bottlebrush' polymers.« less

  1. EON: software for long time simulations of atomic scale systems

    NASA Astrophysics Data System (ADS)

    Chill, Samuel T.; Welborn, Matthew; Terrell, Rye; Zhang, Liang; Berthet, Jean-Claude; Pedersen, Andreas; Jónsson, Hannes; Henkelman, Graeme

    2014-07-01

    The EON software is designed for simulations of the state-to-state evolution of atomic scale systems over timescales greatly exceeding that of direct classical dynamics. States are defined as collections of atomic configurations from which a minimization of the potential energy gives the same inherent structure. The time evolution is assumed to be governed by rare events, where transitions between states are uncorrelated and infrequent compared with the timescale of atomic vibrations. Several methods for calculating the state-to-state evolution have been implemented in EON, including parallel replica dynamics, hyperdynamics and adaptive kinetic Monte Carlo. Global optimization methods, including simulated annealing, basin hopping and minima hopping are also implemented. The software has a client/server architecture where the computationally intensive evaluations of the interatomic interactions are calculated on the client-side and the state-to-state evolution is managed by the server. The client supports optimization for different computer architectures to maximize computational efficiency. The server is written in Python so that developers have access to the high-level functionality without delving into the computationally intensive components. Communication between the server and clients is abstracted so that calculations can be deployed on a single machine, clusters using a queuing system, large parallel computers using a message passing interface, or within a distributed computing environment. A generic interface to the evaluation of the interatomic interactions is defined so that empirical potentials, such as in LAMMPS, and density functional theory as implemented in VASP and GPAW can be used interchangeably. Examples are given to demonstrate the range of systems that can be modeled, including surface diffusion and island ripening of adsorbed atoms on metal surfaces, molecular diffusion on the surface of ice and global structural optimization of nanoparticles.

  2. Study the effect of reservoir spatial heterogeneity on CO2 sequestration under an uncertainty quantification (UQ) software framework

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.

    2011-12-01

    In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.

  3. Atomistic modelling of scattering data in the Collaborative Computational Project for Small Angle Scattering (CCP-SAS).

    PubMed

    Perkins, Stephen J; Wright, David W; Zhang, Hailiang; Brookes, Emre H; Chen, Jianhan; Irving, Thomas C; Krueger, Susan; Barlow, David J; Edler, Karen J; Scott, David J; Terrill, Nicholas J; King, Stephen M; Butler, Paul D; Curtis, Joseph E

    2016-12-01

    The capabilities of current computer simulations provide a unique opportunity to model small-angle scattering (SAS) data at the atomistic level, and to include other structural constraints ranging from molecular and atomistic energetics to crystallography, electron microscopy and NMR. This extends the capabilities of solution scattering and provides deeper insights into the physics and chemistry of the systems studied. Realizing this potential, however, requires integrating the experimental data with a new generation of modelling software. To achieve this, the CCP-SAS collaboration (http://www.ccpsas.org/) is developing open-source, high-throughput and user-friendly software for the atomistic and coarse-grained molecular modelling of scattering data. Robust state-of-the-art molecular simulation engines and molecular dynamics and Monte Carlo force fields provide constraints to the solution structure inferred from the small-angle scattering data, which incorporates the known physical chemistry of the system. The implementation of this software suite involves a tiered approach in which GenApp provides the deployment infrastructure for running applications on both standard and high-performance computing hardware, and SASSIE provides a workflow framework into which modules can be plugged to prepare structures, carry out simulations, calculate theoretical scattering data and compare results with experimental data. GenApp produces the accessible web-based front end termed SASSIE-web , and GenApp and SASSIE also make community SAS codes available. Applications are illustrated by case studies: (i) inter-domain flexibility in two- to six-domain proteins as exemplified by HIV-1 Gag, MASP and ubiquitin; (ii) the hinge conformation in human IgG2 and IgA1 antibodies; (iii) the complex formed between a hexameric protein Hfq and mRNA; and (iv) synthetic 'bottlebrush' polymers.

  4. Determining the nuclear data uncertainty on MONK10 and WIMS10 criticality calculations

    NASA Astrophysics Data System (ADS)

    Ware, Tim; Dobson, Geoff; Hanlon, David; Hiles, Richard; Mason, Robert; Perry, Ray

    2017-09-01

    The ANSWERS Software Service is developing a number of techniques to better understand and quantify uncertainty on calculations of the neutron multiplication factor, k-effective, in nuclear fuel and other systems containing fissile material. The uncertainty on the calculated k-effective arises from a number of sources, including nuclear data uncertainties, manufacturing tolerances, modelling approximations and, for Monte Carlo simulation, stochastic uncertainty. For determining the uncertainties due to nuclear data, a set of application libraries have been generated for use with the MONK10 Monte Carlo and the WIMS10 deterministic criticality and reactor physics codes. This paper overviews the generation of these nuclear data libraries by Latin hypercube sampling of JEFF-3.1.2 evaluated data based upon a library of covariance data taken from JEFF, ENDF/B, JENDL and TENDL evaluations. Criticality calculations have been performed with MONK10 and WIMS10 using these sampled libraries for a number of benchmark models of fissile systems. Results are presented which show the uncertainty on k-effective for these systems arising from the uncertainty on the input nuclear data.

  5. Event Reconstruction in the PandaRoot framework

    NASA Astrophysics Data System (ADS)

    Spataro, Stefano

    2012-12-01

    The PANDA experiment will study the collisions of beams of anti-protons, with momenta ranging from 2-15 GeV/c, with fixed proton and nuclear targets in the charm energy range, and will be built at the FAIR facility. In preparation for the experiment, the PandaRoot software framework is under development for detector simulation, reconstruction and data analysis, running on an Alien2-based grid. The basic features are handled by the FairRoot framework, based on ROOT and Virtual Monte Carlo, while the PANDA detector specifics and reconstruction code are implemented inside PandaRoot. The realization of Technical Design Reports for the tracking detectors has pushed the finalization of the tracking reconstruction code, which is complete for the Target Spectrometer, and of the analysis tools. Particle Identification algorithms are currently implemented using Bayesian approach and compared to Multivariate Analysis methods. Moreover, the PANDA data acquisition foresees a triggerless operation in which events are not defined by a hardware 1st level trigger decision, but all the signals are stored with time stamps requiring a deconvolution by the software. This has led to a redesign of the software from an event basis to a time-ordered structure. In this contribution, the reconstruction capabilities of the Panda spectrometer will be reported, focusing on the performances of the tracking system and the results for the analysis of physics benchmark channels, as well as the new (and challenging) concept of time-based simulation and its implementation.

  6. Diffusion Monte Carlo approach versus adiabatic computation for local Hamiltonians

    NASA Astrophysics Data System (ADS)

    Bringewatt, Jacob; Dorland, William; Jordan, Stephen P.; Mink, Alan

    2018-02-01

    Most research regarding quantum adiabatic optimization has focused on stoquastic Hamiltonians, whose ground states can be expressed with only real non-negative amplitudes and thus for whom destructive interference is not manifest. This raises the question of whether classical Monte Carlo algorithms can efficiently simulate quantum adiabatic optimization with stoquastic Hamiltonians. Recent results have given counterexamples in which path-integral and diffusion Monte Carlo fail to do so. However, most adiabatic optimization algorithms, such as for solving MAX-k -SAT problems, use k -local Hamiltonians, whereas our previous counterexample for diffusion Monte Carlo involved n -body interactions. Here we present a 6-local counterexample which demonstrates that even for these local Hamiltonians there are cases where diffusion Monte Carlo cannot efficiently simulate quantum adiabatic optimization. Furthermore, we perform empirical testing of diffusion Monte Carlo on a standard well-studied class of permutation-symmetric tunneling problems and similarly find large advantages for quantum optimization over diffusion Monte Carlo.

  7. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  8. An Overview of Importance Splitting for Rare Event Simulation

    ERIC Educational Resources Information Center

    Morio, Jerome; Pastel, Rudy; Le Gland, Francois

    2010-01-01

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this…

  9. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  10. Developing a cosmic ray muon sampling capability for muon tomography and monitoring applications

    NASA Astrophysics Data System (ADS)

    Chatzidakis, S.; Chrysikopoulou, S.; Tsoukalas, L. H.

    2015-12-01

    In this study, a cosmic ray muon sampling capability using a phenomenological model that captures the main characteristics of the experimentally measured spectrum coupled with a set of statistical algorithms is developed. The "muon generator" produces muons with zenith angles in the range 0-90° and energies in the range 1-100 GeV and is suitable for Monte Carlo simulations with emphasis on muon tomographic and monitoring applications. The muon energy distribution is described by the Smith and Duller (1959) [35] phenomenological model. Statistical algorithms are then employed for generating random samples. The inverse transform provides a means to generate samples from the muon angular distribution, whereas the Acceptance-Rejection and Metropolis-Hastings algorithms are employed to provide the energy component. The predictions for muon energies 1-60 GeV and zenith angles 0-90° are validated with a series of actual spectrum measurements and with estimates from the software library CRY. The results confirm the validity of the phenomenological model and the applicability of the statistical algorithms to generate polyenergetic-polydirectional muons. The response of the algorithms and the impact of critical parameters on computation time and computed results were investigated. Final output from the proposed "muon generator" is a look-up table that contains the sampled muon angles and energies and can be easily integrated into Monte Carlo particle simulation codes such as Geant4 and MCNP.

  11. Dynamic 99mTc-MAG3 renography: images for quality control obtained by combining pharmacokinetic modelling, an anthropomorphic computer phantom and Monte Carlo simulated scintillation camera imaging

    NASA Astrophysics Data System (ADS)

    Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael

    2013-05-01

    In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.

  12. Autocalibration of a one-dimensional hydrodynamic-ecological model (DYRESM 4.0-CAEDYM 3.1) using a Monte Carlo approach: simulations of hypoxic events in a polymictic lake

    NASA Astrophysics Data System (ADS)

    Luo, Liancong; Hamilton, David; Lan, Jia; McBride, Chris; Trolle, Dennis

    2018-03-01

    Automated calibration of complex deterministic water quality models with a large number of biogeochemical parameters can reduce time-consuming iterative simulations involving empirical judgements of model fit. We undertook autocalibration of the one-dimensional hydrodynamic-ecological lake model DYRESM-CAEDYM, using a Monte Carlo sampling (MCS) method, in order to test the applicability of this procedure for shallow, polymictic Lake Rotorua (New Zealand). The calibration procedure involved independently minimizing the root-mean-square error (RMSE), maximizing the Pearson correlation coefficient (r) and Nash-Sutcliffe efficient coefficient (Nr) for comparisons of model state variables against measured data. An assigned number of parameter permutations was used for 10 000 simulation iterations. The "optimal" temperature calibration produced a RMSE of 0.54 °C, Nr value of 0.99, and r value of 0.98 through the whole water column based on comparisons with 540 observed water temperatures collected between 13 July 2007 and 13 January 2009. The modeled bottom dissolved oxygen concentration (20.5 m below surface) was compared with 467 available observations. The calculated RMSE of the simulations compared with the measurements was 1.78 mg L-1, the Nr value was 0.75, and the r value was 0.87. The autocalibrated model was further tested for an independent data set by simulating bottom-water hypoxia events from 15 January 2009 to 8 June 2011 (875 days). This verification produced an accurate simulation of five hypoxic events corresponding to DO < 2 mg L-1 during summer of 2009-2011. The RMSE was 2.07 mg L-1, Nr value 0.62, and r value of 0.81, based on the available data set of 738 days. The autocalibration software of DYRESM-CAEDYM developed here is substantially less time-consuming and more efficient in parameter optimization than traditional manual calibration which has been the standard tool practiced for similar complex water quality models.

  13. Auto-calibration of a one-dimensional hydrodynamic-ecological model using a Monte Carlo approach: simulation of hypoxic events in a polymictic lake

    NASA Astrophysics Data System (ADS)

    Luo, L.

    2011-12-01

    Automated calibration of complex deterministic water quality models with a large number of biogeochemical parameters can reduce time-consuming iterative simulations involving empirical judgements of model fit. We undertook auto-calibration of the one-dimensional hydrodynamic-ecological lake model DYRESM-CAEDYM, using a Monte Carlo sampling (MCS) method, in order to test the applicability of this procedure for shallow, polymictic Lake Rotorua (New Zealand). The calibration procedure involved independently minimising the root-mean-square-error (RMSE), maximizing the Pearson correlation coefficient (r) and Nash-Sutcliffe efficient coefficient (Nr) for comparisons of model state variables against measured data. An assigned number of parameter permutations was used for 10,000 simulation iterations. The 'optimal' temperature calibration produced a RMSE of 0.54 °C, Nr-value of 0.99 and r-value of 0.98 through the whole water column based on comparisons with 540 observed water temperatures collected between 13 July 2007 - 13 January 2009. The modeled bottom dissolved oxygen concentration (20.5 m below surface) was compared with 467 available observations. The calculated RMSE of the simulations compared with the measurements was 1.78 mg L-1, the Nr-value was 0.75 and the r-value was 0.87. The autocalibrated model was further tested for an independent data set by simulating bottom-water hypoxia events for the period 15 January 2009 to 8 June 2011 (875 days). This verification produced an accurate simulation of five hypoxic events corresponding to DO < 2 mg L-1 during summer of 2009-2011. The RMSE was 2.07 mg L-1, Nr-value 0.62 and r-value of 0.81, based on the available data set of 738 days. The auto-calibration software of DYRESM-CAEDYM developed here is substantially less time-consuming and more efficient in parameter optimisation than traditional manual calibration which has been the standard tool practiced for similar complex water quality models.

  14. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Treesearch

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  15. Bayesian inference of metal oxide ultrathin film structure based on crystal truncation rod measurements

    PubMed Central

    Anada, Masato; Nakanishi-Ohno, Yoshinori; Okada, Masato; Kimura, Tsuyoshi; Wakabayashi, Yusuke

    2017-01-01

    Monte Carlo (MC)-based refinement software to analyze the atomic arrangements of perovskite oxide ultrathin films from the crystal truncation rod intensity is developed on the basis of Bayesian inference. The advantages of the MC approach are (i) it is applicable to multi-domain structures, (ii) it provides the posterior probability of structures through Bayes’ theorem, which allows one to evaluate the uncertainty of estimated structural parameters, and (iii) one can involve any information provided by other experiments and theories. The simulated annealing procedure efficiently searches for the optimum model owing to its stochastic updates, regardless of the initial values, without being trapped by local optima. The performance of the software is examined with a five-unit-cell-thick LaAlO3 film fabricated on top of SrTiO3. The software successfully found the global optima from an initial model prepared by a small grid search calculation. The standard deviations of the atomic positions derived from a dataset taken at a second-generation synchrotron are ±0.02 Å for metal sites and ±0.03 Å for oxygen sites. PMID:29217989

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudhyadhom, A; McGuinness, C; Descovich, M

    Purpose: To develop a methodology for validation of a Monte-Carlo dose calculation model for robotic small field SRS/SBRT deliveries. Methods: In a robotic treatment planning system, a Monte-Carlo model was iteratively optimized to match with beam data. A two-part analysis was developed to verify this model. 1) The Monte-Carlo model was validated in a simulated water phantom versus a Ray-Tracing calculation on a single beam collimator-by-collimator calculation. 2) The Monte-Carlo model was validated to be accurate in the most challenging situation, lung, by acquiring in-phantom measurements. A plan was created and delivered in a CIRS lung phantom with film insert.more » Separately, plans were delivered in an in-house created lung phantom with a PinPoint chamber insert within a lung simulating material. For medium to large collimator sizes, a single beam was delivered to the phantom. For small size collimators (10, 12.5, and 15mm), a robotically delivered plan was created to generate a uniform dose field of irradiation over a 2×2cm{sup 2} area. Results: Dose differences in simulated water between Ray-Tracing and Monte-Carlo were all within 1% at dmax and deeper. Maximum dose differences occurred prior to dmax but were all within 3%. Film measurements in a lung phantom show high correspondence of over 95% gamma at the 2%/2mm level for Monte-Carlo. Ion chamber measurements for collimator sizes of 12.5mm and above were within 3% of Monte-Carlo calculated values. Uniform irradiation involving the 10mm collimator resulted in a dose difference of ∼8% for both Monte-Carlo and Ray-Tracing indicating that there may be limitations with the dose calculation. Conclusion: We have developed a methodology to validate a Monte-Carlo model by verifying that it matches in water and, separately, that it corresponds well in lung simulating materials. The Monte-Carlo model and algorithm tested may have more limited accuracy for 10mm fields and smaller.« less

  17. Monte Carlo simulation of the neutron monitor yield function

    NASA Astrophysics Data System (ADS)

    Mangeard, P.-S.; Ruffolo, D.; Sáiz, A.; Madlee, S.; Nutaro, T.

    2016-08-01

    Neutron monitors (NMs) are ground-based detectors that measure variations of the Galactic cosmic ray flux at GV range rigidities. Differences in configuration, electronics, surroundings, and location induce systematic effects on the calculation of the yield functions of NMs worldwide. Different estimates of NM yield functions can differ by a factor of 2 or more. In this work, we present new Monte Carlo simulations to calculate NM yield functions and perform an absolute (not relative) comparison with the count rate of the Princess Sirindhorn Neutron Monitor (PSNM) at Doi Inthanon, Thailand, both for the entire monitor and for individual counter tubes. We model the atmosphere using profiles from the Global Data Assimilation System database and the Naval Research Laboratory Mass Spectrometer, Incoherent Scatter Radar Extended model. Using FLUKA software and the detailed geometry of PSNM, we calculated the PSNM yield functions for protons and alpha particles. An agreement better than 9% was achieved between the PSNM observations and the simulated count rate during the solar minimum of December 2009. The systematic effect from the electronic dead time was studied as a function of primary cosmic ray rigidity at the top of the atmosphere up to 1 TV. We show that the effect is not negligible and can reach 35% at high rigidity for a dead time >1 ms. We analyzed the response function of each counter tube at PSNM using its actual dead time, and we provide normalization coefficients between count rates for various tube configurations in the standard NM64 design that are valid to within ˜1% for such stations worldwide.

  18. Exact and Monte carlo resampling procedures for the Wilcoxon-Mann-Whitney and Kruskal-Wallis tests.

    PubMed

    Berry, K J; Mielke, P W

    2000-12-01

    Exact and Monte Carlo resampling FORTRAN programs are described for the Wilcoxon-Mann-Whitney rank sum test and the Kruskal-Wallis one-way analysis of variance for ranks test. The program algorithms compensate for tied values and do not depend on asymptotic approximations for probability values, unlike most algorithms contained in PC-based statistical software packages.

  19. Computing at h1 - Experience and Future

    NASA Astrophysics Data System (ADS)

    Eckerlin, G.; Gerhards, R.; Kleinwort, C.; KrÜNer-Marquis, U.; Egli, S.; Niebergall, F.

    The H1 experiment has now been successfully operating at the electron proton collider HERA at DESY for three years. During this time the computing environment has gradually shifted from a mainframe oriented environment to the distributed server/client Unix world. This transition is now almost complete. Computing needs are largely determined by the present amount of 1.5 TB of reconstructed data per year (1994), corresponding to 1.2 × 107 accepted events. All data are centrally available at DESY. In addition to data analysis, which is done in all collaborating institutes, most of the centrally organized Monte Carlo production is performed outside of DESY. New software tools to cope with offline computing needs include CENTIPEDE, a tool for the use of distributed batch and interactive resources for Monte Carlo production, and H1 UNIX, a software package for automatic updates of H1 software on all UNIX platforms.

  20. Monte Carlo simulation of biomolecular systems with BIOMCSIM

    NASA Astrophysics Data System (ADS)

    Kamberaj, H.; Helms, V.

    2001-12-01

    A new Monte Carlo simulation program, BIOMCSIM, is presented that has been developed in particular to simulate the behaviour of biomolecular systems, leading to insights and understanding of their functions. The computational complexity in Monte Carlo simulations of high density systems, with large molecules like proteins immersed in a solvent medium, or when simulating the dynamics of water molecules in a protein cavity, is enormous. The program presented in this paper seeks to provide these desirable features putting special emphasis on simulations in grand canonical ensembles. It uses different biasing techniques to increase the convergence of simulations, and periodic load balancing in its parallel version, to maximally utilize the available computer power. In periodic systems, the long-ranged electrostatic interactions can be treated by Ewald summation. The program is modularly organized, and implemented using an ANSI C dialect, so as to enhance its modifiability. Its performance is demonstrated in benchmark applications for the proteins BPTI and Cytochrome c Oxidase.

  1. Self-Learning Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; Fu, Liang

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of general and efficient update algorithm for large size systems close to phase transition or with strong frustrations, for which local updates perform badly. In this work, we propose a new general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. We demonstrate the efficiency of SLMC in a spin model at the phase transition point, achieving a 10-20 times speedup. This work is supported by the DOE Office of Basic Energy Sciences, Division of Materials Sciences and Engineering under Award DE-SC0010526.

  2. [Benchmark experiment to verify radiation transport calculations for dosimetry in radiation therapy].

    PubMed

    Renner, Franziska

    2016-09-01

    Monte Carlo simulations are regarded as the most accurate method of solving complex problems in the field of dosimetry and radiation transport. In (external) radiation therapy they are increasingly used for the calculation of dose distributions during treatment planning. In comparison to other algorithms for the calculation of dose distributions, Monte Carlo methods have the capability of improving the accuracy of dose calculations - especially under complex circumstances (e.g. consideration of inhomogeneities). However, there is a lack of knowledge of how accurate the results of Monte Carlo calculations are on an absolute basis. A practical verification of the calculations can be performed by direct comparison with the results of a benchmark experiment. This work presents such a benchmark experiment and compares its results (with detailed consideration of measurement uncertainty) with the results of Monte Carlo calculations using the well-established Monte Carlo code EGSnrc. The experiment was designed to have parallels to external beam radiation therapy with respect to the type and energy of the radiation, the materials used and the kind of dose measurement. Because the properties of the beam have to be well known in order to compare the results of the experiment and the simulation on an absolute basis, the benchmark experiment was performed using the research electron accelerator of the Physikalisch-Technische Bundesanstalt (PTB), whose beam was accurately characterized in advance. The benchmark experiment and the corresponding Monte Carlo simulations were carried out for two different types of ionization chambers and the results were compared. Considering the uncertainty, which is about 0.7 % for the experimental values and about 1.0 % for the Monte Carlo simulation, the results of the simulation and the experiment coincide. Copyright © 2015. Published by Elsevier GmbH.

  3. VirtualDose: a software for reporting organ doses from CT for adult and pediatric patients

    NASA Astrophysics Data System (ADS)

    Ding, Aiping; Gao, Yiming; Liu, Haikuan; Caracappa, Peter F.; Long, Daniel J.; Bolch, Wesley E.; Liu, Bob; Xu, X. George

    2015-07-01

    This paper describes the development and testing of VirtualDose—a software for reporting organ doses for adult and pediatric patients who undergo x-ray computed tomography (CT) examinations. The software is based on a comprehensive database of organ doses derived from Monte Carlo (MC) simulations involving a library of 25 anatomically realistic phantoms that represent patients of different ages, body sizes, body masses, and pregnant stages. Models of GE Lightspeed Pro 16 and Siemens SOMATOM Sensation 16 scanners were carefully validated for use in MC dose calculations. The software framework is designed with the ‘software as a service (SaaS)’ delivery concept under which multiple clients can access the web-based interface simultaneously from any computer without having to install software locally. The RESTful web service API also allows a third-party picture archiving and communication system software package to seamlessly integrate with VirtualDose’s functions. Software testing showed that VirtualDose was compatible with numerous operating systems including Windows, Linux, Apple OS X, and mobile and portable devices. The organ doses from VirtualDose were compared against those reported by CT-Expo and ImPACT—two dosimetry tools that were based on the stylized pediatric and adult patient models that were known to be anatomically simple. The organ doses reported by VirtualDose differed from those reported by CT-Expo and ImPACT by as much as 300% in some of the patient models. These results confirm the conclusion from past studies that differences in anatomical realism offered by stylized and voxel phantoms have caused significant discrepancies in CT dose estimations.

  4. Monte Carlo Simulation Using HyperCard and Lotus 1-2-3.

    ERIC Educational Resources Information Center

    Oulman, Charles S.; Lee, Motoko Y.

    Monte Carlo simulation is a computer modeling procedure for mimicking observations on a random variable. A random number generator is used in generating the outcome for the events that are being modeled. The simulation can be used to obtain results that otherwise require extensive testing or complicated computations. This paper describes how Monte…

  5. Monte Carlo Particle Lists: MCPL

    NASA Astrophysics Data System (ADS)

    Kittelmann, T.; Klinkby, E.; Knudsen, E. B.; Willendrup, P.; Cai, X. X.; Kanaki, K.

    2017-09-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.

  6. Baseball Monte Carlo Style.

    ERIC Educational Resources Information Center

    Houser, Larry L.

    1981-01-01

    Monte Carlo methods are used to simulate activities in baseball such as a team's "hot streak" and a hitter's "batting slump." Student participation in such simulations is viewed as a useful method of giving pupils a better understanding of the probability concepts involved. (MP)

  7. Path Toward a Unifid Geometry for Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lee, Kerry; Barzilla, Janet; Davis, Andrew; Zachmann

    2014-01-01

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex computer-aided design (CAD) models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN [high charge and energy transport code developed by NASA Langley Research Center (LaRC)], are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit-specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The workflow for achieving radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats

  8. Trick Simulation Environment 07

    NASA Technical Reports Server (NTRS)

    Lin, Alexander S.; Penn, John M.

    2012-01-01

    The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.

  9. The Analysis of the Patterns of Radiation-Induced DNA Damage Foci by a Stochastic Monte Carlo Model of DNA Double Strand Breaks Induction by Heavy Ions and Image Segmentation Software

    NASA Technical Reports Server (NTRS)

    Ponomarev, Artem; Cucinotta, F.

    2011-01-01

    To create a generalized mechanistic model of DNA damage in human cells that will generate analytical and image data corresponding to experimentally observed DNA damage foci and will help to improve the experimental foci yields by simulating spatial foci patterns and resolving problems with quantitative image analysis. Material and Methods: The analysis of patterns of RIFs (radiation-induced foci) produced by low- and high-LET (linear energy transfer) radiation was conducted by using a Monte Carlo model that combines the heavy ion track structure with characteristics of the human genome on the level of chromosomes. The foci patterns were also simulated in the maximum projection plane for flat nuclei. Some data analysis was done with the help of image segmentation software that identifies individual classes of RIFs and colocolized RIFs, which is of importance to some experimental assays that assign DNA damage a dual phosphorescent signal. Results: The model predicts the spatial and genomic distributions of DNA DSBs (double strand breaks) and associated RIFs in a human cell nucleus for a particular dose of either low- or high-LET radiation. We used the model to do analyses for different irradiation scenarios. In the beam-parallel-to-the-disk-of-a-flattened-nucleus scenario we found that the foci appeared to be merged due to their high density, while, in the perpendicular-beam scenario, the foci appeared as one bright spot per hit. The statistics and spatial distribution of regions of densely arranged foci, termed DNA foci chains, were predicted numerically using this model. Another analysis was done to evaluate the number of ion hits per nucleus, which were visible from streaks of closely located foci. In another analysis, our image segmentaiton software determined foci yields directly from images with single-class or colocolized foci. Conclusions: We showed that DSB clustering needs to be taken into account to determine the true DNA damage foci yield, which helps to determine the DSB yield. Using the model analysis, a researcher can refine the DSB yield per nucleus per particle. We showed that purely geometric artifacts, present in the experimental images, can be analytically resolved with the model, and that the quantization of track hits and DSB yields can be provided to the experimentalists who use enumeration of radiation-induced foci in immunofluorescence experiments using proteins that detect DNA damage. An automated image segmentaiton software can prove useful in a faster and more precise object counting for colocolized foci images.

  10. C++QEDv2: The multi-array concept and compile-time algorithms in the definition of composite quantum systems

    NASA Astrophysics Data System (ADS)

    Vukics, András

    2012-06-01

    C++QED is a versatile framework for simulating open quantum dynamics. It allows to build arbitrarily complex quantum systems from elementary free subsystems and interactions, and simulate their time evolution with the available time-evolution drivers. Through this framework, we introduce a design which should be generic for high-level representations of composite quantum systems. It relies heavily on the object-oriented and generic programming paradigms on one hand, and on the other hand, compile-time algorithms, in particular C++ template-metaprogramming techniques. The core of the design is the data structure which represents the state vectors of composite quantum systems. This data structure models the multi-array concept. The use of template metaprogramming is not only crucial to the design, but with its use all computations pertaining to the layout of the simulated system can be shifted to compile time, hence cutting on runtime. Program summaryProgram title: C++QED Catalogue identifier: AELU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:http://cpc.cs.qub.ac.uk/licence/aelu_v1_0.html. The C++QED package contains other software packages, Blitz, Boost and FLENS, all of which may be distributed freely but have individual license requirements. Please see individual packages for license conditions. No. of lines in distributed program, including test data, etc.: 597 974 No. of bytes in distributed program, including test data, etc.: 4 874 839 Distribution format: tar.gz Programming language: C++ Computer: i386-i686, x86_64 Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60 MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1 MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2, 20 External routines: Boost C++ libraries (http://www.boost.org/), GNU Scientific Library (http://www.gnu.org/software/gsl/), Blitz++ (http://www.oonumerics.org/blitz/), Linear Algebra Package - Flexible Library for Efficient Numerical Solutions (http://flens.sourceforge.net/). Nature of problem: Definition of (open) composite quantum systems out of elementary building blocks [1]. Manipulation of such systems, with emphasis on dynamical simulations such as Master-equation evolution [2] and Monte Carlo wave-function simulation [3]. Solution method: Master equation, Monte Carlo wave-function method. Restrictions: Total dimensionality of the system. Master equation - few thousands. Monte Carlo wave-function trajectory - several millions. Unusual features: Because of the heavy use of compile-time algorithms, compilation of programs written in the framework may take a long time and much memory (up to several GBs). Additional comments: The framework is not a program, but provides and implements an application-programming interface for developing simulations in the indicated problem domain. Supplementary information: http://cppqed.sourceforge.net/. Running time: Depending on the magnitude of the problem, can vary from a few seconds to weeks.

  11. Instrumental resolution of the chopper spectrometer 4SEASONS evaluated by Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Kajimoto, Ryoichi; Sato, Kentaro; Inamura, Yasuhiro; Fujita, Masaki

    2018-05-01

    We performed simulations of the resolution function of the 4SEASONS spectrometer at J-PARC by using the Monte Carlo simulation package McStas. The simulations showed reasonably good agreement with analytical calculations of energy and momentum resolutions by using a simplified description. We implemented new functionalities in Utsusemi, the standard data analysis tool used in 4SEASONS, to enable visualization of the simulated resolution function and predict its shape for specific experimental configurations.

  12. OBJECT KINETIC MONTE CARLO SIMULATIONS OF MICROSTRUCTURE EVOLUTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard L.

    2013-09-30

    The objective is to report the development of the flexible object kinetic Monte Carlo (OKMC) simulation code KSOME (kinetic simulation of microstructure evolution) which can be used to simulate microstructure evolution of complex systems under irradiation. In this report we briefly describe the capabilities of KSOME and present preliminary results for short term annealing of single cascades in tungsten at various primary-knock-on atom (PKA) energies and temperatures.

  13. High-resolution Monte Carlo simulation of flow and conservative transport in heterogeneous porous media: 2. Transport results

    USGS Publications Warehouse

    Naff, R.L.; Haley, D.F.; Sudicky, E.A.

    1998-01-01

    In this, the second of two papers concerned with the use of numerical simulation to examine flow and transport parameters in heterogeneous porous media via Monte Carlo methods, results from the transport aspect of these simulations are reported on. Transport simulations contained herein assume a finite pulse input of conservative tracer, and the numerical technique endeavors to realistically simulate tracer spreading as the cloud moves through a heterogeneous medium. Medium heterogeneity is limited to the hydraulic conductivity field, and generation of this field assumes that the hydraulic-conductivity process is second-order stationary. Methods of estimating cloud moments, and the interpretation of these moments, are discussed. Techniques for estimation of large-time macrodispersivities from cloud second-moment data, and for the approximation of the standard errors associated with these macrodispersivities, are also presented. These moment and macrodispersivity estimation techniques were applied to tracer clouds resulting from transport scenarios generated by specific Monte Carlo simulations. Where feasible, moments and macrodispersivities resulting from the Monte Carlo simulations are compared with first- and second-order perturbation analyses. Some limited results concerning the possible ergodic nature of these simulations, and the presence of non-Gaussian behavior of the mean cloud, are reported on as well.

  14. Investigation of OPET Performance Using GATE, a Geant4-Based Simulation Software.

    PubMed

    Rannou, Fernando R; Kohli, Vandana; Prout, David L; Chatziioannou, Arion F

    2004-10-01

    A combined optical positron emission tomography (OPET) system is capable of both optical and PET imaging in the same setting, and it can provide information/interpretation not possible in single-mode imaging. The scintillator array here serves the dual function of coupling the optical signal from bioluminescence/fluorescence to the photodetector and also of channeling optical scintillations from the gamma rays. We report simulation results of the PET part of OPET using GATE, a Geant4 simulation package. The purpose of this investigation is the definition of the geometric parameters of the OPET tomograph. OPET is composed of six detector blocks arranged in a hexagonal ring-shaped pattern with an inner radius of 15.6 mm. Each detector consists of a two-dimensional array of 8 × 8 scintillator crystals each measuring 2 × 2 × 10 mm(3). Monte Carlo simulations were performed using the GATE software to measure absolute sensitivity, depth of interaction, and spatial resolution for two ring configurations, with and without gantry rotations, two crystal materials, and several crystal lengths. Images were reconstructed with filtered backprojection after angular interleaving and transverse one-dimensional interpolation of the sinogram. We report absolute sensitivities nearly seven times that of the prototype microPET at the center of field of view and 2.0 mm tangential and 2.3 mm radial resolutions with gantry rotations up to an 8.0 mm radial offset. These performance parameters indicate that the imaging spatial resolution and sensitivity of the OPET system will be suitable for high-resolution and high-sensitivity small-animal PET imaging.

  15. Monte Carlo charged-particle tracking and energy deposition on a Lagrangian mesh.

    PubMed

    Yuan, J; Moses, G A; McKenty, P W

    2005-10-01

    A Monte Carlo algorithm for alpha particle tracking and energy deposition on a cylindrical computational mesh in a Lagrangian hydrodynamics code used for inertial confinement fusion (ICF) simulations is presented. The straight line approximation is used to follow propagation of "Monte Carlo particles" which represent collections of alpha particles generated from thermonuclear deuterium-tritium (DT) reactions. Energy deposition in the plasma is modeled by the continuous slowing down approximation. The scheme addresses various aspects arising in the coupling of Monte Carlo tracking with Lagrangian hydrodynamics; such as non-orthogonal severely distorted mesh cells, particle relocation on the moving mesh and particle relocation after rezoning. A comparison with the flux-limited multi-group diffusion transport method is presented for a polar direct drive target design for the National Ignition Facility. Simulations show the Monte Carlo transport method predicts about earlier ignition than predicted by the diffusion method, and generates higher hot spot temperature. Nearly linear speed-up is achieved for multi-processor parallel simulations.

  16. A Monte Carlo method using octree structure in photon and electron transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogawa, K.; Maeda, S.

    Most of the early Monte Carlo calculations in medical physics were used to calculate absorbed dose distributions, and detector responses and efficiencies. Recently, data acquisition in Single Photon Emission CT (SPECT) has been simulated by a Monte Carlo method to evaluate scatter photons generated in a human body and a collimator. Monte Carlo simulations in SPECT data acquisition are generally based on the transport of photons only because the photons being simulated are low energy, and therefore the bremsstrahlung productions by the electrons generated are negligible. Since the transport calculation of photons without electrons is much simpler than that withmore » electrons, it is possible to accomplish the high-speed simulation in a simple object with one medium. Here, object description is important in performing the photon and/or electron transport using a Monte Carlo method efficiently. The authors propose a new description method using an octree representation of an object. Thus even if the boundaries of each medium are represented accurately, high-speed calculation of photon transport can be accomplished because the number of voxels is much fewer than that of the voxel-based approach which represents an object by a union of the voxels of the same size. This Monte Carlo code using the octree representation of an object first establishes the simulation geometry by reading octree string, which is produced by forming an octree structure from a set of serial sections for the object before the simulation; then it transports photons in the geometry. Using the code, if the user just prepares a set of serial sections for the object in which he or she wants to simulate photon trajectories, he or she can perform the simulation automatically using the suboptimal geometry simplified by the octree representation without forming the optimal geometry by handwriting.« less

  17. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo; Sterpin, Edmond

    2016-04-15

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithmmore » of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.« less

  18. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of computemore » node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.« less

  19. SU-F-T-149: Development of the Monte Carlo Simulation Platform Using Geant4 for Designing Heavy Ion Therapy Beam Nozzle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Jae-ik; Yoo, SeungHoon; Cho, Sungho

    Purpose: The significant issue of particle therapy such as proton and carbon ion was a accurate dose delivery from beam line to patient. For designing the complex delivery system, Monte Carlo simulation can be used for the simulation of various physical interaction in scatters and filters. In this report, we present the development of Monte Carlo simulation platform to help design the prototype of particle therapy nozzle and performed the Monte Carlo simulation using Geant4. Also we show the prototype design of particle therapy beam nozzle for Korea Heavy Ion Medical Accelerator (KHIMA) project in Korea Institute of Radiological andmore » Medical Science(KIRAMS) at Republic of Korea. Methods: We developed a simulation platform for particle therapy beam nozzle using Geant4. In this platform, the prototype nozzle design of Scanning system for carbon was simply designed. For comparison with theoretic beam optics, the beam profile on lateral distribution at isocenter is compared with Mont Carlo simulation result. From the result of this analysis, we can expected the beam spot property of KHIMA system and implement the spot size optimization for our spot scanning system. Results: For characteristics study of scanning system, various combination of the spot size from accerlator with ridge filter and beam monitor was tested as simple design for KHIMA dose delivery system. Conclusion: In this report, we presented the part of simulation platform and the characteristics study. This study is now on-going in order to develop the simulation platform including the beam nozzle and the dose verification tool with treatment planning system. This will be presented as soon as it is become available.« less

  20. MO-FG-BRA-01: 4D Monte Carlo Simulations for Verification of Dose Delivered to a Moving Anatomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gholampourkashi, S; Cygler, J E.; The Ottawa Hospital Cancer Centre, Ottawa, ON

    Purpose: To validate 4D Monte Carlo (MC) simulations of dose delivery by an Elekta Agility linear accelerator to a moving phantom. Methods: Monte Carlo simulations were performed using the 4DdefDOSXYZnrc/EGSnrc user code which samples a new geometry for each incident particle and calculates the dose in a continuously moving anatomy. A Quasar respiratory motion phantom with a lung insert containing a 3 cm diameter tumor was used for dose measurements on an Elekta Agility linac with the phantom in stationary and moving states. Dose to the center of tumor was measured using calibrated EBT3 film and the RADPOS 4D dosimetrymore » system. A VMAT plan covering the tumor was created on the static CT scan of the phantom using Monaco V.5.10.02. A validated BEAMnrc model of our Elekta Agility linac was used for Monte Carlo simulations on stationary and moving anatomies. To compare the planned and delivered doses, linac log files recorded during measurements were used for the simulations. For 4D simulations, deformation vectors that modeled the rigid translation of the lung insert were generated as input to the 4DdefDOSXYZnrc code as well as the phantom motion trace recorded with RADPOS during the measurements. Results: Monte Carlo simulations and film measurements were found to agree within 2mm/2% for 97.7% of points in the film in the static phantom and 95.5% in the moving phantom. Dose values based on film and RADPOS measurements are within 2% of each other and within 2σ of experimental uncertainties with respect to simulations. Conclusion: Our 4D Monte Carlo simulation using the defDOSXYZnrc code accurately calculates dose delivered to a moving anatomy. Future work will focus on more investigation of VMAT delivery on a moving phantom to improve the agreement between simulation and measurements, as well as establishing the accuracy of our method in a deforming anatomy. This work was supported by the Ontario Consortium of Adaptive Interventions in Radiation Oncology (OCAIRO), funded by the Ontario Research Fund Research Excellence program.« less

  1. Effect of the multiple scattering of electrons in Monte Carlo simulation of LINACS.

    PubMed

    Vilches, Manuel; García-Pareja, Salvador; Guerrero, Rafael; Anguiano, Marta; Lallena, Antonio M

    2008-01-01

    Results obtained from Monte Carlo simulations of the transport of electrons in thin slabs of dense material media and air slabs with different widths are analyzed. Various general purpose Monte Carlo codes have been used: PENELOPE, GEANT3, GEANT4, EGSNRC, MCNPX. Non-negligible differences between the angular and radial distributions after the slabs have been found. The effects of these differences on the depth doses measured in water are also discussed.

  2. PRELIMINARY COUPLING OF THE MONTE CARLO CODE OPENMC AND THE MULTIPHYSICS OBJECT-ORIENTED SIMULATION ENVIRONMENT (MOOSE) FOR ANALYZING DOPPLER FEEDBACK IN MONTE CARLO SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Ellis; Derek Gaston; Benoit Forget

    In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes.more » An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.« less

  3. Multi-Conformation Monte Carlo: A Method for Introducing Flexibility in Efficient Simulations of Many-Protein Systems.

    PubMed

    Prytkova, Vera; Heyden, Matthias; Khago, Domarin; Freites, J Alfredo; Butts, Carter T; Martin, Rachel W; Tobias, Douglas J

    2016-08-25

    We present a novel multi-conformation Monte Carlo simulation method that enables the modeling of protein-protein interactions and aggregation in crowded protein solutions. This approach is relevant to a molecular-scale description of realistic biological environments, including the cytoplasm and the extracellular matrix, which are characterized by high concentrations of biomolecular solutes (e.g., 300-400 mg/mL for proteins and nucleic acids in the cytoplasm of Escherichia coli). Simulation of such environments necessitates the inclusion of a large number of protein molecules. Therefore, computationally inexpensive methods, such as rigid-body Brownian dynamics (BD) or Monte Carlo simulations, can be particularly useful. However, as we demonstrate herein, the rigid-body representation typically employed in simulations of many-protein systems gives rise to certain artifacts in protein-protein interactions. Our approach allows us to incorporate molecular flexibility in Monte Carlo simulations at low computational cost, thereby eliminating ambiguities arising from structure selection in rigid-body simulations. We benchmark and validate the methodology using simulations of hen egg white lysozyme in solution, a well-studied system for which extensive experimental data, including osmotic second virial coefficients, small-angle scattering structure factors, and multiple structures determined by X-ray and neutron crystallography and solution NMR, as well as rigid-body BD simulation results, are available for comparison.

  4. A Survey of Software Reliability Modeling and Estimation

    DTIC Science & Technology

    1983-09-01

    considered include: the Jelinski-Moranda Model, the ,Geometric Model,’ and Musa’s Model. A Monte -Carlo study of the behavior of the ’V"’"*least squares...ceedings Number 261, 1979, pp. 34-1, 34-11. IoelAmrit, AGieboSSukert, Alan and Goel, Ararat , "A Guidebookfor Software Reliability Assessment, 1980

  5. Comparison of effects of copropagated and precomputed atmosphere profiles on Monte Carlo trajectory simulation

    NASA Technical Reports Server (NTRS)

    Queen, Eric M.; Omara, Thomas M.

    1990-01-01

    A realization of a stochastic atmosphere model for use in simulations is presented. The model provides pressure, density, temperature, and wind velocity as a function of latitude, longitude, and altitude, and is implemented in a three degree of freedom simulation package. This implementation is used in the Monte Carlo simulation of an aeroassisted orbital transfer maneuver and results are compared to those of a more traditional approach.

  6. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  7. TU-H-CAMPUS-IeP1-04: Combined Organ Dose for Digital Subtraction Angiography and Computed Tomography Using Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakabe, D; Ohno, T; Araki, F

    Purpose: The purpose of this study was to evaluate the combined organ dose of digital subtraction angiography (DSA) and computed tomography (CT) using a Monte Carlo (MC) simulation on the abdominal intervention. Methods: The organ doses for DSA and CT were obtained with MC simulation and actual measurements using fluorescent-glass dosimeters at 7 abdominal portions in an Alderson-Rando phantom. DSA was performed from three directions: posterior anterior (PA), right anterior oblique (RAO), and left anterior oblique (LAO). The organ dose with MC simulation was compared with actual radiation dose measurements. Calculations for the MC simulation were carried out with themore » GMctdospp (IMPS, Germany) software based on the EGSnrc MC code. Finally, the combined organ dose for DSA and CT was calculated from the MC simulation using the X-ray conditions of a patient with a diagnosis of hepatocellular carcinoma. Results: For DSA from the PA direction, the organ doses for the actual measurements and MC simulation were 2.2 and 2.4 mGy/100 mAs at the liver, respectively, and 3.0 and 3.1 mGy/100 mAs at the spinal cord, while for CT, the organ doses were 15.2 and 15.1 mGy/100 mAs at the liver, and 14.6 and 13.5 mGy/100 mAs at the spinal cord. The maximum difference in organ dose between the actual measurements and the MC simulation was 11.0% of the spleen at PA, 8.2% of the spinal cord at RAO, and 6.1% of left kidney at LAO with DSA and 9.3% of the stomach with CT. The combined organ dose (4 DSAs and 6 CT scans) with the use of actual patient conditions was found to be 197.4 mGy for the liver and 205.1 mGy for the spinal cord. Conclusion: Our method makes it possible to accurately assess the organ dose to patients for abdominal intervention with combined DSA and CT.« less

  8. Data decomposition of Monte Carlo particle transport simulations via tally servers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit

    An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithmmore » in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations.« less

  9. Hypothesis testing of scientific Monte Carlo calculations.

    PubMed

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  10. Random number generators for large-scale parallel Monte Carlo simulations on FPGA

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Wang, F.; Liu, B.

    2018-05-01

    Through parallelization, field programmable gate array (FPGA) can achieve unprecedented speeds in large-scale parallel Monte Carlo (LPMC) simulations. FPGA presents both new constraints and new opportunities for the implementations of random number generators (RNGs), which are key elements of any Monte Carlo (MC) simulation system. Using empirical and application based tests, this study evaluates all of the four RNGs used in previous FPGA based MC studies and newly proposed FPGA implementations for two well-known high-quality RNGs that are suitable for LPMC studies on FPGA. One of the newly proposed FPGA implementations: a parallel version of additive lagged Fibonacci generator (Parallel ALFG) is found to be the best among the evaluated RNGs in fulfilling the needs of LPMC simulations on FPGA.

  11. Hypothesis testing of scientific Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  12. Split Orthogonal Group: A Guiding Principle for Sign-Problem-Free Fermionic Simulations

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Liu, Ye-Hua; Iazzi, Mauro; Troyer, Matthias; Harcos, Gergely

    2015-12-01

    We present a guiding principle for designing fermionic Hamiltonians and quantum Monte Carlo (QMC) methods that are free from the infamous sign problem by exploiting the Lie groups and Lie algebras that appear naturally in the Monte Carlo weight of fermionic QMC simulations. Specifically, rigorous mathematical constraints on the determinants involving matrices that lie in the split orthogonal group provide a guideline for sign-free simulations of fermionic models on bipartite lattices. This guiding principle not only unifies the recent solutions of the sign problem based on the continuous-time quantum Monte Carlo methods and the Majorana representation, but also suggests new efficient algorithms to simulate physical systems that were previously prohibitive because of the sign problem.

  13. Study of photo-oxidative reactivity of sunscreening agents based on photo-oxidation of uric acid by kinetic Monte Carlo simulation.

    PubMed

    Moradmand Jalali, Hamed; Bashiri, Hadis; Rasa, Hossein

    2015-05-01

    In the present study, the mechanism of free radical production by light-reflective agents in sunscreens (TiO2, ZnO and ZrO2) was obtained by applying kinetic Monte Carlo simulation. The values of the rate constants for each step of the suggested mechanism have been obtained by simulation. The effect of the initial concentration of mineral oxides and uric acid on the rate of uric acid photo-oxidation by irradiation of some sun care agents has been studied. The kinetic Monte Carlo simulation results agree qualitatively with the existing experimental data for the production of free radicals by sun care agents. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. ‘Survival’: a simulation toolkit introducing a modular approach for radiobiological evaluations in ion beam therapy

    NASA Astrophysics Data System (ADS)

    Manganaro, L.; Russo, G.; Bourhaleb, F.; Fausti, F.; Giordanengo, S.; Monaco, V.; Sacchi, R.; Vignati, A.; Cirio, R.; Attili, A.

    2018-04-01

    One major rationale for the application of heavy ion beams in tumour therapy is their increased relative biological effectiveness (RBE). The complex dependencies of the RBE on dose, biological endpoint, position in the field etc require the use of biophysical models in treatment planning and clinical analysis. This study aims to introduce a new software, named ‘Survival’, to facilitate the radiobiological computations needed in ion therapy. The simulation toolkit was written in C++ and it was developed with a modular architecture in order to easily incorporate different radiobiological models. The following models were successfully implemented: the local effect model (LEM, version I, II and III) and variants of the microdosimetric-kinetic model (MKM). Different numerical evaluation approaches were also implemented: Monte Carlo (MC) numerical methods and a set of faster analytical approximations. Among the possible applications, the toolkit was used to reproduce the RBE versus LET for different ions (proton, He, C, O, Ne) and different cell lines (CHO, HSG). Intercomparison between different models (LEM and MKM) and computational approaches (MC and fast approximations) were performed. The developed software could represent an important tool for the evaluation of the biological effectiveness of charged particles in ion beam therapy, in particular when coupled with treatment simulations. Its modular architecture facilitates benchmarking and inter-comparison between different models and evaluation approaches. The code is open source (GPL2 license) and available at https://github.com/batuff/Survival.

  15. Developing a treatment planning process and software for improved translation of photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Cassidy, J.; Zheng, Z.; Xu, Y.; Betz, V.; Lilge, L.

    2017-04-01

    Background: The majority of de novo cancers are diagnosed in low and middle-income countries, which often lack the resources to provide adequate therapeutic options. None or minimally invasive therapies such as Photodynamic Therapy (PDT) or photothermal therapies could become part of the overall treatment options in these countries. However, widespread acceptance is hindered by the current empirical training of surgeons in these optical techniques and a lack of easily usable treatment optimizing tools. Methods: Based on image processing programs, ITK-SNAP, and the publicly available FullMonte light propagation software, a work plan is proposed that allows for personalized PDT treatment planning. Starting with, contoured clinical CT or MRI images, the generation of 3D tetrahedral models in silico, execution of the Monte Carlo simulation and presentation of the 3D fluence rate, Φ, [mWcm-2] distribution a treatment plan optimizing photon source placement is developed. Results: Permitting 1-2 days for the installation of the required programs, novices can generate their first fluence, H [Jcm-2] or Φ distribution in a matter of hours. This is reduced to 10th of minutes with some training. Executing the photon simulation calculations is rapid and not the performance limiting process. Largest sources of errors are uncertainties in the contouring and unknown tissue optical properties. Conclusions: The presented FullMonte simulation is the fastest tetrahedral based photon propagation program and provides the basis for PDT treatment planning processes, enabling a faster proliferation of low cost, minimal invasive personalized cancer therapies.

  16. Deterministic theory of Monte Carlo variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ueki, T.; Larsen, E.W.

    1996-12-31

    The theoretical estimation of variance in Monte Carlo transport simulations, particularly those using variance reduction techniques, is a substantially unsolved problem. In this paper, the authors describe a theory that predicts the variance in a variance reduction method proposed by Dwivedi. Dwivedi`s method combines the exponential transform with angular biasing. The key element of this theory is a new modified transport problem, containing the Monte Carlo weight w as an extra independent variable, which simulates Dwivedi`s Monte Carlo scheme. The (deterministic) solution of this modified transport problem yields an expression for the variance. The authors give computational results that validatemore » this theory.« less

  17. Recommender engine for continuous-time quantum Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Huang, Li; Yang, Yi-feng; Wang, Lei

    2017-03-01

    Recommender systems play an essential role in the modern business world. They recommend favorable items such as books, movies, and search queries to users based on their past preferences. Applying similar ideas and techniques to Monte Carlo simulations of physical systems boosts their efficiency without sacrificing accuracy. Exploiting the quantum to classical mapping inherent in the continuous-time quantum Monte Carlo methods, we construct a classical molecular gas model to reproduce the quantum distributions. We then utilize powerful molecular simulation techniques to propose efficient quantum Monte Carlo updates. The recommender engine approach provides a general way to speed up the quantum impurity solvers.

  18. KSC Tech Transfer News, Volume 2, No. 2

    NASA Technical Reports Server (NTRS)

    Makufka, David (Editor); Dunn, Carol (Editor)

    2009-01-01

    This issue contains articles about: (1) the Innovative Partnerships Program (IPP) and the manager of the program, Alexis Hongamen, (2) New Technology Report (NTR) on a Monte Carlo Simulation to Estimate the Likelihood of Direct Lightning Strikes, (3) Kennedy Space Center's Applied Physics Lab, (4) a virtual ruler that is used for many applications, (5) a portable device that finds low-level leaks, (6) a sun-shield, that supports in-space cryogenic propellant storage, (7) lunar dust modeling software, (8) space based monitoring of radiation damage to DNA, (9) the use of light-emitting diode (LED) arrays vegetable production system, (10) Dust Tolerant Intelligent Electrical Connection Systems, (11) Ice Detection Camera System Upgrade, (12) Repair Techniques for Composite Structures, (13) Cryogenic Orbital Testbed, and (14) copyright protection.

  19. Characterization of Space Shuttle Ascent Debris Aerodynamics Using CFD Methods

    NASA Technical Reports Server (NTRS)

    Murman, Scott M.; Aftosmis, Michael J.; Rogers, Stuart E.

    2005-01-01

    An automated Computational Fluid Dynamics process for determining the aerodynamic Characteristics of debris shedding from the Space Shuttle Launch Vehicle during ascent is presented. This process uses Cartesian fully-coupled, six-degree-of-freedom simulations of isolated debris pieces in a Monte Carlo fashion to produce models for the drag and crossrange behavior over a range of debris shapes and shedding scenarios. A validation of the Cartesian methods against ballistic range data for insulating foam debris shapes at flight conditions, as well as validation of the resulting models, are both contained. These models are integrated with the existing shuttle debris transport analysis software to provide an accurate and efficient engineering tool for analyzing debris sources and their potential for damage.

  20. Recent developments in Geant4

    DOE PAGES

    Allison, J.; Amako, K.; Apostolakis, J.; ...

    2016-07-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions tomore » the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.« less

  1. Non-proportional odds multivariate logistic regression of ordinal family data.

    PubMed

    Zaloumis, Sophie G; Scurrah, Katrina J; Harrap, Stephen B; Ellis, Justine A; Gurrin, Lyle C

    2015-03-01

    Methods to examine whether genetic and/or environmental sources can account for the residual variation in ordinal family data usually assume proportional odds. However, standard software to fit the non-proportional odds model to ordinal family data is limited because the correlation structure of family data is more complex than for other types of clustered data. To perform these analyses we propose the non-proportional odds multivariate logistic regression model and take a simulation-based approach to model fitting using Markov chain Monte Carlo methods, such as partially collapsed Gibbs sampling and the Metropolis algorithm. We applied the proposed methodology to male pattern baldness data from the Victorian Family Heart Study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Efficient Monte Carlo Methods for Biomolecular Simulations.

    NASA Astrophysics Data System (ADS)

    Bouzida, Djamal

    A new approach to efficient Monte Carlo simulations of biological molecules is presented. By relaxing the usual restriction to Markov processes, we are able to optimize performance while dealing directly with the inhomogeneity and anisotropy inherent in these systems. The advantage of this approach is that we can introduce a wide variety of Monte Carlo moves to deal with complicated motions of the molecule, while maintaining full optimization at every step. This enables the use of a variety of collective rotational moves that relax long-wavelength modes. We were able to show by explicit simulations that the resulting algorithms substantially increase the speed of the simulation while reproducing the correct equilibrium behavior. This approach is particularly intended for simulations of macromolecules, although we expect it to be useful in other situations. The dynamic optimization of the new Monte Carlo methods makes them very suitable for simulated annealing experiments on all systems whose state space is continuous in general, and to the protein folding problem in particular. We introduce an efficient annealing schedule using preferential bias moves. Our simulated annealing experiments yield structures whose free energies were lower than the equilibrated X-ray structure, which leads us to believe that the empirical energy function used does not fully represent the interatomic interactions. Furthermore, we believe that the largest discrepancies involve the solvent effects in particular.

  3. Experimental benchmarking of a Monte Carlo dose simulation code for pediatric CT

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Yoshizumi, Terry; Colsher, James G.; Jones, Robert P.; Frush, Donald P.

    2007-03-01

    In recent years, there has been a desire to reduce CT radiation dose to children because of their susceptibility and prolonged risk for cancer induction. Concerns arise, however, as to the impact of dose reduction on image quality and thus potentially on diagnostic accuracy. To study the dose and image quality relationship, we are developing a simulation code to calculate organ dose in pediatric CT patients. To benchmark this code, a cylindrical phantom was built to represent a pediatric torso, which allows measurements of dose distributions from its center to its periphery. Dose distributions for axial CT scans were measured on a 64-slice multidetector CT (MDCT) scanner (GE Healthcare, Chalfont St. Giles, UK). The same measurements were simulated using a Monte Carlo code (PENELOPE, Universitat de Barcelona) with the applicable CT geometry including bowtie filter. The deviations between simulated and measured dose values were generally within 5%. To our knowledge, this work is one of the first attempts to compare measured radial dose distributions on a cylindrical phantom with Monte Carlo simulated results. It provides a simple and effective method for benchmarking organ dose simulation codes and demonstrates the potential of Monte Carlo simulation for investigating the relationship between dose and image quality for pediatric CT patients.

  4. RECORDS: improved Reporting of montE CarlO RaDiation transport Studies: Report of the AAPM Research Committee Task Group 268.

    PubMed

    Sechopoulos, Ioannis; Rogers, D W O; Bazalova-Carter, Magdalena; Bolch, Wesley E; Heath, Emily C; McNitt-Gray, Michael F; Sempau, Josep; Williamson, Jeffrey F

    2018-01-01

    Studies involving Monte Carlo simulations are common in both diagnostic and therapy medical physics research, as well as other fields of basic and applied science. As with all experimental studies, the conditions and parameters used for Monte Carlo simulations impact their scope, validity, limitations, and generalizability. Unfortunately, many published peer-reviewed articles involving Monte Carlo simulations do not provide the level of detail needed for the reader to be able to properly assess the quality of the simulations. The American Association of Physicists in Medicine Task Group #268 developed guidelines to improve reporting of Monte Carlo studies in medical physics research. By following these guidelines, manuscripts submitted for peer-review will include a level of relevant detail that will increase the transparency, the ability to reproduce results, and the overall scientific value of these studies. The guidelines include a checklist of the items that should be included in the Methods, Results, and Discussion sections of manuscripts submitted for peer-review. These guidelines do not attempt to replace the journal reviewer, but rather to be a tool during the writing and review process. Given the varied nature of Monte Carlo studies, it is up to the authors and the reviewers to use this checklist appropriately, being conscious of how the different items apply to each particular scenario. It is envisioned that this list will be useful both for authors and for reviewers, to help ensure the adequate description of Monte Carlo studies in the medical physics literature. © 2017 American Association of Physicists in Medicine.

  5. Modeling Nitrogen Dynamics in a Waste Stabilization Pond System Using Flexible Modeling Environment with MCMC

    PubMed Central

    Mukhtar, Hussnain; Lin, Yu-Pin; Shipin, Oleg V.; Petway, Joy R.

    2017-01-01

    This study presents an approach for obtaining realization sets of parameters for nitrogen removal in a pilot-scale waste stabilization pond (WSP) system. The proposed approach was designed for optimal parameterization, local sensitivity analysis, and global uncertainty analysis of a dynamic simulation model for the WSP by using the R software package Flexible Modeling Environment (R-FME) with the Markov chain Monte Carlo (MCMC) method. Additionally, generalized likelihood uncertainty estimation (GLUE) was integrated into the FME to evaluate the major parameters that affect the simulation outputs in the study WSP. Comprehensive modeling analysis was used to simulate and assess nine parameters and concentrations of ON-N, NH3-N and NO3-N. Results indicate that the integrated FME-GLUE-based model, with good Nash–Sutcliffe coefficients (0.53–0.69) and correlation coefficients (0.76–0.83), successfully simulates the concentrations of ON-N, NH3-N and NO3-N. Moreover, the Arrhenius constant was the only parameter sensitive to model performances of ON-N and NH3-N simulations. However, Nitrosomonas growth rate, the denitrification constant, and the maximum growth rate at 20 °C were sensitive to ON-N and NO3-N simulation, which was measured using global sensitivity. PMID:28704958

  6. Neoclassical toroidal viscosity calculations in tokamaks using a δf Monte Carlo simulation and their verifications.

    PubMed

    Satake, S; Park, J-K; Sugama, H; Kanno, R

    2011-07-29

    Neoclassical toroidal viscosities (NTVs) in tokamaks are investigated using a δf Monte Carlo simulation, and are successfully verified with a combined analytic theory over a wide range of collisionality. A Monte Carlo simulation has been required in the study of NTV since the complexities in guiding-center orbits of particles and their collisions cannot be fully investigated by any means of analytic theories alone. Results yielded the details of the complex NTV dependency on particle precessions and collisions, which were predicted roughly in a combined analytic theory. Both numerical and analytic methods can be utilized and extended based on these successful verifications.

  7. Improved radial dose function estimation using current version MCNP Monte-Carlo simulation: Model 6711 and ISC3500 125I brachytherapy sources.

    PubMed

    Duggan, Dennis M

    2004-12-01

    Improved cross-sections in a new version of the Monte-Carlo N-particle (MCNP) code may eliminate discrepancies between radial dose functions (as defined by American Association of Physicists in Medicine Task Group 43) derived from Monte-Carlo simulations of low-energy photon-emitting brachytherapy sources and those from measurements on the same sources with thermoluminescent dosimeters. This is demonstrated for two 125I brachytherapy seed models, the Implant Sciences Model ISC3500 (I-Plant) and the Amersham Health Model 6711, by simulating their radial dose functions with two versions of MCNP, 4c2 and 5.

  8. Web-based, GPU-accelerated, Monte Carlo simulation and visualization of indirect radiation imaging detector performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Han; Sharma, Diksha; Badano, Aldo, E-mail: aldo.badano@fda.hhs.gov

    2014-12-15

    Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: Themore » visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual feedback for the model predictions.« less

  9. Simulation of image detectors in radiology for determination of scatter-to-primary ratios using Monte Carlo radiation transport code MCNP/MCNPX.

    PubMed

    Smans, Kristien; Zoetelief, Johannes; Verbrugge, Beatrijs; Haeck, Wim; Struelens, Lara; Vanhavere, Filip; Bosmans, Hilde

    2010-05-01

    The purpose of this study was to compare and validate three methods to simulate radiographic image detectors with the Monte Carlo software MCNP/MCNPX in a time efficient way. The first detector model was the standard semideterministic radiography tally, which has been used in previous image simulation studies. Next to the radiography tally two alternative stochastic detector models were developed: A perfect energy integrating detector and a detector based on the energy absorbed in the detector material. Validation of three image detector models was performed by comparing calculated scatter-to-primary ratios (SPRs) with the published and experimentally acquired SPR values. For mammographic applications, SPRs computed with the radiography tally were up to 44% larger than the published results, while the SPRs computed with the perfect energy integrating detectors and the blur-free absorbed energy detector model were, on the average, 0.3% (ranging from -3% to 3%) and 0.4% (ranging from -5% to 5%) lower, respectively. For general radiography applications, the radiography tally overestimated the measured SPR by as much as 46%. The SPRs calculated with the perfect energy integrating detectors were, on the average, 4.7% (ranging from -5.3% to -4%) lower than the measured SPRs, whereas for the blur-free absorbed energy detector model, the calculated SPRs were, on the average, 1.3% (ranging from -0.1% to 2.4%) larger than the measured SPRs. For mammographic applications, both the perfect energy integrating detector model and the blur-free energy absorbing detector model can be used to simulate image detectors, whereas for conventional x-ray imaging using higher energies, the blur-free energy absorbing detector model is the most appropriate image detector model. The radiography tally overestimates the scattered part and should therefore not be used to simulate radiographic image detectors.

  10. A Machine Learning Method for the Prediction of Receptor Activation in the Simulation of Synapses

    PubMed Central

    Montes, Jesus; Gomez, Elena; Merchán-Pérez, Angel; DeFelipe, Javier; Peña, Jose-Maria

    2013-01-01

    Chemical synaptic transmission involves the release of a neurotransmitter that diffuses in the extracellular space and interacts with specific receptors located on the postsynaptic membrane. Computer simulation approaches provide fundamental tools for exploring various aspects of the synaptic transmission under different conditions. In particular, Monte Carlo methods can track the stochastic movements of neurotransmitter molecules and their interactions with other discrete molecules, the receptors. However, these methods are computationally expensive, even when used with simplified models, preventing their use in large-scale and multi-scale simulations of complex neuronal systems that may involve large numbers of synaptic connections. We have developed a machine-learning based method that can accurately predict relevant aspects of the behavior of synapses, such as the percentage of open synaptic receptors as a function of time since the release of the neurotransmitter, with considerably lower computational cost compared with the conventional Monte Carlo alternative. The method is designed to learn patterns and general principles from a corpus of previously generated Monte Carlo simulations of synapses covering a wide range of structural and functional characteristics. These patterns are later used as a predictive model of the behavior of synapses under different conditions without the need for additional computationally expensive Monte Carlo simulations. This is performed in five stages: data sampling, fold creation, machine learning, validation and curve fitting. The resulting procedure is accurate, automatic, and it is general enough to predict synapse behavior under experimental conditions that are different to the ones it has been trained on. Since our method efficiently reproduces the results that can be obtained with Monte Carlo simulations at a considerably lower computational cost, it is suitable for the simulation of high numbers of synapses and it is therefore an excellent tool for multi-scale simulations. PMID:23894367

  11. Simulation of Satellite, Airborne and Terrestrial LiDAR with DART (I):Waveform Simulation with Quasi-Monte Carlo Ray Tracing

    NASA Technical Reports Server (NTRS)

    Gastellu-Etchegorry, Jean-Philippe; Yin, Tiangang; Lauret, Nicolas; Grau, Eloi; Rubio, Jeremy; Cook, Bruce D.; Morton, Douglas C.; Sun, Guoqing

    2016-01-01

    Light Detection And Ranging (LiDAR) provides unique data on the 3-D structure of atmosphere constituents and the Earth's surface. Simulating LiDAR returns for different laser technologies and Earth scenes is fundamental for evaluating and interpreting signal and noise in LiDAR data. Different types of models are capable of simulating LiDAR waveforms of Earth surfaces. Semi-empirical and geometric models can be imprecise because they rely on simplified simulations of Earth surfaces and light interaction mechanisms. On the other hand, Monte Carlo ray tracing (MCRT) models are potentially accurate but require long computational time. Here, we present a new LiDAR waveform simulation tool that is based on the introduction of a quasi-Monte Carlo ray tracing approach in the Discrete Anisotropic Radiative Transfer (DART) model. Two new approaches, the so-called "box method" and "Ray Carlo method", are implemented to provide robust and accurate simulations of LiDAR waveforms for any landscape, atmosphere and LiDAR sensor configuration (view direction, footprint size, pulse characteristics, etc.). The box method accelerates the selection of the scattering direction of a photon in the presence of scatterers with non-invertible phase function. The Ray Carlo method brings traditional ray-tracking into MCRT simulation, which makes computational time independent of LiDAR field of view (FOV) and reception solid angle. Both methods are fast enough for simulating multi-pulse acquisition. Sensitivity studies with various landscapes and atmosphere constituents are presented, and the simulated LiDAR signals compare favorably with their associated reflectance images and Laser Vegetation Imaging Sensor (LVIS) waveforms. The LiDAR module is fully integrated into DART, enabling more detailed simulations of LiDAR sensitivity to specific scene elements (e.g., atmospheric aerosols, leaf area, branches, or topography) and sensor configuration for airborne or satellite LiDAR sensors.

  12. HepSim: A repository with predictions for high-energy physics experiments

    DOE PAGES

    Chekanov, S. V.

    2015-02-03

    A file repository for calculations of cross sections and kinematic distributions using Monte Carlo generators for high-energy collisions is discussed. The repository is used to facilitate effective preservation and archiving of data from theoretical calculations and for comparisons with experimental data. The HepSim data library is publicly accessible and includes a number of Monte Carlo event samples with Standard Model predictions for current and future experiments. The HepSim project includes a software package to automate the process of downloading and viewing online Monte Carlo event samples. Data streaming over a network for end-user analysis is discussed.

  13. An Energy-Dispersive X-Ray Fluorescence Spectrometry and Monte Carlo simulation study of Iron-Age Nuragic small bronzes ("Navicelle") from Sardinia, Italy

    NASA Astrophysics Data System (ADS)

    Schiavon, Nick; de Palmas, Anna; Bulla, Claudio; Piga, Giampaolo; Brunetti, Antonio

    2016-09-01

    A spectrometric protocol combining Energy Dispersive X-Ray Fluorescence Spectrometry with Monte Carlo simulations of experimental spectra using the XRMC code package has been applied for the first time to characterize the elemental composition of a series of famous Iron Age small scale archaeological bronze replicas of ships (known as the ;Navicelle;) from the Nuragic civilization in Sardinia, Italy. The proposed protocol is a useful, nondestructive and fast analytical tool for Cultural Heritage sample. In Monte Carlo simulations, each sample was modeled as a multilayered object composed by two or three layers depending on the sample: when all present, the three layers are the original bronze substrate, the surface corrosion patina and an outermost protective layer (Paraloid) applied during past restorations. Monte Carlo simulations were able to account for the presence of the patina/corrosion layer as well as the presence of the Paraloid protective layer. It also accounted for the roughness effect commonly found at the surface of corroded metal archaeological artifacts. In this respect, the Monte Carlo simulation approach adopted here was, to the best of our knowledge, unique and enabled to determine the bronze alloy composition together with the thickness of the surface layers without the need for previously removing the surface patinas, a process potentially threatening preservation of precious archaeological/artistic artifacts for future generations.

  14. Quantitative risk management in gas injection project: a case study from Oman oil and gas industry

    NASA Astrophysics Data System (ADS)

    Khadem, Mohammad Miftaur Rahman Khan; Piya, Sujan; Shamsuzzoha, Ahm

    2017-09-01

    The purpose of this research was to study the recognition, application and quantification of the risks associated in managing projects. In this research, the management of risks in an oil and gas project is studied and implemented within a case company in Oman. In this study, at first, the qualitative data related to risks in the project were identified through field visits and extensive interviews. These data were then translated into numerical values based on the expert's opinion. Further, the numerical data were used as an input to Monte Carlo simulation. RiskyProject Professional™ software was used to simulate the system based on the identified risks. The simulation result predicted a delay of about 2 years as a worse case with no chance of meeting the project's on stream date. Also, it has predicted 8% chance of exceeding the total estimated budget. The result of numerical analysis from the proposed model is validated by comparing it with the result of qualitative analysis, which was obtained through discussion with various project managers of company.

  15. A Monte Carlo simulation study of associated liquid crystals

    NASA Astrophysics Data System (ADS)

    Berardi, R.; Fehervari, M.; Zannoni, C.

    We have performed a Monte Carlo simulation study of a system of ellipsoidal particles with donor-acceptor sites modelling complementary hydrogen-bonding groups in real molecules. We have considered elongated Gay-Berne particles with terminal interaction sites allowing particles to associate and form dimers. The changes in the phase transitions and in the molecular organization and the interplay between orientational ordering and dimer formation are discussed. Particle flip and dimer moves have been used to increase the convergency rate of the Monte Carlo (MC) Markov chain.

  16. Accurate Monte Carlo simulations for nozzle design, commissioning and quality assurance for a proton radiation therapy facility.

    PubMed

    Paganetti, H; Jiang, H; Lee, S Y; Kooy, H M

    2004-07-01

    Monte Carlo dosimetry calculations are essential methods in radiation therapy. To take full advantage of this tool, the beam delivery system has to be simulated in detail and the initial beam parameters have to be known accurately. The modeling of the beam delivery system itself opens various areas where Monte Carlo calculations prove extremely helpful, such as for design and commissioning of a therapy facility as well as for quality assurance verification. The gantry treatment nozzles at the Northeast Proton Therapy Center (NPTC) at Massachusetts General Hospital (MGH) were modeled in detail using the GEANT4.5.2 Monte Carlo code. For this purpose, various novel solutions for simulating irregular shaped objects in the beam path, like contoured scatterers, patient apertures or patient compensators, were found. The four-dimensional, in time and space, simulation of moving parts, such as the modulator wheel, was implemented. Further, the appropriate physics models and cross sections for proton therapy applications were defined. We present comparisons between measured data and simulations. These show that by modeling the treatment nozzle with millimeter accuracy, it is possible to reproduce measured dose distributions with an accuracy in range and modulation width, in the case of a spread-out Bragg peak (SOBP), of better than 1 mm. The excellent agreement demonstrates that the simulations can even be used to generate beam data for commissioning treatment planning systems. The Monte Carlo nozzle model was used to study mechanical optimization in terms of scattered radiation and secondary radiation in the design of the nozzles. We present simulations on the neutron background. Further, the Monte Carlo calculations supported commissioning efforts in understanding the sensitivity of beam characteristics and how these influence the dose delivered. We present the sensitivity of dose distributions in water with respect to various beam parameters and geometrical misalignments. This allows the definition of tolerances for quality assurance and the design of quality assurance procedures.

  17. Raman Monte Carlo simulation for light propagation for tissue with embedded objects

    NASA Astrophysics Data System (ADS)

    Periyasamy, Vijitha; Jaafar, Humaira Bte; Pramanik, Manojit

    2018-02-01

    Monte Carlo (MC) stimulation is one of the prominent simulation technique and is rapidly becoming the model of choice to study light-tissue interaction. Monte Carlo simulation for light transport in multi-layered tissue (MCML) is adapted and modelled with different geometry by integrating embedded objects of various shapes (i.e., sphere, cylinder, cuboid and ellipsoid) into the multi-layered structure. These geometries would be useful in providing a realistic tissue structure such as modelling for lymph nodes, tumors, blood vessels, head and other simulation medium. MC simulations were performed on various geometric medium. Simulation of MCML with embedded object (MCML-EO) was improvised for propagation of the photon in the defined medium with Raman scattering. The location of Raman photon generation is recorded. Simulations were experimented on a modelled breast tissue with tumor (spherical and ellipsoidal) and blood vessels (cylindrical). Results were presented in both A-line and B-line scans for embedded objects to determine spatial location where Raman photons were generated. Studies were done for different Raman probabilities.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Ramírez, Pablo, E-mail: rapeitor@ug.uchile.cl; Ruiz, Andrés

    The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of themore » intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.« less

  19. On the Validity of Continuum Computational Fluid Dynamics Approach Under Very Low-Pressure Plasma Spray Conditions

    NASA Astrophysics Data System (ADS)

    Ivchenko, Dmitrii; Zhang, Tao; Mariaux, Gilles; Vardelle, Armelle; Goutier, Simon; Itina, Tatiana E.

    2018-01-01

    Plasma spray physical vapor deposition aims to substantially evaporate powders in order to produce coatings with various microstructures. This is achieved by powder vapor condensation onto the substrate and/or by deposition of fine melted powder particles and nanoclusters. The deposition process typically operates at pressures ranging between 10 and 200 Pa. In addition to the experimental works, numerical simulations are performed to better understand the process and optimize the experimental conditions. However, the combination of high temperatures and low pressure with shock waves initiated by supersonic expansion of the hot gas in the low-pressure medium makes doubtful the applicability of the continuum approach for the simulation of such a process. This work investigates (1) effects of the pressure dependence of thermodynamic and transport properties on computational fluid dynamics (CFD) predictions and (2) the validity of the continuum approach for thermal plasma flow simulation under very low-pressure conditions. The study compares the flow fields predicted with a continuum approach using CFD software with those obtained by a kinetic-based approach using a direct simulation Monte Carlo method (DSMC). It also shows how the presence of high gradients can contribute to prediction errors for typical PS-PVD conditions.

  20. Influence of Iterative Reconstruction Algorithms on PET Image Resolution

    NASA Astrophysics Data System (ADS)

    Karpetas, G. E.; Michail, C. M.; Fountos, G. P.; Valais, I. G.; Nikolopoulos, D.; Kandarakis, I. S.; Panayiotakis, G. S.

    2015-09-01

    The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction. The simulated PET scanner was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the modulation transfer function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL, the ordered subsets separable paraboloidal surrogate (OSSPS), the median root prior (MRP) and OSMAPOSL with quadratic prior, algorithms. OSMAPOSL reconstruction was assessed by using fixed subsets and various iterations, as well as by using various beta (hyper) parameter values. MTF values were found to increase with increasing iterations. MTF also improves by using lower beta values. The simulated PET evaluation method, based on the TLC plane source, can be useful in the resolution assessment of PET scanners.

  1. N values estimation based on photon flux simulation with Geant4 toolkit.

    PubMed

    Sun, Z J; Danjaji, M; Kim, Y

    2018-06-01

    N values are routinely introduced in photon activation analysis (PAA) as the ratio of special activities of product nuclides to compare the relative intensities of different reaction channels. They determine the individual activities of each radioisotope and the total activity of the sample, which are the primary concerns of radiation safety. Traditionally, N values are calculated from the gamma spectroscopy in real measurements by normalizing the activities of individual nuclides to the reference reaction [ 58 Ni(γ, n) 57 Ni] of the nickel monitor simultaneously irradiated in photon activation. Is it possible to use photon flux simulated by Monte Carlo software to calculate N values even before the actual irradiation starts? This study has applied Geant4 toolkit, a popular platform of simulating the passage of particles through matter, to generate photon flux in the samples. Assisted with photonuclear cross section from IAEA database, it is feasible to predict N values in different experimental setups for simulated target material. We have validated of this method and its consistency with Geant4. Results also show that N values are highly correlated with the beam parameters of incoming electrons and the setup of the electron-photon converter. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Applying Monte-Carlo simulations to optimize an inelastic neutron scattering system for soil carbon analysis

    USDA-ARS?s Scientific Manuscript database

    Computer Monte-Carlo (MC) simulations (Geant4) of neutron propagation and acquisition of gamma response from soil samples was applied to evaluate INS system performance characteristic [sensitivity, minimal detectable level (MDL)] for soil carbon measurement. The INS system model with best performanc...

  3. Monte Carlo simulation of MOSFET dosimeter for electron backscatter using the GEANT4 code.

    PubMed

    Chow, James C L; Leung, Michael K K

    2008-06-01

    The aim of this study is to investigate the influence of the body of the metal-oxide-semiconductor field effect transistor (MOSFET) dosimeter in measuring the electron backscatter from lead. The electron backscatter factor (EBF), which is defined as the ratio of dose at the tissue-lead interface to the dose at the same point without the presence of backscatter, was calculated by the Monte Carlo simulation using the GEANT4 code. Electron beams with energies of 4, 6, 9, and 12 MeV were used in the simulation. It was found that in the presence of the MOSFET body, the EBFs were underestimated by about 2%-0.9% for electron beam energies of 4-12 MeV, respectively. The trend of the decrease of EBF with an increase of electron energy can be explained by the small MOSFET dosimeter, mainly made of epoxy and silicon, not only attenuated the electron fluence of the electron beam from upstream, but also the electron backscatter generated by the lead underneath the dosimeter. However, this variation of the EBF underestimation is within the same order of the statistical uncertainties as the Monte Carlo simulations, which ranged from 1.3% to 0.8% for the electron energies of 4-12 MeV, due to the small dosimetric volume. Such small EBF deviation is therefore insignificant when the uncertainty of the Monte Carlo simulation is taken into account. Corresponding measurements were carried out and uncertainties compared to Monte Carlo results were within +/- 2%. Spectra of energy deposited by the backscattered electrons in dosimetric volumes with and without the lead and MOSFET were determined by Monte Carlo simulations. It was found that in both cases, when the MOSFET body is either present or absent in the simulation, deviations of electron energy spectra with and without the lead decrease with an increase of the electron beam energy. Moreover, the softer spectrum of the backscattered electron when lead is present can result in a reduction of the MOSFET response due to stronger recombination in the SiO2 gate. It is concluded that the MOSFET dosimeter performed well for measuring the electron backscatter from lead using electron beams. The uncertainty of EBF determined by comparing the results of Monte Carlo simulations and measurements is well within the accuracy of the MOSFET dosimeter (< +/- 4.2%) provided by the manufacturer.

  4. Off-diagonal expansion quantum Monte Carlo

    NASA Astrophysics Data System (ADS)

    Albash, Tameem; Wagenbreth, Gene; Hen, Itay

    2017-12-01

    We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.

  5. Off-diagonal expansion quantum Monte Carlo.

    PubMed

    Albash, Tameem; Wagenbreth, Gene; Hen, Itay

    2017-12-01

    We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.

  6. An example of complex modelling in dentistry using Markov chain Monte Carlo (MCMC) simulation.

    PubMed

    Helfenstein, Ulrich; Menghini, Giorgio; Steiner, Marcel; Murati, Francesca

    2002-09-01

    In the usual regression setting one regression line is computed for a whole data set. In a more complex situation, each person may be observed for example at several points in time and thus a regression line might be calculated for each person. Additional complexities, such as various forms of errors in covariables may make a straightforward statistical evaluation difficult or even impossible. During recent years methods have been developed allowing convenient analysis of problems where the data and the corresponding models show these and many other forms of complexity. The methodology makes use of a Bayesian approach and Markov chain Monte Carlo (MCMC) simulations. The methods allow the construction of increasingly elaborate models by building them up from local sub-models. The essential structure of the models can be represented visually by directed acyclic graphs (DAG). This attractive property allows communication and discussion of the essential structure and the substantial meaning of a complex model without needing algebra. After presentation of the statistical methods an example from dentistry is presented in order to demonstrate their application and use. The dataset of the example had a complex structure; each of a set of children was followed up over several years. The number of new fillings in permanent teeth had been recorded at several ages. The dependent variables were markedly different from the normal distribution and could not be transformed to normality. In addition, explanatory variables were assumed to be measured with different forms of error. Illustration of how the corresponding models can be estimated conveniently via MCMC simulation, in particular, 'Gibbs sampling', using the freely available software BUGS is presented. In addition, how the measurement error may influence the estimates of the corresponding coefficients is explored. It is demonstrated that the effect of the independent variable on the dependent variable may be markedly underestimated if the measurement error is not taken into account ('regression dilution bias'). Markov chain Monte Carlo methods may be of great value to dentists in allowing analysis of data sets which exhibit a wide range of different forms of complexity.

  7. Comparison of Geant4-DNA simulation of S-values with other Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    André, T.; Morini, F.; Karamitros, M.; Delorme, R.; Le Loirec, C.; Campos, L.; Champion, C.; Groetz, J.-E.; Fromm, M.; Bordage, M.-C.; Perrot, Y.; Barberet, Ph.; Bernal, M. A.; Brown, J. M. C.; Deleuze, M. S.; Francis, Z.; Ivanchenko, V.; Mascialino, B.; Zacharatou, C.; Bardiès, M.; Incerti, S.

    2014-01-01

    Monte Carlo simulations of S-values have been carried out with the Geant4-DNA extension of the Geant4 toolkit. The S-values have been simulated for monoenergetic electrons with energies ranging from 0.1 keV up to 20 keV, in liquid water spheres (for four radii, chosen between 10 nm and 1 μm), and for electrons emitted by five isotopes of iodine (131, 132, 133, 134 and 135), in liquid water spheres of varying radius (from 15 μm up to 250 μm). The results have been compared to those obtained from other Monte Carlo codes and from other published data. The use of the Kolmogorov-Smirnov test has allowed confirming the statistical compatibility of all simulation results.

  8. Monte Carlo method for photon heating using temperature-dependent optical properties.

    PubMed

    Slade, Adam Broadbent; Aguilar, Guillermo

    2015-02-01

    The Monte Carlo method for photon transport is often used to predict the volumetric heating that an optical source will induce inside a tissue or material. This method relies on constant (with respect to temperature) optical properties, specifically the coefficients of scattering and absorption. In reality, optical coefficients are typically temperature-dependent, leading to error in simulation results. The purpose of this study is to develop a method that can incorporate variable properties and accurately simulate systems where the temperature will greatly vary, such as in the case of laser-thawing of frozen tissues. A numerical simulation was developed that utilizes the Monte Carlo method for photon transport to simulate the thermal response of a system that allows temperature-dependent optical and thermal properties. This was done by combining traditional Monte Carlo photon transport with a heat transfer simulation to provide a feedback loop that selects local properties based on current temperatures, for each moment in time. Additionally, photon steps are segmented to accurately obtain path lengths within a homogenous (but not isothermal) material. Validation of the simulation was done using comparisons to established Monte Carlo simulations using constant properties, and a comparison to the Beer-Lambert law for temperature-variable properties. The simulation is able to accurately predict the thermal response of a system whose properties can vary with temperature. The difference in results between variable-property and constant property methods for the representative system of laser-heated silicon can become larger than 100K. This simulation will return more accurate results of optical irradiation absorption in a material which undergoes a large change in temperature. This increased accuracy in simulated results leads to better thermal predictions in living tissues and can provide enhanced planning and improved experimental and procedural outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Validation of GATE Monte Carlo simulations of the GE Advance/Discovery LS PET scanners.

    PubMed

    Schmidtlein, C Ross; Kirov, Assen S; Nehmeh, Sadek A; Erdi, Yusuf E; Humm, John L; Amols, Howard I; Bidaut, Luc M; Ganin, Alex; Stearns, Charles W; McDaniel, David L; Hamacher, Klaus A

    2006-01-01

    The recently developed GATE (GEANT4 application for tomographic emission) Monte Carlo package, designed to simulate positron emission tomography (PET) and single photon emission computed tomography (SPECT) scanners, provides the ability to model and account for the effects of photon noncollinearity, off-axis detector penetration, detector size and response, positron range, photon scatter, and patient motion on the resolution and quality of PET images. The objective of this study is to validate a model within GATE of the General Electric (GE) Advance/Discovery Light Speed (LS) PET scanner. Our three-dimensional PET simulation model of the scanner consists of 12 096 detectors grouped into blocks, which are grouped into modules as per the vendor's specifications. The GATE results are compared to experimental data obtained in accordance with the National Electrical Manufactures Association/Society of Nuclear Medicine (NEMA/SNM), NEMA NU 2-1994, and NEMA NU 2-2001 protocols. The respective phantoms are also accurately modeled thus allowing us to simulate the sensitivity, scatter fraction, count rate performance, and spatial resolution. In-house software was developed to produce and analyze sinograms from the simulated data. With our model of the GE Advance/Discovery LS PET scanner, the ratio of the sensitivities with sources radially offset 0 and 10 cm from the scanner's main axis are reproduced to within 1% of measurements. Similarly, the simulated scatter fraction for the NEMA NU 2-2001 phantom agrees to within less than 3% of measured values (the measured scatter fractions are 44.8% and 40.9 +/- 1.4% and the simulated scatter fraction is 43.5 +/- 0.3%). The simulated count rate curves were made to match the experimental curves by using deadtimes as fit parameters. This resulted in deadtime values of 625 and 332 ns at the Block and Coincidence levels, respectively. The experimental peak true count rate of 139.0 kcps and the peak activity concentration of 21.5 kBq/cc were matched by the simulated results to within 0.5% and 0.1% respectively. The simulated count rate curves also resulted in a peak NECR of 35.2 kcps at 10.8 kBq/cc compared to 37.6 kcps at 10.0 kBq/cc from averaged experimental values. The spatial resolution of the simulated scanner matched the experimental results to within 0.2 mm.

  10. A comparison of Monte-Carlo simulations using RESTRAX and McSTAS with experiment on IN14

    NASA Astrophysics Data System (ADS)

    Wildes, A. R.; S̆aroun, J.; Farhi, E.; Anderson, I.; Høghøj, P.; Brochier, A.

    2000-03-01

    Monte-Carlo simulations of a focusing supermirror guide after the monochromator on the IN14 cold neutron three-axis spectrometer, I.L.L. were carried out using the instrument simulation programs RESTRAX and McSTAS. The simulations were compared to experiment to check their accuracy. Comparisons of the flux ratios over both a 100 and a 1600 mm 2 area at the sample position compare well, and there is a very close agreement between simulation and experiment for the energy spread of the incident beam.

  11. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  12. Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy

    NASA Astrophysics Data System (ADS)

    Sharma, Sanjib

    2017-08-01

    Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.

  13. Monte Carlo Simulations of Radiative and Neutrino Transport under Astrophysical Conditions

    NASA Astrophysics Data System (ADS)

    Krivosheyev, Yu. M.; Bisnovatyi-Kogan, G. S.

    2018-05-01

    Monte Carlo simulations are utilized to model radiative and neutrino transfer in astrophysics. An algorithm that can be used to study radiative transport in astrophysical plasma based on simulations of photon trajectories in a medium is described. Formation of the hard X-ray spectrum of the Galactic microquasar SS 433 is considered in detail as an example. Specific requirements for applying such simulations to neutrino transport in a densemedium and algorithmic differences compared to its application to photon transport are discussed.

  14. A review: Functional near infrared spectroscopy evaluation in muscle tissues using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Halim, A. A. A.; Laili, M. H.; Salikin, M. S.; Rusop, M.

    2018-05-01

    Monte Carlo Simulation has advanced their quantification based on number of the photon counting to solve the propagation of light inside the tissues including the absorption, scattering coefficient and act as preliminary study for functional near infrared application. The goal of this paper is to identify the optical properties using Monte Carlo simulation for non-invasive functional near infrared spectroscopy (fNIRS) evaluation of penetration depth in human muscle. This paper will describe the NIRS principle and the basis for its proposed used in Monte Carlo simulation which focused on several important parameters include ATP, ADP and relate with blow flow and oxygen content at certain exercise intensity. This will cover the advantages and limitation of such application upon this simulation. This result may help us to prove that our human muscle is transparent to this near infrared region and could deliver a lot of information regarding to the oxygenation level in human muscle. Thus, this might be useful for non-invasive technique for detecting oxygen status in muscle from living people either athletes or working people and allowing a lots of investigation muscle physiology in future.

  15. Variability in the high energy gamma ray emission from Cyg X-3 over a two-year period (1983 - 1984) at E 4 x 10(11) eV

    NASA Technical Reports Server (NTRS)

    Cawley, M. F.; Fegan, D. J.; Gibbs, K.; Gorham, P. W.; Lamb, R. C.; Liebing, D. F.; Porter, N. A.; Stenger, V. J.; Weekes, T. C.; Williams, R. J.

    1985-01-01

    Cygnus X-3 is observed to emit gamma rays with energies in excess of 4 x 10 to the 11th power eV during two out of 9 observational categories over an 18 month time span. The emissions are observed at the 0.6 phase of the characteristic 4.8 hr light curve for this binary system. We estimate a peak flux at phase 0.6 of 5 x 10 to the minus 10th power photons cm-2s-1 at a software threshold of 8 x 10 to the 11th power eV for Oct/Nov 1983. A flux for the June 84 effect cannot be reliably calculated at present due to lack of Monte Carlo simulations for the energy range and spectral region. For the other 7 observational categories the observations are consistent with zero source emission. The light curve would appear to be variable on a time scale of a couple of weeks at these categories. Selection of compact images in accordance with Monte Carlo simulations combined with empirical optimization techniques have led to an enriched gamma ray light curve for the Oct/Nov 1983 data. Selection on the basis of shower orientation, however, has not led to any notable enhancement of the gamma ray content. Individual Cherenko images can be reliably sorted on an event by event basis into either proton-induced or photon-induced showers.

  16. Combining ray tracing and CFD in the thermal analysis of a parabolic dish tubular cavity receiver

    NASA Astrophysics Data System (ADS)

    Craig, Ken J.; Marsberg, Justin; Meyer, Josua P.

    2016-05-01

    This paper describes the numerical evaluation of a tubular receiver used in a dish Brayton cycle. In previous work considering the use of Computational Fluid Dynamics (CFD) to perform the calculation of the absorbed radiation from the parabolic dish into the cavity as well as the resulting conjugate heat transfer, it was shown that an axi-symmetric model of the dish and receiver absorbing surfaces was useful in reducing the computational cost required for a full 3-D discrete ordinates solution, but concerns remained about its accuracy. To increase the accuracy, the Monte Carlo ray tracer SolTrace is used to perform the calculation of the absorbed radiation profile to be used in the conjugate heat transfer CFD simulation. The paper describes an approach for incorporating a complex geometry like a tubular receiver generated using CFD software into SolTrace. The results illustrate the variation of CFD mesh density that translates into the number of elements in SolTrace as well as the number of rays used in the Monte Carlo approach and their effect on obtaining a resolution-independent solution. The conjugate heat transfer CFD simulation illustrates the effect of applying the SolTrace surface heat flux profile solution as a volumetric heat source to heat up the air inside the tube. Heat losses due to convection and thermal re-radiation are also determined as a function of different tube absorptivities.

  17. [BeO-OSL detectors for dose measurements in cell cultures].

    PubMed

    Andreeff, M; Sommer, D; Freudenberg, R; Reichelt, U; Henniger, J; Kotzerke, J

    2009-01-01

    The absorbed dose is an important parameter in experiments involving irradiation of cells in vitro with unsealed radionuclides. Typically, this is estimated with a model calculation, although the results thus obtained cannot be verified. Generally used real-time measurement methods are not applicable in this setting. A new detector material with in vitro suitability is the subject of this work. Optically-stimulated luminescence (OSL) dosimeters based on beryllium oxide (BeO) were used for dose measurement in cell cultures exposed to unsealed radionuclides. Their qualitative properties (e. g. energy-dependent count rate sensitivity, fading, contamination by radioactive liquids) were determined and compared to the results of a Monte Carlo simulation (using AMOS software). OSL dosimeters were tested in common cell culture setups with a known geometry. Dose reproducibility of the OSL dosimeters was +/-1.5%. Fading at room temperature was 0.07% per day. Dose loss (optically-stimulated deletion) under ambient lighting conditions was 0.5% per minute. The Monte Carlo simulation for the relative sensitivity at different beta energies provided corresponding results to those obtained with the OSL dosimeters. Dose profile measurements using a 6 well plate and 14 ml PP tube showed that the geometry of the cell culture vessel has a marked influence on dose distribution with 188Re. A new dosimeter system was calibrated with beta-emitters of different energy. It turned out as suitable for measuring dose in liquids. The dose profile measurements obtained are suitably precise to be used as a check against theoretical dose calculations.

  18. Result of Monte-Carlo simulation of electron-photon cascades in lead and layers of lead-scintillator

    NASA Technical Reports Server (NTRS)

    Wasilewski, A.; Krys, E.

    1985-01-01

    Results of Monte-Carlo simulation of electromagnetic cascade development in lead and lead-scintillator sandwiches are analyzed. It is demonstrated that the structure function for core approximation is not applicable in the case in which the primary energy is higher than 100 GeV. The simulation data has shown that introducing an inhomogeneous chamber structure results in subsequent reduction of secondary particles.

  19. Assessment of radiation exposure in dental cone-beam computerized tomography with the use of metal-oxide semiconductor field-effect transistor (MOSFET) dosimeters and Monte Carlo simulations.

    PubMed

    Koivisto, J; Kiljunen, T; Tapiovaara, M; Wolff, J; Kortesniemi, M

    2012-09-01

    The aims of this study were to assess the organ and effective dose (International Commission on Radiological Protection (ICRP) 103) resulting from dental cone-beam computerized tomography (CBCT) imaging using a novel metal-oxide semiconductor field-effect transistor (MOSFET) dosimeter device, and to assess the reliability of the MOSFET measurements by comparing the results with Monte Carlo PCXMC simulations. Organ dose measurements were performed using 20 MOSFET dosimeters that were embedded in the 8 most radiosensitive organs in the maxillofacial and neck area. The dose-area product (DAP) values attained from CBCT scans were used for PCXMC simulations. The acquired MOSFET doses were then compared with the Monte Carlo simulations. The effective dose measurements using MOSFET dosimeters yielded, using 0.5-cm steps, a value of 153 μSv and the PCXMC simulations resulted in a value of 136 μSv. The MOSFET dosimeters placed in a head phantom gave results similar to Monte Carlo simulations. Minor vertical changes in the positioning of the phantom had a substantial affect on the overall effective dose. Therefore, the MOSFET dosimeters constitute a feasible method for dose assessment of CBCT units in the maxillofacial region. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Multi-fidelity methods for uncertainty quantification in transport problems

    NASA Astrophysics Data System (ADS)

    Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.

    2016-12-01

    We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.

  1. Heavy particle transport in sputtering systems

    NASA Astrophysics Data System (ADS)

    Trieschmann, Jan

    2015-09-01

    This contribution aims to discuss the theoretical background of heavy particle transport in plasma sputtering systems such as direct current magnetron sputtering (dcMS), high power impulse magnetron sputtering (HiPIMS), or multi frequency capacitively coupled plasmas (MFCCP). Due to inherently low process pressures below one Pa only kinetic simulation models are suitable. In this work a model appropriate for the description of the transport of film forming particles sputtered of a target material has been devised within the frame of the OpenFOAM software (specifically dsmcFoam). The three dimensional model comprises of ejection of sputtered particles into the reactor chamber, their collisional transport through the volume, as well as deposition of the latter onto the surrounding surfaces (i.e. substrates, walls). An angular dependent Thompson energy distribution fitted to results from Monte-Carlo simulations is assumed initially. Binary collisions are treated via the M1 collision model, a modified variable hard sphere (VHS) model. The dynamics of sputtered and background gas species can be resolved self-consistently following the direct simulation Monte-Carlo (DSMC) approach or, whenever possible, simplified based on the test particle method (TPM) with the assumption of a constant, non-stationary background at a given temperature. At the example of an MFCCP research reactor the transport of sputtered aluminum is specifically discussed. For the peculiar configuration and under typical process conditions with argon as process gas the transport of aluminum sputtered of a circular target is shown to be governed by a one dimensional interaction of the imposed and backscattered particle fluxes. The results are analyzed and discussed on the basis of the obtained velocity distribution functions (VDF). This work is supported by the German Research Foundation (DFG) in the frame of the Collaborative Research Centre TRR 87.

  2. Review of Monte Carlo simulations for backgrounds from radioactivity

    NASA Astrophysics Data System (ADS)

    Selvi, Marco

    2013-08-01

    For all experiments dealing with the rare event searches (neutrino, dark matter, neutrino-less double-beta decay), the reduction of the radioactive background is one of the most important and difficult tasks. There are basically two types of background, electron recoils and nuclear recoils. The electron recoil background is mostly from the gamma rays through the radioactive decay. The nuclear recoil background is from neutrons from spontaneous fission, (α, n) reactions and muoninduced interactions (spallations, photo-nuclear and hadronic interaction). The external gammas and neutrons from the muons and laboratory environment, can be reduced by operating the detector at deep underground laboratories and by placing active or passive shield materials around the detector. The radioactivity of the detector materials also contributes to the background; in order to reduce it a careful screening campaign is mandatory to select highly radio-pure materials. In this review I present the status of current Monte Carlo simulations aimed to estimate and reproduce the background induced by gamma and neutron radioactivity of the materials and the shield of rare event search experiment. For the electromagnetic background a good level of agreement between the data and the MC simulation has been reached by the XENON100 and EDELWEISS experiments, using the GEANT4 toolkit. For the neutron background, a comparison between the yield of neutrons from spontaneous fission and (α, n) obtained with two dedicated softwares, SOURCES-4A and the one developed by Mei-Zhang-Hime, show a good overall agreement, with total yields within a factor 2 difference. The energy spectra from SOURCES-4A are in general smoother, while those from MZH presents sharp peaks. The neutron propagation through various materials has been studied with two MC codes, GEANT4 and MCNPX, showing a reasonably good agreement, inside 50% discrepancy.

  3. Contact radiotherapy using a 50 kV X-ray system: Evaluation of relative dose distribution with the Monte Carlo code PENELOPE and comparison with measurements

    NASA Astrophysics Data System (ADS)

    Croce, Olivier; Hachem, Sabet; Franchisseur, Eric; Marcié, Serge; Gérard, Jean-Pierre; Bordy, Jean-Marc

    2012-06-01

    This paper presents a dosimetric study concerning the system named "Papillon 50" used in the department of radiotherapy of the Centre Antoine-Lacassagne, Nice, France. The machine provides a 50 kVp X-ray beam, currently used to treat rectal cancers. The system can be mounted with various applicators of different diameters or shapes. These applicators can be fixed over the main rod tube of the unit in order to deliver the prescribed absorbed dose into the tumor with an optimal distribution. We have analyzed depth dose curves and dose profiles for the naked tube and for a set of three applicators. Dose measurements were made with an ionization chamber (PTW type 23342) and Gafchromic films (EBT2). We have also compared the measurements with simulations performed using the Monte Carlo code PENELOPE. Simulations were performed with a detailed geometrical description of the experimental setup and with enough statistics. Results of simulations are made in accordance with experimental measurements and provide an accurate evaluation of the dose delivered. The depths of the 50% isodose in water for the various applicators are 4.0, 6.0, 6.6 and 7.1 mm. The Monte Carlo PENELOPE simulations are in accordance with the measurements for a 50 kV X-ray system. Simulations are able to confirm the measurements provided by Gafchromic films or ionization chambers. Results also demonstrate that Monte Carlo simulations could be helpful to validate the future applicators designed for other localizations such as breast or skin cancers. Furthermore, Monte Carlo simulations could be a reliable alternative for a rapid evaluation of the dose delivered by such a system that uses multiple designs of applicators.

  4. Monte Carlo simulation of parameter confidence intervals for non-linear regression analysis of biological data using Microsoft Excel.

    PubMed

    Lambert, Ronald J W; Mytilinaios, Ioannis; Maitland, Luke; Brown, Angus M

    2012-08-01

    This study describes a method to obtain parameter confidence intervals from the fitting of non-linear functions to experimental data, using the SOLVER and Analysis ToolPaK Add-In of the Microsoft Excel spreadsheet. Previously we have shown that Excel can fit complex multiple functions to biological data, obtaining values equivalent to those returned by more specialized statistical or mathematical software. However, a disadvantage of using the Excel method was the inability to return confidence intervals for the computed parameters or the correlations between them. Using a simple Monte-Carlo procedure within the Excel spreadsheet (without recourse to programming), SOLVER can provide parameter estimates (up to 200 at a time) for multiple 'virtual' data sets, from which the required confidence intervals and correlation coefficients can be obtained. The general utility of the method is exemplified by applying it to the analysis of the growth of Listeria monocytogenes, the growth inhibition of Pseudomonas aeruginosa by chlorhexidine and the further analysis of the electrophysiological data from the compound action potential of the rodent optic nerve. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  5. A Fast Monte Carlo Simulation for the International Linear Collider Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furse, D.; /Georgia Tech

    2005-12-15

    The following paper contains details concerning the motivation for, implementation and performance of a Java-based fast Monte Carlo simulation for a detector designed to be used in the International Linear Collider. This simulation, presently included in the SLAC ILC group's org.lcsim package, reads in standard model or SUSY events in STDHEP file format, stochastically simulates the blurring in physics measurements caused by intrinsic detector error, and writes out an LCIO format file containing a set of final particles statistically similar to those that would have found by a full Monte Carlo simulation. In addition to the reconstructed particles themselves, descriptionsmore » of the calorimeter hit clusters and tracks that these particles would have produced are also included in the LCIO output. These output files can then be put through various analysis codes in order to characterize the effectiveness of a hypothetical detector at extracting relevant physical information about an event. Such a tool is extremely useful in preliminary detector research and development, as full simulations are extremely cumbersome and taxing on processor resources; a fast, efficient Monte Carlo can facilitate and even make possible detector physics studies that would be very impractical with the full simulation by sacrificing what is in many cases inappropriate attention to detail for valuable gains in time required for results.« less

  6. Estimation of organ and effective doses from newborn radiography of the chest and abdomen.

    PubMed

    Ma, Hillgan; Elbakri, Idris A; Reed, Martin

    2013-09-01

    Neonatal intensive care patients undergo frequent chest and abdomen radiographic imaging. In this study, the organ doses and the effective dose resulting from combined chest-abdomen radiography of the newborn child are determined. These values are calculated using the Monte Carlo simulation software PCXCM 2.0 and compared with direct dose measurements obtained from thermoluminescent detectors (TLDs) in a physical phantom. The effective dose obtained from PCXMC is 21.2 ± 0.7 μSv and that obtained from TLD measurements is 22.0 ± 0.5 μSv. While the two methods are in close agreement with regard to the effective dose, there is a wide range of variation in organ doses, ranging from 85 % difference for the testes to 1.4 % for the lungs. Large organ dose variations are attributed to organs at the edge of the field of view, or organs with large experimental error or simulation uncertainty. This study suggests that PCXMC can be used to estimate organ and effective doses for newborn patients.

  7. BROMOC suite: Monte Carlo/Brownian dynamics suite for studies of ion permeation and DNA transport in biological and artificial pores with effective potentials.

    PubMed

    De Biase, Pablo M; Markosyan, Suren; Noskov, Sergei

    2015-02-05

    The transport of ions and solutes by biological pores is central for cellular processes and has a variety of applications in modern biotechnology. The time scale involved in the polymer transport across a nanopore is beyond the accessibility of conventional MD simulations. Moreover, experimental studies lack sufficient resolution to provide details on the molecular underpinning of the transport mechanisms. BROMOC, the code presented herein, performs Brownian dynamics simulations, both serial and parallel, up to several milliseconds long. BROMOC can be used to model large biological systems. IMC-MACRO software allows for the development of effective potentials for solute-ion interactions based on radial distribution function from all-atom MD. BROMOC Suite also provides a versatile set of tools to do a wide variety of preprocessing and postsimulation analysis. We illustrate a potential application with ion and ssDNA transport in MspA nanopore. © 2014 Wiley Periodicals, Inc.

  8. Development of a web-based CT dose calculator: WAZA-ARI.

    PubMed

    Ban, N; Takahashi, F; Sato, K; Endo, A; Ono, K; Hasegawa, T; Yoshitake, T; Katsunuma, Y; Kai, M

    2011-09-01

    A web-based computed tomography (CT) dose calculation system (WAZA-ARI) is being developed based on the modern techniques for the radiation transport simulation and for software implementation. Dose coefficients were calculated in a voxel-type Japanese adult male phantom (JM phantom), using the Particle and Heavy Ion Transport code System. In the Monte Carlo simulation, the phantom was irradiated with a 5-mm-thick, fan-shaped photon beam rotating in a plane normal to the body axis. The dose coefficients were integrated into the system, which runs as Java servlets within Apache Tomcat. Output of WAZA-ARI for GE LightSpeed 16 was compared with the dose values calculated similarly using MIRD and ICRP Adult Male phantoms. There are some differences due to the phantom configuration, demonstrating the significance of the dose calculation with appropriate phantoms. While the dose coefficients are currently available only for limited CT scanner models and scanning options, WAZA-ARI will be a useful tool in clinical practice when development is finalised.

  9. BIM-Sim: Interactive Simulation of Broadband Imaging Using Mie Theory

    NASA Astrophysics Data System (ADS)

    Berisha, Sebastian; van Dijk, Thomas; Bhargava, Rohit; Carney, P. Scott; Mayerich, David

    2017-02-01

    Understanding the structure of a scattered electromagnetic (EM) field is critical to improving the imaging process. Mechanisms such as diffraction, scattering, and interference affect an image, limiting the resolution and potentially introducing artifacts. Simulation and visualization of scattered fields thus plays an important role in imaging science. However, the calculation of scattered fields is extremely time-consuming on desktop systems and computationally challenging on task-parallel systems such as supercomputers and cluster systems. In addition, EM fields are high-dimensional, making them difficult to visualize. In this paper, we present a framework for interactively computing and visualizing EM fields scattered by micro and nano-particles. Our software uses graphics hardware for evaluating the field both inside and outside of these particles. We then use Monte-Carlo sampling to reconstruct and visualize the three-dimensional structure of the field, spectral profiles at individual points, the structure of the field at the surface of the particle, and the resulting image produced by an optical system.

  10. Theoretical Grounds for the Propagation of Uncertainties in Monte Carlo Particle Transport

    NASA Astrophysics Data System (ADS)

    Saracco, Paolo; Pia, Maria Grazia; Batic, Matej

    2014-04-01

    We introduce a theoretical framework for the calculation of uncertainties affecting observables produced by Monte Carlo particle transport, which derive from uncertainties in physical parameters input into simulation. The theoretical developments are complemented by a heuristic application, which illustrates the method of calculation in a streamlined simulation environment.

  11. Quantum Monte Carlo Methods for First Principles Simulation of Liquid Water

    ERIC Educational Resources Information Center

    Gergely, John Robert

    2009-01-01

    Obtaining an accurate microscopic description of water structure and dynamics is of great interest to molecular biology researchers and in the physics and quantum chemistry simulation communities. This dissertation describes efforts to apply quantum Monte Carlo methods to this problem with the goal of making progress toward a fully "ab initio"…

  12. Estimating Uncertainty in N2O Emissions from US Cropland Soils

    USDA-ARS?s Scientific Manuscript database

    A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...

  13. Testing the Intervention Effect in Single-Case Experiments: A Monte Carlo Simulation Study

    ERIC Educational Resources Information Center

    Heyvaert, Mieke; Moeyaert, Mariola; Verkempynck, Paul; Van den Noortgate, Wim; Vervloet, Marlies; Ugille, Maaike; Onghena, Patrick

    2017-01-01

    This article reports on a Monte Carlo simulation study, evaluating two approaches for testing the intervention effect in replicated randomized AB designs: two-level hierarchical linear modeling (HLM) and using the additive method to combine randomization test "p" values (RTcombiP). Four factors were manipulated: mean intervention effect,…

  14. Teaching Markov Chain Monte Carlo: Revealing the Basic Ideas behind the Algorithm

    ERIC Educational Resources Information Center

    Stewart, Wayne; Stewart, Sepideh

    2014-01-01

    For many scientists, researchers and students Markov chain Monte Carlo (MCMC) simulation is an important and necessary tool to perform Bayesian analyses. The simulation is often presented as a mathematical algorithm and then translated into an appropriate computer program. However, this can result in overlooking the fundamental and deeper…

  15. Monte Carlo simulation models of breeding-population advancement.

    Treesearch

    J.N. King; G.R. Johnson

    1993-01-01

    Five generations of population improvement were modeled using Monte Carlo simulations. The model was designed to address questions that are important to the development of an advanced generation breeding population. Specifically we addressed the effects on both gain and effective population size of different mating schemes when creating a recombinant population for...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lessard, Francois; Archambault, Louis; Plamondon, Mathieu

    Purpose: Photon dosimetry in the kilovolt (kV) energy range represents a major challenge for diagnostic and interventional radiology and superficial therapy. Plastic scintillation detectors (PSDs) are potentially good candidates for this task. This study proposes a simple way to obtain accurate correction factors to compensate for the response of PSDs to photon energies between 80 and 150 kVp. The performance of PSDs is also investigated to determine their potential usefulness in the diagnostic energy range. Methods: A 1-mm-diameter, 10-mm-long PSD was irradiated by a Therapax SXT 150 unit using five different beam qualities made of tube potentials ranging from 80more » to 150 kVp and filtration thickness ranging from 0.8 to 0.2 mmAl + 1.0 mmCu. The light emitted by the detector was collected using an 8-m-long optical fiber and a polychromatic photodiode, which converted the scintillation photons to an electrical current. The PSD response was compared with the reference free air dose rate measured with a calibrated Farmer NE2571 ionization chamber. PSD measurements were corrected using spectra-weighted corrections, accounting for mass energy-absorption coefficient differences between the sensitive volumes of the ionization chamber and the PSD, as suggested by large cavity theory (LCT). Beam spectra were obtained from x-ray simulation software and validated experimentally using a CdTe spectrometer. Correction factors were also obtained using Monte Carlo (MC) simulations. Percent depth dose (PDD) measurements were compensated for beam hardening using the LCT correction method. These PDD measurements were compared with uncorrected PSD data, PDD measurements obtained using Gafchromic films, Monte Carlo simulations, and previous data. Results: For each beam quality used, the authors observed an increase of the energy response with effective energy when no correction was applied to the PSD response. Using the LCT correction, the PSD response was almost energy independent, with a residual 2.1% coefficient of variation (COV) over the 80-150-kVp energy range. Monte Carlo corrections reduced the COV to 1.4% over this energy range. All PDD measurements were in good agreement with one another except for the uncorrected PSD data, in which an over-response was observed with depth (13% at 10 cm with a 100 kVp beam), showing that beam hardening had a non-negligible effect on the PSD response. A correction based on LCT compensated very well for this effect, reducing the over-response to 3%.Conclusion: In the diagnostic energy range, PSDs show high-energy dependence, which can be corrected using spectra-weighted mass energy-absorption coefficients, showing no considerable sign of quenching between these energies. Correction factors obtained by Monte Carlo simulations confirm that the approximations made by LCT corrections are valid. Thus, PSDs could be useful for real-time dosimetry in radiology applications.« less

  17. Levofloxacin Penetration into Epithelial Lining Fluid as Determined by Population Pharmacokinetic Modeling and Monte Carlo Simulation

    PubMed Central

    Drusano, G. L.; Preston, S. L.; Gotfried, M. H.; Danziger, L. H.; Rodvold, K. A.

    2002-01-01

    Levofloxacin was administered orally to steady state to volunteers randomly in doses of 500 and 750 mg. Plasma and epithelial lining fluid (ELF) samples were obtained at 4, 12, and 24 h after the final dose. All data were comodeled in a population pharmacokinetic analysis employing BigNPEM. Penetration was evaluated from the population mean parameter vector values and from the results of a 1,000-subject Monte Carlo simulation. Evaluation from the population mean values demonstrated a penetration ratio (ELF/plasma) of 1.16. The Monte Carlo simulation provided a measure of dispersion, demonstrating a mean ratio of 3.18, with a median of 1.43 and a 95% confidence interval of 0.14 to 19.1. Population analysis with Monte Carlo simulation provides the best and least-biased estimate of penetration. It also demonstrates clearly that we can expect differences in penetration between patients. This analysis did not deal with inflammation, as it was performed in volunteers. The influence of lung pathology on penetration needs to be examined. PMID:11796385

  18. Geant4 hadronic physics for space radiation environment.

    PubMed

    Ivantchenko, Anton V; Ivanchenko, Vladimir N; Molina, Jose-Manuel Quesada; Incerti, Sebastien L

    2012-01-01

    To test and to develop Geant4 (Geometry And Tracking version 4) Monte Carlo hadronic models with focus on applications in a space radiation environment. The Monte Carlo simulations have been performed using the Geant4 toolkit. Binary (BIC), its extension for incident light ions (BIC-ion) and Bertini (BERT) cascades were used as main Monte Carlo generators. For comparisons purposes, some other models were tested too. The hadronic testing suite has been used as a primary tool for model development and validation against experimental data. The Geant4 pre-compound (PRECO) and de-excitation (DEE) models were revised and improved. Proton, neutron, pion, and ion nuclear interactions were simulated with the recent version of Geant4 9.4 and were compared with experimental data from thin and thick target experiments. The Geant4 toolkit offers a large set of models allowing effective simulation of interactions of particles with matter. We have tested different Monte Carlo generators with our hadronic testing suite and accordingly we can propose an optimal configuration of Geant4 models for the simulation of the space radiation environment.

  19. TASK ALLOCATION IN GEO-DISTRIBUTED CYBER-PHYSICAL SYSTEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aggarwal, Rachit; Smidts, Carol

    This paper studies the task allocation algorithm for a distributed test facility (DTF), which aims to assemble geo-distributed cyber (software) and physical (hardware in the loop components into a prototype cyber-physical system (CPS). This allows low cost testing on an early conceptual prototype (ECP) of the ultimate CPS (UCPS) to be developed. The DTF provides an instrumentation interface for carrying out reliability experiments remotely such as fault propagation analysis and in-situ testing of hardware and software components in a simulated environment. Unfortunately, the geo-distribution introduces an overhead that is not inherent to the UCPS, i.e. a significant time delay inmore » communication that threatens the stability of the ECP and is not an appropriate representation of the behavior of the UCPS. This can be mitigated by implementing a task allocation algorithm to find a suitable configuration and assign the software components to appropriate computational locations, dynamically. This would allow the ECP to operate more efficiently with less probability of being unstable due to the delays introduced by geo-distribution. The task allocation algorithm proposed in this work uses a Monte Carlo approach along with Dynamic Programming to identify the optimal network configuration to keep the time delays to a minimum.« less

  20. Window for Optimal Frequency Operation and Reliability of 3DEG and 2DEG Channels for Oxide Microwave MESFETs and HFETs

    DTIC Science & Technology

    2016-04-01

    noise, and energy relaxation for doped zinc-oxide and structured ZnO transistor materials with a 2-D electron gas (2DEG) channel subjected to a strong...function on the time delay. Closed symbols represent the Monte Carlo data with hot-phonon effect at different electron gas density: 1•1017 cm-3...Monte Carlo simulation is performed for electron gas density of 1•1018 cm-3. Figure 18. Monte Carlo simulation of density-dependent hot-electron energy

  1. The many-body Wigner Monte Carlo method for time-dependent ab-initio quantum simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sellier, J.M., E-mail: jeanmichel.sellier@parallel.bas.bg; Dimov, I.

    2014-09-15

    The aim of ab-initio approaches is the simulation of many-body quantum systems from the first principles of quantum mechanics. These methods are traditionally based on the many-body Schrödinger equation which represents an incredible mathematical challenge. In this paper, we introduce the many-body Wigner Monte Carlo method in the context of distinguishable particles and in the absence of spin-dependent effects. Despite these restrictions, the method has several advantages. First of all, the Wigner formalism is intuitive, as it is based on the concept of a quasi-distribution function. Secondly, the Monte Carlo numerical approach allows scalability on parallel machines that is practicallymore » unachievable by means of other techniques based on finite difference or finite element methods. Finally, this method allows time-dependent ab-initio simulations of strongly correlated quantum systems. In order to validate our many-body Wigner Monte Carlo method, as a case study we simulate a relatively simple system consisting of two particles in several different situations. We first start from two non-interacting free Gaussian wave packets. We, then, proceed with the inclusion of an external potential barrier, and we conclude by simulating two entangled (i.e. correlated) particles. The results show how, in the case of negligible spin-dependent effects, the many-body Wigner Monte Carlo method provides an efficient and reliable tool to study the time-dependent evolution of quantum systems composed of distinguishable particles.« less

  2. SU-E-T-586: Field Size Dependence of Output Factor for Uniform Scanning Proton Beams: A Comparison of TPS Calculation, Measurement and Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Y; Singh, H; Islam, M

    2014-06-01

    Purpose: Output dependence on field size for uniform scanning beams, and the accuracy of treatment planning system (TPS) calculation are not well studied. The purpose of this work is to investigate the dependence of output on field size for uniform scanning beams and compare it among TPS calculation, measurements and Monte Carlo simulations. Methods: Field size dependence was studied using various field sizes between 2.5 cm diameter to 10 cm diameter. The field size factor was studied for a number of proton range and modulation combinations based on output at the center of spread out Bragg peak normalized to amore » 10 cm diameter field. Three methods were used and compared in this study: 1) TPS calculation, 2) ionization chamber measurement, and 3) Monte Carlos simulation. The XiO TPS (Electa, St. Louis) was used to calculate the output factor using a pencil beam algorithm; a pinpoint ionization chamber was used for measurements; and the Fluka code was used for Monte Carlo simulations. Results: The field size factor varied with proton beam parameters, such as range, modulation, and calibration depth, and could decrease over 10% from a 10 cm to 3 cm diameter field for a large range proton beam. The XiO TPS predicted the field size factor relatively well at large field size, but could differ from measurements by 5% or more for small field and large range beams. Monte Carlo simulations predicted the field size factor within 1.5% of measurements. Conclusion: Output factor can vary largely with field size, and needs to be accounted for accurate proton beam delivery. This is especially important for small field beams such as in stereotactic proton therapy, where the field size dependence is large and TPS calculation is inaccurate. Measurements or Monte Carlo simulations are recommended for output determination for such cases.« less

  3. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans

    PubMed Central

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2018-01-01

    The goal of this study is to develop a generalized source model (GSM) for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology. PMID:28079526

  4. SU-F-T-657: In-Room Neutron Dose From High Energy Photon Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christ, D; Ding, G

    Purpose: To estimate neutron dose inside the treatment room from photodisintegration events in high energy photon beams using Monte Carlo simulations and experimental measurements. Methods: The Monte Carlo code MCNP6 was used for the simulations. An Eberline ESP-1 Smart Portable Neutron Detector was used to measure neutron dose. A water phantom was centered at isocenter on the treatment couch, and the detector was placed near the phantom. A Varian 2100EX linear accelerator delivered an 18MV open field photon beam to the phantom at 400MU/min, and a camera captured the detector readings. The experimental setup was modeled in the Monte Carlomore » simulation. The source was modeled for two extreme cases: a) hemispherical photon source emitting from the target and b) cone source with an angle of the primary collimator cone. The model includes the target, primary collimator, flattening filter, secondary collimators, water phantom, detector and concrete walls. Energy deposition tallies were measured for neutrons in the detector and for photons at the center of the phantom. Results: For an 18MV beam with an open 10cm by 10cm field and the gantry at 180°, the Monte Carlo simulations predict the neutron dose in the detector to be 0.11% of the photon dose in the water phantom for case a) and 0.01% for case b). The measured neutron dose is 0.04% of the photon dose. Considering the range of neutron dose predicted by Monte Carlo simulations, the calculated results are in good agreement with measurements. Conclusion: We calculated in-room neutron dose by using Monte Carlo techniques, and the predicted neutron dose is confirmed by experimental measurements. If we remodel the source as an electron beam hitting the target for a more accurate representation of the bremsstrahlung fluence, it is feasible that the Monte Carlo simulations can be used to help in shielding designs.« less

  5. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans

    NASA Astrophysics Data System (ADS)

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2017-03-01

    The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.

  6. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brantley, Patrick; Dawson, Shawn; McKinley, Scott

    2016-04-20

    The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, wemore » also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.« less

  7. Determining the Statistical Power of the Kolmogorov-Smirnov and Anderson-Darling Goodness-of-Fit Tests via Monte Carlo Simulation

    DTIC Science & Technology

    2016-12-01

    KS and AD Statistical Power via Monte Carlo Simulation Statistical power is the probability of correctly rejecting the null hypothesis when the...Select a caveat DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Determining the Statistical Power...real-world data to test the accuracy of the simulation. Statistical comparison of these metrics can be necessary when making such a determination

  8. Computer simulation of stochastic processes through model-sampling (Monte Carlo) techniques.

    PubMed

    Sheppard, C W.

    1969-03-01

    A simple Monte Carlo simulation program is outlined which can be used for the investigation of random-walk problems, for example in diffusion, or the movement of tracers in the blood circulation. The results given by the simulation are compared with those predicted by well-established theory, and it is shown how the model can be expanded to deal with drift, and with reflexion from or adsorption at a boundary.

  9. Diagnosing Undersampling Biases in Monte Carlo Eigenvalue and Flux Tally Estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M.; Rearden, Bradley T.; Marshall, William J.

    2017-02-08

    Here, this study focuses on understanding the phenomena in Monte Carlo simulations known as undersampling, in which Monte Carlo tally estimates may not encounter a sufficient number of particles during each generation to obtain unbiased tally estimates. Steady-state Monte Carlo simulations were performed using the KENO Monte Carlo tools within the SCALE code system for models of several burnup credit applications with varying degrees of spatial and isotopic complexities, and the incidence and impact of undersampling on eigenvalue and flux estimates were examined. Using an inadequate number of particle histories in each generation was found to produce a maximum bias of ~100 pcm in eigenvalue estimates and biases that exceeded 10% in fuel pin flux tally estimates. Having quantified the potential magnitude of undersampling biases in eigenvalue and flux tally estimates in these systems, this study then investigated whether Markov Chain Monte Carlo convergence metrics could be integrated into Monte Carlo simulations to predict the onset and magnitude of undersampling biases. Five potential metrics for identifying undersampling biases were implemented in the SCALE code system and evaluated for their ability to predict undersampling biases by comparing the test metric scores with the observed undersampling biases. Finally, of the five convergence metrics that were investigated, three (the Heidelberger-Welch relative half-width, the Gelman-Rubin more » $$\\hat{R}_c$$ diagnostic, and tally entropy) showed the potential to accurately predict the behavior of undersampling biases in the responses examined.« less

  10. COMPARISON OF MONTE CARLO METHODS FOR NONLINEAR RADIATION TRANSPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    W. R. MARTIN; F. B. BROWN

    2001-03-01

    Five Monte Carlo methods for solving the nonlinear thermal radiation transport equations are compared. The methods include the well-known Implicit Monte Carlo method (IMC) developed by Fleck and Cummings, an alternative to IMC developed by Carter and Forest, an ''exact'' method recently developed by Ahrens and Larsen, and two methods recently proposed by Martin and Brown. The five Monte Carlo methods are developed and applied to the radiation transport equation in a medium assuming local thermodynamic equilibrium. Conservation of energy is derived and used to define appropriate material energy update equations for each of the methods. Details of the Montemore » Carlo implementation are presented, both for the random walk simulation and the material energy update. Simulation results for all five methods are obtained for two infinite medium test problems and a 1-D test problem, all of which have analytical solutions. Conclusions regarding the relative merits of the various schemes are presented.« less

  11. Use of Fluka to Create Dose Calculations

    NASA Technical Reports Server (NTRS)

    Lee, Kerry T.; Barzilla, Janet; Townsend, Lawrence; Brittingham, John

    2012-01-01

    Monte Carlo codes provide an effective means of modeling three dimensional radiation transport; however, their use is both time- and resource-intensive. The creation of a lookup table or parameterization from Monte Carlo simulation allows users to perform calculations with Monte Carlo results without replicating lengthy calculations. FLUKA Monte Carlo transport code was used to develop lookup tables and parameterizations for data resulting from the penetration of layers of aluminum, polyethylene, and water with areal densities ranging from 0 to 100 g/cm^2. Heavy charged ion radiation including ions from Z=1 to Z=26 and from 0.1 to 10 GeV/nucleon were simulated. Dose, dose equivalent, and fluence as a function of particle identity, energy, and scattering angle were examined at various depths. Calculations were compared against well-known results and against the results of other deterministic and Monte Carlo codes. Results will be presented.

  12. Pushing the limits of Monte Carlo simulations for the three-dimensional Ising model

    NASA Astrophysics Data System (ADS)

    Ferrenberg, Alan M.; Xu, Jiahao; Landau, David P.

    2018-04-01

    While the three-dimensional Ising model has defied analytic solution, various numerical methods like Monte Carlo, Monte Carlo renormalization group, and series expansion have provided precise information about the phase transition. Using Monte Carlo simulation that employs the Wolff cluster flipping algorithm with both 32-bit and 53-bit random number generators and data analysis with histogram reweighting and quadruple precision arithmetic, we have investigated the critical behavior of the simple cubic Ising Model, with lattice sizes ranging from 163 to 10243. By analyzing data with cross correlations between various thermodynamic quantities obtained from the same data pool, e.g., logarithmic derivatives of magnetization and derivatives of magnetization cumulants, we have obtained the critical inverse temperature Kc=0.221 654 626 (5 ) and the critical exponent of the correlation length ν =0.629 912 (86 ) with precision that exceeds all previous Monte Carlo estimates.

  13. Optical roughness BRDF model for reverse Monte Carlo simulation of real material thermal radiation transfer.

    PubMed

    Su, Peiran; Eri, Qitai; Wang, Qiang

    2014-04-10

    Optical roughness was introduced into the bidirectional reflectance distribution function (BRDF) model to simulate the reflectance characteristics of thermal radiation. The optical roughness BRDF model stemmed from the influence of surface roughness and wavelength on the ray reflectance calculation. This model was adopted to simulate real metal emissivity. The reverse Monte Carlo method was used to display the distribution of reflectance rays. The numerical simulations showed that the optical roughness BRDF model can calculate the wavelength effect on emissivity and simulate the real metal emissivity variance with incidence angles.

  14. PHAST: Protein-like heteropolymer analysis by statistical thermodynamics

    NASA Astrophysics Data System (ADS)

    Frigori, Rafael B.

    2017-06-01

    PHAST is a software package written in standard Fortran, with MPI and CUDA extensions, able to efficiently perform parallel multicanonical Monte Carlo simulations of single or multiple heteropolymeric chains, as coarse-grained models for proteins. The outcome data can be straightforwardly analyzed within its microcanonical Statistical Thermodynamics module, which allows for computing the entropy, caloric curve, specific heat and free energies. As a case study, we investigate the aggregation of heteropolymers bioinspired on Aβ25-33 fragments and their cross-seeding with IAPP20-29 isoforms. Excellent parallel scaling is observed, even under numerically difficult first-order like phase transitions, which are properly described by the built-in fully reconfigurable force fields. Still, the package is free and open source, this shall motivate users to readily adapt it to specific purposes.

  15. Shape matters: The case for Ellipsoids and Ellipsoidal Water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tillack, Andreas F.; Robinson, Bruce H.

    We describe the shape potentials used for the van der Waals interactions between soft-ellipsoids used to coarse-grain molecular moieties in our Metropolis Monte-Carlo simulation software. The morphologies resulting from different expressions for these van der Waals interaction potentials are discussed for the case of a prolate spheroid system with a strong dipole at the ellipsoid center. We also show that the calculation of ellipsoids is, at worst, only about fivefold more expensive computationally when compared to a simple Lennard- Jones sphere. Finally, as an application of the ellipsoidal shape we parametrize water from the original SPC water model and observemore » – just through the difference in shape alone – a significant improvement of the O-O radial distribution function when compared to experimental data.« less

  16. Low Frequency Flats for Imaging Cameras on the Hubble Space Telescope

    NASA Astrophysics Data System (ADS)

    Kossakowski, Diana; Avila, Roberto J.; Borncamp, David; Grogin, Norman A.

    2017-01-01

    We created a revamped Low Frequency Flat (L-Flat) algorithm for the Hubble Space Telescope (HST) and all of its imaging cameras. The current program that makes these calibration files does not compile on modern computer systems and it requires translation to Python. We took the opportunity to explore various methods that reduce the scatter of photometric observations using chi-squared optimizers along with Markov Chain Monte Carlo (MCMC). We created simulations to validate the algorithms and then worked with the UV photometry of the globular cluster NGC6681 to update the calibration files for the Advanced Camera for Surveys (ACS) and Solar Blind Channel (SBC). The new software was made for general usage and therefore can be applied to any of the current imaging cameras on HST.

  17. Data Reprocessing on Worldwide Distributed Systems

    NASA Astrophysics Data System (ADS)

    Wicke, Daniel

    The DØ experiment faces many challenges in terms of enabling access to large datasets for physicists on four continents. The strategy for solving these problems on worldwide distributed computing clusters is presented. Since the beginning of Run II of the Tevatron (March 2001) all Monte-Carlo simulations for the experiment have been produced at remote systems. For data analysis, a system of regional analysis centers (RACs) was established which supply the associated institutes with the data. This structure, which is similar to the tiered structure foreseen for the LHC was used in Fall 2003 to reprocess all DØ data with a much improved version of the reconstruction software. This makes DØ the first running experiment that has implemented and operated all important computing tasks of a high energy physics experiment on systems distributed worldwide.

  18. MC ray-tracing optimization of lobster-eye focusing devices with RESTRAX

    NASA Astrophysics Data System (ADS)

    Šaroun, Jan; Kulda, Jiří

    2006-11-01

    The enhanced functionalities of the latest version of the RESTRAX software, providing a high-speed Monte Carlo (MC) ray-tracing code to represent a virtual three-axis neutron spectrometer, include representation of parabolic and elliptic guide profiles and facilities for numerical optimization of parameter values, characterizing the instrument components. As examples, we present simulations of a doubly focusing monochromator in combination with cold neutron guides and lobster-eye supermirror devices, concentrating a monochromatic beam to small sample volumes. A Levenberg-Marquardt minimization algorithm is used to optimize simultaneously several parameters of the monochromator and lobster-eye guides. We compare the performance of optimized configurations in terms of monochromatic neutron flux and energy spread and demonstrate the effect of lobster-eye optics on beam transformations in real and momentum subspaces.

  19. 14- by 22-Foot Subsonic Tunnel Laser Velocimeter Upgrade

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Lee, Joseph W.; Cavone, Angelo A.; Fletcher, Mark T.

    2012-01-01

    A long-focal length laser velocimeter constructed in the early 1980's was upgraded using current technology to improve usability, reliability and future serviceability. The original, free-space optics were replaced with a state-of-the-art fiber-optic subsystem which allowed most of the optics, including the laser, to be remote from the harsh tunnel environment. General purpose high-speed digitizers were incorporated in a standard modular data acquisition system, along with custom signal processing software executed on a desktop computer, served as the replacement for the signal processors. The resulting system increased optical sensitivity with real-time signal/data processing that produced measurement precisions exceeding those of the original system. Monte Carlo simulations, along with laboratory and wind tunnel investigations were used to determine system characteristics and measurement precision.

  20. The Modernization of a Long-Focal Length Fringe-Type Laser Velocimeter

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Lee, Joseph W.; Cavone, Angelo A.; Fletcher, Mark T.

    2012-01-01

    A long-focal length laser velocimeter constructed in the early 1980's was upgraded using current technology to improve usability, reliability and future serviceability. The original, free-space optics were replaced with a state-of-the-art fiber-optic subsystem which allowed most of the optics, including the laser, to be remote from the harsh tunnel environment. General purpose high-speed digitizers were incorporated in a standard modular data acquisition system, along with custom signal processing software executed on a desktop computer, served as the replacement for the signal processors. The resulting system increased optical sensitivity with real-time signal/data processing that produced measurement precisions exceeding those of the original system. Monte Carlo simulations, along with laboratory and wind tunnel investigations were used to determine system characteristics and measurement precision.

  1. Efficient calculation of luminance variation of a luminaire that uses LED light sources

    NASA Astrophysics Data System (ADS)

    Goldstein, Peter

    2007-09-01

    Many luminaires have an array of LEDs that illuminate a lenslet-array diffuser in order to create the appearance of a single, extended source with a smooth luminance distribution. Designing such a system is challenging because luminance calculations for a lenslet array generally involve tracing millions of rays per LED, which is computationally intensive and time-consuming. This paper presents a technique for calculating an on-axis luminance distribution by tracing only one ray per LED per lenslet. A multiple-LED system is simulated with this method, and with Monte Carlo ray-tracing software for comparison. Accuracy improves, and computation time decreases by at least five orders of magnitude with this technique, which has applications in LED-based signage, displays, and general illumination.

  2. A new tool for radiation exposure calculations in aircraft flights during disturbed solar activity periods

    NASA Astrophysics Data System (ADS)

    Paschalis, Pavlos; Tezari, Anastasia; Gerontidou, Maria; Mavromichalaki, Helen

    2016-04-01

    Galactic cosmic rays and solar energetic particles can penetrate the Earth's atmosphere and interact with its molecules, which can cause atmospheric showers of secondary particles that are detected by ground based neutron monitor detectors. The cascades are of great importance for the study of the radiation exposure of aircraft crews. A new Geant4 software application is presented based on DYASTIMA (Dynamic Atmospheric Shower Tracking Interactive Model Application), which calculates the effective dose that aviators may receive in different flight scenarios characterized by different altitudes and different flight routes, during quiet and disturbed solar and cosmic ray activity. The concept is based on Monte Carlo simulations by using phantoms for the aircraft and the aviator and experimenting with different shielding materials.

  3. A Monte Carlo Simulation of Brownian Motion in the Freshman Laboratory

    ERIC Educational Resources Information Center

    Anger, C. D.; Prescott, J. R.

    1970-01-01

    Describes a dry- lab" experiment for the college freshman laboratory, in which the essential features of Browian motion are given principles, using the Monte Carlo technique. Calculations principles, using the Monte Carlo technique. Calculations are carried out by a computation sheme based on computer language. Bibliography. (LC)

  4. Grand canonical ensemble Monte Carlo simulation of the dCpG/proflavine crystal hydrate.

    PubMed

    Resat, H; Mezei, M

    1996-09-01

    The grand canonical ensemble Monte Carlo molecular simulation method is used to investigate hydration patterns in the crystal hydrate structure of the dCpG/proflavine intercalated complex. The objective of this study is to show by example that the recently advocated grand canonical ensemble simulation is a computationally efficient method for determining the positions of the hydrating water molecules in protein and nucleic acid structures. A detailed molecular simulation convergence analysis and an analogous comparison of the theoretical results with experiments clearly show that the grand ensemble simulations can be far more advantageous than the comparable canonical ensemble simulations.

  5. TU-H-207A-02: Relative Importance of the Various Factors Influencing the Accuracy of Monte Carlo Simulated CT Dose Index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marous, L; Muryn, J; Liptak, C

    2016-06-15

    Purpose: Monte Carlo simulation is a frequently used technique for assessing patient dose in CT. The accuracy of a Monte Carlo program is often validated using the standard CT dose index (CTDI) phantoms by comparing simulated and measured CTDI{sub 100}. To achieve good agreement, many input parameters in the simulation (e.g., energy spectrum and effective beam width) need to be determined. However, not all the parameters have equal importance. Our aim was to assess the relative importance of the various factors that influence the accuracy of simulated CTDI{sub 100}. Methods: A Monte Carlo program previously validated for a clinical CTmore » system was used to simulate CTDI{sub 100}. For the standard CTDI phantoms (32 and 16 cm in diameter), CTDI{sub 100} values from central and four peripheral locations at 70 and 120 kVp were first simulated using a set of reference input parameter values (treated as the truth). To emulate the situation in which the input parameter values used by the researcher may deviate from the truth, additional simulations were performed in which intentional errors were introduced into the input parameters, the effects of which on simulated CTDI{sub 100} were analyzed. Results: At 38.4-mm collimation, errors in effective beam width up to 5.0 mm showed negligible effects on simulated CTDI{sub 100} (<1.0%). Likewise, errors in acrylic density of up to 0.01 g/cm{sup 3} resulted in small CTDI{sub 100} errors (<2.5%). In contrast, errors in spectral HVL produced more significant effects: slight deviations (±0.2 mm Al) produced errors up to 4.4%, whereas more extreme deviations (±1.4 mm Al) produced errors as high as 25.9%. Lastly, ignoring the CT table introduced errors up to 13.9%. Conclusion: Monte Carlo simulated CTDI{sub 100} is insensitive to errors in effective beam width and acrylic density. However, they are sensitive to errors in spectral HVL. To obtain accurate results, the CT table should not be ignored. This work was supported by a Faculty Research and Development Award from Cleveland State University.« less

  6. APS undulator and wiggler sources: Monte-Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, S.L.; Lai, B.; Viccaro, P.J.

    1992-02-01

    Standard insertion devices will be provided to each sector by the Advanced Photon Source. It is important to define the radiation characteristics of these general purpose devices. In this document,results of Monte-Carlo simulation are presented. These results, based on the SHADOW program, include the APS Undulator A (UA), Wiggler A (WA), and Wiggler B (WB).

  7. Markov Chain Monte Carlo Estimation of Item Parameters for the Generalized Graded Unfolding Model

    ERIC Educational Resources Information Center

    de la Torre, Jimmy; Stark, Stephen; Chernyshenko, Oleksandr S.

    2006-01-01

    The authors present a Markov Chain Monte Carlo (MCMC) parameter estimation procedure for the generalized graded unfolding model (GGUM) and compare it to the marginal maximum likelihood (MML) approach implemented in the GGUM2000 computer program, using simulated and real personality data. In the simulation study, test length, number of response…

  8. A systematic framework for Monte Carlo simulation of remote sensing errors map in carbon assessments

    Treesearch

    S. Healey; P. Patterson; S. Urbanski

    2014-01-01

    Remotely sensed observations can provide unique perspective on how management and natural disturbance affect carbon stocks in forests. However, integration of these observations into formal decision support will rely upon improved uncertainty accounting. Monte Carlo (MC) simulations offer a practical, empirical method of accounting for potential remote sensing errors...

  9. Monte Carlo simulations of coherent backscatter for identification of the optical coefficients of biological tissues in vivo

    NASA Astrophysics Data System (ADS)

    Eddowes, M. H.; Mills, T. N.; Delpy, D. T.

    1995-05-01

    A Monte Carlo model of light backscattered from turbid media has been used to simulate the effects of weak localization in biological tissues. A validation technique is used that implies that for the scattering and absorption coefficients and for refractive index mismatches found in tissues, the Monte Carlo method is likely to provide more accurate results than the methods previously used. The model also has the ability to simulate the effects of various illumination profiles and other laboratory-imposed conditions. A curve-fitting routine has been developed that might be used to extract the optical coefficients from the angular intensity profiles seen in experiments on turbid biological tissues, data that could be obtained in vivo.

  10. Radial-based tail methods for Monte Carlo simulations of cylindrical interfaces

    NASA Astrophysics Data System (ADS)

    Goujon, Florent; Bêche, Bruno; Malfreyt, Patrice; Ghoufi, Aziz

    2018-03-01

    In this work, we implement for the first time the radial-based tail methods for Monte Carlo simulations of cylindrical interfaces. The efficiency of this method is then evaluated through the calculation of surface tension and coexisting properties. We show that the inclusion of tail corrections during the course of the Monte Carlo simulation impacts the coexisting and the interfacial properties. We establish that the long range corrections to the surface tension are the same order of magnitude as those obtained from planar interface. We show that the slab-based tail method does not amend the localization of the Gibbs equimolar dividing surface. Additionally, a non-monotonic behavior of surface tension is exhibited as a function of the radius of the equimolar dividing surface.

  11. Monte Carlo source simulation technique for solution of interference reactions in INAA experiments: a preliminary report

    NASA Astrophysics Data System (ADS)

    Allaf, M. Athari; Shahriari, M.; Sohrabpour, M.

    2004-04-01

    A new method using Monte Carlo source simulation of interference reactions in neutron activation analysis experiments has been developed. The neutron spectrum at the sample location has been simulated using the Monte Carlo code MCNP and the contributions of different elements to produce a specified gamma line have been determined. The produced response matrix has been used to measure peak areas and the sample masses of the elements of interest. A number of benchmark experiments have been performed and the calculated results verified against known values. The good agreement obtained between the calculated and known values suggests that this technique may be useful for the elimination of interference reactions in neutron activation analysis.

  12. Force field development with GOMC, a fast new Monte Carlo molecular simulation code

    NASA Astrophysics Data System (ADS)

    Mick, Jason Richard

    In this work GOMC (GPU Optimized Monte Carlo) a new fast, flexible, and free molecular Monte Carlo code for the simulation atomistic chemical systems is presented. The results of a large Lennard-Jonesium simulation in the Gibbs ensemble is presented. Force fields developed using the code are also presented. To fit the models a quantitative fitting process is outlined using a scoring function and heat maps. The presented n-6 force fields include force fields for noble gases and branched alkanes. These force fields are shown to be the most accurate LJ or n-6 force fields to date for these compounds, capable of reproducing pure fluid behavior and binary mixture behavior to a high degree of accuracy.

  13. SU-F-T-682: In-Vivo Simulation of the Relative Biological Effectiveness in Proton Therapy Using a Monte Carlo Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oesten, H; Massachusetts General Hospital, Boston, MA; Loeck, S

    2016-06-15

    Purpose: In proton therapy, the relative biological effectiveness (RBE) – compared with conventional photon therapy – is routinely set to 1.1. However, experimental in vitro studies indicate evidence for the variability of the RBE. To clarify the impact on patient treatment, investigation of the RBE in a preclinical case study should be performed. Methods: The Monte Carlo software TOPAS was used to simulate the radiation field of an irradiation setup at the experimental beamline of the proton therapy facility (OncoRay) in Dresden, Germany. Simulations were performed on cone beam CT-data (CBCT) of a xenogeneous mouse with an orthotopic lung carcinomamore » obtained by an in-house developed small animal image-guided radiotherapy device. A homogeneous physical fraction dose of 1.8Gy was prescribed for the contoured tumor volume. Simulated dose and linear energy transfer distributions were used to estimate RBE values in the mouse based on an RBE model by Wedenberg et al. To characterize radiation sensitivity of normal and tumor tissue, α/β-ratios were taken from the literature for NB1RGB (10.1Gy) and human squamous lung cancer (6.2Gy) cell lines, respectively. Results: Good dose coverage of the target volume was achieved with a spread-out Bragg peak (SOBP). The contra-lateral lung was completely spared from receiving radiation. An increase in RBE towards the distal end of the SOBP from 1.07 to 1.35 and from 1.05 to 1.3 was observed when considering normal tissue and tumor, respectively, with the highest RBE values located distal to the target volume. Conclusion: Modeled RBE values simulated on CBCT for experimental preclinical proton therapy varied with tissue type and depth in a mouse and differed therefore from a constant value of 1.1. Further translational work will include, first, conducting preclinical experiments and, second, analogous RBE studies in patients using experimentally verified simulation settings for our clinically used patient-specific beam conforming technique.« less

  14. Molecular Monte Carlo Simulations Using Graphics Processing Units: To Waste Recycle or Not?

    PubMed

    Kim, Jihan; Rodgers, Jocelyn M; Athènes, Manuel; Smit, Berend

    2011-10-11

    In the waste recycling Monte Carlo (WRMC) algorithm, (1) multiple trial states may be simultaneously generated and utilized during Monte Carlo moves to improve the statistical accuracy of the simulations, suggesting that such an algorithm may be well posed for implementation in parallel on graphics processing units (GPUs). In this paper, we implement two waste recycling Monte Carlo algorithms in CUDA (Compute Unified Device Architecture) using uniformly distributed random trial states and trial states based on displacement random-walk steps, and we test the methods on a methane-zeolite MFI framework system to evaluate their utility. We discuss the specific implementation details of the waste recycling GPU algorithm and compare the methods to other parallel algorithms optimized for the framework system. We analyze the relationship between the statistical accuracy of our simulations and the CUDA block size to determine the efficient allocation of the GPU hardware resources. We make comparisons between the GPU and the serial CPU Monte Carlo implementations to assess speedup over conventional microprocessors. Finally, we apply our optimized GPU algorithms to the important problem of determining free energy landscapes, in this case for molecular motion through the zeolite LTA.

  15. SU-G-JeP1-13: Innovative Tracking Detector for Dose Monitoring in Hadron Therapy: Realization and Monte Carlo Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rucinski, A; Mancini-Terracciano, C; Paramatti, R

    2016-06-15

    Purpose: Development of strategies to monitor range uncertainties is necessary to improve treatment planning in Charged Particle Therapy (CPT) and fully exploit the advantages of ion beams. Our group developed (within the framework of the INSIDE project funded by the Italian research ministry) and is currently building a compact detector Dose Profiler (DP) able to backtrack charged secondary particles produced in the patient during the irradiation. Furthermore we are studying monitoring strategy exploiting charged secondary emission profiles to control the range of the ion beam. Methods: This contribution reports on the DP detector design and construction status. The detector consistsmore » of a charged secondary tracker composed of scintillating fiber layers and a LYSO calorimeter for particles energy measurement.The detector layout has been optimized using the FLUKA Monte Carlo (MC) simulation software. The simulation of a 220 MeV Carbon beam impinging on a PMMA target has been performed to study the detector response, exploiting previous secondary radiation measurements performed by our group. The emission profile of charged secondary particles was reconstructed backtracking the particles to their generation point to benchmark the DP performances. Results: The DP construction status, including the technological details will be presented. The feasibility of range monitoring with DP will be demonstrated by means of MC studies. The correlation of the charged secondary particles emission shape with the position of the Bragg peak (BP) will be shown, as well as the spatial resolution achievable on the BP position estimation (less than 3 mm) in the clinical like conditions. Conclusion: The simulation studies supported the feasibility of an accurate range monitoring technique exploiting the use of charged secondary fragments emitted during the particle therapy treatment. The DP experimental tests are foreseen in 2016, at CNAO particle therapy center in Pavia.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    St James, S; Argento, D; DeWitt, D

    Purpose: Fast neutron therapy is offered at the University of Washington Medical Center for treatment of selected cancers. The hardware and control systems of the UW Clinical Neutron Therapy System are undergoing upgrades to enable delivery of IMNT. To clinically implement IMNT, dose verification tools need to be developed. We propose a portal imaging system that relies on the creation of positron emitting isotopes ({sup 11}C and {sup 15}O) through (n, 2n) reactions with a PMMA plate placed below the patient. After field delivery, the plate is retrieved from the vault and imaged using a reader that detects the annihilationmore » photons. The pattern of activity produced in the plate provides information to reconstruct the neutron fluence map that can be compared to fluence maps from Monte Carlo (MCNP) simulations to verify treatment delivery. We have previously performed Monte Carlo simulations of the portal imaging system (GATE simulations) and the beam line (MCNP simulations). In this work, initial measurements using a prototype system are presented. Methods: Custom electronics were developed for BGO detectors read out with photomultiplier tubes (previous generation PET detectors from a CTI ECAT 953 scanner). Two detectors were placed in coincidence, with a detector separation of 2 cm. Custom software was developed to create the crystal look up tables and perform a limited angle planar reconstruction with a stochastic normalization. To test the initial capabilities of the system, PMMA squares were irradiated with neutrons at a depth of 1.5 cm and read out using the prototype system. Doses ranging from 10–200 cGy were delivered. Results: Using the prototype system, dose differences in the therapeutic range could be determined. Conclusion: The prototype portal imaging system is capable of detecting neutron doses as low as 10–50 cGy and shows great promise as a patient QA tool for IMNT.« less

  17. Characterizing a proton beam scanning system for Monte Carlo dose calculation in patients

    NASA Astrophysics Data System (ADS)

    Grassberger, C.; Lomax, Anthony; Paganetti, H.

    2015-01-01

    The presented work has two goals. First, to demonstrate the feasibility of accurately characterizing a proton radiation field at treatment head exit for Monte Carlo dose calculation of active scanning patient treatments. Second, to show that this characterization can be done based on measured depth dose curves and spot size alone, without consideration of the exact treatment head delivery system. This is demonstrated through calibration of a Monte Carlo code to the specific beam lines of two institutions, Massachusetts General Hospital (MGH) and Paul Scherrer Institute (PSI). Comparison of simulations modeling the full treatment head at MGH to ones employing a parameterized phase space of protons at treatment head exit reveals the adequacy of the method for patient simulations. The secondary particle production in the treatment head is typically below 0.2% of primary fluence, except for low-energy electrons (<0.6 MeV for 230 MeV protons), whose contribution to skin dose is negligible. However, there is significant difference between the two methods in the low-dose penumbra, making full treatment head simulations necessary to study out-of-field effects such as secondary cancer induction. To calibrate the Monte Carlo code to measurements in a water phantom, we use an analytical Bragg peak model to extract the range-dependent energy spread at the two institutions, as this quantity is usually not available through measurements. Comparison of the measured with the simulated depth dose curves demonstrates agreement within 0.5 mm over the entire energy range. Subsequently, we simulate three patient treatments with varying anatomical complexity (liver, head and neck and lung) to give an example how this approach can be employed to investigate site-specific discrepancies between treatment planning system and Monte Carlo simulations.

  18. Characterizing a Proton Beam Scanning System for Monte Carlo Dose Calculation in Patients

    PubMed Central

    Grassberger, C; Lomax, Tony; Paganetti, H

    2015-01-01

    The presented work has two goals. First, to demonstrate the feasibility of accurately characterizing a proton radiation field at treatment head exit for Monte Carlo dose calculation of active scanning patient treatments. Second, to show that this characterization can be done based on measured depth dose curves and spot size alone, without consideration of the exact treatment head delivery system. This is demonstrated through calibration of a Monte Carlo code to the specific beam lines of two institutions, Massachusetts General Hospital (MGH) and Paul Scherrer Institute (PSI). Comparison of simulations modeling the full treatment head at MGH to ones employing a parameterized phase space of protons at treatment head exit reveals the adequacy of the method for patient simulations. The secondary particle production in the treatment head is typically below 0.2% of primary fluence, except for low–energy electrons (<0.6MeV for 230MeV protons), whose contribution to skin dose is negligible. However, there is significant difference between the two methods in the low-dose penumbra, making full treatment head simulations necessary to study out-of field effects such as secondary cancer induction. To calibrate the Monte Carlo code to measurements in a water phantom, we use an analytical Bragg peak model to extract the range-dependent energy spread at the two institutions, as this quantity is usually not available through measurements. Comparison of the measured with the simulated depth dose curves demonstrates agreement within 0.5mm over the entire energy range. Subsequently, we simulate three patient treatments with varying anatomical complexity (liver, head and neck and lung) to give an example how this approach can be employed to investigate site-specific discrepancies between treatment planning system and Monte Carlo simulations. PMID:25549079

  19. Using Monte Carlo Simulation to Prioritize Key Maritime Environmental Impacts of Port Infrastructure

    NASA Astrophysics Data System (ADS)

    Perez Lespier, L. M.; Long, S.; Shoberg, T.

    2016-12-01

    This study creates a Monte Carlo simulation model to prioritize key indicators of environmental impacts resulting from maritime port infrastructure. Data inputs are derived from LandSat imagery, government databases, and industry reports to create the simulation. Results are validated using subject matter experts and compared with those returned from time-series regression to determine goodness of fit. The Port of Prince Rupert, Canada is used as the location for the study.

  20. Antihydrogen from positronium impact with cold antiprotons: a Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Cassidy, D. B.; Merrison, J. P.; Charlton, M.; Mitroy, J.; Ryzhikh, G.

    1999-04-01

    A Monte Carlo simulation of the reaction to form antihydrogen by positronium impact upon antiprotons has been undertaken. Total and differential cross sections have been utilized as inputs to the simulation which models the conditions foreseen in planned antihydrogen formation experiments using positrons and antiprotons held in Penning traps. Thus, predictions of antihydrogen production rates, angular distributions and the variation of the mean antihydrogen temperature as a function of incident positronium kinetic energy have been produced.

Top