Sample records for fluka mc code

  1. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    PubMed

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  2. A model for the accurate computation of the lateral scattering of protons in water

    NASA Astrophysics Data System (ADS)

    Bellinzona, E. V.; Ciocca, M.; Embriaco, A.; Ferrari, A.; Fontana, A.; Mairani, A.; Parodi, K.; Rotondi, A.; Sala, P.; Tessonnier, T.

    2016-02-01

    A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.

  3. A model for the accurate computation of the lateral scattering of protons in water.

    PubMed

    Bellinzona, E V; Ciocca, M; Embriaco, A; Ferrari, A; Fontana, A; Mairani, A; Parodi, K; Rotondi, A; Sala, P; Tessonnier, T

    2016-02-21

    A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.

  4. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm3, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique. We therefore conclude that customization parameters must be set with reference to the optimized parameters of the corresponding irradiation technique in order to render them useful for achieving artifact-free MC simulation for use in computational experiments and clinical treatments.

  5. A Detailed FLUKA-2005 Monte Carlo Simulation for the ATIC Detector

    NASA Technical Reports Server (NTRS)

    Gunasingha, R. M.; Fazely, A. R.; Adams, J. H.; Ahn, H. S.; Bashindzhagyan, G. L.; Batkov, K. E.; Chang, J.; Christl, M.; Ganel, O.; Guzik, T. G.

    2006-01-01

    We have performed a detailed Monte Carlo (MC) calculation for the Advanced thin Ionization Calorimeter (ATIC) detector using the MC code FLUKA-2005 which is capable of simulating particles up to 10 PeV. The ATIC detector has completed two successful balloon flights from McMurdo, Antarctica lasting a total of more than 35 days. ATIC is designed as a multiple, long duration balloon Bight, investigation of the cosmic ray spectra from below 50 GeV to near 100 TeV total energy; using a fully active Bismuth Germanate @GO) calorimeter. It is equipped with a large mosaic of silicon detector pixels capable of charge identification and as a particle tracking system, three projective layers of x-y scintillator hodoscopes were employed, above, in the middle and below a 0.75 nuclear interaction length graphite target. Our calculations are part of an analysis package of both A- and energy-dependences of different nuclei interacting with the ATIC detector. The MC simulates the responses of different components of the detector such as the Simatrix, the scintillator hodoscopes and the BGO calorimeter to various nuclei. We also show comparisons of the FLUKA-2005 MC calculations with a GEANT calculation and data for protons, He and CNO.

  6. Comparison of Fluka-2006 Monte Carlo Simulation and Flight Data for the ATIC Detector

    NASA Technical Reports Server (NTRS)

    Gunasingha, R.M.; Fazely, A.R.; Adams, J.H.; Ahn, H.S.; Bashindzhagyan, G.L.; Chang, J.; Christl, M.; Ganel, O.; Guzik, T.G.; Isbert, J.; hide

    2007-01-01

    We have performed a detailed Monte Carlo (MC) simulation for the Advanced Thin Ionization Calorimeter (ATIC) detector using the MC code FLUKA-2006 which is capable of simulating particles up to 10 PeV. The ATIC detector has completed two successful balloon flights from McMurdo, Antarctica lasting a total of more than 35 days. ATIC is designed as a multiple, long duration balloon flight, investigation of the cosmic ray spectra from below 50 GeV to near 100 TeV total energy; using a fully active Bismuth Germanate(BGO) calorimeter. It is equipped with a large mosaic of.silicon detector pixels capable of charge identification, and, for particle tracking, three projective layers of x-y scintillator hodoscopes, located above, in the middle and below a 0.75 nuclear interaction length graphite target. Our simulations are part of an analysis package of both nuclear (A) and energy dependences for different nuclei interacting in the ATIC detector. The MC simulates the response of different components of the detector such as the Si-matrix, the scintillator hodoscopes and the BGO calorimeter to various nuclei. We present comparisons of the FLUKA-2006 MC calculations with GEANT calculations and with the ATIC CERN data and ATIC flight data.

  7. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  8. The FLUKA Monte Carlo code coupled with the NIRS approach for clinical dose calculations in carbon ion therapy

    NASA Astrophysics Data System (ADS)

    Magro, G.; Dahle, T. J.; Molinelli, S.; Ciocca, M.; Fossati, P.; Ferrari, A.; Inaniwa, T.; Matsufuji, N.; Ytre-Hauge, K. S.; Mairani, A.

    2017-05-01

    Particle therapy facilities often require Monte Carlo (MC) simulations to overcome intrinsic limitations of analytical treatment planning systems (TPS) related to the description of the mixed radiation field and beam interaction with tissue inhomogeneities. Some of these uncertainties may affect the computation of effective dose distributions; therefore, particle therapy dedicated MC codes should provide both absorbed and biological doses. Two biophysical models are currently applied clinically in particle therapy: the local effect model (LEM) and the microdosimetric kinetic model (MKM). In this paper, we describe the coupling of the NIRS (National Institute for Radiological Sciences, Japan) clinical dose to the FLUKA MC code. We moved from the implementation of the model itself to its application in clinical cases, according to the NIRS approach, where a scaling factor is introduced to rescale the (carbon-equivalent) biological dose to a clinical dose level. A high level of agreement was found with published data by exploring a range of values for the MKM input parameters, while some differences were registered in forward recalculations of NIRS patient plans, mainly attributable to differences with the analytical TPS dose engine (taken as reference) in describing the mixed radiation field (lateral spread and fragmentation). We presented a tool which is being used at the Italian National Center for Oncological Hadrontherapy to support the comparison study between the NIRS clinical dose level and the LEM dose specification.

  9. The calculation of mass attenuation coefficients of well-known thermoluminescent dosimetric compounds at wide energy range

    NASA Astrophysics Data System (ADS)

    Ermis, Elif Ebru

    2017-02-01

    The photon mass attenuation coefficients of LiF, BaSO4, CaCO3 and CaSO4 thermoluminescent dosimetric compounds at 100; 300; 500; 600; 800; 1,000; 1,500; 2,000; 3,000 and 5,000 keV gamma-ray energies were calculated. For this purpose, FLUKA Monte Carlo (MC) program which is one of the well-known MC codes was used in this study. Furthermore, obtained results were analyzed by means of ROOT program. National Institute of Standards and Technology (NIST) values were also used to compare the obtained theoretical values because the mass attenuation values of the used compounds could not found in the literature. Calculated mass attenuation coefficients were highly in accordance with the NIST values. As a consequence, FLUKA was successful in calculating the mass attenuation coefficients of the most used thermoluminescent compound.

  10. SU-F-T-193: Evaluation of a GPU-Based Fast Monte Carlo Code for Proton Therapy Biological Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taleei, R; Qin, N; Jiang, S

    2016-06-15

    Purpose: Biological treatment plan optimization is of great interest for proton therapy. It requires extensive Monte Carlo (MC) simulations to compute physical dose and biological quantities. Recently, a gPMC package was developed for rapid MC dose calculations on a GPU platform. This work investigated its suitability for proton therapy biological optimization in terms of accuracy and efficiency. Methods: We performed simulations of a proton pencil beam with energies of 75, 150 and 225 MeV in a homogeneous water phantom using gPMC and FLUKA. Physical dose and energy spectra for each ion type on the central beam axis were scored. Relativemore » Biological Effectiveness (RBE) was calculated using repair-misrepair-fixation model. Microdosimetry calculations were performed using Monte Carlo Damage Simulation (MCDS). Results: Ranges computed by the two codes agreed within 1 mm. Physical dose difference was less than 2.5 % at the Bragg peak. RBE-weighted dose agreed within 5 % at the Bragg peak. Differences in microdosimetric quantities such as dose average lineal energy transfer and specific energy were < 10%. The simulation time per source particle with FLUKA was 0.0018 sec, while gPMC was ∼ 600 times faster. Conclusion: Physical dose computed by FLUKA and gPMC were in a good agreement. The RBE differences along the central axis were small, and RBE-weighted dose difference was found to be acceptable. The combined accuracy and efficiency makes gPMC suitable for proton therapy biological optimization.« less

  11. Development of the 3DHZETRN code for space radiation protection

    NASA Astrophysics Data System (ADS)

    Wilson, John; Badavi, Francis; Slaba, Tony; Reddell, Brandon; Bahadori, Amir; Singleterry, Robert

    Space radiation protection requires computationally efficient shield assessment methods that have been verified and validated. The HZETRN code is the engineering design code used for low Earth orbit dosimetric analysis and astronaut record keeping with end-to-end validation to twenty percent in Space Shuttle and International Space Station operations. HZETRN treated diffusive leakage only at the distal surface limiting its application to systems with a large radius of curvature. A revision of HZETRN that included forward and backward diffusion allowed neutron leakage to be evaluated at both the near and distal surfaces. That revision provided a deterministic code of high computational efficiency that was in substantial agreement with Monte Carlo (MC) codes in flat plates (at least to the degree that MC codes agree among themselves). In the present paper, the 3DHZETRN formalism capable of evaluation in general geometry is described. Benchmarking will help quantify uncertainty with MC codes (Geant4, FLUKA, MCNP6, and PHITS) in simple shapes such as spheres within spherical shells and boxes. Connection of the 3DHZETRN to general geometry will be discussed.

  12. ActiWiz 3 – an overview of the latest developments and their application

    NASA Astrophysics Data System (ADS)

    Vincke, H.; Theis, C.

    2018-06-01

    In 2011 the ActiWiz code was developed at CERN in order to optimize the choice of materials for accelerator equipment from a radiological point of view. Since then the code has been extended to allow for calculating complete nuclide inventories and provide evaluations with respect to radiotoxicity, inhalation doses, etc. Until now the software included only pre-defined radiation environments for CERN’s high-energy proton accelerators which were based on FLUKA Monte Carlo calculations. Eventually the decision was taken to invest into a major revamping of the code. Starting with version 3 the software is not limited anymore to pre-defined radiation fields but within a few seconds it can also treat arbitrary environments of which fluence spectra are available. This has become possible due to the use of ~100 CPU years’ worth of FLUKA Monte Carlo simulations as well as the JEFF cross-section library for neutrons < 20 MeV. Eventually the latest code version allowed for the efficient inclusion of 42 additional radiation environments of the LHC experiments as well as considerably more flexibility in view of characterizing also waste from CERN’s Large Electron Positron collider (LEP). New fully integrated analysis functionalities like automatic evaluation of difficult-to-measure nuclides, rapid assessment of the temporal evolution of quantities like radiotoxicity or dose-rates, etc. make the software a powerful tool for characterization complementary to general purpose MC codes like FLUKA. In this paper an overview of the capabilities will be given using recent examples from the domain of waste characterization as well as operational radiation protection.

  13. Interfacing MCNPX and McStas for simulation of neutron transport

    NASA Astrophysics Data System (ADS)

    Klinkby, Esben; Lauritzen, Bent; Nonbøl, Erik; Kjær Willendrup, Peter; Filges, Uwe; Wohlmuther, Michael; Gallmeier, Franz X.

    2013-02-01

    Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX (Waters et al., 2007 [1]) or FLUKA (Battistoni et al., 2007; Ferrari et al., 2005 [2,3]) whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as McStas (Lefmann and Nielsen, 1999; Willendrup et al., 2004, 2011a,b [4-7]). The coupling between the two simulation suites typically consists of providing analytical fits of MCNPX neutron spectra to McStas. This method is generally successful but has limitations, as it e.g. does not allow for re-entry of neutrons into the MCNPX regime. Previous work to resolve such shortcomings includes the introduction of McStas inspired supermirrors in MCNPX. In the present paper different approaches to interface MCNPX and McStas are presented and applied to a simple test case. The direct coupling between MCNPX and McStas allows for more accurate simulations of e.g. complex moderator geometries, backgrounds, interference between beam-lines as well as shielding requirements along the neutron guides.

  14. FLUKA simulation studies on in-phantom dosimetric parameters of a LINAC-based BNCT

    NASA Astrophysics Data System (ADS)

    Ghal-Eh, N.; Goudarzi, H.; Rahmani, F.

    2017-12-01

    The Monte Carlo simulation code, FLUKA version 2011.2c.5, has been used to estimate the in-phantom dosimetric parameters for use in BNCT studies. The in-phantom parameters of a typical Snyder head, which are necessary information prior to any clinical treatment, have been calculated with both FLUKA and MCNPX codes, which exhibit a promising agreement. The results confirm that FLUKA can be regarded as a good alternative for the MCNPX in BNCT dosimetry simulations.

  15. FLUKA: A Multi-Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan

    2005-12-14

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  16. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  17. Calculation of water equivalent ratio of several dosimetric materials in proton therapy using FLUKA code and SRIM program.

    PubMed

    Akbari, Mahmoud Reza; Yousefnia, Hassan; Mirrezaei, Ehsan

    2014-08-01

    Water equivalent ratio (WER) was calculated for different proton energies in polymethyl methacrylate (PMMA), polystyrene (PS) and aluminum (Al) using FLUKA and SRIM codes. The results were compared with analytical, experimental and simulated SEICS code data obtained from the literature. The biggest difference between the codes was 3.19%, 1.9% and 0.67% for Al, PMMA and PS, respectively. FLUKA and SEICS had the greatest agreement (≤0.77% difference for PMMA and ≤1.08% difference for Al, respectively) with the experimental data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Status of the Space Radiation Monte Carlos Simulation Based on FLUKA and ROOT

    NASA Technical Reports Server (NTRS)

    Andersen, Victor; Carminati, Federico; Empl, Anton; Ferrari, Alfredo; Pinsky, Lawrence; Sala, Paola; Wilson, Thomas L.

    2002-01-01

    The NASA-funded project reported on at the first IWSSRR in Arona to develop a Monte-Carlo simulation program for use in simulating the space radiation environment based on the FLUKA and ROOT codes is well into its second year of development, and considerable progress has been made. The general tasks required to achieve the final goals include the addition of heavy-ion interactions into the FLUKA code and the provision of a ROOT-based interface to FLUKA. The most significant progress to date includes the incorporation of the DPMJET event generator code within FLUKA to handle heavy-ion interactions for incident projectile energies greater than 3GeV/A. The ongoing effort intends to extend the treatment of these interactions down to 10 MeV, and at present two alternative approaches are being explored. The ROOT interface is being pursued in conjunction with the CERN LHC ALICE software team through an adaptation of their existing AliROOT software. As a check on the validity of the code, a simulation of the recent data taken by the ATIC experiment is underway.

  19. Monte Carlo simulations of a low energy proton beamline for radiobiological experiments.

    PubMed

    Dahle, Tordis J; Rykkelid, Anne Marit; Stokkevåg, Camilla H; Mairani, Andrea; Görgen, Andreas; Edin, Nina J; Rørvik, Eivind; Fjæra, Lars Fredrik; Malinen, Eirik; Ytre-Hauge, Kristian S

    2017-06-01

    In order to determine the relative biological effectiveness (RBE) of protons with high accuracy, radiobiological experiments with detailed knowledge of the linear energy transfer (LET) are needed. Cell survival data from high LET protons are sparse and experiments with low energy protons to achieve high LET values are therefore required. The aim of this study was to quantify LET distributions from a low energy proton beam by using Monte Carlo (MC) simulations, and to further compare to a proton beam representing a typical minimum energy available at clinical facilities. A Markus ionization chamber and Gafchromic films were employed in dose measurements in the proton beam at Oslo Cyclotron Laboratory. Dose profiles were also calculated using the FLUKA MC code, with the MC beam parameters optimized based on comparisons with the measurements. LET spectra and dose-averaged LET (LET d ) were then estimated in FLUKA, and compared with LET calculated from an 80 MeV proton beam. The initial proton energy was determined to be 15.5 MeV, with a Gaussian energy distribution of 0.2% full width at half maximum (FWHM) and a Gaussian lateral spread of 2 mm FWHM. The LET d increased with depth, from approximately 5 keV/μm in the entrance to approximately 40 keV/μm in the distal dose fall-off. The LET d values were considerably higher and the LET spectra were much narrower than the corresponding spectra from the 80 MeV beam. MC simulations accurately modeled the dose distribution from the proton beam and could be used to estimate the LET at any position in the setup. The setup can be used to study the RBE for protons at high LET d , which is not achievable in clinical proton therapy facilities.

  20. Space Applications of the FLUKA Monte-Carlo Code: Lunar and Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Anderson, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Elkhayari, N.; Empl, A.; Fasso, A.; Ferrari, A.; hide

    2004-01-01

    NASA has recognized the need for making additional heavy-ion collision measurements at the U.S. Brookhaven National Laboratory in order to support further improvement of several particle physics transport-code models for space exploration applications. FLUKA has been identified as one of these codes and we will review the nature and status of this investigation as it relates to high-energy heavy-ion physics.

  1. Proton therapy treatment monitoring with the DoPET system: activity range, positron emitters evaluation and comparison with Monte Carlo predictions

    NASA Astrophysics Data System (ADS)

    Muraro, S.; Battistoni, G.; Belcari, N.; Bisogni, M. G.; Camarlinghi, N.; Cristoforetti, L.; Del Guerra, A.; Ferrari, A.; Fracchiolla, F.; Morrocchi, M.; Righetto, R.; Sala, P.; Schwarz, M.; Sportelli, G.; Topi, A.; Rosso, V.

    2017-12-01

    Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two planar heads, and designed to be installed along the beam line to acquire data also during the irradiation. Different acquisitions are analyzed and compared with the MC predictions, with a special focus on validating the PET detectors response for activity range verification.

  2. Radiation Protection Studies for Medical Particle Accelerators using Fluka Monte Carlo Code.

    PubMed

    Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario

    2017-04-01

    Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of 41Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. The FLUKA code for space applications: recent developments

    NASA Technical Reports Server (NTRS)

    Andersen, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; hide

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to-date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results have been obtained using a modified version of the RQMD-2.4 code. This interim solution is now fully operational, while waiting for the development of new models based on the FLUKA hadron-nucleus interaction code, a newly developed QMD code, and the implementation of the Boltzmann master equation theory for low energy ion interactions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  4. Calculation of electron and isotopes dose point kernels with fluka Monte Carlo code for dosimetry in nuclear medicine therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Botta, F; Di Dia, A; Pedroli, G

    The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK),more » quantifying the energy deposition all around a point isotropic source, is often the one.Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10–3 MeV) and for beta emitting isotopes commonly used for therapy (89Sr, 90Y, 131I, 153Sm, 177Lu, 186Re, and 188Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8·RCSDA and 0.9·RCSDA for monoenergetic electrons (RCSDA being the continuous slowing down approximation range) and within 0.8·X90 and 0.9·X90 for isotopes (X90 being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9·RCSDA and 0.9·X90 for electrons and isotopes, respectively.Results: Concerning monoenergetic electrons, within 0.8·RCSDA (where 90%–97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The discrepancies between fluka and the other codes are of the same order of magnitude than those observed when comparing the other codes among them, which can be referred to the different simulation algorithms. When considering the beta spectra, discrepancies notably reduce: within 0.9·X90, fluka and penelope differ for less than 1% in water and less than 2% in bone with any of the isotopes here considered. Complete data of fluka DPKs are given as Supplementary Material as a tool to perform dosimetry by analytical point kernel convolution.Conclusions: fluka provides reliable results when transporting electrons in the low energy range, proving to be an adequate tool for nuclear medicine dosimetry.« less

  5. Monte Carlo calculations of positron emitter yields in proton radiotherapy.

    PubMed

    Seravalli, E; Robert, C; Bauer, J; Stichelbaut, F; Kurz, C; Smeets, J; Van Ngoc Ty, C; Schaart, D R; Buvat, I; Parodi, K; Verhaegen, F

    2012-03-21

    Positron emission tomography (PET) is a promising tool for monitoring the three-dimensional dose distribution in charged particle radiotherapy. PET imaging during or shortly after proton treatment is based on the detection of annihilation photons following the ß(+)-decay of radionuclides resulting from nuclear reactions in the irradiated tissue. Therapy monitoring is achieved by comparing the measured spatial distribution of irradiation-induced ß(+)-activity with the predicted distribution based on the treatment plan. The accuracy of the calculated distribution depends on the correctness of the computational models, implemented in the employed Monte Carlo (MC) codes that describe the interactions of the charged particle beam with matter and the production of radionuclides and secondary particles. However, no well-established theoretical models exist for predicting the nuclear interactions and so phenomenological models are typically used based on parameters derived from experimental data. Unfortunately, the experimental data presently available are insufficient to validate such phenomenological hadronic interaction models. Hence, a comparison among the models used by the different MC packages is desirable. In this work, starting from a common geometry, we compare the performances of MCNPX, GATE and PHITS MC codes in predicting the amount and spatial distribution of proton-induced activity, at therapeutic energies, to the already experimentally validated PET modelling based on the FLUKA MC code. In particular, we show how the amount of ß(+)-emitters produced in tissue-like media depends on the physics model and cross-sectional data used to describe the proton nuclear interactions, thus calling for future experimental campaigns aiming at supporting improvements of MC modelling for clinical application of PET monitoring. © 2012 Institute of Physics and Engineering in Medicine

  6. Calculation of electron and isotopes dose point kernels with fluka Monte Carlo code for dosimetry in nuclear medicine therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Botta, F.; Mairani, A.; Battistoni, G.

    Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernelmore » (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10{sup -3} MeV) and for beta emitting isotopes commonly used for therapy ({sup 89}Sr, {sup 90}Y, {sup 131}I, {sup 153}Sm, {sup 177}Lu, {sup 186}Re, and {sup 188}Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8{center_dot}R{sub CSDA} and 0.9{center_dot}R{sub CSDA} for monoenergetic electrons (R{sub CSDA} being the continuous slowing down approximation range) and within 0.8{center_dot}X{sub 90} and 0.9{center_dot}X{sub 90} for isotopes (X{sub 90} being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9{center_dot}R{sub CSDA} and 0.9{center_dot}X{sub 90} for electrons and isotopes, respectively. Results: Concerning monoenergetic electrons, within 0.8{center_dot}R{sub CSDA} (where 90%-97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The discrepancies between fluka and the other codes are of the same order of magnitude than those observed when comparing the other codes among them, which can be referred to the different simulation algorithms. When considering the beta spectra, discrepancies notably reduce: within 0.9{center_dot}X{sub 90}, fluka and penelope differ for less than 1% in water and less than 2% in bone with any of the isotopes here considered. Complete data of fluka DPKs are given as Supplementary Material as a tool to perform dosimetry by analytical point kernel convolution. Conclusions: fluka provides reliable results when transporting electrons in the low energy range, proving to be an adequate tool for nuclear medicine dosimetry.« less

  7. Phase Space Generation for Proton and Carbon Ion Beams for External Users' Applications at the Heidelberg Ion Therapy Center.

    PubMed

    Tessonnier, Thomas; Marcelos, Tiago; Mairani, Andrea; Brons, Stephan; Parodi, Katia

    2015-01-01

    In the field of radiation therapy, accurate and robust dose calculation is required. For this purpose, precise modeling of the irradiation system and reliable computational platforms are needed. At the Heidelberg Ion Therapy Center (HIT), the beamline has been already modeled in the FLUKA Monte Carlo (MC) code. However, this model was kept confidential for disclosure reasons and was not available for any external team. The main goal of this study was to create efficiently phase space (PS) files for proton and carbon ion beams, for all energies and foci available at HIT. PSs are representing the characteristics of each particle recorded (charge, mass, energy, coordinates, direction cosines, generation) at a certain position along the beam path. In order to achieve this goal, keeping a reasonable data size but maintaining the requested accuracy for the calculation, we developed a new approach of beam PS generation with the MC code FLUKA. The generated PSs were obtained using an infinitely narrow beam and recording the desired quantities after the last element of the beamline, with a discrimination of primaries or secondaries. In this way, a unique PS can be used for each energy to accommodate the different foci by combining the narrow-beam scenario with a random sampling of its theoretical Gaussian beam in vacuum. PS can also reproduce the different patterns from the delivery system, when properly combined with the beam scanning information. MC simulations using PS have been compared to simulations, including the full beamline geometry and have been found in very good agreement for several cases (depth dose distributions, lateral dose profiles), with relative dose differences below 0.5%. This approach has also been compared with measured data of ion beams with different energies and foci, resulting in a very satisfactory agreement. Hence, the proposed approach was able to fulfill the different requirements and has demonstrated its capability for application to clinical treatment fields. It also offers a powerful tool to perform investigations on the contribution of primary and secondary particles produced in the beamline. These PSs are already made available to external teams upon request, to support interpretation of their measurements.

  8. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.

  9. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less

  10. Source terms, shielding calculations and soil activation for a medical cyclotron.

    PubMed

    Konheiser, J; Naumann, B; Ferrari, A; Brachem, C; Müller, S E

    2016-12-01

    Calculations of the shielding and estimates of soil activation for a medical cyclotron are presented in this work. Based on the neutron source term from the 18 O(p,n) 18 F reaction produced by a 28 MeV proton beam, neutron and gamma dose rates outside the building were estimated with the Monte Carlo code MCNP6 (Goorley et al 2012 Nucl. Technol. 180 298-315). The neutron source term was calculated with the MCNP6 code and FLUKA (Ferrari et al 2005 INFN/TC_05/11, SLAC-R-773) code as well as with supplied data by the manufacturer. MCNP and FLUKA calculations yielded comparable results, while the neutron yield obtained using the manufacturer-supplied information is about a factor of 5 smaller. The difference is attributed to the missing channels in the manufacturer-supplied neutron source terms which considers only the 18 O(p,n) 18 F reaction, whereas the MCNP and FLUKA calculations include additional neutron reaction channels. Soil activation was performed using the FLUKA code. The estimated dose rate based on MCNP6 calculations in the public area is about 0.035 µSv h -1 and thus significantly below the reference value of 0.5 µSv h -1 (2011 Strahlenschutzverordnung, 9 Auflage vom 01.11.2011, Bundesanzeiger Verlag). After 5 years of continuous beam operation and a subsequent decay time of 30 d, the activity concentration of the soil is about 0.34 Bq g -1 .

  11. Stopping power and dose calculations with analytical and Monte Carlo methods for protons and prompt gamma range verification

    NASA Astrophysics Data System (ADS)

    Usta, Metin; Tufan, Mustafa Çağatay; Aydın, Güral; Bozkurt, Ahmet

    2018-07-01

    In this study, we have performed the calculations stopping power, depth dose, and range verification for proton beams using dielectric and Bethe-Bloch theories and FLUKA, Geant4 and MCNPX Monte Carlo codes. In the framework, as analytical studies, Drude model was applied for dielectric theory and effective charge approach with Roothaan-Hartree-Fock charge densities was used in Bethe theory. In the simulations different setup parameters were selected to evaluate the performance of three distinct Monte Carlo codes. The lung and breast tissues were investigated are considered to be related to the most common types of cancer throughout the world. The results were compared with each other and the available data in literature. In addition, the obtained results were verified with prompt gamma range data. In both stopping power values and depth-dose distributions, it was found that the Monte Carlo values give better results compared with the analytical ones while the results that agree best with ICRU data in terms of stopping power are those of the effective charge approach between the analytical methods and of the FLUKA code among the MC packages. In the depth dose distributions of the examined tissues, although the Bragg curves for Monte Carlo almost overlap, the analytical ones show significant deviations that become more pronounce with increasing energy. Verifications with the results of prompt gamma photons were attempted for 100-200 MeV protons which are regarded important for proton therapy. The analytical results are within 2%-5% and the Monte Carlo values are within 0%-2% as compared with those of the prompt gammas.

  12. Meeting Radiation Protection Requirements and Reducing Spacecraft Mass - A Multifunctional Materials Approach

    NASA Technical Reports Server (NTRS)

    Atwell, William; Koontz, Steve; Reddell, Brandon; Rojdev, Kristina; Franklin, Jennifer

    2010-01-01

    Both crew and radio-sensitive systems, especially electronics must be protected from the effects of the space radiation environment. One method of mitigating this radiation exposure is to use passive-shielding materials. In previous vehicle designs such as the International Space Station (ISS), materials such as aluminum and polyethylene have been used as parasitic shielding to protect crew and electronics from exposure, but these designs add mass and decrease the amount of usable volume inside the vehicle. Thus, it is of interest to understand whether structural materials can also be designed to provide the radiation shielding capability needed for crew and electronics, while still providing weight savings and increased useable volume when compared against previous vehicle shielding designs. In this paper, we present calculations and analysis using the HZETRN (deterministic) and FLUKA (Monte Carlo) codes to investigate the radiation mitigation properties of these structural shielding materials, which includes graded-Z and composite materials. This work is also a follow-on to an earlier paper, that compared computational results for three radiation transport codes, HZETRN, HETC, and FLUKA, using the Feb. 1956 solar particle event (SPE) spectrum. In the following analysis, we consider the October 1989 Ground Level Enhanced (GLE) SPE as the input source term based on the Band function fitting method. Using HZETRN and FLUKA, parametric absorbed doses at the center of a hemispherical structure on the lunar surface are calculated for various thicknesses of graded-Z layups and an all-aluminum structure. HZETRN and FLUKA calculations are compared and are in reasonable (18% to 27%) agreement. Both codes are in agreement with respect to the predicted shielding material performance trends. The results from both HZETRN and FLUKA are analyzed and the radiation protection properties and potential weight savings of various materials and materials lay-ups are compared.

  13. Radiation Protection Considerations

    NASA Astrophysics Data System (ADS)

    Adorisio, C.; Roesler, S.; Urscheler, C.; Vincke, H.

    This chapter summarizes the legal Radiation Protection (RP) framework to be considered in the design of HiLumi LHC. It details design limits and constraints, dose objectives and explains how the As Low As Reasonably Achievable (ALARA) approach is formalized at CERN. Furthermore, features of the FLUKA Monte Carlo code are summarized that are of relevance for RP studies. Results of FLUKA simulations for residual dose rates during Long Shutdown 1 (LS1) are compared to measurements demonstrating good agreement and providing proof for the accuracy of FLUKA predictions for future shutdowns. Finally, an outlook for the residual dose rate evolution until LS3 is given.

  14. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Vinita J.; Schaefer, Charles; Kahnhauser, Henry

    The National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory was shut down in September 2014. Lead bricks used as radiological shadow shielding within the accelerator were exposed to stray radiation fields during normal operations. The FLUKA code, a fully integrated Monte Carlo simulation package for the interaction and transport of particles and nuclei in matter, was used to estimate induced radioactivity in this shielding and stainless steel beam pipe from known beam losses. The FLUKA output was processed using MICROSHIELD® to estimate on-contact exposure rates with individually exposed bricks to help design and optimize the radiological survey process. Thismore » entire process can be modeled using FLUKA, but use of MICROSHIELD® as a secondary method was chosen because of the project’s resource constraints. Due to the compressed schedule and lack of shielding configuration data, simple FLUKA models were developed in this paper. FLUKA activity estimates for stainless steel were compared with sampling data to validate results, which show that simple FLUKA models and irradiation geometries can be used to predict radioactivity inventories accurately in exposed materials. During decommissioning 0.1% of the lead bricks were found to have measurable levels of induced radioactivity. Finally, post-processing with MICROSHIELD® provides an acceptable secondary method of estimating residual exposure rates.« less

  16. SU-E-T-121: Analyzing the Broadening Effect On the Bragg Peak Due to Heterogeneous Geometries and Implementing User-Routines in the Monte-Carlo Code FLUKA in Order to Reduce Computation Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumann, K; Weber, U; Simeonov, Y

    2015-06-15

    Purpose: Aim of this study was to analyze the modulating, broadening effect on the Bragg Peak due to heterogeneous geometries like multi-wire chambers in the beam path of a particle therapy beam line. The effect was described by a mathematical model which was implemented in the Monte-Carlo code FLUKA via user-routines, in order to reduce the computation time for the simulations. Methods: The depth dose curve of 80 MeV/u C12-ions in a water phantom was calculated using the Monte-Carlo code FLUKA (reference curve). The modulating effect on this dose distribution behind eleven mesh-like foils (periodicity ∼80 microns) occurring in amore » typical set of multi-wire and dose chambers was mathematically described by optimizing a normal distribution so that the reverence curve convoluted with this distribution equals the modulated dose curve. This distribution describes a displacement in water and was transferred in a probability distribution of the thickness of the eleven foils using the water equivalent thickness of the foil’s material. From this distribution the distribution of the thickness of one foil was determined inversely. In FLUKA the heterogeneous foils were replaced by homogeneous foils and a user-routine was programmed that varies the thickness of the homogeneous foils for each simulated particle using this distribution. Results: Using the mathematical model and user-routine in FLUKA the broadening effect could be reproduced exactly when replacing the heterogeneous foils by homogeneous ones. The computation time was reduced by 90 percent. Conclusion: In this study the broadening effect on the Bragg Peak due to heterogeneous structures was analyzed, described by a mathematical model and implemented in FLUKA via user-routines. Applying these routines the computing time was reduced by 90 percent. The developed tool can be used for any heterogeneous structure in the dimensions of microns to millimeters, in principle even for organic materials like lung tissue.« less

  17. TU-EF-304-10: Efficient Multiscale Simulation of the Proton Relative Biological Effectiveness (RBE) for DNA Double Strand Break (DSB) Induction and Bio-Effective Dose in the FLUKA Monte Carlo Radiation Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Tsiamas, P; Axente, M

    2015-06-15

    Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determinemore » the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.« less

  18. Calculation of response matrix of CaSO 4:Dy based neutron dosimeter using Monte Carlo code FLUKA and measurement of 241Am-Be spectra

    NASA Astrophysics Data System (ADS)

    Chatterjee, S.; Bakshi, A. K.; Tripathy, S. P.

    2010-09-01

    Response matrix for CaSO 4:Dy based neutron dosimeter was generated using Monte Carlo code FLUKA in the energy range thermal to 20 MeV for a set of eight Bonner spheres of diameter 3-12″ including the bare one. Response of the neutron dosimeter was measured for the above set of spheres for 241Am-Be neutron source covered with 2 mm lead. An analytical expression for the response function was devised as a function of sphere mass. Using Frascati Unfolding Iteration Tool (FRUIT) unfolding code, the neutron spectrum of 241Am-Be was unfolded and compared with standard IAEA spectrum for the same.

  19. Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Atwell, William; Boeder, Paul; Koontz, Steve

    2014-01-01

    NASA's future missions are focused on deep space for human exploration that do not provide a simple emergency return to Earth. In addition, the deep space environment contains a constant background Galactic Cosmic Ray (GCR) radiation exposure, as well as periodic Solar Particle Events (SPEs) that can produce intense amounts of radiation in a short amount of time. Given these conditions, it is important that the avionics systems for deep space human missions are not susceptible to Single Event Effects (SEE) that can occur from radiation interactions with electronic components. The typical process to minimizing SEE effects is through using heritage hardware and extensive testing programs that are very costly. Previous work by Koontz, et al. [1] utilized an analysis-based method for investigating electronic component susceptibility. In their paper, FLUKA, a Monte Carlo transport code, was used to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data. In addition, CREME-96, a deterministic code, was also compared with FLUKA and in-flight data. However, FLUKA has a long run-time (on the order of days), and CREME-96 has not been updated in several years. This paper will investigate the use of HZETRN 2010, a deterministic transport code developed at NASA Langley Research Center, as another tool that can be used to analyze SEE and SEU rates. The benefits to using HZETRN over FLUKA and CREME-96 are that it has a very fast run time (on the order of minutes) and has been shown to be of similar accuracy as other deterministic and Monte Carlo codes when considering dose [2, 3, 4]. The 2010 version of HZETRN has updated its treatment of secondary neutrons and thus has improved its accuracy over previous versions. In this paper, the Linear Energy Transfer (LET) spectra are of interest rather than the total ionizing dose. Therefore, the LET spectra output from HZETRN 2010 will be compared with the FLUKA and in-flight data to validate HZETRN 2010 as a computational tool for SEE qualification by analysis. Furthermore, extrapolation of these data to interplanetary environments at 1 AU will be investigated to determine whether HZETRN 2010 can be used successfully and confidently for deep space mission analyses.

  20. Induced Radioactivity in Lead Shielding at the National Synchrotron Light Source

    DOE PAGES

    Ghosh, Vinita J.; Schaefer, Charles; Kahnhauser, Henry

    2017-06-30

    The National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory was shut down in September 2014. Lead bricks used as radiological shadow shielding within the accelerator were exposed to stray radiation fields during normal operations. The FLUKA code, a fully integrated Monte Carlo simulation package for the interaction and transport of particles and nuclei in matter, was used to estimate induced radioactivity in this shielding and stainless steel beam pipe from known beam losses. The FLUKA output was processed using MICROSHIELD® to estimate on-contact exposure rates with individually exposed bricks to help design and optimize the radiological survey process. Thismore » entire process can be modeled using FLUKA, but use of MICROSHIELD® as a secondary method was chosen because of the project’s resource constraints. Due to the compressed schedule and lack of shielding configuration data, simple FLUKA models were developed in this paper. FLUKA activity estimates for stainless steel were compared with sampling data to validate results, which show that simple FLUKA models and irradiation geometries can be used to predict radioactivity inventories accurately in exposed materials. During decommissioning 0.1% of the lead bricks were found to have measurable levels of induced radioactivity. Finally, post-processing with MICROSHIELD® provides an acceptable secondary method of estimating residual exposure rates.« less

  1. Induced Radioactivity in Lead Shielding at the National Synchrotron Light Source.

    PubMed

    Ghosh, Vinita J; Schaefer, Charles; Kahnhauser, Henry

    2017-06-01

    The National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory was shut down in September 2014. Lead bricks used as radiological shadow shielding within the accelerator were exposed to stray radiation fields during normal operations. The FLUKA code, a fully integrated Monte Carlo simulation package for the interaction and transport of particles and nuclei in matter, was used to estimate induced radioactivity in this shielding and stainless steel beam pipe from known beam losses. The FLUKA output was processed using MICROSHIELD® to estimate on-contact exposure rates with individually exposed bricks to help design and optimize the radiological survey process. This entire process can be modeled using FLUKA, but use of MICROSHIELD® as a secondary method was chosen because of the project's resource constraints. Due to the compressed schedule and lack of shielding configuration data, simple FLUKA models were developed. FLUKA activity estimates for stainless steel were compared with sampling data to validate results, which show that simple FLUKA models and irradiation geometries can be used to predict radioactivity inventories accurately in exposed materials. During decommissioning 0.1% of the lead bricks were found to have measurable levels of induced radioactivity. Post-processing with MICROSHIELD® provides an acceptable secondary method of estimating residual exposure rates.

  2. Helium ions at the heidelberg ion beam therapy center: comparisons between FLUKA Monte Carlo code predictions and dosimetric measurements

    NASA Astrophysics Data System (ADS)

    Tessonnier, T.; Mairani, A.; Brons, S.; Sala, P.; Cerutti, F.; Ferrari, A.; Haberer, T.; Debus, J.; Parodi, K.

    2017-08-01

    In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo treatment planning engine based on the same FLUKA code, or an independent analytical planning system fed with a validated database of inputs calculated with FLUKA.

  3. Helium ions at the heidelberg ion beam therapy center: comparisons between FLUKA Monte Carlo code predictions and dosimetric measurements.

    PubMed

    Tessonnier, T; Mairani, A; Brons, S; Sala, P; Cerutti, F; Ferrari, A; Haberer, T; Debus, J; Parodi, K

    2017-08-01

    In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4 He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo treatment planning engine based on the same FLUKA code, or an independent analytical planning system fed with a validated database of inputs calculated with FLUKA.

  4. Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes

    NASA Technical Reports Server (NTRS)

    Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.

    2010-01-01

    The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,

  5. Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*

    PubMed Central

    Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G

    2014-01-01

    Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image-based dosimetry in nuclear medicine. PMID:24200697

  6. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    NASA Astrophysics Data System (ADS)

    Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.

    2015-09-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  7. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation.

    PubMed

    Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M

    2015-09-07

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo(®) TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus(®) chamber. An EBT3(®) film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  8. Review on Monte-Carlo Tools for Simulating Relativistic Runaway Electron Avalanches and the Propagation of TerretrialTerrestrial-Gamma Ray Flashes in the Atmosphere

    NASA Astrophysics Data System (ADS)

    Sarria, D.

    2016-12-01

    The field of High Energy Atmospheric Physics (HEAP) includes the study of energetic events related to thunderstorms, such as Terrestrial Gamma-ray Flashes (TGF), associated electron-positron beams (TEB), gamma-ray glows and Thunderstorm Ground Enhancements (TGE). Understanding these phenomena requires accurate models for the interaction of particles with atmospheric air and electro-magnetic fields in the <100 MeV energy range. This study is the next step of the work presented in [C. Rutjes et al., 2016] that compared the performances of various codes in the absence of electro-magnetic fields. In the first part, we quantify simple but informative test cases of electrons in various electric field profiles. We will compare the avalanche length (of the Relativistic Runaway Electron Avalanche (RREA) process), the photon/electron spectra and spatial scattering. In particular, we test the effect of the low-energy threshold, that was found to be very important [Skeltved et al., 2014]. Note that even without a field, it was found to be important because of the straggling effect [C. Rutjes et al., 2016]. For this first part, we will be comparing GEANT4 (different flavours), FLUKA and the custom made code GRRR. In the second part, we test the propagation of these high energy particles in the atmosphere, from production altitude (around 10 km to 18 km) to satellite altitude (600 km). We use a simple and clearly fixed set-up for the atmospheric density, the geomagnetic field, the initial conditions, and the detection conditions of the particles. For this second part, we will be comparing GEANT4 (different flavours), FLUKA/CORSIKA and the custom made code MC-PEPTITA. References : C. Rutjes et al., 2016. Evaluation of Monte Carlo tools for high energy atmospheric physics. Geosci. Model Dev. Under review. Skeltved, A. B. et al., 2014. Modelling the relativistic runaway electron avalanche and the feedback mechanism with geant4. JGRA, doi :10.1002/2014JA020504.

  9. FLUKA simulation of TEPC response to cosmic radiation.

    PubMed

    Beck, P; Ferrari, A; Pelliccioni, M; Rollet, S; Villari, R

    2005-01-01

    The aircrew exposure to cosmic radiation can be assessed by calculation with codes validated by measurements. However, the relationship between doses in the free atmosphere, as calculated by the codes and from results of measurements performed within the aircraft, is still unclear. The response of a tissue-equivalent proportional counter (TEPC) has already been simulated successfully by the Monte Carlo transport code FLUKA. Absorbed dose rate and ambient dose equivalent rate distributions as functions of lineal energy have been simulated for several reference sources and mixed radiation fields. The agreement between simulation and measurements has been well demonstrated. In order to evaluate the influence of aircraft structures on aircrew exposure assessment, the response of TEPC in the free atmosphere and on-board is now simulated. The calculated results are discussed and compared with other calculations and measurements.

  10. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; hide

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  11. Study on radiation production in the charge stripping section of the RISP linear accelerator

    NASA Astrophysics Data System (ADS)

    Oh, Joo-Hee; Oranj, Leila Mokhtari; Lee, Hee-Seock; Ko, Seung-Kook

    2015-02-01

    The linear accelerator of the Rare Isotope Science Project (RISP) accelerates 200 MeV/nucleon 238U ions in a multi-charge states. Many kinds of radiations are generated while the primary beam is transported along the beam line. The stripping process using thin carbon foil leads to complicated radiation environments at the 90-degree bending section. The charge distribution of 238U ions after the carbon charge stripper was calculated by using the LISE++ program. The estimates of the radiation environments were carried out by using the well-proved Monte Carlo codes PHITS and FLUKA. The tracks of 238U ions in various charge states were identified using the magnetic field subroutine of the PHITS code. The dose distribution caused by U beam losses for those tracks was obtained over the accelerator tunnel. A modified calculation was applied for tracking the multi-charged U beams because the fundamental idea of PHITS and FLUKA was to transport fully-ionized ion beam. In this study, the beam loss pattern after a stripping section was observed, and the radiation production by heavy ions was studied. Finally, the performance of the PHITS and the FLUKA codes was validated for estimating the radiation production at the stripping section by applying a modified method.

  12. SU-E-T-590: Optimizing Magnetic Field Strengths with Matlab for An Ion-Optic System in Particle Therapy Consisting of Two Quadrupole Magnets for Subsequent Simulations with the Monte-Carlo Code FLUKA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumann, K; Weber, U; Simeonov, Y

    Purpose: Aim of this study was to optimize the magnetic field strengths of two quadrupole magnets in a particle therapy facility in order to obtain a beam quality suitable for spot beam scanning. Methods: The particle transport through an ion-optic system of a particle therapy facility consisting of the beam tube, two quadrupole magnets and a beam monitor system was calculated with the help of Matlab by using matrices that solve the equation of motion of a charged particle in a magnetic field and field-free region, respectively. The magnetic field strengths were optimized in order to obtain a circular andmore » thin beam spot at the iso-center of the therapy facility. These optimized field strengths were subsequently transferred to the Monte-Carlo code FLUKA and the transport of 80 MeV/u C12-ions through this ion-optic system was calculated by using a user-routine to implement magnetic fields. The fluence along the beam-axis and at the iso-center was evaluated. Results: The magnetic field strengths could be optimized by using Matlab and transferred to the Monte-Carlo code FLUKA. The implementation via a user-routine was successful. Analyzing the fluence-pattern along the beam-axis the characteristic focusing and de-focusing effects of the quadrupole magnets could be reproduced. Furthermore the beam spot at the iso-center was circular and significantly thinner compared to an unfocused beam. Conclusion: In this study a Matlab tool was developed to optimize magnetic field strengths for an ion-optic system consisting of two quadrupole magnets as part of a particle therapy facility. These magnetic field strengths could subsequently be transferred to and implemented in the Monte-Carlo code FLUKA to simulate the particle transport through this optimized ion-optic system.« less

  13. The Energy Spectra of Heavy Nuclei Measured by the ATIC Experiment

    NASA Technical Reports Server (NTRS)

    Panov, A. D.; Adams, J. H.; Ahn, H. S.; Bashindzhagyan, G. L.; Batkov, K. E.; Chang, J.; Christl, M.; Fazley, A. R.; Ganel, O.; Gunasingha, R. M.

    2004-01-01

    ATIC (Advanced Thin Ionization Calorimeter) is a balloon-borne experiment to measure the spectra and composition of primary cosmic rays in the region of total energy from 100 GeV to near 100 TeV for Z from 1 to 26. ATIC consists of a pixelated silicon matrix detector to measure charge plus a fully active BGO calorimeter, to measure energy, located below a carbon target interleaved with three layers of scintillator hodoscope. The ATIC instrument had a second (scientific) flight from McMurdo, Antarctica from 12/29/02 to 1/18/03, yielding 20 days of good data. The GEANT 3.21 Monte Carlo code with the QGSM event generator and the FLUKA code with the DPMJET-II event generator were used to convert energy deposition measurements to primary energy. We present the preliminary energy spectra for the abundant elements C, O, Ne, Mg, Si and Fe and compare them with the results of the first (test) flight of ATIC in 2000-01 and with results from the HEAO-3 and CRN experiments.

  14. SU-F-T-156: Monte Carlo Simulation Using TOPAS for Synchrotron Based Proton Discrete Spot Scanning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Pirlepesov, F; Tsiamas, P

    Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less

  15. Measurements and parameterization of neutron energy spectra from targets bombarded with 120 GeV protons

    NASA Astrophysics Data System (ADS)

    Kajimoto, T.; Shigyo, N.; Sanami, T.; Iwamoto, Y.; Hagiwara, M.; Lee, H. S.; Soha, A.; Ramberg, E.; Coleman, R.; Jensen, D.; Leveling, A.; Mokhov, N. V.; Boehnlein, D.; Vaziri, K.; Sakamoto, Y.; Ishibashi, K.; Nakashima, H.

    2014-10-01

    The energy spectra of neutrons were measured by a time-of-flight method for 120 GeV protons on thick graphite, aluminum, copper, and tungsten targets with an NE213 scintillator at the Fermilab Test Beam Facility. Neutron energy spectra were obtained between 25 and 3000 MeV at emission angles of 30°, 45°, 120°, and 150°. The spectra were parameterized as neutron emissions from three moving sources and then compared with theoretical spectra calculated by PHITS and FLUKA codes. The yields of the theoretical spectra were substantially underestimated compared with the yields of measured spectra. The integrated neutron yields from 25 to 3000 MeV calculated with PHITS code were 16-36% of the experimental yields and those calculated with FLUKA code were 26-57% of the experimental yields for all targets and emission angles.

  16. Use of Fluka to Create Dose Calculations

    NASA Technical Reports Server (NTRS)

    Lee, Kerry T.; Barzilla, Janet; Townsend, Lawrence; Brittingham, John

    2012-01-01

    Monte Carlo codes provide an effective means of modeling three dimensional radiation transport; however, their use is both time- and resource-intensive. The creation of a lookup table or parameterization from Monte Carlo simulation allows users to perform calculations with Monte Carlo results without replicating lengthy calculations. FLUKA Monte Carlo transport code was used to develop lookup tables and parameterizations for data resulting from the penetration of layers of aluminum, polyethylene, and water with areal densities ranging from 0 to 100 g/cm^2. Heavy charged ion radiation including ions from Z=1 to Z=26 and from 0.1 to 10 GeV/nucleon were simulated. Dose, dose equivalent, and fluence as a function of particle identity, energy, and scattering angle were examined at various depths. Calculations were compared against well-known results and against the results of other deterministic and Monte Carlo codes. Results will be presented.

  17. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Comparison of Transport Codes, HZETRN, HETC and FLUKA, Using 1977 GCR Solar Minimum Spectra

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Tripathi, Ram K.; Blattnig, Steve R.; Norbury, John W.; Badavi, Francis F.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; hide

    2009-01-01

    The HZETRN deterministic radiation transport code is one of several tools developed to analyze the effects of harmful galactic cosmic rays (GCR) and solar particle events (SPE) on mission planning, astronaut shielding and instrumentation. This paper is a comparison study involving the two Monte Carlo transport codes, HETC-HEDS and FLUKA, and the deterministic transport code, HZETRN. Each code is used to transport ions from the 1977 solar minimum GCR spectrum impinging upon a 20 g/cm2 Aluminum slab followed by a 30 g/cm2 water slab. This research is part of a systematic effort of verification and validation to quantify the accuracy of HZETRN and determine areas where it can be improved. Comparisons of dose and dose equivalent values at various depths in the water slab are presented in this report. This is followed by a comparison of the proton fluxes, and the forward, backward and total neutron fluxes at various depths in the water slab. Comparisons of the secondary light ion 2H, 3H, 3He and 4He fluxes are also examined.

  19. MO-FG-CAMPUS-TeP3-02: Benchmarks of a Proton Relative Biological Effectiveness (RBE) Model for DNA Double Strand Break (DSB) Induction in the FLUKA, MCNP, TOPAS, and RayStation™ Treatment Planning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R; Streitmatter, S; Traneus, E

    2016-06-15

    Purpose: Validate implementation of a published RBE model for DSB induction (RBEDSB) in several general purpose Monte Carlo (MC) code systems and the RayStation™ treatment planning system (TPS). For protons and other light ions, DSB induction is a critical initiating molecular event that correlates well with the RBE for cell survival. Methods: An efficient algorithm to incorporate information on proton and light ion RBEDSB from the independently tested Monte Carlo Damage Simulation (MCDS) has now been integrated into MCNP (Stewart et al. PMB 60, 8249–8274, 2015), FLUKA, TOPAS and a research build of the RayStation™ TPS. To cross-validate the RBEDSBmore » model implementation LET distributions, depth-dose and lateral (dose and RBEDSB) profiles for monodirectional monoenergetic (100 to 200 MeV) protons incident on a water phantom are compared. The effects of recoil and secondary ion production ({sub 2}H{sub +}, {sub 3}H{sub +}, {sub 3}He{sub 2+}, {sub 4}He{sub 2+}), spot size (3 and 10 mm), and transport physics on beam profiles and RBEDSB are examined. Results: Depth-dose and RBEDSB profiles among all of the MC models are in excellent agreement using a 1 mm distance criterion (width of a voxel). For a 100 MeV proton beam (10 mm spot), RBEDSB = 1.2 ± 0.03 (− 2–3%) at the tip of the Bragg peak and increases to 1.59 ± 0.3 two mm distal to the Bragg peak. RBEDSB tends to decrease as the kinetic energy of the incident proton increases. Conclusion: The model for proton RBEDSB has been accurately implemented into FLUKA, MCNP, TOPAS and the RayStation™TPS. The transport of secondary light ions (Z > 1) has a significant impact on RBEDSB, especially distal to the Bragg peak, although light ions have a small effect on (dosexRBEDSB) profiles. The ability to incorporate spatial variations in proton RBE within a TPS creates new opportunities to individualize treatment plans and increase the therapeutic ratio. Dr. Erik Traneus is employed full-time as a Research Scientist at RaySearch Laboratories. The research build of the RayStation used in the study was made available to the University of Washington free of charge. RaySearch Laboratories did not provide any monetary support for the reported studies.« less

  20. Benchmarking Heavy Ion Transport Codes FLUKA, HETC-HEDS MARS15, MCNPX, and PHITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronningen, Reginald Martin; Remec, Igor; Heilbronn, Lawrence H.

    Powerful accelerators such as spallation neutron sources, muon-collider/neutrino facilities, and rare isotope beam facilities must be designed with the consideration that they handle the beam power reliably and safely, and they must be optimized to yield maximum performance relative to their design requirements. The simulation codes used for design purposes must produce reliable results. If not, component and facility designs can become costly, have limited lifetime and usefulness, and could even be unsafe. The objective of this proposal is to assess the performance of the currently available codes PHITS, FLUKA, MARS15, MCNPX, and HETC-HEDS that could be used for designmore » simulations involving heavy ion transport. We plan to access their performance by performing simulations and comparing results against experimental data of benchmark quality. Quantitative knowledge of the biases and the uncertainties of the simulations is essential as this potentially impacts the safe, reliable and cost effective design of any future radioactive ion beam facility. Further benchmarking of heavy-ion transport codes was one of the actions recommended in the Report of the 2003 RIA R&D Workshop".« less

  1. Measurements and FLUKA simulations of bismuth and aluminium activation at the CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.

    2018-03-01

    The CERN High Energy AcceleRator Mixed field facility (CHARM) is located in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5 ṡ1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7 ṡ1010 p/s that then impacts on the CHARM target. The shielding of the CHARM facility also includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target. This facility consists of 80 cm of cast iron and 360 cm of concrete with barite concrete in some places. Activation samples of bismuth and aluminium were placed in the CSBF and in the CHARM access corridor in July 2015. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields for these samples. The results estimated by FLUKA Monte Carlo simulations are compared to activation measurements of these samples. The comparison between FLUKA simulations and the measured values from γ-spectrometry gives an agreement better than a factor of 2.

  2. An accurate model for the computation of the dose of protons in water.

    PubMed

    Embriaco, A; Bellinzona, V E; Fontana, A; Rotondi, A

    2017-06-01

    The accurate and fast calculation of the dose in proton radiation therapy is an essential ingredient for successful treatments. We propose a novel approach with a minimal number of parameters. The approach is based on the exact calculation of the electromagnetic part of the interaction, namely the Molière theory of the multiple Coulomb scattering for the transversal 1D projection and the Bethe-Bloch formula for the longitudinal stopping power profile, including a gaussian energy straggling. To this e.m. contribution the nuclear proton-nucleus interaction is added with a simple two-parameter model. Then, the non gaussian lateral profile is used to calculate the radial dose distribution with a method that assumes the cylindrical symmetry of the distribution. The results, obtained with a fast C++ based computational code called MONET (MOdel of ioN dosE for Therapy), are in very good agreement with the FLUKA MC code, within a few percent in the worst case. This study provides a new tool for fast dose calculation or verification, possibly for clinical use. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Ramírez, Pablo, E-mail: rapeitor@ug.uchile.cl; Ruiz, Andrés

    The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of themore » intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.« less

  4. Spacecraft Solar Particle Event (SPE) Shielding: Shielding Effectiveness as a Function of SPE model as Determined with the FLUKA Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Koontz, Steve; Atwell, William; Reddell, Brandon; Rojdev, Kristina

    2010-01-01

    Analysis of both satellite and surface neutron monitor data demonstrate that the widely utilized Exponential model of solar particle event (SPE) proton kinetic energy spectra can seriously underestimate SPE proton flux, especially at the highest kinetic energies. The more recently developed Band model produces better agreement with neutron monitor data ground level events (GLEs) and is believed to be considerably more accurate at high kinetic energies. Here, we report the results of modeling and simulation studies in which the radiation transport code FLUKA (FLUktuierende KAskade) is used to determine the changes in total ionizing dose (TID) and single-event environments (SEE) behind aluminum, polyethylene, carbon, and titanium shielding masses when the assumed form (i. e., Band or Exponential) of the solar particle event (SPE) kinetic energy spectra is changed. FLUKA simulations have fully three dimensions with an isotropic particle flux incident on a concentric spherical shell shielding mass and detector structure. The effects are reported for both energetic primary protons penetrating the shield mass and secondary particle showers caused by energetic primary protons colliding with shielding mass nuclei. Our results, in agreement with previous studies, show that use of the Exponential form of the event

  5. Estimation of dose delivered to accelerator devices from stripping of 18.5 MeV/n 238U ions using the FLUKA code

    NASA Astrophysics Data System (ADS)

    Oranj, Leila Mokhtari; Lee, Hee-Seock; Leitner, Mario Santana

    2017-12-01

    In Korea, a heavy ion accelerator facility (RAON) has been designed for production of rare isotopes. The 90° bending section of this accelerator includes a 1.3- μm-carbon stripper followed by two dipole magnets and other devices. An incident beam is 18.5 MeV/n 238U33+,34+ ions passing through the carbon stripper at the beginning of the section. The two dipoles are tuned to transport 238U ions with specific charge states of 77+, 78+, 79+, 80+ and 81+. Then other ions will be deflected at the bends and cause beam losses. These beam losses are a concern to the devices of transport/beam line. The absorbed dose in devices and prompt dose in the tunnel were calculated using the FLUKA code in order to estimate radiation damage of such devices located at the 90° bending section and for the radiation protection. A novel method to transport multi-charged 238U ions beam was applied in the FLUKA code by using charge distribution of 238U ions after the stripper obtained from LISE++ code. The calculated results showed that the absorbed dose in the devices is influenced by the geometrical arrangement. The maximum dose was observed at the coils of first, second, fourth and fifth quadruples placed after first dipole magnet. The integrated doses for 30 years of operation with 9.5 p μA 238U ions were about 2 MGy for those quadrupoles. In conclusion, the protection of devices particularly, quadruples would be necessary to reduce the damage to devices. Moreover, results showed that the prompt radiation penetrated within the first 60 - 120 cm of concrete.

  6. Bremsstrahlung Dose Yield for High-Intensity Short-Pulse Laser–Solid Experiments

    DOE PAGES

    Liang, Taiee; Bauer, Johannes M.; Liu, James C.; ...

    2016-12-01

    A bremsstrahlung source term has been developed by the Radiation Protection (RP) group at SLAC National Accelerator Laboratory for high-intensity short-pulse laser–solid experiments between 10 17 and 10 22 W cm –2. This source term couples the particle-in-cell plasma code EPOCH and the radiation transport code FLUKA to estimate the bremsstrahlung dose yield from laser–solid interactions. EPOCH characterizes the energy distribution, angular distribution, and laser-to-electron conversion efficiency of the hot electrons from laser–solid interactions, and FLUKA utilizes this hot electron source term to calculate a bremsstrahlung dose yield (mSv per J of laser energy on target). The goal of thismore » paper is to provide RP guidelines and hazard analysis for high-intensity laser facilities. In conclusion, a comparison of the calculated bremsstrahlung dose yields to radiation measurement data is also made.« less

  7. Bremsstrahlung Dose Yield for High-Intensity Short-Pulse Laser–Solid Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Taiee; Bauer, Johannes M.; Liu, James C.

    A bremsstrahlung source term has been developed by the Radiation Protection (RP) group at SLAC National Accelerator Laboratory for high-intensity short-pulse laser–solid experiments between 10 17 and 10 22 W cm –2. This source term couples the particle-in-cell plasma code EPOCH and the radiation transport code FLUKA to estimate the bremsstrahlung dose yield from laser–solid interactions. EPOCH characterizes the energy distribution, angular distribution, and laser-to-electron conversion efficiency of the hot electrons from laser–solid interactions, and FLUKA utilizes this hot electron source term to calculate a bremsstrahlung dose yield (mSv per J of laser energy on target). The goal of thismore » paper is to provide RP guidelines and hazard analysis for high-intensity laser facilities. In conclusion, a comparison of the calculated bremsstrahlung dose yields to radiation measurement data is also made.« less

  8. Benchmark studies of induced radioactivity produced in LHC materials, Part I: Specific activities.

    PubMed

    Brugger, M; Khater, H; Mayer, S; Prinz, A; Roesler, S; Ulrici, L; Vincke, H

    2005-01-01

    Samples of materials which will be used in the LHC machine for shielding and construction components were irradiated in the stray radiation field of the CERN-EU high-energy reference field facility. After irradiation, the specific activities induced in the various samples were analysed with a high-precision gamma spectrometer at various cooling times, allowing identification of isotopes with a wide range of half-lives. Furthermore, the irradiation experiment was simulated in detail with the FLUKA Monte Carlo code. A comparison of measured and calculated specific activities shows good agreement, supporting the use of FLUKA for estimating the level of induced activity in the LHC.

  9. Using FLUKA to Calculate Spacecraft: Single Event Environments: A Practical Approach

    NASA Technical Reports Server (NTRS)

    Koontz, Steve; Boeder, Paul; Reddell, Brandon

    2009-01-01

    The FLUKA nuclear transport and reaction code can be developed into a practical tool for calculation of spacecraft and planetary surface asset SEE and TID environments. Nuclear reactions and secondary particle shower effects can be estimated with acceptable accuracy both in-flight and in test. More detailed electronic device and/or spacecraft geometries than are reported here are possible using standard FLUKA geometry utilities. Spacecraft structure and shielding mass. Effects of high Z elements in microelectronic structure as reported previously. Median shielding mass in a generic slab or concentric sphere target geometry are at least approximately applicable to more complex spacecraft shapes. Need the spacecraft shielding mass distribution function applicable to the microelectronic system of interest. SEE environment effects can be calculated for a wide range of spacecraft and microelectronic materials with complete nuclear physics. Evaluate benefits of low Z shielding mass can be evaluated relative to aluminum. Evaluate effects of high Z elements as constituents of microelectronic devices. The principal limitation on the accuracy of the FLUKA based method reported here are found in the limited accuracy and incomplete character of affordable heavy ion test data. To support accurate rate estimates with any calculation method, the aspect ratio of the sensitive volume(s) and the dependence must be better characterized.

  10. Inter-comparison of Dose Distributions Calculated by FLUKA, GEANT4, MCNP, and PHITS for Proton Therapy

    NASA Astrophysics Data System (ADS)

    Yang, Zi-Yi; Tsai, Pi-En; Lee, Shao-Chun; Liu, Yen-Chiang; Chen, Chin-Cheng; Sato, Tatsuhiko; Sheu, Rong-Jiun

    2017-09-01

    The dose distributions from proton pencil beam scanning were calculated by FLUKA, GEANT4, MCNP, and PHITS, in order to investigate their applicability in proton radiotherapy. The first studied case was the integrated depth dose curves (IDDCs), respectively from a 100 and a 226-MeV proton pencil beam impinging a water phantom. The calculated IDDCs agree with each other as long as each code employs 75 eV for the ionization potential of water. The second case considered a similar condition of the first case but with proton energies in a Gaussian distribution. The comparison to the measurement indicates the inter-code differences might not only due to different stopping power but also the nuclear physics models. How the physics parameter setting affect the computation time was also discussed. In the third case, the applicability of each code for pencil beam scanning was confirmed by delivering a uniform volumetric dose distribution based on the treatment plan, and the results showed general agreement between each codes, the treatment plan, and the measurement, except that some deviations were found in the penumbra region. This study has demonstrated that the selected codes are all capable of performing dose calculations for therapeutic scanning proton beams with proper physics settings.

  11. Overview of Recent Radiation Transport Code Comparisons for Space Applications

    NASA Astrophysics Data System (ADS)

    Townsend, Lawrence

    Recent advances in radiation transport code development for space applications have resulted in various comparisons of code predictions for a variety of scenarios and codes. Comparisons among both Monte Carlo and deterministic codes have been made and published by vari-ous groups and collaborations, including comparisons involving, but not limited to HZETRN, HETC-HEDS, FLUKA, GEANT, PHITS, and MCNPX. In this work, an overview of recent code prediction inter-comparisons, including comparisons to available experimental data, is presented and discussed, with emphases on those areas of agreement and disagreement among the various code predictions and published data.

  12. Update On the Status of the FLUKA Monte Carlo Transport Code*

    NASA Technical Reports Server (NTRS)

    Ferrari, A.; Lorenzo-Sentis, M.; Roesler, S.; Smirnov, G.; Sommerer, F.; Theis, C.; Vlachoudis, V.; Carboni, M.; Mostacci, A.; Pelliccioni, M.

    2006-01-01

    The FLUKA Monte Carlo transport code is a well-known simulation tool in High Energy Physics. FLUKA is a dynamic tool in the sense that it is being continually updated and improved by the authors. We review the progress achieved since the last CHEP Conference on the physics models, some technical improvements to the code and some recent applications. From the point of view of the physics, improvements have been made with the extension of PEANUT to higher energies for p, n, pi, pbar/nbar and for nbars down to the lowest energies, the addition of the online capability to evolve radioactive products and get subsequent dose rates, upgrading of the treatment of EM interactions with the elimination of the need to separately prepare preprocessed files. A new coherent photon scattering model, an updated treatment of the photo-electric effect, an improved pair production model, new photon cross sections from the LLNL Cullen database have been implemented. In the field of nucleus-- nucleus interactions the electromagnetic dissociation of heavy ions has been added along with the extension of the interaction models for some nuclide pairs to energies below 100 MeV/A using the BME approach, as well as the development of an improved QMD model for intermediate energies. Both DPMJET 2.53 and 3 remain available along with rQMD 2.4 for heavy ion interactions above 100 MeV/A. Technical improvements include the ability to use parentheses in setting up the combinatorial geometry, the introduction of pre-processor directives in the input stream. a new random number generator with full 64 bit randomness, new routines for mathematical special functions (adapted from SLATEC). Finally, work is progressing on the deployment of a user-friendly GUI input interface as well as a CAD-like geometry creation and visualization tool. On the application front, FLUKA has been used to extensively evaluate the potential space radiation effects on astronauts for future deep space missions, the activation dose for beam target areas, dose calculations for radiation therapy as well as being adapted for use in the simulation of events in the ALICE detector at the LHC.

  13. Simulation of radiation environment for the LHeC detector

    NASA Astrophysics Data System (ADS)

    Nayaz, Abdullah; Piliçer, Ercan; Joya, Musa

    2017-02-01

    The detector response and simulation of radiation environment for the Large Hadron electron Collider (LHeC) baseline detector is estimated to predict its performance over the lifetime of the project. In this work, the geometry of the LHeC detector, as reported in LHeC Conceptual Design Report (CDR), built in FLUKA Monte Carlo tool in order to simulate the detector response and radiation environment. For this purpose, events of electrons and protons with high enough energy were sent isotropically from interaction point of the detector. As a result, the detector response and radiation background for the LHeC detector, with different USRBIN code (ENERGY, HADGT20M, ALL-CHAR, ALL-PAR) in FLUKA, are presented.

  14. Measurements and simulations of the radiation exposure to aircraft crew workplaces due to cosmic radiation in the atmosphere.

    PubMed

    Beck, P; Latocha, M; Dorman, L; Pelliccioni, M; Rollet, S

    2007-01-01

    As required by the European Directive 96/29/Euratom, radiation exposure due to natural ionizing radiation has to be taken into account at workplaces if the effective dose could become more than 1 mSv per year. An example of workers concerned by this directive is aircraft crew due to cosmic radiation exposure in the atmosphere. Extensive measurement campaigns on board aircrafts have been carried out to assess ambient dose equivalent. A consortium of European dosimetry institutes within EURADOS WG5 summarized experimental data and results of calculations, together with detailed descriptions of the methods for measurements and calculations. The radiation protection quantity of interest is the effective dose, E (ISO). The comparison of results by measurements and calculations is done in terms of the operational quantity ambient dose equivalent, H(10). This paper gives an overview of the EURADOS Aircraft Crew In-Flight Database and it presents a new empirical model describing fitting functions for this data. Furthermore, it describes numerical simulations performed with the Monte Carlo code FLUKA-2005 using an updated version of the cosmic radiation primary spectra. The ratio between ambient dose equivalent and effective dose at commercial flight altitudes, calculated with FLUKA-2005, is discussed. Finally, it presents the aviation dosimetry model AVIDOS based on FLUKA-2005 simulations for routine dose assessment. The code has been developed by Austrian Research Centers (ARC) for the public usage (http://avidos.healthphysics.at).

  15. Benchmark of neutron production cross sections with Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    Tsai, Pi-En; Lai, Bo-Lun; Heilbronn, Lawrence H.; Sheu, Rong-Jiun

    2018-02-01

    Aiming to provide critical information in the fields of heavy ion therapy, radiation shielding in space, and facility design for heavy-ion research accelerators, the physics models in three Monte Carlo simulation codes - PHITS, FLUKA, and MCNP6, were systematically benchmarked with comparisons to fifteen sets of experimental data for neutron production cross sections, which include various combinations of 12C, 20Ne, 40Ar, 84Kr and 132Xe projectiles and natLi, natC, natAl, natCu, and natPb target nuclides at incident energies between 135 MeV/nucleon and 600 MeV/nucleon. For neutron energies above 60% of the specific projectile energy per nucleon, the LAQGMS03.03 in MCNP6, the JQMD/JQMD-2.0 in PHITS, and the RQMD-2.4 in FLUKA all show a better agreement with data in heavy-projectile systems than with light-projectile systems, suggesting that the collective properties of projectile nuclei and nucleon interactions in the nucleus should be considered for light projectiles. For intermediate-energy neutrons whose energies are below the 60% projectile energy per nucleon and above 20 MeV, FLUKA is likely to overestimate the secondary neutron production, while MCNP6 tends towards underestimation. PHITS with JQMD shows a mild tendency for underestimation, but the JQMD-2.0 model with a modified physics description for central collisions generally improves the agreement between data and calculations. For low-energy neutrons (below 20 MeV), which are dominated by the evaporation mechanism, PHITS (which uses GEM linked with JQMD and JQMD-2.0) and FLUKA both tend to overestimate the production cross section, whereas MCNP6 tends to underestimate more systems than to overestimate. For total neutron production cross sections, the trends of the benchmark results over the entire energy range are similar to the trends seen in the dominate energy region. Also, the comparison of GEM coupled with either JQMD or JQMD-2.0 in the PHITS code indicates that the model used to describe the first stage of a nucleus-nucleus collision also affects the low-energy neutron production. Thus, in this case, a proper combination of two physics models is desired to reproduce the measured results. In addition, code users should be aware that certain models consistently produce secondary neutrons within a constant fraction of another model in certain energy regions, which might be correlated to different physics treatments in different models.

  16. Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Koontz, Steve; Atwell, William; Boeder, Paul

    2014-01-01

    NASA's future missions are focused on long-duration deep space missions for human exploration which offers no options for a quick emergency return to Earth. The combination of long mission duration with no quick emergency return option leads to unprecedented spacecraft system safety and reliability requirements. It is important that spacecraft avionics systems for human deep space missions are not susceptible to Single Event Effect (SEE) failures caused by space radiation (primarily the continuous galactic cosmic ray background and the occasional solar particle event) interactions with electronic components and systems. SEE effects are typically managed during the design, development, and test (DD&T) phase of spacecraft development by using heritage hardware (if possible) and through extensive component level testing, followed by system level failure analysis tasks that are both time consuming and costly. The ultimate product of the SEE DD&T program is a prediction of spacecraft avionics reliability in the flight environment produced using various nuclear reaction and transport codes in combination with the component and subsystem level radiation test data. Previous work by Koontz, et al.1 utilized FLUKA, a Monte Carlo nuclear reaction and transport code, to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data for a variety of spacecraft and space flight environments. However, FLUKA has a long run-time (on the order of days). CREME962, an easy to use deterministic code offering short run times, was also compared with FLUKA predictions and in-flight data. CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Thus, this paper will investigate the use of HZETRN 20103, a fast and easy to use deterministic transport code, similar to CREME96, that was developed at NASA Langley Research Center primarily for flight crew ionizing radiation dose assessments. HZETRN 2010 includes updates to address secondary particle shower effects more accurately, and might be used as another tool to verify spacecraft avionics system reliability in space flight SEE environments.

  17. Prompt radiation, shielding and induced radioactivity in a high-power 160 MeV proton linac

    NASA Astrophysics Data System (ADS)

    Magistris, Matteo; Silari, Marco

    2006-06-01

    CERN is designing a 160 MeV proton linear accelerator, both for a future intensity upgrade of the LHC and as a possible first stage of a 2.2 GeV superconducting proton linac. A first estimate of the required shielding was obtained by means of a simple analytical model. The source terms and the attenuation lengths used in the present study were calculated with the Monte Carlo cascade code FLUKA. Detailed FLUKA simulations were performed to investigate the contribution of neutron skyshine and backscattering to the expected dose rate in the areas around the linac tunnel. An estimate of the induced radioactivity in the magnets, vacuum chamber, the cooling system and the concrete shield was performed. A preliminary thermal study of the beam dump is also discussed.

  18. Energy deposition studies for the high-luminosity Large Hadron Collider inner triplet magnets

    NASA Astrophysics Data System (ADS)

    Mokhov, N. V.; Rakhno, I. L.; Tropin, I. S.; Cerutti, F.; Esposito, L. S.; Lechner, A.

    2015-05-01

    A detailed model of the high-luminosity LHC inner triplet region with new large-aperture Nb3Sn magnets, field maps, corrector packages, and segmented tungsten inner absorbers was built and implemented into the fluka and mars15 codes. Detailed simulations have been performed coherently with the codes on the impact of particle debris from the 14-TeV center-of-mass pp-collisions on the short- and long-term stability of the inner triplet magnets. After optimizing the absorber configuration, the peak power density averaged over the magnet inner cable width is found to be safely below the quench limit at the luminosity of 5 ×1034 cm-2 s-1 . For the anticipated lifetime integrated luminosity of 3000 fb-1 , the peak dose calculated for the innermost magnet insulator ranges from 20 to 35 MGy, a figure close to the commonly accepted limit. Dynamic heat loads to the triplet magnet cold mass are calculated to evaluate the cryogenic capability. fluka and mars results on energy deposition are in very good agreement.

  19. Neutron Productions from thin Be target irradiated by 50 MeV/u 238U beam

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Seock; Oh, Joo-Hee; Jung, Nam-Suk; Oranj, Leila Mokhtari; Nakao, Noriaki; Uwamino, Yoshitomo

    2017-09-01

    Neutrons generated from thin beryllium target by 50 MeV/u 238U beam were measured using activation analysis at 15, 30, 45, and 90 degrees from the beam direction. A 0.085 mm-thick Be stripper of RIBF was used as the neutron generating target. Activation detectors of bismuth, cobalt, and aluminum were placed out of the stripper chamber. The threshold reactions of 209Bi(n, xn)210-xBi(x=4 8), 59Co(n, xn)60-xCO(x=2 5), 59Co(n, 2nα)54Mn, 27Al(n, α)24Na, and 27Al(n,2nα)22Na were applied to measure the production rates of radionuclides. The neutron spectra were obtained using an unfolding method with the SAND-II code. All of production rates and neutron spectra were compared with the calculated results using Monte Carlo codes, the PHITS and the FLUKA. The FLUKA results showed better agreement with the measurements than the PHITS. The discrepancy between the measurements and the calculations were discussed.

  20. Residual activity evaluation: a benchmark between ANITA, FISPACT, FLUKA and PHITS codes

    NASA Astrophysics Data System (ADS)

    Firpo, Gabriele; Viberti, Carlo Maria; Ferrari, Anna; Frisoni, Manuela

    2017-09-01

    The activity of residual nuclides dictates the radiation fields in periodic inspections/repairs (maintenance periods) and dismantling operations (decommissioning phase) of accelerator facilities (i.e., medical, industrial, research) and nuclear reactors. Therefore, the correct prediction of the material activation allows for a more accurate planning of the activities, in line with the ALARA (As Low As Reasonably Achievable) principles. The scope of the present work is to show the results of a comparison between residual total specific activity versus a set of cooling time instants (from zero up to 10 years after irradiation) as obtained by two analytical (FISPACT and ANITA) and two Monte Carlo (FLUKA and PHITS) codes, making use of their default nuclear data libraries. A set of 40 irradiating scenarios is considered, i.e. neutron and proton particles of different energies, ranging from zero to many hundreds MeV, impinging on pure elements or materials of standard composition typically used in industrial applications (namely, AISI SS316 and Portland concrete). In some cases, experimental results were also available for a more thorough benchmark.

  1. Energy deposition studies for the high-luminosity Large Hadron Collider inner triplet magnets

    DOE PAGES

    Mokhov, N. V.; Rakhno, I. L.; Tropin, I. S.; ...

    2015-05-06

    A detailed model of the high-luminosity LHC inner triplet region with new large-aperture Nb 3Sn magnets, field maps, corrector packages, and segmented tungsten inner absorbers was built and implemented into the fluka and mars15 codes. Detailed simulations have been performed coherently with the codes on the impact of particle debris from the 14-TeV center-of-mass pp-collisions on the short- and long-term stability of the inner triplet magnets. After optimizing the absorber configuration, the peak power density averaged over the magnet inner cable width is found to be safely below the quench limit at the luminosity of 5×10 34 cm -2s -1.more » For the anticipated lifetime integrated luminosity of 3000 fb -1, the peak dose calculated for the innermost magnet insulator ranges from 20 to 35 MGy, a figure close to the commonly accepted limit. Dynamic heat loads to the triplet magnet cold mass are calculated to evaluate the cryogenic capability. fluka and mars results on energy deposition are in very good agreement.« less

  2. A new three-tier architecture design for multi-sphere neutron spectrometer with the FLUKA code

    NASA Astrophysics Data System (ADS)

    Huang, Hong; Yang, Jian-Bo; Tuo, Xian-Guo; Liu, Zhi; Wang, Qi-Biao; Wang, Xu

    2016-07-01

    The current commercially, available Bonner sphere neutron spectrometer (BSS) has high sensitivity to neutrons below 20 MeV, which causes it to be poorly placed to measure neutrons ranging from a few MeV to 100 MeV. The paper added moderator layers and the auxiliary material layer upon 3He proportional counters with FLUKA code, with a view to improve. The results showed that the responsive peaks to neutrons below 20 MeV gradually shift to higher energy region and decrease slightly with the increasing moderator thickness. On the contrary, the response for neutrons above 20 MeV was always very low until we embed auxiliary materials such as copper (Cu), lead (Pb), tungsten (W) into moderator layers. This paper chose the most suitable auxiliary material Pb to design a three-tier architecture multi-sphere neutron spectrometer (NBSS). Through calculating and comparing, the NBSS was advantageous in terms of response for 5-100 MeV and the highest response was 35.2 times the response of polyethylene (PE) ball with the same PE thickness.

  3. The estimation of background production by cosmic rays in high-energy gamma ray telescopes

    NASA Technical Reports Server (NTRS)

    Edwards, H. L.; Nolan, P. L.; Lin, Y. C.; Koch, D. G.; Bertsch, D. L.; Fichtel, C. E.; Hartman, R. C.; Hunter, S. D.; Kniffen, D. A.; Hughes, E. B.

    1991-01-01

    A calculational method of estimating instrumental background in high-energy gamma-ray telescopes, using the hadronic Monte Carlo code FLUKA87, is presented. The method is applied to the SAS-2 and EGRET telescope designs and is also used to explore the level of background to be expected for alternative configurations of the proposed GRITS telescope, which adapts the external fuel tank of a Space Shuttle as a gamma-ray telescope with a very large collecting area. The background produced in proton-beam tests of EGRET is much less than the predicted level. This discrepancy appears to be due to the FLUKA87 inability to transport evaporation nucleons. It is predicted that the background in EGRET will be no more than 4-10 percent of the extragalactic diffuse gamma radiation.

  4. Use of borated polyethylene to improve low energy response of a prompt gamma based neutron dosimeter

    NASA Astrophysics Data System (ADS)

    Priyada, P.; Ashwini, U.; Sarkar, P. K.

    2016-05-01

    The feasibility of using a combined sample of borated polyethylene and normal polyethylene to estimate neutron ambient dose equivalent from measured prompt gamma emissions is investigated theoretically to demonstrate improvements in low energy neutron dose response compared to only polyethylene. Monte Carlo simulations have been carried out using the FLUKA code to calculate the response of boron, hydrogen and carbon prompt gamma emissions to mono energetic neutrons. The weighted least square method is employed to arrive at the best linear combination of these responses that approximates the ICRP fluence to dose conversion coefficients well in the energy range of 10-8 MeV to 14 MeV. The configuration of the combined system is optimized through FLUKA simulations. The proposed method is validated theoretically with five different workplace neutron spectra with satisfactory outcome.

  5. Radiological Studies for the LCLS Beam Abort System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santana Leitner, M.; Vollaire, J.; Mao, X.S.

    2008-03-25

    The Linac Coherent Light Source (LCLS), a pioneer hard x-ray free electron laser is currently under construction at the Stanford Linear Accelerator Center. It is expected that by 2009 LCLS will deliver laser pulses of unprecedented brightness and short length, which will be used in several forefront research applications. This ambitious project encompasses major design challenges to the radiation protection like the numerous sources and the number of surveyed objects. In order to sort those, the showers from various loss sources have been tracked along a detailed model covering 1/2 mile of LCLS accelerator by means of the Monte Carlomore » intra nuclear cascade codes FLUKA and MARS15. This article covers the FLUKA studies of heat load; prompt and residual dose and environmental impact for the LCLS beam abort system.« less

  6. SU-E-T-569: Neutron Shielding Calculation Using Analytical and Multi-Monte Carlo Method for Proton Therapy Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, S; Shin, E H; Kim, J

    2015-06-15

    Purpose: To evaluate the shielding wall design to protect patients, staff and member of the general public for secondary neutron using a simply analytic solution, multi-Monte Carlo code MCNPX, ANISN and FLUKA. Methods: An analytical and multi-Monte Carlo method were calculated for proton facility (Sumitomo Heavy Industry Ltd.) at Samsung Medical Center in Korea. The NCRP-144 analytical evaluation methods, which produced conservative estimates on the dose equivalent values for the shielding, were used for analytical evaluations. Then, the radiation transport was simulated with the multi-Monte Carlo code. The neutron dose at evaluation point is got by the value using themore » production of the simulation value and the neutron dose coefficient introduced in ICRP-74. Results: The evaluation points of accelerator control room and control room entrance are mainly influenced by the point of the proton beam loss. So the neutron dose equivalent of accelerator control room for evaluation point is 0.651, 1.530, 0.912, 0.943 mSv/yr and the entrance of cyclotron room is 0.465, 0.790, 0.522, 0.453 mSv/yr with calculation by the method of NCRP-144 formalism, ANISN, FLUKA and MCNP, respectively. The most of Result of MCNPX and FLUKA using the complicated geometry showed smaller values than Result of ANISN. Conclusion: The neutron shielding for a proton therapy facility has been evaluated by the analytic model and multi-Monte Carlo methods. We confirmed that the setting of shielding was located in well accessible area to people when the proton facility is operated.« less

  7. Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes

    NASA Technical Reports Server (NTRS)

    Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.

    2001-01-01

    The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as part of their continuing efforts to support the users of the FLUKA code within the particle physics community. In keeping with the spirit of developing an evolving physics code, we are planning as part of this project, to participate in the efforts to validate the core FLUKA physics in ground-based accelerator test runs. The emphasis of these test runs will be the physics of greatest interest in the simulation of the space radiation environment. Such a tool will be of great value to planners, designers and operators of future space missions, as well as for the design of the vehicles and habitats to be used on such missions. It will also be of aid to future experiments of various kinds that may be affected at some level by the ambient radiation environment, or in the analysis of hybrid experiment designs that have been discussed for space-based astronomy and astrophysics. The tool will be of value to the Life Sciences personnel involved in the prediction and measurement of radiation doses experienced by the crewmembers on such missions. In addition, the tool will be of great use to the planners of experiments to measure and evaluate the space radiation environment itself. It can likewise be useful in the analysis of safe havens, hazard migration plans, and NASA's call for new research in composites and to NASA engineers modeling the radiation exposure of electronic circuits. This code will provide an important complimentary check on the predictions of analytic codes such as BRYNTRN/HZETRN that are presently used for many similar applications, and which have shortcomings that are more easily overcome with Monte Carlo type simulations. Finally, it is acknowledged that there are similar efforts based around the use of the GEANT4 Monte-Carlo transport code currently under development at CERN. It is our intention to make our software modular and sufficiently flexible to allow the parallel use of either FLUKA or GEANT4 as the physics transport engine.

  8. Benchmarking of Neutron Production of Heavy-Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  9. Benchmarking of Heavy Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  10. The MONET code for the evaluation of the dose in hadrontherapy

    NASA Astrophysics Data System (ADS)

    Embriaco, A.

    2018-01-01

    The MONET is a code for the computation of the 3D dose distribution for protons in water. For the lateral profile, MONET is based on the Molière theory of multiple Coulomb scattering. To take into account also the nuclear interactions, we add to this theory a Cauchy-Lorentz function, where the two parameters are obtained by a fit to a FLUKA simulation. We have implemented the Papoulis algorithm for the passage from the projected to a 2D lateral distribution. For the longitudinal profile, we have implemented a new calculation of the energy loss that is in good agreement with simulations. The inclusion of the straggling is based on the convolution of energy loss with a Gaussian function. In order to complete the longitudinal profile, also the nuclear contributions are included using a linear parametrization. The total dose profile is calculated in a 3D mesh by evaluating at each depth the 2D lateral distributions and by scaling them at the value of the energy deposition. We have compared MONET with FLUKA in two cases: a single Gaussian beam and a lateral scan. In both cases, we have obtained a good agreement for different energies of protons in water.

  11. The Application of FLUKA to Dosimetry and Radiation Therapy

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Andersen, Victor; Pinsky, Lawrence; Ferrari, Alfredo; Battistoni, Giusenni

    2005-01-01

    Monte Carlo transport codes like FLUKA are useful for many purposes, and one of those is the simulation of the effects of radiation traversing the human body. In particular, radiation has been used in cancer therapy for a long time, and recently this has been extended to include heavy ion particle beams. The advent of this particular type of therapy has led to the need for increased capabilities in the transport codes used to simulate the detailed nature of the treatment doses to the Y O U S tissues that are encountered. This capability is also of interest to NASA because of the nature of the radiation environment in space.[l] While in space, the crew members bodies are continually being traversed by virtually all forms of radiation. In assessing the risk that this exposure causes, heavy ions are of primary importance. These arise both from the primary external space radiation itself, as well as fragments that result from interactions during the traversal of that radiation through any intervening material including intervening body tissue itself. Thus the capability to characterize the details of the radiation field accurately within a human body subjected to such external 'beams" is of critical importance.

  12. Monte Carlo Simulation of a Segmented Detector for Low-Energy Electron Antineutrinos

    NASA Astrophysics Data System (ADS)

    Qomi, H. Akhtari; Safari, M. J.; Davani, F. Abbasi

    2017-11-01

    Detection of low-energy electron antineutrinos is of importance for several purposes, such as ex-vessel reactor monitoring, neutrino oscillation studies, etc. The inverse beta decay (IBD) is the interaction that is responsible for detection mechanism in (organic) plastic scintillation detectors. Here, a detailed study will be presented dealing with the radiation and optical transport simulation of a typical segmented antineutrino detector withMonte Carlo method using MCNPX and FLUKA codes. This study shows different aspects of the detector, benefiting from inherent capabilities of the Monte Carlo simulation codes.

  13. Experimental study and simulation of 63Zn production via proton induce reaction.

    PubMed

    Rostampour, Malihe; Sadeghi, Mahdi; Aboudzadeh, Mohammadreza; Hamidi, Saeid; Soltani, Naser; Novin, Fatemeh Bolouri; Rahiminejad, Ali; Rajabifar, Saeid

    2018-06-01

    The 63 Zn was produced by16.8 MeV proton irradiation of natural copper. Thick target yield for 63 Zn in the energy range of 16.8 →12.2 MeV was 2.47 ± 0.12 GBq/μA.h. Reasonable agreement between achieved experimental data and theoretical value of thick target yield for 63 Zn was observed. A simple separation procedure of 63 Zn from copper target was developed using cation exchange chromatography. About 88 ± 5% of the loaded activity was recovered. The performance of FLUKA to reproduce experimental data of thick target yield of 63 Zn is validated. The achieved results from this code were compared with the corresponding experimental data. This comparison demonstrated that FLUKA provides a suitable tool for the simulation of radionuclide production using proton irradiation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Benchmarking of neutron production of heavy-ion transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, I.; Ronningen, R. M.; Heilbronn, L.

    Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less

  15. Experimental and Monte Carlo studies of fluence corrections for graphite calorimetry in low- and high-energy clinical proton beams.

    PubMed

    Lourenço, Ana; Thomas, Russell; Bouchard, Hugo; Kacperek, Andrzej; Vondracek, Vladimir; Royle, Gary; Palmans, Hugo

    2016-07-01

    The aim of this study was to determine fluence corrections necessary to convert absorbed dose to graphite, measured by graphite calorimetry, to absorbed dose to water. Fluence corrections were obtained from experiments and Monte Carlo simulations in low- and high-energy proton beams. Fluence corrections were calculated to account for the difference in fluence between water and graphite at equivalent depths. Measurements were performed with narrow proton beams. Plane-parallel-plate ionization chambers with a large collecting area compared to the beam diameter were used to intercept the whole beam. High- and low-energy proton beams were provided by a scanning and double scattering delivery system, respectively. A mathematical formalism was established to relate fluence corrections derived from Monte Carlo simulations, using the fluka code [A. Ferrari et al., "fluka: A multi-particle transport code," in CERN 2005-10, INFN/TC 05/11, SLAC-R-773 (2005) and T. T. Böhlen et al., "The fluka Code: Developments and challenges for high energy and medical applications," Nucl. Data Sheets 120, 211-214 (2014)], to partial fluence corrections measured experimentally. A good agreement was found between the partial fluence corrections derived by Monte Carlo simulations and those determined experimentally. For a high-energy beam of 180 MeV, the fluence corrections from Monte Carlo simulations were found to increase from 0.99 to 1.04 with depth. In the case of a low-energy beam of 60 MeV, the magnitude of fluence corrections was approximately 0.99 at all depths when calculated in the sensitive area of the chamber used in the experiments. Fluence correction calculations were also performed for a larger area and found to increase from 0.99 at the surface to 1.01 at greater depths. Fluence corrections obtained experimentally are partial fluence corrections because they account for differences in the primary and part of the secondary particle fluence. A correction factor, F(d), has been established to relate fluence corrections defined theoretically to partial fluence corrections derived experimentally. The findings presented here are also relevant to water and tissue-equivalent-plastic materials given their carbon content.

  16. Evaluation of the water-equivalence of plastic materials in low- and high-energy clinical proton beams

    NASA Astrophysics Data System (ADS)

    Lourenço, A.; Shipley, D.; Wellock, N.; Thomas, R.; Bouchard, H.; Kacperek, A.; Fracchiolla, F.; Lorentini, S.; Schwarz, M.; MacDougall, N.; Royle, G.; Palmans, H.

    2017-05-01

    The aim of this work was to evaluate the water-equivalence of new trial plastics designed specifically for light-ion beam dosimetry as well as commercially available plastics in clinical proton beams. The water-equivalence of materials was tested by computing a plastic-to-water conversion factor, {{H}\\text{pl,\\text{w}}} . Trial materials were characterized experimentally in 60 MeV and 226 MeV un-modulated proton beams and the results were compared with Monte Carlo simulations using the FLUKA code. For the high-energy beam, a comparison between the trial plastics and various commercial plastics was also performed using FLUKA and Geant4 Monte Carlo codes. Experimental information was obtained from laterally integrated depth-dose ionization chamber measurements in water, with and without plastic slabs with variable thicknesses in front of the water phantom. Fluence correction factors, {{k}\\text{fl}} , between water and various materials were also derived using the Monte Carlo method. For the 60 MeV proton beam, {{H}\\text{pl,\\text{w}}} and {{k}\\text{fl}} factors were within 1% from unity for all trial plastics. For the 226 MeV proton beam, experimental {{H}\\text{pl,\\text{w}}} values deviated from unity by a maximum of about 1% for the three trial plastics and experimental results showed no advantage regarding which of the plastics was the most equivalent to water. Different magnitudes of corrections were found between Geant4 and FLUKA for the various materials due mainly to the use of different nonelastic nuclear data. Nevertheless, for the 226 MeV proton beam, {{H}\\text{pl,\\text{w}}} correction factors were within 2% from unity for all the materials. Considering the results from the two Monte Carlo codes, PMMA and trial plastic #3 had the smallest {{H}\\text{pl,\\text{w}}} values, where maximum deviations from unity were 1%, however, PMMA range differed by 16% from that of water. Overall, {{k}\\text{fl}} factors were deviating more from unity than {{H}\\text{pl,\\text{w}}} factors and could amount to a few percent for some materials.

  17. Evaluation of the water-equivalence of plastic materials in low- and high-energy clinical proton beams.

    PubMed

    Lourenço, A; Shipley, D; Wellock, N; Thomas, R; Bouchard, H; Kacperek, A; Fracchiolla, F; Lorentini, S; Schwarz, M; MacDougall, N; Royle, G; Palmans, H

    2017-05-21

    The aim of this work was to evaluate the water-equivalence of new trial plastics designed specifically for light-ion beam dosimetry as well as commercially available plastics in clinical proton beams. The water-equivalence of materials was tested by computing a plastic-to-water conversion factor, [Formula: see text]. Trial materials were characterized experimentally in 60 MeV and 226 MeV un-modulated proton beams and the results were compared with Monte Carlo simulations using the FLUKA code. For the high-energy beam, a comparison between the trial plastics and various commercial plastics was also performed using FLUKA and Geant4 Monte Carlo codes. Experimental information was obtained from laterally integrated depth-dose ionization chamber measurements in water, with and without plastic slabs with variable thicknesses in front of the water phantom. Fluence correction factors, [Formula: see text], between water and various materials were also derived using the Monte Carlo method. For the 60 MeV proton beam, [Formula: see text] and [Formula: see text] factors were within 1% from unity for all trial plastics. For the 226 MeV proton beam, experimental [Formula: see text] values deviated from unity by a maximum of about 1% for the three trial plastics and experimental results showed no advantage regarding which of the plastics was the most equivalent to water. Different magnitudes of corrections were found between Geant4 and FLUKA for the various materials due mainly to the use of different nonelastic nuclear data. Nevertheless, for the 226 MeV proton beam, [Formula: see text] correction factors were within 2% from unity for all the materials. Considering the results from the two Monte Carlo codes, PMMA and trial plastic #3 had the smallest [Formula: see text] values, where maximum deviations from unity were 1%, however, PMMA range differed by 16% from that of water. Overall, [Formula: see text] factors were deviating more from unity than [Formula: see text] factors and could amount to a few percent for some materials.

  18. TU-H-BRC-09: Validation of a Novel Therapeutic X-Ray Array Source and Collimation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trovati, S; King, GJ; Loo, BW

    2016-06-15

    Purpose: We have experimentally characterized and simulated the dosimetric properties and spatial fidelity of a novel X-ray array source and collimation system called SPHINX that has the potential to generate complex intensity modulated X-ray beams by varying the electron beam intensity only, and without any moving parts like in multi-leaf collimators. Methods: We investigated the spatial fidelity and the X-ray performances of a SPHINX prototype in tungsten, using a Cyber Knife and the experimental high-energy electron beam line at XTA at SLAC National Laboratory. Dose distributions were recorded with gafchromic films, placed at the distal end of SPHINX and atmore » several depths in a solid water phantom. The geometry of SPHINX and of the experimental set-ups was also modeled in Monte Carlo (MC) simulations with the FLUKA code, used to reproduce the experimental results and, after validation, to predict and optimize the performance and design of the SPHINX. Results: The results indicate significant particle leakage through the channels during a single-channel irradiation for high incident energies, followed by a rapid decrease for energies of clinical interest. When the collimator channels are used as target, the photon production increases, however at expense of the beam size that is also enlarged. The illumination of all channels simultaneously shows a fairly even transmission of the beam. Conclusion: With the measurements we have verified the MC models and the uniformity of beam transmission through SPHINX, and we have evaluated the importance of particle leakage through adjacent channels. These results can be used to optimize SPHINX design through the validated MC simulations. Funding: Weston Havens Foundation, Office of the Dean of Medical School and Office of the Provost (Stanford University). Loo, Maxim, Borchard, Tantawi are co-founders of TibaRay Inc. Loo and Tantawi are TibaRay Inc. board members. Loo and Maxim received grants from Varian Medical Systems and RaySearch Laboratory.« less

  19. An experimental approach to improve the Monte Carlo modelling of offline PET/CT-imaging of positron emitters induced by scanned proton beams

    NASA Astrophysics Data System (ADS)

    Bauer, J.; Unholtz, D.; Kurz, C.; Parodi, K.

    2013-08-01

    We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β+ activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β+ activity induced in the investigated targets within a few per cent. Moreover, the simulated distal activity fall-off positions, representing the central quantity for treatment monitoring in terms of beam range verification, are found to agree within 0.6 mm with the measurements at different initial beam energies in both homogeneous and heterogeneous targets. Based on work presented at the Third European Workshop on Monte Carlo Treatment Planning (Seville, 15-18 May 2012).

  20. Monte Carlo calculation of the atmospheric antinucleon flux

    NASA Astrophysics Data System (ADS)

    Djemil, T.; Attallah, R.; Capdevielle, J. N.

    2009-12-01

    The atmospheric antiproton and antineutron energy spectra are calculated at float altitude using the CORSIKA package in a three-dimensional Monte Carlo simulation. The hadronic interaction is treated by the FLUKA code below 80 GeV/nucleon and NEXUS elsewhere. The solar modulation which is described by the force field theory and the geomagnetic effects are taken into account. The numerical results are compared with the BESS-2001 experimental data.

  1. Predicting induced radioactivity for the accelerator operations at the Taiwan Photon Source.

    PubMed

    Sheu, R J; Jiang, S H

    2010-12-01

    This study investigates the characteristics of induced radioactivity due to the operations of a 3-GeV electron accelerator at the Taiwan Photon Source (TPS). According to the beam loss analysis, the authors set two representative irradiation conditions for the activation analysis. The FLUKA Monte Carlo code has been used to predict the isotope inventories, residual activities, and remanent dose rates as a function of time. The calculation model itself is simple but conservative for the evaluation of induced radioactivity in a light source facility. This study highlights the importance of beam loss scenarios and demonstrates the great advantage of using FLUKA in comparing the predicted radioactivity with corresponding regulatory limits. The calculated results lead to the conclusion that, due to fairly low electron consumption, the radioactivity induced in the accelerator components and surrounding concrete walls of the TPS is rather moderate and manageable, while the possible activation of air and cooling water in the tunnel and their environmental releases are negligible.

  2. Measurements and FLUKA Simulations of Bismuth, Aluminium and Indium Activation at the upgraded CERN Shielding Benchmark Facility (CSBF)

    NASA Astrophysics Data System (ADS)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.; Yashima, H.

    2018-06-01

    The CERN High energy AcceleRator Mixed field (CHARM) facility is situated in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5·1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7·1010 protons per second. The extracted proton beam impacts on a cylindrical copper target. The shielding of the CHARM facility includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target that allows deep shielding penetration benchmark studies of various shielding materials. This facility has been significantly upgraded during the extended technical stop at the beginning of 2016. It consists now of 40 cm of cast iron shielding, a 200 cm long removable sample holder concrete block with 3 inserts for activation samples, a material test location that is used for the measurement of the attenuation length for different shielding materials as well as for sample activation at different thicknesses of the shielding materials. Activation samples of bismuth, aluminium and indium were placed in the CSBF in September 2016 to characterize the upgraded version of the CSBF. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields of bismuth isotopes (206 Bi, 205 Bi, 204 Bi, 203 Bi, 202 Bi, 201 Bi) from 209 Bi, 24 Na from 27 Al and 115 m I from 115 I for these samples. The production yields estimated by FLUKA Monte Carlo simulations are compared to the production yields obtained from γ-spectroscopy measurements of the samples taking the beam intensity profile into account. The agreement between FLUKA predictions and γ-spectroscopy measurements for the production yields is at a level of a factor of 2.

  3. Estimation of Airborne Radioactivity Induced by 8-GeV-Class Electron LINAC Accelerator.

    PubMed

    Asano, Yoshihiro

    2017-10-01

    Airborne radioactivity induced by high-energy electrons from 6 to 10 GeV is estimated by using analytical methods and the Monte Carlo codes PHITS and FLUKA. Measurements using a gas monitor with a NaI(Tl) scintillator are carried out in air from a dump room at SACLA, an x-ray free-electron laser facility with 7.8-GeV electrons and are compared to the simulations.

  4. Developing of a New Atmospheric Ionizing Radiation (AIR) Model

    NASA Technical Reports Server (NTRS)

    Clem, John M.; deAngelis, Giovanni; Goldhagen, Paul; Wilson, John W.

    2003-01-01

    As a result of the research leading to the 1998 AIR workshop and the subsequent analysis, the neutron issues posed by Foelsche et al. and further analyzed by Hajnal have been adequately resolved. We are now engaged in developing a new atmospheric ionizing radiation (AIR) model for use in epidemiological studies and air transportation safety assessment. A team was formed to examine a promising code using the basic FLUKA software but with modifications to allow multiple charged ion breakup effects. A limited dataset of the ER-2 measurements and other cosmic ray data will be used to evaluate the use of this code.

  5. Spacecraft Solar Particle Event (SPE) Shielding: Shielding Effectiveness as a Function of SPE Model as Determined with the FLUKA Radiation Transport Code

    NASA Astrophysics Data System (ADS)

    Koontz, S. L.; Atwell, W. A.; Reddell, B.; Rojdev, K.

    2010-12-01

    In the this paper, we report the results of modeling and simulation studies in which the radiation transport code FLUKA (FLUktuierende KAskade) is used to determine the changes in total ionizing dose (TID) and single-event effect (SEE) environments behind aluminum, polyethylene, carbon, and titanium shielding masses when the assumed form (i.e., Band or Exponential) of the solar particle event (SPE) kinetic energy spectra is changed. FLUKA simulations are fully three dimensional with an isotropic particle flux incident on a concentric spherical shell shielding mass and detector structure. FLUKA is a fully integrated and extensively verified Monte Carlo simulation package for the interaction and transport of high-energy particles and nuclei in matter. The effects are reported of both energetic primary protons penetrating the shield mass and secondary particle showers caused by energetic primary protons colliding with shielding mass nuclei. SPE heavy ion spectra are not addressed. Our results, in agreement with previous studies, show that use of the Exponential form of the event spectra can seriously underestimate spacecraft SPE TID and SEE environments in some, but not all, shielding mass cases. The SPE spectra investigated are taken from four specific SPEs that produced ground-level events (GLEs) during solar cycle 23 (1997-2008). GLEs are produced by highly energetic solar particle events (ESP), i.e., those that contain significant fluences of 700 MeV to 10 GeV protons. Highly energetic SPEs are implicated in increased rates of spacecraft anomalies and spacecraft failures. High-energy protons interact with Earth’s atmosphere via nuclear reaction to produce secondary particles, some of which are neutrons that can be detected at the Earth’s surface by the global neutron monitor network. GLEs are one part of the overall SPE resulting from a particular solar flare or coronal mass ejection event on the sun. The ESP part of the particle event, detected by spacecraft, is often associated with the arrival of a “shock front” at Earth some hours after the arrival of the GLE. The specific SPEs used in this analysis are those of: 1) November 6, 1997 - GLE only; 2) July 14-15, 2000 - GLE from the 14th plus ESP from the 15th; 3) November 4-6, 2001 - GLE and ESP from the 4th; and 4) October 28-29, 2003 - GLE and ESP from the 28th plus GLE from the 29th. The corresponding Band and Exponential spectra used in this paper are like those previously reported.

  6. Determination and Fabrication of New Shield Super Alloys Materials for Nuclear Reactor Safety by Experiments and Cern-Fluka Monte Carlo Simulation Code, Geant4 and WinXCom

    NASA Astrophysics Data System (ADS)

    Aygun, Bünyamin; Korkut, Turgay; Karabulut, Abdulhalik

    2016-05-01

    Despite the possibility of depletion of fossil fuels increasing energy needs the use of radiation tends to increase. Recently the security-focused debate about planned nuclear power plants still continues. The objective of this thesis is to prevent the radiation spread from nuclear reactors into the environment. In order to do this, we produced higher performanced of new shielding materials which are high radiation holders in reactors operation. Some additives used in new shielding materials; some of iron (Fe), rhenium (Re), nickel (Ni), chromium (Cr), boron (B), copper (Cu), tungsten (W), tantalum (Ta), boron carbide (B4C). The results of this experiments indicated that these materials are good shields against gamma and neutrons. The powder metallurgy technique was used to produce new shielding materials. CERN - FLUKA Geant4 Monte Carlo simulation code and WinXCom were used for determination of the percentages of high temperature resistant and high-level fast neutron and gamma shielding materials participated components. Super alloys was produced and then the experimental fast neutron dose equivalent measurements and gamma radiation absorpsion of the new shielding materials were carried out. The produced products to be used safely reactors not only in nuclear medicine, in the treatment room, for the storage of nuclear waste, nuclear research laboratories, against cosmic radiation in space vehicles and has the qualities.

  7. Extension of PENELOPE to protons: simulation of nuclear reactions and benchmark with Geant4.

    PubMed

    Sterpin, E; Sorriaux, J; Vynckier, S

    2013-11-01

    Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4. PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac-Hartree-Fock-Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer-Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for (1)H and ICRU 63 data for (12)C, (14)N, (16)O, (31)P, and (40)Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth-dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth-dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone. For simulations with EM collisions only, integral depth-dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth-dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth-dose distributions). The agreement is much better with FLUKA, with deviations within 3%/3 mm. When nuclear interactions were turned on, agreement (within 6% before the Bragg-peak) between PENH and Geant4 was consistent with uncertainties on nuclear models and cross sections, whatever the material simulated (water, muscle, or bone). A detailed and flexible description of nuclear reactions has been implemented in the PENH extension of PENELOPE to protons, which utilizes a mixed-simulation scheme for both elastic and inelastic EM collisions, analogous to the well-established algorithm for electrons/positrons. PENH is compatible with all current main programs that use PENELOPE as the MC engine. The nuclear model of PENH is realistic enough to give dose distributions in fair agreement with those computed by Geant4.

  8. Extension of PENELOPE to protons: Simulation of nuclear reactions and benchmark with Geant4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sterpin, E.; Sorriaux, J.; Vynckier, S.

    2013-11-15

    Purpose: Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4.Methods: PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer–Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for {sup 1}H and ICRUmore » 63 data for {sup 12}C, {sup 14}N, {sup 16}O, {sup 31}P, and {sup 40}Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth–dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth–dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone.Results: For simulations with EM collisions only, integral depth–dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth–dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth–dose distributions). The agreement is much better with FLUKA, with deviations within 3%/3 mm. When nuclear interactions were turned on, agreement (within 6% before the Bragg-peak) between PENH and Geant4 was consistent with uncertainties on nuclear models and cross sections, whatever the material simulated (water, muscle, or bone).Conclusions: A detailed and flexible description of nuclear reactions has been implemented in the PENH extension of PENELOPE to protons, which utilizes a mixed-simulation scheme for both elastic and inelastic EM collisions, analogous to the well-established algorithm for electrons/positrons. PENH is compatible with all current main programs that use PENELOPE as the MC engine. The nuclear model of PENH is realistic enough to give dose distributions in fair agreement with those computed by Geant4.« less

  9. Update on the Code Intercomparison and Benchmark for Muon Fluence and Absorbed Dose Induced by an 18 GeV Electron Beam After Massive Iron Shielding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fasso, A.; Ferrari, A.; Ferrari, A.

    In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, andmore » with the SLAC data.« less

  10. Gas bremsstrahlung shielding calculation for first optic enclosure of ILSF medical beamline

    NASA Astrophysics Data System (ADS)

    Beigzadeh Jalali, H.; Salimi, E.; Rahighi, J.

    2016-10-01

    Gas bremsstrahlung is generated in high energy electron storage ring accompanies the synchrotron radiation into the beamlines and strike the various components of the beamline. In this paper, radiation shielding calculation for secondary gas bremsstrahlung is performed for the first optics enclosure (FOE) of medical beamline of the Iranian Light Source Facility (ILSF). Dose equivalent rate (DER) calculation is accomplished using FLUKA Monte Carlo code. A comprehensive study of DER distribution at the back wall, sides and roof is given.

  11. Porous silicon carbide and aluminum oxide with unidirectional open porosity as model target materials for radioisotope beam production

    NASA Astrophysics Data System (ADS)

    Czapski, M.; Stora, T.; Tardivat, C.; Deville, S.; Santos Augusto, R.; Leloup, J.; Bouville, F.; Fernandes Luis, R.

    2013-12-01

    New silicon carbide (SiC) and aluminum oxide (Al2O3) of a tailor-made microstructure were produced using the ice-templating technique, which permits controlled pore formation conditions within the material. These prototypes will serve to verify aging of the new advanced target materials under irradiation with proton beams. Before this, the evaluation of their mechanical integrity was made based on the energy deposition spectra produced by FLUKA codes.

  12. Neutron Energy and Time-of-flight Spectra Behind the Lateral Shield of a High Energy Electron Accelerator Beam Dump, Part II: Monte Carlo Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roesler, Stefan

    2002-09-19

    Energy spectra of high-energy neutrons and neutron time-of-flight spectra were calculated for the setup of experiment T-454 performed with a NE213 liquid scintillator at the Final Focus Test Beam (FFTB) facility at the Stanford Linear Accelerator Center. The neutrons were created by the interaction a 28.7 GeV electron beam in the aluminum beam dump of the FFTB which is housed inside a thick steel and concrete shielding. In order to determine the attenuation length of high-energy neutrons additional concrete shielding of various thicknesses was placed outside the existing shielding. The calculations were performed using the FLUKA interaction and transport code.more » The energy and time-of-flight were recorded for the location of the detector allowing a detailed comparison with the experimental data. A generally good description of the data is achieved adding confidence to the use of FLUKA for the design of shielding for high-energy electron accelerators.« less

  13. Role of shielding in modulating the effects of solar particle events: Monte Carlo calculation of absorbed dose and DNA complex lesions in different organs

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Biaggi, M.; De Biaggi, L.; Ferrari, A.; Ottolenghi, A.; Panzarasa, A.; Paretzke, H. G.; Pelliccioni, M.; Sala, P.; Scannicchio, D.; hide

    2004-01-01

    Distributions of absorbed dose and DNA clustered damage yields in various organs and tissues following the October 1989 solar particle event (SPE) were calculated by coupling the FLUKA Monte Carlo transport code with two anthropomorphic phantoms (a mathematical model and a voxel model), with the main aim of quantifying the role of the shielding features in modulating organ doses. The phantoms, which were assumed to be in deep space, were inserted into a shielding box of variable thickness and material and were irradiated with the proton spectra of the October 1989 event. Average numbers of DNA lesions per cell in different organs were calculated by adopting a technique already tested in previous works, consisting of integrating into "condensed-history" Monte Carlo transport codes--such as FLUKA--yields of radiobiological damage, either calculated with "event-by-event" track structure simulations, or taken from experimental works available in the literature. More specifically, the yields of "Complex Lesions" (or "CL", defined and calculated as a clustered DNA damage in a previous work) per unit dose and DNA mass (CL Gy-1 Da-1) due to the various beam components, including those derived from nuclear interactions with the shielding and the human body, were integrated in FLUKA. This provided spatial distributions of CL/cell yields in different organs, as well as distributions of absorbed doses. The contributions of primary protons and secondary hadrons were calculated separately, and the simulations were repeated for values of Al shielding thickness ranging between 1 and 20 g/cm2. Slight differences were found between the two phantom types. Skin and eye lenses were found to receive larger doses with respect to internal organs; however, shielding was more effective for skin and lenses. Secondary particles arising from nuclear interactions were found to have a minor role, although their relative contribution was found to be larger for the Complex Lesions than for the absorbed dose, due to their higher LET and thus higher biological effectiveness. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  14. Role of shielding in modulating the effects of solar particle events: Monte Carlo calculation of absorbed dose and DNA complex lesions in different organs

    NASA Astrophysics Data System (ADS)

    Ballarini, F.; Biaggi, M.; De Biaggi, L.; Ferrari, A.; Ottolenghi, A.; Panzarasa, A.; Paretzke, H. G.; Pelliccioni, M.; Sala, P.; Scannicchio, D.; Zankl, M.

    2004-01-01

    Distributions of absorbed dose and DNA clustered damage yields in various organs and tissues following the October 1989 solar particle event (SPE) were calculated by coupling the FLUKA Monte Carlo transport code with two anthropomorphic phantoms (a mathematical model and a voxel model), with the main aim of quantifying the role of the shielding features in modulating organ doses. The phantoms, which were assumed to be in deep space, were inserted into a shielding box of variable thickness and material and were irradiated with the proton spectra of the October 1989 event. Average numbers of DNA lesions per cell in different organs were calculated by adopting a technique already tested in previous works, consisting of integrating into "condensed-history" Monte Carlo transport codes - such as FLUKA - yields of radiobiological damage, either calculated with "event-by-event" track structure simulations, or taken from experimental works available in the literature. More specifically, the yields of "Complex Lesions" (or "CL", defined and calculated as a clustered DNA damage in a previous work) per unit dose and DNA mass (CL Gy -1 Da -1) due to the various beam components, including those derived from nuclear interactions with the shielding and the human body, were integrated in FLUKA. This provided spatial distributions of CL/cell yields in different organs, as well as distributions of absorbed doses. The contributions of primary protons and secondary hadrons were calculated separately, and the simulations were repeated for values of Al shielding thickness ranging between 1 and 20 g/cm 2. Slight differences were found between the two phantom types. Skin and eye lenses were found to receive larger doses with respect to internal organs; however, shielding was more effective for skin and lenses. Secondary particles arising from nuclear interactions were found to have a minor role, although their relative contribution was found to be larger for the Complex Lesions than for the absorbed dose, due to their higher LET and thus higher biological effectiveness.

  15. Dose reduction of scattered photons from concrete walls lined with lead: Implications for improvement in design of megavoltage radiation therapy facility mazes.

    PubMed

    Al-Affan, I A M; Hugtenburg, R P; Bari, D S; Al-Saleh, W M; Piliero, M; Evans, S; Al-Hasan, M; Al-Zughul, B; Al-Kharouf, S; Ghaith, A

    2015-02-01

    This study explores the possibility of using lead to cover part of the radiation therapy facility maze walls in order to absorb low energy photons and reduce the total dose at the maze entrance of radiation therapy rooms. Experiments and Monte Carlo simulations were utilized to establish the possibility of using high-Z materials to cover the concrete walls of the maze in order to reduce the dose of the scattered photons at the maze entrance. The dose of the backscattered photons from a concrete wall was measured for various scattering angles. The dose was also calculated by the FLUKA and EGSnrc Monte Carlo codes. The FLUKA code was also used to simulate an existing radiotherapy room to study the effect of multiple scattering when adding lead to cover the concrete walls of the maze. Monoenergetic photons were used to represent the main components of the x ray spectrum up to 10 MV. It was observed that when the concrete wall was covered with just 2 mm of lead, the measured dose rate at all backscattering angles was reduced by 20% for photons of energy comparable to Co-60 emissions and 70% for Cs-137 emissions. The simulations with FLUKA and EGS showed that the reduction in the dose was potentially even higher when lead was added. One explanation for the reduction is the increased absorption of backscattered photons due to the photoelectric interaction in lead. The results also showed that adding 2 mm lead to the concrete walls and floor of the maze reduced the dose at the maze entrance by up to 90%. This novel proposal of covering part or the entire maze walls with a few millimeters of lead would have a direct implication for the design of radiation therapy facilities and would assist in upgrading the design of some mazes, especially those in facilities with limited space where the maze length cannot be extended to sufficiently reduce the dose. © 2015 American Association of Physicists in Medicine.

  16. Dose reduction of scattered photons from concrete walls lined with lead: Implications for improvement in design of megavoltage radiation therapy facility mazes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Affan, I. A. M., E-mail: info@medphys-environment.co.uk; Hugtenburg, R. P.; Piliero, M.

    Purpose: This study explores the possibility of using lead to cover part of the radiation therapy facility maze walls in order to absorb low energy photons and reduce the total dose at the maze entrance of radiation therapy rooms. Methods: Experiments and Monte Carlo simulations were utilized to establish the possibility of using high-Z materials to cover the concrete walls of the maze in order to reduce the dose of the scattered photons at the maze entrance. The dose of the backscattered photons from a concrete wall was measured for various scattering angles. The dose was also calculated by themore » FLUKA and EGSnrc Monte Carlo codes. The FLUKA code was also used to simulate an existing radiotherapy room to study the effect of multiple scattering when adding lead to cover the concrete walls of the maze. Monoenergetic photons were used to represent the main components of the x ray spectrum up to 10 MV. Results: It was observed that when the concrete wall was covered with just 2 mm of lead, the measured dose rate at all backscattering angles was reduced by 20% for photons of energy comparable to Co-60 emissions and 70% for Cs-137 emissions. The simulations with FLUKA and EGS showed that the reduction in the dose was potentially even higher when lead was added. One explanation for the reduction is the increased absorption of backscattered photons due to the photoelectric interaction in lead. The results also showed that adding 2 mm lead to the concrete walls and floor of the maze reduced the dose at the maze entrance by up to 90%. Conclusions: This novel proposal of covering part or the entire maze walls with a few millimeters of lead would have a direct implication for the design of radiation therapy facilities and would assist in upgrading the design of some mazes, especially those in facilities with limited space where the maze length cannot be extended to sufficiently reduce the dose.« less

  17. Poster - 40: Treatment Verification of a 3D-printed Eye Phantom for Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunning, Chelsea; Lindsay, Clay; Unick, Nick

    Purpose: Ocular melanoma is a form of eye cancer which is often treated using proton therapy. The benefit of the steep proton dose gradient can only be leveraged for accurate patient eye alignment. A treatment-planning program was written to plan on a 3D-printed anatomical eye-phantom, which was then irradiated to demonstrate the feasibility of verifying in vivo dosimetry for proton therapy using PET imaging. Methods: A 3D CAD eye model with critical organs was designed and voxelized into the Monte-Carlo transport code FLUKA. Proton dose and PET isotope production were simulated for a treatment plan of a test tumour, generatedmore » by a 2D treatment-planning program developed using NumPy and proton range tables. Next, a plastic eye-phantom was 3D-printed from the CAD model, irradiated at the TRIUMF Proton Therapy facility, and imaged using a PET scanner. Results: The treatment-planning program prediction of the range setting and modulator wheel was verified in FLUKA to treat the tumour with at least 90% dose coverage for both tissue and plastic. An axial isotope distribution of the PET isotopes was simulated in FLUKA and converted to PET scan counts. Meanwhile, the 3D-printed eye-phantom successfully yielded a PET signal. Conclusions: The 2D treatment-planning program can predict required parameters to sufficiently treat an eye tumour, which was experimentally verified using commercial 3D-printing hardware to manufacture eye-phantoms. Comparison between the simulated and measured PET isotope distribution could provide a more realistic test of eye alignment, and a variation of the method using radiographic film is being developed.« less

  18. Experimental approach to measure thick target neutron yields induced by heavy ions for shielding

    NASA Astrophysics Data System (ADS)

    Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Brouillard, C.; Clerc, T.; Damoy, S.; Desmezières, V.; Dessay, E.; Dupuis, M.; Grinyer, G. F.; Grinyer, J.; Jacquot, B.; Ledoux, X.; Madeline, A.; Menard, N.; Michel, M.; Morel, V.; Porée, F.; Rannou, B.; Savalle, A.

    2017-09-01

    Double differential (angular and energy) neutron distributions were measured using an activation foil technique. Reactions were induced by impinging two low-energy heavy-ion beams accelerated with the GANIL CSS1 cyclotron: (36S (12 MeV/u) and 208Pb (6.25 MeV/u)) onto thick natCu targets. Results have been compared to Monte-Carlo calculations from two codes (PHITS and FLUKA) for the purpose of benchmarking radiation protection and shielding requirements. This comparison suggests a disagreement between calculations and experiment, particularly for high-energy neutrons.

  19. Testing FLUKA on neutron activation of Si and Ge at nuclear research reactor using gamma spectroscopy

    NASA Astrophysics Data System (ADS)

    Bazo, J.; Rojas, J. M.; Best, S.; Bruna, R.; Endress, E.; Mendoza, P.; Poma, V.; Gago, A. M.

    2018-03-01

    Samples of two characteristic semiconductor sensor materials, silicon and germanium, have been irradiated with neutrons produced at the RP-10 Nuclear Research Reactor at 4.5 MW. Their radionuclides photon spectra have been measured with high resolution gamma spectroscopy, quantifying four radioisotopes (28Al, 29Al for Si and 75Ge and 77Ge for Ge). We have compared the radionuclides production and their emission spectrum data with Monte Carlo simulation results from FLUKA. Thus we have tested FLUKA's low energy neutron library (ENDF/B-VIIR) and decay photon scoring with respect to the activation of these semiconductors. We conclude that FLUKA is capable of predicting relative photon peak amplitudes, with gamma intensities greater than 1%, of produced radionuclides with an average uncertainty of 13%. This work allows us to estimate the corresponding systematic error on neutron activation simulation studies of these sensor materials.

  20. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  1. A Study of Neutron Leakage in Finite Objects

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code capable of simulating High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation was recently developed for simple shielded objects. Monte Carlo (MC) benchmarks were used to verify the 3DHZETRN methodology in slab and spherical geometry, and it was shown that 3DHZETRN agrees with MC codes to the degree that various MC codes agree among themselves. One limitation in the verification process is that all of the codes (3DHZETRN and three MC codes) utilize different nuclear models/databases. In the present report, the new algorithm, with well-defined convergence criteria, is used to quantify the neutron leakage from simple geometries to provide means of verifying 3D effects and to provide guidance for further code development.

  2. Comparison of space radiation calculations for deterministic and Monte Carlo transport codes

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo

    For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.

  3. EDITORIAL: Special section: Selected papers from the Third European Workshop on Monte Carlo Treatment Planning (MCTP2012) Special section: Selected papers from the Third European Workshop on Monte Carlo Treatment Planning (MCTP2012)

    NASA Astrophysics Data System (ADS)

    Spezi, Emiliano; Leal, Antonio

    2013-04-01

    The Third European Workshop on Monte Carlo Treatment Planning (MCTP2012) was held from 15-18 May, 2012 in Seville, Spain. The event was organized by the Universidad de Sevilla with the support of the European Workgroup on Monte Carlo Treatment Planning (EWG-MCTP). MCTP2012 followed two successful meetings, one held in Ghent (Belgium) in 2006 (Reynaert 2007) and one in Cardiff (UK) in 2009 (Spezi 2010). The recurrence of these workshops together with successful events held in parallel by McGill University in Montreal (Seuntjens et al 2012), show consolidated interest from the scientific community in Monte Carlo (MC) treatment planning. The workshop was attended by a total of 90 participants, mainly coming from a medical physics background. A total of 48 oral presentations and 15 posters were delivered in specific scientific sessions including dosimetry, code development, imaging, modelling of photon and electron radiation transport, external beam radiation therapy, nuclear medicine, brachitherapy and hadrontherapy. A copy of the programme is available on the workshop's website (www.mctp2012.com). In this special section of Physics in Medicine and Biology we report six papers that were selected following the journal's rigorous peer review procedure. These papers actually provide a good cross section of the areas of application of MC in treatment planning that were discussed at MCTP2012. Czarnecki and Zink (2013) and Wagner et al (2013) present the results of their work in small field dosimetry. Czarnecki and Zink (2013) studied field size and detector dependent correction factors for diodes and ion chambers within a clinical 6MV photon beam generated by a Siemens linear accelerator. Their modelling work based on the BEAMnrc/EGSnrc codes and experimental measurements revealed that unshielded diodes were the best choice for small field dosimetry because of their independence from the electron beam spot size and correction factor close to unity. Wagner et al (2013) investigated the recombination effect on liquid ionization chambers for stereotactic radiotherapy, a field of increasing importance in external beam radiotherapy. They modelled both radiation source (Cyberknife unit) and detector with the BEAMnrc/EGSnrc codes and quantified the dependence of the response of this type of detectors on factors such as the volume effect and the electrode. They also recommended that these dependences be accounted for in measurements involving small fields. In the field of external beam radiotherapy, Chakarova et al (2013) showed how total body irradiation (TBI) could be improved by simulating patient treatments with MC. In particular, BEAMnrc/EGSnrc based simulations highlighted the importance of optimizing individual compensators for TBI treatments. In the same area of application, Mairani et al (2013) reported on a new tool for treatment planning in proton therapy based on the FLUKA MC code. The software, used to model both proton therapy beam and patient anatomy, supports single-field and multiple-field optimization and can be used to optimize physical and relative biological effectiveness (RBE)-weighted dose distribution, using both constant and variable RBE models. In the field of nuclear medicine Marcatili et al (2013) presented RAYDOSE, a Geant4-based code specifically developed for applications in molecular radiotherapy (MRT). RAYDOSE has been designed to work in MRT trials using sequential positron emission tomography (PET) or single-photon emission tomography (SPECT) imaging to model patient specific time-dependent metabolic uptake and to calculate the total 3D dose distribution. The code was validated through experimental measurements in homogeneous and heterogeneous phantoms. Finally, in the field of code development Miras et al (2013) reported on CloudMC, a Windows Azure-based application for the parallelization of MC calculations in a dynamic cluster environment. Although the performance of CloudMC has been tested with the PENELOPE MC code, the authors report that software has been designed in a way that it should be independent of the type of MC code, provided that simulation meets a number of operational criteria. We wish to thank Elekta/CMS Inc., the University of Seville, the Junta of Andalusia and the European Regional Development Fund for their financial support. We would like also to acknowledge the members of EWG-MCTP for their help in peer-reviewing all the abstracts, and all the invited speakers who kindly agreed to deliver keynote presentations in their area of expertise. A final word of thanks to our colleagues who worked on the reviewing process of the papers selected for this special section and to the IOP Publishing staff who made it possible. MCTP2012 was accredited by the European Federation of Organisations for Medical Physics as a CPD event for medical physicists. Emiliano Spezi and Antonio Leal Guest Editors References Chakarova R, Müntzing K, Krantz M, E Hedin E and Hertzman S 2013 Monte Carlo optimization of total body irradiation in a phantom and patient geometry Phys. Med. Biol. 58 2461-69 Czarnecki D and Zink K 2013 Monte Carlo calculated correction factors for diodes and ion chambers in small photon fields Phys. Med. Biol. 58 2431-44 Mairani A, Böhlen T T, Schiavi A, Tessonnier T, Molinelli S, Brons S, Battistoni G, Parodi K and Patera V 2013 A Monte Carlo-based treatment planning tool for proton therapy Phys. Med. Biol. 58 2471-90 Marcatili S, Pettinato C, Daniels S, Lewis G, Edwards P, Fanti S and Spezi E 2013 Development and validation of RAYDOSE: a Geant4 based application for molecular radiotherapy Phys. Med. Biol. 58 2491-508 Miras H, Jiménez R, Miras C and Gomà C 2013 CloudMC: A cloud computing application for Monte Carlo simulation Phys. Med. Biol. 58 N125-33 Reynaert N 2007 First European Workshop on Monte Carlo Treatment Planning J. Phys.: Conf. Ser. 74 011001 Seuntjens J, Beaulieu L, El Naqa I and Després P 2012 Special section: Selected papers from the Fourth International Workshop on Recent Advances in Monte Carlo Techniques for Radiation Therapy Phys. Med. Biol. 57 (11) E01 Spezi E 2010 Special section: Selected papers from the Second European Workshop on Monte Carlo Treatment Planning (MCTP2009) Phys. Med. Biol. 55 (16) E01 Wagner A, Crop F, Lacornerie T, Vandevelde F and Reynaert N 2013 Use of a liquid ionization chamber for stereotactic radiotherapy dosimetry Phys. Med. Biol. 58 2445-59

  4. Solar Proton Transport within an ICRU Sphere Surrounded by a Complex Shield: Combinatorial Geometry

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    The 3DHZETRN code, with improved neutron and light ion (Z (is) less than 2) transport procedures, was recently developed and compared to Monte Carlo (MC) simulations using simplified spherical geometries. It was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in general combinatorial geometry. A more complex shielding structure with internal parts surrounding a tissue sphere is considered and compared against MC simulations. It is shown that even in the more complex geometry, 3DHZETRN agrees well with the MC codes and maintains a high degree of computational efficiency.

  5. Comparison with simulations to experimental data for photo-neutron reactions using SPring-8 Injector

    NASA Astrophysics Data System (ADS)

    Asano, Yoshihiro

    2017-09-01

    Simulations of photo-nuclear reactions by using Monte Carlo codes PHITS and FLUKA have been performed to compare to the measured data at the SPring-8 injector with 250MeV and 961MeV electrons. Measurement data of Bismuth-206 productions due to photo-nuclear reactions of 209Bi(γ,3n) 206Bi and high energy neutron reactions of 209Bi(n,4n)206 Bi at the beam dumps have been compared with the simulations. Neutron leakage spectra outside the shield wall are also compared between experiments and simulations.

  6. Preliminary Modelling of Radiation Levels at the Fermilab PIP-II Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lari, L.; Cerutti, F.; Esposito, L. S.

    PIP-II is the Fermilab's flagship project for providing powerful, high-intensity proton beams to the laboratory's experiments. The heart of PIP-II is an 800-MeV superconducting linac accelerator. It will be located in a new tunnel with new service buildings and connected to the present Booster through a new transfer line. To support the design of civil engineering and mechanical integration, this paper provides preliminary estimation of radiation level in the gallery at an operational beam loss limit of 0.1 W/m, by means of Monte Carlo calculations with FLUKA and MARS15 codes.

  7. Path Toward a Unified Geometry for Radiation Transport

    NASA Astrophysics Data System (ADS)

    Lee, Kerry

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The work-flow for doing radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats.

  8. FLUKA Monte Carlo simulations and benchmark measurements for the LHC beam loss monitors

    NASA Astrophysics Data System (ADS)

    Sarchiapone, L.; Brugger, M.; Dehning, B.; Kramer, D.; Stockner, M.; Vlachoudis, V.

    2007-10-01

    One of the crucial elements in terms of machine protection for CERN's Large Hadron Collider (LHC) is its beam loss monitoring (BLM) system. On-line loss measurements must prevent the superconducting magnets from quenching and protect the machine components from damages due to unforeseen critical beam losses. In order to ensure the BLM's design quality, in the final design phase of the LHC detailed FLUKA Monte Carlo simulations were performed for the betatron collimation insertion. In addition, benchmark measurements were carried out with LHC type BLMs installed at the CERN-EU high-energy Reference Field facility (CERF). This paper presents results of FLUKA calculations performed for BLMs installed in the collimation region, compares the results of the CERF measurement with FLUKA simulations and evaluates related uncertainties. This, together with the fact that the CERF source spectra at the respective BLM locations are comparable with those at the LHC, allows assessing the sensitivity of the performed LHC design studies.

  9. Solar proton exposure of an ICRU sphere within a complex structure Part I: Combinatorial geometry.

    PubMed

    Wilson, John W; Slaba, Tony C; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A

    2016-06-01

    The 3DHZETRN code, with improved neutron and light ion (Z≤2) transport procedures, was recently developed and compared to Monte Carlo (MC) simulations using simplified spherical geometries. It was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in general combinatorial geometry. A more complex shielding structure with internal parts surrounding a tissue sphere is considered and compared against MC simulations. It is shown that even in the more complex geometry, 3DHZETRN agrees well with the MC codes and maintains a high degree of computational efficiency. Published by Elsevier Ltd.

  10. Design and spectrum calculation of 4H-SiC thermal neutron detectors using FLUKA and TCAD

    NASA Astrophysics Data System (ADS)

    Huang, Haili; Tang, Xiaoyan; Guo, Hui; Zhang, Yimen; Zhang, Yimeng; Zhang, Yuming

    2016-10-01

    SiC is a promising material for neutron detection in a harsh environment due to its wide band gap, high displacement threshold energy and high thermal conductivity. To increase the detection efficiency of SiC, a converter such as 6LiF or 10B is introduced. In this paper, pulse-height spectra of a PIN diode with a 6LiF conversion layer exposed to thermal neutrons (0.026 eV) are calculated using TCAD and Monte Carlo simulations. First, the conversion efficiency of a thermal neutron with respect to the thickness of 6LiF was calculated by using a FLUKA code, and a maximal efficiency of approximately 5% was achieved. Next, the energy distributions of both 3H and α induced by the 6LiF reaction according to different ranges of emission angle are analyzed. Subsequently, transient pulses generated by the bombardment of single 3H or α-particles are calculated. Finally, pulse height spectra are obtained with a detector efficiency of 4.53%. Comparisons of the simulated result with the experimental data are also presented, and the calculated spectrum shows an acceptable similarity to the experimental data. This work would be useful for radiation-sensing applications, especially for SiC detector design.

  11. Energy spectrum of 208Pb(n,x) reactions

    NASA Astrophysics Data System (ADS)

    Tel, E.; Kavun, Y.; Özdoǧan, H.; Kaplan, A.

    2018-02-01

    Fission and fusion reactor technologies have been investigated since 1950's on the world. For reactor technology, fission and fusion reaction investigations are play important role for improve new generation technologies. Especially, neutron reaction studies have an important place in the development of nuclear materials. So neutron effects on materials should study as theoretically and experimentally for improve reactor design. For this reason, Nuclear reaction codes are very useful tools when experimental data are unavailable. For such circumstances scientists created many nuclear reaction codes such as ALICE/ASH, CEM95, PCROSS, TALYS, GEANT, FLUKA. In this study we used ALICE/ASH, PCROSS and CEM95 codes for energy spectrum calculation of outgoing particles from Pb bombardment by neutron. While Weisskopf-Ewing model has been used for the equilibrium process in the calculations, full exciton, hybrid and geometry dependent hybrid nuclear reaction models have been used for the pre-equilibrium process. The calculated results have been discussed and compared with the experimental data taken from EXFOR.

  12. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA andmore » MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  13. THE McGill PLANAR HYDROGEN ATMOSPHERE CODE (McPHAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haakonsen, Christian Bernt; Turner, Monica L.; Tacik, Nick A.

    2012-04-10

    The McGill Planar Hydrogen Atmosphere Code (McPHAC) v1.1 calculates the hydrostatic equilibrium structure and emergent spectrum of an unmagnetized hydrogen atmosphere in the plane-parallel approximation, at surface gravities appropriate for neutron stars. McPHAC incorporates several improvements over previous codes for which tabulated model spectra are available: (1) Thomson scattering is treated anisotropically, which is shown to result in a 0.2%-3% correction in the emergent spectral flux across the 0.1-5 keV passband; (2) the McPHAC source code is made available to the community, allowing it to be scrutinized and modified by other researchers wishing to study or extend its capabilities; andmore » (3) the numerical uncertainty resulting from the discrete and iterative solution is studied as a function of photon energy, indicating that McPHAC is capable of producing spectra with numerical uncertainties <0.01%. The accuracy of the spectra may at present be limited to {approx}1%, but McPHAC enables researchers to study the impact of uncertain inputs and additional physical effects, thereby supporting future efforts to reduce those inaccuracies. Comparison of McPHAC results with spectra from one of the previous model atmosphere codes (NSA) shows agreement to {approx}<1% near the peaks of the emergent spectra. However, in the Wien tail a significant deficit of flux in the spectra of the previous model is revealed, determined to be due to the previous work not considering large enough optical depths at the highest photon frequencies. The deficit is most significant for spectra with T{sub eff} < 10{sup 5.6} K, though even there it may not be of much practical importance for most observations.« less

  14. The McGill Planar Hydrogen Atmosphere Code (McPHAC)

    NASA Astrophysics Data System (ADS)

    Haakonsen, Christian Bernt; Turner, Monica L.; Tacik, Nick A.; Rutledge, Robert E.

    2012-04-01

    The McGill Planar Hydrogen Atmosphere Code (McPHAC) v1.1 calculates the hydrostatic equilibrium structure and emergent spectrum of an unmagnetized hydrogen atmosphere in the plane-parallel approximation, at surface gravities appropriate for neutron stars. McPHAC incorporates several improvements over previous codes for which tabulated model spectra are available: (1) Thomson scattering is treated anisotropically, which is shown to result in a 0.2%-3% correction in the emergent spectral flux across the 0.1-5 keV passband; (2) the McPHAC source code is made available to the community, allowing it to be scrutinized and modified by other researchers wishing to study or extend its capabilities; and (3) the numerical uncertainty resulting from the discrete and iterative solution is studied as a function of photon energy, indicating that McPHAC is capable of producing spectra with numerical uncertainties <0.01%. The accuracy of the spectra may at present be limited to ~1%, but McPHAC enables researchers to study the impact of uncertain inputs and additional physical effects, thereby supporting future efforts to reduce those inaccuracies. Comparison of McPHAC results with spectra from one of the previous model atmosphere codes (NSA) shows agreement to lsim1% near the peaks of the emergent spectra. However, in the Wien tail a significant deficit of flux in the spectra of the previous model is revealed, determined to be due to the previous work not considering large enough optical depths at the highest photon frequencies. The deficit is most significant for spectra with T eff < 105.6 K, though even there it may not be of much practical importance for most observations.

  15. McPHAC: McGill Planar Hydrogen Atmosphere Code

    NASA Astrophysics Data System (ADS)

    Haakonsen, Christian Bernt; Turner, Monica L.; Tacik, Nick A.; Rutledge, Robert E.

    2012-10-01

    The McGill Planar Hydrogen Atmosphere Code (McPHAC) v1.1 calculates the hydrostatic equilibrium structure and emergent spectrum of an unmagnetized hydrogen atmosphere in the plane-parallel approximation at surface gravities appropriate for neutron stars. McPHAC incorporates several improvements over previous codes for which tabulated model spectra are available: (1) Thomson scattering is treated anisotropically, which is shown to result in a 0.2%-3% correction in the emergent spectral flux across the 0.1-5 keV passband; (2) the McPHAC source code is made available to the community, allowing it to be scrutinized and modified by other researchers wishing to study or extend its capabilities; and (3) the numerical uncertainty resulting from the discrete and iterative solution is studied as a function of photon energy, indicating that McPHAC is capable of producing spectra with numerical uncertainties <0.01%. The accuracy of the spectra may at present be limited to ~1%, but McPHAC enables researchers to study the impact of uncertain inputs and additional physical effects, thereby supporting future efforts to reduce those inaccuracies. Comparison of McPHAC results with spectra from one of the previous model atmosphere codes (NSA) shows agreement to lsim1% near the peaks of the emergent spectra. However, in the Wien tail a significant deficit of flux in the spectra of the previous model is revealed, determined to be due to the previous work not considering large enough optical depths at the highest photon frequencies. The deficit is most significant for spectra with T eff < 105.6 K, though even there it may not be of much practical importance for most observations.

  16. Assessment of the neutron dose field around a biomedical cyclotron: FLUKA simulation and experimental measurements.

    PubMed

    Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario

    2016-12-01

    In the planning of a new cyclotron facility, an accurate knowledge of the radiation field around the accelerator is fundamental for the design of shielding, the protection of workers, the general public and the environment. Monte Carlo simulations can be very useful in this process, and their use is constantly increasing. However, few data have been published so far as regards the proper validation of Monte Carlo simulation against experimental measurements, particularly in the energy range of biomedical cyclotrons. In this work a detailed model of an existing installation of a GE PETtrace 16.5MeV cyclotron was developed using FLUKA. An extensive measurement campaign of the neutron ambient dose equivalent H ∗ (10) in marked positions around the cyclotron was conducted using a neutron rem-counter probe and CR39 neutron detectors. Data from a previous measurement campaign performed by our group using TLDs were also re-evaluated. The FLUKA model was then validated by comparing the results of high-statistics simulations with experimental data. In 10 out of 12 measurement locations, FLUKA simulations were in agreement within uncertainties with all the three different sets of experimental data; in the remaining 2 positions, the agreement was with 2/3 of the measurements. Our work allows to quantitatively validate our FLUKA simulation setup and confirms that Monte Carlo technique can produce accurate results in the energy range of biomedical cyclotrons. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Development of a Ne gas target for {sup 22}Na production by proton irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandal, Bidhan Ch., E-mail: mechbidhan@gmail.com; Pal, Gautam; Barua, Luna

    2016-03-15

    The article presents the design and development of a neon gas target for the production of {sup 22}Na using a proton beam from the room temperature cyclotron in Variable Energy Cyclotron Centre, Kolkata. The target design is made to handle a beam power of 85 W (17 MeV, 5 μA). The design is based on simulation using the computer code FLUKA for the beam dump and CFD-CFX for target cooling. The target has been successfully used for the production of {sup 22}Na in a 6 day long 17 MeV, 5 μA proton irradiation run.

  18. Sensitivity of atmospheric muon flux calculation to low energy hadronic interaction models

    NASA Astrophysics Data System (ADS)

    Djemil, T.; Attallah, R.; Capdevielle, J. N.

    2007-10-01

    We investigate in this paper the impact of some up-to-date hadronic interaction models on the calculation of the atmospheric muon flux. Calculations are carried out with the air shower simulation code CORSIKA in combination with the hadronic interaction models FLUKA and UrQMD below 80 GeV/nucleon and NEXUS elsewhere. We also examine the atmospheric effects using two different parametrizations of the US standard atmosphere. The cosmic ray spectra of protons and α particles, the only primary particles considered here, are taken according to the force field model which describes properly solar modulation. Numerical results are compared with the BESS-2001 experimental data.

  19. A Bonner Sphere Spectrometer with extended response matrix

    NASA Astrophysics Data System (ADS)

    Birattari, C.; Dimovasili, E.; Mitaroff, A.; Silari, M.

    2010-08-01

    This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators.

  20. Activation of accelerator construction materials by heavy ions

    NASA Astrophysics Data System (ADS)

    Katrík, P.; Mustafin, E.; Hoffmann, D. H. H.; Pavlovič, M.; Strašík, I.

    2015-12-01

    Activation data for an aluminum target irradiated by 200 MeV/u 238U ion beam are presented in the paper. The target was irradiated in the stacked-foil geometry and analyzed using gamma-ray spectroscopy. The purpose of the experiment was to study the role of primary particles, projectile fragments, and target fragments in the activation process using the depth profiling of residual activity. The study brought information on which particles contribute dominantly to the target activation. The experimental data were compared with the Monte Carlo simulations by the FLUKA 2011.2c.0 code. This study is a part of a research program devoted to activation of accelerator construction materials by high-energy (⩾200 MeV/u) heavy ions at GSI Darmstadt. The experimental data are needed to validate the computer codes used for simulation of interaction of swift heavy ions with matter.

  1. Performance of two commercial electron beam algorithms over regions close to the lung-mediastinum interface, against Monte Carlo simulation and point dosimetry in virtual and anthropomorphic phantoms.

    PubMed

    Ojala, J; Hyödynmaa, S; Barańczyk, R; Góra, E; Waligórski, M P R

    2014-03-01

    Electron radiotherapy is applied to treat the chest wall close to the mediastinum. The performance of the GGPB and eMC algorithms implemented in the Varian Eclipse treatment planning system (TPS) was studied in this region for 9 and 16 MeV beams, against Monte Carlo (MC) simulations, point dosimetry in a water phantom and dose distributions calculated in virtual phantoms. For the 16 MeV beam, the accuracy of these algorithms was also compared over the lung-mediastinum interface region of an anthropomorphic phantom, against MC calculations and thermoluminescence dosimetry (TLD). In the phantom with a lung-equivalent slab the results were generally congruent, the eMC results for the 9 MeV beam slightly overestimating the lung dose, and the GGPB results for the 16 MeV beam underestimating the lung dose. Over the lung-mediastinum interface, for 9 and 16 MeV beams, the GGPB code underestimated the lung dose and overestimated the dose in water close to the lung, compared to the congruent eMC and MC results. In the anthropomorphic phantom, results of TLD measurements and MC and eMC calculations agreed, while the GGPB code underestimated the lung dose. Good agreement between TLD measurements and MC calculations attests to the accuracy of "full" MC simulations as a reference for benchmarking TPS codes. Application of the GGPB code in chest wall radiotherapy may result in significant underestimation of the lung dose and overestimation of dose to the mediastinum, affecting plan optimization over volumes close to the lung-mediastinum interface, such as the lung or heart. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  2. Path Toward a Unifid Geometry for Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lee, Kerry; Barzilla, Janet; Davis, Andrew; Zachmann

    2014-01-01

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex computer-aided design (CAD) models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN [high charge and energy transport code developed by NASA Langley Research Center (LaRC)], are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit-specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The workflow for achieving radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats

  3. Adaptive Transmission and Channel Modeling for Frequency Hopping Communications

    DTIC Science & Technology

    2009-09-21

    proposed adaptive transmission method has much greater system capacity than conventional non-adaptive MC direct- sequence ( DS )- CDMA system. • We...several mobile radio systems. First, a new improved allocation algorithm was proposed for multicarrier code-division multiple access (MC- CDMA ) system...Multicarrier code-division multiple access (MC- CDMA ) system with adaptive frequency hopping (AFH) has attracted attention of researchers due to its

  4. SU-F-T-144: Analytical Closed Form Approximation for Carbon Ion Bragg Curves in Water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuomanen, S; Moskvin, V; Farr, J

    2016-06-15

    Purpose: Semi-empirical modeling is a powerful computational method in radiation dosimetry. A set of approximations exist for proton ion depth dose distribution (DDD) in water. However, the modeling is more complicated for carbon ions due to fragmentation. This study addresses this by providing and evaluating a new methodology for DDD modeling of carbon ions in water. Methods: The FLUKA, Monte Carlo (MC) general-purpose transport code was used for simulation of carbon DDDs for energies of 100–400 MeV in water as reference data model benchmarking. Based on Thomas Bortfeld’s closed form equation approximating proton Bragg Curves as a basis, we derivedmore » the critical constants for a beam of Carbon ions by applying models of radiation transport by Lee et. al. and Geiger to our simulated Carbon curves. We hypothesized that including a new exponential (κ) residual distance parameter to Bortfeld’s fluence reduction relation would improve DDD modeling for carbon ions. We are introducing an additional term to be added to Bortfeld’s equation to describe fragmentation tail. This term accounts for the pre-peak dose from nuclear fragments (NF). In the post peak region, the NF transport will be treated as new beams utilizing the Glauber model for interaction cross sections and the Abrasion- Ablation fragmentation model. Results: The carbon beam specific constants in the developed model were determined to be : p= 1.75, β=0.008 cm-1, γ=0.6, α=0.0007 cm MeV, σmono=0.08, and the new exponential parameter κ=0.55. This produced a close match for the plateau part of the curve (max deviation 6.37%). Conclusion: The derived semi-empirical model provides an accurate approximation of the MC simulated clinical carbon DDDs. This is the first direct semi-empirical simulation for the dosimetry of therapeutic carbon ions. The accurate modeling of the NF tail in the carbon DDD will provide key insight into distal edge dose deposition formation.« less

  5. Dark matter search in a Beam-Dump eXperiment (BDX) at Jefferson Lab: an update on PR12-16-001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Battaglieri, M.

    This document is an update to the proposal PR12-16-001 Dark matter search in a Beam-Dump eXperiment (BDX) at Jefferson Lab submitted to JLab-PAC44 in 2016 reporting progress in addressing questions raised regarding the beam-on backgrounds. The concerns are addressed by adopting a new simulation tool, FLUKA, and planning measurements of muon fluxes from the dump with its existing shielding around the dump. First, we have implemented the detailed BDX experimental geometry into a FLUKA simulation, in consultation with experts from the JLab Radiation Control Group. The FLUKA simulation has been compared directly to our GEANT4 simulations and shown to agreemore » in regions of validity. The FLUKA interaction package, with a tuned set of biasing weights, is naturally able to generate reliable particle distributions with very small probabilities and therefore predict rates at the detector location beyond the planned shielding around the beam dump. Second, we have developed a plan to conduct measurements of the muon ux from the Hall-A dump in its current configuration to validate our simulations.« less

  6. Comparison of optimized single and multifield irradiation plans of antiproton, proton and carbon ion beams.

    PubMed

    Bassler, Niels; Kantemiris, Ioannis; Karaiskos, Pantelis; Engelke, Julia; Holzscheiter, Michael H; Petersen, Jørgen B

    2010-04-01

    Antiprotons have been suggested as a possibly superior modality for radiotherapy, due to the energy released when antiprotons annihilate, which enhances the Bragg peak and introduces a high-LET component to the dose. However, concerns are expressed about the inferior lateral dose distribution caused by the annihilation products. We use the Monte Carlo code FLUKA to generate depth-dose kernels for protons, antiprotons, and carbon ions. Using these we then build virtual treatment plans optimized according to ICRU recommendations for the different beam modalities, which then are recalculated with FLUKA. Dose-volume histograms generated from these plans can be used to compare the different irradiations. The enhancement in physical and possibly biological dose from annihilating antiprotons can significantly lower the dose in the entrance channel; but only at the expense of a diffuse low dose background from long-range secondary particles. Lateral dose distributions are improved using active beam delivery methods, instead of flat fields. Dose-volume histograms for different treatment scenarios show that antiprotons have the potential to reduce the volume of normal tissue receiving medium to high dose, however, in the low dose region antiprotons are inferior to both protons and carbon ions. This limits the potential usage to situations where dose to normal tissue must be reduced as much as possible. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  7. A method for radiological characterization based on fluence conversion coefficients

    NASA Astrophysics Data System (ADS)

    Froeschl, Robert

    2018-06-01

    Radiological characterization of components in accelerator environments is often required to ensure adequate radiation protection during maintenance, transport and handling as well as for the selection of the proper disposal pathway. The relevant quantities are typical the weighted sums of specific activities with radionuclide-specific weighting coefficients. Traditional methods based on Monte Carlo simulations are radionuclide creation-event based or the particle fluences in the regions of interest are scored and then off-line weighted with radionuclide production cross sections. The presented method bases the radiological characterization on a set of fluence conversion coefficients. For a given irradiation profile and cool-down time, radionuclide production cross-sections, material composition and radionuclide-specific weighting coefficients, a set of particle type and energy dependent fluence conversion coefficients is computed. These fluence conversion coefficients can then be used in a Monte Carlo transport code to perform on-line weighting to directly obtain the desired radiological characterization, either by using built-in multiplier features such as in the PHITS code or by writing a dedicated user routine such as for the FLUKA code. The presented method has been validated against the standard event-based methods directly available in Monte Carlo transport codes.

  8. Dosimetric impact of the low-dose envelope of scanned proton beams at a ProBeam facility: comparison of measurements with TPS and MC calculations.

    PubMed

    Würl, M; Englbrecht, F; Parodi, K; Hillbrand, M

    2016-01-21

    Due to the low-dose envelope of scanned proton beams, the dose output depends on the size of the irradiated field or volume. While this field size dependence has already been extensively investigated by measurements and Monte Carlo (MC) simulations for single pencil beams or monoenergetic fields, reports on the relevance of this effect for analytical dose calculation models are limited. Previous studies on this topic only exist for specific beamline designs. However, the amount of large-angle scattered primary and long-range secondary particles and thus the relevance of the low-dose envelope can considerably be influenced by the particular design of the treatment nozzle. In this work, we therefore addressed the field size dependence of the dose output at the commercially available ProBeam(®) beamline, which is being built in several facilities worldwide. We compared treatment planning dose calculations with ionization chamber (IC) measurements and MC simulations, using an experimentally validated FLUKA MC model of the scanning beamline. To this aim, monoenergetic square fields of three energies, as well as spherical target volumes were studied, including the investigation on the influence of the lateral spot spacing on the field size dependence. For the spherical target volumes, MC as well as analytical dose calculation were found in excellent agreement with the measurements in the center of the spread-out Bragg peak. In the plateau region, the treatment planning system (TPS) tended to overestimate the dose compared to MC calculations and IC measurements by up to almost 5% for the smallest investigated sphere and for small monoenergetic square fields. Narrower spot spacing slightly enhanced the field size dependence of the dose output. The deviations in the plateau dose were found to go in the clinically safe direction, i.e. the actual deposited dose outside the target was found to be lower than predicted by the TPS. Thus, the moderate overestimation of dose to normal tissue by the TPS is likely to result in no severe consequences in clinical cases, even for the most critical cases of small target volumes.

  9. Event Generators for Simulating Heavy Ion Interactions of Interest in Evaluating Risks in Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Pinsky, Lawrence; Andersen, Victor; Empl, Anton; Lee, Kerry; Smirmov, Georgi; Zapp, Neal; Ferrari, Alfredo; Tsoulou, Katerina; Roesler, Stefan; hide

    2005-01-01

    Simulating the Space Radiation environment with Monte Carlo Codes, such as FLUKA, requires the ability to model the interactions of heavy ions as they penetrate spacecraft and crew member's bodies. Monte-Carlo-type transport codes use total interaction cross sections to determine probabilistically when a particular type of interaction has occurred. Then, at that point, a distinct event generator is employed to determine separately the results of that interaction. The space radiation environment contains a full spectrum of radiation types, including relativistic nuclei, which are the most important component for the evaluation of crew doses. Interactions between incident protons with target nuclei in the spacecraft materials and crew member's bodies are well understood. However, the situation is substantially less comfortable for incident heavier nuclei (heavy ions). We have been engaged in developing several related heavy ion interaction models based on a Quantum Molecular Dynamics-type approach for energies up through about 5 GeV per nucleon (GeV/A) as part of a NASA Consortium that includes a parallel program of cross section measurements to guide and verify this code development.

  10. Monte Carlo and analytical calculations for characterization of gas bremsstrahlung in ILSF insertion devices

    NASA Astrophysics Data System (ADS)

    Salimi, E.; Rahighi, J.; Sardari, D.; Mahdavi, S. R.; Lamehi Rachti, M.

    2014-12-01

    Gas bremsstrahlung is generated in high energy electron storage rings through interaction of the electron beam with the residual gas molecules in vacuum chamber. In this paper, Monte Carlo calculation has been performed to evaluate radiation hazard due to gas bremsstrahlung in the Iranian Light Source Facility (ILSF) insertion devices. Shutter/stopper dimensions is determined and dose rate from the photoneutrons via the giant resonance photonuclear reaction which takes place inside the shutter/stopper is also obtained. Some other characteristics of gas bremsstrahlung such as photon fluence, energy spectrum, angular distribution and equivalent dose in tissue equivalent phantom have also been investigated by FLUKA Monte Carlo code.

  11. Study on induced radioactivity of China Spallation Neutron Source

    NASA Astrophysics Data System (ADS)

    Wu, Qing-Biao; Wang, Qing-Bin; Wu, Jing-Min; Ma, Zhong-Jian

    2011-06-01

    China Spallation Neutron Source (CSNS) is the first High Energy Intense Proton Accelerator planned to be constructed in China during the State Eleventh Five-Year Plan period, whose induced radioactivity is very important for occupational disease hazard assessment and environmental impact assessment. Adopting the FLUKA code, the authors have constructed a cylinder-tunnel geometric model and a line-source sampling physical model, deduced proper formulas to calculate air activation, and analyzed various issues with regard to the activation of different tunnel parts. The results show that the environmental impact resulting from induced activation is negligible, whereas the residual radiation in the tunnels has a great influence on maintenance personnel, so strict measures should be adopted.

  12. Monte Carlo Simulations of Background Spectra in Integral Imager Detectors

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.; Dietz, K. L.; Ramsey, B. D.; Weisskopf, M. C.

    1998-01-01

    Predictions of the expected gamma-ray backgrounds in the ISGRI (CdTe) and PiCsIT (Csl) detectors on INTEGRAL due to cosmic-ray interactions and the diffuse gamma-ray background have been made using a coupled set of Monte Carlo radiation transport codes (HETC, FLUKA, EGS4, and MORSE) and a detailed, 3-D mass model of the spacecraft and detector assemblies. The simulations include both the prompt background component from induced hadronic and electromagnetic cascades and the delayed component due to emissions from induced radioactivity. Background spectra have been obtained with and without the use of active (BGO) shielding and charged particle rejection to evaluate the effectiveness of anticoincidence counting on background rejection.

  13. A comparative study of history-based versus vectorized Monte Carlo methods in the GPU/CUDA environment for a simple neutron eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Du, Xining; Ji, Wei; Xu, X. George; Brown, Forrest B.

    2014-06-01

    For nuclear reactor analysis such as the neutron eigenvalue calculations, the time consuming Monte Carlo (MC) simulations can be accelerated by using graphics processing units (GPUs). However, traditional MC methods are often history-based, and their performance on GPUs is affected significantly by the thread divergence problem. In this paper we describe the development of a newly designed event-based vectorized MC algorithm for solving the neutron eigenvalue problem. The code was implemented using NVIDIA's Compute Unified Device Architecture (CUDA), and tested on a NVIDIA Tesla M2090 GPU card. We found that although the vectorized MC algorithm greatly reduces the occurrence of thread divergence thus enhancing the warp execution efficiency, the overall simulation speed is roughly ten times slower than the history-based MC code on GPUs. Profiling results suggest that the slow speed is probably due to the memory access latency caused by the large amount of global memory transactions. Possible solutions to improve the code efficiency are discussed.

  14. Fast GPU-based Monte Carlo code for SPECT/CT reconstructions generates improved 177Lu images.

    PubMed

    Rydén, T; Heydorn Lagerlöf, J; Hemmingsson, J; Marin, I; Svensson, J; Båth, M; Gjertsson, P; Bernhardt, P

    2018-01-04

    Full Monte Carlo (MC)-based SPECT reconstructions have a strong potential for correcting for image degrading factors, but the reconstruction times are long. The objective of this study was to develop a highly parallel Monte Carlo code for fast, ordered subset expectation maximum (OSEM) reconstructions of SPECT/CT images. The MC code was written in the Compute Unified Device Architecture language for a computer with four graphics processing units (GPUs) (GeForce GTX Titan X, Nvidia, USA). This enabled simulations of parallel photon emissions from the voxels matrix (128 3 or 256 3 ). Each computed tomography (CT) number was converted to attenuation coefficients for photo absorption, coherent scattering, and incoherent scattering. For photon scattering, the deflection angle was determined by the differential scattering cross sections. An angular response function was developed and used to model the accepted angles for photon interaction with the crystal, and a detector scattering kernel was used for modeling the photon scattering in the detector. Predefined energy and spatial resolution kernels for the crystal were used. The MC code was implemented in the OSEM reconstruction of clinical and phantom 177 Lu SPECT/CT images. The Jaszczak image quality phantom was used to evaluate the performance of the MC reconstruction in comparison with attenuated corrected (AC) OSEM reconstructions and attenuated corrected OSEM reconstructions with resolution recovery corrections (RRC). The performance of the MC code was 3200 million photons/s. The required number of photons emitted per voxel to obtain a sufficiently low noise level in the simulated image was 200 for a 128 3 voxel matrix. With this number of emitted photons/voxel, the MC-based OSEM reconstruction with ten subsets was performed within 20 s/iteration. The images converged after around six iterations. Therefore, the reconstruction time was around 3 min. The activity recovery for the spheres in the Jaszczak phantom was clearly improved with MC-based OSEM reconstruction, e.g., the activity recovery was 88% for the largest sphere, while it was 66% for AC-OSEM and 79% for RRC-OSEM. The GPU-based MC code generated an MC-based SPECT/CT reconstruction within a few minutes, and reconstructed patient images of 177 Lu-DOTATATE treatments revealed clearly improved resolution and contrast.

  15. The specific purpose Monte Carlo code McENL for simulating the response of epithermal neutron lifetime well logging tools

    NASA Astrophysics Data System (ADS)

    Prettyman, T. H.; Gardner, R. P.; Verghese, K.

    1993-08-01

    A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.

  16. Space Radiation Transport Codes: A Comparative Study for Galactic Cosmic Rays Environment

    NASA Astrophysics Data System (ADS)

    Tripathi, Ram; Wilson, John W.; Townsend, Lawrence W.; Gabriel, Tony; Pinsky, Lawrence S.; Slaba, Tony

    For long duration and/or deep space human missions, protection from severe space radiation exposure is a challenging design constraint and may be a potential limiting factor. The space radiation environment consists of galactic cosmic rays (GCR), solar particle events (SPE), trapped radiation, and includes ions of all the known elements over a very broad energy range. These ions penetrate spacecraft materials producing nuclear fragments and secondary particles that damage biological tissues, microelectronic devices, and materials. In deep space missions, where the Earth's magnetic field does not provide protection from space radiation, the GCR environment is significantly enhanced due to the absence of geomagnetic cut-off and is a major component of radiation exposure. Accurate risk assessments critically depend on the accuracy of the input information as well as radiation transport codes used, and so systematic verification of codes is necessary. In this study, comparisons are made between the deterministic code HZETRN2006 and the Monte Carlo codes HETC-HEDS and FLUKA for an aluminum shield followed by a water target exposed to the 1977 solar minimum GCR spectrum. Interaction and transport of high charge ions present in GCR radiation environment provide a more stringent constraint in the comparison of the codes. Dose, dose equivalent and flux spectra are compared; details of the comparisons will be discussed, and conclusions will be drawn for future directions.

  17. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    NASA Astrophysics Data System (ADS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  18. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    PubMed

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  19. MO-FG-CAMPUS-TeP3-05: Limitations of the Dose Weighted LET Concept for Intensity Modulated Proton Therapy in the Distal Falloff Region and Beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Pirlepesov, F; Farr, J

    2016-06-15

    Purpose: Dose-weighted linear energy transfer (dLET) has been shown to be useful for the analysis of late effects in proton therapy. This study presents the results of the testing of the dLET concept for intensity modulated proton therapy (IMPT) with a discrete spot scanning beam system without use of an aperture or compensator (AC). Methods: IMPT (no AC) and broad beams (BB) with (AC) were simulated in the TOPAS and FLUKA code systems. Information from the independently tested Monte Carlo Damage Simulation (MCDS) was integrated into the FLUKA code systems to account for spatial variations in the RBE for protonsmore » and other light ions using an endpoint of DNA double strand break (DSB) induction. Results: The proton spectra for IMPT beams at the depths beyond the distal edge contain a tail of high energy protons up to 100 MeV. The integral from the tail is compatible with the number of 5–8 MeV protons at the tip of the Bragg peak (BP). The dose averaged energy (dEav) decreases to 7 MeV at the tip of (BP) and then increases to about 15 MeV beyond the distal edge. Neutrons produced in the nozzle are two orders of magnitude higher for BB with AC than for IMPT in low energy part of the spectra. The dLET values beyond of the distal edge of the BP are 5 times larger for the IMPT than for BB with the AC. Contrarily, negligible differences are seen in the RBE estimates for IMPT and BB with AC beyond the distal edge of the BP. Conclusion: The analysis of late effects in IMPT with a spot scanning and double scattering or scanning techniques with AC may requires both dLET and RBE as quantitative parameters to characterize effects beyond the distal edge of the BP.« less

  20. Shielding NSLS-II light source: Importance of geometry for calculating radiation levels from beam losses

    NASA Astrophysics Data System (ADS)

    Kramer, S. L.; Ghosh, V. J.; Breitfeller, M.; Wahl, W.

    2016-11-01

    Third generation high brightness light sources are designed to have low emittance and high current beams, which contribute to higher beam loss rates that will be compensated by Top-Off injection. Shielding for these higher loss rates will be critical to protect the projected higher occupancy factors for the users. Top-Off injection requires a full energy injector, which will demand greater consideration of the potential abnormal beam miss-steering and localized losses that could occur. The high energy electron injection beam produces significantly higher neutron component dose to the experimental floor than a lower energy beam injection and ramped operations. Minimizing this dose will require adequate knowledge of where the miss-steered beam can occur and sufficient EM shielding close to the loss point, in order to attenuate the energy of the particles in the EM shower below the neutron production threshold (<10 MeV), which will spread the incident energy on the bulk shield walls and thereby the dose penetrating the shield walls. Designing supplemental shielding near the loss point using the analytic shielding model is shown to be inadequate because of its lack of geometry specification for the EM shower process. To predict the dose rates outside the tunnel requires detailed description of the geometry and materials that the beam losses will encounter inside the tunnel. Modern radiation shielding Monte-Carlo codes, like FLUKA, can handle this geometric description of the radiation transport process in sufficient detail, allowing accurate predictions of the dose rates expected and the ability to show weaknesses in the design before a high radiation incident occurs. The effort required to adequately define the accelerator geometry for these codes has been greatly reduced with the implementation of the graphical interface of FLAIR to FLUKA. This made the effective shielding process for NSLS-II quite accurate and reliable. The principles used to provide supplemental shielding to the NSLS-II accelerators and the lessons learned from this process are presented.

  1. Next-generation acceleration and code optimization for light transport in turbid media using GPUs

    PubMed Central

    Alerstam, Erik; Lo, William Chun Yip; Han, Tianyi David; Rose, Jonathan; Andersson-Engels, Stefan; Lilge, Lothar

    2010-01-01

    A highly optimized Monte Carlo (MC) code package for simulating light transport is developed on the latest graphics processing unit (GPU) built for general-purpose computing from NVIDIA - the Fermi GPU. In biomedical optics, the MC method is the gold standard approach for simulating light transport in biological tissue, both due to its accuracy and its flexibility in modelling realistic, heterogeneous tissue geometry in 3-D. However, the widespread use of MC simulations in inverse problems, such as treatment planning for PDT, is limited by their long computation time. Despite its parallel nature, optimizing MC code on the GPU has been shown to be a challenge, particularly when the sharing of simulation result matrices among many parallel threads demands the frequent use of atomic instructions to access the slow GPU global memory. This paper proposes an optimization scheme that utilizes the fast shared memory to resolve the performance bottleneck caused by atomic access, and discusses numerous other optimization techniques needed to harness the full potential of the GPU. Using these techniques, a widely accepted MC code package in biophotonics, called MCML, was successfully accelerated on a Fermi GPU by approximately 600x compared to a state-of-the-art Intel Core i7 CPU. A skin model consisting of 7 layers was used as the standard simulation geometry. To demonstrate the possibility of GPU cluster computing, the same GPU code was executed on four GPUs, showing a linear improvement in performance with an increasing number of GPUs. The GPU-based MCML code package, named GPU-MCML, is compatible with a wide range of graphics cards and is released as an open-source software in two versions: an optimized version tuned for high performance and a simplified version for beginners (http://code.google.com/p/gpumcml). PMID:21258498

  2. Fluence correction factors for graphite calorimetry in a low-energy clinical proton beam: I. Analytical and Monte Carlo simulations.

    PubMed

    Palmans, H; Al-Sulaiti, L; Andreo, P; Shipley, D; Lühr, A; Bassler, N; Martinkovič, J; Dobrovodský, J; Rossomme, S; Thomas, R A S; Kacperek, A

    2013-05-21

    The conversion of absorbed dose-to-graphite in a graphite phantom to absorbed dose-to-water in a water phantom is performed by water to graphite stopping power ratios. If, however, the charged particle fluence is not equal at equivalent depths in graphite and water, a fluence correction factor, kfl, is required as well. This is particularly relevant to the derivation of absorbed dose-to-water, the quantity of interest in radiotherapy, from a measurement of absorbed dose-to-graphite obtained with a graphite calorimeter. In this work, fluence correction factors for the conversion from dose-to-graphite in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity and the analytical and Monte Carlo codes give consistent values when considering the differences in secondary particle transport. When considering only protons the fluence correction factors are unity at the surface and increase with depth by 0.5% to 1.5% depending on the code. When the fluence of all charged particles is considered, the fluence correction factor is about 0.5% lower than unity at shallow depths predominantly due to the contributions from alpha particles and increases to values above unity near the Bragg peak. Fluence correction factors directly derived from the fluence distributions differential in energy at equivalent depths in water and graphite can be described by kfl = 0.9964 + 0.0024·zw-eq with a relative standard uncertainty of 0.2%. Fluence correction factors derived from a ratio of calculated doses at equivalent depths in water and graphite can be described by kfl = 0.9947 + 0.0024·zw-eq with a relative standard uncertainty of 0.3%. These results are of direct relevance to graphite calorimetry in low-energy protons but given that the fluence correction factor is almost solely influenced by non-elastic nuclear interactions the results are also relevant for plastic phantoms that consist of carbon, oxygen and hydrogen atoms as well as for soft tissues.

  3. The Effects of Spatial Diversity and Imperfect Channel Estimation on Wideband MC-DS-CDMA and MC-CDMA

    DTIC Science & Technology

    2009-10-01

    In our previous work, we compared the theoretical bit error rates of multi-carrier direct sequence code division multiple access (MC- DS - CDMA ) and...consider only those cases where MC- CDMA has higher frequency diversity than MC- DS - CDMA . Since increases in diversity yield diminishing gains, we conclude

  4. Accelerated event-by-event Monte Carlo microdosimetric calculations of electrons and protons tracks on a multi-core CPU and a CUDA-enabled GPU.

    PubMed

    Kalantzis, Georgios; Tachibana, Hidenobu

    2014-01-01

    For microdosimetric calculations event-by-event Monte Carlo (MC) methods are considered the most accurate. The main shortcoming of those methods is the extensive requirement for computational time. In this work we present an event-by-event MC code of low projectile energy electron and proton tracks for accelerated microdosimetric MC simulations on a graphic processing unit (GPU). Additionally, a hybrid implementation scheme was realized by employing OpenMP and CUDA in such a way that both GPU and multi-core CPU were utilized simultaneously. The two implementation schemes have been tested and compared with the sequential single threaded MC code on the CPU. Performance comparison was established on the speed-up for a set of benchmarking cases of electron and proton tracks. A maximum speedup of 67.2 was achieved for the GPU-based MC code, while a further improvement of the speedup up to 20% was achieved for the hybrid approach. The results indicate the capability of our CPU-GPU implementation for accelerated MC microdosimetric calculations of both electron and proton tracks without loss of accuracy. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Optimisation of 12 MeV electron beam simulation using variance reduction technique

    NASA Astrophysics Data System (ADS)

    Jayamani, J.; Termizi, N. A. S. Mohd; Kamarulzaman, F. N. Mohd; Aziz, M. Z. Abdul

    2017-05-01

    Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 107 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 107 to 20 × 107. In this study, 5 MeV electron cut-off with 10 × 107 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy.

  6. Coupled Neutron Transport for HZETRN

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Blattnig, Steve R.

    2009-01-01

    Exposure estimates inside space vehicles, surface habitats, and high altitude aircrafts exposed to space radiation are highly influenced by secondary neutron production. The deterministic transport code HZETRN has been identified as a reliable and efficient tool for such studies, but improvements to the underlying transport models and numerical methods are still necessary. In this paper, the forward-backward (FB) and directionally coupled forward-backward (DC) neutron transport models are derived, numerical methods for the FB model are reviewed, and a computationally efficient numerical solution is presented for the DC model. Both models are compared to the Monte Carlo codes HETC-HEDS, FLUKA, and MCNPX, and the DC model is shown to agree closely with the Monte Carlo results. Finally, it is found in the development of either model that the decoupling of low energy neutrons from the light particle transport procedure adversely affects low energy light ion fluence spectra and exposure quantities. A first order correction is presented to resolve the problem, and it is shown to be both accurate and efficient.

  7. Orthogonal Multi-Carrier DS-CDMA with Frequency-Domain Equalization

    NASA Astrophysics Data System (ADS)

    Tanaka, Ken; Tomeba, Hiromichi; Adachi, Fumiyuki

    Orthogonal multi-carrier direct sequence code division multiple access (orthogonal MC DS-CDMA) is a combination of orthogonal frequency division multiplexing (OFDM) and time-domain spreading, while multi-carrier code division multiple access (MC-CDMA) is a combination of OFDM and frequency-domain spreading. In MC-CDMA, a good bit error rate (BER) performance can be achieved by using frequency-domain equalization (FDE), since the frequency diversity gain is obtained. On the other hand, the conventional orthogonal MC DS-CDMA fails to achieve any frequency diversity gain. In this paper, we propose a new orthogonal MC DS-CDMA that can obtain the frequency diversity gain by applying FDE. The conditional BER analysis is presented. The theoretical average BER performance in a frequency-selective Rayleigh fading channel is evaluated by the Monte-Carlo numerical computation method using the derived conditional BER and is confirmed by computer simulation of the orthogonal MC DS-CDMA signal transmission.

  8. Major Breeding Plumage Color Differences of Male Ruffs (Philomachus pugnax) Are Not Associated With Coding Sequence Variation in the MC1R Gene

    PubMed Central

    Küpper, Clemens; Burke, Terry; Lank, David B.

    2015-01-01

    Sequence variation in the melanocortin-1 receptor (MC1R) gene explains color morph variation in several species of birds and mammals. Ruffs (Philomachus pugnax) exhibit major dark/light color differences in melanin-based male breeding plumage which is closely associated with alternative reproductive behavior. A previous study identified a microsatellite marker (Ppu020) near the MC1R locus associated with the presence/absence of ornamental plumage. We investigated whether coding sequence variation in the MC1R gene explains major dark/light plumage color variation and/or the presence/absence of ornamental plumage in ruffs. Among 821bp of the MC1R coding region from 44 male ruffs we found 3 single nucleotide polymorphisms, representing 1 nonsynonymous and 2 synonymous amino acid substitutions. None were associated with major dark/light color differences or the presence/absence of ornamental plumage. At all amino acid sites known to be functionally important in other avian species with dark/light plumage color variation, ruffs were either monomorphic or the shared polymorphism did not coincide with color morph. Neither ornamental plumage color differences nor the presence/absence of ornamental plumage in ruffs are likely to be caused entirely by amino acid variation within the coding regions of the MC1R locus. Regulatory elements and structural variation at other loci may be involved in melanin expression and contribute to the extreme plumage polymorphism observed in this species. PMID:25534935

  9. Double differential neutron spectra generated by the interaction of a 12 MeV/nucleon 36S beam on a thick natCu target

    NASA Astrophysics Data System (ADS)

    Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Ledoux, X.; Laurent, B.; Thomas, J.-C.; Clerc, T.; Desmezières, V.; Dupuis, M.; Madeline, A.; Dessay, E.; Grinyer, G. F.; Grinyer, J.; Menard, N.; Porée, F.; Achouri, L.; Delaunay, F.; Parlog, M.

    2018-07-01

    Double differential neutron spectra (energy, angle) originating from a thick natCu target bombarded by a 12 MeV/nucleon 36S16+ beam were measured by the activation method and the Time-of-flight technique at the Grand Accélérateur National d'Ions Lourds (GANIL). A neutron spectrum unfolding algorithm combining the SAND-II iterative method and Monte-Carlo techniques was developed for the analysis of the activation results that cover a wide range of neutron energies. It was implemented into a graphical user interface program, called GanUnfold. The experimental neutron spectra are compared to Monte-Carlo simulations performed using the PHITS and FLUKA codes.

  10. Interactive three-dimensional visualization and creation of geometries for Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Theis, C.; Buchegger, K. H.; Brugger, M.; Forkel-Wirth, D.; Roesler, S.; Vincke, H.

    2006-06-01

    The implementation of three-dimensional geometries for the simulation of radiation transport problems is a very time-consuming task. Each particle transport code supplies its own scripting language and syntax for creating the geometries. All of them are based on the Constructive Solid Geometry scheme requiring textual description. This makes the creation a tedious and error-prone task, which is especially hard to master for novice users. The Monte Carlo code FLUKA comes with built-in support for creating two-dimensional cross-sections through the geometry and FLUKACAD, a custom-built converter to the commercial Computer Aided Design package AutoCAD, exists for 3D visualization. For other codes, like MCNPX, a couple of different tools are available, but they are often specifically tailored to the particle transport code and its approach used for implementing geometries. Complex constructive solid modeling usually requires very fast and expensive special purpose hardware, which is not widely available. In this paper SimpleGeo is presented, which is an implementation of a generic versatile interactive geometry modeler using off-the-shelf hardware. It is running on Windows, with a Linux version currently under preparation. This paper describes its functionality, which allows for rapid interactive visualization as well as generation of three-dimensional geometries, and also discusses critical issues regarding common CAD systems.

  11. Use of Existing CAD Models for Radiation Shielding Analysis

    NASA Technical Reports Server (NTRS)

    Lee, K. T.; Barzilla, J. E.; Wilson, P.; Davis, A.; Zachman, J.

    2015-01-01

    The utility of a radiation exposure analysis depends not only on the accuracy of the underlying particle transport code, but also on the accuracy of the geometric representations of both the vehicle used as radiation shielding mass and the phantom representation of the human form. The current NASA/Space Radiation Analysis Group (SRAG) process to determine crew radiation exposure in a vehicle design incorporates both output from an analytic High Z and Energy Particle Transport (HZETRN) code and the properties (i.e., material thicknesses) of a previously processed drawing. This geometry pre-process can be time-consuming, and the results are less accurate than those determined using a Monte Carlo-based particle transport code. The current work aims to improve this process. Although several Monte Carlo programs (FLUKA, Geant4) are readily available, most use an internal geometry engine. The lack of an interface with the standard CAD formats used by the vehicle designers limits the ability of the user to communicate complex geometries. Translation of native CAD drawings into a format readable by these transport programs is time consuming and prone to error. The Direct Accelerated Geometry -United (DAGU) project is intended to provide an interface between the native vehicle or phantom CAD geometry and multiple particle transport codes to minimize problem setup, computing time and analysis error.

  12. SU-E-T-493: Accelerated Monte Carlo Methods for Photon Dosimetry Using a Dual-GPU System and CUDA.

    PubMed

    Liu, T; Ding, A; Xu, X

    2012-06-01

    To develop a Graphics Processing Unit (GPU) based Monte Carlo (MC) code that accelerates dose calculations on a dual-GPU system. We simulated a clinical case of prostate cancer treatment. A voxelized abdomen phantom derived from 120 CT slices was used containing 218×126×60 voxels, and a GE LightSpeed 16-MDCT scanner was modeled. A CPU version of the MC code was first developed in C++ and tested on Intel Xeon X5660 2.8GHz CPU, then it was translated into GPU version using CUDA C 4.1 and run on a dual Tesla m 2 090 GPU system. The code was featured with automatic assignment of simulation task to multiple GPUs, as well as accurate calculation of energy- and material- dependent cross-sections. Double-precision floating point format was used for accuracy. Doses to the rectum, prostate, bladder and femoral heads were calculated. When running on a single GPU, the MC GPU code was found to be ×19 times faster than the CPU code and ×42 times faster than MCNPX. These speedup factors were doubled on the dual-GPU system. The dose Result was benchmarked against MCNPX and a maximum difference of 1% was observed when the relative error is kept below 0.1%. A GPU-based MC code was developed for dose calculations using detailed patient and CT scanner models. Efficiency and accuracy were both guaranteed in this code. Scalability of the code was confirmed on the dual-GPU system. © 2012 American Association of Physicists in Medicine.

  13. Monte Carlo simulation tool for online treatment monitoring in hadrontherapy with in-beam PET: A patient study.

    PubMed

    Fiorina, E; Ferrero, V; Pennazio, F; Baroni, G; Battistoni, G; Belcari, N; Cerello, P; Camarlinghi, N; Ciocca, M; Del Guerra, A; Donetti, M; Ferrari, A; Giordanengo, S; Giraudo, G; Mairani, A; Morrocchi, M; Peroni, C; Rivetti, A; Da Rocha Rolo, M D; Rossi, S; Rosso, V; Sala, P; Sportelli, G; Tampellini, S; Valvo, F; Wheadon, R; Bisogni, M G

    2018-05-07

    Hadrontherapy is a method for treating cancer with very targeted dose distributions and enhanced radiobiological effects. To fully exploit these advantages, in vivo range monitoring systems are required. These devices measure, preferably during the treatment, the secondary radiation generated by the beam-tissue interactions. However, since correlation of the secondary radiation distribution with the dose is not straightforward, Monte Carlo (MC) simulations are very important for treatment quality assessment. The INSIDE project constructed an in-beam PET scanner to detect signals generated by the positron-emitting isotopes resulting from projectile-target fragmentation. In addition, a FLUKA-based simulation tool was developed to predict the corresponding reference PET images using a detailed scanner model. The INSIDE in-beam PET was used to monitor two consecutive proton treatment sessions on a patient at the Italian Center for Oncological Hadrontherapy (CNAO). The reconstructed PET images were updated every 10 s providing a near real-time quality assessment. By half-way through the treatment, the statistics of the measured PET images were already significant enough to be compared with the simulations with average differences in the activity range less than 2.5 mm along the beam direction. Without taking into account any preferential direction, differences within 1 mm were found. In this paper, the INSIDE MC simulation tool is described and the results of the first in vivo agreement evaluation are reported. These results have justified a clinical trial, in which the MC simulation tool will be used on a daily basis to study the compliance tolerances between the measured and simulated PET images. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. McSKY: A hybrid Monte-Carlo lime-beam code for shielded gamma skyshine calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shultis, J.K.; Faw, R.E.; Stedry, M.H.

    1994-07-01

    McSKY evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated into either a vertical cone or a vertical structure with an N-sided polygon cross section. The code assumes an overhead shield of two materials, through the user can specify zero shield thickness for an unshielded calculation. The code uses a Monte-Carlo algorithm to evaluate transport through source shields and the integral line source to describe photon transport through the atmosphere. The source energy must be between 0.02 and 100 MeV. For heavily shielded sources with energies above 20 MeV, McSKY results must be used cautiously, especially at detectormore » locations near the source.« less

  15. Sci-Sat AM: Radiation Dosimetry and Practical Therapy Solutions - 05: Not all geometries are equivalent for magnetic field Fano cavity tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkov, Victor N.; Rogers, David W.O.

    The coupling of MRI and radiation treatment systems for the application of magnetic resonance guided radiation therapy necessitates a reliable magnetic field capable Monte Carlo (MC) code. In addition to the influence of the magnetic field on dose distributions, the question of proper calibration has arisen due to the several percent variation of ion chamber and solid state detector responses in magnetic fields when compared to the 0 T case (Reynolds et al., Med Phys, 2013). In the absence of a magnetic field, EGSnrc has been shown to pass the Fano cavity test (a rigorous benchmarking tool of MC codes)more » at the 0.1 % level (Kawrakow, Med.Phys, 2000), and similar results should be required of magnetic field capable MC algorithms. To properly test such developing MC codes, the Fano cavity theorem has been adapted to function in a magnetic field (Bouchard et al., PMB, 2015). In this work, the Fano cavity test is applied in a slab and ion-chamber-like geometries to test the transport options of an implemented magnetic field algorithm in EGSnrc. Results show that the deviation of the MC dose from the expected Fano cavity theory value is highly sensitive to the choice of geometry, and the ion chamber geometry appears to pass the test more easily than larger slab geometries. As magnetic field MC codes begin to be used for dose simulations and correction factor calculations, care must be taken to apply the most rigorous Fano test geometries to ensure reliability of such algorithms.« less

  16. Instrument intercomparison in the high-energy mixed field at the CERN-EU reference field (CERF) facility.

    PubMed

    Caresana, Marco; Helmecke, Manuela; Kubancak, Jan; Manessi, Giacomo Paolo; Ott, Klaus; Scherpelz, Robert; Silari, Marco

    2014-10-01

    This paper discusses an intercomparison campaign performed in the mixed radiation field at the CERN-EU (CERF) reference field facility. Various instruments were employed: conventional and extended-range rem counters including a novel instrument called LUPIN, a bubble detector using an active counting system (ABC 1260) and two tissue-equivalent proportional counters (TEPCs). The results show that the extended range instruments agree well within their uncertainties and within 1σ with the H*(10) FLUKA value. The conventional rem counters are in good agreement within their uncertainties and underestimate H*(10) as measured by the extended range instruments and as predicted by FLUKA. The TEPCs slightly overestimate the FLUKA value but they are anyhow consistent with it when taking the comparatively large total uncertainties into account, and indicate that the non-neutron part of the stray field accounts for ∼30 % of the total H*(10). © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. MC3, Version 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cawkwell, Marc Jon

    2016-09-09

    The MC3 code is used to perform Monte Carlo simulations in the isothermal-isobaric ensemble (constant number of particles, temperature, and pressure) on molecular crystals. The molecules within the periodic simulation cell are treated as rigid bodies, alleviating the requirement for a complex interatomic potential. Intermolecular interactions are described using generic, atom-centered pair potentials whose parameterization is taken from the literature [D. E. Williams, J. Comput. Chem., 22, 1154 (2001)] and electrostatic interactions arising from atom-centered, fixed, point partial charges. The primary uses of the MC3 code are the computation of i) the temperature and pressure dependence of lattice parameters andmore » thermal expansion coefficients, ii) tensors of elastic constants and compliances via the Parrinello and Rahman’s fluctuation formula [M. Parrinello and A. Rahman, J. Chem. Phys., 76, 2662 (1982)], and iii) the investigation of polymorphic phase transformations. The MC3 code is written in Fortran90 and requires LAPACK and BLAS linear algebra libraries to be linked during compilation. Computationally expensive loops are accelerated using OpenMP.« less

  18. Assessing 1D Atmospheric Solar Radiative Transfer Models: Interpretation and Handling of Unresolved Clouds.

    NASA Astrophysics Data System (ADS)

    Barker, H. W.; Stephens, G. L.; Partain, P. T.; Bergman, J. W.; Bonnel, B.; Campana, K.; Clothiaux, E. E.; Clough, S.; Cusack, S.; Delamere, J.; Edwards, J.; Evans, K. F.; Fouquart, Y.; Freidenreich, S.; Galin, V.; Hou, Y.; Kato, S.; Li, J.;  Mlawer, E.;  Morcrette, J.-J.;  O'Hirok, W.;  Räisänen, P.;  Ramaswamy, V.;  Ritter, B.;  Rozanov, E.;  Schlesinger, M.;  Shibata, K.;  Sporyshev, P.;  Sun, Z.;  Wendisch, M.;  Wood, N.;  Yang, F.

    2003-08-01

    The primary purpose of this study is to assess the performance of 1D solar radiative transfer codes that are used currently both for research and in weather and climate models. Emphasis is on interpretation and handling of unresolved clouds. Answers are sought to the following questions: (i) How well do 1D solar codes interpret and handle columns of information pertaining to partly cloudy atmospheres? (ii) Regardless of the adequacy of their assumptions about unresolved clouds, do 1D solar codes perform as intended?One clear-sky and two plane-parallel, homogeneous (PPH) overcast cloud cases serve to elucidate 1D model differences due to varying treatments of gaseous transmittances, cloud optical properties, and basic radiative transfer. The remaining four cases involve 3D distributions of cloud water and water vapor as simulated by cloud-resolving models. Results for 25 1D codes, which included two line-by-line (LBL) models (clear and overcast only) and four 3D Monte Carlo (MC) photon transport algorithms, were submitted by 22 groups. Benchmark, domain-averaged irradiance profiles were computed by the MC codes. For the clear and overcast cases, all MC estimates of top-of-atmosphere albedo, atmospheric absorptance, and surface absorptance agree with one of the LBL codes to within ±2%. Most 1D codes underestimate atmospheric absorptance by typically 15-25 W m-2 at overhead sun for the standard tropical atmosphere regardless of clouds.Depending on assumptions about unresolved clouds, the 1D codes were partitioned into four genres: (i) horizontal variability, (ii) exact overlap of PPH clouds, (iii) maximum/random overlap of PPH clouds, and (iv) random overlap of PPH clouds. A single MC code was used to establish conditional benchmarks applicable to each genre, and all MC codes were used to establish the full 3D benchmarks. There is a tendency for 1D codes to cluster near their respective conditional benchmarks, though intragenre variances typically exceed those for the clear and overcast cases. The majority of 1D codes fall into the extreme category of maximum/random overlap of PPH clouds and thus generally disagree with full 3D benchmark values. Given the fairly limited scope of these tests and the inability of any one code to perform extremely well for all cases begs the question that a paradigm shift is due for modeling 1D solar fluxes for cloudy atmospheres.

  19. Application of the MCNPX-McStas interface for shielding calculations and guide design at ESS

    NASA Astrophysics Data System (ADS)

    Klinkby, E. B.; Knudsen, E. B.; Willendrup, P. K.; Lauritzen, B.; Nonbøl, E.; Bentley, P.; Filges, U.

    2014-07-01

    Recently, an interface between the Monte Carlo code MCNPX and the neutron ray-tracing code MCNPX was developed [1, 2]. Based on the expected neutronic performance and guide geometries relevant for the ESS, the combined MCNPX-McStas code is used to calculate dose rates along neutron beam guides. The generation and moderation of neutrons is simulated using a full scale MCNPX model of the ESS target monolith. Upon entering the neutron beam extraction region, the individual neutron states are handed to McStas via the MCNPX-McStas interface. McStas transports the neutrons through the beam guide, and by using newly developed event logging capability, the neutron state parameters corresponding to un-reflected neutrons are recorded at each scattering. This information is handed back to MCNPX where it serves as neutron source input for a second MCNPX simulation. This simulation enables calculation of dose rates in the vicinity of the guide. In addition the logging mechanism is employed to record the scatterings along the guides which is exploited to simulate the supermirror quality requirements (i.e. m-values) needed at different positions along the beam guide to transport neutrons in the same guide/source setup.

  20. Stress induced gene expression drives transient DNA methylation changes at adjacent repetitive elements.

    PubMed

    Secco, David; Wang, Chuang; Shou, Huixia; Schultz, Matthew D; Chiarenza, Serge; Nussaume, Laurent; Ecker, Joseph R; Whelan, James; Lister, Ryan

    2015-07-21

    Cytosine DNA methylation (mC) is a genome modification that can regulate the expression of coding and non-coding genetic elements. However, little is known about the involvement of mC in response to environmental cues. Using whole genome bisulfite sequencing to assess the spatio-temporal dynamics of mC in rice grown under phosphate starvation and recovery conditions, we identified widespread phosphate starvation-induced changes in mC, preferentially localized in transposable elements (TEs) close to highly induced genes. These changes in mC occurred after changes in nearby gene transcription, were mostly DCL3a-independent, and could partially be propagated through mitosis, however no evidence of meiotic transmission was observed. Similar analyses performed in Arabidopsis revealed a very limited effect of phosphate starvation on mC, suggesting a species-specific mechanism. Overall, this suggests that TEs in proximity to environmentally induced genes are silenced via hypermethylation, and establishes the temporal hierarchy of transcriptional and epigenomic changes in response to stress.

  1. Validation of a GPU-based Monte Carlo code (gPMC) for proton radiation therapy: clinical cases study.

    PubMed

    Giantsoudi, Drosoula; Schuemann, Jan; Jia, Xun; Dowdell, Stephen; Jiang, Steve; Paganetti, Harald

    2015-03-21

    Monte Carlo (MC) methods are recognized as the gold-standard for dose calculation, however they have not replaced analytical methods up to now due to their lengthy calculation times. GPU-based applications allow MC dose calculations to be performed on time scales comparable to conventional analytical algorithms. This study focuses on validating our GPU-based MC code for proton dose calculation (gPMC) using an experimentally validated multi-purpose MC code (TOPAS) and compare their performance for clinical patient cases. Clinical cases from five treatment sites were selected covering the full range from very homogeneous patient geometries (liver) to patients with high geometrical complexity (air cavities and density heterogeneities in head-and-neck and lung patients) and from short beam range (breast) to large beam range (prostate). Both gPMC and TOPAS were used to calculate 3D dose distributions for all patients. Comparisons were performed based on target coverage indices (mean dose, V95, D98, D50, D02) and gamma index distributions. Dosimetric indices differed less than 2% between TOPAS and gPMC dose distributions for most cases. Gamma index analysis with 1%/1 mm criterion resulted in a passing rate of more than 94% of all patient voxels receiving more than 10% of the mean target dose, for all patients except for prostate cases. Although clinically insignificant, gPMC resulted in systematic underestimation of target dose for prostate cases by 1-2% compared to TOPAS. Correspondingly the gamma index analysis with 1%/1 mm criterion failed for most beams for this site, while for 2%/1 mm criterion passing rates of more than 94.6% of all patient voxels were observed. For the same initial number of simulated particles, calculation time for a single beam for a typical head and neck patient plan decreased from 4 CPU hours per million particles (2.8-2.9 GHz Intel X5600) for TOPAS to 2.4 s per million particles (NVIDIA TESLA C2075) for gPMC. Excellent agreement was demonstrated between our fast GPU-based MC code (gPMC) and a previously extensively validated multi-purpose MC code (TOPAS) for a comprehensive set of clinical patient cases. This shows that MC dose calculations in proton therapy can be performed on time scales comparable to analytical algorithms with accuracy comparable to state-of-the-art CPU-based MC codes.

  2. Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder

    NASA Technical Reports Server (NTRS)

    Staats, Matt

    2009-01-01

    We present work on a prototype tool based on the JavaPathfinder (JPF) model checker for automatically generating tests satisfying the MC/DC code coverage criterion. Using the Eclipse IDE, developers and testers can quickly instrument Java source code with JPF annotations covering all MC/DC coverage obligations, and JPF can then be used to automatically generate tests that satisfy these obligations. The prototype extension to JPF enables various tasks useful in automatic test generation to be performed, such as test suite reduction and execution of generated tests.

  3. Monte Carlo calculation of the radiation field at aircraft altitudes.

    PubMed

    Roesler, S; Heinrich, W; Schraube, H

    2002-01-01

    Energy spectra of secondary cosmic rays are calculated for aircraft altitudes and a discrete set of solar modulation parameters and rigidity cut-off values covering all possible conditions. The calculations are based on the Monte Carlo code FLUKA and on the most recent information on the interstellar cosmic ray flux including a detailed model of solar modulation. Results are compared to a large variety of experimental data obtained on the ground and aboard aircraft and balloons, such as neutron, proton, and muon spectra and yields of charged particles. Furthermore, particle fluence is converted into ambient dose equivalent and effective dose and the dependence of these quantities on height above sea level, solar modulation, and geographical location is studied. Finally, calculated dose equivalent is compared to results of comprehensive measurements performed aboard aircraft.

  4. Benchmark studies of induced radioactivity produced in LHC materials, Part II: Remanent dose rates.

    PubMed

    Brugger, M; Khater, H; Mayer, S; Prinz, A; Roesler, S; Ulrici, L; Vincke, H

    2005-01-01

    A new method to estimate remanent dose rates, to be used with the Monte Carlo code FLUKA, was benchmarked against measurements from an experiment that was performed at the CERN-EU high-energy reference field facility. An extensive collection of samples of different materials were placed downstream of, and laterally to, a copper target, intercepting a positively charged mixed hadron beam with a momentum of 120 GeV c(-1). Emphasis was put on the reduction of uncertainties by taking measures such as careful monitoring of the irradiation parameters, using different instruments to measure dose rates, adopting detailed elemental analyses of the irradiated materials and making detailed simulations of the irradiation experiment. The measured and calculated dose rates are in good agreement.

  5. Shielding design for the front end of the CERN SPL.

    PubMed

    Magistris, Matteo; Silari, Marco; Vincke, Helmut

    2005-01-01

    CERN is designing a 2.2-GeV Superconducting Proton Linac (SPL) with a beam power of 4 MW, to be used for the production of a neutrino superbeam. The SPL front end will initially accelerate 2 x 10(14) negative hydrogen ions per second up to an energy of 120 MeV. The FLUKA Monte Carlo code was employed for shielding design. The proposed shielding is a combined iron-concrete structure, which also takes into consideration the required RF wave-guide ducts and access labyrinths to the machine. Two beam-loss scenarios were investigated: (1) constant beam loss of 1 Wm(-1) over the whole accelerator length and (2) full beam loss occurring at various locations. A comparison with results based on simplified approaches is also presented.

  6. Monte Carlo Methods in Materials Science Based on FLUKA and ROOT

    NASA Technical Reports Server (NTRS)

    Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor

    2003-01-01

    A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the CERN ALICE (A Large Ion Collisions Experiment) software team through an adaptation of their existing AliROOT (ALICE Using ROOT) architecture. In order to check our progress against actual data, we have chosen to simulate the ATIC14 (Advanced Thin Ionization Calorimeter) cosmic-ray astrophysics balloon payload as well as neutron fluences in the Mir spacecraft. This paper contains a summary of status of this project, and a roadmap to its successful completion.

  7. MC2-3 / DIF3D Analysis for the ZPPR-15 Doppler and Sodium Void Worth Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Micheal A.; Lell, Richard M.; Lee, Changho

    This manuscript covers validation efforts for our deterministic codes at Argonne National Laboratory. The experimental results come from the ZPPR-15 work in 1985-1986 which was focused on the accuracy of physics data for the integral fast reactor concept. Results for six loadings are studied in this document and focus on Doppler sample worths and sodium void worths. The ZPPR-15 loadings are modeled using the MC2-3/DIF3D codes developed and maintained at ANL and the MCNP code from LANL. The deterministic models are generated by processing the as-built geometry information, i.e. MCNP input, and generating MC2-3 cross section generation instructions and amore » drawer homogenized equivalence problem. The Doppler reactivity worth measurements are small heated samples which insert very small amounts of reactivity into the system (< 2 pcm). The results generated by the MC2-3/DIF3D codes were excellent for ZPPR-15A and ZPPR-15B and good for ZPPR-15D, compared to the MCNP solutions. In all cases, notable improvements were made over the analysis techniques applied to the same problems in 1987. The sodium void worths from MC2-3/DIF3D were quite good at 37.5 pcm while MCNP result was 33 pcm and the measured result was 31.5 pcm. Copyright © (2015) by the American Nuclear Society All rights reserved.« less

  8. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.

  9. Fundamental Automated Scheduling System (FASS): A Second Look.

    DTIC Science & Technology

    1986-12-01

    Mr. J.H. Shoemaker and Mr. Ron Flatley of Norfolk Naval Shipyard; Mr. Bob Brunner of Long Beach Naval Shipyard; Mr. Barry Brinson of Charleston...Management, McGraw-Hill, 1978. Pressman , Roger S.. Software Engineering: A Practitioner’s Approach, McGraw-Hill, 1982. Project Systems Consultants Inc...J.H. Shoemaker Code 377 Norfolk Naval Shipyard Portsmouth. Vireinih 23709-5000 13. Mr. Barry Brinson-I Code 377 Charleston Naval Shipyard Charleston

  10. Thermodynamic Analysis of the Combustion of Metallic Materials

    NASA Technical Reports Server (NTRS)

    Wilson, D. Bruce; Stoltzfus, Joel M.

    2000-01-01

    Two types of computer codes are available to assist in the thermodynamic analysis of metallic materials combustion. One type of code calculates phase equilibrium data and is represented by CALPHAD. The other type of code calculates chemical reaction by the Gordon-McBride code. The first has seen significant application for alloy-phase diagrams, but only recently has it been considered for oxidation systems. The Gordon-McBride code has been applied to the combustion of metallic materials. Both codes are limited by their treatment of non-ideal solutions and the fact they are limited to treating volatile and gaseous species as ideal. This paper examines the significance of these limitations for combustion of metallic materials. In addition, the applicability of linear-free energy relationships for solid-phase oxidation and their possible extension to liquid-phase systems is examined.

  11. Error-correcting pairs for a public-key cryptosystem

    NASA Astrophysics Data System (ADS)

    Pellikaan, Ruud; Márquez-Corbella, Irene

    2017-06-01

    Code-based Cryptography (CBC) is a powerful and promising alternative for quantum resistant cryptography. Indeed, together with lattice-based cryptography, multivariate cryptography and hash-based cryptography are the principal available techniques for post-quantum cryptography. CBC was first introduced by McEliece where he designed one of the most efficient Public-Key encryption schemes with exceptionally strong security guarantees and other desirable properties that still resist to attacks based on Quantum Fourier Transform and Amplitude Amplification. The original proposal, which remains unbroken, was based on binary Goppa codes. Later, several families of codes have been proposed in order to reduce the key size. Some of these alternatives have already been broken. One of the main requirements of a code-based cryptosystem is having high performance t-bounded decoding algorithms which is achieved in the case the code has a t-error-correcting pair (ECP). Indeed, those McEliece schemes that use GRS codes, BCH, Goppa and algebraic geometry codes are in fact using an error-correcting pair as a secret key. That is, the security of these Public-Key Cryptosystems is not only based on the inherent intractability of bounded distance decoding but also on the assumption that it is difficult to retrieve efficiently an error-correcting pair. In this paper, the class of codes with a t-ECP is proposed for the McEliece cryptosystem. Moreover, we study the hardness of distinguishing arbitrary codes from those having a t-error correcting pair.

  12. A novel method for the elaboration of hydroxyapatite with high purity by sol-gel using the albumin and comparison with the classical methods

    NASA Astrophysics Data System (ADS)

    Mohammed, Eddya; Bouazza, Tbib; Khalil, El-Hami

    2018-02-01

    In this paper, we report the first synthesis of hydroxyapatite (Hap) by sol-gel using the albumin (egg white) compared with the four classical elaboration methods such as co-precipitation, solid state, and solid-liquid samples of hydroxyapatite. We use a reference sample of hydroxyapatite bought from Fluka Chemika company (Lot and Filling code 385330/1 14599). All samples are characterized by X-ray diffraction (XRD), Uv-visible spectroscopy (Uv-Vis), and Fourier transforms infrared spectroscopy (FT-IR). The XRD study showed the existence of a Hexagonal phase for all our samples prepared in our laboratory and an orthorhombic phase for the Fulka Chemika sample of Hap (Lot and Filling code 385330/1 14599). The study by Uv-visible spectroscopy was performed to determine and compare the optical gap and the disorder of each sample of Hap. The FT-IR spectroscopy demonstrated that all our Hap samples had a similar mode of vibration of the chemical bonds (OH-) and (PO4)3-.

  13. Neutron Transport Models and Methods for HZETRN and Coupling to Low Energy Light Ion Transport

    NASA Technical Reports Server (NTRS)

    Blattnig, S.R.; Slaba, T.C.; Heinbockel, J.H.

    2008-01-01

    Exposure estimates inside space vehicles, surface habitats, and high altitude aircraft exposed to space radiation are highly influenced by secondary neutron production. The deterministic transport code HZETRN has been identified as a reliable and efficient tool for such studies, but improvements to the underlying transport models and numerical methods are still necessary. In this paper, the forward-backward (FB) and directionally coupled forward-backward (DC) neutron transport models are derived, numerical methods for the FB model are reviewed, and a computationally efficient numerical solution is presented for the DC model. Both models are compared to the Monte Carlo codes HETCHEDS and FLUKA, and the DC model is shown to agree closely with the Monte Carlo results. Finally, it is found in the development of either model that the decoupling of low energy neutrons from the light ion (A<4) transport procedure adversely affects low energy light ion fluence spectra and exposure quantities. A first order correction is presented to resolve the problem, and it is shown to be both accurate and efficient.

  14. Energy deposition and thermal effects of runaway electrons in ITER-FEAT plasma facing components

    NASA Astrophysics Data System (ADS)

    Maddaluno, G.; Maruccia, G.; Merola, M.; Rollet, S.

    2003-03-01

    The profile of energy deposited by runaway electrons (RAEs) of 10 or 50 MeV in International Thermonuclear Experimental Reactor-Fusion Energy Advanced Tokamak (ITER-FEAT) plasma facing components (PFCs) and the subsequent temperature pattern have been calculated by using the Monte Carlo code FLUKA and the finite element heat conduction code ANSYS. The RAE energy deposition density was assumed to be 50 MJ/m 2 and both 10 and 100 ms deposition times were considered. Five different configurations of PFCs were investigated: primary first wall armoured with Be, with and without protecting CFC poloidal limiters, both port limiter first wall options (Be flat tile and CFC monoblock), divertor baffle first wall, armoured with W. The analysis has outlined that for all the configurations but one (port limiter with Be flat tile) the heat sink and the cooling tube beneath the armour are well protected for both RAE energies and for both energy deposition times. On the other hand large melting (W, Be) or sublimation (C) of the surface layer occurs, eventually affecting the PFCs lifetime.

  15. Neutron spectrometry with a monolithic silicon telescope.

    PubMed

    Agosteo, S; D'Angelo, G; Fazzi, A; Para, A Foglio; Pola, A; Zotto, P

    2007-01-01

    A neutron spectrometer was set-up by coupling a polyethylene converter with a monolithic silicon telescope, consisting of a DeltaE and an E stage-detector (about 2 and 500 microm thick, respectively). The detection system was irradiated with monoenergetic neutrons at INFN-Laboratori Nazionali di Legnaro (Legnaro, Italy). The maximum detectable energy, imposed by the thickness of the E stage, is about 8 MeV for the present detector. The scatter plots of the energy deposited in the two stages were acquired using two independent electronic chains. The distributions of the recoil-protons are well-discriminated from those due to secondary electrons for energies above 0.350 MeV. The experimental spectra of the recoil-protons were compared with the results of Monte Carlo simulations using the FLUKA code. An analytical model that takes into account the geometrical structure of the silicon telescope was developed, validated and implemented in an unfolding code. The capability of reproducing continuous neutron spectra was investigated by irradiating the detector with neutrons from a thick beryllium target bombarded with protons. The measured spectra were compared with data taken from the literature. Satisfactory agreement was found.

  16. Performance Enhancement of MC-CDMA System through Novel Sensitive Bit Algorithm Aided Turbo Multi User Detection

    PubMed Central

    Kumaravel, Rasadurai; Narayanaswamy, Kumaratharan

    2015-01-01

    Multi carrier code division multiple access (MC-CDMA) system is a promising multi carrier modulation (MCM) technique for high data rate wireless communication over frequency selective fading channels. MC-CDMA system is a combination of code division multiple access (CDMA) and orthogonal frequency division multiplexing (OFDM). The OFDM parts reduce multipath fading and inter symbol interference (ISI) and the CDMA part increases spectrum utilization. Advantages of this technique are its robustness in case of multipath propagation and improve security with the minimize ISI. Nevertheless, due to the loss of orthogonality at the receiver in a mobile environment, the multiple access interference (MAI) appears. The MAI is one of the factors that degrade the bit error rate (BER) performance of MC-CDMA system. The multiuser detection (MUD) and turbo coding are the two dominant techniques for enhancing the performance of the MC-CDMA systems in terms of BER as a solution of overcome to MAI effects. In this paper a low complexity iterative soft sensitive bits algorithm (SBA) aided logarithmic-Maximum a-Posteriori algorithm (Log MAP) based turbo MUD is proposed. Simulation results show that the proposed method provides better BER performance with low complexity decoding, by mitigating the detrimental effects of MAI. PMID:25714917

  17. Shielding NSLS-II light source: Importance of geometry for calculating radiation levels from beam losses

    DOE PAGES

    Kramer, S. L.; Ghosh, V. J.; Breitfeller, M.; ...

    2016-08-10

    We present that third generation high brightness light sources are designed to have low emittance and high current beams, which contribute to higher beam loss rates that will be compensated by Top-Off injection. Shielding for these higher loss rates will be critical to protect the projected higher occupancy factors for the users. Top-Off injection requires a full energy injector, which will demand greater consideration of the potential abnormal beam miss-steering and localized losses that could occur. The high energy electron injection beam produces significantly higher neutron component dose to the experimental floor than a lower energy beam injection and rampedmore » operations. Minimizing this dose will require adequate knowledge of where the miss-steered beam can occur and sufficient EM shielding close to the loss point, in order to attenuate the energy of the particles in the EM shower below the neutron production threshold (<10 MeV), which will spread the incident energy on the bulk shield walls and thereby the dose penetrating the shield walls. Designing supplemental shielding near the loss point using the analytic shielding model is shown to be inadequate because of its lack of geometry specification for the EM shower process. To predict the dose rates outside the tunnel requires detailed description of the geometry and materials that the beam losses will encounter inside the tunnel. Modern radiation shielding Monte-Carlo codes, like FLUKA, can handle this geometric description of the radiation transport process in sufficient detail, allowing accurate predictions of the dose rates expected and the ability to show weaknesses in the design before a high radiation incident occurs. The effort required to adequately define the accelerator geometry for these codes has been greatly reduced with the implementation of the graphical interface of FLAIR to FLUKA. In conclusion, this made the effective shielding process for NSLS-II quite accurate and reliable. The principles used to provide supplemental shielding to the NSLS-II accelerators and the lessons learned from this process are presented.« less

  18. Large Hadron Collider at CERN: Beams generating high-energy-density matter.

    PubMed

    Tahir, N A; Schmidt, R; Shutov, A; Lomonosov, I V; Piriz, A R; Hoffmann, D H H; Deutsch, C; Fortov, V E

    2009-04-01

    This paper presents numerical simulations that have been carried out to study the thermodynamic and hydrodynamic responses of a solid copper cylindrical target that is facially irradiated along the axis by one of the two Large Hadron Collider (LHC) 7 TeV/ c proton beams. The energy deposition by protons in solid copper has been calculated using an established particle interaction and Monte Carlo code, FLUKA, which is capable of simulating all components of the particle cascades in matter, up to multi-TeV energies. These data have been used as input to a sophisticated two-dimensional hydrodynamic computer code BIG2 that has been employed to study this problem. The prime purpose of these investigations was to assess the damage caused to the equipment if the entire LHC beam is lost at a single place. The FLUKA calculations show that the energy of protons will be deposited in solid copper within about 1 m assuming constant material parameters. Nevertheless, our hydrodynamic simulations have shown that the energy deposition region will extend to a length of about 35 m over the beam duration. This is due to the fact that first few tens of bunches deposit sufficient energy that leads to high pressure that generates an outgoing radial shock wave. Shock propagation leads to continuous reduction in the density at the target center that allows the protons delivered in subsequent bunches to penetrate deeper and deeper into the target. This phenomenon has also been seen in case of heavy-ion heated targets [N. A. Tahir, A. Kozyreva, P. Spiller, D. H. H. Hoffmann, and A. Shutov, Phys. Rev. E 63, 036407 (2001)]. This effect needs to be considered in the design of a sacrificial beam stopper. These simulations have also shown that the target is severely damaged and is converted into a huge sample of high-energy density (HED) matter. In fact, the inner part of the target is transformed into a strongly coupled plasma with fairly uniform physical conditions. This work, therefore, has suggested an additional very important application of the LHC, namely, studies of HED states in matter.

  19. Stress induced gene expression drives transient DNA methylation changes at adjacent repetitive elements

    PubMed Central

    Secco, David; Wang, Chuang; Shou, Huixia; Schultz, Matthew D; Chiarenza, Serge; Nussaume, Laurent; Ecker, Joseph R; Whelan, James; Lister, Ryan

    2015-01-01

    Cytosine DNA methylation (mC) is a genome modification that can regulate the expression of coding and non-coding genetic elements. However, little is known about the involvement of mC in response to environmental cues. Using whole genome bisulfite sequencing to assess the spatio-temporal dynamics of mC in rice grown under phosphate starvation and recovery conditions, we identified widespread phosphate starvation-induced changes in mC, preferentially localized in transposable elements (TEs) close to highly induced genes. These changes in mC occurred after changes in nearby gene transcription, were mostly DCL3a-independent, and could partially be propagated through mitosis, however no evidence of meiotic transmission was observed. Similar analyses performed in Arabidopsis revealed a very limited effect of phosphate starvation on mC, suggesting a species-specific mechanism. Overall, this suggests that TEs in proximity to environmentally induced genes are silenced via hypermethylation, and establishes the temporal hierarchy of transcriptional and epigenomic changes in response to stress. DOI: http://dx.doi.org/10.7554/eLife.09343.001 PMID:26196146

  20. GEANT4 benchmark with MCNPX and PHITS for activation of concrete

    NASA Astrophysics Data System (ADS)

    Tesse, Robin; Stichelbaut, Frédéric; Pauly, Nicolas; Dubus, Alain; Derrien, Jonathan

    2018-02-01

    The activation of concrete is a real problem from the point of view of waste management. Because of the complexity of the issue, Monte Carlo (MC) codes have become an essential tool to its study. But various codes or even nuclear models exist in MC. MCNPX and PHITS have already been validated for shielding studies but GEANT4 is also a suitable solution. In these codes, different models can be considered for a concrete activation study. The Bertini model is not the best model for spallation while BIC and INCL model agrees well with previous results in literature.

  1. Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes

    NASA Astrophysics Data System (ADS)

    Aghara, S. K.; Sriprisan, S. I.; Singleterry, R. C.; Sato, T.

    2015-01-01

    Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm2 Al shield followed by 30 g/cm2 of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E < 100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results.

  2. PyMC: Bayesian Stochastic Modelling in Python

    PubMed Central

    Patil, Anand; Huard, David; Fonnesbeck, Christopher J.

    2010-01-01

    This user guide describes a Python package, PyMC, that allows users to efficiently code a probabilistic model and draw samples from its posterior distribution using Markov chain Monte Carlo techniques. PMID:21603108

  3. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, V. I.; Makarova, A. S.; Ryazantsev, O. B.; Samarin, S. I.; Uglov, A. S.

    2014-06-01

    A great breakthrough in proton therapy has happened in the new century: several tens of dedicated centers are now operated throughout the world and their number increases every year. An important component of proton therapy is a treatment planning system. To make calculations faster, these systems usually use analytical methods whose reliability and accuracy do not allow the advantages of this method of treatment to implement to the full extent. Predictions by the Monte Carlo (MC) method are a "gold" standard for the verification of calculations with these systems. At the Institute of Experimental and Theoretical Physics (ITEP) which is one of the eldest proton therapy centers in the world, an MC code is an integral part of their treatment planning system. This code which is called IThMC was developed by scientists from RFNC-VNIITF (Snezhinsk) under ISTC Project 3563.

  4. A versatile multi-objective FLUKA optimization using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Vlachoudis, Vasilis; Antoniucci, Guido Arnau; Mathot, Serge; Kozlowska, Wioletta Sandra; Vretenar, Maurizio

    2017-09-01

    Quite often Monte Carlo simulation studies require a multi phase-space optimization, a complicated task, heavily relying on the operator experience and judgment. Examples of such calculations are shielding calculations with stringent conditions in the cost, in residual dose, material properties and space available, or in the medical field optimizing the dose delivered to a patient under a hadron treatment. The present paper describes our implementation inside flair[1] the advanced user interface of FLUKA[2,3] of a multi-objective Genetic Algorithm[Erreur ! Source du renvoi introuvable.] to facilitate the search for the optimum solution.

  5. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  6. Adding Big Data Analytics to GCSS-MC

    DTIC Science & Technology

    2014-09-30

    TERMS Big Data , Hadoop , MapReduce, GCSS-MC 15. NUMBER OF PAGES 93 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY...10 2.5 Hadoop . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3 The Experiment Design 23 3.1 Why Add a Big Data Element...23 3.2 Adding a Big Data Element to GCSS-MC . . . . . . . . . . . . . . 24 3.3 Building a Hadoop Cluster

  7. Monte Carlo and discrete-ordinate simulations of spectral radiances in a coupled air-tissue system.

    PubMed

    Hestenes, Kjersti; Nielsen, Kristian P; Zhao, Lu; Stamnes, Jakob J; Stamnes, Knut

    2007-04-20

    We perform a detailed comparison study of Monte Carlo (MC) simulations and discrete-ordinate radiative-transfer (DISORT) calculations of spectral radiances in a 1D coupled air-tissue (CAT) system consisting of horizontal plane-parallel layers. The MC and DISORT models have the same physical basis, including coupling between the air and the tissue, and we use the same air and tissue input parameters for both codes. We find excellent agreement between radiances obtained with the two codes, both above and in the tissue. Our tests cover typical optical properties of skin tissue at the 280, 540, and 650 nm wavelengths. The normalized volume scattering function for internal structures in the skin is represented by the one-parameter Henyey-Greenstein function for large particles and the Rayleigh scattering function for small particles. The CAT-DISORT code is found to be approximately 1000 times faster than the CAT-MC code. We also show that the spectral radiance field is strongly dependent on the inherent optical properties of the skin tissue.

  8. Radiation Environment Inside Spacecraft

    NASA Technical Reports Server (NTRS)

    O'Neill, Patrick

    2015-01-01

    Dr. Patrick O'Neill, NASA Johnson Space Center, will present a detailed description of the radiation environment inside spacecraft. The free space (outside) solar and galactic cosmic ray and trapped Van Allen belt proton spectra are significantly modified as these ions propagate through various thicknesses of spacecraft structure and shielding material. In addition to energy loss, secondary ions are created as the ions interact with the structure materials. Nuclear interaction codes (FLUKA, GEANT4, HZTRAN, MCNPX, CEM03, and PHITS) transport free space spectra through different thicknesses of various materials. These "inside" energy spectra are then converted to Linear Energy Transfer (LET) spectra and dose rate - that's what's needed by electronics systems designers. Model predictions are compared to radiation measurements made by instruments such as the Intra-Vehicular Charged Particle Directional Spectrometer (IV-CPDS) used inside the Space Station, Orion, and Space Shuttle.

  9. Study of the response of a lithium yttrium borate scintillator based neutron rem counter by Monte Carlo radiation transport simulations

    NASA Astrophysics Data System (ADS)

    Sunil, C.; Tyagi, Mohit; Biju, K.; Shanbhag, A. A.; Bandyopadhyay, T.

    2015-12-01

    The scarcity and the high cost of 3He has spurred the use of various detectors for neutron monitoring. A new lithium yttrium borate scintillator developed in BARC has been studied for its use in a neutron rem counter. The scintillator is made of natural lithium and boron, and the yield of reaction products that will generate a signal in a real time detector has been studied by FLUKA Monte Carlo radiation transport code. A 2 cm lead introduced to enhance the gamma rejection shows no appreciable change in the shape of the fluence response or in the yield of reaction products. The fluence response when normalized at the average energy of an Am-Be neutron source shows promise of being used as rem counter.

  10. Coupled reactors analysis: New needs and advances using Monte Carlo methodology

    DOE PAGES

    Aufiero, M.; Palmiotti, G.; Salvatores, M.; ...

    2016-08-20

    Coupled reactors and the coupling features of large or heterogeneous core reactors can be investigated with the Avery theory that allows a physics understanding of the main features of these systems. However, the complex geometries that are often encountered in association with coupled reactors, require a detailed geometry description that can be easily provided by modern Monte Carlo (MC) codes. This implies a MC calculation of the coupling parameters defined by Avery and of the sensitivity coefficients that allow further detailed physics analysis. The results presented in this paper show that the MC code SERPENT has been successfully modifed tomore » meet the required capabilities.« less

  11. Multi-Sensor Detection with Particle Swarm Optimization for Time-Frequency Coded Cooperative WSNs Based on MC-CDMA for Underground Coal Mines

    PubMed Central

    Xu, Jingjing; Yang, Wei; Zhang, Linyuan; Han, Ruisong; Shao, Xiaotao

    2015-01-01

    In this paper, a wireless sensor network (WSN) technology adapted to underground channel conditions is developed, which has important theoretical and practical value for safety monitoring in underground coal mines. According to the characteristics that the space, time and frequency resources of underground tunnel are open, it is proposed to constitute wireless sensor nodes based on multicarrier code division multiple access (MC-CDMA) to make full use of these resources. To improve the wireless transmission performance of source sensor nodes, it is also proposed to utilize cooperative sensors with good channel conditions from the sink node to assist source sensors with poor channel conditions. Moreover, the total power of the source sensor and its cooperative sensors is allocated on the basis of their channel conditions to increase the energy efficiency of the WSN. To solve the problem that multiple access interference (MAI) arises when multiple source sensors transmit monitoring information simultaneously, a kind of multi-sensor detection (MSD) algorithm with particle swarm optimization (PSO), namely D-PSO, is proposed for the time-frequency coded cooperative MC-CDMA WSN. Simulation results show that the average bit error rate (BER) performance of the proposed WSN in an underground coal mine is improved significantly by using wireless sensor nodes based on MC-CDMA, adopting time-frequency coded cooperative transmission and D-PSO algorithm with particle swarm optimization. PMID:26343660

  12. Multi-Sensor Detection with Particle Swarm Optimization for Time-Frequency Coded Cooperative WSNs Based on MC-CDMA for Underground Coal Mines.

    PubMed

    Xu, Jingjing; Yang, Wei; Zhang, Linyuan; Han, Ruisong; Shao, Xiaotao

    2015-08-27

    In this paper, a wireless sensor network (WSN) technology adapted to underground channel conditions is developed, which has important theoretical and practical value for safety monitoring in underground coal mines. According to the characteristics that the space, time and frequency resources of underground tunnel are open, it is proposed to constitute wireless sensor nodes based on multicarrier code division multiple access (MC-CDMA) to make full use of these resources. To improve the wireless transmission performance of source sensor nodes, it is also proposed to utilize cooperative sensors with good channel conditions from the sink node to assist source sensors with poor channel conditions. Moreover, the total power of the source sensor and its cooperative sensors is allocated on the basis of their channel conditions to increase the energy efficiency of the WSN. To solve the problem that multiple access interference (MAI) arises when multiple source sensors transmit monitoring information simultaneously, a kind of multi-sensor detection (MSD) algorithm with particle swarm optimization (PSO), namely D-PSO, is proposed for the time-frequency coded cooperative MC-CDMA WSN. Simulation results show that the average bit error rate (BER) performance of the proposed WSN in an underground coal mine is improved significantly by using wireless sensor nodes based on MC-CDMA, adopting time-frequency coded cooperative transmission and D-PSO algorithm with particle swarm optimization.

  13. Enabling Microscopic Simulators to Perform System Level Tasks: A System-Identification Based, Closure-on-Demand Toolkit for Multiscale Simulation Stability/Bifurcation Analysis, Optimization and Control

    DTIC Science & Technology

    2006-10-01

    The objective was to construct a bridge between existing and future microscopic simulation codes ( kMC , MD, MC, BD, LB etc.) and traditional, continuum...kinetic Monte Carlo, kMC , equilibrium MC, Lattice-Boltzmann, LB, Brownian Dynamics, BD, or general agent-based, AB) simulators. It also, fortuitously...cond-mat/0310460 at arXiv.org. 27. Coarse Projective kMC Integration: Forward/Reverse Initial and Boundary Value Problems", R. Rico-Martinez, C. W

  14. Impact of high energy high intensity proton beams on targets: Case studies for Super Proton Synchrotron and Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Tahir, N. A.; Sancho, J. Blanco; Shutov, A.; Schmidt, R.; Piriz, A. R.

    2012-05-01

    The Large Hadron Collider (LHC) is designed to collide two proton beams with unprecedented particle energy of 7 TeV. Each beam comprises 2808 bunches and the separation between two neighboring bunches is 25 ns. The energy stored in each beam is 362 MJ, sufficient to melt 500 kg copper. Safety of operation is very important when working with such powerful beams. An accidental release of even a very small fraction of the beam energy can result in severe damage to the equipment. The machine protection system is essential to handle all types of possible accidental hazards; however, it is important to know about possible consequences of failures. One of the critical failure scenarios is when the entire beam is lost at a single point. In this paper we present detailed numerical simulations of the full impact of one LHC beam on a cylindrical solid carbon target. First, the energy deposition by the protons is calculated with the FLUKA code and this energy deposition is used in the BIG2 code to study the corresponding thermodynamic and the hydrodynamic response of the target that leads to a reduction in the density. The modified density distribution is used in FLUKA to calculate new energy loss distribution and the two codes are thus run iteratively. A suitable iteration step is considered to be the time interval during which the target density along the axis decreases by 15%-20%. Our simulations suggest that the full LHC proton beam penetrates up to 25 m in solid carbon whereas the range of the shower from a single proton in solid carbon is just about 3 m (hydrodynamic tunneling effect). It is planned to perform experiments at the experimental facility HiRadMat (High Radiation Materials) at CERN using the proton beam from the Super Proton Synchrotron (SPS), to compare experimental results with the theoretical predictions. Therefore simulations of the response of a solid copper cylindrical target hit by the SPS beam were performed. The particle energy in the SPS beam is 440 GeV while it has the same bunch structure as the LHC beam, except that it has only up to 288 bunches. Beam focal spot sizes of σ=0.1, 0.2, and 0.5 mm have been considered. The phenomenon of significant hydrodynamic tunneling due to the hydrodynamic effects is also expected for the experiments.

  15. Radiation protection considerations along a radioactive ion beam transport line

    NASA Astrophysics Data System (ADS)

    Sarchiapone, Lucia; Zafiropoulos, Demetre

    2016-09-01

    The goal of the SPES project is to produce accelerated radioactive ion beams for Physics studies at “Laboratori Nazionali di Legnaro” (INFN, Italy). This accelerator complex is scheduled to be built by 2016 for an effective operation in 2017. Radioactive species are produced in a uranium carbide target, by the interaction of 200 μA of protons at 40 MeV. All of the ionized species in the 1+ state come out of the target (ISOL method), and pass through a Wien filter for a first selection and an HMRS (high mass resolution spectrometer). Then they are transported by an electrostatic beam line toward a charge state breeder (where the 1+ to n+ multi-ionization takes place) before selection and reacceleration at the already existing superconducting linac. The work concerning dose evaluations, activation calculation, and radiation protection constraints related to the transport of the radioactive ion beam (RIB) from the target to the mass separator will be described in this paper. The FLUKA code has been used as tool for those calculations needing Monte Carlo simulations, in particular for the evaluation of the dose rate due to the presence of the radioactive beam in the selection/interaction points. The time evolution of a radionuclide inventory can be computed online with FLUKA for arbitrary irradiation profiles and decay times. The activity evolution is analytically evaluated through the implementation of the Bateman equations. Furthermore, the generation and transport of decay radiation (limited to gamma, beta- and beta+ emissions) is possible, referring to a dedicated database of decay emissions using mostly information obtained from NNDC, sometimes supplemented with other data and checked for consistency. When the use of Monte Carlo simulations was not feasible, the Bateman equations, or possible simplifications, have been used directly.

  16. The response of a bonner sphere spectrometer to charged hadrons.

    PubMed

    Agosteo, S; Dimovasili, E; Fassò, A; Silari, M

    2004-01-01

    Bonner sphere spectrometers (BSSs) are employed in neutron spectrometry and dosimetry since many years. Recent developments have seen the addition to a conventional BSS of one or more detectors (moderator plus thermal neutron counter) specifically designed to improve the overall response of the spectrometer to neutrons above 10 MeV. These additional detectors employ a shell of material with a high mass number (such as lead) within the polyethylene moderator, in order to slow down high-energy neutrons via (n,xn) reactions. A BSS can be used to measure neutron spectra both outside accelerator shielding and from an unshielded target. Measurements were recently performed at CERN of the neutron yield and spectral fluence at various angles from unshielded, semi-thick copper, silver and lead targets, bombarded by a mixed proton/pion beam with 40 GeV per c momentum. These experiments have provided evidence that under certain circumstances, the use of lead-enriched moderators may present a problem: these detectors were found to have a significant response to the charged hadron component accompanying the neutrons emitted from the target. Conventional polyethylene moderators show a similar behaviour but less pronounced. These secondary hadrons interact with the moderator and generate neutrons, which are in turn detected by the counter. To investigate this effect and determine a correction factor to be applied to the unfolding procedure, a series of Monte Carlo simulations were performed with the FLUKA code. These simulations aimed at determining the response of the BSS to charged hadrons under the specific experimental situation. Following these results, a complete response matrix of the extended BSS to charged pions and protons was calculated with FLUKA. An experimental verification was carried out with a 120 GeV per c hadron beam at the CERF facility at CERN.

  17. A new DWT/MC/DPCM video compression framework based on EBCOT

    NASA Astrophysics Data System (ADS)

    Mei, L. M.; Wu, H. R.; Tan, D. M.

    2005-07-01

    A novel Discrete Wavelet Transform (DWT)/Motion Compensation (MC)/Differential Pulse Code Modulation (DPCM) video compression framework is proposed in this paper. Although the Discrete Cosine Transform (DCT)/MC/DPCM is the mainstream framework for video coders in industry and international standards, the idea of DWT/MC/DPCM has existed for more than one decade in the literature and the investigation is still undergoing. The contribution of this work is twofold. Firstly, the Embedded Block Coding with Optimal Truncation (EBCOT) is used here as the compression engine for both intra- and inter-frame coding, which provides good compression ratio and embedded rate-distortion (R-D) optimization mechanism. This is an extension of the EBCOT application from still images to videos. Secondly, this framework offers a good interface for the Perceptual Distortion Measure (PDM) based on the Human Visual System (HVS) where the Mean Squared Error (MSE) can be easily replaced with the PDM in the R-D optimization. Some of the preliminary results are reported here. They are also compared with benchmarks such as MPEG-2 and MPEG-4 version 2. The results demonstrate that under specified condition the proposed coder outperforms the benchmarks in terms of rate vs. distortion.

  18. Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes.

    PubMed

    Aghara, S K; Sriprisan, S I; Singleterry, R C; Sato, T

    2015-01-01

    Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm(2) Al shield followed by 30 g/cm(2) of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E<100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results. Copyright © 2015 The Committee on Space Research (COSPAR). All rights reserved.

  19. Ion recombination and polarity correction factors for a plane-parallel ionization chamber in a proton scanning beam.

    PubMed

    Liszka, Małgorzata; Stolarczyk, Liliana; Kłodowska, Magdalena; Kozera, Anna; Krzempek, Dawid; Mojżeszek, Natalia; Pędracka, Anna; Waligórski, Michael Patrick Russell; Olko, Paweł

    2018-01-01

    To evaluate the effect on charge collection in the ionization chamber (IC) in proton pencil beam scanning (PBS), where the local dose rate may exceed the dose rates encountered in conventional MV therapy by up to three orders of magnitude. We measured values of the ion recombination (k s ) and polarity (k pol ) correction factors in water, for a plane-parallel Markus TM23343 IC, using the cyclotron-based Proteus-235 therapy system with an active proton PBS of energies 30-230 MeV. Values of k s were determined from extrapolation of the saturation curve and the Two-Voltage Method (TVM), for planar fields. We compared our experimental results with those obtained from theoretical calculations. The PBS dose rates were estimated by combining direct IC measurements with results of simulations performed using the FLUKA MC code. Values of k s were also determined by the TVM for uniformly irradiated volumes over different ranges and modulation depths of the proton PBS, with or without range shifter. By measuring charge collection efficiency versus applied IC voltage, we confirmed that, with respect to ion recombination, our proton PBS represents a continuous beam. For a given chamber parameter, e.g., nominal voltage, the value of k s depends on the energy and the dose rate of the proton PBS, reaching c. 0.5% for the TVM, at the dose rate of 13.4 Gy/s. For uniformly irradiated regular volumes, the k s value was significantly smaller, within 0.2% or 0.3% for irradiations with or without range shifter, respectively. Within measurement uncertainty, the average value of k pol , for the Markus TM23343 IC, was close to unity over the whole investigated range of clinical proton beam energies. While no polarity effect was observed for the Markus TM23343 IC in our pencil scanning proton beam system, the effect of volume recombination cannot be ignored. © 2017 American Association of Physicists in Medicine.

  20. Simulation of the Mg(Ar) ionization chamber currents by different Monte Carlo codes in benchmark gamma fields

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Liu, Yuan-Hao; Nievaart, Sander; Chen, Yen-Fu; Wu, Shu-Wei; Chou, Wen-Tsae; Jiang, Shiang-Huei

    2011-10-01

    High energy photon (over 10 MeV) and neutron beams adopted in radiobiology and radiotherapy always produce mixed neutron/gamma-ray fields. The Mg(Ar) ionization chambers are commonly applied to determine the gamma-ray dose because of its neutron insensitive characteristic. Nowadays, many perturbation corrections for accurate dose estimation and lots of treatment planning systems are based on Monte Carlo technique. The Monte Carlo codes EGSnrc, FLUKA, GEANT4, MCNP5, and MCNPX were used to evaluate energy dependent response functions of the Exradin M2 Mg(Ar) ionization chamber to a parallel photon beam with mono-energies from 20 keV to 20 MeV. For the sake of validation, measurements were carefully performed in well-defined (a) primary M-100 X-ray calibration field, (b) primary 60Co calibration beam, (c) 6-MV, and (d) 10-MV therapeutic beams in hospital. At energy region below 100 keV, MCNP5 and MCNPX both had lower responses than other codes. For energies above 1 MeV, the MCNP ITS-mode greatly resembled other three codes and the differences were within 5%. Comparing to the measured currents, MCNP5 and MCNPX using ITS-mode had perfect agreement with the 60Co, and 10-MV beams. But at X-ray energy region, the derivations reached 17%. This work shows us a better insight into the performance of different Monte Carlo codes in photon-electron transport calculation. Regarding the application of the mixed field dosimetry like BNCT, MCNP with ITS-mode is recognized as the most suitable tool by this work.

  1. Emulation of the Active Immune Response in a Computer Network

    DTIC Science & Technology

    2009-01-15

    the Code Red worm propagated faster than the Melissa virus in 1999 and much faster than Morris’ worm in 1988. In the case of the Code Red worm, only...report to AFRL on contract #30602-01-0509, Binghamton NY, 2002, 2. Skormin, V.A., Delgado-Frias, J.G., McGee, D.L., Giordano , J.V., Popyack, L.J...V., Delgado-Frias J., McGee D., Giordano J., Popyack L.. Tarakanov A., "BASIS: A Biological Approach to System Information Security," ^2

  2. Cargo Tank Registration Statistics

    DOT National Transportation Integrated Search

    1980-07-01

    This report is a presentation of the data collected by the Bureau of Motor Carrier Safety of the Federal Highway Administration under Title 49, Code of Federal Regulations, Part 177.824 (f), reporting requirements for MC 330 and MC 331 Cargo Tanks. I...

  3. Fred: a GPU-accelerated fast-Monte Carlo code for rapid treatment plan recalculation in ion beam therapy

    NASA Astrophysics Data System (ADS)

    Schiavi, A.; Senzacqua, M.; Pioli, S.; Mairani, A.; Magro, G.; Molinelli, S.; Ciocca, M.; Battistoni, G.; Patera, V.

    2017-09-01

    Ion beam therapy is a rapidly growing technique for tumor radiation therapy. Ions allow for a high dose deposition in the tumor region, while sparing the surrounding healthy tissue. For this reason, the highest possible accuracy in the calculation of dose and its spatial distribution is required in treatment planning. On one hand, commonly used treatment planning software solutions adopt a simplified beam-body interaction model by remapping pre-calculated dose distributions into a 3D water-equivalent representation of the patient morphology. On the other hand, Monte Carlo (MC) simulations, which explicitly take into account all the details in the interaction of particles with human tissues, are considered to be the most reliable tool to address the complexity of mixed field irradiation in a heterogeneous environment. However, full MC calculations are not routinely used in clinical practice because they typically demand substantial computational resources. Therefore MC simulations are usually only used to check treatment plans for a restricted number of difficult cases. The advent of general-purpose programming GPU cards prompted the development of trimmed-down MC-based dose engines which can significantly reduce the time needed to recalculate a treatment plan with respect to standard MC codes in CPU hardware. In this work, we report on the development of fred, a new MC simulation platform for treatment planning in ion beam therapy. The code can transport particles through a 3D voxel grid using a class II MC algorithm. Both primary and secondary particles are tracked and their energy deposition is scored along the trajectory. Effective models for particle-medium interaction have been implemented, balancing accuracy in dose deposition with computational cost. Currently, the most refined module is the transport of proton beams in water: single pencil beam dose-depth distributions obtained with fred agree with those produced by standard MC codes within 1-2% of the Bragg peak in the therapeutic energy range. A comparison with measurements taken at the CNAO treatment center shows that the lateral dose tails are reproduced within 2% in the field size factor test up to 20 cm. The tracing kernel can run on GPU hardware, achieving 10 million primary s-1 on a single card. This performance allows one to recalculate a proton treatment plan at 1% of the total particles in just a few minutes.

  4. Parallel Grand Canonical Monte Carlo (ParaGrandMC) Simulation Code

    NASA Technical Reports Server (NTRS)

    Yamakov, Vesselin I.

    2016-01-01

    This report provides an overview of the Parallel Grand Canonical Monte Carlo (ParaGrandMC) simulation code. This is a highly scalable parallel FORTRAN code for simulating the thermodynamic evolution of metal alloy systems at the atomic level, and predicting the thermodynamic state, phase diagram, chemical composition and mechanical properties. The code is designed to simulate multi-component alloy systems, predict solid-state phase transformations such as austenite-martensite transformations, precipitate formation, recrystallization, capillary effects at interfaces, surface absorption, etc., which can aid the design of novel metallic alloys. While the software is mainly tailored for modeling metal alloys, it can also be used for other types of solid-state systems, and to some degree for liquid or gaseous systems, including multiphase systems forming solid-liquid-gas interfaces.

  5. Galactic and solar radiation exposure to aircrew during a solar cycle.

    PubMed

    Lewis, B J; Bennett, L G I; Green, A R; McCall, M J; Ellaschuk, B; Butler, A; Pierre, M

    2002-01-01

    An on-going investigation using a tissue-equivalent proportional counter (TEPC) has been carried out to measure the ambient dose equivalent rate of the cosmic radiation exposure of aircrew during a solar cycle. A semi-empirical model has been derived from these data to allow for the interpolation of the dose rate for any global position. The model has been extended to an altitude of up to 32 km with further measurements made on board aircraft and several balloon flights. The effects of changing solar modulation during the solar cycle are characterised by correlating the dose rate data to different solar potential models. Through integration of the dose-rate function over a great circle flight path or between given waypoints, a Predictive Code for Aircrew Radiation Exposure (PCAIRE) has been further developed for estimation of the route dose from galactic cosmic radiation exposure. This estimate is provided in units of ambient dose equivalent as well as effective dose, based on E/H x (10) scaling functions as determined from transport code calculations with LUIN and FLUKA. This experimentally based treatment has also been compared with the CARI-6 and EPCARD codes that are derived solely from theoretical transport calculations. Using TEPC measurements taken aboard the International Space Station, ground based neutron monitoring, GOES satellite data and transport code analysis, an empirical model has been further proposed for estimation of aircrew exposure during solar particle events. This model has been compared to results obtained during recent solar flare events.

  6. Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems - Part II

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Koontz, Steve; Reddell, Brandon; Atwell, William; Boeder, Paul

    2015-01-01

    An accurate prediction of spacecraft avionics single event effect (SEE) radiation susceptibility is key to ensuring a safe and reliable vehicle. This is particularly important for long-duration deep space missions for human exploration where there is little or no chance for a quick emergency return to Earth. Monte Carlo nuclear reaction and transport codes such as FLUKA can be used to generate very accurate models of the expected in-flight radiation environment for SEE analyses. A major downside to using a Monte Carlo-based code is that the run times can be very long (on the order of days). A more popular choice for SEE calculations is the CREME96 deterministic code, which offers significantly shorter run times (on the order of seconds). However, CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Another modeling option to consider is the deterministic code HZETRN 20104, which includes updates to address secondary particle shower effects more accurately. This paper builds on previous work by Rojdev, et al. to compare the use of HZETRN 2010 against CREME96 as a tool to verify spacecraft avionics system reliability in a space flight SEE environment. This paper will discuss modifications made to HZETRN 2010 to improve its performance for calculating SEE rates and compare results with both in-flight SEE rates and other calculation methods.

  7. Integration of OpenMC methods into MAMMOTH and Serpent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerby, Leslie; DeHart, Mark; Tumulak, Aaron

    OpenMC, a Monte Carlo particle transport simulation code focused on neutron criticality calculations, contains several methods we wish to emulate in MAMMOTH and Serpent. First, research coupling OpenMC and the Multiphysics Object-Oriented Simulation Environment (MOOSE) has shown promising results. Second, the utilization of Functional Expansion Tallies (FETs) allows for a more efficient passing of multiphysics data between OpenMC and MOOSE. Both of these capabilities have been preliminarily implemented into Serpent. Results are discussed and future work recommended.

  8. ICF target 2D modeling using Monte Carlo SNB electron thermal transport in DRACO

    NASA Astrophysics Data System (ADS)

    Chenhall, Jeffrey; Cao, Duc; Moses, Gregory

    2016-10-01

    The iSNB (implicit Schurtz Nicolai Busquet multigroup diffusion electron thermal transport method is adapted into a Monte Carlo (MC) transport method to better model angular and long mean free path non-local effects. The MC model was first implemented in the 1D LILAC code to verify consistency with the iSNB model. Implementation of the MC SNB model in the 2D DRACO code enables higher fidelity non-local thermal transport modeling in 2D implosions such as polar drive experiments on NIF. The final step is to optimize the MC model by hybridizing it with a MC version of the iSNB diffusion method. The hybrid method will combine the efficiency of a diffusion method in intermediate mean free path regions with the accuracy of a transport method in long mean free path regions allowing for improved computational efficiency while maintaining accuracy. Work to date on the method will be presented. This work was supported by Sandia National Laboratories and the Univ. of Rochester Laboratory for Laser Energetics.

  9. 76 FR 59768 - Office of Commercial Space Transportation (AST); Notice of Availability and Request for Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-27

    ... with the National Environmental Policy Act (NEPA) of 1969, 42 United States Code 4321-4347 (as amended... regular business hours at the following location: McGinley Memorial Library, 317 Main Street, McGregor, TX...

  10. Shielding and activation calculations around the reactor core for the MYRRHA ADS design

    NASA Astrophysics Data System (ADS)

    Ferrari, Anna; Mueller, Stefan; Konheiser, J.; Castelliti, D.; Sarotto, M.; Stankovskiy, A.

    2017-09-01

    In the frame of the FP7 European project MAXSIMA, an extensive simulation study has been done to assess the main shielding problems in view of the construction of the MYRRHA accelerator-driven system at SCK·CEN in Mol (Belgium). An innovative method based on the combined use of the two state-of-the-art Monte Carlo codes MCNPX and FLUKA has been used, with the goal to characterize complex, realistic neutron fields around the core barrel, to be used as source terms in detailed analyses of the radiation fields due to the system in operation, and of the coupled residual radiation. The main results of the shielding analysis are presented, as well as the construction of an activation database of all the key structural materials. The results evidenced a powerful way to analyse the shielding and activation problems, with direct and clear implications on the design solutions.

  11. Investigation on demagnetization of Nd2Fe14B permanent magnets induced by irradiation

    NASA Astrophysics Data System (ADS)

    Li, Zhefu; Jia, Yanyan; Liu, Renduo; Xu, Yuhai; Wang, Guanghong; Xia, Xiaobin

    2017-12-01

    Nd2Fe14B is an important component of insertion devices, which are used in synchrotron radiation sources, and could be demagnetized by irradiation. In the present study, the Monte Carlo code FLUKA was used to analyze the irradiation field of Nd2Fe14B, and it was confirmed that the main demagnetization particle was neutron. Nd2Fe14B permanent magnet samples were irradiated by Ar ions at different doses to simulate neutron irradiation damage. The hysteresis loops were measured using a vibrating sample magnetometer, and the microstructure evolutions were characterized by transmission electron microscopy. Moreover, the relationship between them was discussed. The results indicate that the decrease in saturated magnetization is caused by the changes in microstructure. The evolution of single crystals into an amorphous structure is the reason for the demagnetization phenomenon of Nd2Fe14B permanent magnets when considering its microscopic structure.

  12. Secondary neutrons as the main source of neutron-rich fission products in the bombardment of a thick U target by 1 GeV protons

    NASA Astrophysics Data System (ADS)

    Barzakh, A. E.; Lhersonneau, G.; Batist, L. Kh.; Fedorov, D. V.; Ivanov, V. S.; Mezilev, K. A.; Molkanov, P. L.; Moroz, F. V.; Orlov, S. Yu.; Panteleev, V. N.; Volkov, Yu. M.; Alyakrinskiy, O.; Barbui, M.; Stroe, L.; Tecchio, L. B.

    2011-05-01

    The diffusion-effusion model has been used to analyse the release and yields of Fr and Cs isotopes from uranium carbide targets of very different thicknesses (6.3 and 148 g/cm2) bombarded by a 1 GeV proton beam. Release curves of several isotopes of the same element and production efficiency versus decay half-life are well fitted with the same set of parameters. Comparison of efficiencies for neutron-rich and neutron-deficient Cs isotopes enables separation of the contributions from the primary ( p + 238U) and secondary (n + 238U) reactions to the production of neutron-rich Cs isotopes. A rather simple calculation of the neutron contribution describes these data fairly well. The FLUKA code describes the primary and secondary-reaction contributions to the Cs isotopes production efficiencies for different targets quite well.

  13. MCNPX simulation of proton dose distribution in homogeneous and CT phantoms

    NASA Astrophysics Data System (ADS)

    Lee, C. C.; Lee, Y. J.; Tung, C. J.; Cheng, H. W.; Chao, T. C.

    2014-02-01

    A dose simulation system was constructed based on the MCNPX Monte Carlo package to simulate proton dose distribution in homogeneous and CT phantoms. Conversion from Hounsfield unit of a patient CT image set to material information necessary for Monte Carlo simulation is based on Schneider's approach. In order to validate this simulation system, inter-comparison of depth dose distributions among those obtained from the MCNPX, GEANT4 and FLUKA codes for a 160 MeV monoenergetic proton beam incident normally on the surface of a homogeneous water phantom was performed. For dose validation within the CT phantom, direct comparison with measurement is infeasible. Instead, this study took the approach to indirectly compare the 50% ranges (R50%) along the central axis by our system to the NIST CSDA ranges for beams with 160 and 115 MeV energies. Comparison result within the homogeneous phantom shows good agreement. Differences of simulated R50% among the three codes are less than 1 mm. For results within the CT phantom, the MCNPX simulated water equivalent Req,50% are compatible with the CSDA water equivalent ranges from the NIST database with differences of 0.7 and 4.1 mm for 160 and 115 MeV beams, respectively.

  14. Monte Carlo simulations for angular and spatial distributions in therapeutic-energy proton beams

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Pan, C. Y.; Chiang, K. J.; Yuan, M. C.; Chu, C. H.; Tsai, Y. W.; Teng, P. K.; Lin, C. H.; Chao, T. C.; Lee, C. C.; Tung, C. J.; Chen, A. E.

    2017-11-01

    The purpose of this study is to compare the angular and spatial distributions of therapeutic-energy proton beams obtained from the FLUKA, GEANT4 and MCNP6 Monte Carlo codes. The Monte Carlo simulations of proton beams passing through two thin targets and a water phantom were investigated to compare the primary and secondary proton fluence distributions and dosimetric differences among these codes. The angular fluence distributions, central axis depth-dose profiles, and lateral distributions of the Bragg peak cross-field were calculated to compare the proton angular and spatial distributions and energy deposition. Benchmark verifications from three different Monte Carlo simulations could be used to evaluate the residual proton fluence for the mean range and to estimate the depth and lateral dose distributions and the characteristic depths and lengths along the central axis as the physical indices corresponding to the evaluation of treatment effectiveness. The results showed a general agreement among codes, except that some deviations were found in the penumbra region. These calculated results are also particularly helpful for understanding primary and secondary proton components for stray radiation calculation and reference proton standard determination, as well as for determining lateral dose distribution performance in proton small-field dosimetry. By demonstrating these calculations, this work could serve as a guide to the recent field of Monte Carlo methods for therapeutic-energy protons.

  15. PNPN Latchup in Bipolar LSI Devices.

    DTIC Science & Technology

    1982-01-01

    6611, J. Ritter ATTN: STEWS -TE-N, T. Arellanes ATTN: Code 6612, G. McLane ATTN: STEWS -TE-AN, R. Dutchover ATTN: Code 6816, H. Hughes ATTN: STEWS -TE...AN, R. Hays ATTN: Code 6653, A. Namenson * ATTN: STEWS -TE-N, K. Cummings ATTN: Code 6611, L. August ATTN: STEWS -TE-NT, M. Squires ATTN: Code 6813, J...Killiany ATTN: STEWS -TE-AN, J. Meason ATTN: Code 6600, J. Schriempf ATTN: STEWS -TE-AN, A. De La Paz ATTN: Code 6610, R. Marlow USA MATTN: Code 2627USA

  16. McCarthyism's Rhetorical Norms.

    ERIC Educational Resources Information Center

    Townsend, Rebecca M.

    Rhetorical norms of early McCarthyist discourse reveal a reliance upon images of chaos and the body. Through such metaphors, rhetors crafted a model of discussion that feminized "democracy" and "tolerance" to support anti-Communist measures and de-legitimize their opponents. Political variety was coded as deviant to national…

  17. Near-lossless multichannel EEG compression based on matrix and tensor decompositions.

    PubMed

    Dauwels, Justin; Srinivasan, K; Reddy, M Ramasubba; Cichocki, Andrzej

    2013-05-01

    A novel near-lossless compression algorithm for multichannel electroencephalogram (MC-EEG) is proposed based on matrix/tensor decomposition models. MC-EEG is represented in suitable multiway (multidimensional) forms to efficiently exploit temporal and spatial correlations simultaneously. Several matrix/tensor decomposition models are analyzed in view of efficient decorrelation of the multiway forms of MC-EEG. A compression algorithm is built based on the principle of “lossy plus residual coding,” consisting of a matrix/tensor decomposition-based coder in the lossy layer followed by arithmetic coding in the residual layer. This approach guarantees a specifiable maximum absolute error between original and reconstructed signals. The compression algorithm is applied to three different scalp EEG datasets and an intracranial EEG dataset, each with different sampling rate and resolution. The proposed algorithm achieves attractive compression ratios compared to compressing individual channels separately. For similar compression ratios, the proposed algorithm achieves nearly fivefold lower average error compared to a similar wavelet-based volumetric MC-EEG compression algorithm.

  18. Parallelization of a Monte Carlo particle transport simulation code

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P.; Bousis, C.; Emfietzoglou, D.

    2010-05-01

    We have developed a high performance version of the Monte Carlo particle transport simulation code MC4. The original application code, developed in Visual Basic for Applications (VBA) for Microsoft Excel, was first rewritten in the C programming language for improving code portability. Several pseudo-random number generators have been also integrated and studied. The new MC4 version was then parallelized for shared and distributed-memory multiprocessor systems using the Message Passing Interface. Two parallel pseudo-random number generator libraries (SPRNG and DCMT) have been seamlessly integrated. The performance speedup of parallel MC4 has been studied on a variety of parallel computing architectures including an Intel Xeon server with 4 dual-core processors, a Sun cluster consisting of 16 nodes of 2 dual-core AMD Opteron processors and a 200 dual-processor HP cluster. For large problem size, which is limited only by the physical memory of the multiprocessor server, the speedup results are almost linear on all systems. We have validated the parallel implementation against the serial VBA and C implementations using the same random number generator. Our experimental results on the transport and energy loss of electrons in a water medium show that the serial and parallel codes are equivalent in accuracy. The present improvements allow for studying of higher particle energies with the use of more accurate physical models, and improve statistics as more particles tracks can be simulated in low response time.

  19. Coding for stable transmission of W-band radio-over-fiber system using direct-beating of two independent lasers.

    PubMed

    Yang, L G; Sung, J Y; Chow, C W; Yeh, C H; Cheng, K T; Shi, J W; Pan, C L

    2014-10-20

    We demonstrate experimentally Manchester (MC) coding based W-band (75 - 110 GHz) radio-over-fiber (ROF) system to reduce the low-frequency-components (LFCs) signal distortion generated by two independent low-cost lasers using spectral shaping. Hence, a low-cost and higher performance W-band ROF system is achieved. In this system, direct-beating of two independent low-cost CW lasers without frequency tracking circuit (FTC) is used to generate the millimeter-wave. Approaches, such as delayed self-heterodyne interferometer and heterodyne beating are performed to characterize the optical-beating-interference sub-terahertz signal (OBIS). Furthermore, W-band ROF systems using MC coding and NRZ-OOK are compared and discussed.

  20. Next-generation sequencing of the monogenic obesity genes LEP, LEPR, MC4R, PCSK1 and POMC in a Norwegian cohort of patients with morbid obesity and normal weight controls.

    PubMed

    Nordang, Gry B N; Busk, Øyvind L; Tveten, Kristian; Hanevik, Hans Ivar; Fell, Anne Kristin M; Hjelmesæth, Jøran; Holla, Øystein L; Hertel, Jens K

    2017-05-01

    Rare sequence variants in at least five genes are known to cause monogenic obesity. In this study we aimed to investigate the prevalence of, and characterize, rare coding and splice site variants in LEP, LEPR, MC4R, PCSK1 and POMC in patients with morbid obesity and normal weight controls. Targeted next-generation sequencing of all exons in LEP, LEPR, MC4R, PCSK1 and POMC was performed in 485 patients with morbid obesity and 327 normal weight population-based controls from Norway. In total 151 variants were detected. Twenty-eight (18.5%) of these were rare, coding or splice variants and five (3.3%) were novel. All individuals, except one control, were heterozygous for the 28 variants, and the distribution of the rare variants showed a significantly higher carrier frequency among cases than controls (9.9% vs. 4.9%, p=0.011). Four variants in MC4R were classified as pathogenic or likely pathogenic. Four cases (0.8%) of monogenic obesity were detected, all due to MC4R variants previously linked to monogenic obesity. Significant differences in carrier frequencies among patients with morbid obesity and normal weight controls suggest an association between heterozygous rare coding variants in these five genes and morbid obesity. However, additional studies in larger cohorts and functional testing of the novel variants identified are required to confirm the findings. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Diagnosing Undersampling in Monte Carlo Eigenvalue and Flux Tally Estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M; Rearden, Bradley T

    2015-01-01

    This study explored the impact of undersampling on the accuracy of tally estimates in Monte Carlo (MC) calculations. Steady-state MC simulations were performed for models of several critical systems with varying degrees of spatial and isotopic complexity, and the impact of undersampling on eigenvalue and fuel pin flux/fission estimates was examined. This study observed biases in MC eigenvalue estimates as large as several percent and biases in fuel pin flux/fission tally estimates that exceeded tens, and in some cases hundreds, of percent. This study also investigated five statistical metrics for predicting the occurrence of undersampling biases in MC simulations. Threemore » of the metrics (the Heidelberger-Welch RHW, the Geweke Z-Score, and the Gelman-Rubin diagnostics) are commonly used for diagnosing the convergence of Markov chains, and two of the methods (the Contributing Particles per Generation and Tally Entropy) are new convergence metrics developed in the course of this study. These metrics were implemented in the KENO MC code within the SCALE code system and were evaluated for their reliability at predicting the onset and magnitude of undersampling biases in MC eigenvalue and flux tally estimates in two of the critical models. Of the five methods investigated, the Heidelberger-Welch RHW, the Gelman-Rubin diagnostics, and Tally Entropy produced test metrics that correlated strongly to the size of the observed undersampling biases, indicating their potential to effectively predict the size and prevalence of undersampling biases in MC simulations.« less

  2. MO-E-18C-02: Hands-On Monte Carlo Project Assignment as a Method to Teach Radiation Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pater, P; Vallieres, M; Seuntjens, J

    2014-06-15

    Purpose: To present a hands-on project on Monte Carlo methods (MC) recently added to the curriculum and to discuss the students' appreciation. Methods: Since 2012, a 1.5 hour lecture dedicated to MC fundamentals follows the detailed presentation of photon and electron interactions. Students also program all sampling steps (interaction length and type, scattering angle, energy deposit) of a MC photon transport code. A handout structured in a step-by-step fashion guides student in conducting consistency checks. For extra points, students can code a fully working MC simulation, that simulates a dose distribution for 50 keV photons. A kerma approximation to dosemore » deposition is assumed. A survey was conducted to which 10 out of the 14 attending students responded. It compared MC knowledge prior to and after the project, questioned the usefulness of radiation physics teaching through MC and surveyed possible project improvements. Results: According to the survey, 76% of students had no or a basic knowledge of MC methods before the class and 65% estimate to have a good to very good understanding of MC methods after attending the class. 80% of students feel that the MC project helped them significantly to understand simulations of dose distributions. On average, students dedicated 12.5 hours to the project and appreciated the balance between hand-holding and questions/implications. Conclusion: A lecture on MC methods with a hands-on MC programming project requiring about 14 hours was added to the graduate study curriculum since 2012. MC methods produce “gold standard” dose distributions and slowly enter routine clinical work and a fundamental understanding of MC methods should be a requirement for future students. Overall, the lecture and project helped students relate crosssections to dose depositions and presented numerical sampling methods behind the simulation of these dose distributions. Research funding from governments of Canada and Quebec. PP acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less

  3. McStas event logger: Definition and applications

    NASA Astrophysics Data System (ADS)

    Bergbäck Knudsen, Erik; Bryndt Klinkby, Esben; Kjær Willendrup, Peter

    2014-02-01

    Functionality is added to the McStas neutron ray-tracing code, which allows individual neutron states before and after a scattering to be temporarily stored, and analysed. This logging mechanism has multiple uses, including studies of longitudinal intensity loss in neutron guides and guide coating design optimisations. Furthermore, the logging method enables the cold/thermal neutron induced gamma background along the guide to be calculated from the un-reflected neutron, using a recently developed MCNPX-McStas interface.

  4. PRELIMINARY COUPLING OF THE MONTE CARLO CODE OPENMC AND THE MULTIPHYSICS OBJECT-ORIENTED SIMULATION ENVIRONMENT (MOOSE) FOR ANALYZING DOPPLER FEEDBACK IN MONTE CARLO SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Ellis; Derek Gaston; Benoit Forget

    In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes.more » An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.« less

  5. Detection of genetic diversity and selection at the coding region of the melanocortin receptor 1 (MC1R) gene in Tibetan pigs and Landrace pigs.

    PubMed

    Liu, Rui; Jin, Long; Long, Keren; Chai, Jie; Ma, Jideng; Tang, Qianzi; Tian, Shilin; Hu, Yaodong; Lin, Ling; Wang, Xun; Jiang, Anan; Li, Xuewei; Li, Mingzhou

    2016-01-10

    Domestication and subsequent selective pressures have produced a large variety of pig coat colors in different regions and breeds. The melanocortin 1 receptor (MC1R) gene plays a crucial role in determining coat color of mammals. Here, we investigated genetic diversity and selection at the coding region of the porcine melanocortin receptor 1 (MC1R) in Tibetan pigs and Landrace pigs. By contrast, genetic variability was much lower in Landrace pigs than in Tibetan pigs. Meanwhile, haplotype analysis showed that Tibetan pigs possessed shared haplotypes, suggesting a possibility of recent introgression event by way of crossbreeding with neighboring domestic pigs or shared ancestral polymorphism. Additionally, we detected positive selection at the MC1R in both Tibetan pigs and Landrace pigs through the dN/dS analysis. These findings suggested that novel phenotypic change (dark coat color) caused by novel mutations may help Tibetan pigs against intensive solar ultraviolet (UV) radiation and camouflage in wild environment, whereas white coat color in Landrace were intentionally selected by human after domestication. Furthermore, both the phylogenetic analysis and the network analysis provided clues that MC1R in Asian and European wild boars may have initially experienced different selective pressures, and MC1R alleles diversified in modern domesticated pigs. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Accuracy and convergence of coupled finite-volume/Monte Carlo codes for plasma edge simulations of nuclear fusion reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghoos, K., E-mail: kristel.ghoos@kuleuven.be; Dekeyser, W.; Samaey, G.

    2016-10-01

    The plasma and neutral transport in the plasma edge of a nuclear fusion reactor is usually simulated using coupled finite volume (FV)/Monte Carlo (MC) codes. However, under conditions of future reactors like ITER and DEMO, convergence issues become apparent. This paper examines the convergence behaviour and the numerical error contributions with a simplified FV/MC model for three coupling techniques: Correlated Sampling, Random Noise and Robbins Monro. Also, practical procedures to estimate the errors in complex codes are proposed. Moreover, first results with more complex models show that an order of magnitude speedup can be achieved without any loss in accuracymore » by making use of averaging in the Random Noise coupling technique.« less

  7. The change of radial power factor distribution due to RCCA insertion at the first cycle core of AP1000

    NASA Astrophysics Data System (ADS)

    Susilo, J.; Suparlina, L.; Deswandri; Sunaryo, G. R.

    2018-02-01

    The using of a computer program for the PWR type core neutronic design parameters analysis has been carried out in some previous studies. These studies included a computer code validation on the neutronic parameters data values resulted from measurements and benchmarking calculation. In this study, the AP1000 first cycle core radial power peaking factor validation and analysis were performed using CITATION module of the SRAC2006 computer code. The computer code has been also validated with a good result to the criticality values of VERA benchmark core. The AP1000 core power distribution calculation has been done in two-dimensional X-Y geometry through ¼ section modeling. The purpose of this research is to determine the accuracy of the SRAC2006 code, and also the safety performance of the AP1000 core first cycle operating. The core calculations were carried out with the several conditions, those are without Rod Cluster Control Assembly (RCCA), by insertion of a single RCCA (AO, M1, M2, MA, MB, MC, MD) and multiple insertion RCCA (MA + MB, MA + MB + MC, MA + MB + MC + MD, and MA + MB + MC + MD + M1). The maximum power factor of the fuel rods value in the fuel assembly assumedapproximately 1.406. The calculation results analysis showed that the 2-dimensional CITATION module of SRAC2006 code is accurate in AP1000 power distribution calculation without RCCA and with MA+MB RCCA insertion.The power peaking factor on the first operating cycle of the AP1000 core without RCCA, as well as with single and multiple RCCA are still below in the safety limit values (less then about 1.798). So in terms of thermal power generated by the fuel assembly, then it can be considered that the AP100 core at the first operating cycle is safe.

  8. A FLUKA simulation of the KLOE electromagnetic calorimeter

    NASA Astrophysics Data System (ADS)

    Di Micco, B.; Branchini, P.; Ferrari, A.; Loffredo, S.; Passeri, A.; Patera, V.

    2007-10-01

    We present the simulation of the KLOE calorimeter with the FLUKA Monte Carlo program. The response of the detector to electromagnetic showers has been studied and compared with the publicly available KLOE data. The energy and the time resolution of the electromagnetic clusters is in good agreement with the data. The simulation has been also used to study a possible improvement of the KLOE calorimeter using multianode photo-multipliers. An HAMAMATSU R7600-M16 photomultiplier has been assembled in order to determine the whole cross talk matrix that has been included in the simulation. The cross talk matrix takes into account the effects of a realistic photo-multiplier's electronics and of its coupling to the active material. The performance of the modified readout has been compared to the usual KLOE configuration.

  9. New developments in the McStas neutron instrument simulation package

    NASA Astrophysics Data System (ADS)

    Willendrup, P. K.; Knudsen, E. B.; Klinkby, E.; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.

    2014-07-01

    The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.

  10. New estimation method of neutron skyshine for a high-energy particle accelerator

    NASA Astrophysics Data System (ADS)

    Oh, Joo-Hee; Jung, Nam-Suk; Lee, Hee-Seock; Ko, Seung-Kook

    2016-09-01

    A skyshine is the dominant component of the prompt radiation at off-site. Several experimental studies have been done to estimate the neutron skyshine at a few accelerator facilities. In this work, the neutron transports from a source place to off-site location were simulated using the Monte Carlo codes, FLUKA and PHITS. The transport paths were classified as skyshine, direct (transport), groundshine and multiple-shine to understand the contribution of each path and to develop a general evaluation method. The effect of each path was estimated in the view of the dose at far locations. The neutron dose was calculated using the neutron energy spectra obtained from each detector placed up to a maximum of 1 km from the accelerator. The highest altitude of the sky region in this simulation was set as 2 km from the floor of the accelerator facility. The initial model of this study was the 10 GeV electron accelerator, PAL-XFEL. Different compositions and densities of air, soil and ordinary concrete were applied in this calculation, and their dependences were reviewed. The estimation method used in this study was compared with the well-known methods suggested by Rindi, Stevenson and Stepleton, and also with the simple code, SHINE3. The results obtained using this method agreed well with those using Rindi's formula.

  11. Measurement of the 234U(n, f ) cross-section with quasi-monoenergetic beams in the keV and MeV range using a Micromegas detector assembly

    NASA Astrophysics Data System (ADS)

    Stamatopoulos, A.; Kanellakopoulos, A.; Kalamara, A.; Diakaki, M.; Tsinganis, A.; Kokkoris, M.; Michalopoulou, V.; Axiotis, M.; Lagoyiannis, A.; Vlastou, R.

    2018-01-01

    The 234U neutron-induced fission cross-section has been measured at incident neutron energies of 452, 550, 651 keV and 7.5, 8.7, 10 MeV using the 7Li ( p, n) and the 2H( d, n) reactions, respectively, relative to the 235U( n, f ) and 238U( n, f ) reference reactions. The measurement was performed at the neutron beam facility of the National Center for Scientific Research "Demokritos", using a set-up based on Micromegas detectors. The active mass of the actinide samples and the corresponding impurities were determined via α-spectroscopy using a surface barrier silicon detector. The neutron spectra intercepted by the actinide samples have been thoroughly studied by coupling the NeuSDesc and MCNP5 codes, taking into account the energy and angular straggling of the primary ion beams in the neutron source targets in addition to contributions from competing reactions ( e.g. deuteron break-up) and neutron scattering in the surrounding materials. Auxiliary Monte Carlo simulations were performed making combined use of the FLUKA and GEF codes, focusing particularly on the determination of the fission fragment detection efficiency. The developed methodology and the final results are presented.

  12. Experimental measurement and Monte Carlo assessment of Argon-41 production in a PET cyclotron facility.

    PubMed

    Infantino, Angelo; Valtieri, Lorenzo; Cicoria, Gianfranco; Pancaldi, Davide; Mostacci, Domiziano; Marengo, Mario

    2015-12-01

    In a medical cyclotron facility, (41)Ar (t1/2 = 109.34 m) is produced by the activation of air due to the neutron flux during irradiation, according to the (40)Ar(n,γ)(41)Ar reaction; this is particularly relevant in widely diffused high beam current cyclotrons for the production of PET radionuclides. While theoretical estimations of the (41)Ar production have been published, no data are available on direct experimental measurements for a biomedical cyclotron. In this work, we describe a sampling methodology and report the results of an extensive measurement campaign. Furthermore, the experimental results are compared with Monte Carlo simulations performed with the FLUKA code. To measure (41)Ar activity, air samples were taken inside the cyclotron bunker in sealed Marinelli beakers, during the routine production of (18)F with a 16.5 MeV GE-PETtrace cyclotron; this sampling thus reproduces a situation of absence of air changes. Samples analysis was performed in a gamma-ray spectrometry system equipped with HPGe detector. Monte Carlo assessment of the (41)Ar saturation yield was performed directly using the standard FLUKA score RESNUCLE, and off-line by the convolution of neutron fluence with cross section data. The average (41)Ar saturation yield per one liter of air of (41)Ar, measured in gamma-ray spectrometry, resulted to be 3.0 ± 0.6 Bq/µA*dm(3) while simulations gave a result of 6.9 ± 0.3 Bq/µA*dm(3) in the direct assessment and 6.92 ± 0.22 Bq/µA*dm(3) by the convolution neutron fluence-to-cross section. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. Computational fluid dynamics applications at McDonnel Douglas

    NASA Technical Reports Server (NTRS)

    Hakkinen, R. J.

    1987-01-01

    Representative examples are presented of applications and development of advanced Computational Fluid Dynamics (CFD) codes for aerodynamic design at the McDonnell Douglas Corporation (MDC). Transonic potential and Euler codes, interactively coupled with boundary layer computation, and solutions of slender-layer Navier-Stokes approximation are applied to aircraft wing/body calculations. An optimization procedure using evolution theory is described in the context of transonic wing design. Euler methods are presented for analysis of hypersonic configurations, and helicopter rotors in hover and forward flight. Several of these projects were accepted for access to the Numerical Aerodynamic Simulation (NAS) facility at the NASA-Ames Research Center.

  14. Neutron Angular Scatter Effects in 3DHZETRN: Quasi-Elastic

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Werneth, Charles M.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2017-01-01

    The current 3DHZETRN code has a detailed three dimensional (3D) treatment of neutron transport based on a forward/isotropic assumption and has been compared to Monte Carlo (MC) simulation codes in various geometries. In most cases, it has been found that 3DHZETRN agrees with the MC codes to the extent they agree with each other. However, a recent study of neutron leakage from finite geometries revealed that further improvements to the 3DHZETRN formalism are needed. In the present report, angular scattering corrections to the neutron fluence are provided in an attempt to improve fluence estimates from a uniform sphere. It is found that further developments in the nuclear production models are required to fully evaluate the impact of transport model updates. A model for the quasi-elastic neutron production spectra is therefore developed and implemented into 3DHZETRN.

  15. The Lived Intersectional Experiences of Privilege and Oppression of Queer Men of Color in Counselor Education Doctoral Programs: An Interpretative Phenomenological Analysis

    ERIC Educational Resources Information Center

    Chan, Christian D.

    2018-01-01

    The advent of the "Multicultural and Social Justice Counseling Competencies" (Ratts, Singh, Nassar-McMillan, Butler, & McCullough, 2016), the "American Counseling Association (ACA) Code of Ethics" (2014), and a more comprehensive emphasis on multiculturalism and social justice (Haskins & Singh, 2015; Ratts, 2009, 2011;…

  16. 14 CFR 1214.603 - Official Flight Kit (OFK).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .../users of any nature, to the Director of Transportation Services, Code MC, NASA Headquarters, Washington... of External Relations, Code X, NASA Headquarters, Washington, DC 20546. Upon receipt of all requests... Center, Houston, TX 77058. (3) All others (aerospace companies, state and local governments, the academic...

  17. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  18. Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.

    PubMed

    Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A

    2016-06-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.

  19. Bayesian Atmospheric Radiative Transfer (BART): Model, Statistics Driver, and Application to HD 209458b

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Stemm, Madison M.; Lust, Nate B.; Foster, Andrew S.; Rojo, Patricio M.; Loredo, Thomas J.

    2014-11-01

    Multi-wavelength secondary-eclipse and transit depths probe the thermo-chemical properties of exoplanets. In recent years, several research groups have developed retrieval codes to analyze the existing data and study the prospects of future facilities. However, the scientific community has limited access to these packages. Here we premiere the open-source Bayesian Atmospheric Radiative Transfer (BART) code. We discuss the key aspects of the radiative-transfer algorithm and the statistical package. The radiation code includes line databases for all HITRAN molecules, high-temperature H2O, TiO, and VO, and includes a preprocessor for adding additional line databases without recompiling the radiation code. Collision-induced absorption lines are available for H2-H2 and H2-He. The parameterized thermal and molecular abundance profiles can be modified arbitrarily without recompilation. The generated spectra are integrated over arbitrary bandpasses for comparison to data. BART's statistical package, Multi-core Markov-chain Monte Carlo (MC3), is a general-purpose MCMC module. MC3 implements the Differental-evolution Markov-chain Monte Carlo algorithm (ter Braak 2006, 2009). MC3 converges 20-400 times faster than the usual Metropolis-Hastings MCMC algorithm, and in addition uses the Message Passing Interface (MPI) to parallelize the MCMC chains. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  20. Design of a finger ring extremity dosemeter based on OSL readout of alpha-Al2O3:C.

    PubMed

    Durham, J S; Zhang, X; Payne, F; Akselrod, M S

    2002-01-01

    A finger-ring dosemeter and reader has been designed that uses OSL readout of alpha-Al2O3:C (aluminium oxide). The use of aluminium oxide is important because it allows the sensitive element of the dosemeter to be a very thin layer that reduces the beta and gamma energy dependence to acceptable levels without compromising the required sensitivity for dose measurement. OSL readout allows the ring dosemeter to be interrogated with minimal disassembly. The ring dosemeter consists of three components: aluminium oxide powder for measurement of dose, an aluminium substrate that gives structure to the ring, and an aluminised Mylar cover to prevent the aluminium oxide from exposure to light. The thicknesses of the three components have been optimised for beta response using the Monte Carlo computer code FLUKA. A reader was also designed and developed that allows the dosemeter to be read after removing the Mylar. Future efforts are discussed.

  1. Double-layer neutron shield design as neutron shielding application

    NASA Astrophysics Data System (ADS)

    Sariyer, Demet; Küçer, Rahmi

    2018-02-01

    The shield design in particle accelerators and other high energy facilities are mainly connected to the high-energy neutrons. The deep penetration of neutrons through massive shield has become a very serious problem. For shielding to be efficient, most of these neutrons should be confined to the shielding volume. If the interior space will become limited, the sufficient thickness of multilayer shield must be used. Concrete and iron are widely used as a multilayer shield material. Two layers shield material was selected to guarantee radiation safety outside of the shield against neutrons generated in the interaction of the different proton energies. One of them was one meter of concrete, the other was iron-contained material (FeB, Fe2B and stainless-steel) to be determined shield thicknesses. FLUKA Monte Carlo code was used for shield design geometry and required neutron dose distributions. The resulting two layered shields are shown better performance than single used concrete, thus the shield design could leave more space in the interior shielded areas.

  2. Shielding design of an underground experimental area at point 5 of the CERN Super Proton Synchrotron (SPS).

    PubMed

    Mueller, Mario J; Stevenson, Graham R

    2005-01-01

    Increasing projected values of the circulating beam intensity in the Super Proton Synchrotron (SPS) and decreasing limits to radiation exposure, taken with the increasing non-acceptance of unjustified and unoptimised radiation exposures, have led to the need to re-assess the shielding between the ECX and ECA5 underground experimental areas of the SPS. Twenty years ago, these experimental areas at SPS-Point 5 housed the UA1 experiment, where Carlo Rubbia and his team verified the existence of W and Z bosons. The study reported here describes such a re-assessment based on simulations using the multi-purpose FLUKA radiation transport code. This study concludes that while the main shield which is made of concrete blocks and is 4.8 m thick satisfactorily meets the current design limits even at the highest intensities presently planned for the SPS, dose rates calculated for liaison areas on both sides of the main shield significantly exceed the design limits. Possible ways of improving the shielding situation are discussed.

  3. Geant4 simulation of the CERN-EU high-energy reference field (CERF) facility.

    PubMed

    Prokopovich, D A; Reinhard, M I; Cornelius, I M; Rosenfeld, A B

    2010-09-01

    The CERN-EU high-energy reference field facility is used for testing and calibrating both active and passive radiation dosemeters for radiation protection applications in space and aviation. Through a combination of a primary particle beam, target and a suitable designed shielding configuration, the facility is able to reproduce the neutron component of the high altitude radiation field relevant to the jet aviation industry. Simulations of the facility using the GEANT4 (GEometry ANd Tracking) toolkit provide an improved understanding of the neutron particle fluence as well as the particle fluence of other radiation components present. The secondary particle fluence as a function of the primary particle fluence incident on the target and the associated dose equivalent rates were determined at the 20 designated irradiation positions available at the facility. Comparisons of the simulated results with previously published simulations obtained using the FLUKA Monte Carlo code, as well as with experimental results of the neutron fluence obtained with a Bonner sphere spectrometer, are made.

  4. The radiation fields around a proton therapy facility: A comparison of Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Ottaviano, G.; Picardi, L.; Pillon, M.; Ronsivalle, C.; Sandri, S.

    2014-02-01

    A proton therapy test facility with a beam current lower than 10 nA in average, and an energy up to 150 MeV, is planned to be sited at the Frascati ENEA Research Center, in Italy. The accelerator is composed of a sequence of linear sections. The first one is a commercial 7 MeV proton linac, from which the beam is injected in a SCDTL (Side Coupled Drift Tube Linac) structure reaching the energy of 52 MeV. Then a conventional CCL (coupled Cavity Linac) with side coupling cavities completes the accelerator. The linear structure has the important advantage that the main radiation losses during the acceleration process occur to protons with energy below 20 MeV, with a consequent low production of neutrons and secondary radiation. From the radiation protection point of view the source of radiation for this facility is then almost completely located at the final target. Physical and geometrical models of the device have been developed and implemented into radiation transport computer codes based on the Monte Carlo method. The scope is the assessment of the radiation field around the main source for supporting the safety analysis. For the assessment independent researchers used two different Monte Carlo computer codes named FLUKA (FLUktuierende KAskade) and MCNPX (Monte Carlo N-Particle eXtended) respectively. Both are general purpose tools for calculations of particle transport and interactions with matter, covering an extended range of applications including proton beam analysis. Nevertheless each one utilizes its own nuclear cross section libraries and uses specific physics models for particle types and energies. The models implemented into the codes are described and the results are presented. The differences between the two calculations are reported and discussed pointing out disadvantages and advantages of each code in the specific application.

  5. Pion and electromagnetic contribution to dose: Comparisons of HZETRN to Monte Carlo results and ISS data

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Blattnig, Steve R.; Reddell, Brandon; Bahadori, Amir; Norman, Ryan B.; Badavi, Francis F.

    2013-07-01

    Recent work has indicated that pion production and the associated electromagnetic (EM) cascade may be an important contribution to the total astronaut exposure in space. Recent extensions to the deterministic space radiation transport code, HZETRN, allow the production and transport of pions, muons, electrons, positrons, and photons. In this paper, the extended code is compared to the Monte Carlo codes, Geant4, PHITS, and FLUKA, in slab geometries exposed to galactic cosmic ray (GCR) boundary conditions. While improvements in the HZETRN transport formalism for the new particles are needed, it is shown that reasonable agreement on dose is found at larger shielding thicknesses commonly found on the International Space Station (ISS). Finally, the extended code is compared to ISS data on a minute-by-minute basis over a seven day period in 2001. The impact of pion/EM production on exposure estimates and validation results is clearly shown. The Badhwar-O'Neill (BO) 2004 and 2010 models are used to generate the GCR boundary condition at each time-step allowing the impact of environmental model improvements on validation results to be quantified as well. It is found that the updated BO2010 model noticeably reduces overall exposure estimates from the BO2004 model, and the additional production mechanisms in HZETRN provide some compensation. It is shown that the overestimates provided by the BO2004 GCR model in previous validation studies led to deflated uncertainty estimates for environmental, physics, and transport models, and allowed an important physical interaction (π/EM) to be overlooked in model development. Despite the additional π/EM production mechanisms in HZETRN, a systematic under-prediction of total dose is observed in comparison to Monte Carlo results and measured data.

  6. Color differences among feral pigeons (Columba livia) are not attributable to sequence variation in the coding region of the melanocortin-1 receptor gene (MC1R)

    PubMed Central

    2013-01-01

    Background Genetic variation at the melanocortin-1 receptor (MC1R) gene is correlated with melanin color variation in many birds. Feral pigeons (Columba livia) show two major melanin-based colorations: a red coloration due to pheomelanic pigment and a black coloration due to eumelanic pigment. Furthermore, within each color type, feral pigeons display continuous variation in the amount of melanin pigment present in the feathers, with individuals varying from pure white to a full dark melanic color. Coloration is highly heritable and it has been suggested that it is under natural or sexual selection, or both. Our objective was to investigate whether MC1R allelic variants are associated with plumage color in feral pigeons. Findings We sequenced 888 bp of the coding sequence of MC1R among pigeons varying both in the type, eumelanin or pheomelanin, and the amount of melanin in their feathers. We detected 10 non-synonymous substitutions and 2 synonymous substitution but none of them were associated with a plumage type. It remains possible that non-synonymous substitutions that influence coloration are present in the short MC1R fragment that we did not sequence but this seems unlikely because we analyzed the entire functionally important region of the gene. Conclusions Our results show that color differences among feral pigeons are probably not attributable to amino acid variation at the MC1R locus. Therefore, variation in regulatory regions of MC1R or variation in other genes may be responsible for the color polymorphism of feral pigeons. PMID:23915680

  7. The DoE method as an efficient tool for modeling the behavior of monocrystalline Si-PV module

    NASA Astrophysics Data System (ADS)

    Kessaissia, Fatma Zohra; Zegaoui, Abdallah; Boutoubat, Mohamed; Allouache, Hadj; Aillerie, Michel; Charles, Jean-Pierre

    2018-05-01

    The objective of this paper is to apply the Design of Experiments (DoE) method to study and to obtain a predictive model of any marketed monocrystalline photovoltaic (mc-PV) module. This technique allows us to have a mathematical model that represents the predicted responses depending upon input factors and experimental data. Therefore, the DoE model for characterization and modeling of mc-PV module behavior can be obtained by just performing a set of experimental trials. The DoE model of the mc-PV panel evaluates the predictive maximum power, as a function of irradiation and temperature in a bounded domain of study for inputs. For the mc-PV panel, the predictive model for both one level and two levels were developed taking into account both influences of the main effect and the interactive effects on the considered factors. The DoE method is then implemented by developing a code under Matlab software. The code allows us to simulate, characterize, and validate the predictive model of the mc-PV panel. The calculated results were compared to the experimental data, errors were estimated, and an accurate validation of the predictive models was evaluated by the surface response. Finally, we conclude that the predictive models reproduce the experimental trials and are defined within a good accuracy.

  8. First experimental-based characterization of oxygen ion beam depth dose distributions at the Heidelberg Ion-Beam Therapy Center

    NASA Astrophysics Data System (ADS)

    Kurz, C.; Mairani, A.; Parodi, K.

    2012-08-01

    Over the last decades, the application of proton and heavy-ion beams to external beam radiotherapy has rapidly increased. Due to the favourable lateral and depth dose profile, the superposition of narrow ion pencil beams may enable a highly conformal dose delivery to the tumour, with better sparing of the surrounding healthy tissue in comparison to conventional radiation therapy with photons. To fully exploit the promised clinical advantages of ion beams, an accurate planning of the patient treatments is required. The clinical treatment planning system (TPS) at the Heidelberg Ion-Beam Therapy Center (HIT) is based on a fast performing analytical algorithm for dose calculation, relying, among others, on laterally integrated depth dose distributions (DDDs) simulated with the FLUKA Monte Carlo (MC) code. Important input parameters of these simulations need to be derived from a comparison of the simulated DDDs with measurements. In this work, the first measurements of 16O ion DDDs at HIT are presented with a focus on the determined Bragg peak positions and the understanding of factors influencing the shape of the distributions. The measurements are compared to different simulation approaches aiming to reproduce the acquired data at best. A simplified geometrical model is first used to optimize important input parameters, not known a priori, in the simulations. This method is then compared to a more realistic, but also more time-consuming simulation approach better accounting for the experimental set-up and the measuring process. The results of this work contributed to a pre-clinical oxygen ion beam database, which is currently used by a research TPS for corresponding radio-biological cell experiments. A future extension to a clinical database used by the clinical TPS at HIT is foreseen. As a side effect, the performed investigations showed that the typical water equivalent calibration approach of experimental data acquired with water column systems leads to slight deviations between the experimentally determined and the real Bragg peak positions. For improved accuracy, the energy dependence of the stopping power, and herewith the water equivalent thickness, of the material downstream of the water tank should be considered in the analysis of measured data.

  9. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    PubMed

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  10. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    NASA Astrophysics Data System (ADS)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  11. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  12. Cellular dosimetry calculations for Strontium-90 using Monte Carlo code PENELOPE.

    PubMed

    Hocine, Nora; Farlay, Delphine; Boivin, Georges; Franck, Didier; Agarande, Michelle

    2014-11-01

    To improve risk assessments associated with chronic exposure to Strontium-90 (Sr-90), for both the environment and human health, it is necessary to know the energy distribution in specific cells or tissue. Monte Carlo (MC) simulation codes are extremely useful tools for calculating deposition energy. The present work was focused on the validation of the MC code PENetration and Energy LOss of Positrons and Electrons (PENELOPE) and the assessment of dose distribution to bone marrow cells from punctual Sr-90 source localized within the cortical bone part. S-values (absorbed dose per unit cumulated activity) calculations using Monte Carlo simulations were performed by using PENELOPE and Monte Carlo N-Particle eXtended (MCNPX). Cytoplasm, nucleus, cell surface, mouse femur bone and Sr-90 radiation source were simulated. Cells are assumed to be spherical with the radii of the cell and cell nucleus ranging from 2-10 μm. The Sr-90 source is assumed to be uniformly distributed in cell nucleus, cytoplasm and cell surface. The comparison of S-values calculated with PENELOPE to MCNPX results and the Medical Internal Radiation Dose (MIRD) values agreed very well since the relative deviations were less than 4.5%. The dose distribution to mouse bone marrow cells showed that the cells localized near the cortical part received the maximum dose. The MC code PENELOPE may prove useful for cellular dosimetry involving radiation transport through materials other than water, or for complex distributions of radionuclides and geometries.

  13. A Detailed Comparison of Multidimensional Boltzmann Neutrino Transport Methods in Core-collapse Supernovae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richers, Sherwood; Nagakura, Hiroki; Ott, Christian D.

    The mechanism driving core-collapse supernovae is sensitive to the interplay between matter and neutrino radiation. However, neutrino radiation transport is very difficult to simulate, and several radiation transport methods of varying levels of approximation are available. We carefully compare for the first time in multiple spatial dimensions the discrete ordinates (DO) code of Nagakura, Yamada, and Sumiyoshi and the Monte Carlo (MC) code Sedonu, under the assumptions of a static fluid background, flat spacetime, elastic scattering, and full special relativity. We find remarkably good agreement in all spectral, angular, and fluid interaction quantities, lending confidence to both methods. The DOmore » method excels in determining the heating and cooling rates in the optically thick region. The MC method predicts sharper angular features due to the effectively infinite angular resolution, but struggles to drive down noise in quantities where subtractive cancellation is prevalent, such as the net gain in the protoneutron star and off-diagonal components of the Eddington tensor. We also find that errors in the angular moments of the distribution functions induced by neglecting velocity dependence are subdominant to those from limited momentum-space resolution. We briefly compare directly computed second angular moments to those predicted by popular algebraic two-moment closures, and we find that the errors from the approximate closures are comparable to the difference between the DO and MC methods. Included in this work is an improved Sedonu code, which now implements a fully special relativistic, time-independent version of the grid-agnostic MC random walk approximation.« less

  14. A Detailed Comparison of Multidimensional Boltzmann Neutrino Transport Methods in Core-collapse Supernovae

    DOE PAGES

    Richers, Sherwood; Nagakura, Hiroki; Ott, Christian D.; ...

    2017-10-03

    The mechanism driving core-collapse supernovae is sensitive to the interplay between matter and neutrino radiation. However, neutrino radiation transport is very difficult to simulate, and several radiation transport methods of varying levels of approximation are available. In this paper, we carefully compare for the first time in multiple spatial dimensions the discrete ordinates (DO) code of Nagakura, Yamada, and Sumiyoshi and the Monte Carlo (MC) code Sedonu, under the assumptions of a static fluid background, flat spacetime, elastic scattering, and full special relativity. We find remarkably good agreement in all spectral, angular, and fluid interaction quantities, lending confidence to bothmore » methods. The DO method excels in determining the heating and cooling rates in the optically thick region. The MC method predicts sharper angular features due to the effectively infinite angular resolution, but struggles to drive down noise in quantities where subtractive cancellation is prevalent, such as the net gain in the protoneutron star and off-diagonal components of the Eddington tensor. We also find that errors in the angular moments of the distribution functions induced by neglecting velocity dependence are subdominant to those from limited momentum-space resolution. We briefly compare directly computed second angular moments to those predicted by popular algebraic two-moment closures, and we find that the errors from the approximate closures are comparable to the difference between the DO and MC methods. Finally, included in this work is an improved Sedonu code, which now implements a fully special relativistic, time-independent version of the grid-agnostic MC random walk approximation.« less

  15. A Detailed Comparison of Multidimensional Boltzmann Neutrino Transport Methods in Core-collapse Supernovae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richers, Sherwood; Nagakura, Hiroki; Ott, Christian D.

    The mechanism driving core-collapse supernovae is sensitive to the interplay between matter and neutrino radiation. However, neutrino radiation transport is very difficult to simulate, and several radiation transport methods of varying levels of approximation are available. In this paper, we carefully compare for the first time in multiple spatial dimensions the discrete ordinates (DO) code of Nagakura, Yamada, and Sumiyoshi and the Monte Carlo (MC) code Sedonu, under the assumptions of a static fluid background, flat spacetime, elastic scattering, and full special relativity. We find remarkably good agreement in all spectral, angular, and fluid interaction quantities, lending confidence to bothmore » methods. The DO method excels in determining the heating and cooling rates in the optically thick region. The MC method predicts sharper angular features due to the effectively infinite angular resolution, but struggles to drive down noise in quantities where subtractive cancellation is prevalent, such as the net gain in the protoneutron star and off-diagonal components of the Eddington tensor. We also find that errors in the angular moments of the distribution functions induced by neglecting velocity dependence are subdominant to those from limited momentum-space resolution. We briefly compare directly computed second angular moments to those predicted by popular algebraic two-moment closures, and we find that the errors from the approximate closures are comparable to the difference between the DO and MC methods. Finally, included in this work is an improved Sedonu code, which now implements a fully special relativistic, time-independent version of the grid-agnostic MC random walk approximation.« less

  16. A Detailed Comparison of Multidimensional Boltzmann Neutrino Transport Methods in Core-collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Richers, Sherwood; Nagakura, Hiroki; Ott, Christian D.; Dolence, Joshua; Sumiyoshi, Kohsuke; Yamada, Shoichi

    2017-10-01

    The mechanism driving core-collapse supernovae is sensitive to the interplay between matter and neutrino radiation. However, neutrino radiation transport is very difficult to simulate, and several radiation transport methods of varying levels of approximation are available. We carefully compare for the first time in multiple spatial dimensions the discrete ordinates (DO) code of Nagakura, Yamada, and Sumiyoshi and the Monte Carlo (MC) code Sedonu, under the assumptions of a static fluid background, flat spacetime, elastic scattering, and full special relativity. We find remarkably good agreement in all spectral, angular, and fluid interaction quantities, lending confidence to both methods. The DO method excels in determining the heating and cooling rates in the optically thick region. The MC method predicts sharper angular features due to the effectively infinite angular resolution, but struggles to drive down noise in quantities where subtractive cancellation is prevalent, such as the net gain in the protoneutron star and off-diagonal components of the Eddington tensor. We also find that errors in the angular moments of the distribution functions induced by neglecting velocity dependence are subdominant to those from limited momentum-space resolution. We briefly compare directly computed second angular moments to those predicted by popular algebraic two-moment closures, and we find that the errors from the approximate closures are comparable to the difference between the DO and MC methods. Included in this work is an improved Sedonu code, which now implements a fully special relativistic, time-independent version of the grid-agnostic MC random walk approximation.

  17. Optimization of beam shaping assembly based on D-T neutron generator and dose evaluation for BNCT

    NASA Astrophysics Data System (ADS)

    Naeem, Hamza; Chen, Chaobin; Zheng, Huaqing; Song, Jing

    2017-04-01

    The feasibility of developing an epithermal neutron beam for a boron neutron capture therapy (BNCT) facility based on a high intensity D-T fusion neutron generator (HINEG) and using the Monte Carlo code SuperMC (Super Monte Carlo simulation program for nuclear and radiation process) is proposed in this study. The Monte Carlo code SuperMC is used to determine and optimize the final configuration of the beam shaping assembly (BSA). The optimal BSA design in a cylindrical geometry which consists of a natural uranium sphere (14 cm) as a neutron multiplier, AlF3 and TiF3 as moderators (20 cm each), Cd (1 mm) as a thermal neutron filter, Bi (5 cm) as a gamma shield, and Pb as a reflector and collimator to guide neutrons towards the exit window. The epithermal neutron beam flux of the proposed model is 5.73 × 109 n/cm2s, and other dosimetric parameters for the BNCT reported by IAEA-TECDOC-1223 have been verified. The phantom dose analysis shows that the designed BSA is accurate, efficient and suitable for BNCT applications. Thus, the Monte Carlo code SuperMC is concluded to be capable of simulating the BSA and the dose calculation for BNCT, and high epithermal flux can be achieved using proposed BSA.

  18. Fort Leavenworth Ethics Symposium: The Professional Ethic and the State

    DTIC Science & Technology

    2015-04-23

    47 Chapter 6 Ethical Paradox, Cultural Incongruence, and the Need for a Code of Ethics in the US Military by William J . Davis, Jr. PhD...of Military Ethics by Thomas J . Gibbons ....................................................................................................91...PJ McCormack MBE, BD, MTh, PhD (QUB), PhD (Cran), CF ......143 Chapter 14 Multiple Ethical Loyalties in Guantanamo CAPT J . Scott McPherson, USN and

  19. 75 FR 80765 - Hazardous Materials: Adoption of ASME Code Section XII and the National Board Inspection Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ... requirement for allowable peak secondary stresses for MC 331 cargo tanks. 4. Rational Design of Non-circular... the design, construction, and certification of cargo tank motor vehicles, cryogenic portable tanks and... CTMV: Cargo Tank Motor Vehicle DCE: Design Certifying Engineer FMCSA: Federal Motor Carrier Safety...

  20. MC3: Multi-core Markov-chain Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Lust, Nate; Foster, AJ; Stemm, Madison; Loredo, Tom; Stevenson, Kevin; Campo, Chris; Hardin, Matt; Hardy, Ryan

    2016-10-01

    MC3 (Multi-core Markov-chain Monte Carlo) is a Bayesian statistics tool that can be executed from the shell prompt or interactively through the Python interpreter with single- or multiple-CPU parallel computing. It offers Markov-chain Monte Carlo (MCMC) posterior-distribution sampling for several algorithms, Levenberg-Marquardt least-squares optimization, and uniform non-informative, Jeffreys non-informative, or Gaussian-informative priors. MC3 can share the same value among multiple parameters and fix the value of parameters to constant values, and offers Gelman-Rubin convergence testing and correlated-noise estimation with time-averaging or wavelet-based likelihood estimation methods.

  1. MC 2 -3: Multigroup Cross Section Generation Code for Fast Reactor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Changho; Yang, Won Sik

    This paper presents the methods and performance of the MC2 -3 code, which is a multigroup cross-section generation code for fast reactor analysis, developed to improve the resonance self-shielding and spectrum calculation methods of MC2 -2 and to simplify the current multistep schemes generating region-dependent broad-group cross sections. Using the basic neutron data from ENDF/B data files, MC2 -3 solves the consistent P1 multigroup transport equation to determine the fundamental mode spectra for use in generating multigroup neutron cross sections. A homogeneous medium or a heterogeneous slab or cylindrical unit cell problem is solved in ultrafine (2082) or hyperfine (~400more » 000) group levels. In the resolved resonance range, pointwise cross sections are reconstructed with Doppler broadening at specified temperatures. The pointwise cross sections are directly used in the hyperfine group calculation, whereas for the ultrafine group calculation, self-shielded cross sections are prepared by numerical integration of the pointwise cross sections based upon the narrow resonance approximation. For both the hyperfine and ultrafine group calculations, unresolved resonances are self-shielded using the analytic resonance integral method. The ultrafine group calculation can also be performed for a two-dimensional whole-core problem to generate region-dependent broad-group cross sections. Verification tests have been performed using the benchmark problems for various fast critical experiments including Los Alamos National Laboratory critical assemblies; Zero-Power Reactor, Zero-Power Physics Reactor, and Bundesamt für Strahlenschutz experiments; Monju start-up core; and Advanced Burner Test Reactor. Verification and validation results with ENDF/B-VII.0 data indicated that eigenvalues from MC2 -3/DIF3D agreed well with Monte Carlo N-Particle5 MCNP5 or VIM Monte Carlo solutions within 200 pcm and regionwise one-group fluxes were in good agreement with Monte Carlo solutions.« less

  2. Measurement of antiproton annihilation on Cu, Ag and Au with emulsion films

    NASA Astrophysics Data System (ADS)

    Aghion, S.; Amsler, C.; Ariga, A.; Ariga, T.; Bonomi, G.; Bräunig, P.; Brusa, R. S.; Cabaret, L.; Caccia, M.; Caravita, R.; Castelli, F.; Cerchiari, G.; Comparat, D.; Consolati, G.; Demetrio, A.; Di Noto, L.; Doser, M.; Ereditato, A.; Evans, C.; Ferragut, R.; Fesel, J.; Fontana, A.; Gerber, S.; Giammarchi, M.; Gligorova, A.; Guatieri, F.; Haider, S.; Hinterberger, A.; Holmestad, H.; Huse, T.; Kawada, J.; Kellerbauer, A.; Kimura, M.; Krasnický, D.; Lagomarsino, V.; Lansonneur, P.; Lebrun, P.; Malbrunot, C.; Mariazzi, S.; Matveev, V.; Mazzotta, Z.; Müller, S. R.; Nebbia, G.; Nedelec, P.; Oberthaler, M.; Pacifico, N.; Pagano, D.; Penasa, L.; Petracek, V.; Pistillo, C.; Prelz, F.; Prevedelli, M.; Ravelli, L.; Rienaecker, B.; RØhne, O. M.; Rotondi, A.; Sacerdoti, M.; Sandaker, H.; Santoro, R.; Scampoli, P.; Simon, M.; Smestad, L.; Sorrentino, F.; Testera, G.; Tietje, I. C.; Vamosi, S.; Vladymyrov, M.; Widmann, E.; Yzombard, P.; Zimmer, C.; Zmeskal, J.; Zurlo, N.

    2017-04-01

    The characteristics of low energy antiproton annihilations on nuclei (e.g. hadronization and product multiplicities) are not well known, and Monte Carlo simulation packages that use different models provide different descriptions of the annihilation events. In this study, we measured the particle multiplicities resulting from antiproton annihilations on nuclei. The results were compared with predictions obtained using different models in the simulation tools GEANT4 and FLUKA. For this study, we exposed thin targets (Cu, Ag and Au) to a very low energy antiproton beam from CERN's Antiproton Decelerator, exploiting the secondary beamline available in the AEgIS experimental zone. The antiproton annihilation products were detected using emulsion films developed at the Laboratory of High Energy Physics in Bern, where they were analysed at the automatic microscope facility. The fragment multiplicity measured in this study is in good agreement with results obtained with FLUKA simulations for both minimally and heavily ionizing particles.

  3. Preliminary design of CERN Future Circular Collider tunnel: first evaluation of the radiation environment in critical areas for electronics

    NASA Astrophysics Data System (ADS)

    Infantino, Angelo; Alía, Rubén García; Besana, Maria Ilaria; Brugger, Markus; Cerutti, Francesco

    2017-09-01

    As part of its post-LHC high energy physics program, CERN is conducting a study for a new proton-proton collider, called Future Circular Collider (FCC-hh), running at center-of-mass energies of up to 100 TeV in a new 100 km tunnel. The study includes a 90-350 GeV lepton collider (FCC-ee) as well as a lepton-hadron option (FCC-he). In this work, FLUKA Monte Carlo simulation was extensively used to perform a first evaluation of the radiation environment in critical areas for electronics in the FCC-hh tunnel. The model of the tunnel was created based on the original civil engineering studies already performed and further integrated in the existing FLUKA models of the beam line. The radiation levels in critical areas, such as the racks for electronics and cables, power converters, service areas, local tunnel extensions was evaluated.

  4. Plasmid integration in a wide range of bacteria mediated by the integrase of Lactobacillus delbrueckii bacteriophage mv4.

    PubMed Central

    Auvray, F; Coddeville, M; Ritzenthaler, P; Dupont, L

    1997-01-01

    Bacteriophage mv4 is a temperate phage infecting Lactobacillus delbrueckii subsp. bulgaricus. During lysogenization, the phage integrates its genome into the host chromosome at the 3' end of a tRNA(Ser) gene through a site-specific recombination process (L. Dupont et al., J. Bacteriol., 177:586-595, 1995). A nonreplicative vector (pMC1) based on the mv4 integrative elements (attP site and integrase-coding int gene) is able to integrate into the chromosome of a wide range of bacterial hosts, including Lactobacillus plantarum, Lactobacillus casei (two strains), Lactococcus lactis subsp. cremoris, Enterococcus faecalis, and Streptococcus pneumoniae. Integrative recombination of pMC1 into the chromosomes of all of these species is dependent on the int gene product and occurs specifically at the pMC1 attP site. The isolation and sequencing of pMC1 integration sites from these bacteria showed that in lactobacilli, pMC1 integrated into the conserved tRNA(Ser) gene. In the other bacterial species where this tRNA gene is less or not conserved; secondary integration sites either in potential protein-coding regions or in intergenic DNA were used. A consensus sequence was deduced from the analysis of the different integration sites. The comparison of these sequences demonstrated the flexibility of the integrase for the bacterial integration site and suggested the importance of the trinucleotide CCT at the 5' end of the core in the strand exchange reaction. PMID:9068626

  5. spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains

    NASA Astrophysics Data System (ADS)

    Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo

    2016-09-01

    The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.

  6. Detection And Classification Of Web Robots With Honeypots

    DTIC Science & Technology

    2016-03-01

    CLASSIFICATION OF WEB ROBOTS WITH HONEYPOTS by Sean F. McKenna March 2016 Thesis Advisor: Neil Rowe Second Reader: Justin P. Rohrer THIS...Master’s thesis 4. TITLE AND SUBTITLE DETECTION AND CLASSIFICATION OF WEB ROBOTS WITH HONEYPOTS 5. FUNDING NUMBERS 6. AUTHOR(S) Sean F. McKenna 7...DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Web robots are automated programs that systematically browse the Web , collecting information. Although

  7. Space-Time Processing for Tactical Mobile Ad Hoc Networks

    DTIC Science & Technology

    2010-05-01

    Spatial Diversity and Imperfect Channel Estimation on Wideband MC- DS - CDMA and MC- CDMA " IEEE Transactions on Communications, Vol. 57, No. 10, pp. 2988...include direct sequence code division multiple access ( DS - CDMA ), Frequency Hopped (FH) CDMA and Orthogonal Frequency Division Multiple Access (OFDMA...capability, LPD/LPI, and operability in non-continuous spectrum. In addition, FH- CDMA is robust to the near-far problem, while DS - CDMA requires

  8. Quantitative analysis of optical properties of flowing blood using a photon-cell interactive Monte Carlo code: effects of red blood cells' orientation on light scattering.

    PubMed

    Sakota, Daisuke; Takatani, Setsuo

    2012-05-01

    Optical properties of flowing blood were analyzed using a photon-cell interactive Monte Carlo (pciMC) model with the physical properties of the flowing red blood cells (RBCs) such as cell size, shape, refractive index, distribution, and orientation as the parameters. The scattering of light by flowing blood at the He-Ne laser wavelength of 632.8 nm was significantly affected by the shear rate. The light was scattered more in the direction of flow as the flow rate increased. Therefore, the light intensity transmitted forward in the direction perpendicular to flow axis decreased. The pciMC model can duplicate the changes in the photon propagation due to moving RBCs with various orientations. The resulting RBC's orientation that best simulated the experimental results was with their long axis perpendicular to the direction of blood flow. Moreover, the scattering probability was dependent on the orientation of the RBCs. Finally, the pciMC code was used to predict the hematocrit of flowing blood with accuracy of approximately 1.0 HCT%. The photon-cell interactive Monte Carlo (pciMC) model can provide optical properties of flowing blood and will facilitate the development of the non-invasive monitoring of blood in extra corporeal circulatory systems.

  9. Efficient Coupling of Fluid-Plasma and Monte-Carlo-Neutrals Models for Edge Plasma Transport

    NASA Astrophysics Data System (ADS)

    Dimits, A. M.; Cohen, B. I.; Friedman, A.; Joseph, I.; Lodestro, L. L.; Rensink, M. E.; Rognlien, T. D.; Sjogreen, B.; Stotler, D. P.; Umansky, M. V.

    2017-10-01

    UEDGE has been valuable for modeling transport in the tokamak edge and scrape-off layer due in part to its efficient fully implicit solution of coupled fluid neutrals and plasma models. We are developing an implicit coupling of the kinetic Monte-Carlo (MC) code DEGAS-2, as the neutrals model component, to the UEDGE plasma component, based on an extension of the Jacobian-free Newton-Krylov (JFNK) method to MC residuals. The coupling components build on the methods and coding already present in UEDGE. For the linear Krylov iterations, a procedure has been developed to ``extract'' a good preconditioner from that of UEDGE. This preconditioner may also be used to greatly accelerate the convergence rate of a relaxed fixed-point iteration, which may provide a useful ``intermediate'' algorithm. The JFNK method also requires calculation of Jacobian-vector products, for which any finite-difference procedure is inaccurate when a MC component is present. A semi-analytical procedure that retains the standard MC accuracy and fully kinetic neutrals physics is therefore being developed. Prepared for US DOE by LLNL under Contract DE-AC52-07NA27344 and LDRD project 15-ERD-059, by PPPL under Contract DE-AC02-09CH11466, and supported in part by the U.S. DOE, OFES.

  10. Evaluation of surveillance methods for staphylococcal toxic shock syndrome.

    PubMed

    Lesher, Lindsey; Devries, Aaron; Danila, Richard; Lynfield, Ruth

    2009-05-01

    We compared passive surveillance and International Classification of Diseases, 9th Revision, codes for completeness of staphylococcal toxic shock syndrome (TSS) surveillance in the Minneapolis-St. Paul area, Minnesota, USA. TSS-specific codes identified 55% of cases compared with 30% by passive surveillance and were more sensitive (p = 0.0005, McNemar chi2 12.25).

  11. Evaluation of Surveillance Methods for Staphylococcal Toxic Shock Syndrome

    PubMed Central

    DeVries, Aaron; Danila, Richard; Lynfield, Ruth

    2009-01-01

    We compared passive surveillance and International Classification of Diseases, 9th Revision, codes for completeness of staphylococcal toxic shock syndrome (TSS) surveillance in the Minneapolis–St. Paul area, Minnesota, USA. TSS-specific codes identified 55% of cases compared with 30% by passive surveillance and were more sensitive (p = 0.0005, McNemar χ2 12.25). PMID:19402965

  12. 76 FR 40849 - Post Office (PO) Box Fee Groups for Merged Locations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-12

    ... POSTAL SERVICE 39 CFR Part 111 Post Office (PO) Box Fee Groups for Merged Locations AGENCY: Postal... Locations.'' Faxed comments are not accepted. FOR FURTHER INFORMATION CONTACT: Nan McKenzie at 202-268-3089... boxes move to a different ZIP Code location because of a merger of two or more ZIP Code locations into a...

  13. A Monte Carlo calculation model of electronic portal imaging device for transit dosimetry through heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Jihyung; Jung, Jae Won, E-mail: jungj@ecu.edu; Kim, Jong Oh

    2016-05-15

    Purpose: To develop and evaluate a fast Monte Carlo (MC) dose calculation model of electronic portal imaging device (EPID) based on its effective atomic number modeling in the XVMC code. Methods: A previously developed EPID model, based on the XVMC code by density scaling of EPID structures, was modified by additionally considering effective atomic number (Z{sub eff}) of each structure and adopting a phase space file from the EGSnrc code. The model was tested under various homogeneous and heterogeneous phantoms and field sizes by comparing the calculations in the model with measurements in EPID. In order to better evaluate themore » model, the performance of the XVMC code was separately tested by comparing calculated dose to water with ion chamber (IC) array measurement in the plane of EPID. Results: In the EPID plane, calculated dose to water by the code showed agreement with IC measurements within 1.8%. The difference was averaged across the in-field regions of the acquired profiles for all field sizes and phantoms. The maximum point difference was 2.8%, affected by proximity of the maximum points to penumbra and MC noise. The EPID model showed agreement with measured EPID images within 1.3%. The maximum point difference was 1.9%. The difference dropped from the higher value of the code by employing the calibration that is dependent on field sizes and thicknesses for the conversion of calculated images to measured images. Thanks to the Z{sub eff} correction, the EPID model showed a linear trend of the calibration factors unlike those of the density-only-scaled model. The phase space file from the EGSnrc code sharpened penumbra profiles significantly, improving agreement of calculated profiles with measured profiles. Conclusions: Demonstrating high accuracy, the EPID model with the associated calibration system may be used for in vivo dosimetry of radiation therapy. Through this study, a MC model of EPID has been developed, and their performance has been rigorously investigated for transit dosimetry.« less

  14. SNPs of melanocortin 4 receptor (MC4R) associated with body weight in Beagle dogs.

    PubMed

    Zeng, Ruixia; Zhang, Yibo; Du, Peng

    2014-01-01

    Melanocortin 4 receptor (MC4R), which is associated with inherited human obesity, is involoved in food intake and body weight of mammals. To study the relationships between MC4R gene polymorphism and body weight in Beagle dogs, we detected and compared the nucleotide sequence of the whole coding region and 3'- and 5'- flanking regions of the dog MC4R gene (1214 bp). In 120 Beagle dogs, two SNPs (A420C, C895T) were identified and their relation with body weight was analyzed with RFLP-PCR method. The results showed that the SNP at A420C was significantly associated with canine body weight trait when it changed amino acid 101 of the MC4R protein from asparagine to threonine, while canine body weight variations were significant in female dogs when MC4R nonsense mutation at C895T. It suggested that the two SNPs might affect the MC4R gene's function which was relative to body weight in Beagle dogs. Therefore, MC4R was a candidate gene for selecting different size dogs with the MC4R SNPs (A420C, C895T) being potentially valuable as a genetic marker.

  15. Effect of vertical sleeve gastrectomy in melanocortin receptor 4-deficient rats.

    PubMed

    Mul, Joram D; Begg, Denovan P; Alsters, Suzanne I M; van Haaften, Gijs; Duran, Karen J; D'Alessio, David A; le Roux, Carel W; Woods, Stephen C; Sandoval, Darleen A; Blakemore, Alexandra I F; Cuppen, Edwin; van Haelst, Mieke M; Seeley, Randy J

    2012-07-01

    Bariatric surgery is currently the most effective treatment for obesity. Vertical sleeve gastrectomy (VSG), a commonly applied bariatric procedure, involves surgically incising most of the volume of the stomach. In humans, partial loss of melanocortin receptor-4 (MC4R) activity is the most common monogenic correlate of obesity regardless of lifestyle. At present it is unclear whether genetic alteration of MC4R signaling modulates the beneficial effects of VSG. Following VSG, we analyzed body weight, food intake, glucose sensitivity, and macronutrient preference of wild-type and MC4R-deficient (Mc4r(+/-) and Mc4r(-/-)) rats compared with sham-operated controls. VSG reduced body weight and fat mass and improved glucose metabolism and also shifted preference toward carbohydrates and away from fat. All of this occurred independently of MC4R activity. In addition, MC4R was resequenced in 46 human subjects who underwent VSG. We observed common genetic variations in the coding sequence of MC4R in five subjects. However, none of those variations appeared to affect the outcome of VSG. Taken together, these data suggest that the beneficial effect of VSG on body weight and glucose metabolism is not mediated by alterations in MC4R activity.

  16. Effect of vertical sleeve gastrectomy in melanocortin receptor 4-deficient rats

    PubMed Central

    Mul, Joram D.; Begg, Denovan P.; Alsters, Suzanne I. M.; van Haaften, Gijs; Duran, Karen J.; D'Alessio, David A.; le Roux, Carel W.; Woods, Stephen C.; Sandoval, Darleen A.; Blakemore, Alexandra I. F.; Cuppen, Edwin; van Haelst, Mieke M.

    2012-01-01

    Bariatric surgery is currently the most effective treatment for obesity. Vertical sleeve gastrectomy (VSG), a commonly applied bariatric procedure, involves surgically incising most of the volume of the stomach. In humans, partial loss of melanocortin receptor-4 (MC4R) activity is the most common monogenic correlate of obesity regardless of lifestyle. At present it is unclear whether genetic alteration of MC4R signaling modulates the beneficial effects of VSG. Following VSG, we analyzed body weight, food intake, glucose sensitivity, and macronutrient preference of wild-type and MC4R-deficient (Mc4r+/− and Mc4r−/−) rats compared with sham-operated controls. VSG reduced body weight and fat mass and improved glucose metabolism and also shifted preference toward carbohydrates and away from fat. All of this occurred independently of MC4R activity. In addition, MC4R was resequenced in 46 human subjects who underwent VSG. We observed common genetic variations in the coding sequence of MC4R in five subjects. However, none of those variations appeared to affect the outcome of VSG. Taken together, these data suggest that the beneficial effect of VSG on body weight and glucose metabolism is not mediated by alterations in MC4R activity. PMID:22535749

  17. Initial Ada components evaluation

    NASA Technical Reports Server (NTRS)

    Moebes, Travis

    1989-01-01

    The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.

  18. MUFFSgenMC: An Open Source MUon Flexible Framework for Spectral GENeration for Monte Carlo Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatzidakis, Stylianos; Greulich, Christopher

    A cosmic ray Muon Flexible Framework for Spectral GENeration for Monte Carlo Applications (MUFFSgenMC) has been developed to support state-of-the-art cosmic ray muon tomographic applications. The flexible framework allows for easy and fast creation of source terms for popular Monte Carlo applications like GEANT4 and MCNP. This code framework simplifies the process of simulations used for cosmic ray muon tomography.

  19. NRL Fact Book

    DTIC Science & Technology

    1989-04-01

    house research labora- tory under the command of the Chief of Naval Research (CNR). As the corporate research laboratory of the Navy, NRL is an important...L.S. Herrin Ms. B.J. McDonald Mr. R.C. Spragg Ms. M.E. Barton Ms. J. Hileman Title Head, Office of Management and Administration Deputy Head...Administrative Officer Head, Management Information Staff Head, Directives Staff Head, GLISIP Program Point of contact: Ms. B.J. McDonald , Code 1005.2, 767-3634

  20. Reilly pulls it together with care

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiesche, E.S.

    1992-12-09

    Reilly Industries (Indianapolis) has changed strategic planning procedures to incorporate Responsible Care into its business plans. Each of the company's business units budgets for Responsible Care and reports quarterly on progress in implementing the codes, says Jacqueline Fernette, corporate communications coordinator and Responsible Care coordinator. The company's goal is to achieve full implementation by the end of 1997. In Reilly's 1993 budget, 20% of capital is directed at Responsible Care, says president Robert McNeeley. We hold unit managers responsible for planning Responsible Care within their businesses and reporting on them on a quarterly basis, says McNeeley. The firm makes pyridine,more » coal tar, potash and related chemicals, and specialized esters, and posts annual in the $250 million-$300 million range. Reilly has seven plants and 900 employees. Incorporating Responsible Care into the strategic business plan required a fair amount of administrative work to make sure that all business unit managers understood the concepts and were working in comparable terms, says McNeeley. We needed to bring the managers up to speed in six codes, so there was a training aspect to it.« less

  1. 76 FR 45570 - Consumer Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-29

    ... Rooker Coleman Institute for Cognitive Disabilities--Clayton Lewis Consumer Action--Ken McEldowney... http://accessibleevent.com . The Web page prompts for an Event Code which is, 005202376. To learn about...

  2. SU-G-JeP1-13: Innovative Tracking Detector for Dose Monitoring in Hadron Therapy: Realization and Monte Carlo Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rucinski, A; Mancini-Terracciano, C; Paramatti, R

    2016-06-15

    Purpose: Development of strategies to monitor range uncertainties is necessary to improve treatment planning in Charged Particle Therapy (CPT) and fully exploit the advantages of ion beams. Our group developed (within the framework of the INSIDE project funded by the Italian research ministry) and is currently building a compact detector Dose Profiler (DP) able to backtrack charged secondary particles produced in the patient during the irradiation. Furthermore we are studying monitoring strategy exploiting charged secondary emission profiles to control the range of the ion beam. Methods: This contribution reports on the DP detector design and construction status. The detector consistsmore » of a charged secondary tracker composed of scintillating fiber layers and a LYSO calorimeter for particles energy measurement.The detector layout has been optimized using the FLUKA Monte Carlo (MC) simulation software. The simulation of a 220 MeV Carbon beam impinging on a PMMA target has been performed to study the detector response, exploiting previous secondary radiation measurements performed by our group. The emission profile of charged secondary particles was reconstructed backtracking the particles to their generation point to benchmark the DP performances. Results: The DP construction status, including the technological details will be presented. The feasibility of range monitoring with DP will be demonstrated by means of MC studies. The correlation of the charged secondary particles emission shape with the position of the Bragg peak (BP) will be shown, as well as the spatial resolution achievable on the BP position estimation (less than 3 mm) in the clinical like conditions. Conclusion: The simulation studies supported the feasibility of an accurate range monitoring technique exploiting the use of charged secondary fragments emitted during the particle therapy treatment. The DP experimental tests are foreseen in 2016, at CNAO particle therapy center in Pavia.« less

  3. SU-F-T-364: Monte Carlo-Dose Verification of Volumetric Modulated Arc Therapy Plans Using AAPM TG-119 Test Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onizuka, R; Araki, F; Ohno, T

    2016-06-15

    Purpose: To investigate the Monte Carlo (MC)-based dose verification for VMAT plans by a treatment planning system (TPS). Methods: The AAPM TG-119 test structure set was used for VMAT plans by the Pinnacle3 (convolution/superposition), using a Synergy radiation head of a 6 MV beam with the Agility MLC. The Synergy was simulated with the EGSnrc/BEAMnrc code, and VMAT dose distributions were calculated with the EGSnrc/DOSXYZnrc code by the same irradiation conditions as TPS. VMAT dose distributions of TPS and MC were compared with those of EBT3 film, by 2-D gamma analysis of ±3%/3 mm criteria with a threshold of 30%more » of prescribed doses. VMAT dose distributions between TPS and MC were also compared by DVHs and 3-D gamma analysis of ±3%/3 mm criteria with a threshold of 10%, and 3-D passing rates for PTVs and OARs were analyzed. Results: TPS dose distributions differed from those of film, especially for Head & neck. The dose difference between TPS and film results from calculation accuracy for complex motion of MLCs like tongue and groove effect. In contrast, MC dose distributions were in good agreement with those of film. This is because MC can model fully the MLC configuration and accurately reproduce the MLC motion between control points in VMAT plans. D95 of PTV for Prostate, Head & neck, C-shaped, and Multi Target was 97.2%, 98.1%, 101.6%, and 99.7% for TPS and 95.7%, 96.0%, 100.6%, and 99.1% for MC, respectively. Similarly, 3-D gamma passing rates of each PTV for TPS vs. MC were 100%, 89.5%, 99.7%, and 100%, respectively. 3-D passing rates of TPS reduced for complex VMAT fields like Head & neck because MLCs are not modeled completely for TPS. Conclusion: MC-calculated VMAT dose distributions is useful for the 3-D dose verification of VMAT plans by TPS.« less

  4. Characterization of a multilayer ionization chamber prototype for fast verification of relative depth ionization curves and spread-out-Bragg-peaks in light ion beam therapy.

    PubMed

    Mirandola, Alfredo; Magro, Giuseppe; Lavagno, Marco; Mairani, Andrea; Molinelli, Silvia; Russo, Stefania; Mastella, Edoardo; Vai, Alessandro; Maestri, Davide; La Rosa, Vanessa; Ciocca, Mario

    2018-05-01

    To dosimetrically characterize a multilayer ionization chamber (MLIC) prototype for quality assurance (QA) of pristine integral ionization curves (ICs) and spread-out-Bragg-peaks (SOBPs) for scanning light ion beams. QUBE (De.Tec.Tor., Torino, Italy) is a modular detector designed for QA in particle therapy (PT). Its main module is a MLIC detector, able to evaluate particle beam relative depth ionization distributions at different beam energies and modulations. The charge collecting electrodes are made of aluminum, for a nominal water equivalent thickness (WET) of ~75 mm. The detector prototype was calibrated by acquiring the signals in the initial plateau region of a pristine BP and in terms of WET. Successively, it was characterized in terms of repeatability response, linearity, short-term stability and dose rate dependence. Beam-induced measurements of activation in terms of ambient dose equivalent rate were also performed. To increase the detector coarse native spatial resolution (~2.3 mm), several consecutive acquisitions with a set of certified 0.175-mm-thick PMMA sheets (Goodfellow, Cambridge Limited, UK), placed in front of the QUBE mylar entrance window, were performed. The ICs/SOBPs were achieved as the result of the sum of the set of measurements, made up of a one-by-one PMMA layer acquisition. The newly obtained detector spatial resolution allowed the experimental measurements to be properly comparable against the reference curves acquired in water with the PTW Peakfinder. Furthermore, QUBE detector was modeled in the FLUKA Monte Carlo (MC) code following the technical design details and ICs/SOBPs were calculated. Measurements showed a high repeatability: mean relative standard deviation within ±0.5% for all channels and both particle types. Moreover, the detector response was linear with dose (R 2  > 0.998) and independent on the dose rate. The mean deviation over the channel-by-channel readout respect to the reference beam flux (100%) was equal to 0.7% (1.9%) for the 50% (20%) beam flux level. The short-term stability of the gain calibration was very satisfying for both particle types: the channel mean relative standard deviation was within ±1% for all the acquisitions performed at different times. The ICs obtained with the MLIC QUBE at improved resolution satisfactorily matched both the MC simulations and the reference curves acquired with Peakfinder. Deviations from the reference values in terms of BP position, peak width and distal fall-off were submillimetric for both particle types in the whole investigated energy range. For modulated SOBPs, a submillimetric deviation was found when comparing both experimental MLIC QUBE data against the reference values and MC calculations. The relative dose deviations for the experimental MLIC QUBE acquisitions, with respect to Peakfinder data, ranged from ~1% to ~3.5%. Maximum value of 14.1 μSv/h was measured in contact with QUBE entrance window soon after a long irradiation with carbon ions. MLIC QUBE appears to be a promising detector for accurately measuring pristine ICs and SOBPs. A simple procedure to improve the intrinsic spatial resolution of the detector is proposed. Being the detector very accurate, precise, fast responding, and easy to handle, it is therefore well suited for daily checks in PT. © 2018 American Association of Physicists in Medicine.

  5. Envelope of Correlation Used with Deconvolution and Reconvolution to Remove the Direct Arrival in a Multipath Environment

    DTIC Science & Technology

    1990-03-01

    London - Amsterdam:Geophysical Press, 1984. 3 Dicus, Ronald L., "Impulse response estimation with underwater explosive charge acoustic signals," Journal of... Ronald N. Bracewell, The Fourier Transform and Its Applications. New York: McGraw-Hill, 1978, pp 267-71. 6 Julius S. Bendat and Allan G. Pierson...Feuillade Code 212 Ted Kennedy L. Dolly Lee Code 240 Dr. Ron Wagstaff 833 Hancock Sq., Suite G Dr. Robert Farwell Bay St. Louis, MS 39520 Code 242 Roger

  6. Absorbed Dose and Dose Equivalent Calculations for Modeling Effective Dose

    NASA Technical Reports Server (NTRS)

    Welton, Andrew; Lee, Kerry

    2010-01-01

    While in orbit, Astronauts are exposed to a much higher dose of ionizing radiation than when on the ground. It is important to model how shielding designs on spacecraft reduce radiation effective dose pre-flight, and determine whether or not a danger to humans is presented. However, in order to calculate effective dose, dose equivalent calculations are needed. Dose equivalent takes into account an absorbed dose of radiation and the biological effectiveness of ionizing radiation. This is important in preventing long-term, stochastic radiation effects in humans spending time in space. Monte carlo simulations run with the particle transport code FLUKA, give absorbed and equivalent dose data for relevant shielding. The shielding geometry used in the dose calculations is a layered slab design, consisting of aluminum, polyethylene, and water. Water is used to simulate the soft tissues that compose the human body. The results obtained will provide information on how the shielding performs with many thicknesses of each material in the slab. This allows them to be directly applicable to modern spacecraft shielding geometries.

  7. Impact of the LHC beam abort kicker prefire on high luminosity insertion and CMS detector performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.I. Drozhdin, N.V. Mokhov and M. Huhtinen

    1999-04-13

    The effect of possible accidental beam loss in LHC on the IP5 insertion elements and CMS detector is studied via realistic Monte Carlo simulations. Such beam loss could be the consequence of an unsynchronized abort or in worst case an accidental prefire of one of the abort kicker modules. Simulations with the STRUCT code show that this beam losses would take place in the IP5 inner and outer triplets. MARS simulations of the hadronic and electro-magnetic cascades induced in such an event indicate severe heating of the inner triplet quadrupoles. In order to protect the IP5 elements, two methods aremore » proposed: a set of shadow collimators in the outer triplet and a prefired module compensation using a special module charged with an opposite voltage (antikicker). The remnants of the accidental beam loss entering the experimental hall have been used as input for FLUKA simulations in the CMS detector. It is shown that it is vital to take measures to reliably protect the expensive CMS tracker components.« less

  8. Depth profile of production yields of natPb(p, xn) 206,205,204,203,202,201Bi nuclear reactions

    NASA Astrophysics Data System (ADS)

    Mokhtari Oranj, Leila; Jung, Nam-Suk; Kim, Dong-Hyun; Lee, Arim; Bae, Oryun; Lee, Hee-Seock

    2016-11-01

    Experimental and simulation studies on the depth profiles of production yields of natPb(p, xn) 206,205,204,203,202,201Bi nuclear reactions were carried out. Irradiation experiments were performed at the high-intensity proton linac facility (KOMAC) in Korea. The targets, irradiated by 100-MeV protons, were arranged in a stack consisting of natural Pb, Al, Au foils and Pb plates. The proton beam intensity was determined by activation analysis method using 27Al(p, 3p1n)24Na, 197Au(p, p1n)196Au, and 197Au(p, p3n)194Au monitor reactions and also by Gafchromic film dosimetry method. The yields of produced radio-nuclei in the natPb activation foils and monitor foils were measured by HPGe spectroscopy system. Monte Carlo simulations were performed by FLUKA, PHITS/DCHAIN-SP, and MCNPX/FISPACT codes and the calculated data were compared with the experimental results. A satisfactory agreement was observed between the present experimental data and the simulations.

  9. SU-F-T-217: A Comprehensive Monte-Carlo Study of Out-Of-Field Secondary Neutron Spectra in a Scanned-Beam Proton Therapy Treatment Room

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Englbrecht, F; Parodi, K; Trinkl, S

    2016-06-15

    Purpose: To simulate secondary neutron radiation-fields produced at different positions during phantom irradiation inside a scanning proton therapy gantry treatment room. Further, to identify origin, energy distribution and angular emission as function of proton beam energy. Methods: GEANT4 and FLUKA Monte-Carlo codes were used to model the relevant parts of the treatment room in a gantry-equipped pencil beam scanning proton therapy facility including walls, floor, metallic gantry-components, patient table and the homogeneous PMMA target. The proton beams were modeled based on experimental beam ranges in water and spot shapes in air. Neutron energy spectra were simulated at 0°, 45°, 90°more » and 135° relative to the beam axis at 2m distance from isocenter, as well as 11×11 cm2 fields for 75MeV, 140MeV, 200MeV and for 118MeV with 5cm PMMA range-shifter. The total neutron energy distribution was recorded for these four positions and proton energies. Additionally, the room-components generating secondary neutrons in the room and their contributions to the total spectrum were identified and quantified. Results: FLUKA and GEANT4 simulated neutron spectra showed good general agreement in the whole energy range of 10{sup −}9 to 10{sup 2} MeV. Comparison of measured spectra with the simulated contributions of the various room components helped to limit the complexity of the room model, by identifying the dominant contributions to the secondary neutron spectrum. The iron of the bending magnet and counterweight were identified as sources of secondary evaporation-neutrons, which were lacking in simplified room models. Conclusion: Thorough Monte-Carlo simulations have been performed to complement Bonner-sphere spectrometry measurements of secondary neutrons in a clinical proton therapy treatment room. Such calculations helped disentangling the origin of secondary neutrons and their dominant contributions to measured spectra, besides providing a useful validation of widely used Monte-Carlo packages in comparison to experimental data. Cluster of Excellence of the German Research Foundation (DFG) “Munich-Centre for Advanced Photonics (MAP)”.« less

  10. SU-C-204-06: Monte Carlo Dose Calculation for Kilovoltage X-Ray-Psoralen Activated Cancer Therapy (X-PACT): Preliminary Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mein, S; Gunasingha, R; Nolan, M

    Purpose: X-PACT is an experimental cancer therapy where kV x-rays are used to photo-activate anti-cancer therapeutics through phosphor intermediaries (phosphors that absorb x-rays and re-radiate as UV light). Clinical trials in pet dogs are currently underway (NC State College of Veterinary Medicine) and an essential component is the ability to model the kV dose in these dogs. Here we report the commissioning and characterization of a Monte Carlo (MC) treatment planning simulation tool to calculate X-PACT radiation doses in canine trials. Methods: FLUKA multi-particle MC simulation package was used to simulate a standard X-PACT radiation treatment beam of 80kVp withmore » the Varian OBI x-ray source geometry. The beam quality was verified by comparing measured and simulated attenuation of the beam by various thicknesses of aluminum (2–4.6 mm) under narrow beam conditions (HVL). The beam parameters at commissioning were then corroborated using MC, characterized and verified with empirically collected commissioning data, including: percent depth dose curves (PDD), back-scatter factors (BSF), collimator scatter factor(s), and heel effect, etc. All simulations were conducted for N=30M histories at M=100 iterations. Results: HVL and PDD simulation data agreed with an average percent error of 2.42%±0.33 and 6.03%±1.58, respectively. The mean square error (MSE) values for HVL and PDD (0.07% and 0.50%) were low, as expected; however, longer simulations are required to validate convergence to the expected values. Qualitatively, pre- and post-filtration source spectra matched well with 80kVp references generated via SPEKTR software. Further validation of commissioning data simulation is underway in preparation for first-time 3D dose calculations with canine CBCT data. Conclusion: We have prepared a Monte Carlo simulation capable of accurate dose calculation for use with ongoing X-PACT canine clinical trials. Preliminary results show good agreement with measured data and hold promise for accurate quantification of dose for this novel psoralen X-ray therapy. Funding Support, Disclosures, & Conflict of Interest: The Monte Carlo simulation work was not funded; Drs. Adamson & Oldham have received funding from Immunolight LLC for X-PACT research.« less

  11. Monte Carlo modeling of a conventional X-ray computed tomography scanner for gel dosimetry purposes.

    PubMed

    Hayati, Homa; Mesbahi, Asghar; Nazarpoor, Mahmood

    2016-01-01

    Our purpose in the current study was to model an X-ray CT scanner with the Monte Carlo (MC) method for gel dosimetry. In this study, a conventional CT scanner with one array detector was modeled with use of the MCNPX MC code. The MC calculated photon fluence in detector arrays was used for image reconstruction of a simple water phantom as well as polyacrylamide polymer gel (PAG) used for radiation therapy. Image reconstruction was performed with the filtered back-projection method with a Hann filter and the Spline interpolation method. Using MC results, we obtained the dose-response curve for images of irradiated gel at different absorbed doses. A spatial resolution of about 2 mm was found for our simulated MC model. The MC-based CT images of the PAG gel showed a reliable increase in the CT number with increasing absorbed dose for the studied gel. Also, our results showed that the current MC model of a CT scanner can be used for further studies on the parameters that influence the usability and reliability of results, such as the photon energy spectra and exposure techniques in X-ray CT gel dosimetry.

  12. The alpaca melanocortin 1 receptor: gene mutations, transcripts, and relative levels of expression in ventral skin biopsies.

    PubMed

    Chandramohan, Bathrachalam; Renieri, Carlo; La Manna, Vincenzo; La Terza, Antonietta

    2015-01-01

    The objectives of the present study were to characterize the MC1R gene, its transcripts and the single nucleotide polymorphisms (SNPs) associated with coat color in alpaca. Full length cDNA amplification revealed the presence of two transcripts, named as F1 and F2, differing only in the length of their 5'-terminal untranslated region (UTR) sequences and presenting a color specific expression. Whereas the F1 transcript was common to white and colored (black and brown) alpaca phenotypes, the shorter F2 transcript was specific to white alpaca. Further sequencing of the MC1R gene in white and colored alpaca identified a total of twelve SNPs; among those nine (four silent mutations (c.126C>A, c.354T>C, c.618G>A, and c.933G>A); five missense mutations (c.82A>G, c.92C>T, c.259A>G, c.376A>G, and c.901C>T)) were observed in coding region and three in the 3'UTR. A 4 bp deletion (c.224 227del) was also identified in the coding region. Molecular segregation analysis uncovered that the combinatory mutations in the MC1R locus could cause eumelanin and pheomelanin synthesis in alpaca. Overall, our data refine what is known about the MC1R gene and provides additional information on its role in alpaca pigmentation.

  13. Monte Carlo dosimetric characterization of the Flexisource Co-60 high-dose-rate brachytherapy source using PENELOPE.

    PubMed

    Almansa, Julio F; Guerrero, Rafael; Torres, Javier; Lallena, Antonio M

    60 Co sources have been commercialized as an alternative to 192 Ir sources for high-dose-rate (HDR) brachytherapy. One of them is the Flexisource Co-60 HDR source manufactured by Elekta. The only available dosimetric characterization of this source is that of Vijande et al. [J Contemp Brachytherapy 2012; 4:34-44], whose results were not included in the AAPM/ESTRO consensus document. In that work, the dosimetric quantities were calculated as averages of the results obtained with the Geant4 and PENELOPE Monte Carlo (MC) codes, though for other sources, significant differences have been quoted between the values obtained with these two codes. The aim of this work is to perform the dosimetric characterization of the Flexisource Co-60 HDR source using PENELOPE. The MC simulation code PENELOPE (v. 2014) has been used. Following the recommendations of the AAPM/ESTRO report, the radial dose function, the anisotropy function, the air-kerma strength, the dose rate constant, and the absorbed dose rate in water have been calculated. The results we have obtained exceed those of Vijande et al. In particular, the absorbed dose rate constant is ∼0.85% larger. A similar difference is also found in the other dosimetric quantities. The effect of the electrons emitted in the decay of 60 Co, usually neglected in this kind of simulations, is significant up to the distances of 0.25 cm from the source. The systematic and significant differences we have found between PENELOPE results and the average values found by Vijande et al. point out that the dosimetric characterizations carried out with the various MC codes should be provided independently. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  14. TU-AB-BRC-12: Optimized Parallel MonteCarlo Dose Calculations for Secondary MU Checks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    French, S; Nazareth, D; Bellor, M

    Purpose: Secondary MU checks are an important tool used during a physics review of a treatment plan. Commercial software packages offer varying degrees of theoretical dose calculation accuracy, depending on the modality involved. Dose calculations of VMAT plans are especially prone to error due to the large approximations involved. Monte Carlo (MC) methods are not commonly used due to their long run times. We investigated two methods to increase the computational efficiency of MC dose simulations with the BEAMnrc code. Distributed computing resources, along with optimized code compilation, will allow for accurate and efficient VMAT dose calculations. Methods: The BEAMnrcmore » package was installed on a high performance computing cluster accessible to our clinic. MATLAB and PYTHON scripts were developed to convert a clinical VMAT DICOM plan into BEAMnrc input files. The BEAMnrc installation was optimized by running the VMAT simulations through profiling tools which indicated the behavior of the constituent routines in the code, e.g. the bremsstrahlung splitting routine, and the specified random number generator. This information aided in determining the most efficient compiling parallel configuration for the specific CPU’s available on our cluster, resulting in the fastest VMAT simulation times. Our method was evaluated with calculations involving 10{sup 8} – 10{sup 9} particle histories which are sufficient to verify patient dose using VMAT. Results: Parallelization allowed the calculation of patient dose on the order of 10 – 15 hours with 100 parallel jobs. Due to the compiler optimization process, further speed increases of 23% were achieved when compared with the open-source compiler BEAMnrc packages. Conclusion: Analysis of the BEAMnrc code allowed us to optimize the compiler configuration for VMAT dose calculations. In future work, the optimized MC code, in conjunction with the parallel processing capabilities of BEAMnrc, will be applied to provide accurate and efficient secondary MU checks.« less

  15. Neutron-induced fission cross-section measurement of 234U with quasi-monoenergetic beams in the keV and MeV range using micromegas detectors

    NASA Astrophysics Data System (ADS)

    Tsinganis, A.; Kokkoris, M.; Vlastou, R.; Kalamara, A.; Stamatopoulos, A.; Kanellakopoulos, A.; Lagoyannis, A.; Axiotis, M.

    2017-09-01

    Accurate data on neutron-induced fission cross-sections of actinides are essential for the design of advanced nuclear reactors based either on fast neutron spectra or alternative fuel cycles, as well as for the reduction of safety margins of existing and future conventional facilities. The fission cross-section of 234U was measured at incident neutron energies of 560 and 660 keV and 7.5 MeV with a setup based on `microbulk' Micromegas detectors and the same samples previously used for the measurement performed at the CERN n_TOF facility (Karadimos et al., 2014). The 235U fission cross-section was used as reference. The (quasi-)monoenergetic neutron beams were produced via the 7Li(p,n) and the 2H(d,n) reactions at the neutron beam facility of the Institute of Nuclear and Particle Physics at the `Demokritos' National Centre for Scientific Research. A detailed study of the neutron spectra produced in the targets and intercepted by the samples was performed coupling the NeuSDesc and MCNPX codes, taking into account the energy spread, energy loss and angular straggling of the beam ions in the target assemblies, as well as contributions from competing reactions and neutron scattering in the experimental setup. Auxiliary Monte-Carlo simulations were performed with the FLUKA code to study the behaviour of the detectors, focusing particularly on the reproduction of the pulse height spectra of α-particles and fission fragments (using distributions produced with the GEF code) for the evaluation of the detector efficiency. An overview of the developed methodology and preliminary results are presented.

  16. Latent uncertainties of the precalculated track Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, Marc-André; Seuntjens, Jan; Roberge, David

    Purpose: While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited numbermore » of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pregenerated for electrons and protons using EGSnrc and GEANT4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (CUDA) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a “ground truth” benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of D{sub max}. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Results: Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. Conclusions: The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.« less

  17. Latent uncertainties of the precalculated track Monte Carlo method.

    PubMed

    Renaud, Marc-André; Roberge, David; Seuntjens, Jan

    2015-01-01

    While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited number of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Particle tracks were pregenerated for electrons and protons using EGSnrc and geant4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (cuda) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a "ground truth" benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of Dmax. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤ 1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.

  18. 77 FR 34218 - Clothing Allowance; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-11

    ..., Radioactive materials, Veterans, Vietnam. Approved: June 6, 2012. Robert C. McFetridge, Direc tor, Regulation... distinct type''. [FR Doc. 2012-14108 Filed 6-8-12; 8:45 am] BILLING CODE 8320-01-P ...

  19. Full Geant4 and FLUKA simulations of an e-LINAC for its use in particle detectors performance tests

    NASA Astrophysics Data System (ADS)

    Alpat, B.; Pilicer, E.; Servoli, L.; Menichelli, M.; Tucceri, P.; Italiani, M.; Buono, E.; Di Capua, F.

    2012-03-01

    In this work we present the results of full Geant4 and FLUKA simulations and comparison with dosimetry data of an electron LINAC of St. Maria Hospital located in Terni, Italy. The facility is being used primarily for radiotherapy and the goal of the present study is the detailed investigation of electron beam parameters to evaluate the possibility to use the e-LINAC (during time slots when it is not used for radiotherapy) to test the performance of detector systems, in particular those designed to operate in space. The critical beam parameters are electron energy, profile and flux available at the surface of device to be tested. The present work aims to extract these parameters from dosimetry calibration data available at the e-LINAC. The electron energy ranges from 4 MeV to 20 MeV. The dose measurements have been performed by using an Advanced Markus Chamber which has a small sensitive volume.

  20. Radiological Protection and Nuclear Engineering Studies in Multi-MW Target Systems

    NASA Astrophysics Data System (ADS)

    Luis, Raul Fernandes

    Several innovative projects involving nuclear technology have emerged around the world in recent years, for applications such as spallation neutron sources, accelerator-driven systems for the transmutation of nuclear waste and radioactive ion beam (RIB) production. While the available neutron Wuxes from nuclear reactors did not increase substantially in intensity over the past three decades, the intensities of neutron sources produced in spallation targets have increased steadily, and should continue to do so during the 21st century. Innovative projects like ESS, MYRRHA and EURISOL lie at the forefront of the ongoing pursuit for increasingly bright neutron sources; driven by proton beams with energies up to 2 GeV and intensities up to several mA, the construction of their proposed facilities involves complex Nuclear Technology and Radiological Protection design studies executed by multidisciplinary teams of scientists and engineers from diUerent branches of Science. The intense neutron Wuxes foreseen for those facilities can be used in several scientiVc research Velds, such as Nuclear Physics and Astrophysics, Medicine and Materials Science. In this work, the target systems of two facilitites for the production of RIBs using the Isotope Separation On-Line (ISOL) method were studied in detail: ISOLDE, operating at CERN since 1967, and EURISOL, the next-generation ISOL facility to be built in Europe. For the EURISOL multi-MW target station, a detailed study of Radiological Protection was carried out using the Monte Carlo code FLUKA. Simulations were done to assess neutron Wuences, Vssion rates, ambient dose equivalent rates during operation and after shutdown and the production of radioactive nuclei in the targets and surrounding materials. DiUerent materials were discussed for diUerent components of the target system, aiming at improving its neutronics performance while keeping the residual activities resulting from material activation as low as possible. The second goal of this work was to perform an optimisation study for the ISOLDE neutron converter and Vssion target system. The target system was simulated using FLUKA and the cross section codes TALYS and ABRABLA, with the objective of maximising the performance of the system for the production of pure beams of neutron-rich isotopes, suppressing the contaminations by undesired neutron-deficient isobars. Two alternative target systems were proposed in the optimisation studies; the simplest of the two, with some modiVcations, was built as a prototype and tested at ISOLDE. The experimental results clearly show that it is possible, with simple changes in the layouts of the target systems, to produce purer beams of neutron-rich isotopes around the doubly magic nuclei 78Ni and 132Sn. A study of Radiological Protection was also performed, comparing the performances of the prototype target system and the standard ISOLDE target system. None

  1. Protein-altering variants associated with body mass index implicate pathways that control energy intake and expenditure underpinning obesity

    PubMed Central

    Turcot, Valérie; Lu, Yingchang; Highland, Heather M; Schurmann, Claudia; Justice, Anne E; Fine, Rebecca S; Bradfield, Jonathan P; Esko, Tõnu; Giri, Ayush; Graff, Mariaelisa; Guo, Xiuqing; Hendricks, Audrey E; Karaderi, Tugce; Lempradl, Adelheid; Locke, Adam E; Mahajan, Anubha; Marouli, Eirini; Sivapalaratnam, Suthesh; Young, Kristin L; Alfred, Tamuno; Feitosa, Mary F; Masca, Nicholas GD; Manning, Alisa K; Medina-Gomez, Carolina; Mudgal, Poorva; Ng, Maggie CY; Reiner, Alex P; Vedantam, Sailaja; Willems, Sara M; Winkler, Thomas W; Abecasis, Goncalo; Aben, Katja K; Alam, Dewan S; Alharthi, Sameer E; Allison, Matthew; Amouyel, Philippe; Asselbergs, Folkert W; Auer, Paul L; Balkau, Beverley; Bang, Lia E; Barroso, Inês; Bastarache, Lisa; Benn, Marianne; Bergmann, Sven; Bielak, Lawrence F; Blüher, Matthias; Boehnke, Michael; Boeing, Heiner; Boerwinkle, Eric; Böger, Carsten A; Bork-Jensen, Jette; Bots, Michiel L; Bottinger, Erwin P; Bowden, Donald W; Brandslund, Ivan; Breen, Gerome; Brilliant, Murray H; Broer, Linda; Brumat, Marco; Burt, Amber A; Butterworth, Adam S; Campbell, Peter T; Cappellani, Stefania; Carey, David J; Catamo, Eulalia; Caulfield, Mark J; Chambers, John C; Chasman, Daniel I; Chen, Yii-Der Ida; Chowdhury, Rajiv; Christensen, Cramer; Chu, Audrey Y; Cocca, Massimiliano; Collins, Francis S; Cook, James P; Corley, Janie; Galbany, Jordi Corominas; Cox, Amanda J; Crosslin, David S; Cuellar-Partida, Gabriel; D'Eustacchio, Angela; Danesh, John; Davies, Gail; de Bakker, Paul IW; de Groot, Mark CH; de Mutsert, Renée; Deary, Ian J; Dedoussis, George; Demerath, Ellen W; den Heijer, Martin; den Hollander, Anneke I; den Ruijter, Hester M; Dennis, Joe G; Denny, Josh C; Di Angelantonio, Emanuele; Drenos, Fotios; Du, Mengmeng; Dubé, Marie-Pierre; Dunning, Alison M; Easton, Douglas F; Edwards, Todd L; Ellinghaus, David; Ellinor, Patrick T; Elliott, Paul; Evangelou, Evangelos; Farmaki, Aliki-Eleni; Farooqi, I. Sadaf; Faul, Jessica D; Fauser, Sascha; Feng, Shuang; Ferrannini, Ele; Ferrieres, Jean; Florez, Jose C; Ford, Ian; Fornage, Myriam; Franco, Oscar H; Franke, Andre; Franks, Paul W; Friedrich, Nele; Frikke-Schmidt, Ruth; Galesloot, Tessel E.; Gan, Wei; Gandin, Ilaria; Gasparini, Paolo; Gibson, Jane; Giedraitis, Vilmantas; Gjesing, Anette P; Gordon-Larsen, Penny; Gorski, Mathias; Grabe, Hans-Jörgen; Grant, Struan FA; Grarup, Niels; Griffiths, Helen L; Grove, Megan L; Gudnason, Vilmundur; Gustafsson, Stefan; Haessler, Jeff; Hakonarson, Hakon; Hammerschlag, Anke R; Hansen, Torben; Harris, Kathleen Mullan; Harris, Tamara B; Hattersley, Andrew T; Have, Christian T; Hayward, Caroline; He, Liang; Heard-Costa, Nancy L; Heath, Andrew C; Heid, Iris M; Helgeland, Øyvind; Hernesniemi, Jussi; Hewitt, Alex W; Holmen, Oddgeir L; Hovingh, G Kees; Howson, Joanna MM; Hu, Yao; Huang, Paul L; Huffman, Jennifer E; Ikram, M Arfan; Ingelsson, Erik; Jackson, Anne U; Jansson, Jan-Håkan; Jarvik, Gail P; Jensen, Gorm B; Jia, Yucheng; Johansson, Stefan; Jørgensen, Marit E; Jørgensen, Torben; Jukema, J Wouter; Kahali, Bratati; Kahn, René S; Kähönen, Mika; Kamstrup, Pia R; Kanoni, Stavroula; Kaprio, Jaakko; Karaleftheri, Maria; Kardia, Sharon LR; Karpe, Fredrik; Kathiresan, Sekar; Kee, Frank; Kiemeney, Lambertus A; Kim, Eric; Kitajima, Hidetoshi; Komulainen, Pirjo; Kooner, Jaspal S; Kooperberg, Charles; Korhonen, Tellervo; Kovacs, Peter; Kuivaniemi, Helena; Kutalik, Zoltán; Kuulasmaa, Kari; Kuusisto, Johanna; Laakso, Markku; Lakka, Timo A; Lamparter, David; Lange, Ethan M; Lange, Leslie A; Langenberg, Claudia; Larson, Eric B; Lee, Nanette R; Lehtimäki, Terho; Lewis, Cora E; Li, Huaixing; Li, Jin; Li-Gao, Ruifang; Lin, Honghuang; Lin, Keng-Hung; Lin, Li-An; Lin, Xu; Lind, Lars; Lindström, Jaana; Linneberg, Allan; Liu, Ching-Ti; Liu, Dajiang J; Liu, Yongmei; Lo, Ken Sin; Lophatananon, Artitaya; Lotery, Andrew J; Loukola, Anu; Luan, Jian'an; Lubitz, Steven A; Lyytikäinen, Leo-Pekka; Männistö, Satu; Marenne, Gaëlle; Mazul, Angela L; McCarthy, Mark I; McKean-Cowdin, Roberta; Medland, Sarah E; Meidtner, Karina; Milani, Lili; Mistry, Vanisha; Mitchell, Paul; Mohlke, Karen L; Moilanen, Leena; Moitry, Marie; Montgomery, Grant W; Mook-Kanamori, Dennis O; Moore, Carmel; Mori, Trevor A; Morris, Andrew D; Morris, Andrew P; Müller-Nurasyid, Martina; Munroe, Patricia B; Nalls, Mike A; Narisu, Narisu; Nelson, Christopher P; Neville, Matt; Nielsen, Sune F; Nikus, Kjell; Njølstad, Pål R; Nordestgaard, Børge G; Nyholt, Dale R; O'Connel, Jeffrey R; O’Donoghue, Michelle L.; Olde Loohuis, Loes M; Ophoff, Roel A; Owen, Katharine R; Packard, Chris J; Padmanabhan, Sandosh; Palmer, Colin NA; Palmer, Nicholette D; Pasterkamp, Gerard; Patel, Aniruddh P; Pattie, Alison; Pedersen, Oluf; Peissig, Peggy L; Peloso, Gina M; Pennell, Craig E; Perola, Markus; Perry, James A; Perry, John RB; Pers, Tune H; Person, Thomas N; Peters, Annette; Petersen, Eva RB; Peyser, Patricia A; Pirie, Ailith; Polasek, Ozren; Polderman, Tinca J; Puolijoki, Hannu; Raitakari, Olli T; Rasheed, Asif; Rauramaa, Rainer; Reilly, Dermot F; Renström, Frida; Rheinberger, Myriam; Ridker, Paul M; Rioux, John D; Rivas, Manuel A; Roberts, David J; Robertson, Neil R; Robino, Antonietta; Rolandsson, Olov; Rudan, Igor; Ruth, Katherine S; Saleheen, Danish; Salomaa, Veikko; Samani, Nilesh J; Sapkota, Yadav; Sattar, Naveed; Schoen, Robert E; Schreiner, Pamela J; Schulze, Matthias B; Scott, Robert A; Segura-Lepe, Marcelo P; Shah, Svati H; Sheu, Wayne H-H; Sim, Xueling; Slater, Andrew J; Small, Kerrin S; Smith, Albert Vernon; Southam, Lorraine; Spector, Timothy D; Speliotes, Elizabeth K; Starr, John M; Stefansson, Kari; Steinthorsdottir, Valgerdur; Stirrups, Kathleen E; Strauch, Konstantin; Stringham, Heather M; Stumvoll, Michael; Sun, Liang; Surendran, Praveen; Swift, Amy J; Tada, Hayato; Tansey, Katherine E; Tardif, Jean-Claude; Taylor, Kent D; Teumer, Alexander; Thompson, Deborah J; Thorleifsson, Gudmar; Thorsteinsdottir, Unnur; Thuesen, Betina H; Tönjes, Anke; Tromp, Gerard; Trompet, Stella; Tsafantakis, Emmanouil; Tuomilehto, Jaakko; Tybjaerg-Hansen, Anne; Tyrer, Jonathan P; Uher, Rudolf; Uitterlinden, André G; Uusitupa, Matti; van der Laan, Sander W; van Duijn, Cornelia M; van Leeuwen, Nienke; van Setten, Jessica; Vanhala, Mauno; Varbo, Anette; Varga, Tibor V; Varma, Rohit; Velez Edwards, Digna R; Vermeulen, Sita H; Veronesi, Giovanni; Vestergaard, Henrik; Vitart, Veronique; Vogt, Thomas F; Völker, Uwe; Vuckovic, Dragana; Wagenknecht, Lynne E; Walker, Mark; Wallentin, Lars; Wang, Feijie; Wang, Carol A; Wang, Shuai; Wang, Yiqin; Ware, Erin B; Wareham, Nicholas J; Warren, Helen R; Waterworth, Dawn M; Wessel, Jennifer; White, Harvey D; Willer, Cristen J; Wilson, James G; Witte, Daniel R; Wood, Andrew R; Wu, Ying; Yaghootkar, Hanieh; Yao, Jie; Yao, Pang; Yerges-Armstrong, Laura M; Young, Robin; Zeggini, Eleftheria; Zhan, Xiaowei; Zhang, Weihua; Zhao, Jing Hua; Zhao, Wei; Zhao, Wei; Zhou, Wei; Zondervan, Krina T; Rotter, Jerome I; Pospisilik, John A; Rivadeneira, Fernando; Borecki, Ingrid B; Deloukas, Panos; Frayling, Timothy M; Lettre, Guillaume; North, Kari E; Lindgren, Cecilia M; Hirschhorn, Joel N; Loos, Ruth JF

    2018-01-01

    Genome-wide association studies (GWAS) have identified >250 loci for body mass index (BMI), implicating pathways related to neuronal biology. Most GWAS loci represent clusters of common, non-coding variants from which pinpointing causal genes remains challenging. Here, we combined data from 718,734 individuals to discover rare and low-frequency (MAF<5%) coding variants associated with BMI. We identified 14 coding variants in 13 genes, of which eight in genes (ZBTB7B, ACHE, RAPGEF3, RAB21, ZFHX3, ENTPD6, ZFR2, ZNF169) newly implicated in human obesity, two (MC4R, KSR2) previously observed in extreme obesity, and two variants in GIPR. Effect sizes of rare variants are ~10 times larger than of common variants, with the largest effect observed in carriers of an MC4R stop-codon (p.Tyr35Ter, MAF=0.01%), weighing ~7kg more than non-carriers. Pathway analyses confirmed enrichment of neuronal genes and provide new evidence for adipocyte and energy expenditure biology, widening the potential of genetically-supported therapeutic targets to treat obesity. PMID:29273807

  2. SU-F-19A-10: Recalculation and Reporting Clinical HDR 192-Ir Head and Neck Dose Distributions Using Model Based Dose Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlsson Tedgren, A; Persson, M; Nilsson, J

    Purpose: To retrospectively re-calculate dose distributions for selected head and neck cancer patients, earlier treated with HDR 192Ir brachytherapy, using Monte Carlo (MC) simulations and compare results to distributions from the planning system derived using TG43 formalism. To study differences between dose to medium (as obtained with the MC code) and dose to water in medium as obtained through (1) ratios of stopping powers and (2) ratios of mass energy absorption coefficients between water and medium. Methods: The MC code Algebra was used to calculate dose distributions according to earlier actual treatment plans using anonymized plan data and CT imagesmore » in DICOM format. Ratios of stopping power and mass energy absorption coefficients for water with various media obtained from 192-Ir spectra were used in toggling between dose to water and dose to media. Results: Differences between initial planned TG43 dose distributions and the doses to media calculated by MC are insignificant in the target volume. Differences are moderate (within 4–5 % at distances of 3–4 cm) but increase with distance and are most notable in bone and at the patient surface. Differences between dose to water and dose to medium are within 1-2% when using mass energy absorption coefficients to toggle between the two quantities but increase to above 10% for bone using stopping power ratios. Conclusion: MC predicts target doses for head and neck cancer patients in close agreement with TG43. MC yields improved dose estimations outside the target where a larger fraction of dose is from scattered photons. It is important with awareness and a clear reporting of absorbed dose values in using model based algorithms. Differences in bone media can exceed 10% depending on how dose to water in medium is defined.« less

  3. Dosimetric quality control of Eclipse treatment planning system using pelvic digital test object

    NASA Astrophysics Data System (ADS)

    Benhdech, Yassine; Beaumont, Stéphane; Guédon, Jeanpierre; Crespin, Sylvain

    2011-03-01

    Last year, we demonstrated the feasibility of a new method to perform dosimetric quality control of Treatment Planning Systems in radiotherapy, this method is based on Monte-Carlo simulations and uses anatomical Digital Test Objects (DTOs). The pelvic DTO was used in order to assess this new method on an ECLIPSE VARIAN Treatment Planning System. Large dose variations were observed particularly in air and bone equivalent material. In this current work, we discuss the results of the previous paper and provide an explanation for observed dose differences, the VARIAN Eclipse (Anisotropic Analytical) algorithm was investigated. Monte Carlo simulations (MC) were performed with a PENELOPE code version 2003. To increase efficiency of MC simulations, we have used our parallelized version based on the standard MPI (Message Passing Interface). The parallel code has been run on a 32- processor SGI cluster. The study was carried out using pelvic DTOs and was performed for low- and high-energy photon beams (6 and 18MV) on 2100CD VARIAN linear accelerator. A square field (10x10 cm2) was used. Assuming the MC data as reference, χ index analyze was carried out. For this study, a distance to agreement (DTA) was set to 7mm while the dose difference was set to 5% as recommended in the TRS-430 and TG-53 (on the beam axis in 3-D inhomogeneities). When using Monte Carlo PENELOPE, the absorbed dose is computed to the medium, however the TPS computes dose to water. We have used the method described by Siebers et al. based on Bragg-Gray cavity theory to convert MC simulated dose to medium to dose to water. Results show a strong consistency between ECLIPSE and MC calculations on the beam axis.

  4. Robust prediction of consensus secondary structures using averaged base pairing probability matrices.

    PubMed

    Kiryu, Hisanori; Kin, Taishin; Asai, Kiyoshi

    2007-02-15

    Recent transcriptomic studies have revealed the existence of a considerable number of non-protein-coding RNA transcripts in higher eukaryotic cells. To investigate the functional roles of these transcripts, it is of great interest to find conserved secondary structures from multiple alignments on a genomic scale. Since multiple alignments are often created using alignment programs that neglect the special conservation patterns of RNA secondary structures for computational efficiency, alignment failures can cause potential risks of overlooking conserved stem structures. We investigated the dependence of the accuracy of secondary structure prediction on the quality of alignments. We compared three algorithms that maximize the expected accuracy of secondary structures as well as other frequently used algorithms. We found that one of our algorithms, called McCaskill-MEA, was more robust against alignment failures than others. The McCaskill-MEA method first computes the base pairing probability matrices for all the sequences in the alignment and then obtains the base pairing probability matrix of the alignment by averaging over these matrices. The consensus secondary structure is predicted from this matrix such that the expected accuracy of the prediction is maximized. We show that the McCaskill-MEA method performs better than other methods, particularly when the alignment quality is low and when the alignment consists of many sequences. Our model has a parameter that controls the sensitivity and specificity of predictions. We discussed the uses of that parameter for multi-step screening procedures to search for conserved secondary structures and for assigning confidence values to the predicted base pairs. The C++ source code that implements the McCaskill-MEA algorithm and the test dataset used in this paper are available at http://www.ncrna.org/papers/McCaskillMEA/. Supplementary data are available at Bioinformatics online.

  5. McEliece PKC Calculator

    NASA Astrophysics Data System (ADS)

    Marek, Repka

    2015-01-01

    The original McEliece PKC proposal is interesting thanks to its resistance against all known attacks, even using quantum cryptanalysis, in an IND-CCA2 secure conversion. Here we present a generic implementation of the original McEliece PKC proposal, which provides test vectors (for all important intermediate results), and also in which a measurement tool for side-channel analysis is employed. To our best knowledge, this is the first such an implementation. This Calculator is valuable in implementation optimization, in further McEliece/Niederreiter like PKCs properties investigations, and also in teaching. Thanks to that, one can, for example, examine side-channel vulnerability of a certain implementation, or one can find out and test particular parameters of the cryptosystem in order to make them appropriate for an efficient hardware implementation. This implementation is available [1] in executable binary format, and as a static C++ library, as well as in form of source codes, for Linux and Windows operating systems.

  6. 7 CFR 1599.7 - Transportation of goods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF AGRICULTURE McGOVERN-DOLE INTERNATIONAL FOOD FOR EDUCATION AND CHILD NUTRITION PROGRAM § 1599.7... regulations set forth in chapter 4 of title 48 of the Code of Federal Regulations (the AGAR) and directives...

  7. 7 CFR 1599.7 - Transportation of goods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF AGRICULTURE McGOVERN-DOLE INTERNATIONAL FOOD FOR EDUCATION AND CHILD NUTRITION PROGRAM § 1599.7... regulations set forth in chapter 4 of title 48 of the Code of Federal Regulations (the AGAR) and directives...

  8. The Problem of Modeling the Elastomechanics in Engineering

    DTIC Science & Technology

    1990-02-01

    element method by the code PROBE (McNeil Schwendler- Noetic ) and STRIPE (Aeronautical Institute of Sweden). These codes have various error checks so that...Mindlin solutions converge to the Kirchhoff solution as d--O, see eg. [12), [19]. For a detailed study of the asymptotic behavior of Reissner...of study and research for foreign students in numerical mathematics who are supported by foreign govern- ments or exchange agencies (Fulbright, etc

  9. Space Radiation Transport Code Development: 3DHZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    The space radiation transport code, HZETRN, has been used extensively for research, vehicle design optimization, risk analysis, and related applications. One of the simplifying features of the HZETRN transport formalism is the straight-ahead approximation, wherein all particles are assumed to travel along a common axis. This reduces the governing equation to one spatial dimension allowing enormous simplification and highly efficient computational procedures to be implemented. Despite the physical simplifications, the HZETRN code is widely used for space applications and has been found to agree well with fully 3D Monte Carlo simulations in many circumstances. Recent work has focused on the development of 3D transport corrections for neutrons and light ions (Z < 2) for which the straight-ahead approximation is known to be less accurate. Within the development of 3D corrections, well-defined convergence criteria have been considered, allowing approximation errors at each stage in model development to be quantified. The present level of development assumes the neutron cross sections have an isotropic component treated within N explicit angular directions and a forward component represented by the straight-ahead approximation. The N = 1 solution refers to the straight-ahead treatment, while N = 2 represents the bi-directional model in current use for engineering design. The figure below shows neutrons, protons, and alphas for various values of N at locations in an aluminum sphere exposed to a solar particle event (SPE) spectrum. The neutron fluence converges quickly in simple geometry with N > 14 directions. The improved code, 3DHZETRN, transports neutrons, light ions, and heavy ions under space-like boundary conditions through general geometry while maintaining a high degree of computational efficiency. A brief overview of the 3D transport formalism for neutrons and light ions is given, and extensive benchmarking results with the Monte Carlo codes Geant4, FLUKA, and PHITS are provided for a variety of boundary conditions and geometries. Improvements provided by the 3D corrections are made clear in the comparisons. Developments needed to connect 3DHZETRN to vehicle design and optimization studies will be discussed. Future theoretical development will relax the forward plus isotropic interaction assumption to more general angular dependence.

  10. The Alpaca Melanocortin 1 Receptor: Gene Mutations, Transcripts, and Relative Levels of Expression in Ventral Skin Biopsies

    PubMed Central

    Renieri, Carlo; La Terza, Antonietta

    2015-01-01

    The objectives of the present study were to characterize the MC1R gene, its transcripts and the single nucleotide polymorphisms (SNPs) associated with coat color in alpaca. Full length cDNA amplification revealed the presence of two transcripts, named as F1 and F2, differing only in the length of their 5′-terminal untranslated region (UTR) sequences and presenting a color specific expression. Whereas the F1 transcript was common to white and colored (black and brown) alpaca phenotypes, the shorter F2 transcript was specific to white alpaca. Further sequencing of the MC1R gene in white and colored alpaca identified a total of twelve SNPs; among those nine (four silent mutations (c.126C>A, c.354T>C, c.618G>A, and c.933G>A); five missense mutations (c.82A>G, c.92C>T, c.259A>G, c.376A>G, and c.901C>T)) were observed in coding region and three in the 3′UTR. A 4 bp deletion (c.224 227del) was also identified in the coding region. Molecular segregation analysis uncovered that the combinatory mutations in the MC1R locus could cause eumelanin and pheomelanin synthesis in alpaca. Overall, our data refine what is known about the MC1R gene and provides additional information on its role in alpaca pigmentation. PMID:25685836

  11. Nuclear reactor transient analysis via a quasi-static kinetics Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jo, YuGwon; Cho, Bumhee; Cho, Nam Zin, E-mail: nzcho@kaist.ac.kr

    2015-12-31

    The predictor-corrector quasi-static (PCQS) method is applied to the Monte Carlo (MC) calculation for reactor transient analysis. To solve the transient fixed-source problem of the PCQS method, fission source iteration is used and a linear approximation of fission source distributions during a macro-time step is introduced to provide delayed neutron source. The conventional particle-tracking procedure is modified to solve the transient fixed-source problem via MC calculation. The PCQS method with MC calculation is compared with the direct time-dependent method of characteristics (MOC) on a TWIGL two-group problem for verification of the computer code. Then, the results on a continuous-energy problemmore » are presented.« less

  12. Suitability of point kernel dose calculation techniques in brachytherapy treatment planning

    PubMed Central

    Lakshminarayanan, Thilagam; Subbaiah, K. V.; Thayalan, K.; Kannan, S. E.

    2010-01-01

    Brachytherapy treatment planning system (TPS) is necessary to estimate the dose to target volume and organ at risk (OAR). TPS is always recommended to account for the effect of tissue, applicator and shielding material heterogeneities exist in applicators. However, most brachytherapy TPS software packages estimate the absorbed dose at a point, taking care of only the contributions of individual sources and the source distribution, neglecting the dose perturbations arising from the applicator design and construction. There are some degrees of uncertainties in dose rate estimations under realistic clinical conditions. In this regard, an attempt is made to explore the suitability of point kernels for brachytherapy dose rate calculations and develop new interactive brachytherapy package, named as BrachyTPS, to suit the clinical conditions. BrachyTPS is an interactive point kernel code package developed to perform independent dose rate calculations by taking into account the effect of these heterogeneities, using two regions build up factors, proposed by Kalos. The primary aim of this study is to validate the developed point kernel code package integrated with treatment planning computational systems against the Monte Carlo (MC) results. In the present work, three brachytherapy applicators commonly used in the treatment of uterine cervical carcinoma, namely (i) Board of Radiation Isotope and Technology (BRIT) low dose rate (LDR) applicator and (ii) Fletcher Green type LDR applicator (iii) Fletcher Williamson high dose rate (HDR) applicator, are studied to test the accuracy of the software. Dose rates computed using the developed code are compared with the relevant results of the MC simulations. Further, attempts are also made to study the dose rate distribution around the commercially available shielded vaginal applicator set (Nucletron). The percentage deviations of BrachyTPS computed dose rate values from the MC results are observed to be within plus/minus 5.5% for BRIT LDR applicator, found to vary from 2.6 to 5.1% for Fletcher green type LDR applicator and are up to −4.7% for Fletcher-Williamson HDR applicator. The isodose distribution plots also show good agreements with the results of previous literatures. The isodose distributions around the shielded vaginal cylinder computed using BrachyTPS code show better agreement (less than two per cent deviation) with MC results in the unshielded region compared to shielded region, where the deviations are observed up to five per cent. The present study implies that the accurate and fast validation of complicated treatment planning calculations is possible with the point kernel code package. PMID:20589118

  13. Parallel filtering in global gyrokinetic simulations

    NASA Astrophysics Data System (ADS)

    Jolliet, S.; McMillan, B. F.; Villard, L.; Vernay, T.; Angelino, P.; Tran, T. M.; Brunner, S.; Bottino, A.; Idomura, Y.

    2012-02-01

    In this work, a Fourier solver [B.F. McMillan, S. Jolliet, A. Bottino, P. Angelino, T.M. Tran, L. Villard, Comp. Phys. Commun. 181 (2010) 715] is implemented in the global Eulerian gyrokinetic code GT5D [Y. Idomura, H. Urano, N. Aiba, S. Tokuda, Nucl. Fusion 49 (2009) 065029] and in the global Particle-In-Cell code ORB5 [S. Jolliet, A. Bottino, P. Angelino, R. Hatzky, T.M. Tran, B.F. McMillan, O. Sauter, K. Appert, Y. Idomura, L. Villard, Comp. Phys. Commun. 177 (2007) 409] in order to reduce the memory of the matrix associated with the field equation. This scheme is verified with linear and nonlinear simulations of turbulence. It is demonstrated that the straight-field-line angle is the coordinate that optimizes the Fourier solver, that both linear and nonlinear turbulent states are unaffected by the parallel filtering, and that the k∥ spectrum is independent of plasma size at fixed normalized poloidal wave number.

  14. Two MC1R loss-of-function alleles in cream-coloured Australian Cattle Dogs and white Huskies.

    PubMed

    Dürig, N; Letko, A; Lepori, V; Hadji Rasouliha, S; Loechel, R; Kehl, A; Hytönen, M K; Lohi, H; Mauri, N; Dietrich, J; Wiedmer, M; Drögemüller, M; Jagannathan, V; Schmutz, S M; Leeb, T

    2018-06-22

    Loss-of-function variants in the MC1R gene cause recessive red or yellow coat-colour phenotypes in many species. The canine MC1R:c.916C>T (p.Arg306Ter) variant is widespread and found in a homozygous state in many uniformly yellow- or red-coloured dogs. We investigated cream-coloured Australian Cattle Dogs whose coat colour could not be explained by this variant. A genome-wide association study with 10 cream and 123 red Australian Cattle Dogs confirmed that the cream locus indeed maps to MC1R. Whole-genome sequencing of cream dogs revealed a single nucleotide variant within the MITF binding site of the canine MC1R promoter. We propose to designate the mutant alleles at MC1R:c.916C>T as e 1 and at the new promoter variant as e 2 . Both alleles segregate in the Australian Cattle Dog breed. When we considered both alleles in combination, we observed perfect association between the MC1R genotypes and the cream coat colour phenotype in a cohort of 10 cases and 324 control dogs. Analysis of the MC1R transcript levels in an e 1 /e 2 compound heterozygous dog confirmed that the transcript levels of the e 2 allele were markedly reduced with respect to the e 1 allele. We further report another MC1R loss-of-function allele in Alaskan and Siberian Huskies caused by a 2-bp deletion in the coding sequence, MC1R:c.816_817delCT. We propose to term this allele e 3 . Huskies that carry two copies of MC1R loss-of-function alleles have a white coat colour. © 2018 Stichting International Foundation for Animal Genetics.

  15. Space-radiation-induced Photon Luminescence of the Moon

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas; Lee, Kerry

    2008-01-01

    We report on the results of a study of the photon luminescence of the Moon induced by Galactic Cosmic Rays (GCRs) and space radiation from the Sun, using the Monte Carlo program FLUKA. The model of the lunar surface is taken to be the chemical composition of soils found at various landing sites during the Apollo and Luna programs, averaged over all such sites to define a generic regolith for the present analysis. This then becomes the target that is bombarded by Galactic Cosmic Rays (GCRs) and Solar Energetic Particles (SEPs) above 1 keV in FLUKA to determine the photon fluence albedo produced by the Moon's surface when there is no sunlight and Earthshine. This is to be distinguished from the gamma-ray spectrum produced by the radioactive decay of radiogenic constituents lying in the surface and interior of the Moon. From the photon fluence we derive the spectrum which can be utilized to examine existing lunar spectral data and to design orbiting instrumentation for measuring various components of the space-radiation-induced photon luminescence present on the Moon.

  16. GPU-BASED MONTE CARLO DUST RADIATIVE TRANSFER SCHEME APPLIED TO ACTIVE GALACTIC NUCLEI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heymann, Frank; Siebenmorgen, Ralf, E-mail: fheymann@pa.uky.edu

    2012-05-20

    A three-dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing-time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons. Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting proceduremore » and at high spatial resolution by a vectorized ray tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust temperatures of one- and two-dimensional benchmarks are reproduced at high precision. The parallelization capability of various MC algorithms is analyzed and included in our treatment. We utilize the Lucy algorithm for the optical thin case where the Poisson noise is high, the iteration-free Bjorkman and Wood method to reduce the calculation time, and the Fleck and Canfield diffusion approximation for extreme optical thick cells. The code is applied to model the appearance of active galactic nuclei (AGNs) at optical and infrared wavelengths. The AGN torus is clumpy and includes fluffy composite grains of various sizes made up of silicates and carbon. The dependence of the SED on the number of clumps in the torus and the viewing angle is studied. The appearance of the 10 {mu}m silicate features in absorption or emission is discussed. The SED of the radio-loud quasar 3C 249.1 is fit by the AGN model and a cirrus component to account for the far-infrared emission.« less

  17. A new dynamical atmospheric ionizing radiation (AIR) model for epidemiological studies

    NASA Technical Reports Server (NTRS)

    De Angelis, G.; Clem, J. M.; Goldhagen, P. E.; Wilson, J. W.

    2003-01-01

    A new Atmospheric Ionizing Radiation (AIR) model is currently being developed for use in radiation dose evaluation in epidemiological studies targeted to atmospheric flight personnel such as civilian airlines crewmembers. The model will allow computing values for biologically relevant parameters, e.g. dose equivalent and effective dose, for individual flights from 1945. Each flight is described by its actual three dimensional flight profile, i.e. geographic coordinates and altitudes varying with time. Solar modulated primary particles are filtered with a new analytical fully angular dependent geomagnetic cut off rigidity model, as a function of latitude, longitude, arrival direction, altitude and time. The particle transport results have been obtained with a technique based on the three-dimensional Monte Carlo transport code FLUKA, with a special procedure to deal with HZE particles. Particle fluxes are transformed into dose-related quantities and then integrated all along the flight path to obtain the overall flight dose. Preliminary validations of the particle transport technique using data from the AIR Project ER-2 flight campaign of measurements are encouraging. Future efforts will deal with modeling of the effects of the aircraft structure as well as inclusion of solar particle events. Published by Elsevier Ltd on behalf of COSPAR.

  18. Induced activation studies for the LHC upgrade to High Luminosity LHC

    NASA Astrophysics Data System (ADS)

    Adorisio, C.; Roesler, S.

    2018-06-01

    The Large Hadron Collider (LHC) will be upgraded in 2019/2020 to increase its luminosity (rate of collisions) by a factor of five beyond its design value and the integrated luminosity by a factor ten, in order to maintain scientific progress and exploit its full capacity. The novel machine configuration, called High Luminosity LHC (HL-LHC), will increase consequently the level of activation of its components. The evaluation of the radiological impact of the HL-LHC operation in the Long Straight Sections of the Insertion Region 1 (ATLAS) and Insertion Region 5 (CMS) is presented. Using the Monte Carlo code FLUKA, ambient dose equivalent rate estimations have been performed on the basis of two announced operating scenarios and using the latest available machine layout. The HL-LHC project requires new technical infrastructure with caverns and 300 m long tunnels along the Insertion Regions 1 and 5. The new underground service galleries will be accessible during the operation of the accelerator machine. The radiological risk assessment for the Civil Engineering work foreseen to start excavating the new galleries in the next LHC Long Shutdown and the radiological impact of the machine operation will be discussed.

  19. A single-shot nanosecond neutron pulsed technique for the detection of fissile materials

    NASA Astrophysics Data System (ADS)

    Gribkov, V.; Miklaszewski, R. A.; Chernyshova, M.; Scholz, M.; Prokopovicz, R.; Tomaszewski, K.; Drozdowicz, K.; Wiacek, U.; Gabanska, B.; Dworak, D.; Pytel, K.; Zawadka, A.

    2012-07-01

    A novel technique with the potential of detecting hidden fissile materials is presented utilizing the interaction of a single powerful and nanosecond wide neutron pulse with matter. The experimental system is based on a Dense Plasma Focus (DPF) device as a neutron source generating pulses of almost mono-energetic 2.45 MeV and/or 14.0 MeV neutrons, a few nanoseconds in width. Fissile materials, consisting of heavy nuclei, are detected utilizing two signatures: firstly by measuring those secondary fission neutrons which are faster than the elastically scattered 2.45 MeV neutrons of the D-D reaction in the DPF; secondly by measuring the pulses of the slower secondary fission neutrons following the pulse of the fast 14 MeV neutrons from the D-T reaction. In both cases it is important to compare the measured spectrum of the fission neutrons induced by the 2.45 MeV or 14 MeV neutron pulse of the DPF with theoretical spectra obtained by mathematical simulation. Therefore, results of numerical modelling of the proposed system, using the MCNP5 and the FLUKA codes are presented and compared with experimental data.

  20. Monte Carlo simulations for the shielding of the future high-intensity accelerator facility FAIR at GSI.

    PubMed

    Radon, T; Gutermuth, F; Fehrenbacher, G

    2005-01-01

    The Gesellschaft für Schwerionenforschung (GSI) is planning a significant expansion of its accelerator facilities. Compared to the present GSI facility, a factor of 100 in primary beam intensities and up to a factor of 10,000 in secondary radioactive beam intensities are key technical goals of the proposal. The second branch of the so-called Facility for Antiproton and Ion Research (FAIR) is the production of antiprotons and their storage in rings and traps. The facility will provide beam energies a factor of approximately 15 higher than presently available at the GSI for all ions, from protons to uranium. The shielding design of the synchrotron SIS 100/300 is shown exemplarily by using Monte Carlo calculations with the FLUKA code. The experimental area serving the investigation of compressed baryonic matter is analysed in the same way. In addition, a dose comparison is made for an experimental area operated with medium energy heavy-ion beams. Here, Monte Carlo calculations are performed by using either heavy-ion primary particles or proton beams with intensities scaled by the mass number of the corresponding heavy-ion beam.

  1. Radiation protection design for the Super-FRS and SIS100 at the international FAIR facility

    NASA Astrophysics Data System (ADS)

    Kozlova, Ekaterina; Sokolov, Alexey; Radon, Torsten; Lang, Rupert; Conrad, Inna; Fehrenbacher, Georg; Weick, Helmut; Winkler, Martin

    2017-09-01

    The new accelerator SIS100 and the Super-FRS will be built at the international Facility for Antiprotons and Ion Research FAIR. The synchrotron SIS100 is a core part of the FAIR facility which serves for acceleration of ions like Uranium up to 2.7 GeV/u with intensities of 3x1011 particles per second or protons up to 30 GeV with intensities of 5x1012 particles per second. The Super-FRS is a superconducting fragment separator, it will be able to separate all kinds of nuclear projectile fragments of primary heavy ion beams including Uranium with energies up to 1.5 GeV/u and intensities up to 3x1011 particles per second. During operation activation of several components, especially the production target and the beam catchers will take place. For handling of highly activated components it is foreseen to have a hot cell with connected storage place. All calculations for the optimisation of the shielding design of the SIS100, the Super-FRS and the hot cell were performed using the Monte Carlo code FLUKA, results are presented.

  2. Conception and realization of a parallel-plate free-air ionization chamber for the absolute dosimetry of an ultrasoft X-ray beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groetz, J.-E., E-mail: jegroetz@univ-fcomte.fr; Mavon, C.; Fromm, M.

    2014-08-15

    We report the design of a millimeter-sized parallel plate free-air ionization chamber (IC) aimed at determining the absolute air kerma rate of an ultra-soft X-ray beam (E = 1.5 keV). The size of the IC was determined so that the measurement volume satisfies the condition of charged-particle equilibrium. The correction factors necessary to properly measure the absolute kerma using the IC have been established. Particular attention was given to the determination of the effective mean energy for the 1.5 keV photons using the PENELOPE code. Other correction factors were determined by means of computer simulation (COMSOL™and FLUKA). Measurements of airmore » kerma rates under specific operating parameters of the lab-bench X-ray source have been performed at various distances from that source and compared to Monte Carlo calculations. We show that the developed ionization chamber makes it possible to determine accurate photon fluence rates in routine work and will constitute substantial time-savings for future radiobiological experiments based on the use of ultra-soft X-rays.« less

  3. On the origin of the visible light responsible for proton dose measurement using plastic optical fibers

    NASA Astrophysics Data System (ADS)

    Darafsheh, Arash; Taleei, Reza; Kassaee, Alireza; Finlay, Jarod C.

    2017-03-01

    We experimentally and by means of Monte Carlo simulations investigated the origin of the visible signal responsible for proton therapy dose measurement using bare plastic optical fibers. Experimentally, the fiber optic probe, embedded in tissue-mimicking plastics, was irradiated with a proton beam produced by a proton therapy cyclotron and the luminescence spectroscopy was performed by a CCD-coupled spectrograph to analyze the emission spectrum of the fiber tip. Monte Carlo simulations were performed using FLUKA Monte Carlo code to stochastically simulate radiation transport, ionizing radiation dose deposition, and optical emission of Čerenkov radiation. The spectroscopic study of proton-irradiated plastic fibers showed a continuous spectrum with shape different from that of Čerenkov radiation. The Monte Carlo simulations confirmed that the amount of the generated Čerenkov light does not follow the radiation absorbed dose in a medium. Our results show that the origin of the optical signal responsible for the proton dose measurement using bare optical fibers is not Čerenkov radiation. Our results point toward a connection between the scintillation of the plastic material of the fiber and the origin of the signal responsible for dose measurement.

  4. Dose response of alanine detectors irradiated with carbon ion beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrmann, Rochus; Jaekel, Oliver; Palmans, Hugo

    Purpose: The dose response of the alanine detector shows a dependence on particle energy and type when irradiated with ion beams. The purpose of this study is to investigate the response behavior of the alanine detector in clinical carbon ion beams and compare the results to model predictions. Methods: Alanine detectors have been irradiated with carbon ions with an energy range of 89-400 MeV/u. The relative effectiveness of alanine has been measured in this regime. Pristine and spread out Bragg peak depth-dose curves have been measured with alanine dosimeters. The track structure based alanine response model developed by Hansen andmore » Olsen has been implemented in the Monte Carlo code FLUKA and calculations were compared to experimental results. Results: Calculations of the relative effectiveness deviate less than 5% from the measured values for monoenergetic beams. Measured depth-dose curves deviate from predictions in the peak region, most pronounced at the distal edge of the peak. Conclusions: The used model and its implementation show a good overall agreement for quasimonoenergetic measurements. Deviations in depth-dose measurements are mainly attributed to uncertainties of the detector geometry implemented in the Monte Carlo simulations.« less

  5. Single event effects in high-energy accelerators

    NASA Astrophysics Data System (ADS)

    García Alía, Rubén; Brugger, Markus; Danzeca, Salvatore; Cerutti, Francesco; de Carvalho Saraiva, Joao Pedro; Denz, Reiner; Ferrari, Alfredo; Foro, Lionel L.; Peronnard, Paul; Røed, Ketil; Secondo, Raffaello; Steckert, Jens; Thurel, Yves; Toccafondo, Iacocpo; Uznanski, Slawosz

    2017-03-01

    The radiation environment encountered at high-energy hadron accelerators strongly differs from the environment relevant for space applications. The mixed-field expected at modern accelerators is composed of charged and neutral hadrons (protons, pions, kaons and neutrons), photons, electrons, positrons and muons, ranging from very low (thermal) energies up to the TeV range. This complex field, which is extensively simulated by Monte Carlo codes (e.g. FLUKA) is due to beam losses in the experimental areas, distributed along the machine (e.g. collimation points) and deriving from the interaction with the residual gas inside the beam pipe. The resulting intensity, energy distribution and proportion of the different particles largely depends on the distance and angle with respect to the interaction point as well as the amount of installed shielding material. Electronics operating in the vicinity of the accelerator will therefore be subject to both cumulative damage from radiation (total ionizing dose, displacement damage) as well as single event effects which can seriously compromise the operation of the machine. This, combined with the extensive use of commercial-off-the-shelf components due to budget, performance and availability reasons, results in the need to carefully characterize the response of the devices and systems to representative radiation conditions.

  6. Modeling of transitional flows

    NASA Technical Reports Server (NTRS)

    Lund, Thomas S.

    1988-01-01

    An effort directed at developing improved transitional models was initiated. The focus of this work was concentrated on the critical assessment of a popular existing transitional model developed by McDonald and Fish in 1972. The objective of this effort was to identify the shortcomings of the McDonald-Fish model and to use the insights gained to suggest modifications or alterations of the basic model. In order to evaluate the transitional model, a compressible boundary layer code was required. Accordingly, a two-dimensional compressible boundary layer code was developed. The program was based on a three-point fully implicit finite difference algorithm where the equations were solved in an uncoupled manner with second order extrapolation used to evaluate the non-linear coefficients. Iteration was offered as an option if the extrapolation error could not be tolerated. The differencing scheme was arranged to be second order in both spatial directions on an arbitrarily stretched mesh. A variety of boundary condition options were implemented including specification of an external pressure gradient, specification of a wall temperature distribution, and specification of an external temperature distribution. Overall the results of the initial phase of this work indicate that the McDonald-Fish model does a poor job at predicting the details of the turbulent flow structure during the transition region.

  7. 78 FR 47695 - Sam Rayburn Dam Power Rate

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... replacements in the hydroelectric generating facilities and small increases to annual operations and... K. McDonald, Vice President for Corporate Operations/Chief Operating Officer, Southwestern Power... Code of Federal Regulations (18 CFR 300). Southwestern markets power from 24 multi-purpose reservoir...

  8. Determination of SPEAR-1 Rocket Body Potential during High-Voltage Experiments

    DTIC Science & Technology

    1990-06-01

    California at San Diego La Jolla, CA 92093 10 . Dr. C. E. McIlwain Center for Astrophysics and Space Science University of California at San Diego La Jolla...Postgraduate School 39 Naval Postgraduate School 6c. ADDRESS (City, S:are, and ZIP Code) 7b. ADDRESS (Ciy, State, and ZIP Code) Monterey. CA 93943-5000...Monterey. CA 93943-5000 8a. NAME OF FUNDING.SPONSORING 80. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If applicable

  9. Code TESLA for Modeling and Design of High-Power High-Efficiency Klystrons

    DTIC Science & Technology

    2011-03-01

    CODE TESLA FOR MODELING AND DESIGN OF HIGH - POWER HIGH -EFFICIENCY KLYSTRONS * I.A. Chernyavskiy, SAIC, McLean, VA 22102, U.S.A. S.J. Cooke, B...and multiple-beam klystrons as high - power RF sources. These sources are widely used or proposed to be used in accelerators in the future. Comparison...of TESLA modelling results with experimental data for a few multiple-beam klystrons are shown. INTRODUCTION High - power and high -efficiency

  10. IGG Subclass and Isotype Specific Immunoglobulin Responses to Lassa Fever and Venezuelan Equine Encephalomyelitis: Natural Infection and Immunization

    DTIC Science & Technology

    1990-09-30

    EQUINE N ENCEPHALOMYELITIS: NATURAL INFECTION AND IMMUNIZATION , I PRINCIPAL INVESTIGATOR: Renata J. Engler, LTC, MC CONTRACTING ORGANIZATION: Uniformed...Services University of the Health Sciences Department of Medicine Bethesda, MD 20814-4799 REPORT DATE: September 30, 1990 ELECTEO 0CT 3 11990 TYPE OF...Uniformed Services University (If applicable) of Health Sciences I 6c. ADDRESS (City, State, and ZIP Code) 7b. ADDRESS (City, State, and ZIP Code

  11. Expression of the histone chaperone SET/TAF-Iβ during the strobilation process of Mesocestoides corti (Platyhelminthes, Cestoda).

    PubMed

    Costa, Caroline B; Monteiro, Karina M; Teichmann, Aline; da Silva, Edileuza D; Lorenzatto, Karina R; Cancela, Martín; Paes, Jéssica A; Benitz, André de N D; Castillo, Estela; Margis, Rogério; Zaha, Arnaldo; Ferreira, Henrique B

    2015-08-01

    The histone chaperone SET/TAF-Iβ is implicated in processes of chromatin remodelling and gene expression regulation. It has been associated with the control of developmental processes, but little is known about its function in helminth parasites. In Mesocestoides corti, a partial cDNA sequence related to SET/TAF-Iβ was isolated in a screening for genes differentially expressed in larvae (tetrathyridia) and adult worms. Here, the full-length coding sequence of the M. corti SET/TAF-Iβ gene was analysed and the encoded protein (McSET/TAF) was compared with orthologous sequences, showing that McSET/TAF can be regarded as a SET/TAF-Iβ family member, with a typical nucleosome-assembly protein (NAP) domain and an acidic tail. The expression patterns of the McSET/TAF gene and protein were investigated during the strobilation process by RT-qPCR, using a set of five reference genes, and by immunoblot and immunofluorescence, using monospecific polyclonal antibodies. A gradual increase in McSET/TAF transcripts and McSET/TAF protein was observed upon development induction by trypsin, demonstrating McSET/TAF differential expression during strobilation. These results provided the first evidence for the involvement of a protein from the NAP family of epigenetic effectors in the regulation of cestode development.

  12. Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation.

    PubMed

    Ziegenhein, Peter; Pirner, Sven; Ph Kamerling, Cornelis; Oelfke, Uwe

    2015-08-07

    Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37[Formula: see text] compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25[Formula: see text] and 1.95[Formula: see text] faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.

  13. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    NASA Astrophysics Data System (ADS)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  14. Beam Induced Hydrodynamic Tunneling in the Future Circular Collider Components

    NASA Astrophysics Data System (ADS)

    Tahir, N. A.; Burkart, F.; Schmidt, R.; Shutov, A.; Wollmann, D.; Piriz, A. R.

    2016-08-01

    A future circular collider (FCC) has been proposed as a post-Large Hadron Collider accelerator, to explore particle physics in unprecedented energy ranges. The FCC is a circular collider in a tunnel with a circumference of 80-100 km. The FCC study puts an emphasis on proton-proton high-energy and electron-positron high-intensity frontier machines. A proton-electron interaction scenario is also examined. According to the nominal FCC parameters, each of the 50 TeV proton beams will carry an amount of 8.5 GJ energy that is equivalent to the kinetic energy of an Airbus A380 (560 t) at a typical speed of 850 km /h . Safety of operation with such extremely energetic beams is an important issue, as off-nominal beam loss can cause serious damage to the accelerator and detector components with a severe impact on the accelerator environment. In order to estimate the consequences of an accident with the full beam accidently deflected into equipment, we have carried out numerical simulations of interaction of a FCC beam with a solid copper target using an energy-deposition code (fluka) and a 2D hydrodynamic code (big2) iteratively. These simulations show that, although the penetration length of a single FCC proton and its shower in solid copper is about 1.5 m, the full FCC beam will penetrate up to about 350 m into the target because of the "hydrodynamic tunneling." These simulations also show that a significant part of the target is converted into high-energy-density matter. We also discuss this interesting aspect of this study.

  15. The Effectiveness of an Interactive Map Display in Tutoring Geography

    DTIC Science & Technology

    1976-08-01

    MCIT LG Hanscom Field Bedford, MA 01730 Director, Office of Manpower Utilization Headquarters, Marine Corps (Code MPU ) MCB (Building 2009...Lantz University of Denver Denver Research Institute Industrial Economics Division Denver, CO 80210 Mr. Brian McNally Educational Testing Service

  16. Computational Modeling and Validation for Hypersonic Inlets

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.

    1996-01-01

    Hypersonic inlet research activity at NASA is reviewed. The basis for the paper is the experimental tests performed with three inlets: the NASA Lewis Research Center Mach 5, the McDonnell Douglas Mach 12, and the NASA Langley Mach 18. Both three-dimensional PNS and NS codes have been used to compute the flow within the three inlets. Modeling assumptions in the codes involve the turbulence model, the nature of the boundary layer, shock wave-boundary layer interaction, and the flow spilled to the outside of the inlet. Use of the codes and the experimental data are helping to develop a clearer understanding of the inlet flow physics and to focus on the modeling improvements required in order to arrive at validated codes.

  17. Newtonian CAFE: a new ideal MHD code to study the solar atmosphere

    NASA Astrophysics Data System (ADS)

    González, J. J.; Guzmán, F.

    2015-12-01

    In this work we present a new independent code designed to solve the equations of classical ideal magnetohydrodynamics (MHD) in three dimensions, submitted to a constant gravitational field. The purpose of the code centers on the analysis of solar phenomena within the photosphere-corona region. In special the code is capable to simulate the propagation of impulsively generated linear and non-linear MHD waves in the non-isothermal solar atmosphere. We present 1D and 2D standard tests to demonstrate the quality of the numerical results obtained with our code. As 3D tests we present the propagation of MHD-gravity waves and vortices in the solar atmosphere. The code is based on high-resolution shock-capturing methods, uses the HLLE flux formula combined with Minmod, MC and WENO5 reconstructors. The divergence free magnetic field constraint is controlled using the Flux Constrained Transport method.

  18. DPM, a fast, accurate Monte Carlo code optimized for photon and electron radiotherapy treatment planning dose calculations

    NASA Astrophysics Data System (ADS)

    Sempau, Josep; Wilderman, Scott J.; Bielajew, Alex F.

    2000-08-01

    A new Monte Carlo (MC) algorithm, the `dose planning method' (DPM), and its associated computer program for simulating the transport of electrons and photons in radiotherapy class problems employing primary electron beams, is presented. DPM is intended to be a high-accuracy MC alternative to the current generation of treatment planning codes which rely on analytical algorithms based on an approximate solution of the photon/electron Boltzmann transport equation. For primary electron beams, DPM is capable of computing 3D dose distributions (in 1 mm3 voxels) which agree to within 1% in dose maximum with widely used and exhaustively benchmarked general-purpose public-domain MC codes in only a fraction of the CPU time. A representative problem, the simulation of 1 million 10 MeV electrons impinging upon a water phantom of 1283 voxels of 1 mm on a side, can be performed by DPM in roughly 3 min on a modern desktop workstation. DPM achieves this performance by employing transport mechanics and electron multiple scattering distribution functions which have been derived to permit long transport steps (of the order of 5 mm) which can cross heterogeneity boundaries. The underlying algorithm is a `mixed' class simulation scheme, with differential cross sections for hard inelastic collisions and bremsstrahlung events described in an approximate manner to simplify their sampling. The continuous energy loss approximation is employed for energy losses below some predefined thresholds, and photon transport (including Compton, photoelectric absorption and pair production) is simulated in an analogue manner. The δ-scattering method (Woodcock tracking) is adopted to minimize the computational costs of transporting photons across voxels.

  19. A novel Monte Carlo algorithm for simulating crystals with McStas

    NASA Astrophysics Data System (ADS)

    Alianelli, L.; Sánchez del Río, M.; Felici, R.; Andersen, K. H.; Farhi, E.

    2004-07-01

    We developed an original Monte Carlo algorithm for the simulation of Bragg diffraction by mosaic, bent and gradient crystals. It has practical applications, as it can be used for simulating imperfect crystals (monochromators, analyzers and perhaps samples) in neutron ray-tracing packages, like McStas. The code we describe here provides a detailed description of the particle interaction with the microscopic homogeneous regions composing the crystal, therefore it can be used also for the calculation of quantities having a conceptual interest, as multiple scattering, or for the interpretation of experiments aiming at characterizing crystals, like diffraction topographs.

  20. Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics

    NASA Astrophysics Data System (ADS)

    Doronin, Alexander; Meglinski, Igor

    2012-09-01

    In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.

  1. Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics.

    PubMed

    Doronin, Alexander; Meglinski, Igor

    2012-09-01

    In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.

  2. McMillan Lens in a System with Space Charge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobach, I.; Nagaitsev, S.; Stern, E.

    Space charge (SC) in a circulating beam in a ring produces both betatron tune shift and betatron tune spread. These effects make some particles move on to a machine resonance and become unstable. Linear elements of beam optics cannot reduce the tune spread induced by SC because of its intrinsic nonlinear nature. We investigate the possibility to mitigate it by a thin McMillan lens providing a nonlinear axially symmetric kick, which is qualitatively opposite to the accumulated kick by SC. Experimentally, the proposed concept can be tested in Fermilab's IOTA ring. A thin McMillan lens can be implemented by amore » short (70 cm) insertion of an electron beam with specifically chosen density distribution in transverse directions. In this article, to see if McMillan lenses reduce the tune spread induced by SC, we make several simulations with particle tracking code Synergia. We choose such beam and lattice parameters that tune spread is roughly 0.5 and a beam instability due to the half-integer resonance 0.5 is observed. Then, we try to reduce emittance growth by shifting betatron tunes by adjusting quadrupoles and reducing the tune spread by McMillan lenses.« less

  3. Thermospray Liquid Chromatography/Mass Spectrometry of Mustard and Its Metabolites

    DTIC Science & Technology

    1989-05-01

    MONITORING ORGANIZATION REPORT NUMBER(S) CRDEC-TR-066 6a. NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION (If applicable...see reverse 6c- ADDRESS (Cty, State, and ZIP Code) 7b. ADDRESS (City, State, and ZIP Code) Ba. NAME OF FUNDING/SPONSORING 8b OFFICE SYMBOL 9...Ather und Thioather in Dioxan- Wasser -Gemischen," Chem, Ber. Vol. 81, p 123 (1948). 2. Capon, B., and McManus, S. P., Neighboring Group Participation

  4. Universal Frequency Domain Baseband Receiver Structure for Future Military Software Defined Radios

    DTIC Science & Technology

    2010-09-01

    selective channels, i.e., it may have a poor performance at good conditions [4]. Military systems may require a direct sequence ( DS ) component for...frequency bins using a spreading code. This is called the MC- CDMA signal. Note that spreading does not need to cover all the subcarriers but just a few, like...preambles with appropriate frequency domain properties. A DS component can be added as usually. The FDP block then includes this code as a reference

  5. Results of SEI Independent Research and Development Projects and Report on Emerging Technologies and Technology Trends

    DTIC Science & Technology

    2004-10-01

    Top-Level Process for Identification and Analysis of Safety-Related Re- quirements 4.4 Collaborators The primary SEI team members were Don Firesmith...Graff, M. & van Wyk, K. Secure Coding Principles & Practices. O’Reilly, 2003. • Hoglund, G. & McGraw, G. Exploiting Software: How to Break Code. Addison...Eisenecker, U.; Glück, R.; Vandevoorde, D.; & Veldhuizen , T. “Generative Programming and Active Libraries (Extended Abstract)” <osl.iu.edu/~tveldhui/papers

  6. Dynamic Detection of Malicious Code in COTS Software

    DTIC Science & Technology

    2000-04-01

    run the following documented hostile applets or ActiveX of these tools work only on mobile code (Java, ActiveX , controls: 16-11 Hostile Applets Tiny...Killer App Exploder Runner ActiveX Check Spy eSafe Protect Desktop 9/9 blocked NB B NB 13/17 blocked NB Surfinshield Online 9/9 blocked NB B B 13/17...Exploder is an ActiveX control top (@). that performs a clean shutdown of your computer. The interface is attractive, although rather complex, as McLain’s

  7. General Monte Carlo reliability simulation code including common mode failures and HARP fault/error-handling

    NASA Technical Reports Server (NTRS)

    Platt, M. E.; Lewis, E. E.; Boehm, F.

    1991-01-01

    A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.

  8. Analysis of thermo-chemical nonequilibrium models for carbon dioxide flows

    NASA Technical Reports Server (NTRS)

    Rock, Stacey G.; Candler, Graham V.; Hornung, Hans G.

    1992-01-01

    The aerothermodynamics of thermochemical nonequilibrium carbon dioxide flows is studied. The chemical kinetics models of McKenzie and Park are implemented in separate three-dimensional computational fluid dynamics codes. The codes incorporate a five-species gas model characterized by a translational-rotational and a vibrational temperature. Solutions are obtained for flow over finite length elliptical and circular cylinders. The computed flowfields are then employed to calculate Mach-Zehnder interferograms for comparison with experimental data. The accuracy of the chemical kinetics models is determined through this comparison. Also, the methodology of the three-dimensional thermochemical nonequilibrium code is verified by the reproduction of the experiments.

  9. Evidence that multiple genetic variants of MC4R play a functional role in the regulation of energy expenditure and appetite in Hispanic children1234

    PubMed Central

    Cole, Shelley A; Voruganti, V Saroja; Cai, Guowen; Haack, Karin; Kent, Jack W; Blangero, John; Comuzzie, Anthony G; McPherson, John D; Gibbs, Richard A

    2010-01-01

    Background: Melanocortin-4-receptor (MC4R) haploinsufficiency is the most common form of monogenic obesity; however, the frequency of MC4R variants and their functional effects in general populations remain uncertain. Objective: The aim was to identify and characterize the effects of MC4R variants in Hispanic children. Design: MC4R was resequenced in 376 parents, and the identified single nucleotide polymorphisms (SNPs) were genotyped in 613 parents and 1016 children from the Viva la Familia cohort. Measured genotype analysis (MGA) tested associations between SNPs and phenotypes. Bayesian quantitative trait nucleotide (BQTN) analysis was used to infer the most likely functional polymorphisms influencing obesity-related traits. Results: Seven rare SNPs in coding and 18 SNPs in flanking regions of MC4R were identified. MGA showed suggestive associations between MC4R variants and body size, adiposity, glucose, insulin, leptin, ghrelin, energy expenditure, physical activity, and food intake. BQTN analysis identified SNP 1704 in a predicted micro-RNA target sequence in the downstream flanking region of MC4R as a strong, probable functional variant influencing total, sedentary, and moderate activities with posterior probabilities of 1.0. SNP 2132 was identified as a variant with a high probability (1.0) of exerting a functional effect on total energy expenditure and sleeping metabolic rate. SNP rs34114122 was selected as having likely functional effects on the appetite hormone ghrelin, with a posterior probability of 0.81. Conclusion: This comprehensive investigation provides strong evidence that MC4R genetic variants are likely to play a functional role in the regulation of weight, not only through energy intake but through energy expenditure. PMID:19889825

  10. Pilot-Assisted Channel Estimation for Orthogonal Multi-Carrier DS-CDMA with Frequency-Domain Equalization

    NASA Astrophysics Data System (ADS)

    Shima, Tomoyuki; Tomeba, Hiromichi; Adachi, Fumiyuki

    Orthogonal multi-carrier direct sequence code division multiple access (orthogonal MC DS-CDMA) is a combination of time-domain spreading and orthogonal frequency division multiplexing (OFDM). In orthogonal MC DS-CDMA, the frequency diversity gain can be obtained by applying frequency-domain equalization (FDE) based on minimum mean square error (MMSE) criterion to a block of OFDM symbols and can improve the bit error rate (BER) performance in a severe frequency-selective fading channel. FDE requires an accurate estimate of the channel gain. The channel gain can be estimated by removing the pilot modulation in the frequency domain. In this paper, we propose a pilot-assisted channel estimation suitable for orthogonal MC DS-CDMA with FDE and evaluate, by computer simulation, the BER performance in a frequency-selective Rayleigh fading channel.

  11. Analysis of Parent, Teacher, and Consultant Speech Exchanges and Educational Outcomes of Students With Autism During COMPASS Consultation.

    PubMed

    Ruble, Lisa; Birdwhistell, Jessie; Toland, Michael D; McGrew, John H

    2011-01-01

    The significant increase in the numbers of students with autism combined with the need for better trained teachers (National Research Council, 2001) call for research on the effectiveness of alternative methods, such as consultation, that have the potential to improve service delivery. Data from 2 randomized controlled single-blind trials indicate that an autism-specific consultation planning framework known as the collaborative model for promoting competence and success (COMPASS) is effective in increasing child Individual Education Programs (IEP) outcomes (Ruble, Dal-rymple, & McGrew, 2010; Ruble, McGrew, & Toland, 2011). In this study, we describe the verbal interactions, defined as speech acts and speech act exchanges that take place during COMPASS consultation, and examine the associations between speech exchanges and child outcomes. We applied the Psychosocial Processes Coding Scheme (Leaper, 1991) to code speech acts. Speech act exchanges were overwhelmingly affiliative, failed to show statistically significant relationships with child IEP outcomes and teacher adherence, but did correlate positively with IEP quality.

  12. Analysis of Parent, Teacher, and Consultant Speech Exchanges and Educational Outcomes of Students With Autism During COMPASS Consultation

    PubMed Central

    RUBLE, LISA; BIRDWHISTELL, JESSIE; TOLAND, MICHAEL D.; MCGREW, JOHN H.

    2011-01-01

    The significant increase in the numbers of students with autism combined with the need for better trained teachers (National Research Council, 2001) call for research on the effectiveness of alternative methods, such as consultation, that have the potential to improve service delivery. Data from 2 randomized controlled single-blind trials indicate that an autism-specific consultation planning framework known as the collaborative model for promoting competence and success (COMPASS) is effective in increasing child Individual Education Programs (IEP) outcomes (Ruble, Dal-rymple, & McGrew, 2010; Ruble, McGrew, & Toland, 2011). In this study, we describe the verbal interactions, defined as speech acts and speech act exchanges that take place during COMPASS consultation, and examine the associations between speech exchanges and child outcomes. We applied the Psychosocial Processes Coding Scheme (Leaper, 1991) to code speech acts. Speech act exchanges were overwhelmingly affiliative, failed to show statistically significant relationships with child IEP outcomes and teacher adherence, but did correlate positively with IEP quality. PMID:22639523

  13. Report of the AAPM Task Group No. 105: Issues associated with clinical implementation of Monte Carlo-based photon and electron external beam treatment planning.

    PubMed

    Chetty, Indrin J; Curran, Bruce; Cygler, Joanna E; DeMarco, John J; Ezzell, Gary; Faddegon, Bruce A; Kawrakow, Iwan; Keall, Paul J; Liu, Helen; Ma, C M Charlie; Rogers, D W O; Seuntjens, Jan; Sheikh-Bagheri, Daryoush; Siebers, Jeffrey V

    2007-12-01

    The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, the ability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.

  14. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com; Suprijadi; Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic imagesmore » and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.« less

  15. Problematic Behavior: What Do CACREP Accredited Program Policies and Procedures Reflect

    ERIC Educational Resources Information Center

    Brown, Maranda

    2011-01-01

    Counselor Education programs are ethically obligated by accreditation standards and professional codes of ethics to identify counselors-in-training whose academic, clinical, and personal performance indicate problematic behavior that would potentially prevent them from entering the profession (McAdams, Foster, & Ward, 2007). Despite these…

  16. 76 FR 38604 - Southern Montana Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-01

    ... Resource Advisory Committee will meet in Big Timber, Montana. The committee is meeting as authorized under... be held at the Carnegie Public Library, 34 McLeod Street, Big Timber, MT. Written comments should be...-16557 Filed 6-30-11; 8:45 am] BILLING CODE 3410-11-P ...

  17. A Content Analysis of Problematic Behavior in Counselor Education Programs

    ERIC Educational Resources Information Center

    Brown, Maranda

    2013-01-01

    Counselor education programs are obligated by accreditation standards and professional codes of ethics to identify counselors-in-training whose academic, clinical, and personal performance indicate problematic behavior that would potentially prevent them from entering the profession (McAdams, Foster, & Ward, 2007; Rust, Raskin, & Hill,…

  18. Calculated X-ray Intensities Using Monte Carlo Algorithms: A Comparison to Experimental EPMA Data

    NASA Technical Reports Server (NTRS)

    Carpenter, P. K.

    2005-01-01

    Monte Carlo (MC) modeling has been used extensively to simulate electron scattering and x-ray emission from complex geometries. Here are presented comparisons between MC results and experimental electron-probe microanalysis (EPMA) measurements as well as phi(rhoz) correction algorithms. Experimental EPMA measurements made on NIST SRM 481 (AgAu) and 482 (CuAu) alloys, at a range of accelerating potential and instrument take-off angles, represent a formal microanalysis data set that has been widely used to develop phi(rhoz) correction algorithms. X-ray intensity data produced by MC simulations represents an independent test of both experimental and phi(rhoz) correction algorithms. The alpha-factor method has previously been used to evaluate systematic errors in the analysis of semiconductor and silicate minerals, and is used here to compare the accuracy of experimental and MC-calculated x-ray data. X-ray intensities calculated by MC are used to generate a-factors using the certificated compositions in the CuAu binary relative to pure Cu and Au standards. MC simulations are obtained using the NIST, WinCasino, and WinXray algorithms; derived x-ray intensities have a built-in atomic number correction, and are further corrected for absorption and characteristic fluorescence using the PAP phi(rhoz) correction algorithm. The Penelope code additionally simulates both characteristic and continuum x-ray fluorescence and thus requires no further correction for use in calculating alpha-factors.

  19. GCR-induced Photon Luminescence of the Moon: The Moon as a CR Detector

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Lee, Kerry; Andersen, Vic

    2007-01-01

    We report on the results of a preliminary study of the GCR-induced photon luminescence of the Moon using the Monte Carlo program FLUKA. The model of the lunar surface is taken to be the chemical composition of soils found at various landing sites during the Apollo and Luna programs, averaged over all such sites to define a generic regolith for the present analysis. This then becomes the target that is bombarded by Galactic Cosmic Rays (GCRs) in FLUKA to determine the photon fluence when there is no sunshine or Earthshine. From the photon fluence we derive the energy spectrum which can be utilized to design an orbiting optical instrument for measuring the GCR-induced luminescence. This is to be distinguished from the gamma-ray spectrum produced by the radioactive decay of its radiogenic constituents lying in the surface and interior. Also, we investigate transient optical flashes from high-energy CRs impacting the lunar surface (boulders and regolith). The goal is to determine to what extent the Moon could be used as a rudimentary CR detector. Meteor impacts on the Moon have been observed for centuries to generate such flashes, so why not CRs?

  20. Modelling PET radionuclide production in tissue and external targets using Geant4

    NASA Astrophysics Data System (ADS)

    Amin, T.; Infantino, A.; Lindsay, C.; Barlow, R.; Hoehr, C.

    2017-07-01

    The Proton Therapy Facility in TRIUMF provides 74 MeV protons extracted from a 500 MeV H- cyclotron for ocular melanoma treatments. During treatment, positron emitting radionuclides such as 1C, 15O and 13N are produced in patient tissue. Using PET scanners, the isotopic activity distribution can be measured for in-vivo range verification. A second cyclotron, the TR13, provides 13 MeV protons onto liquid targets for the production of PET radionuclides such as 18F, 13N or 68Ga, for medical applications. The aim of this work was to validate Geant4 against FLUKA and experimental measurements for production of the above-mentioned isotopes using the two cyclotrons. The results show variable degrees of agreement. For proton therapy, the proton-range agreement was within 2 mm for 11C activity, whereas 13N disagreed. For liquid targets at the TR13 the average absolute deviation ratio between FLUKA and experiment was 1.9±2.7, whereas the average absolute deviation ratio between Geant4 and experiment was 0. 6±0.4. This is due to the uncertainties present in experimentally determined reaction cross sections.

  1. Monte Carlo simulation of secondary neutron dose for scanning proton therapy using FLUKA

    PubMed Central

    Lee, Chaeyeong; Lee, Sangmin; Lee, Seung-Jae; Song, Hankyeol; Kim, Dae-Hyun; Cho, Sungkoo; Jo, Kwanghyun; Han, Youngyih; Chung, Yong Hyun

    2017-01-01

    Proton therapy is a rapidly progressing field for cancer treatment. Globally, many proton therapy facilities are being commissioned or under construction. Secondary neutrons are an important issue during the commissioning process of a proton therapy facility. The purpose of this study is to model and validate scanning nozzles of proton therapy at Samsung Medical Center (SMC) by Monte Carlo simulation for beam commissioning. After the commissioning, a secondary neutron ambient dose from proton scanning nozzle (Gantry 1) was simulated and measured. This simulation was performed to evaluate beam properties such as percent depth dose curve, Bragg peak, and distal fall-off, so that they could be verified with measured data. Using the validated beam nozzle, the secondary neutron ambient dose was simulated and then compared with the measured ambient dose from Gantry 1. We calculated secondary neutron dose at several different points. We demonstrated the validity modeling a proton scanning nozzle system to evaluate various parameters using FLUKA. The measured secondary neutron ambient dose showed a similar tendency with the simulation result. This work will increase the knowledge necessary for the development of radiation safety technology in medical particle accelerators. PMID:29045491

  2. WIND Flow Solver Released

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.

    1999-01-01

    The WIND code is a general-purpose, structured, multizone, compressible flow solver that can be used to analyze steady or unsteady flow for a wide range of geometric configurations and over a wide range of flow conditions. WIND is the latest product of the NPARC Alliance, a formal partnership between the NASA Lewis Research Center and the Air Force Arnold Engineering Development Center (AEDC). WIND Version 1.0 was released in February 1998, and Version 2.0 will be released in February 1999. The WIND code represents a merger of the capabilities of three existing computational fluid dynamics codes--NPARC (the original NPARC Alliance flow solver), NXAIR (an Air Force code used primarily for unsteady store separation problems), and NASTD (the primary flow solver at McDonnell Douglas, now part of Boeing).

  3. Community and Healthcare Providers' Perspectives on Male Circumcision: A Multi-Centric Qualitative Study in India

    PubMed Central

    Sahay, Seema; Nagarajan, Karikalan; Mehendale, Sanjay; Deb, Sibnath; Gupta, Abhilasha; Bharat, Shalini; Bhatt, Shripad; Kumar, Athokpam Bijesh; Kanthe, Vidisha; Sinha, Anju; Chandhiok, Nomita

    2014-01-01

    Background Although male circumcision (MC) is recommended as an HIV prevention option, the religious, cultural and biomedical dimensions of its feasibility, acceptability and practice in India have not been explored till date. This study explores beliefs, experiences and understanding of the community and healthcare providers (HCPs) about adult MC as an HIV prevention option in India. Methods This qualitative study covered 134 in-depth interviews from Belgaum, Kolkata, Meerut and Mumbai cities of India. Of these, 62 respondents were the members of circumcising (CC)/non-circumcising communities (NCC); including medically and traditionally circumcised men, parents of circumcised children, spouses of circumcised men, and religious clerics. Additionally, 58 registered healthcare providers (RHCPs) such as general and pediatric surgeons, pediatricians, skin and venereal disease specialists, general practitioners, and operation theatre nurses were interviewed. Fourteen traditional circumcisers were also interviewed. The data were coded and analyzed in QSR NUD*IST ver. 6.0. The study has not explored the participants' views about neonatal versus adult circumcision. Results Members of CC/NCC, traditional circumcisers and RCHPs expressed sharp religious sensitivities around the issue of MC. Six themes emerged: Male circumcision as the religious rite; Multiple meanings of MC: MC for ‘religious identity/privilege/sacrifice’ or ‘hygiene’; MC inflicts pain and cost; Medical indications outweigh faith; Hesitation exists in accepting ‘foreign’ evidence supporting MC; and communication is the key for acceptance of MCs. Medical indications could make members of NCC accept MC following appropriate counseling. Majority of the RHCPs demanded local in-country evidence. Conclusion HCPs must educate high-risk groups regarding the preventive and therapeutic role of MC. Communities need to discuss and create new social norms about male circumcision for better societal acceptance especially among the NCC. Feasibility studies on MC as an individual specific option for the high risk groups in health care setting needs to be explored. PMID:24614575

  4. aMC fast: automation of fast NLO computations for PDF fits

    NASA Astrophysics Data System (ADS)

    Bertone, Valerio; Frederix, Rikkert; Frixione, Stefano; Rojo, Juan; Sutton, Mark

    2014-08-01

    We present the interface between M adG raph5_ aMC@NLO, a self-contained program that calculates cross sections up to next-to-leading order accuracy in an automated manner, and APPL grid, a code that parametrises such cross sections in the form of look-up tables which can be used for the fast computations needed in the context of PDF fits. The main characteristic of this interface, which we dub aMC fast, is its being fully automated as well, which removes the need to extract manually the process-specific information for additional physics processes, as is the case with other matrix-element calculators, and renders it straightforward to include any new process in the PDF fits. We demonstrate this by studying several cases which are easily measured at the LHC, have a good constraining power on PDFs, and some of which were previously unavailable in the form of a fast interface.

  5. Newtonian CAFE: a new ideal MHD code to study the solar atmosphere

    NASA Astrophysics Data System (ADS)

    González-Avilés, J. J.; Cruz-Osorio, A.; Lora-Clavijo, F. D.; Guzmán, F. S.

    2015-12-01

    We present a new code designed to solve the equations of classical ideal magnetohydrodynamics (MHD) in three dimensions, submitted to a constant gravitational field. The purpose of the code centres on the analysis of solar phenomena within the photosphere-corona region. We present 1D and 2D standard tests to demonstrate the quality of the numerical results obtained with our code. As solar tests we present the transverse oscillations of Alfvénic pulses in coronal loops using a 2.5D model, and as 3D tests we present the propagation of impulsively generated MHD-gravity waves and vortices in the solar atmosphere. The code is based on high-resolution shock-capturing methods, uses the Harten-Lax-van Leer-Einfeldt (HLLE) flux formula combined with Minmod, MC, and WENO5 reconstructors. The divergence free magnetic field constraint is controlled using the Flux Constrained Transport method.

  6. Comparing Turbulence Simulation with Experiment in DIII-D

    NASA Astrophysics Data System (ADS)

    Ross, D. W.; Bravenec, R. V.; Dorland, W.; Beer, M. A.; Hammett, G. W.; McKee, G. R.; Murakami, M.; Jackson, G. L.

    2000-10-01

    Gyrofluid simulations of DIII-D discharges with the GRYFFIN code(D. W. Ross et al.), Transport Task Force Workshop, Burlington, VT, (2000). are compared with transport and fluctuation measurements. The evolution of confinement-improved discharges(G. R. McKee et al.), Phys. Plasmas 7, 1870 (200) is studied at early times following impurity injection, when EXB rotational shear plays a small role. The ion thermal transport predicted by the code is consistent with the experimental values. Experimentally, changes in density profiles resulting from the injection of neon, lead to reduction in fluctuation levels and transport following the injection. This triggers subsequent changes in the shearing rate that further reduce the turbulence.(M. Murakami et al.), European Physical Society, Budapest (2000); M. Murakami et al., this meeting. Estimated uncertainties in the plasma profiles, however, make it difficult to simulate these reductions with the code. These cases will also be studied with the GS2 gyrokinetic code.

  7. ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.

    A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less

  8. Manufacturing Methods and Technology Engineering for Tape Chip Carrier.

    DTIC Science & Technology

    1981-08-01

    equipment and fixtures were used in the manufacturer of the Sync Counter hybrid microcircuit. o Continuous Tape Plater - Model No. STP, Microplate ...Headquarters 001 Commander ATTN: Ray L. Gilbert Naval Ocean Systems Center 608 Independence Ave., SW ATTN: Dr. W. D. McKee, Jr. Washington, DC 20546 Code

  9. Mediating Third-Wave Feminism: Appropriation as Postmodern Media Practice.

    ERIC Educational Resources Information Center

    Shugart, Helene A.; Waggoner, Catherine Egley; Hallstein, D. Lynn O'Brien

    2001-01-01

    Analyzes gendered representations of Alanis Morissette, Kate Moss, and Ally McBeal. Argue that, in each case, the appropriation of third-wave feminist tenets is accomplished via a postmodern aesthetic code of juxtaposition that serves to recontextualize and reinscribe those sensibilities in a way that ultimately functions to reify dominant…

  10. 75 FR 82170 - Hours of Service of Drivers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-29

    ....J., Hickman, J., Fumero, M.C., Olson, R.L. & Dingus, T.A., ``The Sleep of Commercial Vehicle Drivers... Significant Alternatives to the Proposed Rule which Minimize any Significant Impact on Small Entities C... 49 of the United States Code (49 U.S.C.)). The HOS regulations proposed today concern the ``maximum...

  11. Three-Dimensional Plasma-Based Stall Control Simulations with Coupled First-Principles Approaches

    DTIC Science & Technology

    2006-07-01

    flow code, developed at the Computational Plasma Dynamics Laboratory at Kettering University. The method is based on a versatile finite-element ( FE ...McLaughlin, T., and Baughn, J., 2005. “Acoustic testing of the dielectric barrier dis- charge ( dbd ) plasma actuator”. AIAA Paper 2005-0565, Jan

  12. 78 FR 780 - Investigations Regarding Certifications of Eligibility To Apply for Worker Adjustment Assistance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-04

    ... McCann's--a Division of Los Angeles, CA........ 11/29/12 11/28/12 Manitowoc Foodservice (Company). 82191... (Workers)... Sea Tac, WA 11/30/12 11/28/12 [FR Doc. 2012-31663 Filed 1-3-13; 8:45 am] BILLING CODE 4510-FN...

  13. 77 FR 75670 - Importer of Controlled Substances, Notice of Application, Hospira

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-21

    ... DEPARTMENT OF JUSTICE Drug Enforcement Administration Importer of Controlled Substances, Notice of Application, Hospira Pursuant to Title 21 Code of Federal Regulations 1301.34(a), this is notice that on September 20, 2012, Hospira, 1776 North Centennial Drive, McPherson, Kansas 67460-1247, made application by...

  14. 76 FR 52697 - Meetings of Humanities Panel

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-23

    ...., Washington, DC 20506. FOR FURTHER INFORMATION CONTACT: Michael P. McDonald, Advisory Committee Management... subsections (c)(4), and (6) of section 552b of Title 5, United States Code. 1. Date: September 7, 2011. Time... Initiatives at Historically Black Colleges and Universities, High Hispanic Enrollment, and Tribal Colleges and...

  15. Validation of GPU-accelerated superposition-convolution dose computations for the Small Animal Radiation Research Platform.

    PubMed

    Cho, Nathan; Tsiamas, Panagiotis; Velarde, Esteban; Tryggestad, Erik; Jacques, Robert; Berbeco, Ross; McNutt, Todd; Kazanzides, Peter; Wong, John

    2018-05-01

    The Small Animal Radiation Research Platform (SARRP) has been developed for conformal microirradiation with on-board cone beam CT (CBCT) guidance. The graphics processing unit (GPU)-accelerated Superposition-Convolution (SC) method for dose computation has been integrated into the treatment planning system (TPS) for SARRP. This paper describes the validation of the SC method for the kilovoltage energy by comparing with EBT2 film measurements and Monte Carlo (MC) simulations. MC data were simulated by EGSnrc code with 3 × 10 8 -1.5 × 10 9 histories, while 21 photon energy bins were used to model the 220 kVp x-rays in the SC method. Various types of phantoms including plastic water, cork, graphite, and aluminum were used to encompass the range of densities of mouse organs. For the comparison, percentage depth dose (PDD) of SC, MC, and film measurements were analyzed. Cross beam (x,y) dosimetric profiles of SC and film measurements are also presented. Correction factors (CFz) to convert SC to MC dose-to-medium are derived from the SC and MC simulations in homogeneous phantoms of aluminum and graphite to improve the estimation. The SC method produces dose values that are within 5% of film measurements and MC simulations in the flat regions of the profile. The dose is less accurate at the edges, due to factors such as geometric uncertainties of film placement and difference in dose calculation grids. The GPU-accelerated Superposition-Convolution dose computation method was successfully validated with EBT2 film measurements and MC calculations. The SC method offers much faster computation speed than MC and provides calculations of both dose-to-water in medium and dose-to-medium in medium. © 2018 American Association of Physicists in Medicine.

  16. Evaluation of the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels using particle and heavy ion transport code system: PHITS.

    PubMed

    Shiiba, Takuro; Kuga, Naoya; Kuroiwa, Yasuyoshi; Sato, Tatsuhiko

    2017-10-01

    We assessed the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels (DPKs) calculated using the particle and heavy ion transport code system (PHITS) for patient-specific dosimetry in targeted radionuclide treatment (TRT) and compared our data with published data. All mono-energetic and beta-emitting isotope DPKs calculated using PHITS, both in water and compact bone, were in good agreement with those in literature using other MC codes. PHITS provided reliable mono-energetic electron and beta-emitting isotope scaled DPKs for patient-specific dosimetry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Subgroup Benchmark Calculations for the Intra-Pellet Nonuniform Temperature Cases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog; Jung, Yeon Sang; Liu, Yuxuan

    A benchmark suite has been developed by Seoul National University (SNU) for intrapellet nonuniform temperature distribution cases based on the practical temperature profiles according to the thermal power levels. Though a new subgroup capability for nonuniform temperature distribution was implemented in MPACT, no validation calculation has been performed for the new capability. This study focuses on bench-marking the new capability through a code-to-code comparison. Two continuous-energy Monte Carlo codes, McCARD and CE-KENO, are engaged in obtaining reference solutions, and the MPACT results are compared to the SNU nTRACER using a similar cross section library and subgroup method to obtain self-shieldedmore » cross sections.« less

  18. Jet and electromagnetic tomography (JET) of extreme phases of matter in heavy-ion collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinz, Ulrich

    2015-08-31

    The Ohio State University (OSU) group contributed to the deliverables of the JET Collaboration three major products: 1. The code package iEBE-VISHNU for modeling the dynamical evolution of the soft medium created in relativistic heavy-ion collisions, from its creation all the way to final freeze-out using a hybrid approach that interfaces a free-streaming partonic pre-equilbrium stage with a (2+1)-dimensional viscous relativistic fluid dynamical stage for the quark-gluon plasma (QGP) phase and the microscopic hadron cascade UrQMD for the hadronic rescattering and freeze-out stage. Except for UrQMD, all dynamical evolution components and interfaces were developed at OSU and tested and implementedmore » in collaboration with the Duke University group. 2. An electromagnetic radiation module for the calculation of thermal photon emission from the QGP and hadron resonance gas stages of a heavy-ion collision, with emission rates that have been corrected for viscous effects in the expanding medium consistent with the bulk evolution. The electromagnetic radiation module was developed under OSU leadership in collaboration with the McGill group and has been integrated in the iEBE-VISHNU code package. 3. An interface between the Monte Carlo jet shower evolution and hadronization codes developed by the Wayne State University (WSU), McGill and Texas A&M groups and the iEBE-VISHNU bulk evolution code, for performing jet quenching and jet shape modification studies in a realistically modeled evolving medium that was tuned to measured soft hadron data. Building on work performed at OSU for the theoretical framework used to describe the interaction of jets with the medium, initial work on the jet shower Monte Carlo was started at OSU and moved to WSU when OSU Visiting Assistant Professor Abhijit Majumder accepted a tenure track faculty position at WSU in September 2011. The jet-hydro interface was developed at OSU and WSU and tested and implemented in collaboration with the McGill, Texas A&M, and LBNL groups.« less

  19. A backward Monte Carlo method for efficient computation of runaway probabilities in runaway electron simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Guannan; Del-Castillo-Negrete, Diego

    2017-10-01

    Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.

  20. Chronic pain patients' perspectives of medical cannabis.

    PubMed

    Piper, Brian J; Beals, Monica L; Abess, Alexander T; Nichols, Stephanie D; Martin, Maurice W; Cobb, Catherine M; DeKeuster, Rebecca M

    2017-07-01

    Medical cannabis (MC) is used for a variety of conditions including chronic pain. The goal of this report was to provide an in-depth qualitative exploration of patient perspectives on the strengths and limitations of MC. Members of MC dispensaries (N = 984) in New England including two-thirds with a history of chronic pain completed an online survey. In response to "How effective is medical cannabis in treating your symptoms or conditions?," with options of 0% "no relief" to 100% "complete relief," the average was 74.6% ± 0.6. The average amount spent on MC each year was $3064.47 ± 117.60, median = $2320.23, range = $52.14 to $52,140.00. Open-ended responses were coded into themes and subthemes. Analysis of answers to "What is it that you like most about MC?" (N = 2592 responses) identified 10 themes, including health benefits (36.0% of responses, eg, "Changes perception and experience of my chronic pain."), the product (14.2%, eg, "Knowing exactly what strain you are getting"), nonhealth benefits (14.1%), general considerations (10.3%), and medications (7.1%). Responses (N = 1678) to "What is it that you like least about MC?" identified 12 themes, including money (28.4%, eg, "The cost is expensive for someone on a fixed income"), effects (21.7%, eg, "The effects on my lungs"), the view of others (11.4%), access (8.2%), and method of administration (7.1%). These findings provide a patient-centered view on the advantages (eg, efficacy in pain treatment, reduced use of other medications) and disadvantages (eg, economic and stigma) of MC.

  1. SU-F-T-184: 3D Range-Modulator for Scanned Particle Therapy: Development, Monte Carlo Simulations and Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simeonov, Y; Penchev, P; Ringbaek, T Printz

    2016-06-15

    Purpose: Active raster scanning in particle therapy results in highly conformal dose distributions. Treatment time, however, is relatively high due to the large number of different iso-energy layers used. By using only one energy and the so called 3D range-modulator irradiation times of a few seconds only can be achieved, thus making delivery of homogeneous dose to moving targets (e.g. lung cancer) more reliable. Methods: A 3D range-modulator consisting of many pins with base area of 2.25 mm2 and different lengths was developed and manufactured with rapid prototyping technique. The form of the 3D range-modulator was optimised for a sphericalmore » target volume with 5 cm diameter placed at 25 cm in a water phantom. Monte Carlo simulations using the FLUKA package were carried out to evaluate the modulating effect of the 3D range-modulator and simulate the resulting dose distribution. The fine and complicated contour form of the 3D range-modulator was taken into account by a specially programmed user routine. Additionally FLUKA was extended with the capability of intensity modulated scanning. To verify the simulation results dose measurements were carried out at the Heidelberg Ion Therapy Center (HIT) with a 400.41 MeV 12C beam. Results: The high resolution measurements show that the 3D range-modulator is capable of producing homogeneous 3D conformal dose distributions, simultaneously reducing significantly irradiation time. Measured dose is in very good agreement with the previously conducted FLUKA simulations, where slight differences were traced back to minor manufacturing deviations from the perfect optimised form. Conclusion: Combined with the advantages of very short treatment time the 3D range-modulator could be an alternative to treat small to medium sized tumours (e.g. lung metastasis) with the same conformity as full raster-scanning treatment. Further simulations and measurements of more complex cases will be conducted to investigate the full potential of the 3D range-modulator.« less

  2. Development of a Space Radiation Monte Carlo Computer Simulation

    NASA Technical Reports Server (NTRS)

    Pinsky, Lawrence S.

    1997-01-01

    The ultimate purpose of this effort is to undertake the development of a computer simulation of the radiation environment encountered in spacecraft which is based upon the Monte Carlo technique. The current plan is to adapt and modify a Monte Carlo calculation code known as FLUKA, which is presently used in high energy and heavy ion physics, to simulate the radiation environment present in spacecraft during missions. The initial effort would be directed towards modeling the MIR and Space Shuttle environments, but the long range goal is to develop a program for the accurate prediction of the radiation environment likely to be encountered on future planned endeavors such as the Space Station, a Lunar Return Mission, or a Mars Mission. The longer the mission, especially those which will not have the shielding protection of the earth's magnetic field, the more critical the radiation threat will be. The ultimate goal of this research is to produce a code that will be useful to mission planners and engineers who need to have detailed projections of radiation exposures at specified locations within the spacecraft and for either specific times during the mission or integrated over the entire mission. In concert with the development of the simulation, it is desired to integrate it with a state-of-the-art interactive 3-D graphics-capable analysis package known as ROOT, to allow easy investigation and visualization of the results. The efforts reported on here include the initial development of the program and the demonstration of the efficacy of the technique through a model simulation of the MIR environment. This information was used to write a proposal to obtain follow-on permanent funding for this project.

  3. Measurement And Calculation of High-Energy Neutron Spectra Behind Shielding at the CERF 120-GeV/C Hadron Beam Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakao, N.; /SLAC; Taniguchi, S.

    Neutron energy spectra were measured behind the lateral shield of the CERF (CERN-EU High Energy Reference Field) facility at CERN with a 120 GeV/c positive hadron beam (a mixture of mainly protons and pions) on a cylindrical copper target (7-cm diameter by 50-cm long). An NE213 organic liquid scintillator (12.7-cm diameter by 12.7-cm long) was located at various longitudinal positions behind shields of 80- and 160-cm thick concrete and 40-cm thick iron. The measurement locations cover an angular range with respect to the beam axis between 13 and 133{sup o}. Neutron energy spectra in the energy range between 32 MeVmore » and 380 MeV were obtained by unfolding the measured pulse height spectra with the detector response functions which have been verified in the neutron energy range up to 380 MeV in separate experiments. Since the source term and experimental geometry in this experiment are well characterized and simple and results are given in the form of energy spectra, these experimental results are very useful as benchmark data to check the accuracies of simulation codes and nuclear data. Monte Carlo simulations of the experimental set up were performed with the FLUKA, MARS and PHITS codes. Simulated spectra for the 80-cm thick concrete often agree within the experimental uncertainties. On the other hand, for the 160-cm thick concrete and iron shield differences are generally larger than the experimental uncertainties, yet within a factor of 2. Based on source term simulations, observed discrepancies among simulations of spectra outside the shield can be partially explained by differences in the high-energy hadron production in the copper target.« less

  4. Simulations of beam-matter interaction experiments at the CERN HiRadMat facility and prospects of high-energy-density physics research.

    PubMed

    Tahir, N A; Burkart, F; Shutov, A; Schmidt, R; Wollmann, D; Piriz, A R

    2014-12-01

    In a recent publication [Schmidt et al., Phys. Plasmas 21, 080701 (2014)], we reported results on beam-target interaction experiments that have been carried out at the CERN HiRadMat (High Radiation to Materials) facility using extended solid copper cylindrical targets that were irradiated with a 440-GeV proton beam delivered by the Super Proton Synchrotron (SPS). On the one hand, these experiments confirmed the existence of hydrodynamic tunneling of the protons that leads to substantial increase in the range of the protons and the corresponding hadron shower in the target, a phenomenon predicted by our previous theoretical investigations [Tahir et al., Phys. Rev. ST Accel. Beams 25, 051003 (2012)]. On the other hand, these experiments demonstrated that the beam heated part of the target is severely damaged and is converted into different phases of high energy density (HED) matter, as suggested by our previous theoretical studies [Tahir et al., Phys. Rev. E 79, 046410 (2009)]. The latter confirms that the HiRadMat facility can be used to study HED physics. In the present paper, we give details of the numerical simulations carried out to understand the experimental measurements. These include the evolution of the physical parameters, for example, density, temperature, pressure, and the internal energy in the target, during and after the irradiation. This information is important in order to determine the region of the HED phase diagram that can be accessed in such experiments. These simulations have been done using the energy deposition code fluka and a two-dimensional hydrodynamic code, big2, iteratively.

  5. Radiological Environmental Protection for LCLS-II High Power Operation

    NASA Astrophysics Data System (ADS)

    Liu, James; Blaha, Jan; Cimeno, Maranda; Mao, Stan; Nicolas, Ludovic; Rokni, Sayed; Santana, Mario; Tran, Henry

    2017-09-01

    The LCLS-II superconducting electron accelerator at SLAC plans to operate at up to 4 GeV and 240 kW average power, which would create higher radiological impacts particularly near the beam loss points such as beam dumps and halo collimators. The main hazards to the public and environment include direct or skyshine radiation, effluent of radioactive air such as 13N, 15O and 41Ar, and activation of groundwater creating tritium. These hazards were evaluated using analytic methods and FLUKA Monte Carlo code. The controls (mainly extensive bulk shielding and local shielding around high loss points) and monitoring (neutron/photon detectors with detection capabilities below natural background at site boundary, site-wide radioactive air monitors, and groundwater wells) were designed to meet the U.S. DOE and EPA, as well as SLAC requirements. The radiological design and controls for the LCW systems [including concrete housing shielding for 15O and 11C circulating in LCW, 7Be and erosion/corrosion products (22Na, 54Mn, 60Co, 65Zn, etc.) captured in resin and filters, leak detection and containment of LCW with 3H and its waste water discharge; explosion from H2 build-up in surge tank and release of radionuclides] associated with the high power beam dumps are also presented.

  6. High and low energy gamma beam dump designs for the gamma beam delivery system at ELI-NP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yasin, Zafar, E-mail: zafar.yasin@eli-np.ro; Matei, Catalin; Ur, Calin A.

    The Extreme Light Infrastructure - Nuclear Physics (ELI-NP) is under construction in Magurele, Bucharest, Romania. The facility will use two 10 PW lasers and a high intensity, narrow bandwidth gamma beam for stand-alone and combined laser-gamma experiments. The accurate estimation of particle doses and their restriction within the limits for both personel and general public is very important in the design phase of any nuclear facility. In the present work, Monte Carlo simulations are performed using FLUKA and MCNPX to design 19.4 and 4 MeV gamma beam dumps along with shielding of experimental areas. Dose rate contour plots from both FLUKAmore » and MCNPX along with numerical values of doses in experimental area E8 of the facility are performed. The calculated doses are within the permissible limits. Furthermore, a reasonable agreement between both codes enhances our confidence in using one or both of them for future calculations in beam dump designs, radiation shielding, radioactive inventory, and other calculations releated to radiation protection. Residual dose rates and residual activity calculations are also performed for high-energy beam dump and their effect is negligible in comparison to contributions from prompt radiation.« less

  7. Neutron yield and induced radioactivity: a study of 235-MeV proton and 3-GeV electron accelerators.

    PubMed

    Hsu, Yung-Cheng; Lai, Bo-Lun; Sheu, Rong-Jiun

    2016-01-01

    This study evaluated the magnitude of potential neutron yield and induced radioactivity of two new accelerators in Taiwan: a 235-MeV proton cyclotron for radiation therapy and a 3-GeV electron synchrotron serving as the injector for the Taiwan Photon Source. From a nuclear interaction point of view, neutron production from targets bombarded with high-energy particles is intrinsically related to the resulting target activation. Two multi-particle interaction and transport codes, FLUKA and MCNPX, were used in this study. To ensure prediction quality, much effort was devoted to the associated benchmark calculations. Comparisons of the accelerators' results for three target materials (copper, stainless steel and tissue) are presented. Although the proton-induced neutron yields were higher than those induced by electrons, the maximal neutron production rates of both accelerators were comparable according to their respective beam outputs during typical operation. Activation products in the targets of the two accelerators were unexpectedly similar because the primary reaction channels for proton- and electron-induced activation are (p,pn) and (γ,n), respectively. The resulting residual activities and remnant dose rates as a function of time were examined and discussed. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Release of Iron from Hemoglobin

    DTIC Science & Technology

    1993-02-17

    Medical Research and Development Division of Blood Research SGRD-ULY-BRP Command 6C. ADDRESS KCay. State, And ZIP Code) 7b. ADDRESS (Cjry. Stitt, and...17]. 28. D. P. Derman, A. Green, T. H. Bothwell, B. Graham, L. McNamara, A. P. MacPhail and R. D. Baynes . Ann. Clin. Biochem. 26, 144; 1989. 29. W. W

  9. 77 FR 47110 - Manufacturer of Controlled Substances; Notice of Application; R & D Systems, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-07

    ... DEPARTMENT OF JUSTICE Drug Enforcement Administration Manufacturer of Controlled Substances; Notice of Application; R & D Systems, Inc. Pursuant to Title 21 Code of Federal Regulations 1301.34(a), this is notice that on May 4, 2012, R & D Systems, Inc., 614 McKinley Place NE., Minneapolis, Minnesota...

  10. Computerized Support of the Pretrial Confinement Decision-Making Process in the Marine Corps.

    DTIC Science & Technology

    1988-03-01

    Books Inc., Blue Ridge Summit, Pennsylvania, 1986. 7. Pressman , R.S., Software Engineering: A Practitioner’s Approach, Second Edition, McGraw-Hill Book...Lieutenant Commander Barry Frew, Code 54Fw 2 Naval Postgraduate School Monterey, California 93943-500 150 NIN U1 W 40 I U n

  11. The Overlap Model: A Model of Letter Position Coding

    ERIC Educational Resources Information Center

    Gomez, Pablo; Ratcliff, Roger; Perea, Manuel

    2008-01-01

    Recent research has shown that letter identity and letter position are not integral perceptual dimensions (e.g., jugde primes judge in word-recognition experiments). Most comprehensive computational models of visual word recognition (e.g., the interactive activation model, J. L. McClelland & D. E. Rumelhart, 1981, and its successors) assume that…

  12. 77 FR 2225 - Allocation and Apportionment of Interest Expense

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-17

    ... for Services and Enforcement. Approved: December 6, 2011. Emily S. McMahon, Acting Assistant Secretary... 44 U.S.C. 1510. #0; #0;The Code of Federal Regulations is sold by the Superintendent of Documents. #0... partner's distributive share of interest expense incurred by the partnership. Section 1.861-9T(e)(1...

  13. 76 FR 14434 - Meetings of Humanities Panel

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-16

    ... FURTHER INFORMATION CONTACT: Michael P. McDonald, Advisory Committee Management Officer, National... subsections (c)(4), and (6) of section 552b of Title 5, United States Code. 1. Date: April 1, 2011. Time: 9 a... January 12, 2011 deadline. 2. Date: April 4, 2011. Time: 9 a.m. to 5 p.m. Location: Room 421. Program...

  14. 75 FR 50007 - Meetings of Humanities Panel

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-16

    ...., Washington, DC 20506. FOR FURTHER INFORMATION CONTACT: Michael P. McDonald, Advisory Committee Management... subsections (c)(4), and (6) of section 552b of Title 5, United States Code. 1. Date: September 1, 2010. Time..., 2010 deadline. 2. Date: September 20, 2010. Time: 9 a.m. to 5 p.m. Room: 315. Program: This meeting...

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, J C; Karmanos Cancer Institute McLaren-Macomb, Clinton Township, MI; Knill, C

    Purpose: To determine small field correction factors for PTW’s microDiamond detector in Elekta’s Gamma Knife Model-C unit. These factors allow the microDiamond to be used in QA measurements of output factors in the Gamma Knife Model-C; additionally, the results also contribute to the discussion on the water equivalence of the relatively-new microDiamond detector and its overall effectiveness in small field applications. Methods: The small field correction factors were calculated as k correction factors according to the Alfonso formalism. An MC model of the Gamma Knife and microDiamond was built with the EGSnrc code system, using BEAMnrc and DOSRZnrc user codes.more » Validation of the model was accomplished by simulating field output factors and measurement ratios for an available ABS plastic phantom and then comparing simulated results to film measurements, detector measurements, and treatment planning system (TPS) data. Once validated, the final k factors were determined by applying the model to a more waterlike solid water phantom. Results: During validation, all MC methods agreed with experiment within the stated uncertainties: MC determined field output factors agreed within 0.6% of the TPS and 1.4% of film; and MC simulated measurement ratios matched physically measured ratios within 1%. The final k correction factors for the PTW microDiamond in the solid water phantom approached unity to within 0.4%±1.7% for all the helmet sizes except the 4 mm; the 4 mm helmet size over-responded by 3.2%±1.7%, resulting in a k factor of 0.969. Conclusion: Similar to what has been found in the Gamma Knife Perfexion, the PTW microDiamond requires little to no corrections except for the smallest 4 mm field. The over-response can be corrected via the Alfonso formalism using the correction factors determined in this work. Using the MC calculated correction factors, the PTW microDiamond detector is an effective dosimeter in all available helmet sizes. The authors would like to thank PTW (Friedberg, Germany) for providing the PTW microDiamond detector for this research.« less

  16. Multi-scale modeling to relate Be surface temperatures, concentrations and molecular sputtering yields

    NASA Astrophysics Data System (ADS)

    Lasa, Ane; Safi, Elnaz; Nordlund, Kai

    2015-11-01

    Recent experiments and Molecular Dynamics (MD) simulations show erosion rates of Be exposed to deuterium (D) plasma varying with surface temperature and the correlated D concentration. Little is understood how these three parameters relate for Be surfaces, despite being essential for reliable prediction of impurity transport and plasma facing material lifetime in current (JET) and future (ITER) devices. A multi-scale exercise is presented here to relate Be surface temperatures, concentrations and sputtering yields. Kinetic Monte Carlo (MC) code MMonCa is used to estimate equilibrium D concentrations in Be at different temperatures. Then, mixed Be-D surfaces - that correspond to the KMC profiles - are generated in MD, to calculate Be-D molecular erosion yields due to D irradiation. With this new database implemented in the 3D MC impurity transport code ERO, modeling scenarios studying wall erosion, such as RF-induced enhanced limiter erosion or main wall surface temperature scans run at JET, can be revisited with higher confidence. Work supported by U.S. DOE under Contract DE-AC05-00OR22725.

  17. Development of a NRSE Spectrometer with the Help of McStas - Application to the Design of Present and Future Instruments

    NASA Astrophysics Data System (ADS)

    Kredler, L.; Häußler, W.; Martin, N.; Böni, P.

    The flux is still a major limiting factor in neutron research. For instruments being supplied by cold neutrons using neutron guides, both at present steady-state and at new spallation neutron sources, it is therefore important to optimize the instrumental setup and the neutron guidance. Optimization of neutron guide geometry and of the instrument itself can be performed by numerical ray-tracing simulations using existing open-access codes. In this paper, we discuss how such Monte Carlo simulations have been employed in order to plan improvements of the Neutron Resonant Spin Echo spectrometer RESEDA (FRM II, Germany) as well as the neutron guides before and within the instrument. The essential components have been represented with the help of the McStas ray-tracing package. The expected intensity has been tested by means of several virtual detectors, implemented in the simulation code. Comparison between simulations and preliminary measurements results shows good agreement and demonstrates the reliability of the numerical approach. These results will be taken into account in the planning of new components installed in the guide system.

  18. A LAMMPS implementation of volume-temperature replica exchange molecular dynamics

    NASA Astrophysics Data System (ADS)

    Liu, Liang-Chun; Kuo, Jer-Lai

    2015-04-01

    A driver module for executing volume-temperature replica exchange molecular dynamics (VTREMD) was developed for the LAMMPS package. As a patch code, the VTREMD module performs classical molecular dynamics (MD) with Monte Carlo (MC) decisions between MD runs. The goal of inserting the MC step was to increase the breadth of sampled configurational space. In this method, states receive better sampling by making temperature or density swaps with their neighboring states. As an accelerated sampling method, VTREMD is particularly useful to explore states at low temperatures, where systems are easily trapped in local potential wells. As functional examples, TIP4P/Ew and TIP4P/2005 water models were analyzed using VTREMD. The phase diagram in this study covered the deeply supercooled regime, and this test served as a suitable demonstration of the usefulness of VTREMD in overcoming the slow dynamics problem. To facilitate using the current code, attention was also paid on how to optimize the exchange efficiency by using grid allocation. VTREMD was useful for studying systems with rough energy landscapes, such as those with numerous local minima or multiple characteristic time scales.

  19. Effective field theory of cosmic acceleration: Constraining dark energy with CMB data

    NASA Astrophysics Data System (ADS)

    Raveri, Marco; Hu, Bin; Frusciante, Noemi; Silvestri, Alessandra

    2014-08-01

    We introduce EFTCAMB/EFTCosmoMC as publicly available patches to the commonly used camb/CosmoMC codes. We briefly describe the structure of the codes, their applicability and main features. To illustrate the use of these patches, we obtain constraints on parametrized pure effective field theory and designer f(R) models, both on ΛCDM and wCDM background expansion histories, using data from Planck temperature and lensing potential spectra, WMAP low-ℓ polarization spectra (WP), and baryon acoustic oscillations (BAO). Upon inspecting the theoretical stability of the models on the given background, we find nontrivial parameter spaces that we translate into viability priors. We use different combinations of data sets to show their individual effects on cosmological and model parameters. Our data analysis results show that, depending on the adopted data sets, in the wCDM background case these viability priors could dominate the marginalized posterior distributions. Interestingly, with Planck +WP+BAO+lensing data, in f(R) gravity models, we get very strong constraints on the constant dark energy equation of state, w0∈(-1,-0.9997) (95% C.L.).

  20. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  1. A comparative analysis of MC4R gene sequence, polymorphism, and chromosomal localization in Chinese raccoon dog and Arctic fox.

    PubMed

    Skorczyk, Anna; Flisikowski, Krzysztof; Switonski, Marek

    2012-05-01

    Numerous mutations of the human melanocortin receptor type 4 (MC4R) gene are responsible for monogenic obesity, and some of them appear to be associated with predisposition or resistance to polygenic obesity. Thus, this gene is considered a functional candidate for fat tissue accumulation and body weight in domestic mammals. The aim of the study was comparative analysis of chromosome localization, nucleotide sequence, and polymorphism of the MC4R gene in two farmed species of the Canidae family, namely the Chinese raccoon dog (Nycterutes procyonoides procyonoides) and the arctic fox (Alopex lagopus). The whole coding sequence, including fragments of 3'UTR and 5'UTR, shows 89% similarity between the arctic fox (1276 bp) and Chinese raccoon dog (1213 bp). Altogether, 30 farmed Chinese raccoon dogs and 30 farmed arctic foxes were searched for polymorphisms. In the Chinese raccoon dog, only one silent substitution in the coding sequence was identified; whereas in the arctic fox, four InDels and two single-nucleotide polymorphisms (SNPs) in the 5'UTR and six silent SNPs in the exon were found. The studied gene was mapped by FISH to the Chinese raccoon dog chromosome 9 (NPP9q1.2) and arctic fox chromosome 24 (ALA24q1.2-1.3). The obtained results are discussed in terms of genome evolution of species belonging to the family Canidae and their potential use in animal breeding.

  2. More on the decoder error probability for Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Cheung, K.-M.

    1987-01-01

    The decoder error probability for Reed-Solomon codes (more generally, linear maximum distance separable codes) is examined. McEliece and Swanson offered an upper bound on P sub E (u), the decoder error probability given that u symbol errors occurs. This upper bound is slightly greater than Q, the probability that a completely random error pattern will cause decoder error. By using a combinatoric technique, the principle of inclusion and exclusion, an exact formula for P sub E (u) is derived. The P sub e (u) for the (255, 223) Reed-Solomon Code used by NASA, and for the (31,15) Reed-Solomon code (JTIDS code), are calculated using the exact formula, and the P sub E (u)'s are observed to approach the Q's of the codes rapidly as u gets larger. An upper bound for the expression is derived, and is shown to decrease nearly exponentially as u increases. This proves analytically that P sub E (u) indeed approaches Q as u becomes large, and some laws of large numbers come into play.

  3. OVERFLOW-Interaction with Industry

    NASA Technical Reports Server (NTRS)

    Buning, Pieter G.; George, Michael W. (Technical Monitor)

    1996-01-01

    A Navier-Stokes flow solver, OVERFLOW, has been developed by researchers at NASA Ames Research Center to use overset (Chimera) grids to simulate the flow about complex aerodynamic shapes. Primary customers of the OVERFLOW flow solver and related software include McDonnell Douglas and Boeing, as well as the NASA Focused Programs for Advanced Subsonic Technology (AST) and High Speed Research (HSR). Code development has focused on customer issues, including improving code performance, ability to run on workstation clusters and the NAS SP2, and direct interaction with industry on accuracy assessment and validation. Significant interaction with NAS has produced a capability tailored to the Ames computing environment, and code contributions have come from a wide range of sources, both within and outside Ames.

  4. MCMEG: Simulations of both PDD and TPR for 6 MV LINAC photon beam using different MC codes

    NASA Astrophysics Data System (ADS)

    Fonseca, T. C. F.; Mendes, B. M.; Lacerda, M. A. S.; Silva, L. A. C.; Paixão, L.; Bastos, F. M.; Ramirez, J. V.; Junior, J. P. R.

    2017-11-01

    The Monte Carlo Modelling Expert Group (MCMEG) is an expert network specializing in Monte Carlo radiation transport and the modelling and simulation applied to the radiation protection and dosimetry research field. For the first inter-comparison task the group launched an exercise to model and simulate a 6 MV LINAC photon beam using the Monte Carlo codes available within their laboratories and validate their simulated results by comparing them with experimental measurements carried out in the National Cancer Institute (INCA) in Rio de Janeiro, Brazil. The experimental measurements were performed using an ionization chamber with calibration traceable to a Secondary Standard Dosimetry Laboratory (SSDL). The detector was immersed in a water phantom at different depths and was irradiated with a radiation field size of 10×10 cm2. This exposure setup was used to determine the dosimetric parameters Percentage Depth Dose (PDD) and Tissue Phantom Ratio (TPR). The validation process compares the MC calculated results to the experimental measured PDD20,10 and TPR20,10. Simulations were performed reproducing the experimental TPR20,10 quality index which provides a satisfactory description of both the PDD curve and the transverse profiles at the two depths measured. This paper reports in detail the modelling process using MCNPx, MCNP6, EGSnrc and Penelope Monte Carlo codes, the source and tally descriptions, the validation processes and the results.

  5. Combustor Computations for CO2-Neutral Aviation

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Brankovic, Andreja; Ryder, Robert C.; Huber, Marcia

    2011-01-01

    Knowing the pure component C(sub p)(sup 0) or mixture C(sub p) (sup 0) as computed by a flexible code such as NIST-STRAPP or McBride-Gordon, one can, within reasonable accuracy, determine the thermophysical properties necessary to predict the combustion characteristics when there are no tabulated or computed data for those fluid mixtures 3or limited results for lower temperatures. (Note: C(sub p) (sup 0) is molar heat capacity at constant pressure.) The method can be used in the determination of synthetic and biological fuels and blends using the NIST code to compute the C(sub p) (sup 0) of the mixture. In this work, the values of the heat capacity were set at zero pressure, which provided the basis for integration to determine the required combustor properties from the injector to the combustor exit plane. The McBride-Gordon code was used to determine the heat capacity at zero pressure over a wide range of temperatures (room to 6,000 K). The selected fluids were Jet-A, 224TMP (octane), and C12. It was found that each heat capacity loci were form-similar. It was then determined that the results [near 400 to 3,000 K] could be represented to within acceptable engineering accuracy with the simplified equation C(sub p) (sup 0) = A/T + B, where A and B are fluid-dependent constants and T is temperature (K).

  6. A medical image-based graphical platform -- features, applications and relevance for brachytherapy.

    PubMed

    Fonseca, Gabriel P; Reniers, Brigitte; Landry, Guillaume; White, Shane; Bellezzo, Murillo; Antunes, Paula C G; de Sales, Camila P; Welteman, Eduardo; Yoriyaz, Hélio; Verhaegen, Frank

    2014-01-01

    Brachytherapy dose calculation is commonly performed using the Task Group-No 43 Report-Updated protocol (TG-43U1) formalism. Recently, a more accurate approach has been proposed that can handle tissue composition, tissue density, body shape, applicator geometry, and dose reporting either in media or water. Some model-based dose calculation algorithms are based on Monte Carlo (MC) simulations. This work presents a software platform capable of processing medical images and treatment plans, and preparing the required input data for MC simulations. The A Medical Image-based Graphical platfOrm-Brachytherapy module (AMIGOBrachy) is a user interface, coupled to the MCNP6 MC code, for absorbed dose calculations. The AMIGOBrachy was first validated in water for a high-dose-rate (192)Ir source. Next, dose distributions were validated in uniform phantoms consisting of different materials. Finally, dose distributions were obtained in patient geometries. Results were compared against a treatment planning system including a linear Boltzmann transport equation (LBTE) solver capable of handling nonwater heterogeneities. The TG-43U1 source parameters are in good agreement with literature with more than 90% of anisotropy values within 1%. No significant dependence on the tissue composition was observed comparing MC results against an LBTE solver. Clinical cases showed differences up to 25%, when comparing MC results against TG-43U1. About 92% of the voxels exhibited dose differences lower than 2% when comparing MC results against an LBTE solver. The AMIGOBrachy can improve the accuracy of the TG-43U1 dose calculation by using a more accurate MC dose calculation algorithm. The AMIGOBrachy can be incorporated in clinical practice via a user-friendly graphical interface. Copyright © 2014 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  7. Gene variants and binge eating as predictors of comorbidity and outcome of treatment in severe obesity.

    PubMed

    Potoczna, Natascha; Branson, Ruth; Kral, John G; Piec, Grazyna; Steffen, Rudolf; Ricklin, Thomas; Hoehe, Margret R; Lentes, Klaus-Ulrich; Horber, Fritz F

    2004-12-01

    Melanocortin-4 receptor gene (MC4R) variants are associated with obesity and binge eating disorder (BED), whereas the more prevalent proopiomelanocortin (POMC) and leptin receptor gene (LEPR) mutations are rarely associated with obesity or BED. The complete coding regions of MC4R, POMC, and leptin-binding domain of LEPR were comparatively sequenced in 300 patients (233 women and 67 men; mean +/- SEM age, 42 +/- 1 years; mean +/- SEM body mass index, 43.5 +/- 0.3 kg/m2) undergoing laparoscopic gastric banding. Eating behavior, esophagogastric pathology, metabolic syndrome prevalence, and postoperative weight loss and complications were retrospectively compared between carriers and noncarriers of gene variants with and without BED during 36 +/- 3-month follow-up. Nineteen patients (6.3%) carried 8 MC4R variants, 144 (48.0%) carried 13 POMC variants, and 247 (82.3%) carried 11 LEPR variants. All MC4R variant carriers had BED, compared with 18.1% of noncarriers (P < 0.001). BED rates were similar among POMC and LEPR variant carriers and noncarriers. Gastroscopy revealed more erosive esophagitis in bingers than in nonbingers before and after banding (P < 0.04), regardless of genotype. MC4R variant carriers lost less weight (P=0.003), showed less improvement in metabolic syndrome (P < 0.001), had dilated esophagi (P < 0.001) and more vomiting (P < 0.05), and had fivefold more gastric complications (P < 0.001) than noncarriers. Overall outcome was poorest in MC4R variant carriers, better in noncarriers with BED (P < 0.05), and best in noncarriers without BED (P < 0.001). MC4R variants influence comorbidities and treatment outcomes in severe obesity.

  8. MC1R Genotype and Plumage Colouration in the Zebra Finch (Taeniopygia guttata): Population Structure Generates Artefactual Associations

    PubMed Central

    Hoffman, Joseph I.; Krause, E. Tobias; Lehmann, Katrin; Krüger, Oliver

    2014-01-01

    Polymorphisms at the melanocortin-1 receptor (MC1R) gene have been linked to coloration in many vertebrate species. However, the potentially confounding influence of population structure has rarely been controlled for. We explored the role of the MC1R in a model avian system by sequencing the coding region in 162 zebra finches comprising 79 wild type and 83 white individuals from five stocks. Allelic counts differed significantly between the two plumage morphs at multiple segregating sites, but these were mostly synonymous. To provide a control, the birds were genotyped at eight microsatellites and subjected to Bayesian cluster analysis, revealing two distinct groups. We therefore crossed wild type with white individuals and backcrossed the F1s with white birds. No significant associations were detected in the resulting offspring, suggesting that our original findings were a byproduct of genome-wide divergence. Our results are consistent with a previous study that found no association between MC1R polymorphism and plumage coloration in leaf warblers. They also contribute towards a growing body of evidence suggesting that care should be taken to quantify, and where necessary control for, population structure in association studies. PMID:24489736

  9. Computer Simulation of Electron Thermalization in CsI and CsI(Tl)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhiguo; Xie, YuLong; Cannon, Bret D.

    2011-09-15

    A Monte Carlo (MC) model was developed and implemented to simulate the thermalization of electrons in inorganic scintillator materials. The model incorporates electron scattering with both longitudinal optical and acoustic phonons. In this paper, the MC model was applied to simulate electron thermalization in CsI, both pure and doped with a range of thallium concentrations. The inclusion of internal electric fields was shown to increase the fraction of recombined electron-hole pairs and to broaden the thermalization distance and thermalization time distributions. The MC simulations indicate that electron thermalization, following {gamma}-ray excitation, takes place within approximately 10 ps in CsI andmore » that electrons can travel distances up to several hundreds of nanometers. Electron thermalization was studied for a range of incident {gamma}-ray energies using electron-hole pair spatial distributions generated by the MC code NWEGRIM (NorthWest Electron and Gamma Ray Interaction in Matter). These simulations revealed that the partition of thermalized electrons between different species (e.g., recombined with self-trapped holes or trapped at thallium sites) vary with the incident energy. Implications for the phenomenon of nonlinearity in scintillator light yield are discussed.« less

  10. Evaluation and optimization of sampling errors for the Monte Carlo Independent Column Approximation

    NASA Astrophysics Data System (ADS)

    Räisänen, Petri; Barker, W. Howard

    2004-07-01

    The Monte Carlo Independent Column Approximation (McICA) method for computing domain-average broadband radiative fluxes is unbiased with respect to the full ICA, but its flux estimates contain conditional random noise. McICA's sampling errors are evaluated here using a global climate model (GCM) dataset and a correlated-k distribution (CKD) radiation scheme. Two approaches to reduce McICA's sampling variance are discussed. The first is to simply restrict all of McICA's samples to cloudy regions. This avoids wasting precious few samples on essentially homogeneous clear skies. Clear-sky fluxes need to be computed separately for this approach, but this is usually done in GCMs for diagnostic purposes anyway. Second, accuracy can be improved by repeated sampling, and averaging those CKD terms with large cloud radiative effects. Although this naturally increases computational costs over the standard CKD model, random errors for fluxes and heating rates are reduced by typically 50% to 60%, for the present radiation code, when the total number of samples is increased by 50%. When both variance reduction techniques are applied simultaneously, globally averaged flux and heating rate random errors are reduced by a factor of #3.

  11. The epidemiology of molluscum contagiosum in children.

    PubMed

    Dohil, Magdalene A; Lin, Peggy; Lee, James; Lucky, Anne W; Paller, Amy S; Eichenfield, Lawrence F

    2006-01-01

    Molluscum contagiosum (MC) is a viral disorder of the skin and mucous membranes characterized by discrete single or multiple, flesh-colored papules. Although MC as a clinical entity is well defined and commonly observed, few data regarding its epidemiology in the pediatric population exist. Our purpose was to collect epidemiologic data on children with MC with regard to age, gender, ethnicity, degree of involvement, relation to pre-existing atopic dermatitis (AD), and immune status. A retrospective chart review was conducted. All subjects were seen at 3 tertiary pediatric dermatology referral centers with two of the sites based at a Children's Hospital. A total of 302 patient charts with the Current Procedural Terminology code diagnosis of MC seen over a 6- to 8-month period were reviewed. Approximately 80% of the patients were younger than 8 years old. The majority of patients (63%) had more than 15 lesions. All but one patient were otherwise healthy, as determined by history and clinical examination. Approximately 24% of the patients presented with a history of previous or active coexistent AD. However, children with AD were at risk for an increased number of lesions. These data provide valuable updated information on the demographics and clinical presentation of MC in pediatric patients in the United States. Limitations include that this was a retrospective study with a population limited to tertiary pediatric dermatology referral centers.

  12. Multi-scale modeling of irradiation effects in spallation neutron source materials

    NASA Astrophysics Data System (ADS)

    Yoshiie, T.; Ito, T.; Iwase, H.; Kaneko, Y.; Kawai, M.; Kishida, I.; Kunieda, S.; Sato, K.; Shimakawa, S.; Shimizu, F.; Hashimoto, S.; Hashimoto, N.; Fukahori, T.; Watanabe, Y.; Xu, Q.; Ishino, S.

    2011-07-01

    Changes in mechanical property of Ni under irradiation by 3 GeV protons were estimated by multi-scale modeling. The code consisted of four parts. The first part was based on the Particle and Heavy-Ion Transport code System (PHITS) code for nuclear reactions, and modeled the interactions between high energy protons and nuclei in the target. The second part covered atomic collisions by particles without nuclear reactions. Because the energy of the particles was high, subcascade analysis was employed. The direct formation of clusters and the number of mobile defects were estimated using molecular dynamics (MD) and kinetic Monte-Carlo (kMC) methods in each subcascade. The third part considered damage structural evolutions estimated by reaction kinetic analysis. The fourth part involved the estimation of mechanical property change using three-dimensional discrete dislocation dynamics (DDD). Using the above four part code, stress-strain curves for high energy proton irradiated Ni were obtained.

  13. A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics

    NASA Technical Reports Server (NTRS)

    Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela

    2015-01-01

    Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information

  14. Capacity, cutoff rate, and coding for a direct-detection optical channel

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1980-01-01

    It is shown that Pierce's pulse position modulation scheme with 2 to the L pulse positions used on a self-noise-limited direct detection optical communication channel results in a 2 to the L-ary erasure channel that is equivalent to the parallel combination of L completely correlated binary erasure channels. The capacity of the full channel is the sum of the capacities of the component channels, but the cutoff rate of the full channel is shown to be much smaller than the sum of the cutoff rates. An interpretation of the cutoff rate is given that suggests a complexity advantage in coding separately on the component channels. It is shown that if short-constraint-length convolutional codes with Viterbi decoders are used on the component channels, then the performance and complexity compare favorably with the Reed-Solomon coding system proposed by McEliece for the full channel. The reasons for this unexpectedly fine performance by the convolutional code system are explored in detail, as are various facets of the channel structure.

  15. Proceedings of the Conference on the Environmental Effects of Explosives and Explosions (2nd) 13-14 October 1976

    DTIC Science & Technology

    1977-07-25

    of contusions on the lining of the gastrointestinal track begin to occur along with petechial lung hemorrhages, The incidence and severity of these...Maryland 20640 Attn: LCDR 3. W. McConnell Director Naval Research Laboratory Washington, D.C. 20375 Attni Geoffrey 0. Thomas, Code 8410 Kenneth N. Fever

  16. An Assessment of Long-Term Changes in Anthropometric Dimensions: Secular Trends of U.S. Army Males

    DTIC Science & Technology

    1990-12-01

    17. COSATI CODES 18 SUBJECT TERMS (Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP Anthropometry Demography... Elderly Population. Human Bioloxy 60:917-925. Clauser, C, I Tebbetts, B Bradtmiller, J McConville and CC Gordon (1988) Measurer’s Handbook: U.S. Army

  17. Scieszka's "The Stinky Cheese Man": A Tossed Salad of Parodic Re-Versions

    ERIC Educational Resources Information Center

    Pantaleo, Sylvia

    2007-01-01

    "The Stinky Cheese Man and Other Fairly Stupid Tales" (1992) by Jon Scieszka and Lane Smith was awarded a Randolph Caldecott Honor Medal in 1993. Scieszka and Smith subvert textual authority through playing "with literary and cultural codes and conventions" (McCallum 1996, p. 400) in their metafictive text. In this article, I discuss the…

  18. Supporting the Reflective Practice of Tutors: What Do Tutors Reflect on?

    ERIC Educational Resources Information Center

    Bell, Amani; Mladenovic, Rosina; Segara, Reuben

    2010-01-01

    Effective self-reflection is a key component of excellent teaching. We describe the types of self-reflection identified in tutors' reflective statements following a peer observation of teaching exercise. We used an adapted version of the categories developed by Grushka, McLeod and Reynolds in 2005 to code text from 20 written statements as…

  19. Combined Edition of Family Planning Library Manual and Family Planning Classification.

    ERIC Educational Resources Information Center

    Planned Parenthood--World Population, New York, NY. Katherine Dexter McCormick Library.

    This edition combines two previous publications of the Katharine Dexter McCormick Library into one volume: the Family Planning Library Manual, a guide for starting a family planning and population library or information center, and the Family Planning Classification, a coding system for organizing book and non-book materials so that they can be…

  20. 78 FR 68025 - Membership of the Economic Development Administration Performance Review Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... CONTACT: Ruthie B. Stewart, U.S. Department of Commerce, Office of Human Resources Management, Office of...) Edith J. McCloud, Associate Director for Management, Career SES 4. Department of Commerce, Office of the... Commerce Human Resources Operations Center. [FR Doc. 2013-27080 Filed 11-12-13; 8:45 am] BILLING CODE 3510...

  1. Multiplexing using synchrony in the zebrafish olfactory bulb.

    PubMed

    Friedrich, Rainer W; Habermann, Christopher J; Laurent, Gilles

    2004-08-01

    In the olfactory bulb (OB) of zebrafish and other species, odors evoke fast oscillatory population activity and specific firing rate patterns across mitral cells (MCs). This activity evolves over a few hundred milliseconds from the onset of the odor stimulus. Action potentials of odor-specific MC subsets phase-lock to the oscillation, defining small and distributed ensembles within the MC population output. We found that oscillatory field potentials in the zebrafish OB propagate across the OB in waves. Phase-locked MC action potentials, however, were synchronized without a time lag. Firing rate patterns across MCs analyzed with low temporal resolution were informative about odor identity. When the sensitivity for phase-locked spiking was increased, activity patterns became progressively more informative about odor category. Hence, information about complementary stimulus features is conveyed simultaneously by the same population of neurons and can be retrieved selectively by biologically plausible mechanisms, indicating that seemingly alternative coding strategies operating on different time scales may coexist.

  2. PHITS simulations of absorbed dose out-of-field and neutron energy spectra for ELEKTA SL25 medical linear accelerator.

    PubMed

    Puchalska, Monika; Sihver, Lembit

    2015-06-21

    Monte Carlo (MC) based calculation methods for modeling photon and particle transport, have several potential applications in radiotherapy. An essential requirement for successful radiation therapy is that the discrepancies between dose distributions calculated at the treatment planning stage and those delivered to the patient are minimized. It is also essential to minimize the dose to radiosensitive and critical organs. With MC technique, the dose distributions from both the primary and scattered photons can be calculated. The out-of-field radiation doses are of particular concern when high energy photons are used, since then neutrons are produced both in the accelerator head and inside the patients. Using MC technique, the created photons and particles can be followed and the transport and energy deposition in all the tissues of the patient can be estimated. This is of great importance during pediatric treatments when minimizing the risk for normal healthy tissue, e.g. secondary cancer. The purpose of this work was to evaluate 3D general purpose PHITS MC code efficiency as an alternative approach for photon beam specification. In this study, we developed a model of an ELEKTA SL25 accelerator and used the transport code PHITS for calculating the total absorbed dose and the neutron energy spectra infield and outside the treatment field. This model was validated against measurements performed with bubble detector spectrometers and Boner sphere for 18 MV linacs, including both photons and neutrons. The average absolute difference between the calculated and measured absorbed dose for the out-of-field region was around 11%. Taking into account a simplification for simulated geometry, which does not include any potential scattering materials around, the obtained result is very satisfactorily. A good agreement between the simulated and measured neutron energy spectra was observed while comparing to data found in the literature.

  3. PHITS simulations of absorbed dose out-of-field and neutron energy spectra for ELEKTA SL25 medical linear accelerator

    NASA Astrophysics Data System (ADS)

    Puchalska, Monika; Sihver, Lembit

    2015-06-01

    Monte Carlo (MC) based calculation methods for modeling photon and particle transport, have several potential applications in radiotherapy. An essential requirement for successful radiation therapy is that the discrepancies between dose distributions calculated at the treatment planning stage and those delivered to the patient are minimized. It is also essential to minimize the dose to radiosensitive and critical organs. With MC technique, the dose distributions from both the primary and scattered photons can be calculated. The out-of-field radiation doses are of particular concern when high energy photons are used, since then neutrons are produced both in the accelerator head and inside the patients. Using MC technique, the created photons and particles can be followed and the transport and energy deposition in all the tissues of the patient can be estimated. This is of great importance during pediatric treatments when minimizing the risk for normal healthy tissue, e.g. secondary cancer. The purpose of this work was to evaluate 3D general purpose PHITS MC code efficiency as an alternative approach for photon beam specification. In this study, we developed a model of an ELEKTA SL25 accelerator and used the transport code PHITS for calculating the total absorbed dose and the neutron energy spectra infield and outside the treatment field. This model was validated against measurements performed with bubble detector spectrometers and Boner sphere for 18 MV linacs, including both photons and neutrons. The average absolute difference between the calculated and measured absorbed dose for the out-of-field region was around 11%. Taking into account a simplification for simulated geometry, which does not include any potential scattering materials around, the obtained result is very satisfactorily. A good agreement between the simulated and measured neutron energy spectra was observed while comparing to data found in the literature.

  4. The predominant circular form of avocado sunblotch viroid accumulates in planta as a free RNA adopting a rod-shaped secondary structure unprotected by tightly bound host proteins.

    PubMed

    López-Carrasco, Amparo; Flores, Ricardo

    2017-07-01

    Avocado sunblotch viroid (ASBVd), the type member of the family Avsunviroidae, replicates and accumulates in chloroplasts. Whether this minimal non-protein-coding circular RNA of 246-250 nt exists in vivo as a free nucleic acid or closely associated with host proteins remains unknown. To tackle this issue, the secondary structures of the monomeric circular (mc) (+) and (-) strands of ASBVd have been examined in silico by searching those of minimal free energy, and in vitro at single-nucleotide resolution by selective 2'-hydroxyl acylation analysed by primer extension (SHAPE). Both approaches resulted in predominant rod-like secondary structures without tertiary interactions, with the mc (+) RNA being more compact than its (-) counterpart as revealed by non-denaturing polyacryamide gel electrophoresis. Moreover, in vivo SHAPE showed that the mc ASBVd (+) form accumulates in avocado leaves as a free RNA adopting a similar rod-shaped conformation unprotected by tightly bound host proteins. Hence, the mc ASBVd (+) RNA behaves in planta like the previously studied mc (+) RNA of potato spindle tuber viroid, the type member of nuclear viroids (family Pospiviroidae), indicating that two different viroids replicating and accumulating in distinct subcellular compartments, have converged into a common structural solution. Circularity and compact secondary structures confer to these RNAs, and probably to all viroids, the intrinsic stability needed to survive in their natural habitats. However, in vivo SHAPE has not revealed the (possibly transient or loose) interactions of the mc ASBVd (+) RNA with two host proteins observed previously by UV irradiation of infected avocado leaves.

  5. Monte Carol-based validation of neutronic methodology for EBR-II analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liaw, J.R.; Finck, P.J.

    1993-01-01

    The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less

  6. Monte Carlo simulations of {sup 3}He ion physical characteristics in a water phantom and evaluation of radiobiological effectiveness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taleei, Reza; Guan, Fada; Peeler, Chris

    Purpose: {sup 3}He ions may hold great potential for clinical therapy because of both their physical and biological properties. In this study, the authors investigated the physical properties, i.e., the depth-dose curves from primary and secondary particles, and the energy distributions of helium ({sup 3}He) ions. A relative biological effectiveness (RBE) model was applied to assess the biological effectiveness on survival of multiple cell lines. Methods: In light of the lack of experimental measurements and cross sections, the authors used Monte Carlo methods to study the energy deposition of {sup 3}He ions. The transport of {sup 3}He ions in watermore » was simulated by using three Monte Carlo codes—FLUKA, GEANT4, and MCNPX—for incident beams with Gaussian energy distributions with average energies of 527 and 699 MeV and a full width at half maximum of 3.3 MeV in both cases. The RBE of each was evaluated by using the repair-misrepair-fixation model. In all of the simulations with each of the three Monte Carlo codes, the same geometry and primary beam parameters were used. Results: Energy deposition as a function of depth and energy spectra with high resolution was calculated on the central axis of the beam. Secondary proton dose from the primary {sup 3}He beams was predicted quite differently by the three Monte Carlo systems. The predictions differed by as much as a factor of 2. Microdosimetric parameters such as dose mean lineal energy (y{sub D}), frequency mean lineal energy (y{sub F}), and frequency mean specific energy (z{sub F}) were used to characterize the radiation beam quality at four depths of the Bragg curve. Calculated RBE values were close to 1 at the entrance, reached on average 1.8 and 1.6 for prostate and head and neck cancer cell lines at the Bragg peak for both energies, but showed some variations between the different Monte Carlo codes. Conclusions: Although the Monte Carlo codes provided different results in energy deposition and especially in secondary particle production (most of the differences between the three codes were observed close to the Bragg peak, where the energy spectrum broadens), the results in terms of RBE were generally similar.« less

  7. Partially Key Distribution with Public Key Cryptosystem Based on Error Control Codes

    NASA Astrophysics Data System (ADS)

    Tavallaei, Saeed Ebadi; Falahati, Abolfazl

    Due to the low level of security in public key cryptosystems based on number theory, fundamental difficulties such as "key escrow" in Public Key Infrastructure (PKI) and a secure channel in ID-based cryptography, a new key distribution cryptosystem based on Error Control Codes (ECC) is proposed . This idea is done by some modification on McEliece cryptosystem. The security of ECC cryptosystem obtains from the NP-Completeness of block codes decoding. The capability of generating public keys with variable lengths which is suitable for different applications will be provided by using ECC. It seems that usage of these cryptosystems because of decreasing in the security of cryptosystems based on number theory and increasing the lengths of their keys would be unavoidable in future.

  8. Sub-band/transform compression of video sequences

    NASA Technical Reports Server (NTRS)

    Sauer, Ken; Bauer, Peter

    1992-01-01

    The progress on compression of video sequences is discussed. The overall goal of the research was the development of data compression algorithms for high-definition television (HDTV) sequences, but most of our research is general enough to be applicable to much more general problems. We have concentrated on coding algorithms based on both sub-band and transform approaches. Two very fundamental issues arise in designing a sub-band coder. First, the form of the signal decomposition must be chosen to yield band-pass images with characteristics favorable to efficient coding. A second basic consideration, whether coding is to be done in two or three dimensions, is the form of the coders to be applied to each sub-band. Computational simplicity is of essence. We review the first portion of the year, during which we improved and extended some of the previous grant period's results. The pyramid nonrectangular sub-band coder limited to intra-frame application is discussed. Perhaps the most critical component of the sub-band structure is the design of bandsplitting filters. We apply very simple recursive filters, which operate at alternating levels on rectangularly sampled, and quincunx sampled images. We will also cover the techniques we have studied for the coding of the resulting bandpass signals. We discuss adaptive three-dimensional coding which takes advantage of the detection algorithm developed last year. To this point, all the work on this project has been done without the benefit of motion compensation (MC). Motion compensation is included in many proposed codecs, but adds significant computational burden and hardware expense. We have sought to find a lower-cost alternative featuring a simple adaptation to motion in the form of the codec. In sequences of high spatial detail and zooming or panning, it appears that MC will likely be necessary for the proposed quality and bit rates.

  9. PNS calculations for 3-D hypersonic corner flow with two turbulence models

    NASA Technical Reports Server (NTRS)

    Smith, Gregory E.; Liou, May-Fun; Benson, Thomas J.

    1988-01-01

    A three-dimensional parabolized Navier-Stokes code has been used as a testbed to investigate two turbulence models, the McDonald Camarata and Bushnell Beckwith model, in the hypersonic regime. The Bushnell Beckwith form factor correction to the McDonald Camarata mixing length model has been extended to three-dimensional flow by use of an inverse averaging of the resultant length scale contributions from each wall. Two-dimensional calculations are compared with experiment for Mach 18 helium flow over a 4-deg wedge. Corner flow calculations have been performed at Mach 11.8 for a Reynolds number of .67 x 10 to the 6th, based on the duct half-width, and a freestream stagnation temperature of 1750-deg Rankine.

  10. Monte-Carlo simulation of a stochastic differential equation

    NASA Astrophysics Data System (ADS)

    Arif, ULLAH; Majid, KHAN; M, KAMRAN; R, KHAN; Zhengmao, SHENG

    2017-12-01

    For solving higher dimensional diffusion equations with an inhomogeneous diffusion coefficient, Monte Carlo (MC) techniques are considered to be more effective than other algorithms, such as finite element method or finite difference method. The inhomogeneity of diffusion coefficient strongly limits the use of different numerical techniques. For better convergence, methods with higher orders have been kept forward to allow MC codes with large step size. The main focus of this work is to look for operators that can produce converging results for large step sizes. As a first step, our comparative analysis has been applied to a general stochastic problem. Subsequently, our formulization is applied to the problem of pitch angle scattering resulting from Coulomb collisions of charge particles in the toroidal devices.

  11. Radiation environment at LEO orbits: MC simulation and experimental data.

    NASA Astrophysics Data System (ADS)

    Zanini, Alba; Borla, Oscar; Damasso, Mario; Falzetta, Giuseppe

    The evaluations of the different components of the radiation environment in spacecraft, both in LEO orbits and in deep space is of great importance because the biological effect on humans and the risk for instrumentation strongly depends on the kind of radiation (high or low LET). That is important especially in view of long term manned or unmanned space missions, (mission to Mars, solar system exploration). The study of space radiation field is extremely complex and not completely solved till today. Given the complexity of the radiation field, an accurate dose evaluation should be considered an indispensable part of any space mission. Two simulation codes (MCNPX and GEANT4) have been used to assess the secondary radiation inside FO-TON M3 satellite and ISS. The energy spectra of primary radiation at LEO orbits have been modelled by using various tools (SPENVIS, OMERE, CREME96) considering separately Van Allen protons, the GCR protons and the GCR alpha particles. This data are used as input for the two MC codes and transported inside the spacecraft. The results of two calculation meth-ods have been compared. Moreover some experimental results previously obtained on FOTON M3 satellite by using TLD, Bubble dosimeter and LIULIN detector are considered to check the performances of the two codes. Finally the same experimental device are at present collecting data on the ISS (ASI experiment BIOKIS -nDOSE) and at the end of the mission the results will be compared with the calculation.

  12. Ion channeling study of defects in compound crystals using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Turos, A.; Jozwik, P.; Nowicki, L.; Sathish, N.

    2014-08-01

    Ion channeling is a well-established technique for determination of structural properties of crystalline materials. Defect depth profiles have been usually determined basing on the two-beam model developed by Bøgh (1968) [1]. As long as the main research interest was focused on single element crystals it was considered as sufficiently accurate. New challenge emerged with growing technological importance of compound single crystals and epitaxial heterostructures. Overlap of partial spectra due to different sublattices and formation of complicated defect structures makes the two beam method hardly applicable. The solution is provided by Monte Carlo computer simulations. Our paper reviews principal aspects of this approach and the recent developments in the McChasy simulation code. The latter made it possible to distinguish between randomly displaced atoms (RDA) and extended defects (dislocations, loops, etc.). Hence, complex defect structures can be characterized by the relative content of these two components. The next refinement of the code consists of detailed parameterization of dislocations and dislocation loops. Defect profiles for variety of compound crystals (GaN, ZnO, SrTiO3) have been measured and evaluated using the McChasy code. Damage accumulation curves for RDA and extended defects revealed non monotonous defect buildup with some characteristic steps. Transition to each stage is governed by the different driving force. As shown by the complementary high resolution XRD measurements lattice strain plays here the crucial role and can be correlated with the concentration of extended defects.

  13. Proposals to conserve Botryodiplodia theobromae (Lasiodiplodia theobromae) against Sphaeria glandicola, .....Ramularia brunnea against Sphaerella tussilaginis (Mycosphaerella tussilaginis) (Ascomycota: Dothideomycetes)

    USDA-ARS?s Scientific Manuscript database

    In the course of updating the scientific names of plant-associated fungi in the U. S. National Fungus Collections Fungal Databases to conform with one scientific name for fungi as required by the International Code of Nomenclature for algae, fungi and plants (ICN, McNeill & al. in Regnum Vegetable 1...

  14. Dress Codes and the Academic Conference: McCulloch's Iron Laws of Conferences

    ERIC Educational Resources Information Center

    McCulloch, Alistair

    2018-01-01

    Despite being a staple of academic life (or perhaps because it is so taken-for-granted), the academic conference has been generally under-utilised as a site for academic research. Using participant observation as its methodology, this article draws on a long career of conference attendance to present two iron laws of conferences which address the…

  15. Channel Diversity in Random Wireless Networks

    DTIC Science & Technology

    2009-01-01

    α, α)πR2 , (33) where K , LbwL 2π B (αL + 1/2, αL+ 1/2). (34) The maximum contention density is therefore proportional to ǫ1/L, as ǫ → 0. This...Digital Communications, 5th ed. Mc Graw Hill, 2008. [10] M. W. Subbarao and B . L. Hughes, “Optimal transmission ranges and code rates for frequency...

  16. UGV Interoperability Profile (IOP) Communications Profile, Version 0

    DTIC Science & Technology

    2011-12-21

    some UGV systems employ Orthogonal Frequency Division Multiplexing ( OFDM ) or Coded Orthogonal Frequency Division Multiplexing (COFDM) waveforms which...other portions of the IOP. Attribute Paragraph Title Values Waveform 3.3 Air Interface/ Waveform OFDM , COFDM, DDL, CDL, None OCU to Platform...Sight MANET Mobile Ad-hoc Network Mbps Megabits per second MC/PM Master Controller/ Payload Manager MHz Megahertz MIMO Multiple Input Multiple

  17. Improvements to the Sandia CTH Hydro-Code to Support Blast Analysis and Protective Design of Military Vehicles

    DTIC Science & Technology

    2014-04-15

    used for advertising or product endorsement purposes. 6.0 REFERENCES [1] McGlaun, J., Thompson, S. and Elrick, M. “CTH: A Three-Dimensional Shock-Wave...Validation of a Loading Model for Simulating Blast Mine Effects on Armoured Vehicles,” 7 th International LS-DYNA Users Conference, Detroit, MI 2002. [14

  18. A Methodology to Assess UrbanSim Scenarios

    DTIC Science & Technology

    2012-09-01

    Education LOE – Line of Effort MMOG – Massively Multiplayer Online Game MC3 – Maneuver Captain’s Career Course MSCCC – Maneuver Support...augmented reality simulations, increased automation and artificial intelligence simulation, and massively multiplayer online games (MMOG), among...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Turn-based strategy games and simulations are vital tools for military

  19. Phytoremediation of Hazardous Wastes

    DTIC Science & Technology

    1995-07-26

    TITLE AND SUBTITLE Phytoremediation of Hazardous Wastes 6. AUTHOR(S) Steven C. McCutcheon, N. Lee Wolfe, Laura H. Carreria and Tse-Yuan Ou 5... phytoremediation (the use of plants to degrade hazardous contaminants) was developed. The new approach to phytoremediation involves rigorous pathway analyses...SUBJECT TERMS phytoremediation , nitroreductase, laccase enzymes, SERDP 15. NUMBER OF PAGES 8 16. PRICE CODE N/A 17. SECURITY CLASSIFICATION OF

  20. MONDO: a neutron tracker for particle therapy secondary emission characterisation

    NASA Astrophysics Data System (ADS)

    Marafini, M.; Gasparini, L.; Mirabelli, R.; Pinci, D.; Patera, V.; Sciubba, A.; Spiriti, E.; Stoppa, D.; Traini, G.; Sarti, A.

    2017-04-01

    Tumour control is performed in particle therapy using particles and ions, whose high irradiation precision enhances the effectiveness of the treatment, while sparing the healthy tissue surrounding the target volume. Dose range monitoring devices using photons and charged particles produced by the beam interacting with the patient’s body have already been proposed, but no attempt has been made yet to exploit the detection of the abundant neutron component. Since neutrons can release a significant dose far away from the tumour region, precise measurements of their flux, production energy and angle distributions are eagerly sought in order to improve the treatment planning system (TPS) software. It will thus be possible to predict not only the normal tissue toxicity in the target region, but also the risk of late complications in the whole body. The aforementioned issues underline the importance of an experimental effort devoted to the precise characterisation of neutron production, aimed at the measurement of their abundance, emission point and production energy. The technical challenges posed by a neutron detector aimed at high detection efficiency and good backtracking precision are addressed within the MONDO (monitor for neutron dose in hadrontherapy) project, whose main goal is to develop a tracking detector that can target fast and ultrafast neutrons. A full reconstruction of two consecutive elastic scattering interactions undergone by the neutrons inside the detector material will be used to measure their energy and direction. The preliminary results of an MC simulation performed using the FLUKA software are presented here, together with the DSiPM (digital SiPM) readout implementation. New detector readout implementations specifically tailored to the MONDO tracker are also discussed, and the neutron detection efficiency attainable with the proposed neutron tracking strategy are reported.

  1. MONDO: a neutron tracker for particle therapy secondary emission characterisation.

    PubMed

    Marafini, M; Gasparini, L; Mirabelli, R; Pinci, D; Patera, V; Sciubba, A; Spiriti, E; Stoppa, D; Traini, G; Sarti, A

    2017-04-21

    Tumour control is performed in particle therapy using particles and ions, whose high irradiation precision enhances the effectiveness of the treatment, while sparing the healthy tissue surrounding the target volume. Dose range monitoring devices using photons and charged particles produced by the beam interacting with the patient's body have already been proposed, but no attempt has been made yet to exploit the detection of the abundant neutron component. Since neutrons can release a significant dose far away from the tumour region, precise measurements of their flux, production energy and angle distributions are eagerly sought in order to improve the treatment planning system (TPS) software. It will thus be possible to predict not only the normal tissue toxicity in the target region, but also the risk of late complications in the whole body. The aforementioned issues underline the importance of an experimental effort devoted to the precise characterisation of neutron production, aimed at the measurement of their abundance, emission point and production energy. The technical challenges posed by a neutron detector aimed at high detection efficiency and good backtracking precision are addressed within the MONDO (monitor for neutron dose in hadrontherapy) project, whose main goal is to develop a tracking detector that can target fast and ultrafast neutrons. A full reconstruction of two consecutive elastic scattering interactions undergone by the neutrons inside the detector material will be used to measure their energy and direction. The preliminary results of an MC simulation performed using the FLUKA software are presented here, together with the DSiPM (digital SiPM) readout implementation. New detector readout implementations specifically tailored to the MONDO tracker are also discussed, and the neutron detection efficiency attainable with the proposed neutron tracking strategy are reported.

  2. Expenditure Distribution Trends with Regard to the Availability of Funds in the DOA and DOAF Budgets.

    DTIC Science & Technology

    1987-06-01

    TERMAS (Continu*on r evere of fleUdr’y and ident-loy by block number) GUp eGRU Budget, Outlays, DOAF, DOA, DOD, Expenditures, Increment alism This thesis...34ONE (Ilo de Arid Code) 22C ~fF(E SYo Prfso TervL McCafferv 408-646-2554 1Code 54Mm DO’ FORtM 1-473.5.4 MAR 83 APR ed-ton -ay be used wmu ntl " awsted ...SCHOOL June 1987 Author: Michael FkA.’ atto Approved by: MCC ! s~visor Dan Boer Seoand Reade ill A. treer ., r .. Chairman .zParr~to nstive Sciences

  3. kmos: A lattice kinetic Monte Carlo framework

    NASA Astrophysics Data System (ADS)

    Hoffmann, Max J.; Matera, Sebastian; Reuter, Karsten

    2014-07-01

    Kinetic Monte Carlo (kMC) simulations have emerged as a key tool for microkinetic modeling in heterogeneous catalysis and other materials applications. Systems, where site-specificity of all elementary reactions allows a mapping onto a lattice of discrete active sites, can be addressed within the particularly efficient lattice kMC approach. To this end we describe the versatile kmos software package, which offers a most user-friendly implementation, execution, and evaluation of lattice kMC models of arbitrary complexity in one- to three-dimensional lattice systems, involving multiple active sites in periodic or aperiodic arrangements, as well as site-resolved pairwise and higher-order lateral interactions. Conceptually, kmos achieves a maximum runtime performance which is essentially independent of lattice size by generating code for the efficiency-determining local update of available events that is optimized for a defined kMC model. For this model definition and the control of all runtime and evaluation aspects kmos offers a high-level application programming interface. Usage proceeds interactively, via scripts, or a graphical user interface, which visualizes the model geometry, the lattice occupations and rates of selected elementary reactions, while allowing on-the-fly changes of simulation parameters. We demonstrate the performance and scaling of kmos with the application to kMC models for surface catalytic processes, where for given operation conditions (temperature and partial pressures of all reactants) central simulation outcomes are catalytic activity and selectivities, surface composition, and mechanistic insight into the occurrence of individual elementary processes in the reaction network.

  4. Multimedia transmission in MC-CDMA using adaptive subcarrier power allocation and CFO compensation

    NASA Astrophysics Data System (ADS)

    Chitra, S.; Kumaratharan, N.

    2018-02-01

    Multicarrier code division multiple access (MC-CDMA) system is one of the most effective techniques in fourth-generation (4G) wireless technology, due to its high data rate, high spectral efficiency and resistance to multipath fading. However, MC-CDMA systems are greatly deteriorated by carrier frequency offset (CFO) which is due to Doppler shift and oscillator instabilities. It leads to loss of orthogonality among the subcarriers and causes intercarrier interference (ICI). Water filling algorithm (WFA) is an efficient resource allocation algorithm to solve the power utilisation problems among the subcarriers in time-dispersive channels. The conventional WFA fails to consider the effect of CFO. To perform subcarrier power allocation with reduced CFO and to improve the capacity of MC-CDMA system, residual CFO compensated adaptive subcarrier power allocation algorithm is proposed in this paper. The proposed technique allocates power only to subcarriers with high channel to noise power ratio. The performance of the proposed method is evaluated using random binary data and image as source inputs. Simulation results depict that the bit error rate performance and ICI reduction capability of the proposed modified WFA offered superior performance in both power allocation and image compression for high-quality multimedia transmission in the presence of CFO and imperfect channel state information conditions.

  5. Complete genome sequence of Geobacillus strain Y4.1MC1, a novel CO-utilizing Geobacillus thermoglucosidasius strain isolated from Bath Hot Spring in Yellowstone National Park

    DOE PAGES

    Brumm, Phillip; Land, Miriam L.; Hauser, Loren John; ...

    2015-02-10

    Geobacillus thermoglucosidasius Y4.1MC1 was isolated from a boiling spring in the lower geyser basin of Yellowstone National Park. We present this species is of interest because of its metabolic versatility. The genome consists of one circular chromosome of 3,840,330 bp and a circular plasmid of 71,617 bp with an average GC content of 44.01%. The genome is available in the GenBank database (NC_014650.1 and NC_014651.1). In addition to the expected metabolic pathways for sugars and amino acids, the Y4.1MC1 genome codes for two separate carbon monoxide utilization pathways, an aerobic oxidation pathway and an anaerobic reductive acetyl CoA (Wood-Ljungdahl) pathway.more » This is the first report of a nonanaerobic organism with the Wood-Ljungdahl pathway. Also, this anaerobic pathway permits the strain to utilize H 2 and fix CO 2 present in the hot spring environment. Y4.1MC1 and its related species may play a significant role in carbon capture and sequestration in thermophilic ecosystems and may open up new routes to produce biofuels and chemicals from CO, H 2, and CO 2.« less

  6. Toward GPGPU accelerated human electromechanical cardiac simulations

    PubMed Central

    Vigueras, Guillermo; Roy, Ishani; Cookson, Andrew; Lee, Jack; Smith, Nicolas; Nordsletten, David

    2014-01-01

    In this paper, we look at the acceleration of weakly coupled electromechanics using the graphics processing unit (GPU). Specifically, we port to the GPU a number of components of Heart—a CPU-based finite element code developed for simulating multi-physics problems. On the basis of a criterion of computational cost, we implemented on the GPU the ODE and PDE solution steps for the electrophysiology problem and the Jacobian and residual evaluation for the mechanics problem. Performance of the GPU implementation is then compared with single core CPU (SC) execution as well as multi-core CPU (MC) computations with equivalent theoretical performance. Results show that for a human scale left ventricle mesh, GPU acceleration of the electrophysiology problem provided speedups of 164 × compared with SC and 5.5 times compared with MC for the solution of the ODE model. Speedup of up to 72 × compared with SC and 2.6 × compared with MC was also observed for the PDE solve. Using the same human geometry, the GPU implementation of mechanics residual/Jacobian computation provided speedups of up to 44 × compared with SC and 2.0 × compared with MC. © 2013 The Authors. International Journal for Numerical Methods in Biomedical Engineering published by John Wiley & Sons, Ltd. PMID:24115492

  7. Complete genome sequence of Geobacillus strain Y4.1MC1, a novel CO-utilizing Geobacillus thermoglucosidasius strain isolated from Bath Hot Spring in Yellowstone National Park

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brumm, Phillip; Land, Miriam L.; Hauser, Loren John

    Geobacillus thermoglucosidasius Y4.1MC1 was isolated from a boiling spring in the lower geyser basin of Yellowstone National Park. We present this species is of interest because of its metabolic versatility. The genome consists of one circular chromosome of 3,840,330 bp and a circular plasmid of 71,617 bp with an average GC content of 44.01%. The genome is available in the GenBank database (NC_014650.1 and NC_014651.1). In addition to the expected metabolic pathways for sugars and amino acids, the Y4.1MC1 genome codes for two separate carbon monoxide utilization pathways, an aerobic oxidation pathway and an anaerobic reductive acetyl CoA (Wood-Ljungdahl) pathway.more » This is the first report of a nonanaerobic organism with the Wood-Ljungdahl pathway. Also, this anaerobic pathway permits the strain to utilize H 2 and fix CO 2 present in the hot spring environment. Y4.1MC1 and its related species may play a significant role in carbon capture and sequestration in thermophilic ecosystems and may open up new routes to produce biofuels and chemicals from CO, H 2, and CO 2.« less

  8. Neutron production from 40 GeV/c mixed proton/pion beam on copper, silver and lead targets in the angular range 30-135°

    NASA Astrophysics Data System (ADS)

    Agosteo, S.; Birattari, C.; Dimovasili, E.; Foglio Para, A.; Silari, M.; Ulrici, L.; Vincke, H.

    2005-02-01

    The neutron emission from 50 mm thick copper, silver and lead targets bombarded by a mixed proton/pion beam with momentum of 40 GeV/c were measured at the CERN Super Proton Synchrotron. The neutron yield and spectral fluence per incident particle on target were measured with an extended range Bonner sphere spectrometer in the angular range 30-135° with respect to the beam direction. Monte Carlo simulations with the FLUKA code were performed to provide a priori information for the unfolding of the experimental data. The spectral fluences show two peaks, an isotropic evaporation component centred at 3 MeV and a high-energy peak sitting around 100-150 MeV. The experimental neutron yields are given in four energy bins: <100 keV, 0.1-20 MeV, 20-500 MeV and 0.5-2 GeV. The total yields show a systematic discrepancy of 30-50%, with a peak of 70% at the largest angles, with respect to the results of the Monte Carlo simulations, which it is believed to be mainly due to uncertainties in the beam normalization factor. Analytic expressions are given for the variation of the integral yield as a function of emission angle and of target mass number.

  9. Development and reproducibility evaluation of a Monte Carlo-based standard LINAC model for quality assurance of multi-institutional clinical trials.

    PubMed

    Usmani, Muhammad Nauman; Takegawa, Hideki; Takashina, Masaaki; Numasaki, Hodaka; Suga, Masaki; Anetai, Yusuke; Kurosu, Keita; Koizumi, Masahiko; Teshima, Teruki

    2014-11-01

    Technical developments in radiotherapy (RT) have created a need for systematic quality assurance (QA) to ensure that clinical institutions deliver prescribed radiation doses consistent with the requirements of clinical protocols. For QA, an ideal dose verification system should be independent of the treatment-planning system (TPS). This paper describes the development and reproducibility evaluation of a Monte Carlo (MC)-based standard LINAC model as a preliminary requirement for independent verification of dose distributions. The BEAMnrc MC code is used for characterization of the 6-, 10- and 15-MV photon beams for a wide range of field sizes. The modeling of the LINAC head components is based on the specifications provided by the manufacturer. MC dose distributions are tuned to match Varian Golden Beam Data (GBD). For reproducibility evaluation, calculated beam data is compared with beam data measured at individual institutions. For all energies and field sizes, the MC and GBD agreed to within 1.0% for percentage depth doses (PDDs), 1.5% for beam profiles and 1.2% for total scatter factors (Scps.). Reproducibility evaluation showed that the maximum average local differences were 1.3% and 2.5% for PDDs and beam profiles, respectively. MC and institutions' mean Scps agreed to within 2.0%. An MC-based standard LINAC model developed to independently verify dose distributions for QA of multi-institutional clinical trials and routine clinical practice has proven to be highly accurate and reproducible and can thus help ensure that prescribed doses delivered are consistent with the requirements of clinical protocols. © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  10. Children's reaction to depictions of healthy foods in fast-food television advertisements.

    PubMed

    Bernhardt, Amy M; Wilking, Cara; Gottlieb, Mark; Emond, Jennifer; Sargent, James D

    2014-05-01

    Since 2009, quick-service restaurant chains, or fast-food companies, have agreed to depict healthy foods in their advertising targeted at children. To determine how children interpreted depictions of milk and apples in television advertisements for children's meals by McDonald's and Burger King (BK) restaurants. Descriptive qualitative study in a rural pediatric practice setting in Northern New England. A convenience sample of 99 children (age range, 3-7 years) was shown depictions of healthy foods in fast-food advertisements that aired from July 1, 2010, through June 30, 2011. The images from McDonald's and BK showed milk and apples. Children were asked what they saw and not prompted to respond specifically to any aspect of the images. Two still images drawn from advertisements for healthy meals at McDonald's and BK. Children's responses were independently content coded to food category by 2 researchers. Among the 99 children participating, only 51 (52%) and 69 (70%) correctly identified milk from the McDonald's and BK images, respectively, with a significantly greater percentage correct (P = .02 for both) among older children. The children's recall of apples was significantly different by restaurant, with 79 (80%) mentioning apples when describing the McDonald's image and only 10 (10%) for the BK image (P < .001). The percentage correct was not associated with age in either case. Conversely, although french fries were not featured in either image, 80 children (81%) recalled french fries after viewing the BK advertisement. Of the 4 healthy food images, only depiction of apples by McDonald's was communicated adequately to the target audience. Representations of milk were inadequately communicated to preliterate children. Televised depictions of apple slices by BK misled the children in this study, although no action was taken by government or self-regulatory bodies.

  11. A preliminary study of in-house Monte Carlo simulations: an integrated Monte Carlo verification system.

    PubMed

    Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hideki; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki

    2009-10-01

    To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.

  12. Genetic, comparative genomic, and expression analyses of the Mc1r locus in the polychromatic Midas cichlid fish (Teleostei, Cichlidae Amphilophus sp.) species group.

    PubMed

    Henning, Frederico; Renz, Adina Josepha; Fukamachi, Shoji; Meyer, Axel

    2010-05-01

    Natural populations of the Midas cichlid species in several different crater lakes in Nicaragua exhibit a conspicuous color polymorphism. Most individuals are dark and the remaining have a gold coloration. The color morphs mate assortatively and sympatric population differentiation has been shown based on neutral molecular data. We investigated the color polymorphism using segregation analysis and a candidate gene approach. The segregation patterns observed in a mapping cross between a gold and a dark individual were consistent with a single dominant gene as a cause of the gold phenotype. This suggests that a simple genetic architecture underlies some of the speciation events in the Midas cichlids. We compared the expression levels of several candidate color genes Mc1r, Ednrb1, Slc45a2, and Tfap1a between the color morphs. Mc1r was found to be up regulated in the gold morph. Given its widespread association in color evolution and role on melanin synthesis, the Mc1r locus was further investigated using sequences derived from a genomic library. Comparative analysis revealed conserved synteny in relation to the majority of teleosts and highlighted several previously unidentified conserved non-coding elements (CNEs) in the upstream and downstream regions in the vicinity of Mc1r. The identification of the CNEs regions allowed the comparison of sequences from gold and dark specimens of natural populations. No polymorphisms were found between in the population sample and Mc1r showed no linkage to the gold phenotype in the mapping cross, demonstrating that it is not causally related to the color polymorphism in the Midas cichlid.

  13. Characterization of Bioderived Polyhydroxyalkanoates by Size Exclusion Chromatography

    NASA Astrophysics Data System (ADS)

    Negulescu, Ioan; Cueto, Rafael; Rusch, Kelly; Gutierrez-Wing, Teresa; Stevens, Benjamin

    2008-03-01

    The plant derived polyesters, better known as polyhydroxyalkanoates, PHAs, are renewable and sustainable: [-O-CH(CH3)-(CH2)x-CO-]n. If x = 0 PHA is Poly(lactic acid), PLA; if x = 1 or 2 it is Poly(hydroxy butyrate), PHB, or Poly(hydroxy valerate), PHV. SEC and light scattering have been used before for determination of the absolute molecular mass of PLA dissolved in CHCl3 (Malmgren et al., J. Thermal Anal. Calorim., 2006, 83, 35-40). To our best knowledge there is no publication on the determination of the absolute MW of other PHAs. The bioderived polymers analyzed in this work were four catalog PHA samples: PHB Fluka 81329, PHB Natural Aldrich 363502, 95PHB/5PHV Aldrich 403105, and 92PHB/8PHV Aldrich 403113. SEC/LS instrumentation used: three Phenogel (1K-10000K) columns + a guard column, an Agilent pump and Wyatt Heleos MALS, QUELS (DLS), ViscoStar and rEX DRI detectors, all in series. The experimental dn/dc of PHB in CHCl3 (0.0336 ml/g at 658nm) allowed the determination of absolute MW of all PHA samples: PHB Fluka Mw 345,100 Mn 218,400; PHB Aldrich Mw 335,700 Mn 185,000; 92PHB/8PHV Mw 144,700 Mn 91,970; 95PHB/5PHV Mw 253,000 Mn 193,800.

  14. 3D-DIVIMP-HC modeling analysis of methane injection into DIII-D using the DiMES porous plug injector

    NASA Astrophysics Data System (ADS)

    Mu, Y.; McLean, A. G.; Elder, J. D.; Stangeby, P. C.; Bray, B. D.; Brooks, N. H.; Davis, J. W.; Fenstermacher, M. E.; Groth, M.; Lasnier, C. J.; Rudakov, D. L.; Watkins, J. G.; West, W. P.; Wong, C. P. C.

    2009-06-01

    A self-contained gas injection system for the Divertor Material Evaluation System (DiMES) on DIII-D, the porous plug injector (PPI), has been employed for in situ study of chemical erosion in the tokamak divertor environment by injection of CH 4 [A.G. McLean et al., these Proceedings]. A new interpretive code, 3D-DIVIMP-HC, has been developed and applied to the interpretation of the CH, CI, and CII emissions. Particular emphasis is placed on the interpretation of 2D filtered-camera (TV) pictures in CH, CI and CII light taken from a view essentially straight down on the PPI. The code replicates sufficient measurements to conclude that most of the basic elements of the controlling physics and chemistry have been identified and incorporated in the code-model.

  15. Proposals to conserve the names Chaetomium piluliferum (Botryotrichum piluliferum) against ……and Gnomonia intermedia (Ophiognomonia intermedia) against Gloeosporium betulae (Discula betulae) (Ascomycota: Sordariomycetes)

    USDA-ARS?s Scientific Manuscript database

    In the course of updating the scientific names of plant-associated fungi in the USDA-ARS U.S. National Fungus Collections Fungal Databases to conform with one scientific name for fungi as required by the International Code of Nomenclature for algae, fungi and plants (ICN, McNeill & al. in Regnum Veg...

  16. Program for Critical Technologies in Breast Oncology

    DTIC Science & Technology

    1999-07-01

    the tissues, and in a ethical manner that respects the patients’ rights . The Program for Critical Technologies in Breast Oncology helps address all of...diagnosis, database 15. NUMBER OF PAGES 148 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS...closer to clinical utility. Page 17 References Adida C. Crotty PL. McGrath J. Berrebi D. Diebold J. Altieri DC. Developmentally regulated

  17. Perception of the Auditory-Visual Illusion in Speech Perception by Children with Phonological Disorders

    ERIC Educational Resources Information Center

    Dodd, Barbara; McIntosh, Beth; Erdener, Dogu; Burnham, Denis

    2008-01-01

    An example of the auditory-visual illusion in speech perception, first described by McGurk and MacDonald, is the perception of [ta] when listeners hear [pa] in synchrony with the lip movements for [ka]. One account of the illusion is that lip-read and heard speech are combined in an articulatory code since people who mispronounce words respond…

  18. A Logical Design of the Naval Postgraduate School Housing Office.

    DTIC Science & Technology

    1985-03-01

    34 March 1985 L C -:0 Thesis Advisor: Barry A. Frew LU Approved for public release; distribution is unlimited 85 6 3 057...Information Systems Development : Analysis and Design, South-Western, 1984. Pressman , R. S. , Software Engineering A Practitioner’s Approach, McGraw...Postgraduate School Monterey, California 93943 3. Lt. Barry A. Frew 2 Code 54 Fw Administrative Services Department Naval Postgraduate School Monterey

  19. Wright Research and Development Center Test Facilities Handbook

    DTIC Science & Technology

    1990-01-01

    Variable Temperature (2-400K) and Field (0-5 Tesla) Squid Susceptometer Variable Temperature (10-80K) and Field (0-10 Tesla) Transport Current...determine products of combustion using extraction type probes INSTRUMENTATION: Mini computer/data acquisiton system Networking provides access to larger...data recorder, Masscomp MC-500 computer with acquisition digitizer, laser and ink -jet printers,lo-pass filters, pulse code modulation AVAILABILITY

  20. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    ERIC Educational Resources Information Center

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  1. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry T; Yin, Fang-Fang; Chetty, Indrin J

    2013-04-21

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan-scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the 'ISource = 8: Phase-Space Source Incident from Multiple Directions' in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  2. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation

    NASA Astrophysics Data System (ADS)

    Kim, Sangroh; Yoshizumi, Terry T.; Yin, Fang-Fang; Chetty, Indrin J.

    2013-04-01

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan—scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the ‘ISource = 8: Phase-Space Source Incident from Multiple Directions’ in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  3. Monte Carlo simulation of MOSFET detectors for high-energy photon beams using the PENELOPE code

    NASA Astrophysics Data System (ADS)

    Panettieri, Vanessa; Amor Duch, Maria; Jornet, Núria; Ginjaume, Mercè; Carrasco, Pablo; Badal, Andreu; Ortega, Xavier; Ribas, Montserrat

    2007-01-01

    The aim of this work was the Monte Carlo (MC) simulation of the response of commercially available dosimeters based on metal oxide semiconductor field effect transistors (MOSFETs) for radiotherapeutic photon beams using the PENELOPE code. The studied Thomson&Nielsen TN-502-RD MOSFETs have a very small sensitive area of 0.04 mm2 and a thickness of 0.5 µm which is placed on a flat kapton base and covered by a rounded layer of black epoxy resin. The influence of different metallic and Plastic water™ build-up caps, together with the orientation of the detector have been investigated for the specific application of MOSFET detectors for entrance in vivo dosimetry. Additionally, the energy dependence of MOSFET detectors for different high-energy photon beams (with energy >1.25 MeV) has been calculated. Calculations were carried out for simulated 6 MV and 18 MV x-ray beams generated by a Varian Clinac 1800 linear accelerator, a Co-60 photon beam from a Theratron 780 unit, and monoenergetic photon beams ranging from 2 MeV to 10 MeV. The results of the validation of the simulated photon beams show that the average difference between MC results and reference data is negligible, within 0.3%. MC simulated results of the effect of the build-up caps on the MOSFET response are in good agreement with experimental measurements, within the uncertainties. In particular, for the 18 MV photon beam the response of the detectors under a tungsten cap is 48% higher than for a 2 cm Plastic water™ cap and approximately 26% higher when a brass cap is used. This effect is demonstrated to be caused by positron production in the build-up caps of higher atomic number. This work also shows that the MOSFET detectors produce a higher signal when their rounded side is facing the beam (up to 6%) and that there is a significant variation (up to 50%) in the response of the MOSFET for photon energies in the studied energy range. All the results have shown that the PENELOPE code system can successfully reproduce the response of a detector with such a small active area.

  4. Monte Carlo simulation of MOSFET detectors for high-energy photon beams using the PENELOPE code.

    PubMed

    Panettieri, Vanessa; Duch, Maria Amor; Jornet, Núria; Ginjaume, Mercè; Carrasco, Pablo; Badal, Andreu; Ortega, Xavier; Ribas, Montserrat

    2007-01-07

    The aim of this work was the Monte Carlo (MC) simulation of the response of commercially available dosimeters based on metal oxide semiconductor field effect transistors (MOSFETs) for radiotherapeutic photon beams using the PENELOPE code. The studied Thomson&Nielsen TN-502-RD MOSFETs have a very small sensitive area of 0.04 mm(2) and a thickness of 0.5 microm which is placed on a flat kapton base and covered by a rounded layer of black epoxy resin. The influence of different metallic and Plastic water build-up caps, together with the orientation of the detector have been investigated for the specific application of MOSFET detectors for entrance in vivo dosimetry. Additionally, the energy dependence of MOSFET detectors for different high-energy photon beams (with energy >1.25 MeV) has been calculated. Calculations were carried out for simulated 6 MV and 18 MV x-ray beams generated by a Varian Clinac 1800 linear accelerator, a Co-60 photon beam from a Theratron 780 unit, and monoenergetic photon beams ranging from 2 MeV to 10 MeV. The results of the validation of the simulated photon beams show that the average difference between MC results and reference data is negligible, within 0.3%. MC simulated results of the effect of the build-up caps on the MOSFET response are in good agreement with experimental measurements, within the uncertainties. In particular, for the 18 MV photon beam the response of the detectors under a tungsten cap is 48% higher than for a 2 cm Plastic water cap and approximately 26% higher when a brass cap is used. This effect is demonstrated to be caused by positron production in the build-up caps of higher atomic number. This work also shows that the MOSFET detectors produce a higher signal when their rounded side is facing the beam (up to 6%) and that there is a significant variation (up to 50%) in the response of the MOSFET for photon energies in the studied energy range. All the results have shown that the PENELOPE code system can successfully reproduce the response of a detector with such a small active area.

  5. Implementation of Soft X-ray Tomography on NSTX

    NASA Astrophysics Data System (ADS)

    Tritz, K.; Stutman, D.; Finkenthal, M.; Granetz, R.; Menard, J.; Park, W.

    2003-10-01

    A set of poloidal ultrasoft X-ray arrays is operated by the Johns Hopkins group on NSTX. To enable MHD mode analysis independent of the magnetic reconstruction, the McCormick-Granetz tomography code developed at MIT is being adapted to the NSTX geometry. Tests of the code using synthetic data show that that present X-ray system is adequate for m=1 tomography. In addition, we have found that spline basis functions may be better suited than Bessel functions for the reconstruction of radially localized phenomena in NSTX. The tomography code was also used to determine the necessary array expansion and optimal array placement for the characterization of higher m modes (m=2,3) in the future. Initial reconstruction of experimental soft X-ray data has been performed for m=1 internal modes, which are often encountered in high beta NSTX discharges. The reconstruction of these modes will be compared to predictions from the M3D code and magnetic measurements.

  6. X-Ray, EUV, UV and Optical Emissivities of Astrophysical Plasmas

    NASA Technical Reports Server (NTRS)

    Raymond, John C.; West, Donald (Technical Monitor)

    2000-01-01

    This grant primarily covered the development of the thermal X-ray emission model code called APEC, which is meant to replace the Raymond and Smith (1977) code. The new code contains far more spectral lines and a great deal of updated atomic data. The code is now available (http://hea-www.harvard.edu/APEC), though new atomic data is still being added, particularly at longer wavelengths. While initial development of the code was funded by this grant, current work is carried on by N. Brickhouse, R. Smith and D. Liedahl under separate funding. Over the last five years, the grant has provided salary support for N. Brickhouse, R. Smith, a summer student (L. McAllister), an SAO predoctoral fellow (A. Vasquez), and visits by T. Kallman, D. Liedahl, P. Ghavamian, J.M. Laming, J. Li, P. Okeke, and M. Martos. In addition to the code development, the grant supported investigations into X-ray and UV spectral diagnostics as applied to shock waves in the ISM, accreting black holes and white dwarfs, and stellar coronae. Many of these efforts are continuing. Closely related work on the shock waves and coronal mass ejections in the solar corona has grown out of the efforts supported by the grant.

  7. Introducing MCgrid 2.0: Projecting cross section calculations on grids

    NASA Astrophysics Data System (ADS)

    Bothmann, Enrico; Hartland, Nathan; Schumann, Steffen

    2015-11-01

    MCgrid is a software package that provides access to interpolation tools for Monte Carlo event generator codes, allowing for the fast and flexible variation of scales, coupling parameters and PDFs in cutting edge leading- and next-to-leading-order QCD calculations. We present the upgrade to version 2.0 which has a broader scope of interfaced interpolation tools, now providing access to fastNLO, and features an approximated treatment for the projection of MC@NLO-type calculations onto interpolation grids. MCgrid 2.0 also now supports the extended information provided through the HepMC event record used in the recent SHERPA version 2.2.0. The additional information provided therein allows for the support of multi-jet merged QCD calculations in a future update of MCgrid.

  8. MC ray-tracing optimization of lobster-eye focusing devices with RESTRAX

    NASA Astrophysics Data System (ADS)

    Šaroun, Jan; Kulda, Jiří

    2006-11-01

    The enhanced functionalities of the latest version of the RESTRAX software, providing a high-speed Monte Carlo (MC) ray-tracing code to represent a virtual three-axis neutron spectrometer, include representation of parabolic and elliptic guide profiles and facilities for numerical optimization of parameter values, characterizing the instrument components. As examples, we present simulations of a doubly focusing monochromator in combination with cold neutron guides and lobster-eye supermirror devices, concentrating a monochromatic beam to small sample volumes. A Levenberg-Marquardt minimization algorithm is used to optimize simultaneously several parameters of the monochromator and lobster-eye guides. We compare the performance of optimized configurations in terms of monochromatic neutron flux and energy spread and demonstrate the effect of lobster-eye optics on beam transformations in real and momentum subspaces.

  9. An unbiased Hessian representation for Monte Carlo PDFs.

    PubMed

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Latorre, José Ignacio; Rojo, Juan

    We develop a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis. We validate the methodology by first showing that it faithfully reproduces a native Monte Carlo PDF set (NNPDF3.0), and then, that if applied to Hessian PDF set (MMHT14) which was transformed into a Monte Carlo set, it gives back the starting PDFs with minimal information loss. We then show that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (MC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea and a Hessian version of the recently suggested PDF compression algorithm (CMC-PDFs). The mc2hessian conversion code is made publicly available together with (through LHAPDF6) a Hessian representations of the NNPDF3.0 set, and the MC-H PDF set.

  10. Validation of a commercial TPS based on the VMC(++) Monte Carlo code for electron beams: commissioning and dosimetric comparison with EGSnrc in homogeneous and heterogeneous phantoms.

    PubMed

    Ferretti, A; Martignano, A; Simonato, F; Paiusco, M

    2014-02-01

    The aim of the present work was the validation of the VMC(++) Monte Carlo (MC) engine implemented in the Oncentra Masterplan (OMTPS) and used to calculate the dose distribution produced by the electron beams (energy 5-12 MeV) generated by the linear accelerator (linac) Primus (Siemens), shaped by a digital variable applicator (DEVA). The BEAMnrc/DOSXYZnrc (EGSnrc package) MC model of the linac head was used as a benchmark. Commissioning results for both MC codes were evaluated by means of 1D Gamma Analysis (2%, 2 mm), calculated with a home-made Matlab (The MathWorks) program, comparing the calculations with the measured profiles. The results of the commissioning of OMTPS were good [average gamma index (γ) > 97%]; some mismatches were found with large beams (size ≥ 15 cm). The optimization of the BEAMnrc model required to increase the beam exit window to match the calculated and measured profiles (final average γ > 98%). Then OMTPS dose distribution maps were compared with DOSXYZnrc with a 2D Gamma Analysis (3%, 3 mm), in 3 virtual water phantoms: (a) with an air step, (b) with an air insert, and (c) with a bone insert. The OMTPD and EGSnrc dose distributions with the air-water step phantom were in very high agreement (γ ∼ 99%), while for heterogeneous phantoms there were differences of about 9% in the air insert and of about 10-15% in the bone region. This is due to the Masterplan implementation of VMC(++) which reports the dose as "dose to water", instead of "dose to medium". Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Monte Carlo Neutrino Transport through Remnant Disks from Neutron Star Mergers

    NASA Astrophysics Data System (ADS)

    Richers, Sherwood; Kasen, Daniel; O'Connor, Evan; Fernández, Rodrigo; Ott, Christian D.

    2015-11-01

    We present Sedonu, a new open source, steady-state, special relativistic Monte Carlo (MC) neutrino transport code, available at bitbucket.org/srichers/sedonu. The code calculates the energy- and angle-dependent neutrino distribution function on fluid backgrounds of any number of spatial dimensions, calculates the rates of change of fluid internal energy and electron fraction, and solves for the equilibrium fluid temperature and electron fraction. We apply this method to snapshots from two-dimensional simulations of accretion disks left behind by binary neutron star mergers, varying the input physics and comparing to the results obtained with a leakage scheme for the cases of a central black hole and a central hypermassive neutron star. Neutrinos are guided away from the densest regions of the disk and escape preferentially around 45° from the equatorial plane. Neutrino heating is strengthened by MC transport a few scale heights above the disk midplane near the innermost stable circular orbit, potentially leading to a stronger neutrino-driven wind. Neutrino cooling in the dense midplane of the disk is stronger when using MC transport, leading to a globally higher cooling rate by a factor of a few and a larger leptonization rate by an order of magnitude. We calculate neutrino pair annihilation rates and estimate that an energy of 2.8 × 1046 erg is deposited within 45° of the symmetry axis over 300 ms when a central BH is present. Similarly, 1.9 × 1048 erg is deposited over 3 s when an HMNS sits at the center, but neither estimate is likely to be sufficient to drive a gamma-ray burst jet.

  12. TU-H-CAMPUS-IeP1-04: Combined Organ Dose for Digital Subtraction Angiography and Computed Tomography Using Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakabe, D; Ohno, T; Araki, F

    Purpose: The purpose of this study was to evaluate the combined organ dose of digital subtraction angiography (DSA) and computed tomography (CT) using a Monte Carlo (MC) simulation on the abdominal intervention. Methods: The organ doses for DSA and CT were obtained with MC simulation and actual measurements using fluorescent-glass dosimeters at 7 abdominal portions in an Alderson-Rando phantom. DSA was performed from three directions: posterior anterior (PA), right anterior oblique (RAO), and left anterior oblique (LAO). The organ dose with MC simulation was compared with actual radiation dose measurements. Calculations for the MC simulation were carried out with themore » GMctdospp (IMPS, Germany) software based on the EGSnrc MC code. Finally, the combined organ dose for DSA and CT was calculated from the MC simulation using the X-ray conditions of a patient with a diagnosis of hepatocellular carcinoma. Results: For DSA from the PA direction, the organ doses for the actual measurements and MC simulation were 2.2 and 2.4 mGy/100 mAs at the liver, respectively, and 3.0 and 3.1 mGy/100 mAs at the spinal cord, while for CT, the organ doses were 15.2 and 15.1 mGy/100 mAs at the liver, and 14.6 and 13.5 mGy/100 mAs at the spinal cord. The maximum difference in organ dose between the actual measurements and the MC simulation was 11.0% of the spleen at PA, 8.2% of the spinal cord at RAO, and 6.1% of left kidney at LAO with DSA and 9.3% of the stomach with CT. The combined organ dose (4 DSAs and 6 CT scans) with the use of actual patient conditions was found to be 197.4 mGy for the liver and 205.1 mGy for the spinal cord. Conclusion: Our method makes it possible to accurately assess the organ dose to patients for abdominal intervention with combined DSA and CT.« less

  13. Phonological coding during reading.

    PubMed

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  14. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  15. Spectroscopy and photochemistry of humic acids

    NASA Astrophysics Data System (ADS)

    Sokolova, I. V.; Vershinin, N. O.; Skobczova, K. A.; Tchaikovskaya, O. N.; Mayer, G. V.

    2018-04-01

    Spectroscopy and photochemistry of humic acids are discussed. The samples of HAs fractions were obtained from Fluka Chemical Co and prepared from peat of Western Siberia region. The comparative analysis of these acids with the sample of humic acids allocated from brown coal is carried out. A specific feature of the reactor is the use of barrier discharge excilamp (KrCl) with radiation wavelength λ = 222 nm. Influence of the received humic acids on process of photodegradation of herbicide - 2.4-dichlorophenoxyacetic acid is considered.

  16. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  17. Blind ICA detection based on second-order cone programming for MC-CDMA systems

    NASA Astrophysics Data System (ADS)

    Jen, Chih-Wei; Jou, Shyh-Jye

    2014-12-01

    The multicarrier code division multiple access (MC-CDMA) technique has received considerable interest for its potential application to future wireless communication systems due to its high data rate. A common problem regarding the blind multiuser detectors used in MC-CDMA systems is that they are extremely sensitive to the complex channel environment. Besides, the perturbation of colored noise may negatively affect the performance of the system. In this paper, a new coherent detection method will be proposed, which utilizes the modified fast independent component analysis (FastICA) algorithm, based on approximate negentropy maximization that is subject to the second-order cone programming (SOCP) constraint. The aim of the proposed coherent detection is to provide robustness against small-to-medium channel estimation mismatch (CEM) that may arise from channel frequency response estimation error in the MC-CDMA system, which is modulated by downlink binary phase-shift keying (BPSK) under colored noise. Noncoherent demodulation schemes are preferable to coherent demodulation schemes, as the latter are difficult to implement over time-varying fading channels. Differential phase-shift keying (DPSK) is therefore the natural choice for an alternative modulation scheme. Furthermore, the new blind differential SOCP-based ICA (SOCP-ICA) detection without channel estimation and compensation will be proposed to combat Doppler spread caused by time-varying fading channels in the DPSK-modulated MC-CDMA system under colored noise. In this paper, numerical simulations are used to illustrate the robustness of the proposed blind coherent SOCP-ICA detector against small-to-medium CEM and to emphasize the advantage of the blind differential SOCP-ICA detector in overcoming Doppler spread.

  18. Sequence Characterization of the MC1R Gene in Yak (Poephagus grunniens) Breeds with Different Coat Colors

    PubMed Central

    Chen, Shi-Yi; Huang, Yi; Zhu, Qing; Fontanesi, Luca; Yao, Yong-Gang; Liu, Yi-Ping

    2009-01-01

    Melanocortin 1 receptor (MC1R) gene plays a key role in determining coat color in several species, including the cattle. However, up to now there is no report regarding the MC1R gene and the potential association of its mutations with coat colors in yak (Poephagus grunniens). In this study, we sequenced the encoding region of the MC1R gene in three yak breeds with completely white (Tianzhu breed) or black coat color (Jiulong and Maiwa breeds). The predicted coding region of the yak MC1R gene resulted of 954 bp, the same to that of the wild-type cattle sequence, with >99% identity. None of the mutation events reported in cattle was found. Comparing the yak obtained sequences, five nucleotide substitutions were detected, which defined three haplotypes (EY1, EY2, and EY3). Of the five mutations, two, characterizing the EY1 haplotype, were nonsynonymous substitutions (c.340C>A and c.871G>A) causing amino acid changes located in the first extracellular loop (p.Q114K) and in the seventh transmembrane region (p.A291T). In silico prediction might indicate a functional effect of the latter substitution. However, all three haplotypes were present in the three yak breeds with relatively consistent frequency distribution, despite of their distinguished coat colors, which suggested that there was no across-breed association between haplotypes or genotypes and black/white phenotypes, at least in the investigated breeds. Other genes may be involved in affecting coat color in the analyzed yaks. PMID:19584942

  19. SU-F-19A-05: Experimental and Monte Carlo Characterization of the 1 Cm CivaString 103Pd Brachytherapy Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, J; Micka, J; Culberson, W

    Purpose: To determine the in-air azimuthal anisotropy and in-water dose distribution for the 1 cm length of the CivaString {sup 103}Pd brachytherapy source through measurements and Monte Carlo (MC) simulations. American Association of Physicists in Medicine Task Group No. 43 (TG-43) dosimetry parameters were also determined for this source. Methods: The in-air azimuthal anisotropy of the source was measured with a NaI scintillation detector and simulated with the MCNP5 radiation transport code. Measured and simulated results were normalized to their respective mean values and compared. The TG-43 dose-rate constant, line-source radial dose function, and 2D anisotropy function for this sourcemore » were determined from LiF:Mg,Ti thermoluminescent dosimeter (TLD) measurements and MC simulations. The impact of {sup 103}Pd well-loading variability on the in-water dose distribution was investigated using MC simulations by comparing the dose distribution for a source model with four wells of equal strength to that for a source model with strengths increased by 1% for two of the four wells. Results: NaI scintillation detector measurements and MC simulations of the in-air azimuthal anisotropy showed that ≥95% of the normalized data were within 1.2% of the mean value. TLD measurements and MC simulations of the TG-43 dose-rate constant, line-source radial dose function, and 2D anisotropy function agreed to within the experimental TLD uncertainties (k=2). MC simulations showed that a 1% variability in {sup 103}Pd well-loading resulted in changes of <0.1%, <0.1%, and <0.3% in the TG-43 dose-rate constant, radial dose distribution, and polar dose distribution, respectively. Conclusion: The CivaString source has a high degree of azimuthal symmetry as indicated by the NaI scintillation detector measurements and MC simulations of the in-air azimuthal anisotropy. TG-43 dosimetry parameters for this source were determined from TLD measurements and MC simulations. {sup 103}Pd well-loading variability results in minimal variations in the in-water dose distribution according to MC simulations. This work was partially supported by CivaTech Oncology, Inc. through an educational grant for Joshua Reed, John Micka, Wesley Culberson, and Larry DeWerd and through research support for Mark Rivard.« less

  20. MC1R diversity in Northern Island Melanesia has not been constrained by strong purifying selection and cannot explain pigmentation phenotype variation in the region.

    PubMed

    Norton, Heather L; Werren, Elizabeth; Friedlaender, Jonathan

    2015-10-19

    Variation in human skin pigmentation evolved in response to the selective pressure of ultra-violet radiation (UVR). Selection to maintain darker skin in high UVR environments is expected to constrain pigmentation phenotype and variation in pigmentation loci. Consistent with this hypothesis, the gene MC1R exhibits reduced diversity in African populations from high UVR regions compared to low-UVR non-African populations. However, MC1R diversity in non-African populations that have evolved under high-UVR conditions is not well characterized. In order to test the hypothesis that MC1R variation has been constrained in Melanesians the coding region of the MC1R gene was sequenced in 188 individuals from Northern Island Melanesia. The role of purifying selection was assessed using a modified McDonald Kreitman's test. Pairwise FST was calculated between Melanesian populations and populations from the 1000 Genomes Project. The SNP rs2228479 was genotyped in a larger sample (n = 635) of Melanesians and tested for associations with skin and hair pigmentation. We observe three nonsynonymous and two synonymous mutations. A modified McDonald Kreitman's test failed to detect a significant signal of purifying selection. Pairwise FST values calculated between the four islands sampled here indicate little regional substructure in MC1R. When compared to African, European, East and South Asian populations, Melanesians do not exhibit reduced population divergence (measured as FST) or a high proportion of haplotype sharing with Africans, as one might expect if ancestral haplotypes were conserved across high UVR populations in and out of Africa. The only common nonsynonymous polymorphism observed, rs2228479, is not significantly associated with skin or hair pigmentation in a larger sample of Melanesians. The pattern of sequence diversity here does not support a model of strong selective constraint on MC1R in Northern Island Melanesia This absence of strong constraint, as well as the recent population history of the region, may explain the observed frequencies of the derived rs2228479 allele. These results emphasize the complex genetic architecture of pigmentation phenotypes, which are controlled by multiple, possibly interacting loci. They also highlight the role that population history can play in influencing phenotypic diversity in the absence of strong natural selection.

Top