Sample records for ray transport code

  1. Modification and benchmarking of MCNP for low-energy tungsten spectra.

    PubMed

    Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M

    2000-12-01

    The MCNP Monte Carlo radiation transport code was modified for diagnostic medical physics applications. In particular, the modified code was thoroughly benchmarked for the production of polychromatic tungsten x-ray spectra in the 30-150 kV range. Validating the modified code for coupled electron-photon transport with benchmark spectra was supplemented with independent electron-only and photon-only transport benchmarks. Major revisions to the code included the proper treatment of characteristic K x-ray production and scoring, new impact ionization cross sections, and new bremsstrahlung cross sections. Minor revisions included updated photon cross sections, electron-electron bremsstrahlung production, and K x-ray yield. The modified MCNP code is benchmarked to electron backscatter factors, x-ray spectra production, and primary and scatter photon transport.

  2. hybrid\\scriptsize{{MANTIS}}: a CPU-GPU Monte Carlo method for modeling indirect x-ray detectors with columnar scintillators

    NASA Astrophysics Data System (ADS)

    Sharma, Diksha; Badal, Andreu; Badano, Aldo

    2012-04-01

    The computational modeling of medical imaging systems often requires obtaining a large number of simulated images with low statistical uncertainty which translates into prohibitive computing times. We describe a novel hybrid approach for Monte Carlo simulations that maximizes utilization of CPUs and GPUs in modern workstations. We apply the method to the modeling of indirect x-ray detectors using a new and improved version of the code \\scriptsize{{MANTIS}}, an open source software tool used for the Monte Carlo simulations of indirect x-ray imagers. We first describe a GPU implementation of the physics and geometry models in fast\\scriptsize{{DETECT}}2 (the optical transport model) and a serial CPU version of the same code. We discuss its new features like on-the-fly column geometry and columnar crosstalk in relation to the \\scriptsize{{MANTIS}} code, and point out areas where our model provides more flexibility for the modeling of realistic columnar structures in large area detectors. Second, we modify \\scriptsize{{PENELOPE}} (the open source software package that handles the x-ray and electron transport in \\scriptsize{{MANTIS}}) to allow direct output of location and energy deposited during x-ray and electron interactions occurring within the scintillator. This information is then handled by optical transport routines in fast\\scriptsize{{DETECT}}2. A load balancer dynamically allocates optical transport showers to the GPU and CPU computing cores. Our hybrid\\scriptsize{{MANTIS}} approach achieves a significant speed-up factor of 627 when compared to \\scriptsize{{MANTIS}} and of 35 when compared to the same code running only in a CPU instead of a GPU. Using hybrid\\scriptsize{{MANTIS}}, we successfully hide hours of optical transport time by running it in parallel with the x-ray and electron transport, thus shifting the computational bottleneck from optical to x-ray transport. The new code requires much less memory than \\scriptsize{{MANTIS}} and, as a result, allows us to efficiently simulate large area detectors.

  3. Should One Use the Ray-by-Ray Approximation in Core-Collapse Supernova Simulations?

    DOE PAGES

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C.

    2016-10-28

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12-, 15-, 20-, and 25-M⊙ progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive use of the ray-by-ray+more » approach. Employing it leads to maximum post-bounce/preexplosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25-M⊙ progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.« less

  4. Should One Use the Ray-by-Ray Approximation in Core-collapse Supernova Simulations?

    NASA Astrophysics Data System (ADS)

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C.

    2016-11-01

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12, 15, 20, and 25 M ⊙ progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive use of the ray-by-ray+ approach. Employing it leads to maximum post-bounce/pre-explosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25 M ⊙ progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions, the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.

  5. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  7. SHOULD ONE USE THE RAY-BY-RAY APPROXIMATION IN CORE-COLLAPSE SUPERNOVA SIMULATIONS?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C., E-mail: burrows@astro.princeton.edu, E-mail: askinner@astro.princeton.edu, E-mail: jdolence@lanl.gov

    2016-11-01

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12, 15, 20, and 25 M {sub ⊙} progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive usemore » of the ray-by-ray+ approach. Employing it leads to maximum post-bounce/pre-explosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25 M {sub ⊙} progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions, the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.« less

  8. Modelling of an Orthovoltage X-ray Therapy Unit with the EGSnrc Monte Carlo Package

    NASA Astrophysics Data System (ADS)

    Knöös, Tommy; Rosenschöld, Per Munck Af; Wieslander, Elinore

    2007-06-01

    Simulations with the EGSnrc code package of an orthovoltage x-ray machine have been performed. The BEAMnrc code was used to transport electrons, produce x-ray photons in the target and transport of these through the treatment machine down to the exit level of the applicator. Further transport in water or CT based phantoms was facilitated by the DOSXYZnrc code. Phase space files were scored with BEAMnrc and analysed regarding the energy spectra at the end of the applicator. Tuning of simulation parameters was based on the half-value layer quantity for the beams in either Al or Cu. Calculated depth dose and profile curves have been compared against measurements and show good agreement except at shallow depths. The MC model tested in this study can be used for various dosimetric studies as well as generating a library of typical treatment cases that can serve as both educational material and guidance in the clinical practice

  9. CEM2k and LAQGSM Codes as Event-Generators for Space Radiation Shield and Cosmic Rays Propagation Applications

    NASA Technical Reports Server (NTRS)

    Mashnik, S. G.; Gudima, K. K.; Sierk, A. J.; Moskalenko, I. V.

    2002-01-01

    Space radiation shield applications and studies of cosmic ray propagation in the Galaxy require reliable cross sections to calculate spectra of secondary particles and yields of the isotopes produced in nuclear reactions induced both by particles and nuclei at energies from threshold to hundreds of GeV per nucleon. Since the data often exist in a very limited energy range or sometimes not at all, the only way to obtain an estimate of the production cross sections is to use theoretical models and codes. Recently, we have developed improved versions of the Cascade-Exciton Model (CEM) of nuclear reactions: the codes CEM97 and CEM2k for description of particle-nucleus reactions at energies up to about 5 GeV. In addition, we have developed a LANL version of the Quark-Gluon String Model (LAQGSM) to describe reactions induced both by particles and nuclei at energies up to hundreds of GeVhucleon. We have tested and benchmarked the CEM and LAQGSM codes against a large variety of experimental data and have compared their results with predictions by other currently available models and codes. Our benchmarks show that CEM and LAQGSM codes have predictive powers no worse than other currently used codes and describe many reactions better than other codes; therefore both our codes can be used as reliable event-generators for space radiation shield and cosmic ray propagation applications. The CEM2k code is being incorporated into the transport code MCNPX (and several other transport codes), and we plan to incorporate LAQGSM into MCNPX in the near future. Here, we present the current status of the CEM2k and LAQGSM codes, and show results and applications to studies of cosmic ray propagation in the Galaxy.

  10. MMAPDNG: A new, fast code backed by a memory-mapped database for simulating delayed γ-ray emission with MCNPX package

    NASA Astrophysics Data System (ADS)

    Lou, Tak Pui; Ludewigt, Bernhard

    2015-09-01

    The simulation of the emission of beta-delayed gamma rays following nuclear fission and the calculation of time-dependent energy spectra is a computational challenge. The widely used radiation transport code MCNPX includes a delayed gamma-ray routine that is inefficient and not suitable for simulating complex problems. This paper describes the code "MMAPDNG" (Memory-Mapped Delayed Neutron and Gamma), an optimized delayed gamma module written in C, discusses usage and merits of the code, and presents results. The approach is based on storing required Fission Product Yield (FPY) data, decay data, and delayed particle data in a memory-mapped file. When compared to the original delayed gamma-ray code in MCNPX, memory utilization is reduced by two orders of magnitude and the ray sampling is sped up by three orders of magnitude. Other delayed particles such as neutrons and electrons can be implemented in future versions of MMAPDNG code using its existing framework.

  11. Approximated transport-of-intensity equation for coded-aperture x-ray phase-contrast imaging.

    PubMed

    Das, Mini; Liang, Zhihua

    2014-09-15

    Transport-of-intensity equations (TIEs) allow better understanding of image formation and assist in simplifying the "phase problem" associated with phase-sensitive x-ray measurements. In this Letter, we present for the first time to our knowledge a simplified form of TIE that models x-ray differential phase-contrast (DPC) imaging with coded-aperture (CA) geometry. The validity of our approximation is demonstrated through comparison with an exact TIE in numerical simulations. The relative contributions of absorption, phase, and differential phase to the acquired phase-sensitive intensity images are made readily apparent with the approximate TIE, which may prove useful for solving the inverse phase-retrieval problem associated with these CA geometry based DPC.

  12. Comparison of space radiation calculations for deterministic and Monte Carlo transport codes

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo

    For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.

  13. Simulation the spatial resolution of an X-ray imager based on zinc oxide nanowires in anodic aluminium oxide membrane by using MCNP and OPTICS Codes

    NASA Astrophysics Data System (ADS)

    Samarin, S. N.; Saramad, S.

    2018-05-01

    The spatial resolution of a detector is a very important parameter for x-ray imaging. A bulk scintillation detector because of spreading of light inside the scintillator does't have a good spatial resolution. The nanowire scintillators because of their wave guiding behavior can prevent the spreading of light and can improve the spatial resolution of traditional scintillation detectors. The zinc oxide (ZnO) scintillator nanowire, with its simple construction by electrochemical deposition in regular hexagonal structure of Aluminum oxide membrane has many advantages. The three dimensional absorption of X-ray energy in ZnO scintillator is simulated by a Monte Carlo transport code (MCNP). The transport, attenuation and scattering of the generated photons are simulated by a general-purpose scintillator light response simulation code (OPTICS). The results are compared with a previous publication which used a simulation code of the passage of particles through matter (Geant4). The results verify that this scintillator nanowire structure has a spatial resolution less than one micrometer.

  14. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  15. Computation of Cosmic Ray Ionization and Dose at Mars: a Comparison of HZETRN and Planetocosmics for Proton and Alpha Particles

    NASA Technical Reports Server (NTRS)

    Gronoff, Guillaume; Norman, Ryan B.; Mertens, Christopher J.

    2014-01-01

    The ability to evaluate the cosmic ray environment at Mars is of interest for future manned exploration. To support exploration, tools must be developed to accurately access the radiation environment in both free space and on planetary surfaces. The primary tool NASA uses to quantify radiation exposure behind shielding materials is the space radiation transport code, HZETRN. In order to build confidence in HZETRN, code benchmarking against Monte Carlo radiation transport codes is often used. This work compares the dose calculations at Mars by HZETRN and the Geant4 application Planetocosmics. The dose at ground and the energy deposited in the atmosphere by galactic cosmic ray protons and alpha particles has been calculated for the Curiosity landing conditions. In addition, this work has considered Solar Energetic Particle events, allowing for the comparison of varying input radiation environments. The results for protons and alpha particles show very good agreement between HZETRN and Planetocosmics.

  16. Radiation exposure for manned Mars surface missions

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.; Townsend, Lawrence W.; Wilson, John W.

    1990-01-01

    The Langley cosmic ray transport code and the Langley nucleon transport code (BRYNTRN) are used to quantify the transport and attenuation of galactic cosmic rays (GCR) and solar proton flares through the Martian atmosphere. Surface doses are estimated using both a low density and a high density carbon dioxide model of the atmosphere which, in the vertical direction, provides a total of 16 g/sq cm and 22 g/sq cm of protection, respectively. At the Mars surface during the solar minimum cycle, a blood-forming organ (BFO) dose equivalent of 10.5 to 12 rem/yr due to galactic cosmic ray transport and attenuation is calculated. Estimates of the BFO dose equivalents which would have been incurred from the three large solar flare events of August 1972, November 1960, and February 1956 are also calculated at the surface. Results indicate surface BFO dose equivalents of approximately 2 to 5, 5 to 7, and 8 to 10 rem per event, respectively. Doses are also estimated at altitudes up to 12 km above the Martian surface where the atmosphere will provide less total protection.

  17. Space radiation dose estimates on the surface of Mars

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.; Townsend, Lawrence W.; Wilson, John W.

    1990-01-01

    The Langley cosmic ray transport code and the Langley nucleon transport code (BRYNTRN) are used to quantify the transport and attenuation of galactic cosmic rays (GCR) and solar proton flares through the Martian atmosphere. Surface doses are estimated using both a low density and a high density carbon dioxide model of the atmosphere which, in the vertical direction, provides a total of 16 g/sq cm and 22 g/sq cm of protection, respectively. At the Mars surface during the solar minimum cycle, a blood-forming organ (BFO) dose equivalent of 10.5 to 12 rem/yr due to galactic cosmic ray transport and attenuation is calculated. Estimates of the BFO dose equivalents which would have been incurred from the three large solar flare events of August 1972, November 1960, and February 1956 are also calculated at the surface. Results indicate surface BFO dose equivalents of approximately 2 to 5, 5 to 7, and 8 to 10 rem per event, respectively. Doses are also estimated at altitudes up to 12 km above the Martian surface where the atmosphere will provide less total protection.

  18. Estimates of galactic cosmic ray shielding requirements during solar minimum

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Nealy, John E.; Wilson, John W.; Simonsen, Lisa C.

    1990-01-01

    Estimates of radiation risk from galactic cosmic rays are presented for manned interplanetary missions. The calculations use the Naval Research Laboratory cosmic ray spectrum model as input into the Langley Research Center galactic cosmic ray transport code. This transport code, which transports both heavy ions and nucleons, can be used with any number of layers of target material, consisting of up to five different arbitrary constituents per layer. Calculated galactic cosmic ray fluxes, dose and dose equivalents behind various thicknesses of aluminum, water and liquid hydrogen shielding are presented for the solar minimum period. Estimates of risk to the skin and the blood-forming organs (BFO) are made using 0-cm and 5-cm depth dose/dose equivalent values, respectively, for water. These results indicate that at least 3.5 g/sq cm (3.5 cm) of water, or 6.5 g/sq cm (2.4 cm) of aluminum, or 1.0 g/sq cm (14 cm) of liquid hydrogen shielding is required to reduce the annual exposure below the currently recommended BFO limit of 0.5 Sv. Because of large uncertainties in fragmentation parameters and the input cosmic ray spectrum, these exposure estimates may be uncertain by as much as a factor of 2 or more. The effects of these potential exposure uncertainties or shield thickness requirements are analyzed.

  19. Transport of cosmic ray nuclei in various materials

    NASA Technical Reports Server (NTRS)

    Silberberg, R.; Tsao, C. H.; Letaw, J. R.

    1988-01-01

    Cosmic-ray heavy ions have become a concern in space radiation effects analyses. Heavy ions rapidly deposit energy and create dense ionization trails as they traverse materials. Collection of the free charge disrupts the operation of microelectronic circuits. This effect, called the single-event upset, can cause a loss of digital data. Passage of high linear energy transfer particles through the eyes has been observed by Apollo astronauts. These heavy ions have great radiobiological effectiveness and are the primary risk factor for leukemia induction on a manned Mars mission. Models of the transport of heavy cosmic-ray nuclei through materials depend heavily on our understanding of the cosmic-ray environment, nuclear spallation cross sections, and computer transport codes. Our group has initiated and pursued the development of a full capability for modeling these transport processes. A recent review of this ongoing effort is presented in Ref. 5. In this paper, we discuss transport methods and present new results comparing the attenuation of cosmic rays in various materials.

  20. IPOLE - semi-analytic scheme for relativistic polarized radiative transport

    NASA Astrophysics Data System (ADS)

    Mościbrodzka, M.; Gammie, C. F.

    2018-03-01

    We describe IPOLE, a new public ray-tracing code for covariant, polarized radiative transport. The code extends the IBOTHROS scheme for covariant, unpolarized transport using two representations of the polarized radiation field: In the coordinate frame, it parallel transports the coherency tensor; in the frame of the plasma it evolves the Stokes parameters under emission, absorption, and Faraday conversion. The transport step is implemented to be as spacetime- and coordinate- independent as possible. The emission, absorption, and Faraday conversion step is implemented using an analytic solution to the polarized transport equation with constant coefficients. As a result, IPOLE is stable, efficient, and produces a physically reasonable solution even for a step with high optical depth and Faraday depth. We show that the code matches analytic results in flat space, and that it produces results that converge to those produced by Dexter's GRTRANS polarized transport code on a complicated model problem. We expect IPOLE will mainly find applications in modelling Event Horizon Telescope sources, but it may also be useful in other relativistic transport problems such as modelling for the IXPE mission.

  1. Space Radiation Transport Codes: A Comparative Study for Galactic Cosmic Rays Environment

    NASA Astrophysics Data System (ADS)

    Tripathi, Ram; Wilson, John W.; Townsend, Lawrence W.; Gabriel, Tony; Pinsky, Lawrence S.; Slaba, Tony

    For long duration and/or deep space human missions, protection from severe space radiation exposure is a challenging design constraint and may be a potential limiting factor. The space radiation environment consists of galactic cosmic rays (GCR), solar particle events (SPE), trapped radiation, and includes ions of all the known elements over a very broad energy range. These ions penetrate spacecraft materials producing nuclear fragments and secondary particles that damage biological tissues, microelectronic devices, and materials. In deep space missions, where the Earth's magnetic field does not provide protection from space radiation, the GCR environment is significantly enhanced due to the absence of geomagnetic cut-off and is a major component of radiation exposure. Accurate risk assessments critically depend on the accuracy of the input information as well as radiation transport codes used, and so systematic verification of codes is necessary. In this study, comparisons are made between the deterministic code HZETRN2006 and the Monte Carlo codes HETC-HEDS and FLUKA for an aluminum shield followed by a water target exposed to the 1977 solar minimum GCR spectrum. Interaction and transport of high charge ions present in GCR radiation environment provide a more stringent constraint in the comparison of the codes. Dose, dose equivalent and flux spectra are compared; details of the comparisons will be discussed, and conclusions will be drawn for future directions.

  2. Common Errors in the Calculation of Aircrew Doses from Cosmic Rays

    NASA Astrophysics Data System (ADS)

    O'Brien, Keran; Felsberger, Ernst; Kindl, Peter

    2010-05-01

    Radiation doses to air crew are calculated using flight codes. Flight codes integrate dose rates over the aircraft flight path, which were calculated by transport codes or obtained by measurements from take off at a specific airport to landing at another. The dose rates are stored in various ways, such as by latitude and longitude, or in terms of the geomagnetic vertical cutoff. The transport codes are generally quite satisfactory, but the treatment of the boundary conditions is frequently incorrect. Both the treatment of solar modulation and of the effect of the geomagnetic field are often defective, leading to the systematic overestimate of the crew doses.

  3. Neutron Capture Gamma-Ray Libraries for Nuclear Applications

    NASA Astrophysics Data System (ADS)

    Sleaford, B. W.; Firestone, R. B.; Summers, N.; Escher, J.; Hurst, A.; Krticka, M.; Basunia, S.; Molnar, G.; Belgya, T.; Revay, Z.; Choi, H. D.

    2011-06-01

    The neutron capture reaction is useful in identifying and analyzing the gamma-ray spectrum from an unknown assembly as it gives unambiguous information on its composition. This can be done passively or actively where an external neutron source is used to probe an unknown assembly. There are known capture gamma-ray data gaps in the ENDF libraries used by transport codes for various nuclear applications. The Evaluated Gamma-ray Activation file (EGAF) is a new thermal neutron capture database of discrete line spectra and cross sections for over 260 isotopes that was developed as part of an IAEA Coordinated Research Project. EGAF is being used to improve the capture gamma production in ENDF libraries. For medium to heavy nuclei the quasi continuum contribution to the gamma cascades is not experimentally resolved. The continuum contains up to 90% of all the decay energy and is modeled here with the statistical nuclear structure code DICEBOX. This code also provides a consistency check of the level scheme nuclear structure evaluation. The calculated continuum is of sufficient accuracy to include in the ENDF libraries. This analysis also determines new total thermal capture cross sections and provides an improved RIPL database. For higher energy neutron capture there is less experimental data available making benchmarking of the modeling codes more difficult. We are investigating the capture spectra from higher energy neutrons experimentally using surrogate reactions and modeling this with Hauser-Feshbach codes. This can then be used to benchmark CASINO, a version of DICEBOX modified for neutron capture at higher energy. This can be used to simulate spectra from neutron capture at incident neutron energies up to 20 MeV to improve the gamma-ray spectrum in neutron data libraries used for transport modeling of unknown assemblies.

  4. Galactic cosmic ray transport methods and radiation quality issues

    NASA Technical Reports Server (NTRS)

    Townsend, L. W.; Wilson, J. W.; Cucinotta, F. A.; Shinn, J. L.

    1992-01-01

    An overview of galactic cosmic ray (GCR) interaction and transport methods, as implemented in the Langley Research Center GCR transport code, is presented. Representative results for solar minimum, exo-magnetospheric GCR dose equivalents in water are presented on a component by component basis for various thicknesses of aluminum shielding. The impact of proposed changes to the currently used quality factors on exposure estimates and shielding requirements are quantified. Using the cellular track model of Katz, estimates of relative biological effectiveness (RBE) for the mixed GCR radiation fields are also made.

  5. Comparison of Transport Codes, HZETRN, HETC and FLUKA, Using 1977 GCR Solar Minimum Spectra

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Tripathi, Ram K.; Blattnig, Steve R.; Norbury, John W.; Badavi, Francis F.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; hide

    2009-01-01

    The HZETRN deterministic radiation transport code is one of several tools developed to analyze the effects of harmful galactic cosmic rays (GCR) and solar particle events (SPE) on mission planning, astronaut shielding and instrumentation. This paper is a comparison study involving the two Monte Carlo transport codes, HETC-HEDS and FLUKA, and the deterministic transport code, HZETRN. Each code is used to transport ions from the 1977 solar minimum GCR spectrum impinging upon a 20 g/cm2 Aluminum slab followed by a 30 g/cm2 water slab. This research is part of a systematic effort of verification and validation to quantify the accuracy of HZETRN and determine areas where it can be improved. Comparisons of dose and dose equivalent values at various depths in the water slab are presented in this report. This is followed by a comparison of the proton fluxes, and the forward, backward and total neutron fluxes at various depths in the water slab. Comparisons of the secondary light ion 2H, 3H, 3He and 4He fluxes are also examined.

  6. Meson Production and Space Radiation

    NASA Astrophysics Data System (ADS)

    Norbury, John; Blattnig, Steve; Norman, Ryan; Aghara, Sukesh

    Protecting astronauts from the harmful effects of space radiation is an important priority for long duration space flight. The National Council on Radiation Protection (NCRP) has recently recommended that pion and other mesons should be included in space radiation transport codes, especially in connection with the Martian atmosphere. In an interesting accident of nature, the galactic cosmic ray spectrum has its peak intensity near the pion production threshold. The Boltzmann transport equation is structured in such a way that particle production cross sec-tions are multiplied by particle flux. Therefore, the peak of the incident flux of the galactic cosmic ray spectrum is more important than other regions of the spectrum and cross sections near the peak are enhanced. This happens with pion cross sections. The MCNPX Monte-Carlo transport code now has the capability of transporting heavy ions, and by using a galactic cosmic ray spectrum as input, recent work has shown that pions contribute about twenty percent of the dose from galactic cosmic rays behind a shield of 20 g/cm2 aluminum and 30 g/cm2 water. It is therefore important to include pion and other hadron production in transport codes designed for space radiation studies, such as HZETRN. The status of experimental hadron production data for energies relevant to space radiation will be reviewed, as well as the predictive capa-bilities of current theoretical hadron production cross section and space radiation transport models. Charged pions decay into muons and neutrinos, and neutral pions decay into photons. An electromagnetic cascade is produced as these particles build up in a material. The cascade and transport of pions, muons, electrons and photons will be discussed as they relate to space radiation. The importance of other hadrons, such as kaons, eta mesons and antiprotons will be considered as well. Efficient methods for calculating cross sections for meson production in nucleon-nucleon and nucleus-nucleus reactions will be presented. The NCRP has also recom-mended that more attention should be paid to neutron and light ion transport. The coupling of neutrons, light ions, mesons and other hadrons will be discussed.

  7. Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.

    PubMed

    Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A

    2016-06-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.

  8. Development of a new version of the Vehicle Protection Factor Code (VPF3)

    NASA Astrophysics Data System (ADS)

    Jamieson, Terrance J.

    1990-10-01

    The Vehicle Protection Factor (VPF) Code is an engineering tool for estimating radiation protection afforded by armoured vehicles and other structures exposed to neutron and gamma ray radiation from fission, thermonuclear, and fusion sources. A number of suggestions for modifications have been offered by users of early versions of the code. These include: implementing some of the more advanced features of the air transport rating code, ATR5, used to perform the air over ground radiation transport analyses; allowing the ability to study specific vehicle orientations within the free field; implementing an adjoint transport scheme to reduce the number of transport runs required; investigating the possibility of accelerating the transport scheme; and upgrading the computer automated design (CAD) package used by VPF. The generation of radiation free field fluences for infinite air geometries as required for aircraft analysis can be accomplished by using ATR with the air over ground correction factors disabled. Analysis of the effects of fallout bearing debris clouds on aircraft will require additional modelling of VPF.

  9. Benchmarking Geant4 for simulating galactic cosmic ray interactions within planetary bodies

    DOE PAGES

    Mesick, K. E.; Feldman, W. C.; Coupland, D. D. S.; ...

    2018-06-20

    Galactic cosmic rays undergo complex nuclear interactions with nuclei within planetary bodies that have little to no atmosphere. Radiation transport simulations are a key tool used in understanding the neutron and gamma-ray albedo coming from these interactions and tracing these signals back to geochemical composition of the target. In this paper, we study the validity of the code Geant4 for simulating such interactions by comparing simulation results to data from the Apollo 17 Lunar Neutron Probe Experiment. Different assumptions regarding the physics are explored to demonstrate how these impact the Geant4 simulation results. In general, all of the Geant4 resultsmore » over-predict the data, however, certain physics lists perform better than others. Finally, in addition, we show that results from the radiation transport code MCNP6 are similar to those obtained using Geant4.« less

  10. Benchmarking Geant4 for simulating galactic cosmic ray interactions within planetary bodies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesick, K. E.; Feldman, W. C.; Coupland, D. D. S.

    Galactic cosmic rays undergo complex nuclear interactions with nuclei within planetary bodies that have little to no atmosphere. Radiation transport simulations are a key tool used in understanding the neutron and gamma-ray albedo coming from these interactions and tracing these signals back to geochemical composition of the target. In this paper, we study the validity of the code Geant4 for simulating such interactions by comparing simulation results to data from the Apollo 17 Lunar Neutron Probe Experiment. Different assumptions regarding the physics are explored to demonstrate how these impact the Geant4 simulation results. In general, all of the Geant4 resultsmore » over-predict the data, however, certain physics lists perform better than others. Finally, in addition, we show that results from the radiation transport code MCNP6 are similar to those obtained using Geant4.« less

  11. Morse Monte Carlo Radiation Transport Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one maymore » determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)« less

  12. Benchmark Analysis of Pion Contribution from Galactic Cosmic Rays

    NASA Technical Reports Server (NTRS)

    Aghara, Sukesh K.; Blattnig, Steve R.; Norbury, John W.; Singleterry, Robert C., Jr.

    2008-01-01

    Shielding strategies for extended stays in space must include a comprehensive resolution of the secondary radiation environment inside the spacecraft induced by the primary, external radiation. The distribution of absorbed dose and dose equivalent is a function of the type, energy and population of these secondary products. A systematic verification and validation effort is underway for HZETRN, which is a space radiation transport code currently used by NASA. It performs neutron, proton and heavy ion transport explicitly, but it does not take into account the production and transport of mesons, photons and leptons. The question naturally arises as to what is the contribution of these particles to space radiation. The pion has a production kinetic energy threshold of about 280 MeV. The Galactic cosmic ray (GCR) spectra, coincidentally, reaches flux maxima in the hundreds of MeV range, corresponding to the pion production threshold. We present results from the Monte Carlo code MCNPX, showing the effect of lepton and meson physics when produced and transported explicitly in a GCR environment.

  13. A comparative study of space radiation organ doses and associated cancer risks using PHITS and HZETRN.

    PubMed

    Bahadori, Amir A; Sato, Tatsuhiko; Slaba, Tony C; Shavers, Mark R; Semones, Edward J; Van Baalen, Mary; Bolch, Wesley E

    2013-10-21

    NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.

  14. A comparative study of space radiation organ doses and associated cancer risks using PHITS and HZETRN

    NASA Astrophysics Data System (ADS)

    Bahadori, Amir A.; Sato, Tatsuhiko; Slaba, Tony C.; Shavers, Mark R.; Semones, Edward J.; Van Baalen, Mary; Bolch, Wesley E.

    2013-10-01

    NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.

  15. Radiation protection for human missions to the Moon and Mars

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.

    1991-01-01

    Radiation protection assessments are performed for advanced Lunar and Mars manned missions. The Langley cosmic ray transport code and the nucleon transport code are used to quantify the transport and attenuation of galactic cosmic rays and solar proton flares through various shielding media. Galactic cosmic radiation at solar maximum and minimum, as well as various flare scenarios are considered. Propagation data for water, aluminum, liquid hydrogen, lithium hydride, lead, and lunar and Martian regolith (soil) are included. Shield thickness and shield mass estimates required to maintain incurred doses below 30 day and annual limits (as set for Space Station Freedom and used as a guide for space exploration) are determined for simple geometry transfer vehicles. On the surface of Mars, dose estimates are presented for crews with their only protection being the carbon dioxide atmosphere and for crews protected by shielding provided by Martian regolith for a candidate habitat.

  16. ipole: Semianalytic scheme for relativistic polarized radiative transport

    NASA Astrophysics Data System (ADS)

    Moscibrodzka, Monika; Gammie, Charles F.

    2018-04-01

    ipole is a ray-tracing code for covariant, polarized radiative transport particularly useful for modeling Event Horizon Telescope sources, though may also be used for other relativistic transport problems. The code extends the ibothros scheme for covariant, unpolarized transport using two representations of the polarized radiation field: in the coordinate frame, it parallel transports the coherency tensor, and in the frame of the plasma, it evolves the Stokes parameters under emission, absorption, and Faraday conversion. The transport step is as spacetime- and coordinate- independent as possible; the emission, absorption, and Faraday conversion step is implemented using an analytic solution to the polarized transport equation with constant coefficients. As a result, ipole is stable, efficient, and produces a physically reasonable solution even for a step with high optical depth and Faraday depth.

  17. Cosmic Rays and Their Radiative Processes in Numerical Cosmology

    NASA Technical Reports Server (NTRS)

    Ryu, Dongsu; Miniati, Francesco; Jones, Tom W.; Kang, Hyesung

    2000-01-01

    A cosmological hydrodynamic code is described, which includes a routine to compute cosmic ray acceleration and transport in a simplified way. The routine was designed to follow explicitly diffusive, acceleration at shocks, and second-order Fermi acceleration and adiabatic loss in smooth flows. Synchrotron cooling of the electron population can also be followed. The updated code is intended to be used to study the properties of nonthermal synchrotron emission and inverse Compton scattering from electron cosmic rays in clusters of galaxies, in addition to the properties of thermal bremsstrahlung emission from hot gas. The results of a test simulation using a grid of 128 (exp 3) cells are presented, where cosmic rays and magnetic field have been treated passively and synchrotron cooling of cosmic ray electrons has not been included.

  18. Cosmic Rays and Their Radiative Processes in Numerical Cosmology

    NASA Astrophysics Data System (ADS)

    Ryu, D.; Miniati, F.; Jones, T. W.; Kang, H.

    2000-05-01

    A cosmological hydrodynamic code is described, which includes a routine to compute cosmic ray acceleration and transport in a simplified way. The routine was designed to follow explicitly diffusive acceleration at shocks, and second-order Fermi acceleration and adiabatic loss in smooth flows. Synchrotron cooling of the electron population can also be followed. The updated code is intended to be used to study the properties of nonthermal synchrotron emission and inverse Compton scattering from electron cosmic rays in clusters of galaxies, in addition to the properties of thermal bremsstrahlung emission from hot gas. The results of a test simulation using a grid of 1283 cells are presented, where cosmic rays and magnetic field have been treated passively and synchrotron cooling of cosmic ray electrons has not been included.

  19. Analysis of neutron and gamma-ray streaming along the maze of NRCAM thallium production target room.

    PubMed

    Raisali, G; Hajiloo, N; Hamidi, S; Aslani, G

    2006-08-01

    Study of the shield performance of a thallium-203 production target room has been investigated in this work. Neutron and gamma-ray equivalent dose rates at various points of the maze are calculated by simulating the transport of streaming neutrons, and photons using Monte Carlo method. For determination of neutron and gamma-ray source intensities and their energy spectrum, we have applied SRIM 2003 and ALICE91 computer codes to Tl target and its Cu substrate for a 145 microA of 28.5 MeV protons beam. The MCNP/4C code has been applied with neutron source term in mode n p to consider both prompt neutrons and secondary gamma-rays. Then the code is applied for the prompt gamma-rays as the source term. The neutron-flux energy spectrum and equivalent dose rates for neutron and gamma-rays in various positions in the maze have been calculated. It has been found that the deviation between calculated and measured dose values along the maze is less than 20%.

  20. Coupled multi-group neutron photon transport for the simulation of high-resolution gamma-ray spectroscopy applications

    NASA Astrophysics Data System (ADS)

    Burns, Kimberly Ann

    The accurate and efficient simulation of coupled neutron-photon problems is necessary for several important radiation detection applications. Examples include the detection of nuclear threats concealed in cargo containers and prompt gamma neutron activation analysis for nondestructive determination of elemental composition of unknown samples. In these applications, high-resolution gamma-ray spectrometers are used to preserve as much information as possible about the emitted photon flux, which consists of both continuum and characteristic gamma rays with discrete energies. Monte Carlo transport is the most commonly used modeling tool for this type of problem, but computational times for many problems can be prohibitive. This work explores the use of coupled Monte Carlo-deterministic methods for the simulation of neutron-induced photons for high-resolution gamma-ray spectroscopy applications. RAdiation Detection Scenario Analysis Toolbox (RADSAT), a code which couples deterministic and Monte Carlo transport to perform radiation detection scenario analysis in three dimensions [1], was used as the building block for the methods derived in this work. RADSAT was capable of performing coupled deterministic-Monte Carlo simulations for gamma-only and neutron-only problems. The purpose of this work was to develop the methodology necessary to perform coupled neutron-photon calculations and add this capability to RADSAT. Performing coupled neutron-photon calculations requires four main steps: the deterministic neutron transport calculation, the neutron-induced photon spectrum calculation, the deterministic photon transport calculation, and the Monte Carlo detector response calculation. The necessary requirements for each of these steps were determined. A major challenge in utilizing multigroup deterministic transport methods for neutron-photon problems was maintaining the discrete neutron-induced photon signatures throughout the simulation. Existing coupled neutron-photon cross-section libraries and the methods used to produce neutron-induced photons were unsuitable for high-resolution gamma-ray spectroscopy applications. Central to this work was the development of a method for generating multigroup neutron-photon cross-sections in a way that separates the discrete and continuum photon emissions so the neutron-induced photon signatures were preserved. The RADSAT-NG cross-section library was developed as a specialized multigroup neutron-photon cross-section set for the simulation of high-resolution gamma-ray spectroscopy applications. The methodology and cross sections were tested using code-to-code comparison with MCNP5 [2] and NJOY [3]. A simple benchmark geometry was used for all cases compared with MCNP. The geometry consists of a cubical sample with a 252Cf neutron source on one side and a HPGe gamma-ray spectrometer on the opposing side. Different materials were examined in the cubical sample: polyethylene (C2H4), P, N, O, and Fe. The cross sections for each of the materials were compared to cross sections collapsed using NJOY. Comparisons of the volume-averaged neutron flux within the sample, volume-averaged photon flux within the detector, and high-purity gamma-ray spectrometer response (only for polyethylene) were completed using RADSAT and MCNP. The code-to-code comparisons show promising results for the coupled Monte Carlo-deterministic method. The RADSAT-NG cross-section production method showed good agreement with NJOY for all materials considered although some additional work is needed in the resonance region and in the first and last energy bin. Some cross section discrepancies existed in the lowest and highest energy bin, but the overall shape and magnitude of the two methods agreed. For the volume-averaged photon flux within the detector, typically the five most intense lines agree to within approximately 5% of the MCNP calculated flux for all of materials considered. The agreement in the code-to-code comparisons cases demonstrates a proof-of-concept of the method for use in RADSAT for coupled neutron-photon problems in high-resolution gamma-ray spectroscopy applications. One of the primary motivators for using the coupled method over pure Monte Carlo method is the potential for significantly lower computational times. For the code-to-code comparison cases, the run times for RADSAT were approximately 25--500 times shorter than for MCNP, as shown in Table 1. This was assuming a 40 mCi 252Cf neutron source and 600 seconds of "real-world" measurement time. The only variance reduction technique implemented in the MCNP calculation was forward biasing of the source toward the sample target. Improved MCNP runtimes could be achieved with the addition of more advanced variance reduction techniques.

  1. Cosmic-ray propagation with DRAGON2: I. numerical solver and astrophysical ingredients

    NASA Astrophysics Data System (ADS)

    Evoli, Carmelo; Gaggero, Daniele; Vittino, Andrea; Di Bernardo, Giuseppe; Di Mauro, Mattia; Ligorini, Arianna; Ullio, Piero; Grasso, Dario

    2017-02-01

    We present version 2 of the DRAGON code designed for computing realistic predictions of the CR densities in the Galaxy. The code numerically solves the interstellar CR transport equation (including inhomogeneous and anisotropic diffusion, either in space and momentum, advective transport and energy losses), under realistic conditions. The new version includes an updated numerical solver and several models for the astrophysical ingredients involved in the transport equation. Improvements in the accuracy of the numerical solution are proved against analytical solutions and in reference diffusion scenarios. The novel features implemented in the code allow to simulate the diverse scenarios proposed to reproduce the most recent measurements of local and diffuse CR fluxes, going beyond the limitations of the homogeneous galactic transport paradigm. To this end, several applications using DRAGON2 are presented as well. This new version facilitates the users to include their own physical models by means of a modular C++ structure.

  2. Initial performances of first undulator-based hard x-ray beamlines of NSLS-II compared to simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chubar, Oleg, E-mail: chubar@bnl.gov; Chu, Yong S.; Huang, Xiaojing

    2016-07-27

    Commissioning of the first X-ray beamlines of NSLS-II included detailed measurements of spectral and spatial distributions of the radiation at different locations of the beamlines, from front-ends to sample positions. Comparison of some of these measurement results with high-accuracy calculations of synchrotron (undulator) emission and wavefront propagation through X-ray transport optics, performed using SRW code, is presented.

  3. Spatiotemporal Monte Carlo transport methods in x-ray semiconductor detectors: application to pulse-height spectroscopy in a-Se.

    PubMed

    Fang, Yuan; Badal, Andreu; Allec, Nicholas; Karim, Karim S; Badano, Aldo

    2012-01-01

    The authors describe a detailed Monte Carlo (MC) method for the coupled transport of ionizing particles and charge carriers in amorphous selenium (a-Se) semiconductor x-ray detectors, and model the effect of statistical variations on the detected signal. A detailed transport code was developed for modeling the signal formation process in semiconductor x-ray detectors. The charge transport routines include three-dimensional spatial and temporal models of electron-hole pair transport taking into account recombination and trapping. Many electron-hole pairs are created simultaneously in bursts from energy deposition events. Carrier transport processes include drift due to external field and Coulombic interactions, and diffusion due to Brownian motion. Pulse-height spectra (PHS) have been simulated with different transport conditions for a range of monoenergetic incident x-ray energies and mammography radiation beam qualities. Two methods for calculating Swank factors from simulated PHS are shown, one using the entire PHS distribution, and the other using the photopeak. The latter ignores contributions from Compton scattering and K-fluorescence. Comparisons differ by approximately 2% between experimental measurements and simulations. The a-Se x-ray detector PHS responses simulated in this work include three-dimensional spatial and temporal transport of electron-hole pairs. These PHS were used to calculate the Swank factor and compare it with experimental measurements. The Swank factor was shown to be a function of x-ray energy and applied electric field. Trapping and recombination models are all shown to affect the Swank factor.

  4. Development of a new EMP code at LANL

    NASA Astrophysics Data System (ADS)

    Colman, J. J.; Roussel-Dupré, R. A.; Symbalisty, E. M.; Triplett, L. A.; Travis, B. J.

    2006-05-01

    A new code for modeling the generation of an electromagnetic pulse (EMP) by a nuclear explosion in the atmosphere is being developed. The source of the EMP is the Compton current produced by the prompt radiation (γ-rays, X-rays, and neutrons) of the detonation. As a first step in building a multi- dimensional EMP code we have written three kinetic codes, Plume, Swarm, and Rad. Plume models the transport of energetic electrons in air. The Plume code solves the relativistic Fokker-Planck equation over a specified energy range that can include ~ 3 keV to 50 MeV and computes the resulting electron distribution function at each cell in a two dimensional spatial grid. The energetic electrons are allowed to transport, scatter, and experience Coulombic drag. Swarm models the transport of lower energy electrons in air, spanning 0.005 eV to 30 keV. The swarm code performs a full 2-D solution to the Boltzmann equation for electrons in the presence of an applied electric field. Over this energy range the relevant processes to be tracked are elastic scattering, three body attachment, two body attachment, rotational excitation, vibrational excitation, electronic excitation, and ionization. All of these occur due to collisions between the electrons and neutral bodies in air. The Rad code solves the full radiation transfer equation in the energy range of 1 keV to 100 MeV. It includes effects of photo-absorption, Compton scattering, and pair-production. All of these codes employ a spherical coordinate system in momentum space and a cylindrical coordinate system in configuration space. The "z" axis of the momentum and configuration spaces is assumed to be parallel and we are currently also assuming complete spatial symmetry around the "z" axis. Benchmarking for each of these codes will be discussed as well as the way forward towards an integrated modern EMP code.

  5. DQE simulation of a-Se x-ray detectors using ARTEMIS

    NASA Astrophysics Data System (ADS)

    Fang, Yuan; Badano, Aldo

    2016-03-01

    Detective Quantum Efficiency (DQE) is one of the most important image quality metrics for evaluating the spatial resolution performance of flat-panel x-ray detectors. In this work, we simulate the DQE of amorphous selenium (a-Se) xray detectors with a detailed Monte Carlo transport code (ARTEMIS) for modeling semiconductor-based direct x-ray detectors. The transport of electron-hole pairs is achieved with a spatiotemporal model that accounts for recombination and trapping of carriers and Coulombic effects of space charge and external applied electric field. A range of x-ray energies has been simulated from 10 to 100 keV. The DQE results can be used to study the spatial resolution characteristics of detectors at different energies.

  6. Neutron transport analysis for nuclear reactor design

    DOEpatents

    Vujic, Jasmina L.

    1993-01-01

    Replacing regular mesh-dependent ray tracing modules in a collision/transfer probability (CTP) code with a ray tracing module based upon combinatorial geometry of a modified geometrical module (GMC) provides a general geometry transfer theory code in two dimensions (2D) for analyzing nuclear reactor design and control. The primary modification of the GMC module involves generation of a fixed inner frame and a rotating outer frame, where the inner frame contains all reactor regions of interest, e.g., part of a reactor assembly, an assembly, or several assemblies, and the outer frame, with a set of parallel equidistant rays (lines) attached to it, rotates around the inner frame. The modified GMC module allows for determining for each parallel ray (line), the intersections with zone boundaries, the path length between the intersections, the total number of zones on a track, the zone and medium numbers, and the intersections with the outer surface, which parameters may be used in the CTP code to calculate collision/transfer probability and cross-section values.

  7. Neutron transport analysis for nuclear reactor design

    DOEpatents

    Vujic, J.L.

    1993-11-30

    Replacing regular mesh-dependent ray tracing modules in a collision/transfer probability (CTP) code with a ray tracing module based upon combinatorial geometry of a modified geometrical module (GMC) provides a general geometry transfer theory code in two dimensions (2D) for analyzing nuclear reactor design and control. The primary modification of the GMC module involves generation of a fixed inner frame and a rotating outer frame, where the inner frame contains all reactor regions of interest, e.g., part of a reactor assembly, an assembly, or several assemblies, and the outer frame, with a set of parallel equidistant rays (lines) attached to it, rotates around the inner frame. The modified GMC module allows for determining for each parallel ray (line), the intersections with zone boundaries, the path length between the intersections, the total number of zones on a track, the zone and medium numbers, and the intersections with the outer surface, which parameters may be used in the CTP code to calculate collision/transfer probability and cross-section values. 28 figures.

  8. Ray Effect Mitigation Through Reference Frame Rotation

    DOE PAGES

    Tencer, John

    2016-05-01

    The discrete ordinates method is a popular and versatile technique for solving the radiative transport equation, a major drawback of which is the presence of ray effects. Mitigation of ray effects can yield significantly more accurate results and enhanced numerical stability for combined mode codes. Moreover, when ray effects are present, the solution is seen to be highly dependent upon the relative orientation of the geometry and the global reference frame. It is an undesirable property. A novel ray effect mitigation technique of averaging the computed solution for various reference frame orientations is proposed.

  9. A graphics-card implementation of Monte-Carlo simulations for cosmic-ray transport

    NASA Astrophysics Data System (ADS)

    Tautz, R. C.

    2016-05-01

    A graphics card implementation of a test-particle simulation code is presented that is based on the CUDA extension of the C/C++ programming language. The original CPU version has been developed for the calculation of cosmic-ray diffusion coefficients in artificial Kolmogorov-type turbulence. In the new implementation, the magnetic turbulence generation, which is the most time-consuming part, is separated from the particle transport and is performed on a graphics card. In this article, the modification of the basic approach of integrating test particle trajectories to employ the SIMD (single instruction, multiple data) model is presented and verified. The efficiency of the new code is tested and several language-specific accelerating factors are discussed. For the example of isotropic magnetostatic turbulence, sample results are shown and a comparison to the results of the CPU implementation is performed.

  10. Correlated prompt fission data in transport simulations

    NASA Astrophysics Data System (ADS)

    Talou, P.; Vogt, R.; Randrup, J.; Rising, M. E.; Pozzi, S. A.; Verbeke, J.; Andrews, M. T.; Clarke, S. D.; Jaffke, P.; Jandel, M.; Kawano, T.; Marcath, M. J.; Meierbachtol, K.; Nakae, L.; Rusev, G.; Sood, A.; Stetcu, I.; Walker, C.

    2018-01-01

    Detailed information on the fission process can be inferred from the observation, modeling and theoretical understanding of prompt fission neutron and γ-ray observables. Beyond simple average quantities, the study of distributions and correlations in prompt data, e.g., multiplicity-dependent neutron and γ-ray spectra, angular distributions of the emitted particles, n - n, n - γ, and γ - γ correlations, can place stringent constraints on fission models and parameters that would otherwise be free to be tuned separately to represent individual fission observables. The FREYA and CGMF codes have been developed to follow the sequential emissions of prompt neutrons and γ rays from the initial excited fission fragments produced right after scission. Both codes implement Monte Carlo techniques to sample initial fission fragment configurations in mass, charge and kinetic energy and sample probabilities of neutron and γ emission at each stage of the decay. This approach naturally leads to using simple but powerful statistical techniques to infer distributions and correlations among many observables and model parameters. The comparison of model calculations with experimental data provides a rich arena for testing various nuclear physics models such as those related to the nuclear structure and level densities of neutron-rich nuclei, the γ-ray strength functions of dipole and quadrupole transitions, the mechanism for dividing the excitation energy between the two nascent fragments near scission, and the mechanisms behind the production of angular momentum in the fragments, etc. Beyond the obvious interest from a fundamental physics point of view, such studies are also important for addressing data needs in various nuclear applications. The inclusion of the FREYA and CGMF codes into the MCNP6.2 and MCNPX - PoliMi transport codes, for instance, provides a new and powerful tool to simulate correlated fission events in neutron transport calculations important in nonproliferation, safeguards, nuclear energy, and defense programs. This review provides an overview of the topic, starting from theoretical considerations of the fission process, with a focus on correlated signatures. It then explores the status of experimental correlated fission data and current efforts to address some of the known shortcomings. Numerical simulations employing the FREYA and CGMF codes are compared to experimental data for a wide range of correlated fission quantities. The inclusion of those codes into the MCNP6.2 and MCNPX - PoliMi transport codes is described and discussed in the context of relevant applications. The accuracy of the model predictions and their sensitivity to model assumptions and input parameters are discussed. Finally, a series of important experimental and theoretical questions that remain unanswered are presented, suggesting a renewed effort to address these shortcomings.

  11. Simulating X-ray bursts with a radiation hydrodynamics code

    NASA Astrophysics Data System (ADS)

    Seong, Gwangeon; Kwak, Kyujin

    2018-04-01

    Previous simulations of X-ray bursts (XRBs), for example, those performed by MESA (Modules for Experiments in Stellar Astrophysics) could not address the dynamical effects of strong radiation, which are important to explain the photospheric radius expansion (PRE) phenomena seen in many XRBs. In order to study the effects of strong radiation, we propose to use SNEC (the SuperNova Explosion Code), a 1D Lagrangian open source code that is designed to solve hydrodynamics and equilibrium-diffusion radiation transport together. Because SNEC is able to control modules of radiation-hydrodynamics for properly mapped inputs, radiation-dominant pressure occurring in PRE XRBs can be handled. Here we present simulation models for PRE XRBs by applying SNEC together with MESA.

  12. Faster Heavy Ion Transport for HZETRN

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.

    2013-01-01

    The deterministic particle transport code HZETRN was developed to enable fast and accurate space radiation transport through materials. As more complex transport solutions are implemented for neutrons, light ions (Z < 2), mesons, and leptons, it is important to maintain overall computational efficiency. In this work, the heavy ion (Z > 2) transport algorithm in HZETRN is reviewed, and a simple modification is shown to provide an approximate 5x decrease in execution time for galactic cosmic ray transport. Convergence tests and other comparisons are carried out to verify that numerical accuracy is maintained in the new algorithm.

  13. TRANSP: status and planning

    NASA Astrophysics Data System (ADS)

    Andre, R.; Carlsson, J.; Gorelenkova, M.; Jardin, S.; Kaye, S.; Poli, F.; Yuan, X.

    2016-10-01

    TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT- SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP incorporates high fidelity heating and current drive source models, such as NUBEAM for neutral beam injection, the beam tracing code TORBEAM for EC, TORIC for ICRF, the ray tracing TORAY and GENRAY for EC. The implementation of selected components makes efficient use of MPI for speed up of code calculations. Recently the GENRAY-CQL3D solver for modeling of LH heating and current drive has been implemented and currently being extended to multiple antennas, to allow modeling of EAST discharges. Also, GENRAY+CQL3D is being extended to the use of EC/EBW and of HHFW for NSTX-U. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Work supported by the US Department of Energy under DE-AC02-CH0911466.

  14. Transport calculations and accelerator experiments needed for radiation risk assessment in space.

    PubMed

    Sihver, Lembit

    2008-01-01

    The major uncertainties on space radiation risk estimates in humans are associated to the poor knowledge of the biological effects of low and high LET radiation, with a smaller contribution coming from the characterization of space radiation field and its primary interactions with the shielding and the human body. However, to decrease the uncertainties on the biological effects and increase the accuracy of the risk coefficients for charged particles radiation, the initial charged-particle spectra from the Galactic Cosmic Rays (GCRs) and the Solar Particle Events (SPEs), and the radiation transport through the shielding material of the space vehicle and the human body, must be better estimated Since it is practically impossible to measure all primary and secondary particles from all possible position-projectile-target-energy combinations needed for a correct risk assessment in space, accurate particle and heavy ion transport codes must be used. These codes are also needed when estimating the risk for radiation induced failures in advanced microelectronics, such as single-event effects, etc., and the efficiency of different shielding materials. It is therefore important that the models and transport codes will be carefully benchmarked and validated to make sure they fulfill preset accuracy criteria, e.g. to be able to predict particle fluence, dose and energy distributions within a certain accuracy. When validating the accuracy of the transport codes, both space and ground based accelerator experiments are needed The efficiency of passive shielding and protection of electronic devices should also be tested in accelerator experiments and compared to simulations using different transport codes. In this paper different multipurpose particle and heavy ion transport codes will be presented, different concepts of shielding and protection discussed, as well as future accelerator experiments needed for testing and validating codes and shielding materials.

  15. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample

    PubMed Central

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-01-01

    Abstract To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site. PMID:29385528

  16. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample.

    PubMed

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-05-01

    To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site.

  17. Physical basis of radiation protection in space travel

    NASA Astrophysics Data System (ADS)

    Durante, Marco; Cucinotta, Francis A.

    2011-10-01

    The health risks of space radiation are arguably the most serious challenge to space exploration, possibly preventing these missions due to safety concerns or increasing their costs to amounts beyond what would be acceptable. Radiation in space is substantially different from Earth: high-energy (E) and charge (Z) particles (HZE) provide the main contribution to the equivalent dose in deep space, whereas γ rays and low-energy α particles are major contributors on Earth. This difference causes a high uncertainty on the estimated radiation health risk (including cancer and noncancer effects), and makes protection extremely difficult. In fact, shielding is very difficult in space: the very high energy of the cosmic rays and the severe mass constraints in spaceflight represent a serious hindrance to effective shielding. Here the physical basis of space radiation protection is described, including the most recent achievements in space radiation transport codes and shielding approaches. Although deterministic and Monte Carlo transport codes can now describe well the interaction of cosmic rays with matter, more accurate double-differential nuclear cross sections are needed to improve the codes. Energy deposition in biological molecules and related effects should also be developed to achieve accurate risk models for long-term exploratory missions. Passive shielding can be effective for solar particle events; however, it is limited for galactic cosmic rays (GCR). Active shielding would have to overcome challenging technical hurdles to protect against GCR. Thus, improved risk assessment and genetic and biomedical approaches are a more likely solution to GCR radiation protection issues.

  18. Cosmic-ray propagation with DRAGON2: I. numerical solver and astrophysical ingredients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evoli, Carmelo; Gaggero, Daniele; Vittino, Andrea

    2017-02-01

    We present version 2 of the DRAGON code designed for computing realistic predictions of the CR densities in the Galaxy. The code numerically solves the interstellar CR transport equation (including inhomogeneous and anisotropic diffusion, either in space and momentum, advective transport and energy losses), under realistic conditions. The new version includes an updated numerical solver and several models for the astrophysical ingredients involved in the transport equation. Improvements in the accuracy of the numerical solution are proved against analytical solutions and in reference diffusion scenarios. The novel features implemented in the code allow to simulate the diverse scenarios proposed tomore » reproduce the most recent measurements of local and diffuse CR fluxes, going beyond the limitations of the homogeneous galactic transport paradigm. To this end, several applications using DRAGON2 are presented as well. This new version facilitates the users to include their own physical models by means of a modular C++ structure.« less

  19. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  20. Implementing displacement damage calculations for electrons and gamma rays in the Particle and Heavy-Ion Transport code System

    NASA Astrophysics Data System (ADS)

    Iwamoto, Yosuke

    2018-03-01

    In this study, the Monte Carlo displacement damage calculation method in the Particle and Heavy-Ion Transport code System (PHITS) was improved to calculate displacements per atom (DPA) values due to irradiation by electrons (or positrons) and gamma rays. For the damage due to electrons and gamma rays, PHITS simulates electromagnetic cascades using the Electron Gamma Shower version 5 (EGS5) algorithm and calculates DPA values using the recoil energies and the McKinley-Feshbach cross section. A comparison of DPA values calculated by PHITS and the Monte Carlo assisted Classical Method (MCCM) reveals that they were in good agreement for gamma-ray irradiations of silicon and iron at energies that were less than 10 MeV. Above 10 MeV, PHITS can calculate DPA values not only for electrons but also for charged particles produced by photonuclear reactions. In DPA depth distributions under electron and gamma-ray irradiations, build-up effects can be observed near the target's surface. For irradiation of 90-cm-thick carbon by protons with energies of more than 30 GeV, the ratio of the secondary electron DPA values to the total DPA values is more than 10% and increases with an increase in incident energy. In summary, PHITS can calculate DPA values for all particles and materials over a wide energy range between 1 keV and 1 TeV for electrons, gamma rays, and charged particles and between 10-5 eV and 1 TeV for neutrons.

  1. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the first option, properties of monoenergetic beams are treated. In the second option, the transport of beams in different materials is treated. Similar biophysical properties as in the first option are evaluated for the primary ion and its secondary particles. Additional properties related to the nuclear fragmentation of the beam are evaluated. The GERM code is a computationally efficient Monte-Carlo heavy-ion-beam model. It includes accurate models of LET, range, residual energy, and straggling, and the quantum multiple scattering fragmentation (QMSGRG) nuclear database.

  2. Enhancements to the MCNP6 background source

    DOE PAGES

    McMath, Garrett E.; McKinney, Gregg W.

    2015-10-19

    The particle transport code MCNP has been used to produce a background radiation data file on a worldwide grid that can easily be sampled as a source in the code. Location-dependent cosmic showers were modeled by Monte Carlo methods to produce the resulting neutron and photon background flux at 2054 locations around Earth. An improved galactic-cosmic-ray feature was used to model the source term as well as data from multiple sources to model the transport environment through atmosphere, soil, and seawater. A new elevation scaling feature was also added to the code to increase the accuracy of the cosmic neutronmore » background for user locations with off-grid elevations. Furthermore, benchmarking has shown the neutron integral flux values to be within experimental error.« less

  3. Development of deterministic transport methods for low energy neutrons for shielding in space

    NASA Technical Reports Server (NTRS)

    Ganapol, Barry

    1993-01-01

    Transport of low energy neutrons associated with the galactic cosmic ray cascade is analyzed in this dissertation. A benchmark quality analytical algorithm is demonstrated for use with BRYNTRN, a computer program written by the High Energy Physics Division of NASA Langley Research Center, which is used to design and analyze shielding against the radiation created by the cascade. BRYNTRN uses numerical methods to solve the integral transport equations for baryons with the straight-ahead approximation, and numerical and empirical methods to generate the interaction probabilities. The straight-ahead approximation is adequate for charged particles, but not for neutrons. As NASA Langley improves BRYNTRN to include low energy neutrons, a benchmark quality solution is needed for comparison. The neutron transport algorithm demonstrated in this dissertation uses the closed-form Green's function solution to the galactic cosmic ray cascade transport equations to generate a source of neutrons. A basis function expansion for finite heterogeneous and semi-infinite homogeneous slabs with multiple energy groups and isotropic scattering is used to generate neutron fluxes resulting from the cascade. This method, called the FN method, is used to solve the neutral particle linear Boltzmann transport equation. As a demonstration of the algorithm coded in the programs MGSLAB and MGSEMI, neutron and ion fluxes are shown for a beam of fluorine ions at 1000 MeV per nucleon incident on semi-infinite and finite aluminum slabs. Also, to demonstrate that the shielding effectiveness against the radiation from the galactic cosmic ray cascade is not directly proportional to shield thickness, a graph of transmitted total neutron scalar flux versus slab thickness is shown. A simple model based on the nuclear liquid drop assumption is used to generate cross sections for the galactic cosmic ray cascade. The ENDF/B V database is used to generate the total and scattering cross sections for neutrons in aluminum. As an external verification, the results from MGSLAB and MGSEMI were compared to ANISN/PC, a routinely used neutron transport code, showing excellent agreement. In an application to an aluminum shield, the FN method seems to generate reasonable results.

  4. How Space Radiation Risk from Galactic Cosmic Rays at the International Space Station Relates to Nuclear Cross Sections

    NASA Technical Reports Server (NTRS)

    Lin, Zi-Wei; Adams, J. H., Jr.

    2005-01-01

    Space radiation risk to astronauts is a major obstacle for long term human space explorations. Space radiation transport codes have thus been developed to evaluate radiation effects at the International Space Station (ISS) and in missions to the Moon or Mars. We study how nuclear fragmentation processes in such radiation transport affect predictions on the radiation risk from galactic cosmic rays. Taking into account effects of the geomagnetic field on the cosmic ray spectra, we investigate the effects of fragmentation cross sections at different energies on the radiation risk (represented by dose-equivalent) from galactic cosmic rays behind typical spacecraft materials. These results tell us how the radiation risk at the ISS is related to nuclear cross sections at different energies, and consequently how to most efficiently reduce the physical uncertainty in our predictions on the radiation risk at the ISS.

  5. Characterization of a hybrid target multi-keV x-ray source by a multi-parameter statistical analysis of titanium K-shell emission

    DOE PAGES

    Primout, M.; Babonneau, D.; Jacquet, L.; ...

    2015-11-10

    We studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We developed a general method to infer the N e, T e and T i characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. Finally, these thermodynamic conditions are compared to those calculated independently bymore » the radiation-hydrodynamics transport code FCI2.« less

  6. Studying the response of a plastic scintillator to gamma rays using the Geant4 Monte Carlo code.

    PubMed

    Ghadiri, Rasoul; Khorsandi, Jamshid

    2015-05-01

    To determine the gamma ray response function of an NE-102 scintillator and to investigate the gamma spectra due to the transport of optical photons, we simulated an NE-102 scintillator using Geant4 code. The results of the simulation were compared with experimental data. Good consistency between the simulation and data was observed. In addition, the time and spatial distributions, along with the energy distribution and surface treatments of scintillation detectors, were calculated. This simulation makes us capable of optimizing the photomultiplier tube (or photodiodes) position to yield the best coupling to the detector. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Nuclear radiation analysis

    NASA Technical Reports Server (NTRS)

    Knies, R. J.; Byrn, N. R.; Smith, H. T.

    1972-01-01

    A study program of radiation shielding against the deleterious effects of nuclear radiation on man and equipment is reported. The methods used to analyze the radiation environment from bremsstrahlung photons are discussed along with the methods employed by transport code users. The theory and numerical methods used to solve transport of neutrons and gammas are described, and the neutron and cosmic fluxes that would be present on the gamma-ray telescope were analyzed.

  8. The FLUKA code for space applications: recent developments

    NASA Technical Reports Server (NTRS)

    Andersen, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; hide

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to-date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results have been obtained using a modified version of the RQMD-2.4 code. This interim solution is now fully operational, while waiting for the development of new models based on the FLUKA hadron-nucleus interaction code, a newly developed QMD code, and the implementation of the Boltzmann master equation theory for low energy ion interactions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  9. DXRaySMCS: a user-friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation.

    PubMed

    Bahreyni Toossi, M T; Moradi, H; Zare, H

    2008-01-01

    In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.

  10. Correlated prompt fission data in transport simulations

    DOE PAGES

    Talou, P.; Vogt, R.; Randrup, J.; ...

    2018-01-24

    Detailed information on the fission process can be inferred from the observation, modeling and theoretical understanding of prompt fission neutron and γ-ray observables. Beyond simple average quantities, the study of distributions and correlations in prompt data, e.g., multiplicity-dependent neutron and γ-ray spectra, angular distributions of the emitted particles, n -n, n - γ, and γ - γ correlations, can place stringent constraints on fission models and parameters that would otherwise be free to be tuned separately to represent individual fission observables. The FREYA and CGMF codes have been developed to follow the sequential emissions of prompt neutrons and γ raysmore » from the initial excited fission fragments produced right after scission. Both codes implement Monte Carlo techniques to sample initial fission fragment configurations in mass, charge and kinetic energy and sample probabilities of neutron and γ emission at each stage of the decay. This approach naturally leads to using simple but powerful statistical techniques to infer distributions and correlations among many observables and model parameters. The comparison of model calculations with experimental data provides a rich arena for testing various nuclear physics models such as those related to the nuclear structure and level densities of neutron-rich nuclei, the γ-ray strength functions of dipole and quadrupole transitions, the mechanism for dividing the excitation energy between the two nascent fragments near scission, and the mechanisms behind the production of angular momentum in the fragments, etc. Beyond the obvious interest from a fundamental physics point of view, such studies are also important for addressing data needs in various nuclear applications. The inclusion of the FREYA and CGMF codes into the MCNP6.2 and MCNPX - PoliMi transport codes, for instance, provides a new and powerful tool to simulate correlated fission events in neutron transport calculations important in nonproliferation, safeguards, nuclear energy, and defense programs. Here, this review provides an overview of the topic, starting from theoretical considerations of the fission process, with a focus on correlated signatures. It then explores the status of experimental correlated fission data and current efforts to address some of the known shortcomings. Numerical simulations employing the FREYA and CGMF codes are compared to experimental data for a wide range of correlated fission quantities. The inclusion of those codes into the MCNP6.2 and MCNPX - PoliMi transport codes is described and discussed in the context of relevant applications. The accuracy of the model predictions and their sensitivity to model assumptions and input parameters are discussed. Lastly, a series of important experimental and theoretical questions that remain unanswered are presented, suggesting a renewed effort to address these shortcomings.« less

  11. Correlated prompt fission data in transport simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talou, P.; Vogt, R.; Randrup, J.

    Detailed information on the fission process can be inferred from the observation, modeling and theoretical understanding of prompt fission neutron and γ-ray observables. Beyond simple average quantities, the study of distributions and correlations in prompt data, e.g., multiplicity-dependent neutron and γ-ray spectra, angular distributions of the emitted particles, n -n, n - γ, and γ - γ correlations, can place stringent constraints on fission models and parameters that would otherwise be free to be tuned separately to represent individual fission observables. The FREYA and CGMF codes have been developed to follow the sequential emissions of prompt neutrons and γ raysmore » from the initial excited fission fragments produced right after scission. Both codes implement Monte Carlo techniques to sample initial fission fragment configurations in mass, charge and kinetic energy and sample probabilities of neutron and γ emission at each stage of the decay. This approach naturally leads to using simple but powerful statistical techniques to infer distributions and correlations among many observables and model parameters. The comparison of model calculations with experimental data provides a rich arena for testing various nuclear physics models such as those related to the nuclear structure and level densities of neutron-rich nuclei, the γ-ray strength functions of dipole and quadrupole transitions, the mechanism for dividing the excitation energy between the two nascent fragments near scission, and the mechanisms behind the production of angular momentum in the fragments, etc. Beyond the obvious interest from a fundamental physics point of view, such studies are also important for addressing data needs in various nuclear applications. The inclusion of the FREYA and CGMF codes into the MCNP6.2 and MCNPX - PoliMi transport codes, for instance, provides a new and powerful tool to simulate correlated fission events in neutron transport calculations important in nonproliferation, safeguards, nuclear energy, and defense programs. Here, this review provides an overview of the topic, starting from theoretical considerations of the fission process, with a focus on correlated signatures. It then explores the status of experimental correlated fission data and current efforts to address some of the known shortcomings. Numerical simulations employing the FREYA and CGMF codes are compared to experimental data for a wide range of correlated fission quantities. The inclusion of those codes into the MCNP6.2 and MCNPX - PoliMi transport codes is described and discussed in the context of relevant applications. The accuracy of the model predictions and their sensitivity to model assumptions and input parameters are discussed. Lastly, a series of important experimental and theoretical questions that remain unanswered are presented, suggesting a renewed effort to address these shortcomings.« less

  12. Energy transport in plasmas produced by a high brightness krypton fluoride laser focused to a line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Hadithi, Y.; Tallents, G.J.; Zhang, J.

    A high brightness krypton fluoride Raman laser (wavelength 0.268 [mu]m) generating 0.3 TW, 12 ps pulses with 20 [mu]rad beam divergence and a prepulse of less than 10[sup [minus]10] has been focused to produce a 10 [mu]m wide line focus (irradiances [similar to]0.8--4[times]10[sup 15] W cm[sup [minus]2]) on plastic targets with a diagnostic sodium fluoride (NaF) layer buried within the target. Axial and lateral transport of energy has been measured by analysis of x-ray images of the line focus and from x-ray spectra emitted by the layer of NaF with varying overlay thicknesses. It is shown that the ratio ofmore » the distance between the critical density surface and the ablation surface to the laser focal width controls lateral transport in a similar manner as for previous spot focus experiments. The measured axial energy transport is compared to MEDUSA [J. P. Christiansen, D. E. T. F. Ashby, and K. V. Roberts, Comput. Phys. Commun. [bold 7], 271 (1974)] one-dimensional hydrodynamic code simulations with an average atom post-processor for predicting spectral line intensities. An energy absorption of [similar to]10% in the code gives agreement with the experimental axial penetration. Various measured line ratios of hydrogen- and helium-like Na and F are investigated as temperature diagnostics in the NaF layer using the RATION [R. W. Lee, B. L. Whitten, and R. E. Strout, J. Quant. Spectrosc. Radiat. Transfer [bold 32], 91 (1984)] code.« less

  13. Monte Carlo Analysis of Pion Contribution to Absorbed Dose from Galactic Cosmic Rays

    NASA Technical Reports Server (NTRS)

    Aghara, S.K.; Battnig, S.R.; Norbury, J.W.; Singleterry, R.C.

    2009-01-01

    Accurate knowledge of the physics of interaction, particle production and transport is necessary to estimate the radiation damage to equipment used on spacecraft and the biological effects of space radiation. For long duration astronaut missions, both on the International Space Station and the planned manned missions to Moon and Mars, the shielding strategy must include a comprehensive knowledge of the secondary radiation environment. The distribution of absorbed dose and dose equivalent is a function of the type, energy and population of these secondary products. Galactic cosmic rays (GCR) comprised of protons and heavier nuclei have energies from a few MeV per nucleon to the ZeV region, with the spectra reaching flux maxima in the hundreds of MeV range. Therefore, the MeV - GeV region is most important for space radiation. Coincidentally, the pion production energy threshold is about 280 MeV. The question naturally arises as to how important these particles are with respect to space radiation problems. The space radiation transport code, HZETRN (High charge (Z) and Energy TRaNsport), currently used by NASA, performs neutron, proton and heavy ion transport explicitly, but it does not take into account the production and transport of mesons, photons and leptons. In this paper, we present results from the Monte Carlo code MCNPX (Monte Carlo N-Particle eXtended), showing the effect of leptons and mesons when they are produced and transported in a GCR environment.

  14. Radiation Transport and Shielding for Space Exploration and High Speed Flight Transportation

    NASA Technical Reports Server (NTRS)

    Maung, Khin Maung; Trapathi, R. K.

    1997-01-01

    Transportation of ions and neutrons in matter is of direct interest in several technologically important and scientific areas, including space radiation, cosmic ray propagation studies in galactic medium, nuclear power plants and radiological effects that impact industrial and public health. For the proper assessment of radiation exposure, both reliable transport codes and accurate data are needed. Nuclear cross section data is one of the essential inputs into the transport codes. In order to obtain an accurate parametrization of cross section data, theoretical input is indispensable especially for processes where there is little or no experimental data available. In this grant period work has been done on the studies of the use of relativistic equations and their one-body limits. The results will be useful in choosing appropriate effective one-body equation for reaction calculations. Work has also been done to improve upon the data base needed for the transport codes used in the studies of radiation transport and shielding for space exploration and high speed flight transportation. A phenomenological model was developed for the total absorption cross sections valid for any system of charged and/or uncharged collision pairs for the entire energy range. The success of the model is gratifying. It is being used by other federal agencies, national labs and universities. A list of publications based on the work during the grant period is given below and copies are enclosed with this report.

  15. Kinetic Modeling of Ultraintense X-ray Laser-Matter Interactions

    NASA Astrophysics Data System (ADS)

    Royle, Ryan; Sentoku, Yasuhiko; Mancini, Roberto

    2016-10-01

    Hard x-ray free-electron lasers (XFELs) have had a profound impact on the physical, chemical, and biological sciences. They can produce millijoule x-ray laser pulses just tens of femtoseconds in duration with more than 1012 photons each, making them the brightest laboratory x-ray sources ever produced by several orders of magnitude. An XFEL pulse can be intensified to 1020 W/cm2 when focused to submicron spot sizes, making it possible to isochorically heat solid matter well beyond 100 eV. These characteristics enable XFELs to create and probe well-characterized warm and hot dense plasmas of relevance to HED science, planetary science, laboratory astrophysics, relativistic laser plasmas, and fusion research. Several newly developed atomic physics models including photoionization, Auger ionization, and continuum-lowering have been implemented in a particle-in-cell code, PICLS, which self-consistently solves the x-ray transport, to enable the simulation of the non-LTE plasmas created by ultraintense x-ray laser interactions with solid density matter. The code is validated against the results of several recent experiments and is used to simulate the maximum-intensity x-ray heating of solid iron targets. This work was supported by DOE/OFES under Contract No. DE-SC0008827.

  16. DIAPHANE: A portable radiation transport library for astrophysical applications

    NASA Astrophysics Data System (ADS)

    Reed, Darren S.; Dykes, Tim; Cabezón, Rubén; Gheller, Claudio; Mayer, Lucio

    2018-05-01

    One of the most computationally demanding aspects of the hydrodynamical modelingof Astrophysical phenomena is the transport of energy by radiation or relativistic particles. Physical processes involving energy transport are ubiquitous and of capital importance in many scenarios ranging from planet formation to cosmic structure evolution, including explosive events like core collapse supernova or gamma-ray bursts. Moreover, the ability to model and hence understand these processes has often been limited by the approximations and incompleteness in the treatment of radiation and relativistic particles. The DIAPHANE project has focused on developing a portable and scalable library that handles the transport of radiation and particles (in particular neutrinos) independently of the underlying hydrodynamic code. In this work, we present the computational framework and the functionalities of the first version of the DIAPHANE library, which has been successfully ported to three different smoothed-particle hydrodynamic codes, GADGET2, GASOLINE and SPHYNX. We also present validation of different modules solving the equations of radiation and neutrino transport using different numerical schemes.

  17. Galactic cosmic ray radiation levels in spacecraft on interplanetary missions

    NASA Technical Reports Server (NTRS)

    Shinn, J. L.; Nealy, J. E.; Townsend, L. W.; Wilson, J. W.; Wood, J.S.

    1994-01-01

    Using the Langley Research Center Galactic Cosmic Ray (GCR) transport computer code (HZETRN) and the Computerized Anatomical Man (CAM) model, crew radiation levels inside manned spacecraft on interplanetary missions are estimated. These radiation-level estimates include particle fluxes, LET (Linear Energy Transfer) spectra, absorbed dose, and dose equivalent within various organs of interest in GCR protection studies. Changes in these radiation levels resulting from the use of various different types of shield materials are presented.

  18. Effects of nuclear cross sections at different energies on the radiation hazard from galactic cosmic rays.

    PubMed

    Lin, Z W; Adams, J H

    2007-03-01

    The radiation hazard for astronauts from galactic cosmic rays (GCR) is a major obstacle to long-duration human space exploration. Space radiation transport codes have been developed to calculate the radiation environment on missions to the Moon, Mars, and beyond. We have studied how uncertainties in fragmentation cross sections at different energies affect the accuracy of predictions from such radiation transport calculations. We find that, in deep space, cross sections at energies between 0.3 and 0.85 GeV/nucleon have the largest effect in solar maximum GCR environments. At the International Space Station, cross sections at higher energies have the largest effect due to the geomagnetic cutoff.

  19. Recent Developments in Three Dimensional Radiation Transport Using the Green's Function Technique

    NASA Technical Reports Server (NTRS)

    Rockell, Candice; Tweed, John; Blattnig, Steve R.; Mertens, Christopher J.

    2010-01-01

    In the future, astronauts will be sent into space for longer durations of time compared to previous missions. The increased risk of exposure to dangerous radiation, such as Galactic Cosmic Rays and Solar Particle Events, is of great concern. Consequently, steps must be taken to ensure astronaut safety by providing adequate shielding. In order to better determine and verify shielding requirements, an accurate and efficient radiation transport code based on a fully three dimensional radiation transport model using the Green's function technique is being developed

  20. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul

    2015-11-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.

  1. Simulation of the Mg(Ar) ionization chamber currents by different Monte Carlo codes in benchmark gamma fields

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Liu, Yuan-Hao; Nievaart, Sander; Chen, Yen-Fu; Wu, Shu-Wei; Chou, Wen-Tsae; Jiang, Shiang-Huei

    2011-10-01

    High energy photon (over 10 MeV) and neutron beams adopted in radiobiology and radiotherapy always produce mixed neutron/gamma-ray fields. The Mg(Ar) ionization chambers are commonly applied to determine the gamma-ray dose because of its neutron insensitive characteristic. Nowadays, many perturbation corrections for accurate dose estimation and lots of treatment planning systems are based on Monte Carlo technique. The Monte Carlo codes EGSnrc, FLUKA, GEANT4, MCNP5, and MCNPX were used to evaluate energy dependent response functions of the Exradin M2 Mg(Ar) ionization chamber to a parallel photon beam with mono-energies from 20 keV to 20 MeV. For the sake of validation, measurements were carefully performed in well-defined (a) primary M-100 X-ray calibration field, (b) primary 60Co calibration beam, (c) 6-MV, and (d) 10-MV therapeutic beams in hospital. At energy region below 100 keV, MCNP5 and MCNPX both had lower responses than other codes. For energies above 1 MeV, the MCNP ITS-mode greatly resembled other three codes and the differences were within 5%. Comparing to the measured currents, MCNP5 and MCNPX using ITS-mode had perfect agreement with the 60Co, and 10-MV beams. But at X-ray energy region, the derivations reached 17%. This work shows us a better insight into the performance of different Monte Carlo codes in photon-electron transport calculation. Regarding the application of the mixed field dosimetry like BNCT, MCNP with ITS-mode is recognized as the most suitable tool by this work.

  2. The EGS4 Code System: Solution of Gamma-ray and Electron Transport Problems

    DOE R&D Accomplishments Database

    Nelson, W. R.; Namito, Yoshihito

    1990-03-01

    In this paper we present an overview of the EGS4 Code System -- a general purpose package for the Monte Carlo simulation of the transport of electrons and photons. During the last 10-15 years EGS has been widely used to design accelerators and detectors for high-energy physics. More recently the code has been found to be of tremendous use in medical radiation physics and dosimetry. The problem-solving capabilities of EGS4 will be demonstrated by means of a variety of practical examples. To facilitate this review, we will take advantage of a new add-on package, called SHOWGRAF, to display particle trajectories in complicated geometries. These are shown as 2-D laser pictures in the written paper and as photographic slides of a 3-D high-resolution color monitor during the oral presentation. 11 refs., 15 figs.

  3. Assessment and Requirements of Nuclear Reaction Databases for GCR Transport in the Atmosphere and Structures

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.

    1998-01-01

    The transport properties of galactic cosmic rays (GCR) in the atmosphere, material structures, and human body (self-shielding) am of interest in risk assessment for supersonic and subsonic aircraft and for space travel in low-Earth orbit and on interplanetary missions. Nuclear reactions, such as knockout and fragmentation, present large modifications of particle type and energies of the galactic cosmic rays in penetrating materials. We make an assessment of the current nuclear reaction models and improvements in these model for developing required transport code data bases. A new fragmentation data base (QMSFRG) based on microscopic models is compared to the NUCFRG2 model and implications for shield assessment made using the HZETRN radiation transport code. For deep penetration problems, the build-up of light particles, such as nucleons, light clusters and mesons from nuclear reactions in conjunction with the absorption of the heavy ions, leads to the dominance of the charge Z = 0, 1, and 2 hadrons in the exposures at large penetration depths. Light particles are produced through nuclear or cluster knockout and in evaporation events with characteristically distinct spectra which play unique roles in the build-up of secondary radiation's in shielding. We describe models of light particle production in nucleon and heavy ion induced reactions and make an assessment of the importance of light particle multiplicity and spectral parameters in these exposures.

  4. Synergism of the method of characteristics and CAD technology for neutron transport calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Z.; Wang, D.; He, T.

    2013-07-01

    The method of characteristics (MOC) is a very popular methodology in neutron transport calculation and numerical simulation in recent decades for its unique advantages. One of the key problems determining whether the MOC can be applied in complicated and highly heterogeneous geometry is how to combine an effective geometry processing method with MOC. Most of the existing MOC codes describe the geometry by lines and arcs with extensive input data, such as circles, ellipses, regular polygons and combination of them. Thus they have difficulty in geometry modeling, background meshing and ray tracing for complicated geometry domains. In this study, amore » new idea making use of a CAD solid modeler MCAM which is a CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport developed by FDS Team in China was introduced for geometry modeling and ray tracing of particle transport to remove these geometrical limitations mentioned above. The diamond-difference scheme was applied to MOC to reduce the spatial discretization error of the flat flux approximation in theory. Based on MCAM and MOC, a new MOC code was developed and integrated into SuperMC system, which is a Super Multi-function Computational system for neutronics and radiation simulation. The numerical testing results demonstrated the feasibility and effectiveness of the new idea for geometry treatment in SuperMC. (authors)« less

  5. Effects of target fragmentation on evaluation of LET spectra from space radiations: implications for space radiation protection studies

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Wilson, J. W.; Shinn, J. L.; Badavi, F. F.; Badhwar, G. D.

    1996-01-01

    We present calculations of linear energy transfer (LET) spectra in low earth orbit from galactic cosmic rays and trapped protons using the HZETRN/BRYNTRN computer code. The emphasis of our calculations is on the analysis of the effects of secondary nuclei produced through target fragmentation in the spacecraft shield or detectors. Recent improvements in the HZETRN/BRYNTRN radiation transport computer code are described. Calculations show that at large values of LET (> 100 keV/micrometer) the LET spectra seen in free space and low earth orbit (LEO) are dominated by target fragments and not the primary nuclei. Although the evaluation of microdosimetric spectra is not considered here, calculations of LET spectra support that the large lineal energy (y) events are dominated by the target fragments. Finally, we discuss the situation for interplanetary exposures to galactic cosmic rays and show that current radiation transport codes predict that in the region of high LET values the LET spectra at significant shield depths (> 10 g/cm2 of Al) is greatly modified by target fragments. These results suggest that studies of track structure and biological response of space radiation should place emphasis on short tracks of medium charge fragments produced in the human body by high energy protons and neutrons.

  6. Overview of Particle and Heavy Ion Transport Code System PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit

    2014-06-01

    A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.

  7. Nucleon-Nucleon Total Cross Section

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    2008-01-01

    The total proton-proton and neutron-proton cross sections currently used in the transport code HZETRN show significant disagreement with experiment in the GeV and EeV energy ranges. The GeV range is near the region of maximum cosmic ray intensity. It is therefore important to correct these cross sections, so that predictions of space radiation environments will be accurate. Parameterizations of nucleon-nucleon total cross sections are developed which are accurate over the entire energy range of the cosmic ray spectrum.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Bernhard; Janka, Hans-Thomas; Dimmelmeier, Harald, E-mail: bjmuellr@mpa-garching.mpg.d, E-mail: thj@mpa-garching.mpg.d, E-mail: harrydee@mpa-garching.mpg.d

    We present a new general relativistic code for hydrodynamical supernova simulations with neutrino transport in spherical and azimuthal symmetry (one dimension and two dimensions, respectively). The code is a combination of the COCONUT hydro module, which is a Riemann-solver-based, high-resolution shock-capturing method, and the three-flavor, fully energy-dependent VERTEX scheme for the transport of massless neutrinos. VERTEX integrates the coupled neutrino energy and momentum equations with a variable Eddington factor closure computed from a model Boltzmann equation and uses the 'ray-by-ray plus' approximation in two dimensions, assuming the neutrino distribution to be axially symmetric around the radial direction at every pointmore » in space, and thus the neutrino flux to be radial. Our spacetime treatment employs the Arnowitt-Deser-Misner 3+1 formalism with the conformal flatness condition for the spatial three metric. This approach is exact for the one-dimensional case and has previously been shown to yield very accurate results for spherical and rotational stellar core collapse. We introduce new formulations of the energy equation to improve total energy conservation in relativistic and Newtonian hydro simulations with grid-based Eulerian finite-volume codes. Moreover, a modified version of the VERTEX scheme is developed that simultaneously conserves energy and lepton number in the neutrino transport with better accuracy and higher numerical stability in the high-energy tail of the spectrum. To verify our code, we conduct a series of tests in spherical symmetry, including a detailed comparison with published results of the collapse, shock formation, shock breakout, and accretion phases. Long-time simulations of proto-neutron star cooling until several seconds after core bounce both demonstrate the robustness of the new COCONUT-VERTEX code and show the approximate treatment of relativistic effects by means of an effective relativistic gravitational potential as in PROMETHEUS-VERTEX to be remarkably accurate in spherical symmetry.« less

  9. Comparison of distributed acceleration and standard models of cosmic-ray transport

    NASA Technical Reports Server (NTRS)

    Letaw, J. R.; Silberberg, R.; Tsao, C. H.

    1995-01-01

    Recent cosmic-ray abundance measurements for elements in the range 3 less than or equal to Z less than or equal to 28 and energies 10 MeV/n less than or equal to E less than or equal to 1 TeV/n have been analyzed with computer transport modeling. About 500 elemental and isotopic measurements have been explored in this analysis. The transport code includes the effects of ionization losses, nuclear spallation reactions (including those of secondaries), all nuclear decay modes, stripping and attachment of electrons, escape from the Galaxy, weak reacceleration and solar modulation. Four models of reacceleration (with several submodels of various reacceleration strengths) were explored. A chi (exp 2) analysis show that the reacceleration models yield at least equally good fits to the data as the standard propagation model. However, with reacceleration, the ad hoc assumptions of the standard model regarding discontinuities in the energy dependence of the mean path length traversed by cosmic rays, and in the momentum spectrum of the cosmic-ray source spectrum are eliminated. Futhermore, the difficulty between rigidity dependent leakage and energy independent anisotropy below energies of 10(exp 14) eV is alleviated.

  10. Three new extreme ultraviolet spectrometers on NSTX-U for impurity monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weller, M. E., E-mail: weller4@llnl.gov; Beiersdorfer, P.; Soukhanovskii, V. A.

    2016-11-15

    Three extreme ultraviolet (EUV) spectrometers have been mounted on the National Spherical Torus Experiment–Upgrade (NSTX-U). All three are flat-field grazing-incidence spectrometers and are dubbed X-ray and Extreme Ultraviolet Spectrometer (XEUS, 8–70 Å), Long-Wavelength Extreme Ultraviolet Spectrometer (LoWEUS, 190–440 Å), and Metal Monitor and Lithium Spectrometer Assembly (MonaLisa, 50–220 Å). XEUS and LoWEUS were previously implemented on NSTX to monitor impurities from low- to high-Z sources and to study impurity transport while MonaLisa is new and provides the system increased spectral coverage. The spectrometers will also be a critical diagnostic on the planned laser blow-off system for NSTX-U, which will bemore » used for impurity edge and core ion transport studies, edge-transport code development, and benchmarking atomic physics codes.« less

  11. Modeling the Production of Beta-Delayed Gamma Rays for the Detection of Special Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, J M; Pruet, J A; Brown, D A

    2005-02-14

    The objective of this LDRD project was to develop one or more models for the production of {beta}-delayed {gamma} rays following neutron-induced fission of a special nuclear material (SNM) and to define a standardized formatting scheme which will allow them to be incorporated into some of the modern, general-purpose Monte Carlo transport codes currently being used to simulate inspection techniques proposed for detecting fissionable material hidden in sea-going cargo containers. In this report, we will describe a Monte Carlo model for {beta}-delayed {gamma}-ray emission following the fission of SNM that can accommodate arbitrary time-dependent fission rates and photon collection histories.more » The model involves direct sampling of the independent fission yield distributions of the system, the branching ratios for decay of individual fission products and spectral distributions representing photon emission from each fission product and for each decay mode. While computationally intensive, it will be shown that this model can provide reasonably detailed estimates of the spectra that would be recorded by an arbitrary spectrometer and may prove quite useful in assessing the quality of evaluated data libraries and identifying gaps in the libraries. The accuracy of the model will be illustrated by comparing calculated and experimental spectra from the decay of short-lived fission products following the reactions {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f). For general-purpose transport calculations, where a detailed consideration of the large number of individual {gamma}-ray transitions in a spectrum may not be necessary, it will be shown that a simple parameterization of the {gamma}-ray source function can be defined which provides high-quality average spectral distributions that should suffice for calculations describing photons being transported through thick attenuating media. Finally, a proposal for ENDF-compatible formats that describe each of the models and allow for their straightforward use in Monte Carlo codes will be presented.« less

  12. CRPropa 3.1—a low energy extension based on stochastic differential equations

    NASA Astrophysics Data System (ADS)

    Merten, Lukas; Becker Tjus, Julia; Fichtner, Horst; Eichmann, Björn; Sigl, Günter

    2017-06-01

    The propagation of charged cosmic rays through the Galactic environment influences all aspects of the observation at Earth. Energy spectrum, composition and arrival directions are changed due to deflections in magnetic fields and interactions with the interstellar medium. Today the transport is simulated with different simulation methods either based on the solution of a transport equation (multi-particle picture) or a solution of an equation of motion (single-particle picture). We developed a new module for the publicly available propagation software CRPropa 3.1, where we implemented an algorithm to solve the transport equation using stochastic differential equations. This technique allows us to use a diffusion tensor which is anisotropic with respect to an arbitrary magnetic background field. The source code of CRPropa is written in C++ with python steering via SWIG which makes it easy to use and computationally fast. In this paper, we present the new low-energy propagation code together with validation procedures that are developed to proof the accuracy of the new implementation. Furthermore, we show first examples of the cosmic ray density evolution, which depends strongly on the ratio of the parallel κ∥ and perpendicular κ⊥ diffusion coefficients. This dependency is systematically examined as well the influence of the particle rigidity on the diffusion process.

  13. Cosmic-Ray Transport in Heliospheric Magnetic Structures. II. Modeling Particle Transport through Corotating Interaction Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopp, Andreas; Wiengarten, Tobias; Fichtner, Horst

    The transport of cosmic rays (CRs) in the heliosphere is determined by the properties of the solar wind plasma. The heliospheric plasma environment has been probed by spacecraft for decades and provides a unique opportunity for testing transport theories. Of particular interest for the three-dimensional (3D) heliospheric CR transport are structures such as corotating interaction regions (CIRs), which, due to the enhancement of the magnetic field strength and magnetic fluctuations within and due to the associated shocks as well as stream interfaces, do influence the CR diffusion and drift. In a three-fold series of papers, we investigate these effects bymore » modeling inner-heliospheric solar wind conditions with the numerical magnetohydrodynamic (MHD) framework Cronos (Wiengarten et al., referred as Paper I), and the results serve as input to a transport code employing a stochastic differential equation approach (this paper). While, in Paper I, we presented results from 3D simulations with Cronos, the MHD output is now taken as an input to the CR transport modeling. We discuss the diffusion and drift behavior of Galactic cosmic rays using the example of different theories, and study the effects of CIRs on these transport processes. In particular, we point out the wide range of possible particle fluxes at a given point in space resulting from these different theories. The restriction of this variety by fitting the numerical results to spacecraft data will be the subject of the third paper of this series.« less

  14. Simulations of a Molecular Cloud experiment using CRASH

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Keiter, Paul; Vandervort, Robert; Drake, R. Paul; Shvarts, Dov

    2017-10-01

    Recent laboratory experiments explore molecular cloud radiation hydrodynamics. The experiment irradiates a gold foil with a laser producing x-rays to drive the implosion or explosion of a foam ball. The CRASH code, an Eulerian code with block-adaptive mesh refinement, multigroup diffusive radiation transport, and electron heat conduction developed at the University of Michigan to design and analyze high-energy-density experiments, is used to perform a parameter search in order to identify optically thick, optically thin and transition regimes suitable for these experiments. Specific design issues addressed by the simulations are the x-ray drive temperature, foam density, distance from the x-ray source to the ball, as well as other complicating issues such as the positioning of the stalk holding the foam ball. We present the results of this study and show ways the simulations helped improve the quality of the experiment. This work is funded by the LLNL under subcontract B614207 and NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0002956.

  15. Study of solid-conversion gaseous detector based on GEM for high energy X-ray industrial CT.

    PubMed

    Zhou, Rifeng; Zhou, Yaling

    2014-01-01

    The general gaseous ionization detectors are not suitable for high energy X-ray industrial computed tomography (HEICT) because of their inherent limitations, especially low detective efficiency and large volume. The goal of this study was to investigate a new type of gaseous detector to solve these problems. The novel detector was made by a metal foil as X-ray convertor to improve the conversion efficiency, and the Gas Electron Multiplier (hereinafter "GEM") was used as electron amplifier to lessen its volume. The detective mechanism and signal formation of the detector was discussed in detail. The conversion efficiency was calculated by using EGSnrc Monte Carlo code, and the transport course of photon and secondary electron avalanche in the detector was simulated with the Maxwell and Garfield codes. The result indicated that this detector has higher conversion efficiency as well as less volume. Theoretically this kind of detector could be a perfect candidate for replacing the conventional detector in HEICT.

  16. Physics of the Isotopic Dependence of Galactic Cosmic Ray Fluence Behind Shielding

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Saganti, Premkumar B.; Hu, Xiao-Dong; Kim, Myung-Hee Y.; Cleghorn, Timothy F.; Wilson, John W.; Tripathi, Ram K.; Zeitlin, Cary J.

    2003-01-01

    For over 25 years, NASA has supported the development of space radiation transport models for shielding applications. The NASA space radiation transport model now predicts dose and dose equivalent in Earth and Mars orbit to an accuracy of plus or minus 20%. However, because larger errors may occur in particle fluence predictions, there is interest in further assessments and improvements in NASA's space radiation transport model. In this paper, we consider the effects of the isotopic composition of the primary galactic cosmic rays (GCR) and the isotopic dependence of nuclear fragmentation cross-sections on the solution to transport models used for shielding studies. Satellite measurements are used to describe the isotopic composition of the GCR. Using NASA's quantum multiple-scattering theory of nuclear fragmentation (QMSFRG) and high-charge and energy (HZETRN) transport code, we study the effect of the isotopic dependence of the primary GCR composition and secondary nuclei on shielding calculations. The QMSFRG is shown to accurately describe the iso-spin dependence of nuclear fragmentation. The principal finding of this study is that large errors (plus or minus 100%) will occur in the mass-fluence spectra when comparing transport models that use a complete isotope grid (approximately 170 ions) to ones that use a reduced isotope grid, for example the 59 ion-grid used in the HZETRN code in the past, however less significant errors (less than 20%) occur in the elemental-fluence spectra. Because a complete isotope grid is readily handled on small computer workstations and is needed for several applications studying GCR propagation and scattering, it is recommended that they be used for future GCR studies.

  17. Delayed photo-emission model for beam optics codes

    DOE PAGES

    Jensen, Kevin L.; Petillo, John J.; Panagos, Dimitrios N.; ...

    2016-11-22

    Future advanced light sources and x-ray Free Electron Lasers require fast response from the photocathode to enable short electron pulse durations as well as pulse shaping, and so the ability to model delays in emission is needed for beam optics codes. The development of a time-dependent emission model accounting for delayed photoemission due to transport and scattering is given, and its inclusion in the Particle-in-Cell code MICHELLE results in changes to the pulse shape that are described. Furthermore, the model is applied to pulse elongation of a bunch traversing an rf injector, and to the smoothing of laser jitter onmore » a short pulse.« less

  18. EBQ code: Transport of space-charge beams in axially symmetric devices

    NASA Astrophysics Data System (ADS)

    Paul, A. C.

    1982-11-01

    Such general-purpose space charge codes as EGUN, BATES, WODF, and TRANSPORT do not gracefully accommodate the simulation of relativistic space-charged beams propagating a long distance in axially symmetric devices where a high degree of cancellation has occurred between the self-magnetic and self-electric forces of the beam. The EBQ code was written specifically to follow high current beam particles where space charge is important in long distance flight in axially symmetric machines possessing external electric and magnetic field. EBQ simultaneously tracks all trajectories so as to allow procedures for charge deposition based on inter-ray separations. The orbits are treated in Cartesian geometry (position and momentum) with z as the independent variable. Poisson's equation is solved in cylindrical geometry on an orthogonal rectangular mesh. EBQ can also handle problems involving multiple ion species where the space charge from each must be included. Such problems arise in the design of ion sources where different charge and mass states are present.

  19. Simulation of Charge Collection in Diamond Detectors Irradiated with Deuteron-Triton Neutron Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milocco, Alberto; Trkov, Andrej; Pillon, Mario

    2011-12-13

    Diamond-based neutron spectrometers exhibit outstanding properties such as radiation hardness, low sensitivity to gamma rays, fast response and high-energy resolution. They represent a very promising application of diamonds for plasma diagnostics in fusion devices. The measured pulse height spectrum is obtained from the collection of helium and beryllium ions produced by the reactions on {sup 12}C. An original code is developed to simulate the production and the transport of charged particles inside the diamond detector. The ion transport methodology is based on the well-known TRIM code. The reactions of interest are triggered using the ENDF/B-VII.0 nuclear data for the neutronmore » interactions on carbon. The model is implemented in the TALLYX subroutine of the MCNP5 and MCNPX codes. Measurements with diamond detectors in a {approx}14 MeV neutron field have been performed at the FNG (Rome, Italy) and IRMM (Geel, Belgium) facilities. The comparison of the experimental data with the simulations validates the proposed model.« less

  20. Reduced discretization error in HZETRN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaba, Tony C., E-mail: Tony.C.Slaba@nasa.gov; Blattnig, Steve R., E-mail: Steve.R.Blattnig@nasa.gov; Tweed, John, E-mail: jtweed@odu.edu

    2013-02-01

    The deterministic particle transport code HZETRN is an efficient analysis tool for studying the effects of space radiation on humans, electronics, and shielding materials. In a previous work, numerical methods in the code were reviewed, and new methods were developed that further improved efficiency and reduced overall discretization error. It was also shown that the remaining discretization error could be attributed to low energy light ions (A < 4) with residual ranges smaller than the physical step-size taken by the code. Accurately resolving the spectrum of low energy light particles is important in assessing risk associated with astronaut radiation exposure.more » In this work, modifications to the light particle transport formalism are presented that accurately resolve the spectrum of low energy light ion target fragments. The modified formalism is shown to significantly reduce overall discretization error and allows a physical approximation to be removed. For typical step-sizes and energy grids used in HZETRN, discretization errors for the revised light particle transport algorithms are shown to be less than 4% for aluminum and water shielding thicknesses as large as 100 g/cm{sup 2} exposed to both solar particle event and galactic cosmic ray environments.« less

  1. An MCNP-based model of a medical linear accelerator x-ray photon beam.

    PubMed

    Ajaj, F A; Ghassal, N M

    2003-09-01

    The major components in the x-ray photon beam path of the treatment head of the VARIAN Clinac 2300 EX medical linear accelerator were modeled and simulated using the Monte Carlo N-Particle radiation transport computer code (MCNP). Simulated components include x-ray target, primary conical collimator, x-ray beam flattening filter and secondary collimators. X-ray photon energy spectra and angular distributions were calculated using the model. The x-ray beam emerging from the secondary collimators were scored by considering the total x-ray spectra from the target as the source of x-rays at the target position. The depth dose distribution and dose profiles at different depths and field sizes have been calculated at a nominal operating potential of 6 MV and found to be within acceptable limits. It is concluded that accurate specification of the component dimensions, composition and nominal accelerating potential gives a good assessment of the x-ray energy spectra.

  2. Investigation of ion and electron heat transport of high- T e ECH heated discharges in the large helical device

    DOE PAGES

    Pablant, N. A.; Satake, S.; Yokoyama, M.; ...

    2016-01-28

    An analysis of the radial electric field and heat transport, both for ions and electrons, is presented for a high-more » $${{T}_{\\text{e}}}$$ electron cyclotron heated (ECH) discharge on the large helical device (LHD). Transport analysis is done using the task3d transport suite utilizing experimentally measured profiles for both ions and electrons. Ion temperature and perpendicular flow profiles are measured using the recently installed x-ray imaging crystal spectrometer diagnostic (XICS), while electron temperature and density profiles are measured using Thomson scattering. The analysis also includes calculated ECH power deposition profiles as determined through the travis ray-tracing code. This is the first time on LHD that this type of integrated transport analysis with measured ion temperature profiles has been performed without NBI, allowing the heat transport properties of plasmas with only ECH heating to be more clearly examined. For this study, a plasma discharge is chosen which develops a high central electron temperature ($${{T}_{\\text{eo}}}=9$$ keV) at moderately low densities ($${{n}_{\\text{eo}}}=1.5\\times {{10}^{19}}$$ m-3). The experimentally determined transport properties from task3d are compared to neoclassical predictions as calculated by the gsrake and fortec-3d codes. The predicted electron fluxes are seen to be an order of magnitude less than the measured fluxes, indicating that electron transport is largely anomalous, while the neoclassical and measured ion heat fluxes are of the same magnitude. Neoclassical predictions of a strong positive ambipolar electric field ($${{E}_{\\text{r}}}$$ ) in the plasma core are validated through comparisons to perpendicular flow measurements from the XICS diagnostic. Furthermore, this provides confidence that the predictions are producing physically meaningful results for the particle fluxes and radial electric field, which are a key component in correctly predicting plasma confinement.« less

  3. CRPropa 3.1—a low energy extension based on stochastic differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merten, Lukas; Tjus, Julia Becker; Eichmann, Björn

    The propagation of charged cosmic rays through the Galactic environment influences all aspects of the observation at Earth. Energy spectrum, composition and arrival directions are changed due to deflections in magnetic fields and interactions with the interstellar medium. Today the transport is simulated with different simulation methods either based on the solution of a transport equation (multi-particle picture) or a solution of an equation of motion (single-particle picture). We developed a new module for the publicly available propagation software CRPropa 3.1, where we implemented an algorithm to solve the transport equation using stochastic differential equations. This technique allows us tomore » use a diffusion tensor which is anisotropic with respect to an arbitrary magnetic background field. The source code of CRPropa is written in C++ with python steering via SWIG which makes it easy to use and computationally fast. In this paper, we present the new low-energy propagation code together with validation procedures that are developed to proof the accuracy of the new implementation. Furthermore, we show first examples of the cosmic ray density evolution, which depends strongly on the ratio of the parallel κ{sub ∥} and perpendicular κ{sub ⊥} diffusion coefficients. This dependency is systematically examined as well the influence of the particle rigidity on the diffusion process.« less

  4. Simulations of GCR interactions within planetary bodies using GEANT4

    NASA Astrophysics Data System (ADS)

    Mesick, K.; Feldman, W. C.; Stonehill, L. C.; Coupland, D. D. S.

    2017-12-01

    On planetary bodies with little to no atmosphere, Galactic Cosmic Rays (GCRs) can hit the body and produce neutrons primarily through nuclear spallation within the top few meters of the surfaces. These neutrons undergo further nuclear interactions with elements near the planetary surface and some will escape the surface and can be detected by landed or orbiting neutron radiation detector instruments. The neutron leakage signal at fast neutron energies provides a measure of average atomic mass of the near-surface material and in the epithermal and thermal energy ranges is highly sensitive to the presence of hydrogen. Gamma-rays can also escape the surface, produced at characteristic energies depending on surface composition, and can be detected by gamma-ray instruments. The intra-nuclear cascade (INC) that occurs when high-energy GCRs interact with elements within a planetary surface to produce the leakage neutron and gamma-ray signals is highly complex, and therefore Monte Carlo based radiation transport simulations are commonly used for predicting and interpreting measurements from planetary neutron and gamma-ray spectroscopy instruments. In the past, the simulation code that has been widely used for this type of analysis is MCNPX [1], which was benchmarked against data from the Lunar Neutron Probe Experiment (LPNE) on Apollo 17 [2]. In this work, we consider the validity of the radiation transport code GEANT4 [3], another widely used but open-source code, by benchmarking simulated predictions of the LPNE experiment to the Apollo 17 data. We consider the impact of different physics model options on the results, and show which models best describe the INC based on agreement with the Apollo 17 data. The success of this validation then gives us confidence in using GEANT4 to simulate GCR-induced neutron leakage signals on Mars in relevance to a re-analysis of Mars Odyssey Neutron Spectrometer data. References [1] D.B. Pelowitz, Los Alamos National Laboratory, LA-CP-05-0369, 2005. [2] G.W. McKinney et al, Journal of Geophysics Research, 111, E06004, 2006. [3] S. Agostinelli et al, Nuclear Instrumentation and Methods A, 506, 2003.

  5. Monte Carlo Simulations of Background Spectra in Integral Imager Detectors

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.; Dietz, K. L.; Ramsey, B. D.; Weisskopf, M. C.

    1998-01-01

    Predictions of the expected gamma-ray backgrounds in the ISGRI (CdTe) and PiCsIT (Csl) detectors on INTEGRAL due to cosmic-ray interactions and the diffuse gamma-ray background have been made using a coupled set of Monte Carlo radiation transport codes (HETC, FLUKA, EGS4, and MORSE) and a detailed, 3-D mass model of the spacecraft and detector assemblies. The simulations include both the prompt background component from induced hadronic and electromagnetic cascades and the delayed component due to emissions from induced radioactivity. Background spectra have been obtained with and without the use of active (BGO) shielding and charged particle rejection to evaluate the effectiveness of anticoincidence counting on background rejection.

  6. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit.

    PubMed

    Badal, Andreu; Badano, Aldo

    2009-11-01

    It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  7. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, and Human Health and Safety

    NASA Technical Reports Server (NTRS)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems. The effects of primary cosmic ray particles and secondary particle showers produced by nuclear reactions with the atmosphere, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth s surface, especially if the net target area of the sensitive electronic system components is large. Finally, accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO). In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as human health and the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in ground-based atmospheric flight, and space flight environments. Ground test methods applied to microelectronic components and systems are used in combinations with radiation transport and reaction codes to predict the performance of microelectronic systems in their operating environments. Similar radiation transport codes are used to evaluate possible human health effects of cosmic ray exposure, however, the health effects are based on worst-case analysis and extrapolation of a very limited human exposure data base combined with some limited experimental animal data. Finally, the limitations on human space operations beyond low-Earth orbit imposed by long term exposure to galactic cosmic rays are discussed.

  8. Ion absorption of the high harmonic fast wave in the National Spherical Torus Experiment

    NASA Astrophysics Data System (ADS)

    Rosenberg, Adam Lewis

    Ion absorption of the high harmonic fast wave in a spherical torus is of critical importance to assessing the viability of the wave as a means of heating and driving current. Analysis of recent NSTX shots has revealed that under some conditions when neutral beam and RF power are injected into the plasma simultaneously, a fast ion population with energy above the beam injection energy is sustained by the wave. In agreement with modeling, these experiments find the RF-induced fast ion tail strength and neutron rate at lower B-fields to be less enhanced, likely due to a larger β profile, which promotes greater off-axis absorption where the fast ion population is small. Ion loss codes find the increased loss fraction with decreased B insufficient to account for the changes in tail strength, providing further evidence that this is an RF interaction effect. Though greater ion absorption is predicted with lower k∥, surprisingly little variation in the tail was observed, along with a neutron rate enhancement with higher k∥. Data from the neutral particle analyzer, neutron detectors, x-ray crystal spectrometer, and Thomson scattering is presented, along with results from the TRANSP transport analysis code, ray-tracing codes HPRT and CURRAY, full-wave code and AORSA, quasilinear code CQL3D, and ion loss codes EIGOL and CONBEAM.

  9. Light ion components of the galactic cosmic rays: Nuclear interactions and transport theory

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Townsend, L. W.; Wilson, J. W.; Shinn, J. L.; Badhwar, G. D.; Dubey, R. R.

    1996-01-01

    Light nuclei are present in the primary galactic cosmic rays (GCR) and are produced in thick targets due to projectile or target fragmentation from both nucleon and heavy induced reactions. In the primary GCR, He-4 is the most abundant nucleus after H-1. However, there are also a substantial fluxes of H-2 and He-3. In this paper we describe theoretical models based on quantum multiple scattering theory for the description of light ion nuclear interactions. The energy dependence of the light ion fragmentation cross section is considered with comparisons of inclusive yields and secondary momentum distributions to experiments described. We also analyze the importance of a fast component of lights ions from proton and neutron induced target fragementation. These theoretical models have been incorporated into the cosmic ray transport code HZETRN and will be used to analyze the role of shielding materials in modulating the production and the energy spectrum of light ions.

  10. An efficient HZETRN (a galactic cosmic ray transport code)

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.

    1992-01-01

    An accurate and efficient engineering code for analyzing the shielding requirements against the high-energy galactic heavy ions is needed. The HZETRN is a deterministic code developed at Langley Research Center that is constantly under improvement both in physics and numerical computation and is targeted for such use. One problem area connected with the space-marching technique used in this code is the propagation of the local truncation error. By improving the numerical algorithms for interpolation, integration, and grid distribution formula, the efficiency of the code is increased by a factor of eight as the number of energy grid points is reduced. The numerical accuracy of better than 2 percent for a shield thickness of 150 g/cm(exp 2) is found when a 45 point energy grid is used. The propagating step size, which is related to the perturbation theory, is also reevaluated.

  11. Radiography simulation on single-shot dual-spectrum X-ray for cargo inspection system.

    PubMed

    Gil, Youngmi; Oh, Youngdo; Cho, Moohyun; Namkung, Won

    2011-02-01

    We propose a method to identify materials in the dual energy X-ray (DeX) inspection system. This method identifies materials by combining information on the relative proportions T of high-energy and low-energy X-rays transmitted through the material, and the ratio R of the attenuation coefficient of the material when high-energy are used to that when low energy X-rays are used. In Monte Carlo N-Particle Transport Code (MCNPX) simulations using the same geometry as that of the real container inspection system, this T vs. R method successfully identified tissue-equivalent plastic and several metals. In further simulations, the single-shot mode of operating the accelerator led to better distinguishing of materials than the dual-shot system. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Quantitative Kα line spectroscopy for energy transport in ultra-intense laser plasma interaction

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Nishimura, H.; Namimoto, T.; Fujioka, S.; Arikawa, Y.; Nakai, M.; Koga, M.; Shiraga, H.; Kojima, S.; Azechi, H.; Ozaki, T.; Chen, H.; Pakr, J.; Williams, G. J.; Nishikino, M.; Kawachi, T.; Sagisaka, A.; Orimo, S.; Ogura, K.; Pirozhkov, A.; Yogo, A.; Kiriyama, H.; Kondo, K.; Okano, Y.

    2012-10-01

    X-ray line spectra ranging from 17 to 77 keV were quantitatively measured with a Laue spectrometer, composed of a cylindrically curved crystal and a detector. The absolute sensitivity of the spectrometer system was calibrated using pre-characterized laser-produced x-ray sources and radioisotopes, for the detectors and crystal respectively. The integrated reflectivity for the crystal is in good agreement with predictions by an open code for x-ray diffraction. The energy transfer efficiency from incident laser beams to hot electrons, as the energy transfer agency for Au Kα x-ray line emissions, is derived as a consequence of this work. By considering the hot electron temperature, the transfer efficiency from LFEX laser to Au plate target is about 8% to 10%.

  13. Anisotropic imaging performance in indirect x-ray imaging detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badano, Aldo; Kyprianou, Iacovos S.; Sempau, Josep

    We report on the variability in imaging system performance due to oblique x-ray incidence, and the associated transport of quanta (both x rays and optical photons) through the phosphor, in columnar indirect digital detectors. The analysis uses MANTIS, a combined x-ray, electron, and optical Monte Carlo transport code freely available. We describe the main features of the simulation method and provide some validation of the phosphor screen models considered in this work. We report x-ray and electron three-dimensional energy deposition distributions and point-response functions (PRFs), including optical spread in columnar phosphor screens of thickness 100 and 500 {mu}m, for 19,more » 39, 59, and 79 keV monoenergetic x-ray beams incident at 0 deg., 10 deg., and 15 deg. . In addition, we present pulse-height spectra for the same phosphor thickness, x-ray energies, and angles of incidence. Our results suggest that the PRF due to the phosphor blur is highly nonsymmetrical, and that the resolution properties of a columnar screen in a tomographic, or tomosynthetic imaging system varies significantly with the angle of x-ray incidence. Moreover, we find that the noise due to the variability in the number of light photons detected per primary x-ray interaction, summarized in the information or Swank factor, is somewhat independent of thickness and incidence angle of the x-ray beam. Our results also suggest that the anisotropy in the PRF is not less in screens with absorptive backings, while the noise introduced by variations in the gain and optical transport is larger. Predictions from MANTIS, after additional validation, can provide the needed understanding of the extent of such variations, and eventually, lead to the incorporation of the changes in imaging performance with incidence angle into the reconstruction algorithms for volumetric x-ray imaging systems.« less

  14. The potential of detecting intermediate-scale biomass and canopy interception in a coniferous forest using cosmic-ray neutron intensity measurements and neutron transport modeling

    NASA Astrophysics Data System (ADS)

    Andreasen, M.; Looms, M. C.; Bogena, H. R.; Desilets, D.; Zreda, M. G.; Sonnenborg, T. O.; Jensen, K. H.

    2014-12-01

    The water stored in the various compartments of the terrestrial ecosystem (in snow, canopy interception, soil and litter) controls the exchange of the water and energy between the land surface and the atmosphere. Therefore, measurements of the water stored within these pools are critical for the prediction of e.g. evapotranspiration and groundwater recharge. The detection of cosmic-ray neutron intensity is a novel non-invasive method for the quantification of continuous intermediate-scale soil moisture. The footprint of the cosmic-ray neutron probe is a hemisphere of a few hectometers and subsurface depths of 10-70 cm depending on wetness. The cosmic-ray neutron method offers measurements at a scale between the point-scale measurements and large-scale satellite retrievals. The cosmic-ray neutron intensity is inversely correlated to the hydrogen stored within the footprint. Overall soil moisture represents the largest pool of hydrogen and changes in the soil moisture clearly affect the cosmic-ray neutron signal. However, the neutron intensity is also sensitive to variations of hydrogen in snow, canopy interception and biomass offering the potential to determine water content in such pools from the signal. In this study we tested the potential of determining canopy interception and biomass using cosmic-ray neutron intensity measurements within the framework of the Danish Hydrologic Observatory (HOBE) and the Terrestrial Environmental Observatories (TERENO). Continuous measurements at the ground and the canopy level, along with profile measurements were conducted at towers at forest field sites. Field experiments, including shielding the cosmic-ray neutron probes with cadmium foil (to remove lower-energy neutrons) and measuring reference intensity rates at complete water saturated conditions (on the sea close to the HOBE site), were further conducted to obtain an increased understanding of the physics controlling the cosmic-ray neutron transport and the equipment used. Additionally, neutron transport modeling, using the extended version of the Monte Carlo N-Particle Transport Code, was conducted. The responses of the reference condition, different amounts of biomass, soil moisture and canopy interception on the cosmic-ray neutron intensity were simulated and compared to the measurements.

  15. Radiation protection effectiveness of a proposed magnetic shielding concept for manned Mars missions

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Wilson, John W.; Shinn, J. L.; Nealy, John E.; Simonsen, Lisa C.

    1990-01-01

    The effectiveness of a proposed concept for shielding a manned Mars vehicle using a confined magnetic field configuration is evaluated by computing estimated crew radiation exposures resulting from galactic cosmic rays and a large solar flare event. In the study the incident radiation spectra are transported through the spacecraft structure/magnetic shield using the deterministic space radiation transport computer codes developed at Langley Research Center. The calculated exposures unequivocally demonstrate that magnetic shielding could provide an effective barrier against solar flare protons but is virtually transparent to the more energetic galactic cosmic rays. It is then demonstrated that through proper selection of materials and shield configuration, adequate and reliable bulk material shielding can be provided for the same total mass as needed to generate and support the more risky magnetic field configuration.

  16. Effects of Nuclear Cross Sections at Different Energies on the Radiation Hazard from Galactic Cosmic Rays

    NASA Technical Reports Server (NTRS)

    Lin, Z. W.; Adams, J. H., Jr.

    2006-01-01

    The radiation hazard for astronauts from galactic cosmic rays is a major obstacle in long duration human space explorations. Space radiation transport codes have been developed to calculate radiation environment on missions to the Moon, Mars or beyond. We have studied how uncertainties in fragmentation cross sections at different energies affect the accuracy of predictions from such radiation transport. We find that, in deep space, cross sections between 0.3 and 0.85 GeV/u usually have the largest effect on dose-equivalent behind shielding in solar minimum GCR environments, and cross sections between 0.85 and 1.2 GeV/u have the largest effect in solar maximum GCR environments. At the International Space Station, cross sections at higher energies have the largest effect due to the geomagnetic cutoff.

  17. An Improved Analytic Model for Microdosimeter Response

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.; Xapsos, Michael A.

    2001-01-01

    An analytic model used to predict energy deposition fluctuations in a microvolume by ions through direct events is improved to include indirect delta ray events. The new model can now account for the increase in flux at low lineal energy when the ions are of very high energy. Good agreement is obtained between the calculated results and available data for laboratory ion beams. Comparison of GCR (galactic cosmic ray) flux between Shuttle TEPC (tissue equivalent proportional counter) flight data and current calculations draws a different assessment of developmental work required for the GCR transport code (HZETRN) than previously concluded.

  18. Application of computational fluid dynamics and laminar flow technology for improved performance and sonic boom reduction

    NASA Technical Reports Server (NTRS)

    Bobbitt, Percy J.

    1992-01-01

    A discussion is given of the many factors that affect sonic booms with particular emphasis on the application and development of improved computational fluid dynamics (CFD) codes. The benefits that accrue from interference (induced) lift, distributing lift using canard configurations, the use of wings with dihedral or anhedral and hybrid laminar flow control for drag reduction are detailed. The application of the most advanced codes to a wider variety of configurations along with improved ray-tracing codes to arrive at more accurate and, hopefully, lower sonic booms is advocated. Finally, it is speculated that when all of the latest technology is applied to the design of a supersonic transport it will be found environmentally acceptable.

  19. Heat transport in the quasi-single-helicity islands of EXTRAP T2R

    NASA Astrophysics Data System (ADS)

    Frassinetti, L.; Brunsell, P. R.; Drake, J.

    2009-03-01

    The heat transport inside the magnetic island generated in a quasi-single-helicity regime of a reversed-field pinch device is studied by using a numerical code that simulates the electron temperature and the soft x-ray emissivity. The heat diffusivity χe inside the island is determined by matching the simulated signals with the experimental ones. Inside the island, χe turns out to be from one to two orders of magnitude lower than the diffusivity in the surrounding plasma, where the magnetic field is stochastic. Furthermore, the heat transport properties inside the island are studied in correlation with the plasma current and with the amplitude of the magnetic fluctuations.

  20. Numerical Model for Cosmic Rays Species Production and Propagation in the Galaxy

    NASA Technical Reports Server (NTRS)

    Farahat, Ashraf; Zhang, Ming; Rassoul, Hamid; Connell, J. J.

    2005-01-01

    In recent years, considerable progress has been made in studying the propagation and origin of cosmic rays, as new and more accurate data have become available. Many models have been developed to study cosmic ray interactions and propagation showed flexibility in resembling various astrophysical conditions and good agreement with observational data. However, some astrophysical problems cannot be addressed using these models, such as the stochastic nature of the cosmic rays source, small-scale structures and inhomogeneities in the interstellar gas that can affect radioactive secondary abundance in cosmic rays. We have developed a new model and a corresponding computer code that can address some of these limitations. The model depends on the expansion of the backward stochastic solution of the general diffusion transport equation (Zhang 1999) starting from an observer position to solve a group of diffusion transport equations each of which represents a particular element or isotope of cosmic ray nuclei. In this paper we are focusing on key abundance ratios such as B/C, sub-Fe/Fe, (10)Be/(9)Be, (26)Al/(27)Al, (36)Cl/(37)Cl and (54)Mn/(55)Mn, which all have well established cross sections, to evaluate our model. The effect of inhomogeneity in the interstellar medium is investigated. The contribution of certain cosmic ray nuclei to the production of other nuclei is addressed. The contribution of various galactic locations to the production of cosmic ray nuclei observed at solar system is also investigated.

  1. The estimation of background production by cosmic rays in high-energy gamma ray telescopes

    NASA Technical Reports Server (NTRS)

    Edwards, H. L.; Nolan, P. L.; Lin, Y. C.; Koch, D. G.; Bertsch, D. L.; Fichtel, C. E.; Hartman, R. C.; Hunter, S. D.; Kniffen, D. A.; Hughes, E. B.

    1991-01-01

    A calculational method of estimating instrumental background in high-energy gamma-ray telescopes, using the hadronic Monte Carlo code FLUKA87, is presented. The method is applied to the SAS-2 and EGRET telescope designs and is also used to explore the level of background to be expected for alternative configurations of the proposed GRITS telescope, which adapts the external fuel tank of a Space Shuttle as a gamma-ray telescope with a very large collecting area. The background produced in proton-beam tests of EGRET is much less than the predicted level. This discrepancy appears to be due to the FLUKA87 inability to transport evaporation nucleons. It is predicted that the background in EGRET will be no more than 4-10 percent of the extragalactic diffuse gamma radiation.

  2. Control of the Low-energy X-rays by Using MCNP5 and Numerical Analysis for a New Concept Intra-oral X-ray Imaging System

    NASA Astrophysics Data System (ADS)

    Huh, Jangyong; Ji, Yunseo; Lee, Rena

    2018-05-01

    An X-ray control algorithm to modulate the X-ray intensity distribution over the FOV (field of view) has been developed by using numerical analysis and MCNP5, a particle transport simulation code on the basis of the Monte Carlo method. X-rays, which are widely used in medical diagnostic imaging, should be controlled in order to maximize the performance of the X-ray imaging system. However, transporting X-rays, like a liquid or a gas is conveyed through a physical form such as pipes, is not possible. In the present study, an X-ray control algorithm and technique to uniformize the Xray intensity projected on the image sensor were developed using a flattening filter and a collimator in order to alleviate the anisotropy of the distribution of X-rays due to intrinsic features of the X-ray generator. The proposed method, which is combined with MCNP5 modeling and numerical analysis, aimed to optimize a flattening filter and a collimator for a uniform distribution of X-rays. Their size and shape were estimated from the method. The simulation and the experimental results both showed that the method yielded an intensity distribution over an X-ray field of 6×4 cm2 at SID (source to image-receptor distance) of 5 cm with a uniformity of more than 90% when the flattening filter and the collimator were mounted on the system. The proposed algorithm and technique are not only confined to flattening filter development but can also be applied for other X-ray related research and development efforts.

  3. Kinetic modeling of x-ray laser-driven solid Al plasmas via particle-in-cell simulation

    NASA Astrophysics Data System (ADS)

    Royle, R.; Sentoku, Y.; Mancini, R. C.; Paraschiv, I.; Johzaki, T.

    2017-06-01

    Solid-density plasmas driven by intense x-ray free-electron laser (XFEL) radiation are seeded by sources of nonthermal photoelectrons and Auger electrons that ionize and heat the target via collisions. Simulation codes that are commonly used to model such plasmas, such as collisional-radiative (CR) codes, typically assume a Maxwellian distribution and thus instantaneous thermalization of the source electrons. In this study, we present a detailed description and initial applications of a collisional particle-in-cell code, picls, that has been extended with a self-consistent radiation transport model and Monte Carlo models for photoionization and K L L Auger ionization, enabling the fully kinetic simulation of XFEL-driven plasmas. The code is used to simulate two experiments previously performed at the Linac Coherent Light Source investigating XFEL-driven solid-density Al plasmas. It is shown that picls-simulated pulse transmissions using the Ecker-Kröll continuum-lowering model agree much better with measurements than do simulations using the Stewart-Pyatt model. Good quantitative agreement is also found between the time-dependent picls results and those of analogous simulations by the CR code scfly, which was used in the analysis of the experiments to accurately reproduce the observed K α emissions and pulse transmissions. Finally, it is shown that the effects of the nonthermal electrons are negligible for the conditions of the particular experiments under investigation.

  4. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field.

    PubMed

    Yang, Y M; Bednarz, B

    2013-02-21

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  5. Consistency evaluation between EGSnrc and Geant4 charged particle transport in an equilibrium magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Bednarz, B.

    2013-02-01

    Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.

  6. Developing of a New Atmospheric Ionizing Radiation (AIR) Model

    NASA Technical Reports Server (NTRS)

    Clem, John M.; deAngelis, Giovanni; Goldhagen, Paul; Wilson, John W.

    2003-01-01

    As a result of the research leading to the 1998 AIR workshop and the subsequent analysis, the neutron issues posed by Foelsche et al. and further analyzed by Hajnal have been adequately resolved. We are now engaged in developing a new atmospheric ionizing radiation (AIR) model for use in epidemiological studies and air transportation safety assessment. A team was formed to examine a promising code using the basic FLUKA software but with modifications to allow multiple charged ion breakup effects. A limited dataset of the ER-2 measurements and other cosmic ray data will be used to evaluate the use of this code.

  7. Addendum: ``Hard X-Rays and Gamma Rays from Type Ia Supernovae'' (ApJ, 492, 228 [1998])

    NASA Astrophysics Data System (ADS)

    Höflich, Peter; Wheeler, J. C.

    2004-04-01

    We report a subtle error in the normalization of the absolute flux published in our original article (hereafter HWK98), and some minor updates. The normalization problem is related to the post-processing. As a consequence, the reported line fluxes are too large at early times. Note that Figure 1 of P. Höflich (ApJ, 492, 228 [1998]) has been transferred from HWK98. Results of previous papers are not affected (E. Müller, P. Höflich, A. M. Khokhlov, & E. Müller, ApJ, 492, 228 [1998]; P. Höflich, A. M. Khokhlov, & E. Müller, ApJ, 492, 228 [1998]). For calculating the γ-ray spectra, the γ-ray transport is solved via a Monte Carlo code that produces an output file containing the Eddington flux, the energy input by radioactive decay and escape probability, ζ, of γ-ray photons. In a postprocessing step, the spectrum is renormalized and convolved with the instrumental response function of the γ-ray telescope. A two-step procedure is used to obtain the emergent spectra to separate the CPU-intensive Monte Carlo transport calculation from the ``fast'' second step, allowing us to study the influence of the instrument on the observables (e.g., E. Müller, P. Höflich, A. M. Khokhlov, & E. Müller, ApJ, 492, 228 [1998

  8. Isotopic Dependence of GCR Fluence behind Shielding

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Wilson, John W.; Saganti, Premkumar; Kim, Myung-Hee Y.; Cleghorn, Timothy; Zeitlin, Cary; Tripathi, Ram K.

    2006-01-01

    In this paper we consider the effects of the isotopic composition of the primary galactic cosmic rays (GCR), nuclear fragmentation cross-sections, and isotopic-grid on the solution to transport models used for shielding studies. Satellite measurements are used to describe the isotopic composition of the GCR. For the nuclear interaction data-base and transport solution, we use the quantum multiple-scattering theory of nuclear fragmentation (QMSFRG) and high-charge and energy (HZETRN) transport code, respectively. The QMSFRG model is shown to accurately describe existing fragmentation data including proper description of the odd-even effects as function of the iso-spin dependence on the projectile nucleus. The principle finding of this study is that large errors (+/-100%) will occur in the mass-fluence spectra when comparing transport models that use a complete isotopic-grid (approx.170 ions) to ones that use a reduced isotopic-grid, for example the 59 ion-grid used in the HZETRN code in the past, however less significant errors (<+/-20%) occur in the elemental-fluence spectra. Because a complete isotopic-grid is readily handled on small computer workstations and is needed for several applications studying GCR propagation and scattering, it is recommended that they be used for future GCR studies.

  9. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Manuel, Mario; Keiter, Paul; Drake, R. P.

    2014-10-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, imploding bubbles, and interacting jet experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via Grant DEFC52-08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0001840, and by the National Laser User Facility Program, Grant Number DE-NA0000850.

  10. MCNPX Cosmic Ray Shielding Calculations with the NORMAN Phantom Model

    NASA Technical Reports Server (NTRS)

    James, Michael R.; Durkee, Joe W.; McKinney, Gregg; Singleterry Robert

    2008-01-01

    The United States is planning manned lunar and interplanetary missions in the coming years. Shielding from cosmic rays is a critical aspect of manned spaceflight. These ventures will present exposure issues involving the interplanetary Galactic Cosmic Ray (GCR) environment. GCRs are comprised primarily of protons (approx.84.5%) and alpha-particles (approx.14.7%), while the remainder is comprised of massive, highly energetic nuclei. The National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) has commissioned a joint study with Los Alamos National Laboratory (LANL) to investigate the interaction of the GCR environment with humans using high-fidelity, state-of-the-art computer simulations. The simulations involve shielding and dose calculations in order to assess radiation effects in various organs. The simulations are being conducted using high-resolution voxel-phantom models and the MCNPX[1] Monte Carlo radiation-transport code. Recent advances in MCNPX physics packages now enable simulated transport over 2200 types of ions of widely varying energies in large, intricate geometries. We report here initial results obtained using a GCR spectrum and a NORMAN[3] phantom.

  11. Faster and More Accurate Transport Procedures for HZETRN

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Blattnig, Steve R.; Badavi, Francis F.

    2010-01-01

    Several aspects of code verification are examined for HZETRN. First, a detailed derivation of the numerical marching algorithms is given. Next, a new numerical method for light particle transport is presented, and improvements to the heavy ion transport algorithm are discussed. A summary of various coding errors is also given, and the impact of these errors on exposure quantities is shown. Finally, a coupled convergence study is conducted. From this study, it is shown that past efforts in quantifying the numerical error in HZETRN were hindered by single precision calculations and computational resources. It is also determined that almost all of the discretization error in HZETRN is caused by charged target fragments below 50 AMeV. Total discretization errors are given for the old and new algorithms, and the improved accuracy of the new numerical methods is demonstrated. Run time comparisons are given for three applications in which HZETRN is commonly used. The new algorithms are found to be almost 100 times faster for solar particle event simulations and almost 10 times faster for galactic cosmic ray simulations.

  12. Pion and electromagnetic contribution to dose: Comparisons of HZETRN to Monte Carlo results and ISS data

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Blattnig, Steve R.; Reddell, Brandon; Bahadori, Amir; Norman, Ryan B.; Badavi, Francis F.

    2013-07-01

    Recent work has indicated that pion production and the associated electromagnetic (EM) cascade may be an important contribution to the total astronaut exposure in space. Recent extensions to the deterministic space radiation transport code, HZETRN, allow the production and transport of pions, muons, electrons, positrons, and photons. In this paper, the extended code is compared to the Monte Carlo codes, Geant4, PHITS, and FLUKA, in slab geometries exposed to galactic cosmic ray (GCR) boundary conditions. While improvements in the HZETRN transport formalism for the new particles are needed, it is shown that reasonable agreement on dose is found at larger shielding thicknesses commonly found on the International Space Station (ISS). Finally, the extended code is compared to ISS data on a minute-by-minute basis over a seven day period in 2001. The impact of pion/EM production on exposure estimates and validation results is clearly shown. The Badhwar-O'Neill (BO) 2004 and 2010 models are used to generate the GCR boundary condition at each time-step allowing the impact of environmental model improvements on validation results to be quantified as well. It is found that the updated BO2010 model noticeably reduces overall exposure estimates from the BO2004 model, and the additional production mechanisms in HZETRN provide some compensation. It is shown that the overestimates provided by the BO2004 GCR model in previous validation studies led to deflated uncertainty estimates for environmental, physics, and transport models, and allowed an important physical interaction (π/EM) to be overlooked in model development. Despite the additional π/EM production mechanisms in HZETRN, a systematic under-prediction of total dose is observed in comparison to Monte Carlo results and measured data.

  13. ScintSim1: A new Monte Carlo simulation code for transport of optical photons in 2D arrays of scintillation detectors

    PubMed Central

    Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali

    2014-01-01

    Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization. PMID:24600168

  14. ScintSim1: A new Monte Carlo simulation code for transport of optical photons in 2D arrays of scintillation detectors.

    PubMed

    Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali

    2014-01-01

    Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization.

  15. MESTRN: A Deterministic Meson-Muon Transport Code for Space Radiation

    NASA Technical Reports Server (NTRS)

    Blattnig, Steve R.; Norbury, John W.; Norman, Ryan B.; Wilson, John W.; Singleterry, Robert C., Jr.; Tripathi, Ram K.

    2004-01-01

    A safe and efficient exploration of space requires an understanding of space radiations, so that human life and sensitive equipment can be protected. On the way to these sensitive sites, the radiation fields are modified in both quality and quantity. Many of these modifications are thought to be due to the production of pions and muons in the interactions between the radiation and intervening matter. A method used to predict the effects of the presence of these particles on the transport of radiation through materials is developed. This method was then used to develop software, which was used to calculate the fluxes of pions and muons after the transport of a cosmic ray spectrum through aluminum and water. Software descriptions are given in the appendices.

  16. Isotopic Effects in Nuclear Fragmentation and GCR Transport Problems

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2002-01-01

    Improving the accuracy of the galactic cosmic ray (GCR) environment and transport models is an important goal in preparing for studies of the projected risks and the efficiency of potential mitigations methods for space exploration. In this paper we consider the effects of the isotopic composition of the primary cosmic rays and the isotopic dependence of nuclear fragmentation cross sections on GCR transport models. Measurements are used to describe the isotopic composition of the GCR including their modulation throughout the solar cycle. The quantum multiple-scattering approach to nuclear fragmentation (QMSFRG) is used as the data base generator in order to accurately describe the odd-even effect in fragment production. Using the Badhwar and O'Neill GCR model, the QMSFRG model and the HZETRN transport code, the effects of the isotopic dependence of the primary GCR composition and on fragment production for transport problems is described for a complete GCR isotopic-grid. The principle finding of this study is that large errors ( 100%) will occur in the mass-flux spectra when comparing the complete isotopic-grid (141 ions) to a reduced isotopic-grid (59 ions), however less significant errors 30%) occur in the elemental-flux spectra. Because the full isotopic-grid is readily handled on small computer work-stations, it is recommended that they be used for future GCR studies.

  17. Deep Space Test Bed for Radiation Studies

    NASA Technical Reports Server (NTRS)

    Adams, James H.; Christl, Mark; Watts, John; Kuznetsov, Eugene; Lin, Zi-Wei

    2006-01-01

    A key factor affecting the technical feasibility and cost of missions to Mars or the Moon is the need to protect the crew from ionizing radiation in space. Some analyses indicate that large amounts of spacecraft shielding may be necessary for crew safety. The shielding requirements are driven by the need to protect the crew from Galactic cosmic rays (GCR). Recent research activities aimed at enabling manned exploration have included shielding materials studies. A major goal of this research is to develop accurate radiation transport codes to calculate the shielding effectiveness of materials and to develop effective shielding strategies for spacecraft design. Validation of these models and calculations must be addressed in a relevant radiation environment to assure their technical readiness and accuracy. Test data obtained in the deep space radiation environment can provide definitive benchmarks and yield uncertainty estimates of the radiation transport codes. The two approaches presently used for code validation are ground based testing at particle accelerators and flight tests in high-inclination low-earth orbits provided by the shuttle, free-flyer platforms, or polar-orbiting satellites. These approaches have limitations in addressing all the radiation-shielding issues of deep space missions in both technical and practical areas. An approach based on long duration high altitude polar balloon flights provides exposure to the galactic cosmic ray composition and spectra encountered in deep space at a lower cost and with easier and more frequent access than afforded with spaceflight opportunities. This approach also results in shorter development times than spaceflight experiments, which is important for addressing changing program goals and requirements.

  18. High Energy Electron Detectors on Sphinx

    NASA Astrophysics Data System (ADS)

    Thompson, J. R.; Porte, A.; Zucchini, F.; Calamy, H.; Auriel, G.; Coleman, P. L.; Bayol, F.; Lalle, B.; Krishnan, M.; Wilson, K.

    2008-11-01

    Z-pinch plasma radiation sources are used to dose test objects with K-shell (˜1-4keV) x-rays. The implosion physics can produce high energy electrons (> 50keV), which could distort interpretation of the soft x-ray effects. We describe the design and implementation of a diagnostic suite to characterize the electron environment of Al wire and Ar gas puff z-pinches on Sphinx. The design used ITS calculations to model detector response to both soft x-rays and electrons and help set upper bounds to the spurious electron flux. Strategies to discriminate between the known soft x-ray emission and the suspected electron flux will be discussed. H.Calamy et al, ``Use of microsecond current prepulse for dramatic improvements of wire array Z-pinch implosion,'' Phys Plasmas 15, 012701 (2008) J.A.Halbleib et al, ``ITS: the integrated TIGER series of electron/photon transport codes-Version 3.0,'' IEEE Trans on Nuclear Sci, 39, 1025 (1992)

  19. Probing the Accretion Geometry of Black Holes with X-Ray Polarization

    NASA Technical Reports Server (NTRS)

    Schnitman, Jeremy D.

    2011-01-01

    In the coming years, new space missions will be able to measure X-ray polarization at levels of 1% or better in the approx.1-10 keV energy band. In particular, X-ray polarization is an ideal tool for determining the nature of black hole (BH) accretion disks surrounded by hot coronae. Using a Monte Carlo radiation transport code in full general relativity, we calculate the spectra and polarization features of these BH systems. At low energies, the signal is dominated by the thermal flux coming directly from the optically thick disk. At higher energies, the thermal seed photons have been inverse-Compton scattered by the corona, often reflecting back off the disk before reaching the observer, giving a distinctive polarization signature. By measuring the degree and angle of this X-ray polarization, we can infer the BH inclination, the emission geometry of the accretion flow, and also determine the spin of the black hole.

  20. Consideration of the Protection Curtain's Shielding Ability after Identifying the Source of Scattered Radiation in the Angiography.

    PubMed

    Sato, Naoki; Fujibuchi, Toshioh; Toyoda, Takatoshi; Ishida, Takato; Ohura, Hiroki; Miyajima, Ryuichi; Orita, Shinichi; Sueyoshi, Tomonari

    2017-06-15

    To decrease radiation exposure to medical staff performing angiography, the dose distribution in the angiography was calculated in room using the particle and heavy ion transport code system (PHITS), which is based on Monte Carlo code, and the source of scattered radiation was confirmed using a tungsten sheet by considering the difference shielding performance among different sheet placements. Scattered radiation generated from a flat panel detector, X-ray tube and bed was calculated using the PHITS. In this experiment, the source of scattered radiation was identified as the phantom or acrylic window attached to the X-ray tube thus, a protection curtain was placed on the bed to shield against scattered radiation at low positions. There was an average difference of 20% between the measured and calculated values. The H*(10) value decreased after placing the sheet on the right side of the phantom. Thus, the curtain could decrease scattered radiation. © Crown copyright 2016.

  1. Kinetic Modeling of Ultraintense X-Ray Laser-Matter Interactions

    NASA Astrophysics Data System (ADS)

    Royle, Ryan; Sentoku, Yasuhiko; Mancini, Roberto; Johzaki, Tomoyuki

    2015-11-01

    High-intensity XFELs have become a novel way of creating and studying hot dense plasmas. The LCLS at Stanford can deliver a millijoule of energy with more than 1012 photons in a ~ 100 femtosecond pulse. By tightly focusing the beam to a micron-scale spot size, the XFEL can be intensified to more than 1018 W/cm2, making it possible to heat solid matter isochorically beyond a million degrees (>100 eV). Such extreme states of matter are of considerable interest due to their relevance to astrophysical plasmas. Additionally, they will allow novel ways of studying equation-of-state and opacity physics under Gbar pressure and strong fields. Photoionization is the dominant x-ray absorption mechanism and triggers the heating processes. A photoionization model that takes into account the subshell cross-sections has been developed in a kinetic plasma simulation code, PICLS, that solves the x-ray transport self-consistently. The XFEL-matter interaction with several elements, including solid carbon, aluminum, and iron, is studied with the code, and the results are compared with recent LCLS experiments. This work was supported by the DOE/OFES under Contract No. DE-SC0008827.

  2. Monte Carlo simulation of x-ray buildup factors of lead and its applications in shielding of diagnostic x-ray facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kharrati, Hedi; Agrebi, Amel; Karaoui, Mohamed-Karim

    2007-04-15

    X-ray buildup factors of lead in broad beam geometry for energies from 15 to 150 keV are determined using the general purpose Monte Carlo N-particle radiation transport computer code (MCNP4C). The obtained buildup factors data are fitted to a modified three parameter Archer et al. model for ease in calculating the broad beam transmission with computer at any tube potentials/filters combinations in diagnostic energies range. An example for their use to compute the broad beam transmission at 70, 100, 120, and 140 kVp is given. The calculated broad beam transmission is compared to data derived from literature, presenting good agreement.more » Therefore, the combination of the buildup factors data as determined and a mathematical model to generate x-ray spectra provide a computationally based solution to broad beam transmission for lead barriers in shielding x-ray facilities.« less

  3. Application of the MCNPX-McStas interface for shielding calculations and guide design at ESS

    NASA Astrophysics Data System (ADS)

    Klinkby, E. B.; Knudsen, E. B.; Willendrup, P. K.; Lauritzen, B.; Nonbøl, E.; Bentley, P.; Filges, U.

    2014-07-01

    Recently, an interface between the Monte Carlo code MCNPX and the neutron ray-tracing code MCNPX was developed [1, 2]. Based on the expected neutronic performance and guide geometries relevant for the ESS, the combined MCNPX-McStas code is used to calculate dose rates along neutron beam guides. The generation and moderation of neutrons is simulated using a full scale MCNPX model of the ESS target monolith. Upon entering the neutron beam extraction region, the individual neutron states are handed to McStas via the MCNPX-McStas interface. McStas transports the neutrons through the beam guide, and by using newly developed event logging capability, the neutron state parameters corresponding to un-reflected neutrons are recorded at each scattering. This information is handed back to MCNPX where it serves as neutron source input for a second MCNPX simulation. This simulation enables calculation of dose rates in the vicinity of the guide. In addition the logging mechanism is employed to record the scatterings along the guides which is exploited to simulate the supermirror quality requirements (i.e. m-values) needed at different positions along the beam guide to transport neutrons in the same guide/source setup.

  4. Depth dependency of neutron density produced by cosmic rays in the lunar subsurface

    NASA Astrophysics Data System (ADS)

    Ota, S.; Sihver, L.; Kobayashi, S.; Hasebe, N.

    2014-11-01

    Depth dependency of neutrons produced by cosmic rays (CRs) in the lunar subsurface was estimated using the three-dimensional Monte Carlo particle and heavy ion transport simulation code, PHITS, incorporating the latest high energy nuclear data, JENDL/HE-2007. The PHITS simulations of equilibrium neutron density profiles in the lunar subsurface were compared with the measurement by Apollo 17 Lunar Neutron Probe Experiment (LNPE). Our calculations reproduced the LNPE data except for the 350-400 mg/cm2 region under the improved condition using the CR spectra model based on the latest observations, well-tested nuclear interaction models with systematic cross section data, and JENDL/HE-2007.

  5. MODA: a new algorithm to compute optical depths in multidimensional hydrodynamic simulations

    NASA Astrophysics Data System (ADS)

    Perego, Albino; Gafton, Emanuel; Cabezón, Rubén; Rosswog, Stephan; Liebendörfer, Matthias

    2014-08-01

    Aims: We introduce the multidimensional optical depth algorithm (MODA) for the calculation of optical depths in approximate multidimensional radiative transport schemes, equally applicable to neutrinos and photons. Motivated by (but not limited to) neutrino transport in three-dimensional simulations of core-collapse supernovae and neutron star mergers, our method makes no assumptions about the geometry of the matter distribution, apart from expecting optically transparent boundaries. Methods: Based on local information about opacities, the algorithm figures out an escape route that tends to minimize the optical depth without assuming any predefined paths for radiation. Its adaptivity makes it suitable for a variety of astrophysical settings with complicated geometry (e.g., core-collapse supernovae, compact binary mergers, tidal disruptions, star formation, etc.). We implement the MODA algorithm into both a Eulerian hydrodynamics code with a fixed, uniform grid and into an SPH code where we use a tree structure that is otherwise used for searching neighbors and calculating gravity. Results: In a series of numerical experiments, we compare the MODA results with analytically known solutions. We also use snapshots from actual 3D simulations and compare the results of MODA with those obtained with other methods, such as the global and local ray-by-ray method. It turns out that MODA achieves excellent accuracy at a moderate computational cost. In appendix we also discuss implementation details and parallelization strategies.

  6. COMPARISON OF COSMIC-RAY ENVIRONMENTS ON EARTH, MOON, MARS AND IN SPACECARFT USING PHITS.

    PubMed

    Sato, Tatsuhiko; Nagamatsu, Aiko; Ueno, Haruka; Kataoka, Ryuho; Miyake, Shoko; Takeda, Kazuo; Niita, Koji

    2017-09-29

    Estimation of cosmic-ray doses is of great importance not only in aircrew and astronaut dosimetry but also in evaluation of background radiation exposure to public. We therefore calculated the cosmic-ray doses on Earth, Moon and Mars as well as inside spacecraft, using Particle and Heavy Ion Transport code System PHITS. The same cosmic-ray models and dose conversion coefficients were employed in the calculation to properly compare between the simulation results for different environments. It is quantitatively confirmed that the thickness of physical shielding including the atmosphere and soil of the planets is the most important parameter to determine the cosmic-ray doses and their dominant contributors. The comparison also suggests that higher solar activity significantly reduces the astronaut doses particularly for the interplanetary missions. The information obtained from this study is useful in the designs of the future space missions as well as accelerator-based experiments dedicated to cosmic-ray research. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Rates for neutron-capture reactions on tungsten isotopes in iron meteorites. [Abstract only

    NASA Technical Reports Server (NTRS)

    Masarik, J.; Reedy, R. C.

    1994-01-01

    High-precision W isotopic analyses by Harper and Jacobsen indicate the W-182/W-183 ratio in the Toluca iron meteorite is shifted by -(3.0 +/- 0.9) x 10(exp -4) relative to a terrestrial standard. Possible causes of this shift are neutron-capture reactions on W during Toluca's approximately 600-Ma exposure to cosmic ray particles or radiogenic growth of W-182 from 9-Ma Hf-182 in the silicate portion of the Earth after removal of W to the Earth's core. Calculations for the rates of neutron-capture reactions on W isotopes were done to study the first possibility. The LAHET Code System (LCS) which consists of the Los Alamos High Energy Transport (LAHET) code and the Monte Carlo N-Particle(MCNP) transport code was used to numerically simulate the irradiation of the Toluca iron meteorite by galactic-cosmic-ray (GCR) particles and to calculate the rates of W(n, gamma) reactions. Toluca was modeled as a 3.9-m-radius sphere with the composition of a typical IA iron meteorite. The incident GCR protons and their interactions were modeled with LAHET, which also handled the interactions of neutrons with energies above 20 MeV. The rates for the capture of neutrons by W-182, W-183, and W-186 were calculated using the detailed library of (n, gamma) cross sections in MCNP. For this study of the possible effect of W(n, gamma) reactions on W isotope systematics, we consider the peak rates. The calculated maximum change in the normalized W-182/W-183 ratio due to neutron-capture reactions cannot account for more than 25% of the mass 182 deficit observed in Toluca W.

  8. HARD X-RAY ASYMMETRY LIMITS IN SOLAR FLARE CONJUGATE FOOTPOINTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daou, Antoun G.; Alexander, David, E-mail: agdaou@rice.edu, E-mail: dalex@rice.edu

    2016-11-20

    The transport of energetic electrons in a solar flare is modeled using a time-dependent one-dimensional Fokker–Planck code that incorporates asymmetric magnetic convergence. We derive the temporal and spectral evolution of the resulting hard X-ray (HXR) emission in the conjugate chromospheric footpoints, assuming thick target photon production, and characterize the time evolution of the numerically simulated footpoint asymmetry and its relationship to the photospheric magnetic configuration. The thick target HXR asymmetry in the conjugate footpoints is found to increase with magnetic field ratio as expected. However, we find that the footpoint HXR asymmetry saturates for conjugate footpoint magnetic field ratios ≥4.more » This result is borne out in a direct comparison with observations of 44 double-footpoint flares. The presence of such a limit has not been reported before, and may serve as both a theoretical and observational benchmark for testing a range of particle transport and flare morphology constraints, particularly as a means to differentiate between isotropic and anisotropic particle injection.« less

  9. Monte Carlo simulation of x-ray spectra in diagnostic radiology and mammography using MCNP4C

    NASA Astrophysics Data System (ADS)

    Ay, M. R.; Shahriari, M.; Sarkar, S.; Adib, M.; Zaidi, H.

    2004-11-01

    The general purpose Monte Carlo N-particle radiation transport computer code (MCNP4C) was used for the simulation of x-ray spectra in diagnostic radiology and mammography. The electrons were transported until they slow down and stop in the target. Both bremsstrahlung and characteristic x-ray production were considered in this work. We focus on the simulation of various target/filter combinations to investigate the effect of tube voltage, target material and filter thickness on x-ray spectra in the diagnostic radiology and mammography energy ranges. The simulated x-ray spectra were compared with experimental measurements and spectra calculated by IPEM report number 78. In addition, the anode heel effect and off-axis x-ray spectra were assessed for different anode angles and target materials and the results were compared with EGS4-based Monte Carlo simulations and measured data. Quantitative evaluation of the differences between our Monte Carlo simulated and comparison spectra was performed using student's t-test statistical analysis. Generally, there is a good agreement between the simulated x-ray and comparison spectra, although there are systematic differences between the simulated and reference spectra especially in the K-characteristic x-rays intensity. Nevertheless, no statistically significant differences have been observed between IPEM spectra and the simulated spectra. It has been shown that the difference between MCNP simulated spectra and IPEM spectra in the low energy range is the result of the overestimation of characteristic photons following the normalization procedure. The transmission curves produced by MCNP4C have good agreement with the IPEM report especially for tube voltages of 50 kV and 80 kV. The systematic discrepancy for higher tube voltages is the result of systematic differences between the corresponding spectra.

  10. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com; Suprijadi; Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic imagesmore » and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.« less

  11. Three-dimensional Monte-Carlo simulation of gamma-ray scattering and production in the atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, D.J.

    1989-05-15

    Monte Carlo codes have been developed to simulate gamma-ray scattering and production in the atmosphere. The scattering code simulates interactions of low-energy gamma rays (20 to several hundred keV) from an astronomical point source in the atmosphere; a modified code also simulates scattering in a spacecraft. Four incident spectra, typical of gamma-ray bursts, solar flares, and the Crab pulsar, and 511 keV line radiation have been studied. These simulations are consistent with observations of solar flare radiation scattered from the atmosphere. The production code simulates the interactions of cosmic rays which produce high-energy (above 10 MeV) photons and electrons. Itmore » has been used to calculate gamma-ray and electron albedo intensities at Palestine, Texas and at the equator; the results agree with observations in most respects. With minor modifications this code can be used to calculate intensities of other high-energy particles. Both codes are fully three-dimensional, incorporating a curved atmosphere; the production code also incorporates the variation with both zenith and azimuth of the incident cosmic-ray intensity due to geomagnetic effects. These effects are clearly reflected in the calculated albedo by intensity contrasts between the horizon and nadir, and between the east and west horizons.« less

  12. Neutron production by cosmic-ray muons in various materials

    NASA Astrophysics Data System (ADS)

    Manukovsky, K. V.; Ryazhskaya, O. G.; Sobolevsky, N. M.; Yudin, A. V.

    2016-07-01

    The results obtained by studying the background of neutrons produced by cosmic-raymuons in underground experimental facilities intended for rare-event searches and in surrounding rock are presented. The types of this rock may include granite, sedimentary rock, gypsum, and rock salt. Neutron production and transfer were simulated using the Geant4 and SHIELD transport codes. These codes were tuned via a comparison of the results of calculations with experimental data—in particular, with data of the Artemovsk research station of the Institute for Nuclear Research (INR, Moscow, Russia)—as well as via an intercomparison of results of calculations with the Geant4 and SHIELD codes. It turns out that the atomic-number dependence of the production and yield of neutrons has an irregular character and does not allow a description in terms of a universal function of the atomic number. The parameters of this dependence are different for two groups of nuclei—nuclei consisting of alpha particles and all of the remaining nuclei. Moreover, there are manifest exceptions from a power-law dependence—for example, argon. This may entail important consequences both for the existing underground experimental facilities and for those under construction. Investigation of cosmic-ray-induced neutron production in various materials is of paramount importance for the interpretation of experiments conducted at large depths under the Earth's surface.

  13. Mass-invariance of the iron enrichment in the hot haloes of massive ellipticals, groups, and clusters of galaxies

    NASA Astrophysics Data System (ADS)

    Mernier, F.; de Plaa, J.; Werner, N.; Kaastra, J. S.; Raassen, A. J. J.; Gu, L.; Mao, J.; Urdampilleta, I.; Truong, N.; Simionescu, A.

    2018-05-01

    X-ray measurements find systematically lower Fe abundances in the X-ray emitting haloes pervading groups (kT ≲ 1.7 keV) than in clusters of galaxies. These results have been difficult to reconcile with theoretical predictions. However, models using incomplete atomic data or the assumption of isothermal plasmas may have biased the best fit Fe abundance in groups and giant elliptical galaxies low. In this work, we take advantage of a major update of the atomic code in the spectral fitting package SPEX to re-evaluate the Fe abundance in 43 clusters, groups, and elliptical galaxies (the CHEERS sample) in a self-consistent analysis and within a common radius of 0.1r500. For the first time, we report a remarkably similar average Fe enrichment in all these systems. Unlike previous results, this strongly suggests that metals are synthesised and transported in these haloes with the same average efficiency across two orders of magnitude in total mass. We show that the previous metallicity measurements in low temperature systems were biased low due to incomplete atomic data in the spectral fitting codes. The reasons for such a code-related Fe bias, also implying previously unconsidered biases in the emission measure and temperature structure, are discussed.

  14. Comparisons of cross-section predictions for relativistic iron and argon beams with semiempirical fragmentation models

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Tripathi, Ram K.; Khan, Ferdous

    1993-01-01

    Cross-section predictions with semi-empirical nuclear fragmentation models from the Langley Research Center and the Naval Research Laboratory are compared with experimental data for the breakup of relativistic iron and argon projectile nuclei in various targets. Both these models are commonly used to provide fragmentation cross-section inputs into galactic cosmic ray transport codes for shielding and exposure analyses. Overall, the Langley model appears to yield better agreement with the experimental data.

  15. Hard X-ray imaging from Explorer

    NASA Technical Reports Server (NTRS)

    Grindlay, J. E.; Murray, S. S.

    1981-01-01

    Coded aperture X-ray detectors were applied to obtain large increases in sensitivity as well as angular resolution. A hard X-ray coded aperture detector concept is described which enables very high sensitivity studies persistent hard X-ray sources and gamma ray bursts. Coded aperture imaging is employed so that approx. 2 min source locations can be derived within a 3 deg field of view. Gamma bursts were located initially to within approx. 2 deg and X-ray/hard X-ray spectra and timing, as well as precise locations, derived for possible burst afterglow emission. It is suggested that hard X-ray imaging should be conducted from an Explorer mission where long exposure times are possible.

  16. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    NASA Astrophysics Data System (ADS)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 83617 No. of bytes in distributed program, including test data, etc.: 1038160 Distribution format: tar.gz Programming language: C++. Computer: Tested on several PCs and on Mac. Operating system: Linux, Mac OS X, Windows (native and cygwin). RAM: It is dependent on the input data but usually between 1 and 10 MB. Classification: 2.5, 21.1. External routines: XrayLib (https://github.com/tschoonj/xraylib/wiki) Nature of problem: Simulation of a wide range of X-ray imaging and spectroscopy experiments using different types of sources and detectors. Solution method: XRMC is a versatile program that is useful for the simulation of a wide range of X-ray imaging and spectroscopy experiments. It enables the simulation of monochromatic and polychromatic X-ray sources, with unpolarised or partially/completely polarised radiation. Single-element detectors as well as two-dimensional pixel detectors can be used in the simulations, with several acquisition options. In the current version of the program, the sample is modelled by combining convex three-dimensional objects demarcated by quadric surfaces, such as planes, ellipsoids and cylinders. The Monte Carlo approach makes XRMC able to accurately simulate X-ray photon transport and interactions with matter up to any order of interaction. The differential cross-sections and all other quantities related to the interaction processes (photoelectric absorption, fluorescence emission, elastic and inelastic scattering) are computed using the xraylib software library, which is currently the most complete and up-to-date software library for X-ray parameters. The use of variance reduction techniques makes XRMC able to reduce the simulation time by several orders of magnitude compared to other general-purpose Monte Carlo simulation programs. Running time: It is dependent on the complexity of the simulation. For the examples distributed with the code, it ranges from less than 1 s to a few minutes.

  17. Measuring and interpreting X-ray fluorescence from planetary surfaces.

    PubMed

    Owens, Alan; Beckhoff, Burkhard; Fraser, George; Kolbe, Michael; Krumrey, Michael; Mantero, Alfonso; Mantler, Michael; Peacock, Anthony; Pia, Maria-Grazia; Pullan, Derek; Schneider, Uwe G; Ulm, Gerhard

    2008-11-15

    As part of a comprehensive study of X-ray emission from planetary surfaces and in particular the planet Mercury, we have measured fluorescent radiation from a number of planetary analog rock samples using monochromatized synchrotron radiation provided by the BESSY II electron storage ring. The experiments were carried out using a purpose built X-ray fluorescence (XRF) spectrometer chamber developed by the Physikalisch-Technische Bundesanstalt, Germany's national metrology institute. The XRF instrumentation is absolutely calibrated and allows for reference-free quantitation of rock sample composition, taking into account secondary photon- and electron-induced enhancement effects. The fluorescence data, in turn, have been used to validate a planetary fluorescence simulation tool based on the GEANT4 transport code. This simulation can be used as a mission analysis tool to predict the time-dependent orbital XRF spectral distributions from planetary surfaces throughout the mapping phase.

  18. Contrast cancellation technique applied to digital x-ray imaging using silicon strip detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avila, C.; Lopez, J.; Sanabria, J. C.

    2005-12-15

    Dual-energy mammographic imaging experimental tests have been performed using a compact dichromatic imaging system based on a conventional x-ray tube, a mosaic crystal, and a 384-strip silicon detector equipped with full-custom electronics with single photon counting capability. For simulating mammal tissue, a three-component phantom, made of Plexiglass, polyethylene, and water, has been used. Images have been collected with three different pairs of x-ray energies: 16-32 keV, 18-36 keV, and 20-40 keV. A Monte Carlo simulation of the experiment has also been carried out using the MCNP-4C transport code. The Alvarez-Macovski algorithm has been applied both to experimental and simulated datamore » to remove the contrast between two of the phantom materials so as to enhance the visibility of the third one.« less

  19. Theoretical modeling of a portable x-ray tube based KXRF system to measure lead in bone

    PubMed Central

    Specht, Aaron J; Weisskopf, Marc G; Nie, Linda Huiling

    2017-01-01

    Objective K-shell x-ray fluorescence (KXRF) techniques have been used to identify health effects resulting from exposure to metals for decades, but the equipment is bulky and requires significant maintenance and licensing procedures. A portable x-ray fluorescence (XRF) device was developed to overcome these disadvantages, but introduced a measurement dependency on soft tissue thickness. With recent advances to detector technology, an XRF device utilizing the advantages of both systems should be feasible. Approach In this study, we used Monte Carlo simulations to test the feasibility of an XRF device with a high-energy x-ray tube and detector operable at room temperature. Main Results We first validated the use of Monte Carlo N-particle transport code (MCNP) for x-ray tube simulations, and found good agreement between experimental and simulated results. Then, we optimized x-ray tube settings and found the detection limit of the high-energy x-ray tube based XRF device for bone lead measurements to be 6.91 μg g−1 bone mineral using a cadmium zinc telluride detector. Significance In conclusion, this study validated the use of MCNP in simulations of x-ray tube physics and XRF applications, and demonstrated the feasibility of a high-energy x-ray tube based XRF for metal exposure assessment. PMID:28169835

  20. Theoretical modeling of a portable x-ray tube based KXRF system to measure lead in bone.

    PubMed

    Specht, Aaron J; Weisskopf, Marc G; Nie, Linda Huiling

    2017-03-01

    K-shell x-ray fluorescence (KXRF) techniques have been used to identify health effects resulting from exposure to metals for decades, but the equipment is bulky and requires significant maintenance and licensing procedures. A portable x-ray fluorescence (XRF) device was developed to overcome these disadvantages, but introduced a measurement dependency on soft tissue thickness. With recent advances to detector technology, an XRF device utilizing the advantages of both systems should be feasible. In this study, we used Monte Carlo simulations to test the feasibility of an XRF device with a high-energy x-ray tube and detector operable at room temperature. We first validated the use of Monte Carlo N-particle transport code (MCNP) for x-ray tube simulations, and found good agreement between experimental and simulated results. Then, we optimized x-ray tube settings and found the detection limit of the high-energy x-ray tube based XRF device for bone lead measurements to be 6.91 µg g -1 bone mineral using a cadmium zinc telluride detector. In conclusion, this study validated the use of MCNP in simulations of x-ray tube physics and XRF applications, and demonstrated the feasibility of a high-energy x-ray tube based XRF for metal exposure assessment.

  1. Coded mask telescopes for X-ray astronomy

    NASA Astrophysics Data System (ADS)

    Skinner, G. K.; Ponman, T. J.

    1987-04-01

    The principle of the coded mask techniques are discussed together with the methods of image reconstruction. The coded mask telescopes built at the University of Birmingham, including the SL 1501 coded mask X-ray telescope flown on the Skylark rocket and the Coded Mask Imaging Spectrometer (COMIS) projected for the Soviet space station Mir, are described. A diagram of a coded mask telescope and some designs for coded masks are included.

  2. Numerical simulation of the radiation environment on Martian surface

    NASA Astrophysics Data System (ADS)

    Zhao, L.

    2015-12-01

    The radiation environment on the Martian surface is significantly different from that on earth. Existing observation and studies reveal that the radiation environment on the Martian surface is highly variable regarding to both short- and long-term time scales. For example, its dose rate presents diurnal and seasonal variations associated with atmospheric pressure changes. Moreover, dose rate is also strongly influenced by the modulation from GCR flux. Numerical simulation and theoretical explanations are required to understand the mechanisms behind these features, and to predict the time variation of radiation environment on the Martian surface if aircraft is supposed to land on it in near future. The high energy galactic cosmic rays (GCRs) which are ubiquitous throughout the solar system are highly penetrating and extremely difficult to shield against beyond the Earth's protective atmosphere and magnetosphere. The goal of this article is to evaluate the long term radiation risk on the Martian surface. Therefore, we need to develop a realistic time-dependent GCR model, which will be integrated with Geant4 transport code subsequently to reproduce the observed variation of surface dose rate associated with the changing heliospheric conditions. In general, the propagation of cosmic rays in the interplanetary medium can be described by a Fokker-Planck equation (or Parker equation). In last decade,we witnessed a fast development of GCR transport models within the heliosphere based on accurate gas-dynamic and MHD backgrounds from global models of the heliosphere. The global MHD simulation produces a more realistic pattern of the 3-D heliospheric structure, as well as the interface between the solar system and the surrounding interstellar space. As a consequence, integrating plasma background obtained from global-dependent 3-D MHD simulation and stochastic Parker transport simulation, we expect to produce an accurate global physical-based GCR modulation model. Combined with the Geant4 transport code, this GCR model will provide valuable insight into the long-term dose rates variation on the Martian surface.

  3. Investigation of the hard x-ray background in backlit pinhole imagers.

    PubMed

    Fein, J R; Peebles, J L; Keiter, P A; Holloway, J P; Klein, S R; Kuranz, C C; Manuel, M J-E; Drake, R P

    2014-11-01

    Hard x-rays from laser-produced hot electrons (>10 keV) in backlit pinhole imagers can give rise to a background signal that decreases signal dynamic range in radiographs. Consequently, significant uncertainties are introduced to the measured optical depth of imaged plasmas. Past experiments have demonstrated that hard x-rays are produced when hot electrons interact with the high-Z pinhole substrate used to collimate the softer He-α x-ray source. Results are presented from recent experiments performed on the OMEGA-60 laser to further study the production of hard x-rays in the pinhole substrate and how these x-rays contribute to the background signal in radiographs. Radiographic image plates measured hard x-rays from pinhole imagers with Mo, Sn, and Ta pinhole substrates. The variation in background signal between pinhole substrates provides evidence that much of this background comes from x-rays produced in the pinhole substrate itself. A Monte Carlo electron transport code was used to model x-ray production from hot electrons interacting in the pinhole substrate, as well as to model measurements of x-rays from the irradiated side of the targets, recorded by a bremsstrahlung x-ray spectrometer. Inconsistencies in inferred hot electron distributions between the different pinhole substrate materials demonstrate that additional sources of hot electrons beyond those modeled may produce hard x-rays in the pinhole substrate.

  4. Investigation of the hard x-ray background in backlit pinhole imagers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fein, J. R., E-mail: jrfein@umich.edu; Holloway, J. P.; Peebles, J. L.

    Hard x-rays from laser-produced hot electrons (>10 keV) in backlit pinhole imagers can give rise to a background signal that decreases signal dynamic range in radiographs. Consequently, significant uncertainties are introduced to the measured optical depth of imaged plasmas. Past experiments have demonstrated that hard x-rays are produced when hot electrons interact with the high-Z pinhole substrate used to collimate the softer He-α x-ray source. Results are presented from recent experiments performed on the OMEGA-60 laser to further study the production of hard x-rays in the pinhole substrate and how these x-rays contribute to the background signal in radiographs. Radiographicmore » image plates measured hard x-rays from pinhole imagers with Mo, Sn, and Ta pinhole substrates. The variation in background signal between pinhole substrates provides evidence that much of this background comes from x-rays produced in the pinhole substrate itself. A Monte Carlo electron transport code was used to model x-ray production from hot electrons interacting in the pinhole substrate, as well as to model measurements of x-rays from the irradiated side of the targets, recorded by a bremsstrahlung x-ray spectrometer. Inconsistencies in inferred hot electron distributions between the different pinhole substrate materials demonstrate that additional sources of hot electrons beyond those modeled may produce hard x-rays in the pinhole substrate.« less

  5. Radiological characterization of the pressure vessel internals of the BNL High Flux Beam Reactor.

    PubMed

    Holden, Norman E; Reciniello, Richard N; Hu, Jih-Perng

    2004-08-01

    In preparation for the eventual decommissioning of the High Flux Beam Reactor after the permanent removal of its fuel elements from the Brookhaven National Laboratory, measurements and calculations of the decay gamma-ray dose-rate were performed in the reactor pressure vessel and on vessel internal structures such as the upper and lower thermal shields, the Transition Plate, and the Control Rod blades. Measurements of gamma-ray dose rates were made using Red Perspex polymethyl methacrylate high-dose film, a Radcal "peanut" ion chamber, and Eberline's RO-7 high-range ion chamber. As a comparison, the Monte Carlo MCNP code and MicroShield code were used to model the gamma-ray transport and dose buildup. The gamma-ray dose rate at 8 cm above the center of the Transition Plate was measured to be 160 Gy h (using an RO-7) and 88 Gy h at 8 cm above and about 5 cm lateral to the Transition Plate (using Red Perspex film). This compares with a calculated dose rate of 172 Gy h using Micro-Shield. The gamma-ray dose rate was 16.2 Gy h measured at 76 cm from the reactor core (using the "peanut" ion chamber) and 16.3 Gy h at 87 cm from the core (using Red Perspex film). The similarity of dose rates measured with different instruments indicates that using different methods and instruments is acceptable if the measurement (and calculation) parameters are well defined. Different measurement techniques may be necessary due to constraints such as size restrictions.

  6. Influence of simulation parameters on the speed and accuracy of Monte Carlo calculations using PENEPMA

    NASA Astrophysics Data System (ADS)

    Llovet, X.; Salvat, F.

    2018-01-01

    The accuracy of Monte Carlo simulations of EPMA measurements is primarily determined by that of the adopted interaction models and atomic relaxation data. The code PENEPMA implements the most reliable general models available, and it is known to provide a realistic description of electron transport and X-ray emission. Nonetheless, efficiency (i.e., the simulation speed) of the code is determined by a number of simulation parameters that define the details of the electron tracking algorithm, which may also have an effect on the accuracy of the results. In addition, to reduce the computer time needed to obtain X-ray spectra with a given statistical accuracy, PENEPMA allows the use of several variance-reduction techniques, defined by a set of specific parameters. In this communication we analyse and discuss the effect of using different values of the simulation and variance-reduction parameters on the speed and accuracy of EPMA simulations. We also discuss the effectiveness of using multi-core computers along with a simple practical strategy implemented in PENEPMA.

  7. Dose estimation for astronauts using dose conversion coefficients calculated with the PHITS code and the ICRP/ICRU adult reference computational phantoms.

    PubMed

    Sato, Tatsuhiko; Endo, Akira; Sihver, Lembit; Niita, Koji

    2011-03-01

    Absorbed-dose and dose-equivalent rates for astronauts were estimated by multiplying fluence-to-dose conversion coefficients in the units of Gy.cm(2) and Sv.cm(2), respectively, and cosmic-ray fluxes around spacecrafts in the unit of cm(-2) s(-1). The dose conversion coefficients employed in the calculation were evaluated using the general-purpose particle and heavy ion transport code system PHITS coupled to the male and female adult reference computational phantoms, which were released as a common ICRP/ICRU publication. The cosmic-ray fluxes inside and near to spacecrafts were also calculated by PHITS, using simplified geometries. The accuracy of the obtained absorbed-dose and dose-equivalent rates was verified by various experimental data measured both inside and outside spacecrafts. The calculations quantitatively show that the effective doses for astronauts are significantly greater than their corresponding effective dose equivalents, because of the numerical incompatibility between the radiation quality factors and the radiation weighting factors. These results demonstrate the usefulness of dose conversion coefficients in space dosimetry. © Springer-Verlag 2010

  8. The Alba ray tracing code: ART

    NASA Astrophysics Data System (ADS)

    Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi

    2013-09-01

    The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.

  9. Evaluation of the cosmic-ray induced background in coded aperture high energy gamma-ray telescopes

    NASA Technical Reports Server (NTRS)

    Owens, Alan; Barbier, Loius M.; Frye, Glenn M.; Jenkins, Thomas L.

    1991-01-01

    While the application of coded-aperture techniques to high-energy gamma-ray astronomy offers potential arc-second angular resolution, concerns were raised about the level of secondary radiation produced in a thick high-z mask. A series of Monte-Carlo calculations are conducted to evaluate and quantify the cosmic-ray induced neutral particle background produced in a coded-aperture mask. It is shown that this component may be neglected, being at least a factor of 50 lower in intensity than the cosmic diffuse gamma-rays.

  10. Combined Modeling of Acceleration, Transport, and Hydrodynamic Response in Solar Flares. 1; The Numerical Model

    NASA Technical Reports Server (NTRS)

    Liu, Wei; Petrosian, Vahe; Mariska, John T.

    2009-01-01

    Acceleration and transport of high-energy particles and fluid dynamics of atmospheric plasma are interrelated aspects of solar flares, but for convenience and simplicity they were artificially separated in the past. We present here self consistently combined Fokker-Planck modeling of particles and hydrodynamic simulation of flare plasma. Energetic electrons are modeled with the Stanford unified code of acceleration, transport, and radiation, while plasma is modeled with the Naval Research Laboratory flux tube code. We calculated the collisional heating rate directly from the particle transport code, which is more accurate than those in previous studies based on approximate analytical solutions. We repeated the simulation of Mariska et al. with an injection of power law, downward-beamed electrons using the new heating rate. For this case, a -10% difference was found from their old result. We also used a more realistic spectrum of injected electrons provided by the stochastic acceleration model, which has a smooth transition from a quasi-thermal background at low energies to a non thermal tail at high energies. The inclusion of low-energy electrons results in relatively more heating in the corona (versus chromosphere) and thus a larger downward heat conduction flux. The interplay of electron heating, conduction, and radiative loss leads to stronger chromospheric evaporation than obtained in previous studies, which had a deficit in low-energy electrons due to an arbitrarily assumed low-energy cutoff. The energy and spatial distributions of energetic electrons and bremsstrahlung photons bear signatures of the changing density distribution caused by chromospheric evaporation. In particular, the density jump at the evaporation front gives rise to enhanced emission, which, in principle, can be imaged by X-ray telescopes. This model can be applied to investigate a variety of high-energy processes in solar, space, and astrophysical plasmas.

  11. A general time-dependent stochastic method for solving Parker's transport equation in spherical coordinates

    NASA Astrophysics Data System (ADS)

    Pei, C.; Bieber, J. W.; Burger, R. A.; Clem, J.

    2010-12-01

    We present a detailed description of our newly developed stochastic approach for solving Parker's transport equation, which we believe is the first attempt to solve it with time dependence in 3-D, evolving from our 3-D steady state stochastic approach. Our formulation of this method is general and is valid for any type of heliospheric magnetic field, although we choose the standard Parker field as an example to illustrate the steps to calculate the transport of galactic cosmic rays. Our 3-D stochastic method is different from other stochastic approaches in the literature in several ways. For example, we employ spherical coordinates to integrate directly, which makes the code much more efficient by reducing coordinate transformations. What is more, the equivalence between our stochastic differential equations and Parker's transport equation is guaranteed by Ito's theorem in contrast to some other approaches. We generalize the technique for calculating particle flux based on the pseudoparticle trajectories for steady state solutions and for time-dependent solutions in 3-D. To validate our code, first we show that good agreement exists between solutions obtained by our steady state stochastic method and a traditional finite difference method. Then we show that good agreement also exists for our time-dependent method for an idealized and simplified heliosphere which has a Parker magnetic field and a simple initial condition for two different inner boundary conditions.

  12. Description of Transport Codes for Space Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.; Cucinotta, Francis A.

    2011-01-01

    This slide presentation describes transport codes and their use for studying and designing space radiation shielding. When combined with risk projection models radiation transport codes serve as the main tool for study radiation and designing shielding. There are three criteria for assessing the accuracy of transport codes: (1) Ground-based studies with defined beams and material layouts, (2) Inter-comparison of transport code results for matched boundary conditions and (3) Comparisons to flight measurements. These three criteria have a very high degree with NASA's HZETRN/QMSFRG.

  13. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; hide

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  14. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  15. Evaluating Galactic Cosmic Ray Environment Models Using RaD-X Flight Data

    NASA Technical Reports Server (NTRS)

    Norman, R. B.; Mertens, C. J.; Slaba, T. C.

    2016-01-01

    Galactic cosmic rays enter Earth's atmosphere after interacting with the geomagnetic field. The primary galactic cosmic rays spectrum is fundamentally changed as it interacts with Earth's atmosphere through nuclear and atomic interactions. At points deeper in the atmosphere, such as at airline altitudes, the radiation environment is a combination of the primary galactic cosmic rays and the secondary particles produced through nuclear interactions. The RaD-X balloon experiment measured the atmospheric radiation environment above 20 km during 2 days in September 2015. These experimental measurements were used to validate and quantify uncertainty in physics-based models used to calculate exposure levels for commercial aviation. In this paper, the Badhwar-O'Neill 2014, the International Organization for Standardization 15390, and the German Aerospace Company galactic cosmic ray environment models are used as input into the same radiation transport code to predict and compare dosimetric quantities to RaD-X measurements. In general, the various model results match the measured tissue equivalent dose well, with results generated by the German Aerospace Center galactic cosmic ray environment model providing the best comparison. For dose equivalent and dose measured in silicon, however, the models were compared less favorably to the measurements.

  16. GALACTIC WINDS DRIVEN BY ISOTROPIC AND ANISOTROPIC COSMIC-RAY DIFFUSION IN DISK GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pakmor, R.; Pfrommer, C.; Simpson, C. M.

    2016-06-20

    The physics of cosmic rays (CRs) is a promising candidate for explaining the driving of galactic winds and outflows. Recent galaxy formation simulations have demonstrated the need for active CR transport either in the form of diffusion or streaming to successfully launch winds in galaxies. However, due to computational limitations, most previous simulations have modeled CR transport isotropically. Here, we discuss high-resolution simulations of isolated disk galaxies in a 10{sup 11} M {sub ⊙} halo with the moving-mesh code Arepo that include injection of CRs from supernovae, advective transport, CR cooling, and CR transport through isotropic or anisotropic diffusion. Wemore » show that either mode of diffusion leads to the formation of strong bipolar outflows. However, they develop significantly later in the simulation with anisotropic diffusion compared to the simulation with isotropic diffusion. Moreover, we find that isotropic diffusion allows most of the CRs to quickly diffuse out of the disk, while in the simulation with anisotropic diffusion, most CRs remain in the disk once the magnetic field becomes dominated by its azimuthal component, which occurs after ∼300 Myr. This has important consequences for the gas dynamics in the disk. In particular, we show that isotropic diffusion strongly suppresses the amplification of the magnetic field in the disk compared to anisotropic or no diffusion models. We therefore conclude that reliable simulations which include CR transport inevitably need to account for anisotropic diffusion.« less

  17. Evaluation of an alternative shielding materials for F-127 transport package

    NASA Astrophysics Data System (ADS)

    Gual, Maritza R.; Mesquita, Amir Z.; Pereira, Cláubia

    2018-03-01

    Lead is used as radiation shielding material for the Nordion's F-127 source shipping container is used for transport and storage of the GammaBeam -127's cobalt-60 source of the Nuclear Technology Development Center (CDTN) located in Belo Horizonte, Brazil. As an alternative, Th, Tl and WC have been evaluated as radiation shielding material. The goal is to check their behavior regarding shielding and dosing. Monte Carlo MCNPX code is used for the simulations. In the MCNPX calculation was used one cylinder as exclusion surface instead one sphere. Validation of MCNPX gamma doses calculations was carried out through comparison with experimental measurements. The results show that tungsten carbide WC is better shielding material for γ-ray than lead shielding.

  18. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    NASA Astrophysics Data System (ADS)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  19. A velocity-dependent anomalous radial transport model for (2-D, 2-V) kinetic transport codes

    NASA Astrophysics Data System (ADS)

    Bodi, Kowsik; Krasheninnikov, Sergei; Cohen, Ron; Rognlien, Tom

    2008-11-01

    Plasma turbulence constitutes a significant part of radial plasma transport in magnetically confined plasmas. This turbulent transport is modeled in the form of anomalous convection and diffusion coefficients in fluid transport codes. There is a need to model the same in continuum kinetic edge codes [such as the (2-D, 2-V) transport version of TEMPEST, NEO, and the code being developed by the Edge Simulation Laboratory] with non-Maxwellian distributions. We present an anomalous transport model with velocity-dependent convection and diffusion coefficients leading to a diagonal transport matrix similar to that used in contemporary fluid transport models (e.g., UEDGE). Also presented are results of simulations corresponding to radial transport due to long-wavelength ExB turbulence using a velocity-independent diffusion coefficient. A BGK collision model is used to enable comparison with fluid transport codes.

  20. GRMHD and GRPIC Simulations

    NASA Technical Reports Server (NTRS)

    Nishikawa, K.-I.; Mizuno, Y.; Watson, M.; Fuerst, S.; Wu, K.; Hardee, P.; Fishman, G. J.

    2007-01-01

    We have developed a new three-dimensional general relativistic magnetohydrodynamic (GRMHD) code by using a conservative, high-resolution shock-capturing scheme. The numerical fluxes are calculated using the HLL approximate Riemann solver scheme. The flux-interpolated constrained transport scheme is used to maintain a divergence-free magnetic field. We have performed various 1-dimensional test problems in both special and general relativity by using several reconstruction methods and found that the new 3D GRMHD code shows substantial improvements over our previous code. The simulation results show the jet formations from a geometrically thin accretion disk near a nonrotating and a rotating black hole. We will discuss the jet properties depended on the rotation of a black hole and the magnetic field configuration including issues for future research. A General Relativistic Particle-in-Cell Code (GRPIC) has been developed using the Kerr-Schild metric. The code includes kinetic effects, and is in accordance with GRMHD code. Since the gravitational force acting on particles is extreme near black holes, there are some difficulties in numerically describing these processes. The preliminary code consists of an accretion disk and free-falling corona. Results indicate that particles are ejected from the black hole. These results are consistent with other GRMHD simulations. The GRPIC simulation results will be presented, along with some remarks and future improvements. The emission is calculated from relativistic flows in black hole systems using a fully general relativistic radiative transfer formulation, with flow structures obtained by GRMHD simulations considering thermal free-free emission and thermal synchrotron emission. Bright filament-like features protrude (visually) from the accretion disk surface, which are enhancements of synchrotron emission where the magnetic field roughly aligns with the line-of-sight in the co-moving frame. The features move back and forth as the accretion flow evolves, but their visibility and morphology are robust. We would like to extend this research using GRPIC simulations and examine a possible new mechanism for certain X-ray quasi-periodic oscillations (QPOs) observed in blackhole X-ray binaries.

  1. Hard gamma-ray background from the coding collimator of a gamma-ray telescope during in conditions of a space experiment

    NASA Astrophysics Data System (ADS)

    Aleksandrov, A. P.; Berezovoj, A. N.; Gal'Per, A. M.; Grachev, V. M.; Dmitrenko, V. V.; Kirillov-Ugryumov, V. G.; Lebedev, V. V.; Lyakhov, V. A.; Moiseev, A. A.; Ulin, S. E.; Shchvets, N. I.

    1984-11-01

    Coding collimators are used to improve the angular resolution of gamma-ray telescopes at energies above 50 MeV. However, the interaction of cosmic rays with the collimator material can lead to the appearance of a gramma-ray background flux which can have a deleterious effect on measurement efficiency. An experiment was performed on the Salyut-6-Soyuz spacecraft system with the Elena-F small-scale gamma-ray telescope in order to measure the magnitude of this background. It is shown that, even at a zenith angle of approximately zero degrees (the angle at which the gamma-ray observations are made), the coding collimator has only an insignificant effect on the background conditions.

  2. Mars surface radiation exposure for solar maximum conditions and 1989 solar proton events

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.

    1992-01-01

    The Langley heavy-ion/nucleon transport code, HZETRN, and the high-energy nucleon transport code, BRYNTRN, are used to predict the propagation of galactic cosmic rays (GCR's) and solar flare protons through the carbon dioxide atmosphere of Mars. Particle fluences and the resulting doses are estimated on the surface of Mars for GCR's during solar maximum conditions and the Aug., Sep., and Oct. 1989 solar proton events. These results extend previously calculated surface estimates for GCR's at solar minimum conditions and the Feb. 1956, Nov. 1960, and Aug. 1972 solar proton events. Surface doses are estimated with both a low-density and a high-density carbon dioxide model of the atmosphere for altitudes of 0, 4, 8, and 12 km above the surface. A solar modulation function is incorporated to estimate the GCR dose variation between solar minimum and maximum conditions over the 11-year solar cycle. By using current Mars mission scenarios, doses to the skin, eye, and blood-forming organs are predicted for short- and long-duration stay times on the Martian surface throughout the solar cycle.

  3. Interfacing MCNPX and McStas for simulation of neutron transport

    NASA Astrophysics Data System (ADS)

    Klinkby, Esben; Lauritzen, Bent; Nonbøl, Erik; Kjær Willendrup, Peter; Filges, Uwe; Wohlmuther, Michael; Gallmeier, Franz X.

    2013-02-01

    Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX (Waters et al., 2007 [1]) or FLUKA (Battistoni et al., 2007; Ferrari et al., 2005 [2,3]) whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as McStas (Lefmann and Nielsen, 1999; Willendrup et al., 2004, 2011a,b [4-7]). The coupling between the two simulation suites typically consists of providing analytical fits of MCNPX neutron spectra to McStas. This method is generally successful but has limitations, as it e.g. does not allow for re-entry of neutrons into the MCNPX regime. Previous work to resolve such shortcomings includes the introduction of McStas inspired supermirrors in MCNPX. In the present paper different approaches to interface MCNPX and McStas are presented and applied to a simple test case. The direct coupling between MCNPX and McStas allows for more accurate simulations of e.g. complex moderator geometries, backgrounds, interference between beam-lines as well as shielding requirements along the neutron guides.

  4. SYMTRAN - A Time-dependent Symmetric Tandem Mirror Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hua, D; Fowler, T

    2004-06-15

    A time-dependent version of the steady-state radial transport model in symmetric tandem mirrors in Ref. [1] has been coded up and first tests performed. Our code, named SYMTRAN, is an adaptation of the earlier SPHERE code for spheromaks, now modified for tandem mirror physics. Motivated by Post's new concept of kinetic stabilization of symmetric mirrors, it is an extension of the earlier TAMRAC rate-equation code omitting radial transport [2], which successfully accounted for experimental results in TMX. The SYMTRAN code differs from the earlier tandem mirror radial transport code TMT in that our code is focused on axisymmetric tandem mirrorsmore » and classical diffusion, whereas TMT emphasized non-ambipolar transport in TMX and MFTF-B due to yin-yang plugs and non-symmetric transitions between the plugs and axisymmetric center cell. Both codes exhibit interesting but different non-linear behavior.« less

  5. Diffusive, supersonic x-ray transport in radiatively heated foam cylinders

    NASA Astrophysics Data System (ADS)

    Back, C. A.; Bauer, J. D.; Hammer, J. H.; Lasinski, B. F.; Turner, R. E.; Rambo, P. W.; Landen, O. L.; Suter, L. J.; Rosen, M. D.; Hsing, W. W.

    2000-05-01

    Diffusive supersonic radiation transport, where the ratio of the diffusive radiation front velocity to the material sound speed >2 has been studied in experiments on low density (40 mg/cc to 50 mg/cc) foams. Laser-heated Au hohlraums provided a radiation drive that heated SiO2 and Ta2O5 aerogel foams of varying lengths. Face-on emission measurements at 550 eV provided clean signatures of the radiation breakout. The high quality data provides new detailed information on the importance of both the fill and wall material opacities and heat capacities in determining the radiation front speed and curvature. The Marshak radiation wave transport is studied in a geometry that allows direct comparisons with analytic models and two-dimensional code simulations. Experiments show important effects that will affect even nondiffusive and transonic radiation transport experiments studied by others in the field. This work is of basic science interest with applications to inertial confinement fusion and astrophysics.

  6. Time-dependent transport of energetic particles in magnetic turbulence: computer simulations versus analytical theory

    NASA Astrophysics Data System (ADS)

    Arendt, V.; Shalchi, A.

    2018-06-01

    We explore numerically the transport of energetic particles in a turbulent magnetic field configuration. A test-particle code is employed to compute running diffusion coefficients as well as particle distribution functions in the different directions of space. Our numerical findings are compared with models commonly used in diffusion theory such as Gaussian distribution functions and solutions of the cosmic ray Fokker-Planck equation. Furthermore, we compare the running diffusion coefficients across the mean magnetic field with solutions obtained from the time-dependent version of the unified non-linear transport theory. In most cases we find that particle distribution functions are indeed of Gaussian form as long as a two-component turbulence model is employed. For turbulence setups with reduced dimensionality, however, the Gaussian distribution can no longer be obtained. It is also shown that the unified non-linear transport theory agrees with simulated perpendicular diffusion coefficients as long as the pure two-dimensional model is excluded.

  7. Laser-driven x-ray and neutron source development for industrial applications of plasma accelerators

    NASA Astrophysics Data System (ADS)

    Brenner, C. M.; Mirfayzi, S. R.; Rusby, D. R.; Armstrong, C.; Alejo, A.; Wilson, L. A.; Clarke, R.; Ahmed, H.; Butler, N. M. H.; Haddock, D.; Higginson, A.; McClymont, A.; Murphy, C.; Notley, M.; Oliver, P.; Allott, R.; Hernandez-Gomez, C.; Kar, S.; McKenna, P.; Neely, D.

    2016-01-01

    Pulsed beams of energetic x-rays and neutrons from intense laser interactions with solid foils are promising for applications where bright, small emission area sources, capable of multi-modal delivery are ideal. Possible end users of laser-driven multi-modal sources are those requiring advanced non-destructive inspection techniques in industry sectors of high value commerce such as aerospace, nuclear and advanced manufacturing. We report on experimental work that demonstrates multi-modal operation of high power laser-solid interactions for neutron and x-ray beam generation. Measurements and Monte Carlo radiation transport simulations show that neutron yield is increased by a factor ~2 when a 1 mm copper foil is placed behind a 2 mm lithium foil, compared to using a 2 cm block of lithium only. We explore x-ray generation with a 10 picosecond drive pulse in order to tailor the spectral content for radiography with medium density alloy metals. The impact of using  >1 ps pulse duration on laser-accelerated electron beam generation and transport is discussed alongside the optimisation of subsequent bremsstrahlung emission in thin, high atomic number target foils. X-ray spectra are deconvolved from spectrometer measurements and simulation data generated using the GEANT4 Monte Carlo code. We also demonstrate the unique capability of laser-driven x-rays in being able to deliver single pulse high spatial resolution projection imaging of thick metallic objects. Active detector radiographic imaging of industrially relevant sample objects with a 10 ps drive pulse is presented for the first time, demonstrating that features of 200 μm size are resolved when projected at high magnification.

  8. A signature of anisotropic cosmic-ray transport in the gamma-ray sky

    NASA Astrophysics Data System (ADS)

    Cerri, Silvio Sergio; Gaggero, Daniele; Vittino, Andrea; Evoli, Carmelo; Grasso, Dario

    2017-10-01

    A crucial process in Galactic cosmic-ray (CR) transport is the spatial diffusion due to the interaction with the interstellar turbulent magnetic field. Usually, CR diffusion is assumed to be uniform and isotropic all across the Galaxy. However, this picture is clearly inaccurate: several data-driven and theoretical arguments, as well as dedicated numerical simulations, show that diffusion exhibits highly anisotropic properties with respect to the direction of a background (ordered) magnetic field (i.e., parallel or perpendicular to it). In this paper we focus on a recently discovered anomaly in the hadronic CR spectrum inferred by the Fermi-LAT gamma-ray data at different positions in the Galaxy, i.e. the progressive hardening of the proton slope at low Galactocentric radii. We propose the idea that this feature can be interpreted as a signature of anisotropic diffusion in the complex Galactic magnetic field: in particular, the harder slope in the inner Galaxy is due, in our scenario, to the parallel diffusive escape along the poloidal component of the large-scale, regular, magnetic field. We implement this idea in a numerical framework, based on the DRAGON code, and perform detailed numerical tests on the accuracy of our setup. We discuss how the effect proposed depends on the relevant free parameters involved. Based on low-energy extrapolation of the few focused numerical simulations aimed at determining the scalings of the anisotropic diffusion coefficients, we finally present a set of plausible models that reproduce the behavior of the CR proton slopes inferred by gamma-ray data.

  9. A signature of anisotropic cosmic-ray transport in the gamma-ray sky

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerri, Silvio Sergio; Grasso, Dario; Gaggero, Daniele

    A crucial process in Galactic cosmic-ray (CR) transport is the spatial diffusion due to the interaction with the interstellar turbulent magnetic field. Usually, CR diffusion is assumed to be uniform and isotropic all across the Galaxy. However, this picture is clearly inaccurate: several data-driven and theoretical arguments, as well as dedicated numerical simulations, show that diffusion exhibits highly anisotropic properties with respect to the direction of a background (ordered) magnetic field (i.e., parallel or perpendicular to it). In this paper we focus on a recently discovered anomaly in the hadronic CR spectrum inferred by the Fermi-LAT gamma-ray data at differentmore » positions in the Galaxy, i.e. the progressive hardening of the proton slope at low Galactocentric radii. We propose the idea that this feature can be interpreted as a signature of anisotropic diffusion in the complex Galactic magnetic field: in particular, the harder slope in the inner Galaxy is due, in our scenario, to the parallel diffusive escape along the poloidal component of the large-scale, regular, magnetic field. We implement this idea in a numerical framework, based on the DRAGON code, and perform detailed numerical tests on the accuracy of our setup. We discuss how the effect proposed depends on the relevant free parameters involved. Based on low-energy extrapolation of the few focused numerical simulations aimed at determining the scalings of the anisotropic diffusion coefficients, we finally present a set of plausible models that reproduce the behavior of the CR proton slopes inferred by gamma-ray data.« less

  10. Electromagnetic Chirps from Neutron Star–Black Hole Mergers

    NASA Astrophysics Data System (ADS)

    Schnittman, Jeremy D.; Dal Canton, Tito; Camp, Jordan; Tsang, David; Kelly, Bernard J.

    2018-02-01

    We calculate the electromagnetic signal of a gamma-ray flare coming from the surface of a neutron star shortly before merger with a black hole companion. Using a new version of the Monte Carlo radiation transport code Pandurata that incorporates dynamic spacetimes, we integrate photon geodesics from the neutron star surface until they reach a distant observer or are captured by the black hole. The gamma-ray light curve is modulated by a number of relativistic effects, including Doppler beaming and gravitational lensing. Because the photons originate from the inspiraling neutron star, the light curve closely resembles the corresponding gravitational waveform: a chirp signal characterized by a steadily increasing frequency and amplitude. We propose to search for these electromagnetic chirps using matched filtering algorithms similar to those used in LIGO data analysis.

  11. Electromagnetic Chirps from Neutron Star-Black Hole Mergers

    NASA Technical Reports Server (NTRS)

    Schnittman, Jeremy D.; Dal Canton, Tito; Camp, Jordan B.; Tsang, David; Kelly, Bernard J.

    2018-01-01

    We calculate the electromagnetic signal of a gamma-ray flare coming from the surface of a neutron star shortly before merger with a black hole companion. Using a new version of the Monte Carlo radiation transport code Pandurata that incorporates dynamic spacetimes, we integrate photon geodesics from the neutron star surface until they reach a distant observer or are captured by the black hole. The gamma-ray light curve is modulated by a number of relativistic effects, including Doppler beaming and gravitational lensing. Because the photons originate from the inspiraling neutron star, the light curve closely resembles the corresponding gravitational waveform: a chirp signal characterized by a steadily increasing frequency and amplitude. We propose to search for these electromagnetic chirps using matched filtering algorithms similar to those used in LIGO data analysis.

  12. Application of quasi-distributions for solving inverse problems of neutron and {gamma}-ray transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pogosbekyan, L.R.; Lysov, D.A.

    The considered inverse problems deal with the calculation of the unknown values of nuclear installations by means of the known (goal) functionals of neutron/{gamma}-ray distributions. The example of these problems might be the calculation of the automatic control rods position as function of neutron sensors reading, or the calculation of experimentally-corrected values of cross-sections, isotopes concentration, fuel enrichment via the measured functional. The authors have developed the new method to solve inverse problem. It finds flux density as quasi-solution of the particles conservation linear system adjointed to equalities for functionals. The method is more effective compared to the one basedmore » on the classical perturbation theory. It is suitable for vectorization and it can be used successfully in optimization codes.« less

  13. High energy radiation from jets and accretion disks near rotating black holes

    NASA Astrophysics Data System (ADS)

    O'Riordan, Michael; Pe'er, Asaf; McKinney, Jonathan C.

    2017-01-01

    We model the low/hard state in X-ray binaries as a magnetically arrested accretion flow, and calculate the resulting radiation using a general-relativistic radiative transport code. Firstly, we investigate the origin of the high-energy emission. We find the following indications of a significant jet contribution at high energies: (i) a pronounced γ-ray peak at ˜ 1023 Hz, (ii) a break in the optical/UV band where the spectrum changes from disk to jet dominated, and (iii) a low-frequency synchrotron peak ≲ 1014 Hz implies that a significant fraction of any observed X-ray and γ-ray emission originates in the jet. Secondly, we investigate the effects of black hole spin on the high-energy emission. We find that the X-ray and γ-ray power depend strongly on spin and inclination angle. Surprisingly, this dependence is not a result of the Blandford-Znajek mechanism, but instead can be understood as a redshift effect. For rapidly rotating black holes, observers with large inclinations see deeper into the hot, dense, highly-magnetized inner regions of the accretion flow. Since the lower frequency emission originates at larger radii, it is not significantly affected by the spin. Therefore, the ratio of the X-ray to near-infrared power is an observational probe of black hole spin.

  14. SAM-CE; A Three Dimensional Monte Carlo Code for the Dolution of the Forward Neutron and Forward and Adjoint Gamma Ray Transport Equations. Revision C

    DTIC Science & Technology

    1974-07-31

    Multiple scoring regions are permitted and these may be either finite volume regions or point detectors or both. Other sccres of interest, e.g., collision... Multiplicities ...... . . . . 43 2,3.5.2 Photon Production Cross Sections. . 44 2.3.5.3 Anisotropy of Photon Production . . 44 2.3.5.4 Continuous...hepting, count rates, etc., are calculated as functions of energy, time and position. Multiple scoring regions are permitted and these may be either

  15. Neutron production by cosmic-ray muons in various materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manukovsky, K. V.; Ryazhskaya, O. G.; Sobolevsky, N. M.

    The results obtained by studying the background of neutrons produced by cosmic-raymuons in underground experimental facilities intended for rare-event searches and in surrounding rock are presented. The types of this rock may include granite, sedimentary rock, gypsum, and rock salt. Neutron production and transfer were simulated using the Geant4 and SHIELD transport codes. These codes were tuned via a comparison of the results of calculations with experimental data—in particular, with data of the Artemovsk research station of the Institute for Nuclear Research (INR, Moscow, Russia)—as well as via an intercomparison of results of calculations with the Geant4 and SHIELD codes.more » It turns out that the atomic-number dependence of the production and yield of neutrons has an irregular character and does not allow a description in terms of a universal function of the atomic number. The parameters of this dependence are different for two groups of nuclei—nuclei consisting of alpha particles and all of the remaining nuclei. Moreover, there are manifest exceptions from a power-law dependence—for example, argon. This may entail important consequences both for the existing underground experimental facilities and for those under construction. Investigation of cosmic-ray-induced neutron production in various materials is of paramount importance for the interpretation of experiments conducted at large depths under the Earth’s surface.« less

  16. MCNP6.1 simulations for low-energy atomic relaxation: Code-to-code comparison with GATEv7.2, PENELOPE2014, and EGSnrc

    NASA Astrophysics Data System (ADS)

    Jung, Seongmoon; Sung, Wonmo; Lee, Jaegi; Ye, Sung-Joon

    2018-01-01

    Emerging radiological applications of gold nanoparticles demand low-energy electron/photon transport calculations including details of an atomic relaxation process. Recently, MCNP® version 6.1 (MCNP6.1) has been released with extended cross-sections for low-energy electron/photon, subshell photoelectric cross-sections, and more detailed atomic relaxation data than the previous versions. With this new feature, the atomic relaxation process of MCNP6.1 has not been fully tested yet with its new physics library (eprdata12) that is based on the Evaluated Atomic Data Library (EADL). In this study, MCNP6.1 was compared with GATEv7.2, PENELOPE2014, and EGSnrc that have been often used to simulate low-energy atomic relaxation processes. The simulations were performed to acquire both photon and electron spectra produced by interactions of 15 keV electrons or photons with a 10-nm-thick gold nano-slab. The photon-induced fluorescence X-rays from MCNP6.1 fairly agreed with those from GATEv7.2 and PENELOPE2014, while the electron-induced fluorescence X-rays of the four codes showed more or less discrepancies. A coincidence was observed in the photon-induced Auger electrons simulated by MCNP6.1 and GATEv7.2. A recent release of MCNP6.1 with eprdata12 can be used to simulate the photon-induced atomic relaxation.

  17. Modeling anomalous radial transport in kinetic transport codes

    NASA Astrophysics Data System (ADS)

    Bodi, K.; Krasheninnikov, S. I.; Cohen, R. H.; Rognlien, T. D.

    2009-11-01

    Anomalous transport is typically the dominant component of the radial transport in magnetically confined plasmas, where the physical origin of this transport is believed to be plasma turbulence. A model is presented for anomalous transport that can be used in continuum kinetic edge codes like TEMPEST, NEO and the next-generation code being developed by the Edge Simulation Laboratory. The model can also be adapted to particle-based codes. It is demonstrated that the model with a velocity-dependent diffusion and convection terms can match a diagonal gradient-driven transport matrix as found in contemporary fluid codes, but can also include off-diagonal effects. The anomalous transport model is also combined with particle drifts and a particle/energy-conserving Krook collision operator to study possible synergistic effects with neoclassical transport. For the latter study, a velocity-independent anomalous diffusion coefficient is used to mimic the effect of long-wavelength ExB turbulence.

  18. Simulations of recoiling black holes: adaptive mesh refinement and radiative transfer

    NASA Astrophysics Data System (ADS)

    Meliani, Zakaria; Mizuno, Yosuke; Olivares, Hector; Porth, Oliver; Rezzolla, Luciano; Younsi, Ziri

    2017-02-01

    Context. In many astrophysical phenomena, and especially in those that involve the high-energy regimes that always accompany the astronomical phenomenology of black holes and neutron stars, physical conditions that are achieved are extreme in terms of speeds, temperatures, and gravitational fields. In such relativistic regimes, numerical calculations are the only tool to accurately model the dynamics of the flows and the transport of radiation in the accreting matter. Aims: We here continue our effort of modelling the behaviour of matter when it orbits or is accreted onto a generic black hole by developing a new numerical code that employs advanced techniques geared towards solving the equations of general-relativistic hydrodynamics. Methods: More specifically, the new code employs a number of high-resolution shock-capturing Riemann solvers and reconstruction algorithms, exploiting the enhanced accuracy and the reduced computational cost of adaptive mesh-refinement (AMR) techniques. In addition, the code makes use of sophisticated ray-tracing libraries that, coupled with general-relativistic radiation-transfer calculations, allow us to accurately compute the electromagnetic emissions from such accretion flows. Results: We validate the new code by presenting an extensive series of stationary accretion flows either in spherical or axial symmetry that are performed either in two or three spatial dimensions. In addition, we consider the highly nonlinear scenario of a recoiling black hole produced in the merger of a supermassive black-hole binary interacting with the surrounding circumbinary disc. In this way, we can present for the first time ray-traced images of the shocked fluid and the light curve resulting from consistent general-relativistic radiation-transport calculations from this process. Conclusions: The work presented here lays the ground for the development of a generic computational infrastructure employing AMR techniques to accurately and self-consistently calculate general-relativistic accretion flows onto compact objects. In addition to the accurate handling of the matter, we provide a self-consistent electromagnetic emission from these scenarios by solving the associated radiative-transfer problem. While magnetic fields are currently excluded from our analysis, the tools presented here can have a number of applications to study accretion flows onto black holes or neutron stars.

  19. Three dimensional ray tracing of the Jovian magnetosphere in the low frequency range

    NASA Technical Reports Server (NTRS)

    Menietti, J. D.

    1984-01-01

    Ray tracing studies of Jovian low frequency emissions were studied. A comprehensive three-dimensional ray tracing computer code for examination of model Jovian decametric (DAM) emission was developed. The improvements to the computer code are outlined and described. The results of the ray tracings of Jovian emissions will be presented in summary form.

  20. Validation of columnar CsI x-ray detector responses obtained with hybridMANTIS, a CPU-GPU Monte Carlo code for coupled x-ray, electron, and optical transport.

    PubMed

    Sharma, Diksha; Badano, Aldo

    2013-03-01

    hybridMANTIS is a Monte Carlo package for modeling indirect x-ray imagers using columnar geometry based on a hybrid concept that maximizes the utilization of available CPU and graphics processing unit processors in a workstation. The authors compare hybridMANTIS x-ray response simulations to previously published MANTIS and experimental data for four cesium iodide scintillator screens. These screens have a variety of reflective and absorptive surfaces with different thicknesses. The authors analyze hybridMANTIS results in terms of modulation transfer function and calculate the root mean square difference and Swank factors from simulated and experimental results. The comparison suggests that hybridMANTIS better matches the experimental data as compared to MANTIS, especially at high spatial frequencies and for the thicker screens. hybridMANTIS simulations are much faster than MANTIS with speed-ups up to 5260. hybridMANTIS is a useful tool for improved description and optimization of image acquisition stages in medical imaging systems and for modeling the forward problem in iterative reconstruction algorithms.

  1. Grid-enhanced X-ray coded aperture microscopy with polycapillary optics

    PubMed Central

    Sowa, Katarzyna M.; Last, Arndt; Korecki, Paweł

    2017-01-01

    Polycapillary devices focus X-rays by means of multiple reflections of X-rays in arrays of bent glass capillaries. The size of the focal spot (typically 10–100 μm) limits the resolution of scanning, absorption and phase-contrast X-ray imaging using these devices. At the expense of a moderate resolution, polycapillary elements provide high intensity and are frequently used for X-ray micro-imaging with both synchrotrons and X-ray tubes. Recent studies have shown that the internal microstructure of such an optics can be used as a coded aperture that encodes high-resolution information about objects located inside the focal spot. However, further improvements to this variant of X-ray microscopy will require the challenging fabrication of tailored devices with a well-defined capillary microstructure. Here, we show that submicron coded aperture microscopy can be realized using a periodic grid that is placed at the output surface of a polycapillary optics. Grid-enhanced X-ray coded aperture microscopy with polycapillary optics does not rely on the specific microstructure of the optics but rather takes advantage only of its focusing properties. Hence, submicron X-ray imaging can be realized with standard polycapillary devices and existing set-ups for micro X-ray fluorescence spectroscopy. PMID:28322316

  2. Grid-enhanced X-ray coded aperture microscopy with polycapillary optics.

    PubMed

    Sowa, Katarzyna M; Last, Arndt; Korecki, Paweł

    2017-03-21

    Polycapillary devices focus X-rays by means of multiple reflections of X-rays in arrays of bent glass capillaries. The size of the focal spot (typically 10-100 μm) limits the resolution of scanning, absorption and phase-contrast X-ray imaging using these devices. At the expense of a moderate resolution, polycapillary elements provide high intensity and are frequently used for X-ray micro-imaging with both synchrotrons and X-ray tubes. Recent studies have shown that the internal microstructure of such an optics can be used as a coded aperture that encodes high-resolution information about objects located inside the focal spot. However, further improvements to this variant of X-ray microscopy will require the challenging fabrication of tailored devices with a well-defined capillary microstructure. Here, we show that submicron coded aperture microscopy can be realized using a periodic grid that is placed at the output surface of a polycapillary optics. Grid-enhanced X-ray coded aperture microscopy with polycapillary optics does not rely on the specific microstructure of the optics but rather takes advantage only of its focusing properties. Hence, submicron X-ray imaging can be realized with standard polycapillary devices and existing set-ups for micro X-ray fluorescence spectroscopy.

  3. RAY-RAMSES: a code for ray tracing on the fly in N-body simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barreira, Alexandre; Llinares, Claudio; Bose, Sownak

    2016-05-01

    We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementationmore » using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conventional methods, which can be important in tests of theory systematics in preparation for upcoming large scale structure surveys.« less

  4. Method for rapid high-frequency seismogram calculation

    NASA Astrophysics Data System (ADS)

    Stabile, Tony Alfredo; De Matteis, Raffaella; Zollo, Aldo

    2009-02-01

    We present a method for rapid, high-frequency seismogram calculation that makes use of an algorithm to automatically generate an exhaustive set of seismic phases with an appreciable amplitude on the seismogram. The method uses a hierarchical order of ray and seismic-phase generation, taking into account some existing constraints for ray paths and some physical constraints. To compute synthetic seismograms, the COMRAD code (from the Italian: "COdice Multifase per il RAy-tracing Dinamico") uses as core a dynamic ray-tracing code. To validate the code, we have computed in a layered medium synthetic seismograms using both COMRAD and a code that computes the complete wave field by the discrete wave number method. The seismograms are compared according to a time-frequency misfit criteria based on the continuous wavelet transform of the signals. Although the number of phases is considerably reduced by the selection criteria, the results show that the loss in amplitude on the whole seismogram is negligible. Moreover, the time for the computing of the synthetics using the COMRAD code (truncating the ray series at the 10th generation) is 3-4-fold less than that needed for the AXITRA code (up to a frequency of 25 Hz).

  5. Hard gamma radiation background from coding collimator of gamma telescope under space experiment conditions

    NASA Astrophysics Data System (ADS)

    Aleksandrov, A. P.; Berezovoy, A. N.; Galper, A. M.; Grachev, V. M.; Dmitrenko, V. V.; Kirillov-Ugryumov, V. G.; Lebedev, V. V.; Lyakhov, V. A.; Moiseyev, A. A.; Ulin, S. Y.

    1985-09-01

    Coding collimators are used to improve the angular resolution of gamma-ray telescopes at energies above 50 MeV. However, the interaction of cosmic rays with the collimation material can lead to the appearance of a gamma-ray background flux which can have a deleterious effect on measurement efficiency. An experiment was performed on the Salyut-6-Soyuz spacecraft system with the Elena-F small-scale gamma-ray telescope in order to measure the magnitude of this background. It is shown that, even at a zenith angle of approximately zero degrees (the angle at which the gamma-ray observations are made), the coding collimator has only an insignificant effect on the background conditions.

  6. Evaluation of the Environmental Gamma-ray Dose Rate by Skyshine Analysis During the Maintenance of an Activated TFC in ITER

    NASA Astrophysics Data System (ADS)

    Sato, S.; Takatsu, H.; Maki, K.; Yamada, K.; Mori, S.; Iida, H.; Santoro, R. T.

    1997-09-01

    Gamma-ray exposure dose rates at the ITER site boundary were estimated for the cases of removal of a failed activated Toroidal Field (TF) coil from the torus and removal of a failed activated TF coil together with a sector of the activated Vacuum Vessel (VV). Skyshine analyses were performed using the two-dimensional SN radiation transport code, DOT3.5. The exposure gamma-ray dose rates on the ground at the site boundary (presently assumed to be 1 km from the ITER building), were calculated to be 1.1 and 84 μSv/year for removal of the TF coil without and with a VV sector, respectively. The dose rate level for the latter case is close to the tentative radiation limit of 100 μSv/year so an additional ˜14 cm of concrete is required in the ITER building roof to satisfy the criterion for a safety factor often for the site boundary dose rate.

  7. CRUNCH_PARALLEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumaker, Dana E.; Steefel, Carl I.

    The code CRUNCH_PARALLEL is a parallel version of the CRUNCH code. CRUNCH code version 2.0 was previously released by LLNL, (UCRL-CODE-200063). Crunch is a general purpose reactive transport code developed by Carl Steefel and Yabusake (Steefel Yabsaki 1996). The code handles non-isothermal transport and reaction in one, two, and three dimensions. The reaction algorithm is generic in form, handling an arbitrary number of aqueous and surface complexation as well as mineral dissolution/precipitation. A standardized database is used containing thermodynamic and kinetic data. The code includes advective, dispersive, and diffusive transport.

  8. An electron-beam dose deposition experiment: TIGER 1-D simulation code versus thermoluminescent dosimetry

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Tipton, Charles W.; Self, Charles T.

    1991-03-01

    The dose absorbed in an integrated circuit (IC) die exposed to a pulse of low-energy electrons is a strong function of both electron energy and surrounding packaging materials. This report describes an experiment designed to measure how well the Integrated TIGER Series one-dimensional (1-D) electron transport simulation program predicts dose correction factors for a state-of-the-art IC package and package/printed circuit board (PCB) combination. These derived factors are compared with data obtained experimentally using thermoluminescent dosimeters (TLD's) and the FX-45 flash x-ray machine (operated in electron-beam (e-beam) mode). The results of this experiment show that the TIGER 1-D simulation code can be used to accurately predict FX-45 e-beam dose deposition correction factors for reasonably complex IC packaging configurations.

  9. PHITS simulations of the Matroshka experiment

    NASA Astrophysics Data System (ADS)

    Gustafsson, Katarina; Sihver, Lembit; Mancusi, Davide; Sato, Tatsuhiko

    In order to design a more secure space exploration, radiation exposure estimations are necessary; the radiation environment in space is very different from the one on Earth and it is harmful for humans and for electronic equipments. The threat origins from two sources: Galactic Cosmic Rays and Solar Particle Events. It is important to understand what happens when these particles strike matter such as space vehicle walls, human organs and electronics. We are therefore developing a tool able to estimate the radiation exposure to both humans and electronics. The tool will be based on PHITS, the Particle and Heavy-Ion Transport code System, a three dimensional Monte Carlo code which can calculate interactions and transport of particles and heavy ions in matter. PHITS is developed by a collaboration between RIST (Research Organization for Information Science & Technology), JAEA (Japan Atomic Energy Agency), KEK (High Energy Accelerator Research Organization), Japan and Chalmers University of Technology, Sweden. A method for benchmarking and developing the code is to simulate experiments performed in space or on Earth. We have carried out simulations of the Matroshka experiment which focus on determining the radiation load on astronauts inside and outside the International Space Station by using a torso of a tissue equivalent human phantom, filled with active and passive detectors located in the positions of critical tissues and organs. We will present status and results of our simulations.

  10. Benchmarking atomic physics models for magnetically confined fusion plasma physics experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, M.J.; Finkenthal, M.; Soukhanovskii, V.

    In present magnetically confined fusion devices, high and intermediate {ital Z} impurities are either puffed into the plasma for divertor radiative cooling experiments or are sputtered from the high {ital Z} plasma facing armor. The beneficial cooling of the edge as well as the detrimental radiative losses from the core of these impurities can be properly understood only if the atomic physics used in the modeling of the cooling curves is very accurate. To this end, a comprehensive experimental and theoretical analysis of some relevant impurities is undertaken. Gases (Ne, Ar, Kr, and Xe) are puffed and nongases are introducedmore » through laser ablation into the FTU tokamak plasma. The charge state distributions and total density of these impurities are determined from spatial scans of several photometrically calibrated vacuum ultraviolet and x-ray spectrographs (3{endash}1600 {Angstrom}), the multiple ionization state transport code transport code (MIST) and a collisional radiative model. The radiative power losses are measured with bolometery, and the emissivity profiles were measured by a visible bremsstrahlung array. The ionization balance, excitation physics, and the radiative cooling curves are computed from the Hebrew University Lawrence Livermore atomic code (HULLAC) and are benchmarked by these experiments. (Supported by U.S. DOE Grant No. DE-FG02-86ER53214 at JHU and Contract No. W-7405-ENG-48 at LLNL.) {copyright} {ital 1999 American Institute of Physics.}« less

  11. A Finite-Orbit-Width Fokker-Planck solver for modeling of energetic particle interactions with waves, with application to Helicons in ITER

    NASA Astrophysics Data System (ADS)

    Petrov, Yuri V.; Harvey, R. W.

    2017-10-01

    The bounce-average (BA) finite-difference Fokker-Planck (FP) code CQL3D [1,2] now includes the essential physics to describe the RF heating of Finite-Orbit-Width (FOW) ions in tokamaks. The FP equation is reformulated in terms of Constants-Of-Motion coordinates, which we select to be particle speed, pitch angle, and major radius on the equatorial plane thus obtaining the distribution function directly at this location. Full-orbit, low collisionality neoclassical radial transport emerges from averaging the local friction and diffusion coefficients along guiding center orbits. Similarly, the BA of local quasilinear RF diffusion terms gives rise to additional radial transport. The local RF electric field components needed for the BA operator are usually obtained by a ray-tracing code, such as GENRAY, or in conjunction with full-wave codes. As a new, practical application, the CQL3D-FOW version is used for simulation of alpha-particle heating by high-harmonic waves in ITER. Coupling of high harmonic or helicon fast waves power to electrons is a promising current drive (CD) scenario for high beta plasmas. However, the efficiency of current drive can be diminished by parasitic channeling of RF power into fast ions, such as alphas, through finite Larmor-radius effects. We investigate possibilities to reduce the fast ion heating in CD scenarios.

  12. Comparison of Organ Dosimetry for Astronaut Phantoms: Earth-Based vs. Microgravity-Based Anthropometry and Body Positioning

    NASA Technical Reports Server (NTRS)

    VanBaalen, Mary; Bahadon, Amir; Shavers, Mark; Semones, Edward

    2011-01-01

    The purpose of this study is to use NASA radiation transport codes to compare astronaut organ dose equivalents resulting from solar particle events (SPE), geomagnetically trapped protons, and free-space galactic cosmic rays (GCR) using phantom models representing Earth-based and microgravity-based anthropometry and positioning. Methods: The Univer sity of Florida hybrid adult phantoms were scaled to represent male and female astronauts with 5th, 50th, and 95th percentile heights and weights as measured on Earth. Another set of scaled phantoms, incorporating microgravity-induced changes, such as spinal lengthening, leg volume loss, and the assumption of the neutral body position, was also created. A ray-tracer was created and used to generate body self-shielding distributions for dose points within a voxelized phantom under isotropic irradiation conditions, which closely approximates the free-space radiation environment. Simplified external shielding consisting of an aluminum spherical shell was used to consider the influence of a spacesuit or shielding of a hull. These distributions were combined with depth dose distributions generated from the NASA radiation transport codes BRYNTRN (SPE and trapped protons) and HZETRN (GCR) to yield dose equivalent. Many points were sampled per organ. Results: The organ dos e equivalent rates were on the order of 1.5-2.5 mSv per day for GCR (1977 solar minimum) and 0.4-0.8 mSv per day for trapped proton irradiation with shielding of 2 g cm-2 aluminum equivalent. The organ dose equivalents for SPE irradiation varied considerably, with the skin and eye lens having the highest organ dose equivalents and deep-seated organs, such as the bladder, liver, and stomach having the lowest. Conclus ions: The greatest differences between the Earth-based and microgravity-based phantoms are observed for smaller ray thicknesses, since the most drastic changes involved limb repositioning and not overall phantom size. Improved self-shielding models reduce the overall uncertainty in organ dosimetry for mission-risk projections and assessments for astronauts

  13. Flowfield computer graphics

    NASA Technical Reports Server (NTRS)

    Desautel, Richard

    1993-01-01

    The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).

  14. Validation of Ray Tracing Code Refraction Effects

    NASA Technical Reports Server (NTRS)

    Heath, Stephanie L.; McAninch, Gerry L.; Smith, Charles D.; Conner, David A.

    2008-01-01

    NASA's current predictive capabilities using the ray tracing program (RTP) are validated using helicopter noise data taken at Eglin Air Force Base in 2007. By including refractive propagation effects due to wind and temperature, the ray tracing code is able to explain large variations in the data observed during the flight test.

  15. The MCNP Simulation of a PGNAA System at TRR-1/M1

    NASA Astrophysics Data System (ADS)

    Sangaroon, S.; Ratanatongchai, W.; Picha, R.; Khaweerat, S.; Channuie, J.

    2017-06-01

    The prompt-gamma neutron activation analysis system (PGNAA) has been installed at Thai Research Reactor-1/Modified 1 (TRR-1/M1) since 1999. The purpose of the system is for elemental and isotopic analyses. The system mainly consists of a series of the moderator and collimator, neutron and gamma-ray shielding and the HPGe detector. In this work, the condition of the system is carried out based on the Monte Carlo method using Monte Carlo N-Particle transport code and the experiment. The flux ratios (Φthermal/Φepithermal and Φthermal/Φfast) and thermal neutron flux have been obtained. The simulated prompt gamma rays of the Portland cement sample have been carried out. The simulation provides significant contribution in upgrading the PGNAA station to be available in various applications.

  16. The POPOP4 library and codes for preparing secondary gamma-ray production cross sections

    NASA Technical Reports Server (NTRS)

    Ford, W. E., III

    1972-01-01

    The POPOP4 code for converting secondary gamma ray yield data to multigroup secondary gamma ray production cross sections and the POPOP4 library of secondary gamma ray yield data are described. Recent results of the testing of uranium and iron data sets from the POPOP4 library are given. The data sets were tested by comparing calculated secondary gamma ray pulse height spectra measured at the ORNL TSR-II reactor.

  17. A Deterministic Transport Code for Space Environment Electrons

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Chang, C. K.; Norman, Ryan B.; Blattnig, Steve R.; Badavi, Francis F.; Adamczyk, Anne M.

    2010-01-01

    A deterministic computational procedure has been developed to describe transport of space environment electrons in various shield media. This code is an upgrade and extension of an earlier electron code. Whereas the former code was formulated on the basis of parametric functions derived from limited laboratory data, the present code utilizes well established theoretical representations to describe the relevant interactions and transport processes. The shield material specification has been made more general, as have the pertinent cross sections. A combined mean free path and average trajectory approach has been used in the transport formalism. Comparisons with Monte Carlo calculations are presented.

  18. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    NASA Astrophysics Data System (ADS)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate temporal evolution of the magnetic field structure and/or fast electron population implied by the electron acceleration and transport. This work was supported in part by NSF grants AGS-0961867, AST-0908344, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, STFC/PPARC Advanced Fellowship, and the Leverhulme Trust, UK. Financial support by the European Commission through the SOLAIRE and HESPE Networks is gratefully acknowledged.

  19. Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Koontz, Steve; Atwell, William; Boeder, Paul

    2014-01-01

    NASA's future missions are focused on long-duration deep space missions for human exploration which offers no options for a quick emergency return to Earth. The combination of long mission duration with no quick emergency return option leads to unprecedented spacecraft system safety and reliability requirements. It is important that spacecraft avionics systems for human deep space missions are not susceptible to Single Event Effect (SEE) failures caused by space radiation (primarily the continuous galactic cosmic ray background and the occasional solar particle event) interactions with electronic components and systems. SEE effects are typically managed during the design, development, and test (DD&T) phase of spacecraft development by using heritage hardware (if possible) and through extensive component level testing, followed by system level failure analysis tasks that are both time consuming and costly. The ultimate product of the SEE DD&T program is a prediction of spacecraft avionics reliability in the flight environment produced using various nuclear reaction and transport codes in combination with the component and subsystem level radiation test data. Previous work by Koontz, et al.1 utilized FLUKA, a Monte Carlo nuclear reaction and transport code, to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data for a variety of spacecraft and space flight environments. However, FLUKA has a long run-time (on the order of days). CREME962, an easy to use deterministic code offering short run times, was also compared with FLUKA predictions and in-flight data. CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Thus, this paper will investigate the use of HZETRN 20103, a fast and easy to use deterministic transport code, similar to CREME96, that was developed at NASA Langley Research Center primarily for flight crew ionizing radiation dose assessments. HZETRN 2010 includes updates to address secondary particle shower effects more accurately, and might be used as another tool to verify spacecraft avionics system reliability in space flight SEE environments.

  20. Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes

    NASA Technical Reports Server (NTRS)

    Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.

    2010-01-01

    The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,

  1. High-resolution imaging gamma-ray spectroscopy with externally segmented germanium detectors

    NASA Technical Reports Server (NTRS)

    Callas, J. L.; Mahoney, W. A.; Varnell, L. S.; Wheaton, W. A.

    1993-01-01

    Externally segmented germanium detectors promise a breakthrough in gamma-ray imaging capabilities while retaining the superb energy resolution of germanium spectrometers. An angular resolution of 0.2 deg becomes practical by combining position-sensitive germanium detectors having a segment thickness of a few millimeters with a one-dimensional coded aperture located about a meter from the detectors. Correspondingly higher angular resolutions are possible with larger separations between the detectors and the coded aperture. Two-dimensional images can be obtained by rotating the instrument. Although the basic concept is similar to optical or X-ray coded-aperture imaging techniques, several complicating effects arise because of the penetrating nature of gamma rays. The complications include partial transmission through the coded aperture elements, Compton scattering in the germanium detectors, and high background count rates. Extensive electron-photon Monte Carlo modeling of a realistic detector/coded-aperture/collimator system has been performed. Results show that these complicating effects can be characterized and accounted for with no significant loss in instrument sensitivity.

  2. Implementation of an anomalous radial transport model for continuum kinetic edge codes

    NASA Astrophysics Data System (ADS)

    Bodi, K.; Krasheninnikov, S. I.; Cohen, R. H.; Rognlien, T. D.

    2007-11-01

    Radial plasma transport in magnetic fusion devices is often dominated by plasma turbulence compared to neoclassical collisional transport. Continuum kinetic edge codes [such as the (2d,2v) transport version of TEMPEST and also EGK] compute the collisional transport directly, but there is a need to model the anomalous transport from turbulence for long-time transport simulations. Such a model is presented and results are shown for its implementation in the TEMPEST gyrokinetic edge code. The model includes velocity-dependent convection and diffusion coefficients expressed as a Hermite polynominals in velocity. The specification of the Hermite coefficients can be set, e.g., by specifying the ratio of particle and energy transport as in fluid transport codes. The anomalous transport terms preserve the property of no particle flux into unphysical regions of velocity space. TEMPEST simulations are presented showing the separate control of particle and energy anomalous transport, and comparisons are made with neoclassical transport also included.

  3. Chromaticity calculations and code comparisons for x-ray lithography source XLS and SXLS rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-06-16

    This note presents the chromaticity calculations and code comparison results for the (x-ray lithography source) XLS (Chasman Green, XUV Cosy lattice) and (2 magnet 4T) SXLS lattices, with the standard beam optic codes, including programs SYNCH88.5, MAD6, PATRICIA88.4, PATPET88.2, DIMAD, BETA, and MARYLIE. This analysis is a part of our ongoing accelerator physics code studies. 4 figs., 10 tabs.

  4. Faster and more accurate transport procedures for HZETRN

    NASA Astrophysics Data System (ADS)

    Slaba, T. C.; Blattnig, S. R.; Badavi, F. F.

    2010-12-01

    The deterministic transport code HZETRN was developed for research scientists and design engineers studying the effects of space radiation on astronauts and instrumentation protected by various shielding materials and structures. In this work, several aspects of code verification are examined. First, a detailed derivation of the light particle ( A ⩽ 4) and heavy ion ( A > 4) numerical marching algorithms used in HZETRN is given. References are given for components of the derivation that already exist in the literature, and discussions are given for details that may have been absent in the past. The present paper provides a complete description of the numerical methods currently used in the code and is identified as a key component of the verification process. Next, a new numerical method for light particle transport is presented, and improvements to the heavy ion transport algorithm are discussed. A summary of round-off error is also given, and the impact of this error on previously predicted exposure quantities is shown. Finally, a coupled convergence study is conducted by refining the discretization parameters (step-size and energy grid-size). From this study, it is shown that past efforts in quantifying the numerical error in HZETRN were hindered by single precision calculations and computational resources. It is determined that almost all of the discretization error in HZETRN is caused by the use of discretization parameters that violate a numerical convergence criterion related to charged target fragments below 50 AMeV. Total discretization errors are given for the old and new algorithms to 100 g/cm 2 in aluminum and water, and the improved accuracy of the new numerical methods is demonstrated. Run time comparisons between the old and new algorithms are given for one, two, and three layer slabs of 100 g/cm 2 of aluminum, polyethylene, and water. The new algorithms are found to be almost 100 times faster for solar particle event simulations and almost 10 times faster for galactic cosmic ray simulations.

  5. COMBINED MODELING OF ACCELERATION, TRANSPORT, AND HYDRODYNAMIC RESPONSE IN SOLAR FLARES. I. THE NUMERICAL MODEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu Wei; Petrosian, Vahe; Mariska, John T.

    2009-09-10

    Acceleration and transport of high-energy particles and fluid dynamics of atmospheric plasma are interrelated aspects of solar flares, but for convenience and simplicity they were artificially separated in the past. We present here self-consistently combined Fokker-Planck modeling of particles and hydrodynamic simulation of flare plasma. Energetic electrons are modeled with the Stanford unified code of acceleration, transport, and radiation, while plasma is modeled with the Naval Research Laboratory flux tube code. We calculated the collisional heating rate directly from the particle transport code, which is more accurate than those in previous studies based on approximate analytical solutions. We repeated themore » simulation of Mariska et al. with an injection of power law, downward-beamed electrons using the new heating rate. For this case, a {approx}10% difference was found from their old result. We also used a more realistic spectrum of injected electrons provided by the stochastic acceleration model, which has a smooth transition from a quasi-thermal background at low energies to a nonthermal tail at high energies. The inclusion of low-energy electrons results in relatively more heating in the corona (versus chromosphere) and thus a larger downward heat conduction flux. The interplay of electron heating, conduction, and radiative loss leads to stronger chromospheric evaporation than obtained in previous studies, which had a deficit in low-energy electrons due to an arbitrarily assumed low-energy cutoff. The energy and spatial distributions of energetic electrons and bremsstrahlung photons bear signatures of the changing density distribution caused by chromospheric evaporation. In particular, the density jump at the evaporation front gives rise to enhanced emission, which, in principle, can be imaged by X-ray telescopes. This model can be applied to investigate a variety of high-energy processes in solar, space, and astrophysical plasmas.« less

  6. Faster and more accurate transport procedures for HZETRN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaba, T.C., E-mail: Tony.C.Slaba@nasa.go; Blattnig, S.R., E-mail: Steve.R.Blattnig@nasa.go; Badavi, F.F., E-mail: Francis.F.Badavi@nasa.go

    The deterministic transport code HZETRN was developed for research scientists and design engineers studying the effects of space radiation on astronauts and instrumentation protected by various shielding materials and structures. In this work, several aspects of code verification are examined. First, a detailed derivation of the light particle (A {<=} 4) and heavy ion (A > 4) numerical marching algorithms used in HZETRN is given. References are given for components of the derivation that already exist in the literature, and discussions are given for details that may have been absent in the past. The present paper provides a complete descriptionmore » of the numerical methods currently used in the code and is identified as a key component of the verification process. Next, a new numerical method for light particle transport is presented, and improvements to the heavy ion transport algorithm are discussed. A summary of round-off error is also given, and the impact of this error on previously predicted exposure quantities is shown. Finally, a coupled convergence study is conducted by refining the discretization parameters (step-size and energy grid-size). From this study, it is shown that past efforts in quantifying the numerical error in HZETRN were hindered by single precision calculations and computational resources. It is determined that almost all of the discretization error in HZETRN is caused by the use of discretization parameters that violate a numerical convergence criterion related to charged target fragments below 50 AMeV. Total discretization errors are given for the old and new algorithms to 100 g/cm{sup 2} in aluminum and water, and the improved accuracy of the new numerical methods is demonstrated. Run time comparisons between the old and new algorithms are given for one, two, and three layer slabs of 100 g/cm{sup 2} of aluminum, polyethylene, and water. The new algorithms are found to be almost 100 times faster for solar particle event simulations and almost 10 times faster for galactic cosmic ray simulations.« less

  7. The Development of WARP - A Framework for Continuous Energy Monte Carlo Neutron Transport in General 3D Geometries on GPUs

    NASA Astrophysics Data System (ADS)

    Bergmann, Ryan

    Graphics processing units, or GPUs, have gradually increased in computational power from the small, job-specific boards of the early 1990s to the programmable powerhouses of today. Compared to more common central processing units, or CPUs, GPUs have a higher aggregate memory bandwidth, much higher floating-point operations per second (FLOPS), and lower energy consumption per FLOP. Because one of the main obstacles in exascale computing is power consumption, many new supercomputing platforms are gaining much of their computational capacity by incorporating GPUs into their compute nodes. Since CPU-optimized parallel algorithms are not directly portable to GPU architectures (or at least not without losing substantial performance), transport codes need to be rewritten to execute efficiently on GPUs. Unless this is done, reactor simulations cannot take full advantage of these new supercomputers. WARP, which can stand for ``Weaving All the Random Particles,'' is a three-dimensional (3D) continuous energy Monte Carlo neutron transport code developed in this work as to efficiently implement a continuous energy Monte Carlo neutron transport algorithm on a GPU. WARP accelerates Monte Carlo simulations while preserving the benefits of using the Monte Carlo Method, namely, very few physical and geometrical simplifications. WARP is able to calculate multiplication factors, flux tallies, and fission source distributions for time-independent problems, and can run in both criticality or fixed source modes. WARP can transport neutrons in unrestricted arrangements of parallelepipeds, hexagonal prisms, cylinders, and spheres. WARP uses an event-based algorithm, but with some important differences. Moving data is expensive, so WARP uses a remapping vector of pointer/index pairs to direct GPU threads to the data they need to access. The remapping vector is sorted by reaction type after every transport iteration using a high-efficiency parallel radix sort, which serves to keep the reaction types as contiguous as possible and removes completed histories from the transport cycle. The sort reduces the amount of divergence in GPU ``thread blocks,'' keeps the SIMD units as full as possible, and eliminates using memory bandwidth to check if a neutron in the batch has been terminated or not. Using a remapping vector means the data access pattern is irregular, but this is mitigated by using large batch sizes where the GPU can effectively eliminate the high cost of irregular global memory access. WARP modifies the standard unionized energy grid implementation to reduce memory traffic. Instead of storing a matrix of pointers indexed by reaction type and energy, WARP stores three matrices. The first contains cross section values, the second contains pointers to angular distributions, and a third contains pointers to energy distributions. This linked list type of layout increases memory usage, but lowers the number of data loads that are needed to determine a reaction by eliminating a pointer load to find a cross section value. Optimized, high-performance GPU code libraries are also used by WARP wherever possible. The CUDA performance primitives (CUDPP) library is used to perform the parallel reductions, sorts and sums, the CURAND library is used to seed the linear congruential random number generators, and the OptiX ray tracing framework is used for geometry representation. OptiX is a highly-optimized library developed by NVIDIA that automatically builds hierarchical acceleration structures around user-input geometry so only surfaces along a ray line need to be queried in ray tracing. WARP also performs material and cell number queries with OptiX by using a point-in-polygon like algorithm. WARP has shown that GPUs are an effective platform for performing Monte Carlo neutron transport with continuous energy cross sections. Currently, WARP is the most detailed and feature-rich program in existence for performing continuous energy Monte Carlo neutron transport in general 3D geometries on GPUs, but compared to production codes like Serpent and MCNP, WARP has limited capabilities. Despite WARP's lack of features, its novel algorithm implementations show that high performance can be achieved on a GPU despite the inherently divergent program flow and sparse data access patterns. WARP is not ready for everyday nuclear reactor calculations, but is a good platform for further development of GPU-accelerated Monte Carlo neutron transport. In it's current state, it may be a useful tool for multiplication factor searches, i.e. determining reactivity coefficients by perturbing material densities or temperatures, since these types of calculations typically do not require many flux tallies. (Abstract shortened by UMI.)

  8. Microfocusing of the FERMI@Elettra FEL beam with a K-B active optics system: Spot size predictions by application of the WISE code

    NASA Astrophysics Data System (ADS)

    Raimondi, L.; Svetina, C.; Mahne, N.; Cocco, D.; Abrami, A.; De Marco, M.; Fava, C.; Gerusina, S.; Gobessi, R.; Capotondi, F.; Pedersoli, E.; Kiskinova, M.; De Ninno, G.; Zeitoun, P.; Dovillaire, G.; Lambert, G.; Boutu, W.; Merdji, H.; Gonzalez, A. I.; Gauthier, D.; Zangrando, M.

    2013-05-01

    FERMI@Elettra, the first seeded EUV-SXR free electron laser (FEL) facility located at Elettra Sincrotrone Trieste has been conceived to provide very short (10-100 fs) pulses with ultrahigh peak brightness and wavelengths from 100 nm to 4 nm. A section fully dedicated to the photon transport and analysis diagnostics, named PADReS, has already been installed and commissioned. Three of the beamlines, EIS-TIMEX, DiProI and LDM, installed after the PADReS section, are in advanced commissioning state and will accept the first users in December 2012. These beam lines employ active X-ray optics in order to focus the FEL beam as well as to perform a controlled beam-shaping at focus. Starting from mirror surface metrology characterization, it is difficult to predict the focal spot shape applying only methods based on geometrical optics such as the ray tracing. Within the geometrical optics approach one cannot take into account the diffraction effect from the optics edges, i.e. the aperture diffraction, and the impact of different surface spatial wavelengths to the spot size degradation. Both these effects are strongly dependent on the photon beam energy and mirror incident angles. We employed a method based on physical optics, which applies the Huygens-Fresnel principle to reflection (on which the WISE code is based). In this work we report the results of the first measurements of the focal spot in the DiProI beamline end-station and compare them to the predictions computed with Shadow code and WISE code, starting from the mirror surface profile characterization.

  9. Impact of Cosmic-Ray Transport on Galactic Winds

    NASA Astrophysics Data System (ADS)

    Farber, R.; Ruszkowski, M.; Yang, H.-Y. K.; Zweibel, E. G.

    2018-04-01

    The role of cosmic rays generated by supernovae and young stars has very recently begun to receive significant attention in studies of galaxy formation and evolution due to the realization that cosmic rays can efficiently accelerate galactic winds. Microscopic cosmic-ray transport processes are fundamental for determining the efficiency of cosmic-ray wind driving. Previous studies modeled cosmic-ray transport either via a constant diffusion coefficient or via streaming proportional to the Alfvén speed. However, in predominantly cold, neutral gas, cosmic rays can propagate faster than in the ionized medium, and the effective transport can be substantially larger; i.e., cosmic rays can decouple from the gas. We perform three-dimensional magnetohydrodynamical simulations of patches of galactic disks including the effects of cosmic rays. Our simulations include the decoupling of cosmic rays in the cold, neutral interstellar medium. We find that, compared to the ordinary diffusive cosmic-ray transport case, accounting for the decoupling leads to significantly different wind properties, such as the gas density and temperature, significantly broader spatial distribution of cosmic rays, and higher wind speed. These results have implications for X-ray, γ-ray, and radio emission, and for the magnetization and pollution of the circumgalactic medium by cosmic rays.

  10. An Overview of the XGAM Code and Related Software for Gamma-ray Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younes, W.

    2014-11-13

    The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-raymore » data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.« less

  11. A comparison of models for supernova remnants including cosmic rays

    NASA Astrophysics Data System (ADS)

    Kang, Hyesung; Drury, L. O'C.

    1992-11-01

    A simplified model which can follow the dynamical evolution of a supernova remnant including the acceleration of cosmic rays without carrying out full numerical simulations has been proposed by Drury, Markiewicz, & Voelk in 1989. To explore the accuracy and the merits of using such a model, we have recalculated with the simplified code the evolution of the supernova remnants considered in Jones & Kang, in which more detailed and accurate numerical simulations were done using a full hydrodynamic code based on the two-fluid approximation. For the total energy transferred to cosmic rays the two codes are in good agreement, the acceleration efficiency being the same within a factor of 2 or so. The dependence of the results of the two codes on the closure parameters for the two-fluid approximation is also qualitatively similar. The agreement is somewhat degraded in those cases where the shock is smoothed out by the cosmic rays.

  12. Benchmarking NNWSI flow and transport codes: COVE 1 results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayden, N.K.

    1985-06-01

    The code verification (COVE) activity of the Nevada Nuclear Waste Storage Investigations (NNWSI) Project is the first step in certification of flow and transport codes used for NNWSI performance assessments of a geologic repository for disposing of high-level radioactive wastes. The goals of the COVE activity are (1) to demonstrate and compare the numerical accuracy and sensitivity of certain codes, (2) to identify and resolve problems in running typical NNWSI performance assessment calculations, and (3) to evaluate computer requirements for running the codes. This report describes the work done for COVE 1, the first step in benchmarking some of themore » codes. Isothermal calculations for the COVE 1 benchmarking have been completed using the hydrologic flow codes SAGUARO, TRUST, and GWVIP; the radionuclide transport codes FEMTRAN and TRUMP; and the coupled flow and transport code TRACR3D. This report presents the results of three cases of the benchmarking problem solved for COVE 1, a comparison of the results, questions raised regarding sensitivities to modeling techniques, and conclusions drawn regarding the status and numerical sensitivities of the codes. 30 refs.« less

  13. X-ray Spectral Formation In High-mass X-ray Binaries: The Case Of Vela X-1

    NASA Astrophysics Data System (ADS)

    Akiyama, Shizuka; Mauche, C. W.; Liedahl, D. A.; Plewa, T.

    2007-05-01

    We are working to develop improved models of radiatively-driven mass flows in the presence of an X-ray source -- such as in X-ray binaries, cataclysmic variables, and active galactic nuclei -- in order to infer the physical properties that determine the X-ray spectra of such systems. The models integrate a three-dimensional time-dependent hydrodynamics capability (FLASH); a comprehensive and uniform set of atomic data, improved calculations of the line force multiplier that account for X-ray photoionization and non-LTE population kinetics, and X-ray emission-line models appropriate to X-ray photoionized plasmas (HULLAC); and a Monte Carlo radiation transport code that simulates Compton scattering and recombination cascades following photoionization. As a test bed, we have simulated a high-mass X-ray binary with parameters appropriate to Vela X-1. While the orbital and stellar parameters of this system are well constrained, the physics of X-ray spectral formation is less well understood because the canonical analytical wind velocity profile of OB stars does not account for the dynamical and radiative feedback effects due to the rotation of the system and to the irradiation of the stellar wind by X-rays from the neutron star. We discuss the dynamical wind structure of Vela X-1 as determined by the FLASH simulation, where in the binary the X-ray emission features originate, and how the spatial and spectral properties of the X-ray emission features are modified by Compton scattering, photoabsorption, and fluorescent emission. This work was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  14. Analytical Model for Estimating the Zenith Angle Dependence of Terrestrial Cosmic Ray Fluxes

    PubMed Central

    Sato, Tatsuhiko

    2016-01-01

    A new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 4.0” was developed to facilitate instantaneous estimation of not only omnidirectional but also angular differential energy spectra of cosmic ray fluxes anywhere in Earth’s atmosphere at nearly any given time. It consists of its previous version, PARMA3.0, for calculating the omnidirectional fluxes and several mathematical functions proposed in this study for expressing their zenith-angle dependences. The numerical values of the parameters used in these functions were fitted to reproduce the results of the extensive air shower simulation performed by Particle and Heavy Ion Transport code System (PHITS). The angular distributions of ground-level muons at large zenith angles were specially determined by introducing an optional function developed on the basis of experimental data. The accuracy of PARMA4.0 was closely verified using multiple sets of experimental data obtained under various global conditions. This extension enlarges the model’s applicability to more areas of research, including design of cosmic-ray detectors, muon radiography, soil moisture monitoring, and cosmic-ray shielding calculation. PARMA4.0 is available freely and is easy to use, as implemented in the open-access EXcel-based Program for Calculating Atmospheric Cosmic-ray Spectrum (EXPACS). PMID:27490175

  15. Analytical Model for Estimating the Zenith Angle Dependence of Terrestrial Cosmic Ray Fluxes.

    PubMed

    Sato, Tatsuhiko

    2016-01-01

    A new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 4.0" was developed to facilitate instantaneous estimation of not only omnidirectional but also angular differential energy spectra of cosmic ray fluxes anywhere in Earth's atmosphere at nearly any given time. It consists of its previous version, PARMA3.0, for calculating the omnidirectional fluxes and several mathematical functions proposed in this study for expressing their zenith-angle dependences. The numerical values of the parameters used in these functions were fitted to reproduce the results of the extensive air shower simulation performed by Particle and Heavy Ion Transport code System (PHITS). The angular distributions of ground-level muons at large zenith angles were specially determined by introducing an optional function developed on the basis of experimental data. The accuracy of PARMA4.0 was closely verified using multiple sets of experimental data obtained under various global conditions. This extension enlarges the model's applicability to more areas of research, including design of cosmic-ray detectors, muon radiography, soil moisture monitoring, and cosmic-ray shielding calculation. PARMA4.0 is available freely and is easy to use, as implemented in the open-access EXcel-based Program for Calculating Atmospheric Cosmic-ray Spectrum (EXPACS).

  16. Modeling UV Radiation Feedback from Massive Stars. I. Implementation of Adaptive Ray-tracing Method and Tests

    NASA Astrophysics Data System (ADS)

    Kim, Jeong-Gyu; Kim, Woong-Tae; Ostriker, Eve C.; Skinner, M. Aaron

    2017-12-01

    We present an implementation of an adaptive ray-tracing (ART) module in the Athena hydrodynamics code that accurately and efficiently handles the radiative transfer involving multiple point sources on a three-dimensional Cartesian grid. We adopt a recently proposed parallel algorithm that uses nonblocking, asynchronous MPI communications to accelerate transport of rays across the computational domain. We validate our implementation through several standard test problems, including the propagation of radiation in vacuum and the expansions of various types of H II regions. Additionally, scaling tests show that the cost of a full ray trace per source remains comparable to that of the hydrodynamics update on up to ∼ {10}3 processors. To demonstrate application of our ART implementation, we perform a simulation of star cluster formation in a marginally bound, turbulent cloud, finding that its star formation efficiency is 12% when both radiation pressure forces and photoionization by UV radiation are treated. We directly compare the radiation forces computed from the ART scheme with those from the M1 closure relation. Although the ART and M1 schemes yield similar results on large scales, the latter is unable to resolve the radiation field accurately near individual point sources.

  17. Validation of columnar CsI x-ray detector responses obtained with hybridMANTIS, a CPU-GPU Monte Carlo code for coupled x-ray, electron, and optical transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Diksha; Badano, Aldo

    2013-03-15

    Purpose: hybridMANTIS is a Monte Carlo package for modeling indirect x-ray imagers using columnar geometry based on a hybrid concept that maximizes the utilization of available CPU and graphics processing unit processors in a workstation. Methods: The authors compare hybridMANTIS x-ray response simulations to previously published MANTIS and experimental data for four cesium iodide scintillator screens. These screens have a variety of reflective and absorptive surfaces with different thicknesses. The authors analyze hybridMANTIS results in terms of modulation transfer function and calculate the root mean square difference and Swank factors from simulated and experimental results. Results: The comparison suggests thatmore » hybridMANTIS better matches the experimental data as compared to MANTIS, especially at high spatial frequencies and for the thicker screens. hybridMANTIS simulations are much faster than MANTIS with speed-ups up to 5260. Conclusions: hybridMANTIS is a useful tool for improved description and optimization of image acquisition stages in medical imaging systems and for modeling the forward problem in iterative reconstruction algorithms.« less

  18. Modelling of aircrew radiation exposure from galactic cosmic rays and solar particle events.

    PubMed

    Takada, M; Lewis, B J; Boudreau, M; Al Anid, H; Bennett, L G I

    2007-01-01

    Correlations have been developed for implementation into the semi-empirical Predictive Code for Aircrew Radiation Exposure (PCAIRE) to account for effects of extremum conditions of solar modulation and low altitude based on transport code calculations. An improved solar modulation model, as proposed by NASA, has been further adopted to interpolate between the bounding correlations for solar modulation. The conversion ratio of effective dose to ambient dose equivalent, as applied to the PCAIRE calculation (based on measurements) for the legal regulation of aircrew exposure, was re-evaluated in this work to take into consideration new ICRP-92 radiation-weighting factors and different possible irradiation geometries of the source cosmic-radiation field. A computational analysis with Monte Carlo N-Particle eXtended Code was further used to estimate additional aircrew exposure that may result from sporadic solar energetic particle events considering real-time monitoring by the Geosynchronous Operational Environmental Satellite. These predictions were compared with the ambient dose equivalent rates measured on-board an aircraft and to count rate data observed at various ground-level neutron monitors.

  19. Cellular track model of biological damage to mammalian cell cultures from galactic cosmic rays

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Katz, Robert; Wilson, John W.; Townsend, Lawrence W.; Nealy, John E.; Shinn, Judy L.

    1991-01-01

    The assessment of biological damage from the galactic cosmic rays (GCR) is a current interest for exploratory class space missions where the highly ionizing, high-energy, high-charge ions (HZE) particles are the major concern. The relative biological effectiveness (RBE) values determined by ground-based experiments with HZE particles are well described by a parametric track theory of cell inactivation. Using the track model and a deterministic GCR transport code, the biological damage to mammalian cell cultures is considered for 1 year in free space at solar minimum for typical spacecraft shielding. Included are the effects of projectile and target fragmentation. The RBE values for the GCR spectrum which are fluence-dependent in the track model are found to be more severe than the quality factors identified by the International Commission on Radiological Protection publication 26 and seem to obey a simple scaling law with the duration period in free space.

  20. Modification and benchmarking of SKYSHINE-III for use with ISFSI cask arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertel, N.E.; Napolitano, D.G.

    1997-12-01

    Dry cask storage arrays are becoming more and more common at nuclear power plants in the United States. Title 10 of the Code of Federal Regulations, Part 72, limits doses at the controlled area boundary of these independent spent-fuel storage installations (ISFSI) to 0.25 mSv (25 mrem)/yr. The minimum controlled area boundaries of such a facility are determined by cask array dose calculations, which include direct radiation and radiation scattered by the atmosphere, also known as skyshine. NAC International (NAC) uses SKYSHINE-III to calculate the gamma-ray and neutron dose rates as a function of distance from ISFSI arrays. In thismore » paper, we present modifications to the SKYSHINE-III that more explicitly model cask arrays. In addition, we have benchmarked the radiation transport methods used in SKYSHINE-III against {sup 60}Co gamma-ray experiments and MCNP neutron calculations.« less

  1. Geant4 Modifications for Accurate Fission Simulations

    NASA Astrophysics Data System (ADS)

    Tan, Jiawei; Bendahan, Joseph

    Monte Carlo is one of the methods to simulate the generation and transport of radiation through matter. The most widely used radiation simulation codes are MCNP and Geant4. The simulation of fission production and transport by MCNP has been thoroughly benchmarked. There is an increasing number of users that prefer using Geant4 due to the flexibility of adding features. However, it has been found that Geant4 does not have the proper fission-production cross sections and does not produce the correct fission products. To achieve accurate results for studies in fissionable material applications, Geant4 was modified to correct these inaccuracies and to add new capabilities. The fission model developed by the Lawrence Livermore National Laboratory was integrated into the neutron-fission modeling package. The photofission simulation capability was enabled using the same neutron-fission library under the assumption that nuclei fission in the same way, independent of the excitation source. The modified fission code provides the correct multiplicity of prompt neutrons and gamma rays, and produces delayed gamma rays and neutrons with time and energy dependencies that are consistent with ENDF/B-VII. The delayed neutrons are now directly produced by a custom package that bypasses the fragment cascade model. The modifications were made for U-235, U-238 and Pu-239 isotopes; however, the new framework allows adding new isotopes easily. The SLAC nuclear data library is used for simulation of isotopes with an atomic number above 92 because it is not available in Geant4. Results of the modified Geant4.10.1 package of neutron-fission and photofission for prompt and delayed radiation are compared with ENDFB-VII and with results produced with the original package.

  2. Astronaut EVA exposure estimates from CAD model spacesuit geometry.

    PubMed

    De Angelis, Giovanni; Anderson, Brooke M; Atwell, William; Nealy, John E; Qualls, Garry D; Wilson, John W

    2004-03-01

    Ongoing assembly and maintenance activities at the International Space Station (ISS) require much more extravehicular activity (EVA) than did the earlier U.S. Space Shuttle missions. It is thus desirable to determine and analyze, and possibly foresee, as accurately as possible what radiation exposures crew members involved in EVAs will experience in order to minimize risks and to establish exposure limits that must not to be exceeded. A detailed CAD model of the U.S. Space Shuttle EVA Spacesuit, developed at NASA Langley Research Center (LaRC), is used to represent the directional shielding of an astronaut; it has detailed helmet and backpack structures, hard upper torso, and multilayer space suit fabric material. The NASA Computerized Anatomical Male and Female (CAM and CAF) models are used in conjunction with the space suit CAD model for dose evaluation within the human body. The particle environments are taken from the orbit-averaged NASA AP8 and AE8 models at solar cycle maxima and minima. The transport of energetic particles through space suit materials and body tissue is calculated by using the NASA LaRC HZETRN code for hadrons and a recently developed deterministic transport code, ELTRN, for electrons. The doses within the CAM and CAF models are determined from energy deposition at given target points along 968 directional rays convergent on the points and are evaluated for several points on the skin and within the body. Dosimetric quantities include contributions from primary protons, light ions, and electrons, as well as from secondary brehmsstrahlung and target fragments. Directional dose patterns are displayed as rays and on spherical surfaces by the use of a color relative intensity representation.

  3. Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release

    NASA Astrophysics Data System (ADS)

    Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.

    2017-11-01

    We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.

  4. Systematic Comparison of Photoionized Plasma Codes with Application to Spectroscopic Studies of AGN in X-Rays

    NASA Technical Reports Server (NTRS)

    Mehdipour, M.; Kaastra, J. S.; Kallman, T.

    2016-01-01

    Atomic data and plasma models play a crucial role in the diagnosis and interpretation of astrophysical spectra, thus influencing our understanding of the Universe. In this investigation we present a systematic comparison of the leading photoionization codes to determine how much their intrinsic differences impact X-ray spectroscopic studies of hot plasmas in photoionization equilibrium. We carry out our computations using the Cloudy, SPEX, and XSTAR photoionization codes, and compare their derived thermal and ionization states for various ionizing spectral energy distributions. We examine the resulting absorption-line spectra from these codes for the case of ionized outflows in active galactic nuclei. By comparing the ionic abundances as a function of ionization parameter, we find that on average there is about 30 deviation between the codes in where ionic abundances peak. For H-like to B-like sequence ions alone, this deviation in is smaller at about 10 on average. The comparison of the absorption-line spectra in the X-ray band shows that there is on average about 30 deviation between the codes in the optical depth of the lines produced at log 1 to 2, reducing to about 20 deviation at log 3. We also simulate spectra of the ionized outflows with the current and upcoming high-resolution X-ray spectrometers, on board XMM-Newton, Chandra, Hitomi, and Athena. From these simulations we obtain the deviation on the best-fit model parameters, arising from the use of different photoionization codes, which is about 10 to40. We compare the modeling uncertainties with the observational uncertainties from the simulations. The results highlight the importance of continuous development and enhancement of photoionization codes for the upcoming era of X-ray astronomy with Athena.

  5. Beam-dynamics codes used at DARHT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekdahl, Jr., Carl August

    Several beam simulation codes are used to help gain a better understanding of beam dynamics in the DARHT LIAs. The most notable of these fall into the following categories: for beam production – Tricomp Trak orbit tracking code, LSP Particle in cell (PIC) code, for beam transport and acceleration – XTR static envelope and centroid code, LAMDA time-resolved envelope and centroid code, LSP-Slice PIC code, for coasting-beam transport to target – LAMDA time-resolved envelope code, LSP-Slice PIC code. These codes are also being used to inform the design of Scorpius.

  6. SKYSHINEIII. Calculating Effects of Structure Design on Neutron Dose Rates in Air

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lampley, C.M.; Andrews, C.M.; Wells, M.B.

    1988-12-01

    SKYSHINE was designed to aid in the evaluation of the effects of structure geometry on the gamma-ray dose rate at given detector positions outside of a building housing gamma-ray sources. The program considers a rectangular structure enclosed by four walls and a roof. Each of the walls and the roof of the building may be subdivided into up to nine different areas, representing different materials or different thicknesses of the same material for those positions of the wall or roof. Basic sets of iron and concrete slab transmission and reflection data for 6.2 MeV gamma-rays are part of the SKYSHINEmore » block data. These data, as well as parametric air transport data for line-beam sources at a number of energies between 0.6 MeV and 6.2 MeV and ranges to 3750 ft, are used to estimate the various components of the gamma-ray dose rate at positions outside of the building. The gamma-ray source is assumed to be a 6.2 MeV point-isotropic source. SKYSHINE-III provides an increase in versatility over the original SKYSHINE code in that it addresses both neutron and gamma-ray point sources. In addition, the emitted radiation may be characterized by an energy emission spectrum defined by the user. A new SKYSHINE data base is also included.« less

  7. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS.

    PubMed

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.

  8. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS

    PubMed Central

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183

  9. Modeling the Martian neutron and gamma-ray leakage fluxes using Geant4

    NASA Astrophysics Data System (ADS)

    Pirard, Benoit; Desorgher, Laurent; Diez, Benedicte; Gasnault, Olivier

    A new evaluation of the Martian neutron and gamma-ray (continuum and line) leakage fluxes has been performed using the Geant4 code. Even if numerous studies have recently been carried out with Monte Carlo methods to characterize planetary radiation environments, only a few however have been able to reproduce in detail the neutron and gamma-ray spectra observed in orbit. We report on the efforts performed to adapt and validate the Geant4-based PLAN- ETOCOSMICS code for use in planetary neutron and gamma-ray spectroscopy data analysis. Beside the advantage of high transparency and modularity common to Geant4 applications, the new code uses reviewed nuclear cross section data, realistic atmospheric profiles and soil layering, as well as specific effects such as gravity acceleration for low energy neutrons. Results from first simulations are presented for some Martian reference compositions and show a high consistency with corresponding neutron and gamma-ray spectra measured on board Mars Odyssey. Finally we discuss the advantages and perspectives of the improved code for precise simulation of planetary radiation environments.

  10. Radiation transport around Kerr black holes

    NASA Astrophysics Data System (ADS)

    Schnittman, Jeremy David

    This Thesis describes the basic framework of a relativistic ray-tracing code for analyzing accretion processes around Kerr black holes. We begin in Chapter 1 with a brief historical summary of the major advances in black hole astrophysics over the past few decades. In Chapter 2 we present a detailed description of the ray-tracing code, which can be used to calculate the transfer function between the plane of the accretion disk and the detector plane, an important tool for modeling relativistically broadened emission lines. Observations from the Rossi X-Ray Timing Explorer have shown the existence of high frequency quasi-periodic oscillations (HFQPOs) in a number of black hole binary systems. In Chapter 3, we employ a simple "hot spot" model to explain the position and amplitude of these HFQPO peaks. The power spectrum of the periodic X-ray light curve consists of multiple peaks located at integral combinations of the black hole coordinate frequencies, with the relative amplitude of each peak determined by the orbital inclination, eccentricity, and hot spot arc length. In Chapter 4, we introduce additional features to the model to explain the broadening of the QPO peaks as well as the damping of higher frequency harmonics in the power spectrum. The complete model is used to fit the power spectra observed in XTE J1550-564, giving confidence limits on each of the model parameters. In Chapter 5 we present a description of the structure of a relativistic alpha- disk around a Kerr black hole. Given the surface temperature of the disk, the observed spectrum is calculated using the transfer function mentioned above. The features of this modified thermal spectrum may be used to infer the physical properties of the accretion disk and the central black hole. In Chapter 6 we develop a Monte Carlo code to calculate the detailed propagation of photons from a hot spot emitter scattering through a corona surrounding the black hole. The coronal scattering has two major observable effects: the inverse-Compton process alters the photon spectrum by adding a high energy power-law tail, and the random scattering of each photon effectively damps out the highest frequency modulations in the X-ray light curve. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617- 253-5668; Fax 617-253-1690.)

  11. Monte Carlo simulation of photon buildup factors for shielding materials in diagnostic x-ray facilities.

    PubMed

    Kharrati, Hedi; Agrebi, Amel; Karoui, Mohamed Karim

    2012-10-01

    A simulation of buildup factors for ordinary concrete, steel, lead, plate glass, lead glass, and gypsum wallboard in broad beam geometry for photons energies from 10 keV to 150 keV at 5 keV intervals is presented. Monte Carlo N-particle radiation transport computer code has been used to determine the buildup factors for the studied shielding materials. An example concretizing the use of the obtained buildup factors data in computing the broad beam transmission for tube potentials at 70, 100, 120, and 140 kVp is given. The half value layer, the tenth value layer, and the equilibrium tenth value layer are calculated from the broad beam transmission for these tube potentials. The obtained values compared with those calculated from the published data show the ability of these data to predict shielding transmission curves. Therefore, the buildup factors data can be combined with primary, scatter, and leakage x-ray spectra to provide a computationally based solution to broad beam transmission for barriers in shielding x-ray facilities.

  12. Using the Monte Carlo technique to calculate dose conversion coefficients for medical professionals in interventional radiology

    NASA Astrophysics Data System (ADS)

    Santos, W. S.; Carvalho, A. B., Jr.; Hunt, J. G.; Maia, A. F.

    2014-02-01

    The objective of this study was to estimate doses in the physician and the nurse assistant at different positions during interventional radiology procedures. In this study, effective doses obtained for the physician and at points occupied by other workers were normalised by air kerma-area product (KAP). The simulations were performed for two X-ray spectra (70 kVp and 87 kVp) using the radiation transport code MCNPX (version 2.7.0), and a pair of anthropomorphic voxel phantoms (MASH/FASH) used to represent both the patient and the medical professional at positions from 7 cm to 47 cm from the patient. The X-ray tube was represented by a point source positioned in the anterior posterior (AP) and posterior anterior (PA) projections. The CC can be useful to calculate effective doses, which in turn are related to stochastic effects. With the knowledge of the values of CCs and KAP measured in an X-ray equipment, at a similar exposure, medical professionals will be able to know their own effective dose.

  13. Monte Carlo simulation of photon buildup factors for shielding materials in diagnostic x-ray facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kharrati, Hedi; Agrebi, Amel; Karoui, Mohamed Karim

    2012-10-15

    Purpose: A simulation of buildup factors for ordinary concrete, steel, lead, plate glass, lead glass, and gypsum wallboard in broad beam geometry for photons energies from 10 keV to 150 keV at 5 keV intervals is presented. Methods: Monte Carlo N-particle radiation transport computer code has been used to determine the buildup factors for the studied shielding materials. Results: An example concretizing the use of the obtained buildup factors data in computing the broad beam transmission for tube potentials at 70, 100, 120, and 140 kVp is given. The half value layer, the tenth value layer, and the equilibrium tenthmore » value layer are calculated from the broad beam transmission for these tube potentials. Conclusions: The obtained values compared with those calculated from the published data show the ability of these data to predict shielding transmission curves. Therefore, the buildup factors data can be combined with primary, scatter, and leakage x-ray spectra to provide a computationally based solution to broad beam transmission for barriers in shielding x-ray facilities.« less

  14. Modeling IrisCode and its variants as convex polyhedral cones and its security implications.

    PubMed

    Kong, Adams Wai-Kin

    2013-03-01

    IrisCode, developed by Daugman, in 1993, is the most influential iris recognition algorithm. A thorough understanding of IrisCode is essential, because over 100 million persons have been enrolled by this algorithm and many biometric personal identification and template protection methods have been developed based on IrisCode. This paper indicates that a template produced by IrisCode or its variants is a convex polyhedral cone in a hyperspace. Its central ray, being a rough representation of the original biometric signal, can be computed by a simple algorithm, which can often be implemented in one Matlab command line. The central ray is an expected ray and also an optimal ray of an objective function on a group of distributions. This algorithm is derived from geometric properties of a convex polyhedral cone but does not rely on any prior knowledge (e.g., iris images). The experimental results show that biometric templates, including iris and palmprint templates, produced by different recognition methods can be matched through the central rays in their convex polyhedral cones and that templates protected by a method extended from IrisCode can be broken into. These experimental results indicate that, without a thorough security analysis, convex polyhedral cone templates cannot be assumed secure. Additionally, the simplicity of the algorithm implies that even junior hackers without knowledge of advanced image processing and biometric databases can still break into protected templates and reveal relationships among templates produced by different recognition methods.

  15. 76 FR 2744 - Disclosure of Code-Share Service by Air Carriers and Sellers of Air Transportation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-14

    ... DEPARTMENT OF TRANSPORTATION Office of the Secretary Disclosure of Code-Share Service by Air Carriers and Sellers of Air Transportation AGENCY: Office of the Secretary, Department of Transportation..., their agents, and third party sellers of air transportation in view of recent amendments to 49 U.S.C...

  16. Improvements in the MGA Code Provide Flexibility and Better Error Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruhter, W D; Kerr, J

    2005-05-26

    The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user themore » ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.« less

  17. Computer Code for Transportation Network Design and Analysis

    DOT National Transportation Integrated Search

    1977-01-01

    This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...

  18. Epp: A C++ EGSnrc user code for x-ray imaging and scattering simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lippuner, Jonas; Elbakri, Idris A.; Cui Congwu

    2011-03-15

    Purpose: Easy particle propagation (Epp) is a user code for the EGSnrc code package based on the C++ class library egspp. A main feature of egspp (and Epp) is the ability to use analytical objects to construct simulation geometries. The authors developed Epp to facilitate the simulation of x-ray imaging geometries, especially in the case of scatter studies. While direct use of egspp requires knowledge of C++, Epp requires no programming experience. Methods: Epp's features include calculation of dose deposited in a voxelized phantom and photon propagation to a user-defined imaging plane. Projection images of primary, single Rayleigh scattered, singlemore » Compton scattered, and multiple scattered photons may be generated. Epp input files can be nested, allowing for the construction of complex simulation geometries from more basic components. To demonstrate the imaging features of Epp, the authors simulate 38 keV x rays from a point source propagating through a water cylinder 12 cm in diameter, using both analytical and voxelized representations of the cylinder. The simulation generates projection images of primary and scattered photons at a user-defined imaging plane. The authors also simulate dose scoring in the voxelized version of the phantom in both Epp and DOSXYZnrc and examine the accuracy of Epp using the Kawrakow-Fippel test. Results: The results of the imaging simulations with Epp using voxelized and analytical descriptions of the water cylinder agree within 1%. The results of the Kawrakow-Fippel test suggest good agreement between Epp and DOSXYZnrc. Conclusions: Epp provides the user with useful features, including the ability to build complex geometries from simpler ones and the ability to generate images of scattered and primary photons. There is no inherent computational time saving arising from Epp, except for those arising from egspp's ability to use analytical representations of simulation geometries. Epp agrees with DOSXYZnrc in dose calculation, since they are both based on the well-validated standard EGSnrc radiation transport physics model.« less

  19. Electromagnetic processes in nucleus-nucleus collisions relating to space radiation research

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    1992-01-01

    Most of the papers within this report deal with electromagnetic processes in nucleus-nucleus collisions which are of concern in the space radiation program. In particular, the removal of one and two nucleons via both electromagnetic and strong interaction processes has been extensively investigated. The theory of relativistic Coulomb fission has also been developed. Several papers on quark models also appear. Finally, note that the theoretical methods developed in this work have been directly applied to the task of radiation protection of astronauts. This has been done by parameterizing the theoretical formalism in such a fashion that it can be used in cosmic ray transport codes.

  20. User's manual for a material transport code on the Octopus Computer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.; Mendez, G.D.

    1978-09-15

    A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.

  1. Filter-fluorescer measurement of low-voltage simulator x-ray energy spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baldwin, G.T.; Craven, R.E.

    X-ray energy spectra of the Maxwell Laboratories MBS and Physics International Pulserad 737 were measured using an eight-channel filter-fluorescer array. The PHOSCAT computer code was used to calculate channel response functions, and the UFO code to unfold spectrum.

  2. Implementation of Soft X-ray Tomography on NSTX

    NASA Astrophysics Data System (ADS)

    Tritz, K.; Stutman, D.; Finkenthal, M.; Granetz, R.; Menard, J.; Park, W.

    2003-10-01

    A set of poloidal ultrasoft X-ray arrays is operated by the Johns Hopkins group on NSTX. To enable MHD mode analysis independent of the magnetic reconstruction, the McCormick-Granetz tomography code developed at MIT is being adapted to the NSTX geometry. Tests of the code using synthetic data show that that present X-ray system is adequate for m=1 tomography. In addition, we have found that spline basis functions may be better suited than Bessel functions for the reconstruction of radially localized phenomena in NSTX. The tomography code was also used to determine the necessary array expansion and optimal array placement for the characterization of higher m modes (m=2,3) in the future. Initial reconstruction of experimental soft X-ray data has been performed for m=1 internal modes, which are often encountered in high beta NSTX discharges. The reconstruction of these modes will be compared to predictions from the M3D code and magnetic measurements.

  3. Comparison of heavy-ion transport simulations: Collision integral in a box

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-Xun; Wang, Yong-Jia; Colonna, Maria; Danielewicz, Pawel; Ono, Akira; Tsang, Manyee Betty; Wolter, Hermann; Xu, Jun; Chen, Lie-Wen; Cozma, Dan; Feng, Zhao-Qing; Das Gupta, Subal; Ikeno, Natsumi; Ko, Che-Ming; Li, Bao-An; Li, Qing-Feng; Li, Zhu-Xia; Mallik, Swagata; Nara, Yasushi; Ogawa, Tatsuhiko; Ohnishi, Akira; Oliinychenko, Dmytro; Papa, Massimo; Petersen, Hannah; Su, Jun; Song, Taesoo; Weil, Janus; Wang, Ning; Zhang, Feng-Shou; Zhang, Zhen

    2018-03-01

    Simulations by transport codes are indispensable to extract valuable physical information from heavy-ion collisions. In order to understand the origins of discrepancies among different widely used transport codes, we compare 15 such codes under controlled conditions of a system confined to a box with periodic boundary, initialized with Fermi-Dirac distributions at saturation density and temperatures of either 0 or 5 MeV. In such calculations, one is able to check separately the different ingredients of a transport code. In this second publication of the code evaluation project, we only consider the two-body collision term; i.e., we perform cascade calculations. When the Pauli blocking is artificially suppressed, the collision rates are found to be consistent for most codes (to within 1 % or better) with analytical results, or completely controlled results of a basic cascade code. In orderto reach that goal, it was necessary to eliminate correlations within the same pair of colliding particles that can be present depending on the adopted collision prescription. In calculations with active Pauli blocking, the blocking probability was found to deviate from the expected reference values. The reason is found in substantial phase-space fluctuations and smearing tied to numerical algorithms and model assumptions in the representation of phase space. This results in the reduction of the blocking probability in most transport codes, so that the simulated system gradually evolves away from the Fermi-Dirac toward a Boltzmann distribution. Since the numerical fluctuations are weaker in the Boltzmann-Uehling-Uhlenbeck codes, the Fermi-Dirac statistics is maintained there for a longer time than in the quantum molecular dynamics codes. As a result of this investigation, we are able to make judgements about the most effective strategies in transport simulations for determining the collision probabilities and the Pauli blocking. Investigation in a similar vein of other ingredients in transport calculations, like the mean-field propagation or the production of nucleon resonances and mesons, will be discussed in the future publications.

  4. Closing Report for NASA Cooperative Agreement NASA-1-242

    NASA Technical Reports Server (NTRS)

    Maung, Khin Maung

    1999-01-01

    Reliable estimates of exposures due to ionizing radiations are of paramount importance in achieving human exploration and development of space, and in several technologically important and scientifically significant areas impacting on industrial and public health. For proper assessment of radiation exposures reliable transport codes are needed. An essential input to the transport codes is the information about the interaction of ions and neutrons with the matter. Most of the information about this interaction is put in by nuclear cross section data. In order to obtain an accurate parameterization of cross sections data, theoretical input is indispensable especially for the processes where there is little or no experimental data available. In the grant period reliable data base was developed and a phenomenological model was developed for the total absorption cross sections valid for any charged/uncharged light, medium and heavy collision pairs valid for the entire energy range. It is gratifying to note the success of the model. The cross sections model has been adopted and is in use in NASA cosmic ray detector development projects, the radiation protection and shielding programs and several DoE laboratories and institutions. A list of the publications based on the work done during the grant period is given below and a sample copy of one of the papers is enclosed with this report.

  5. Verification and benchmark testing of the NUFT computer code

    NASA Astrophysics Data System (ADS)

    Lee, K. H.; Nitao, J. J.; Kulshrestha, A.

    1993-10-01

    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  6. Surface applicator of a miniature X-ray tube for superficial electronic brachytherapy of skin cancer.

    PubMed

    Kim, Hyun Nam; Lee, Ju Hyuk; Park, Han Beom; Kim, Hyun Jin; Cho, Sung Oh

    2018-01-01

    We designed and fabricated a surface applicator of a novel carbon nanotube (CNT)-based miniature X-ray tube for the use in superficial electronic brachytherapy of skin cancer. To investigate the effectiveness of the surface applicator, the performance of the applicator was numerically and experimentally analyzed. The surface applicator consists of a graphite flattening filter and an X-ray shield. A Monte Carlo radiation transport code, MCNP6, was used to optimize the geometries of both the flattening filter and the shield so that X-rays are generated uniformly over the desired region. The performance of the graphite filter was compared with that of conventional aluminum (Al) filters of different geometries using the numerical simulations. After fabricating a surface applicator, the X-ray spatial distribution was measured to evaluate the performance of the applicator. The graphite filter shows better spatial dose uniformity and less dose distortion than Al filters. Moreover, graphite allows easy fabrication of the flattening filter due to its low X-ray attenuation property, which is particularly important for low-energy electronic brachytherapy. The applicator also shows that no further X-ray shielding is required for the application because unwanted X-rays are completely protected. As a result, highly uniform X-ray dose distribution was achieved from the miniature X-ray tube mounted with the surface applicators. The measured values of both flatness and symmetry were less than 5% and the measured penumbra values were less than 1 mm. All these values satisfy the currently accepted tolerance criteria for radiation therapy. The surface applicator exhibits sufficient performance capability for their application in electronic brachytherapy of skin cancers. © 2017 American Association of Physicists in Medicine.

  7. 49 CFR Appendix C to Part 229 - FRA Locomotive Standards-Code of Defects

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false FRA Locomotive Standards-Code of Defects C Appendix C to Part 229 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD LOCOMOTIVE SAFETY STANDARDS Pt. 229, App. C...

  8. 49 CFR Appendix C to Part 229 - FRA Locomotive Standards-Code of Defects

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false FRA Locomotive Standards-Code of Defects C Appendix C to Part 229 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD LOCOMOTIVE SAFETY STANDARDS Pt. 229, App. C...

  9. 49 CFR Appendix C to Part 229 - FRA Locomotive Standards-Code of Defects

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false FRA Locomotive Standards-Code of Defects C Appendix C to Part 229 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD LOCOMOTIVE SAFETY STANDARDS Pt. 229, App. C...

  10. Simulating Gamma-Ray Emission in Star-forming Galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pfrommer, Christoph; Pakmor, Rüdiger; Simpson, Christine M.

    Star-forming galaxies emit GeV and TeV gamma-rays that are thought to originate from hadronic interactions of cosmic-ray (CR) nuclei with the interstellar medium. To understand the emission, we have used the moving-mesh code Arepo to perform magnetohydrodynamical galaxy formation simulations with self-consistent CR physics. Our galaxy models exhibit a first burst of star formation that injects CRs at supernovae. Once CRs have sufficiently accumulated in our Milky Way–like galaxy, their buoyancy force overcomes the magnetic tension of the toroidal disk field. As field lines open up, they enable anisotropically diffusing CRs to escape into the halo and to accelerate amore » bubble-like, CR-dominated outflow. However, these bubbles are invisible in our simulated gamma-ray maps of hadronic pion-decay and secondary inverse-Compton emission because of low gas density in the outflows. By adopting a phenomenological relation between star formation rate (SFR) and far-infrared emission and assuming that gamma-rays mainly originate from decaying pions, our simulated galaxies can reproduce the observed tight relation between far-infrared and gamma-ray emission, independent of whether we account for anisotropic CR diffusion. This demonstrates that uncertainties in modeling active CR transport processes only play a minor role in predicting gamma-ray emission from galaxies. We find that in starbursts, most of the CR energy is “calorimetrically” lost to hadronic interactions. In contrast, the gamma-ray emission deviates from this calorimetric property at low SFRs due to adiabatic losses, which cannot be identified in traditional one-zone models.« less

  11. Simulating Gamma-Ray Emission in Star-forming Galaxies

    NASA Astrophysics Data System (ADS)

    Pfrommer, Christoph; Pakmor, Rüdiger; Simpson, Christine M.; Springel, Volker

    2017-10-01

    Star-forming galaxies emit GeV and TeV gamma-rays that are thought to originate from hadronic interactions of cosmic-ray (CR) nuclei with the interstellar medium. To understand the emission, we have used the moving-mesh code Arepo to perform magnetohydrodynamical galaxy formation simulations with self-consistent CR physics. Our galaxy models exhibit a first burst of star formation that injects CRs at supernovae. Once CRs have sufficiently accumulated in our Milky Way-like galaxy, their buoyancy force overcomes the magnetic tension of the toroidal disk field. As field lines open up, they enable anisotropically diffusing CRs to escape into the halo and to accelerate a bubble-like, CR-dominated outflow. However, these bubbles are invisible in our simulated gamma-ray maps of hadronic pion-decay and secondary inverse-Compton emission because of low gas density in the outflows. By adopting a phenomenological relation between star formation rate (SFR) and far-infrared emission and assuming that gamma-rays mainly originate from decaying pions, our simulated galaxies can reproduce the observed tight relation between far-infrared and gamma-ray emission, independent of whether we account for anisotropic CR diffusion. This demonstrates that uncertainties in modeling active CR transport processes only play a minor role in predicting gamma-ray emission from galaxies. We find that in starbursts, most of the CR energy is “calorimetrically” lost to hadronic interactions. In contrast, the gamma-ray emission deviates from this calorimetric property at low SFRs due to adiabatic losses, which cannot be identified in traditional one-zone models.

  12. 49 CFR Appendix C to Part 215 - FRA Freight Car Standards Defect Code

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false FRA Freight Car Standards Defect Code C Appendix C... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD FREIGHT CAR SAFETY STANDARDS Pt. 215, App. C Appendix C to Part 215—FRA Freight Car Standards Defect Code The following defect code has been established for use...

  13. 49 CFR Appendix C to Part 215 - FRA Freight Car Standards Defect Code

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false FRA Freight Car Standards Defect Code C Appendix C... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD FREIGHT CAR SAFETY STANDARDS Pt. 215, App. C Appendix C to Part 215—FRA Freight Car Standards Defect Code The following defect code has been established for use...

  14. 49 CFR Appendix C to Part 215 - FRA Freight Car Standards Defect Code

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false FRA Freight Car Standards Defect Code C Appendix C... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD FREIGHT CAR SAFETY STANDARDS Pt. 215, App. C Appendix C to Part 215—FRA Freight Car Standards Defect Code The following defect code has been established for use...

  15. 49 CFR Appendix C to Part 215 - FRA Freight Car Standards Defect Code

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false FRA Freight Car Standards Defect Code C Appendix C... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD FREIGHT CAR SAFETY STANDARDS Pt. 215, App. C Appendix C to Part 215—FRA Freight Car Standards Defect Code The following defect code has been established for use...

  16. Methods of treating complex space vehicle geometry for charged particle radiation transport

    NASA Technical Reports Server (NTRS)

    Hill, C. W.

    1973-01-01

    Current methods of treating complex geometry models for space radiation transport calculations are reviewed. The geometric techniques used in three computer codes are outlined. Evaluations of geometric capability and speed are provided for these codes. Although no code development work is included several suggestions for significantly improving complex geometry codes are offered.

  17. Comparisons of anomalous and collisional radial transport with a continuum kinetic edge code

    NASA Astrophysics Data System (ADS)

    Bodi, K.; Krasheninnikov, S.; Cohen, R.; Rognlien, T.

    2009-05-01

    Modeling of anomalous (turbulence-driven) radial transport in controlled-fusion plasmas is necessary for long-time transport simulations. Here the focus is continuum kinetic edge codes such as the (2-D, 2-V) transport version of TEMPEST, NEO, and the code being developed by the Edge Simulation Laboratory, but the model also has wider application. Our previously developed anomalous diagonal transport matrix model with velocity-dependent convection and diffusion coefficients allows contact with typical fluid transport models (e.g., UEDGE). Results are presented that combine the anomalous transport model and collisional transport owing to ion drift orbits utilizing a Krook collision operator that conserves density and energy. Comparison is made of the relative magnitudes and possible synergistic effects of the two processes for typical tokamak device parameters.

  18. Monte Carlo Neutrino Transport through Remnant Disks from Neutron Star Mergers

    NASA Astrophysics Data System (ADS)

    Richers, Sherwood; Kasen, Daniel; O'Connor, Evan; Fernández, Rodrigo; Ott, Christian D.

    2015-11-01

    We present Sedonu, a new open source, steady-state, special relativistic Monte Carlo (MC) neutrino transport code, available at bitbucket.org/srichers/sedonu. The code calculates the energy- and angle-dependent neutrino distribution function on fluid backgrounds of any number of spatial dimensions, calculates the rates of change of fluid internal energy and electron fraction, and solves for the equilibrium fluid temperature and electron fraction. We apply this method to snapshots from two-dimensional simulations of accretion disks left behind by binary neutron star mergers, varying the input physics and comparing to the results obtained with a leakage scheme for the cases of a central black hole and a central hypermassive neutron star. Neutrinos are guided away from the densest regions of the disk and escape preferentially around 45° from the equatorial plane. Neutrino heating is strengthened by MC transport a few scale heights above the disk midplane near the innermost stable circular orbit, potentially leading to a stronger neutrino-driven wind. Neutrino cooling in the dense midplane of the disk is stronger when using MC transport, leading to a globally higher cooling rate by a factor of a few and a larger leptonization rate by an order of magnitude. We calculate neutrino pair annihilation rates and estimate that an energy of 2.8 × 1046 erg is deposited within 45° of the symmetry axis over 300 ms when a central BH is present. Similarly, 1.9 × 1048 erg is deposited over 3 s when an HMNS sits at the center, but neither estimate is likely to be sufficient to drive a gamma-ray burst jet.

  19. Study of transport of laser-driven relativistic electrons in solid materials

    NASA Astrophysics Data System (ADS)

    Leblanc, Philippe

    With the ultra intense lasers available today, it is possible to generate very hot electron beams in solid density materials. These intense laser-matter interactions result in many applications which include the generation of ultrashort secondary sources of particles and radiation such as ions, neutrons, positrons, x-rays, or even laser-driven hadron therapy. For these applications to become reality, a comprehensive understanding of laser-driven energy transport including hot electron generation through the various mechanisms of ionization, and their subsequent transport in solid density media is required. This study will focus on the characterization of electron transport effects in solid density targets using the state-of- the-art particle-in-cell code PICLS. A number of simulation results will be presented on the topics of ionization propagation in insulator glass targets, non-equilibrium ionization modeling featuring electron impact ionization, and electron beam guiding by the self-generated resistive magnetic field. An empirically derived scaling relation for the resistive magnetic in terms of the laser parameters and material properties is presented and used to derive a guiding condition. This condition may prove useful for the design of future laser-matter interaction experiments.

  20. Study of transport phenomena in laser-driven, non- equilibrium plasmas in the presence of external magnetic fields

    NASA Astrophysics Data System (ADS)

    Kemp, G. Elijah; Mariscal, D. A.; Williams, G. J.; Blue, B. E.; Colvin, J. D.; Fears, T. M.; Kerr, S. M.; May, M. J.; Moody, J. D.; Strozzi, D. J.; Lefevre, H. J.; Klein, S. R.; Kuranz, C. C.; Manuel, M. J.-E.; Gautier, D. C.; Montgomery, D. S.

    2017-10-01

    We present experimental and simulation results from a study of thermal transport inhibition in laser-driven, mid-Z, non-equilibrium plasmas in the presence external magnetic fields. The experiments were performed at the Jupiter Laser Facility at LLNL, where x-ray spectroscopy, proton radiography, and Brillouin backscatter data were simultaneously acquired from sub-critical-density, Ti-doped silica aerogel foams driven by a 2 ω laser at 5 ×1014 W /cm2 . External B-field strengths up to 20 T (aligned antiparallel to the laser propagation axis) were provided by a capacitor-bank-driven Helmholtz coil. Pre-shot simulations with Hydra, a radiation-magnetohydrodyanmics code, showed increasing electron plasma temperature with increasing B-field strength - the result of thermal transport inhibition perpendicular to the B-field. The influence of this thermal transport inhibition on the experimental observables as a function of external field strength and target density will be shown and compared with simulations. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344 and funded by LDRD project 17-ERD-027.

  1. Diffusive, Supersonic X-ray Transport in Foam Cylinders

    NASA Astrophysics Data System (ADS)

    Back, Christina A.

    1999-11-01

    Diffusive supersonic radiation transport, where the ratio of the diffusive radiation front velocity to the material sound speed >2 has been studied in a series of laboratory experiments on low density foams. This work is of interest for radiation transport in basic science and astrophysics. The Marshak radiation wave transport is studied for both low and high Z foam materials and for different length foams in a novel hohlraum geometry that allows direct comparisons with 2-dimensional analytic models and code simulations. The radiation wave is created by a ~ 80 eV near blackbody 12-ns long drive or a ~ 200 eV 1.2-2.4 ns long drive generated by laser-heated Au hohlraums. The targets are SiO2 and Ta2O5 aerogel foams of varying lengths which span 10 to 50 mg/cc densities. Clean signatures of radiation breakout were observed by radially resolved face-on transmission measurements of the radiation flux at a photon energy of 250 eV or 550 eV. The high quality data provides new detailed information on the importance of both the fill and wall material opacities and heat capacities in determining the radiation front speed and curvature. note number.

  2. Light transport feature for SCINFUL.

    PubMed

    Etaati, G R; Ghal-Eh, N

    2008-03-01

    An extended version of the scintillator response function prediction code SCINFUL has been developed by incorporating PHOTRACK, a Monte Carlo light transport code. Comparisons of calculated and experimental results for organic scintillators exposed to neutrons show that the extended code improves the predictive capability of SCINFUL.

  3. First ERO2.0 modeling of Be erosion and non-local transport in JET ITER-like wall

    NASA Astrophysics Data System (ADS)

    Romazanov, J.; Borodin, D.; Kirschner, A.; Brezinsek, S.; Silburn, S.; Huber, A.; Huber, V.; Bufferand, H.; Firdaouss, M.; Brömmel, D.; Steinbusch, B.; Gibbon, P.; Lasa, A.; Borodkina, I.; Eksaeva, A.; Linsmeier, Ch; Contributors, JET

    2017-12-01

    ERO is a Monte-Carlo code for modeling plasma-wall interaction and 3D plasma impurity transport for applications in fusion research. The code has undergone a significant upgrade (ERO2.0) which allows increasing the simulation volume in order to cover the entire plasma edge of a fusion device, allowing a more self-consistent treatment of impurity transport and comparison with a larger number and variety of experimental diagnostics. In this contribution, the physics-relevant technical innovations of the new code version are described and discussed. The new capabilities of the code are demonstrated by modeling of beryllium (Be) erosion of the main wall during JET limiter discharges. Results for erosion patterns along the limiter surfaces and global Be transport including incident particle distributions are presented. A novel synthetic diagnostic, which mimics experimental wide-angle 2D camera images, is presented and used for validating various aspects of the code, including erosion, magnetic shadowing, non-local impurity transport, and light emission simulation.

  4. HZETRN: A heavy ion/nucleon transport code for space radiations

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.

    1991-01-01

    The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.

  5. Table-top laser-driven ultrashort electron and X-ray source: the CIBER-X source project

    NASA Astrophysics Data System (ADS)

    Girardeau-Montaut, Jean-Pierre; Kiraly, Bélà; Girardeau-Montaut, Claire; Leboutet, Hubert

    2000-09-01

    We report on the development of a new laser-driven table-top ultrashort electron and X-ray source, also called the CIBER-X source . X-ray pulses are produced by a three-step process which consists of the photoelectron emission from a thin metallic photocathode illuminated by 16 ps duration laser pulses at 213 nm. The e-gun is a standard Pierce diode electrode type, in which electrons are accelerated by a cw electric field of ˜11 MV/m up to a hole made in the anode. The photoinjector produces a train of 70-80 keV electron pulses of ˜0.5 nC and 20 A peak current at a repetition rate of 10 Hz. The electrons are then transported outside the diode along a path of 20 cm length, and are focused onto a target of thullium by magnetic fields produced by two electromagnetic coils. X-rays are then produced by the impact of electrons on the target. Simulations of geometrical, electromagnetic fields and energetic characteristics of the complete source were performed previously with the assistance of the code PIXEL1 also developed at the laboratory. Finally, experimental electron and X-ray performances of the CIBER-X source as well as its application to very low dose imagery are presented and discussed. source Compacte d' Impulsions Brèves d' Electrons et de Rayons X

  6. ITS version 5.0 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    ITS is a powerful and user-friendly software package permitting state of the art Monte Carlo solution of linear time-independent couple electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theoristsmore » alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2)multigroup codes with adjoint transport capabilities, and (3) parallel implementations of all ITS codes. Moreover the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less

  7. Design of laboratory experiments to study radiation-driven implosions

    DOE PAGES

    Keiter, P. A.; Trantham, M.; Malamud, G.; ...

    2017-02-03

    The interstellar medium is heterogeneous with dense clouds amid an ambient medium. Radiation from young OB stars asymmetrically irradiate the dense clouds. Bertoldi (1989) developed analytic formulae to describe possible outcomes of these clouds when irradiated by hot, young stars. One of the critical parameters that determines the cloud’s fate is the number of photon mean free paths in the cloud. For the extreme cases where the cloud size is either much greater than or much less than one mean free path, the radiation transport should be well understood. However, as one transitions between these limits, the radiation transport ismore » much more complex and is a challenge to solve with many of the current radiation transport models implemented in codes. In this paper, we present the design of laboratory experiments that use a thermal source of x-rays to asymmetrically irradiate a low-density plastic foam sphere. The experiment will vary the density and hence the number of mean free paths of the sphere to study the radiation transport in different regimes. Finally, we have developed dimensionless parameters to relate the laboratory experiment to the astrophysical system and we show that we can perform the experiment in the same transport regime.« less

  8. SOPHAEROS code development and its application to falcon tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lajtha, G.; Missirlian, M.; Kissane, M.

    1996-12-31

    One of the key issues in source-term evaluation in nuclear reactor severe accidents is determination of the transport behavior of fission products released from the degrading core. The SOPHAEROS computer code is being developed to predict fission product transport in a mechanistic way in light water reactor circuits. These applications of the SOPHAEROS code to the Falcon experiments, among others not presented here, indicate that the numerical scheme of the code is robust, and no convergence problems are encountered. The calculation is also very fast being three times longer on a Sun SPARC 5 workstation than real time and typicallymore » {approx} 10 times faster than an identical calculation with the VICTORIA code. The study demonstrates that the SOPHAEROS 1.3 code is a suitable tool for prediction of the vapor chemistry and fission product transport with a reasonable level of accuracy. Furthermore, the fexibility of the code material data bank allows improvement of understanding of fission product transport and deposition in the circuit. Performing sensitivity studies with different chemical species or with different properties (saturation pressure, chemical equilibrium constants) is very straightforward.« less

  9. Intact coding region of the serotonin transporter gene in obsessive-compulsive disorder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Altemus, M.; Murphy, D.L.; Greenberg, B.

    1996-07-26

    Epidemiologic studies indicate that obsessive-compulsive disorder is genetically transmitted in some families, although no genetic abnormalities have been identified in individuals with this disorder. The selective response of obsessive-compulsive disorder to treatment with agents which block serotonin reuptake suggests the gene coding for the serotonin transporter as a candidate gene. The primary structure of the serotonin-transporter coding region was sequenced in 22 patients with obsessive-compulsive disorder, using direct PCR sequencing of cDNA synthesized from platelet serotonin-transporter mRNA. No variations in amino acid sequence were found among the obsessive-compulsive disorder patients or healthy controls. These results do not support a rolemore » for alteration in the primary structure of the coding region of the serotonin-transporter gene in the pathogenesis of obsessive-compulsive disorder. 27 refs.« less

  10. 950 keV X-Band Linac For Material Recognition Using Two-Fold Scintillator Detector As A Concept Of Dual-Energy X-Ray System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Kiwoo; Natsui, Takuya; Hirai, Shunsuke

    2011-06-01

    One of the advantages of applying X-band linear accelerator (Linac) is the compact size of the whole system. That shows us the possibility of on-site system such as the custom inspection system in an airport. As X-ray source, we have developed X-band Linac and achieved maximum X-ray energy 950 keV using the low power magnetron (250 kW) in 2 {mu}s pulse length. The whole size of the Linac system is 1x1x1 m{sup 3}. That is realized by introducing X-band system. In addition, we have designed two-fold scintillator detector in dual energy X-ray concept. Monte carlo N-particle transport (MCNP) code wasmore » used to make up sensor part of the design with two scintillators, CsI and CdWO4. The custom inspection system is composed of two equipments: 950 keV X-band Linac and two-fold scintillator and they are operated simulating real situation such as baggage check in an airport. We will show you the results of experiment which was performed with metal samples: iron and lead as targets in several conditions.« less

  11. Investigation of impurity transport using laser blow-off technique in the HL-2A Ohmic and ECRH plasmas

    NASA Astrophysics Data System (ADS)

    Kai, Zhang; Zheng-Ying, Cui; Ping, Sun; Chun-Feng, Dong; Wei, Deng; Yun-Bo, Dong; Shao-Dong, Song; Min, Jiang; Yong-Gao, Li; Ping, Lu; Qing-Wei, Yang

    2016-06-01

    Impurity transports in two neighboring discharges with and without electron cyclotron resonance heating (ECRH) are studied in the HL-2A tokamak by laser blow-off (LBO) technique. The progression of aluminium ions as the trace impurity is monitored by soft x-ray (SXR) and bolometer detector arrays with good temporal and spatial resolutions. Obvious difference in the time trace of the signal between the Ohmic and ECRH L-mode discharges is observed. Based on the numerical simulation with one-dimensional (1D) impurity transport code STRAHL, the radial profiles of impurity diffusion coefficient D and convective velocity V are obtained for each shot. The result shows that the diffusion coefficient D significantly increases throughout the plasma minor radius for the ECRH case with respect to the Ohmic case, and that the convection velocity V changes from negative (inward) for the Ohmic case to partially positive (outward) for the ECRH case. The result on HL-2A confirms the pump out effect of ECRH on impurity profile as reported on various other devices.

  12. Concepts and strategies for lunar base radiation protection - Prefabricated versus in-situ materials

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.; Townsend, Lawrence W.

    1992-01-01

    The most recently accepted environment data are used as inputs for the Langley nucleon and heavy-ion transport codes, BRYNTRN and HZETRN, to examine the shield effectiveness of lunar regolith in comparison with commercially-used shield materials in nuclear facilities. Several of the fabricated materials categorized as neutron absorbers exhibit favorable characteristics for space radiation protection. In particular, polyethylene with additive boron is analyzed with regard to response to the predicted lunar galactic cosmic ray and solar proton flare environment during the course of a complete solar cycle. Although this effort is not intended to be a definitive trade study for specific shielding recommendations, attention is given to several factors that warrant consideration in such trade studies. For example, the transporting of bulk shield material to the lunar site as opposed to regolith-moving and processing equipment is assessed on the basis of recent scenario studies. The transporting of shield material from Earth may also be a viable alternative to the use of regolith from standpoints of cost-effectiveness, EVA time required, and risk factor.

  13. A New Approach in Coal Mine Exploration Using Cosmic Ray Muons

    NASA Astrophysics Data System (ADS)

    Darijani, Reza; Negarestani, Ali; Rezaie, Mohammad Reza; Fatemi, Syed Jalil; Akhond, Ahmad

    2016-08-01

    Muon radiography is a technique that uses cosmic ray muons to image the interior of large scale geological structures. The muon absorption in matter is the most important parameter in cosmic ray muon radiography. Cosmic ray muon radiography is similar to X-ray radiography. The main aim in this survey is the simulation of the muon radiography for exploration of mines. So, the production source, tracking, and detection of cosmic ray muons were simulated by MCNPX code. For this purpose, the input data of the source card in MCNPX code were extracted from the muon energy spectrum at sea level. In addition, the other input data such as average density and thickness of layers that were used in this code are the measured data from Pabdana (Kerman, Iran) coal mines. The average thickness and density of these layers in the coal mines are from 2 to 4 m and 1.3 gr/c3, respectively. To increase the spatial resolution, a detector was placed inside the mountain. The results indicated that using this approach, the layers with minimum thickness about 2.5 m can be identified.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceglio, N.M.; George, E.V.; Brooks, K.M.

    The first successful demonstration of high resolution, tomographic imaging of a laboratory plasma using coded imaging techniques is reported. ZPCI has been used to image the x-ray emission from laser compressed DT filled microballoons. The zone plate camera viewed an x-ray spectral window extending from below 2 keV to above 6 keV. It exhibited a resolution approximately 8 ..mu..m, a magnification factor approximately 13, and subtended a radiation collection solid angle at the target approximately 10/sup -2/ sr. X-ray images using ZPCI were compared with those taken using a grazing incidence reflection x-ray microscope. The agreement was excellent. In addition,more » the zone plate camera produced tomographic images. The nominal tomographic resolution was approximately 75 ..mu..m. This allowed three dimensional viewing of target emission from a single shot in planar ''slices''. In addition to its tomographic capability, the great advantage of the coded imaging technique lies in its applicability to hard (greater than 10 keV) x-ray and charged particle imaging. Experiments involving coded imaging of the suprathermal x-ray and high energy alpha particle emission from laser compressed microballoon targets are discussed.« less

  15. Evolution of Structure and Composition in Saturn's Rings Due to Ballistic Transport of Micrometeoroid Impact Ejecta

    NASA Astrophysics Data System (ADS)

    Estrada, P. R.; Durisen, R. H.; Cuzzi, J. N.

    2014-04-01

    We introduce improved numerical techniques for simulating the structural and compositional evolution of planetary rings due to micrometeoroid bombardment and subsequent ballistic transport of impact ejecta. Our current, robust code, which is based on the original structural code of [1] and on the pollution transport code of [3], is capable of modeling structural changes and pollution transport simultaneously over long times on both local and global scales. We provide demonstrative simulations to compare with, and extend upon previous work, as well as examples of how ballistic transport can maintain the observed structure in Saturn's rings using available Cassini occultation optical depth data.

  16. A Spherical Active Coded Aperture for 4π Gamma-ray Imaging

    DOE PAGES

    Hellfeld, Daniel; Barton, Paul; Gunter, Donald; ...

    2017-09-22

    Gamma-ray imaging facilitates the efficient detection, characterization, and localization of compact radioactive sources in cluttered environments. Fieldable detector systems employing active planar coded apertures have demonstrated broad energy sensitivity via both coded aperture and Compton imaging modalities. But, planar configurations suffer from a limited field-of-view, especially in the coded aperture mode. In order to improve upon this limitation, we introduce a novel design by rearranging the detectors into an active coded spherical configuration, resulting in a 4pi isotropic field-of-view for both coded aperture and Compton imaging. This work focuses on the low- energy coded aperture modality and the optimization techniquesmore » used to determine the optimal number and configuration of 1 cm 3 CdZnTe coplanar grid detectors on a 14 cm diameter sphere with 192 available detector locations.« less

  17. The complete mitochondrial genome of the Giant Manta ray, Manta birostris.

    PubMed

    Hinojosa-Alvarez, Silvia; Díaz-Jaimes, Pindaro; Marcet-Houben, Marina; Gabaldón, Toni

    2015-01-01

    The complete mitochondrial genome of the giant manta ray (Manta birostris), consists of 18,075 bp with rich A + T and low G content. Gene organization and length is similar to other species of ray. It comprises of 13 protein-coding genes, 2 rRNAs genes, 23 tRNAs genes and 1 non-coding sequence, and the control region. We identified an AT tandem repeat region, similar to that reported in Mobula japanica.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony, Stephen

    The Sandia hyperspectral upper-bound spectrum algorithm (hyper-UBS) is a cosmic ray despiking algorithm for hyperspectral data sets. When naturally-occurring, high-energy (gigaelectronvolt) cosmic rays impact the earth’s atmosphere, they create an avalanche of secondary particles which will register as a large, positive spike on any spectroscopic detector they hit. Cosmic ray spikes are therefore an unavoidable spectroscopic contaminant which can interfere with subsequent analysis. A variety of cosmic ray despiking algorithms already exist and can potentially be applied to hyperspectral data matrices, most notably the upper-bound spectrum data matrices (UBS-DM) algorithm by Dongmao Zhang and Dor Ben-Amotz which served as themore » basis for the hyper-UBS algorithm. However, the existing algorithms either cannot be applied to hyperspectral data, require information that is not always available, introduce undesired spectral bias, or have otherwise limited effectiveness for some experimentally relevant conditions. Hyper-UBS is more effective at removing a wider variety of cosmic ray spikes from hyperspectral data without introducing undesired spectral bias. In addition to the core algorithm the Sandia hyper-UBS software package includes additional source code useful in evaluating the effectiveness of the hyper-UBS algorithm. The accompanying source code includes code to generate simulated hyperspectral data contaminated by cosmic ray spikes, several existing despiking algorithms, and code to evaluate the performance of the despiking algorithms on simulated data.« less

  19. Comparing Turbulence Simulation with Experiment in DIII-D

    NASA Astrophysics Data System (ADS)

    Ross, D. W.; Bravenec, R. V.; Dorland, W.; Beer, M. A.; Hammett, G. W.; McKee, G. R.; Murakami, M.; Jackson, G. L.

    2000-10-01

    Gyrofluid simulations of DIII-D discharges with the GRYFFIN code(D. W. Ross et al.), Transport Task Force Workshop, Burlington, VT, (2000). are compared with transport and fluctuation measurements. The evolution of confinement-improved discharges(G. R. McKee et al.), Phys. Plasmas 7, 1870 (200) is studied at early times following impurity injection, when EXB rotational shear plays a small role. The ion thermal transport predicted by the code is consistent with the experimental values. Experimentally, changes in density profiles resulting from the injection of neon, lead to reduction in fluctuation levels and transport following the injection. This triggers subsequent changes in the shearing rate that further reduce the turbulence.(M. Murakami et al.), European Physical Society, Budapest (2000); M. Murakami et al., this meeting. Estimated uncertainties in the plasma profiles, however, make it difficult to simulate these reductions with the code. These cases will also be studied with the GS2 gyrokinetic code.

  20. Neutron Capture gamma ENDF libraries for modeling and identification of neutron sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sleaford, B

    2007-10-29

    There are a number of inaccuracies and data omissions with respect to gammas from neutron capture in the ENDF libraries used as field reference information and by modeling codes used in JTOT. As the use of Active Neutron interrogation methods is expanded, these shortfalls become more acute. A new, more accurate and complete evaluated experimental database of gamma rays (over 35,000 lines for 262 isotopes up to U so far) from thermal neutron capture has recently become available from the IAEA. To my knowledge, none of this new data has been installed in ENDF libraries and disseminated. I propose tomore » upgrade libraries of {sup 184,186}W, {sup 56}Fe, {sup 204,206,207}Pb, {sup 104}Pd, and {sup 19}F the 1st year. This will involve collaboration with Richard Firestone at LBL in evaluating the data and installing it in the libraries. I will test them with the transport code MCNP5.« less

  1. Neutron Environment Calculations for Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Clowdsley, M. S.; Wilson, J. W.; Shinn, J. L.; Badavi, F. F.; Heinbockel, J. H.; Atwell, W.

    2001-01-01

    The long term exposure of astronauts on the developing International Space Station (ISS) requires an accurate knowledge of the internal exposure environment for human risk assessment and other onboard processes. The natural environment is moderated by the solar wind, which varies over the solar cycle. The HZETRN high charge and energy transport code developed at NASA Langley Research Center can be used to evaluate the neutron environment on ISS. A time dependent model for the ambient environment in low earth orbit is used. This model includes GCR radiation moderated by the Earth's magnetic field, trapped protons, and a recently completed model of the albedo neutron environment formed through the interaction of galactic cosmic rays with the Earth's atmosphere. Using this code, the neutron environments for space shuttle missions were calculated and comparisons were made to measurements by the Johnson Space Center with onboard detectors. The models discussed herein are being developed to evaluate the natural and induced environment data for the Intelligence Synthesis Environment Project and eventual use in spacecraft optimization.

  2. Spatially-Dependent Modelling of Pulsar Wind Nebula G0.9+0.1

    NASA Astrophysics Data System (ADS)

    van Rensburg, C.; Krüger, P. P.; Venter, C.

    2018-03-01

    We present results from a leptonic emission code that models the spectral energy distribution of a pulsar wind nebula by solving a Fokker-Planck-type transport equation and calculating inverse Compton and synchrotron emissivities. We have created this time-dependent, multi-zone model to investigate changes in the particle spectrum as they traverse the pulsar wind nebula, by considering a time and spatially-dependent B-field, spatially-dependent bulk particle speed implying convection and adiabatic losses, diffusion, as well as radiative losses. Our code predicts the radiation spectrum at different positions in the nebula, yielding the surface brightness versus radius and the nebular size as function of energy. We compare our new model against more basic models using the observed spectrum of pulsar wind nebula G0.9+0.1, incorporating data from H.E.S.S. as well as radio and X-ray experiments. We show that simultaneously fitting the spectral energy distribution and the energy-dependent source size leads to more stringent constraints on several model parameters.

  3. Spatially dependent modelling of pulsar wind nebula G0.9+0.1

    NASA Astrophysics Data System (ADS)

    van Rensburg, C.; Krüger, P. P.; Venter, C.

    2018-07-01

    We present results from a leptonic emission code that models the spectral energy distribution of a pulsar wind nebula by solving a Fokker-Planck-type transport equation and calculating inverse Compton and synchrotron emissivities. We have created this time-dependent, multizone model to investigate changes in the particle spectrum as they traverse the pulsar wind nebula, by considering a time and spatially dependent B-field, spatially dependent bulk particle speed implying convection and adiabatic losses, diffusion, as well as radiative losses. Our code predicts the radiation spectrum at different positions in the nebula, yielding the surface brightness versus radius and the nebular size as function of energy. We compare our new model against more basic models using the observed spectrum of pulsar wind nebula G0.9+0.1, incorporating data from H.E.S.S. as well as radio and X-ray experiments. We show that simultaneously fitting the spectral energy distribution and the energy-dependent source size leads to more stringent constraints on several model parameters.

  4. Gyrokinetic Particle Simulation of Turbulent Transport in Burning Plasmas (GPS - TTBP) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chame, Jacqueline

    2011-05-27

    The goal of this project is the development of the Gyrokinetic Toroidal Code (GTC) Framework and its applications to problems related to the physics of turbulence and turbulent transport in tokamaks,. The project involves physics studies, code development, noise effect mitigation, supporting computer science efforts, diagnostics and advanced visualizations, verification and validation. Its main scientific themes are mesoscale dynamics and non-locality effects on transport, the physics of secondary structures such as zonal flows, and strongly coherent wave-particle interaction phenomena at magnetic precession resonances. Special emphasis is placed on the implications of these themes for rho-star and current scalings and formore » the turbulent transport of momentum. GTC-TTBP also explores applications to electron thermal transport, particle transport; ITB formation and cross-cuts such as edge-core coupling, interaction of energetic particles with turbulence and neoclassical tearing mode trigger dynamics. Code development focuses on major initiatives in the development of full-f formulations and the capacity to simulate flux-driven transport. In addition to the full-f -formulation, the project includes the development of numerical collision models and methods for coarse graining in phase space. Verification is pursued by linear stability study comparisons with the FULL and HD7 codes and by benchmarking with the GKV, GYSELA and other gyrokinetic simulation codes. Validation of gyrokinetic models of ion and electron thermal transport is pursed by systematic stressing comparisons with fluctuation and transport data from the DIII-D and NSTX tokamaks. The physics and code development research programs are supported by complementary efforts in computer sciences, high performance computing, and data management.« less

  5. Effects of a wavy neutral sheet on cosmic ray anisotropies

    NASA Technical Reports Server (NTRS)

    Kota, J.; Jokipii, J. R.

    1985-01-01

    The first results of a three-dimensional numerical code calculating cosmic ray anisotropies is presented. The code includes diffusion, convection, adiabatic cooling, and drift in an interplanetary magnetic field model containing a wavy neutral sheet. The 3-D model can reproduce all the principal observations for a reasonable set of parameters.

  6. A soft X-ray source based on a low divergence, high repetition rate ultraviolet laser

    NASA Astrophysics Data System (ADS)

    Crawford, E. A.; Hoffman, A. L.; Milroy, R. D.; Quimby, D. C.; Albrecht, G. F.

    The CORK code is utilized to evaluate the applicability of low divergence ultraviolet lasers for efficient production of soft X-rays. The use of the axial hydrodynamic code wih one ozone radial expansion to estimate radial motion and laser energy is examined. The calculation of ionization levels of the plasma and radiation rates by employing the atomic physics and radiation model included in the CORK code is described. Computations using the hydrodynamic code to determine the effect of laser intensity, spot size, and wavelength on plasma electron temperature are provided. The X-ray conversion efficiencies of the lasers are analyzed. It is observed that for a 1 GW laser power the X-ray conversion efficiency is a function of spot size, only weakly dependent on pulse length for time scales exceeding 100 psec, and better conversion efficiencies are obtained at shorter wavelengths. It is concluded that these small lasers focused to 30 micron spot sizes and 10 to the 14th W/sq cm intensities are useful sources of 1-2 keV radiation.

  7. A comparison between EGS4 and MCNP computer modeling of an in vivo X-ray fluorescence system.

    PubMed

    Al-Ghorabie, F H; Natto, S S; Al-Lyhiani, S H

    2001-03-01

    The Monte Carlo computer codes EGS4 and MCNP were used to develop a theoretical model of a 180 degrees geometry in vivo X-ray fluorescence system for the measurement of platinum concentration in head and neck tumors. The model included specification of the photon source, collimators, phantoms and detector. Theoretical results were compared and evaluated against X-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In Vivo Analysis and Cancer Research Group. The EGS4 results agreed well with the MCNP results. However, agreement between the measured spectral shape obtained using the experimental X-ray fluorescence system and the simulated spectral shape obtained using the two Monte Carlo codes was relatively poor. The main reason for the disagreement between the results arises from the basic assumptions which the two codes used in their calculations. Both codes assume a "free" electron model for Compton interactions. This assumption will underestimate the results and invalidates any predicted and experimental spectra when compared with each other.

  8. Computational techniques in gamma-ray skyshine analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, D.L.

    1988-12-01

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified tomore » use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs.« less

  9. Overview of Recent Radiation Transport Code Comparisons for Space Applications

    NASA Astrophysics Data System (ADS)

    Townsend, Lawrence

    Recent advances in radiation transport code development for space applications have resulted in various comparisons of code predictions for a variety of scenarios and codes. Comparisons among both Monte Carlo and deterministic codes have been made and published by vari-ous groups and collaborations, including comparisons involving, but not limited to HZETRN, HETC-HEDS, FLUKA, GEANT, PHITS, and MCNPX. In this work, an overview of recent code prediction inter-comparisons, including comparisons to available experimental data, is presented and discussed, with emphases on those areas of agreement and disagreement among the various code predictions and published data.

  10. Cosmic ray transport in astrophysical plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlickeiser, R.

    2015-09-15

    Since the development of satellite space technology about 50 years ago the solar heliosphere is explored almost routinely by several spacecrafts carrying detectors for measuring the properties of the interplanetary medium including energetic charged particles (cosmic rays), solar wind particle densities, and electromagnetic fields. In 2012, the Voyager 1 spacecraft has even left what could be described as the heliospheric modulation region, as indicated by the sudden disappearance of low energy heliospheric cosmic ray particles. With the available in-situ measurements of interplanetary turbulent electromagnetic fields and of the momentum spectra of different cosmic ray species in different interplanetary environments, themore » heliosphere is the best cosmic laboratory to test our understanding of the transport and acceleration of cosmic rays in space plasmas. I review both the historical development and the current state of various cosmic ray transport equations. Similarities and differences to transport theories for terrestrial fusion plasmas are highlighted. Any progress in cosmic ray transport requires a detailed understanding of the electromagnetic turbulence that is responsible for the scattering and acceleration of these particles.« less

  11. Imaging Analysis of the Hard X-Ray Telescope ProtoEXIST2 and New Techniques for High-Resolution Coded-Aperture Telescopes

    NASA Technical Reports Server (NTRS)

    Hong, Jaesub; Allen, Branden; Grindlay, Jonathan; Barthelmy, Scott D.

    2016-01-01

    Wide-field (greater than or approximately equal to 100 degrees squared) hard X-ray coded-aperture telescopes with high angular resolution (greater than or approximately equal to 2 minutes) will enable a wide range of time domain astrophysics. For instance, transient sources such as gamma-ray bursts can be precisely localized without the assistance of secondary focusing X-ray telescopes to enable rapid followup studies. On the other hand, high angular resolution in coded-aperture imaging introduces a new challenge in handling the systematic uncertainty: the average photon count per pixel is often too small to establish a proper background pattern or model the systematic uncertainty in a timescale where the model remains invariant. We introduce two new techniques to improve detection sensitivity, which are designed for, but not limited to, a high-resolution coded-aperture system: a self-background modeling scheme which utilizes continuous scan or dithering operations, and a Poisson-statistics based probabilistic approach to evaluate the significance of source detection without subtraction in handling the background. We illustrate these new imaging analysis techniques in high resolution coded-aperture telescope using the data acquired by the wide-field hard X-ray telescope ProtoEXIST2 during a high-altitude balloon flight in fall 2012. We review the imaging sensitivity of ProtoEXIST2 during the flight, and demonstrate the performance of the new techniques using our balloon flight data in comparison with a simulated ideal Poisson background.

  12. Coded-aperture imaging of the Galactic center region at gamma-ray energies

    NASA Technical Reports Server (NTRS)

    Cook, Walter R.; Grunsfeld, John M.; Heindl, William A.; Palmer, David M.; Prince, Thomas A.

    1991-01-01

    The first coded-aperture images of the Galactic center region at energies above 30 keV have revealed two strong gamma-ray sources. One source has been identified with the X-ray source IE 1740.7 - 2942, located 0.8 deg away from the nucleus. If this source is at the distance of the Galactic center, it is one of the most luminous objects in the galaxy at energies from 35 to 200 keV. The second source is consistent in location with the X-ray source GX 354 + 0 (MXB 1728-34). In addition, gamma-ray flux from the location of GX 1 + 4 was marginally detected at a level consistent with other post-1980 measurements. No significant hard X-ray or gamma-ray flux was detected from the direction of the Galactic nucleus or from the direction of the recently discovered gamma-ray source GRS 1758-258.

  13. MILSTAMP TACs: Military Standard Transportation and Movement Procedures Transportation Account Codes. Volume 2

    DTIC Science & Technology

    1987-02-15

    this chapter. NO - If shipment is not second des - tination transportation , obtain fund cite per yes response for question 2 above. 4. For Direct Support...return . . . . . . . . .0 . . . . . . . a. . .. A820 (8) LOGAIR/QUICKTRANS. Transportation Account Codes de - signed herein are applicable to the...oo~• na~- Transportation Tis Document Contains Tasotto Missing Page/s That Are Unavailable In The And Original Document Movement sdocument has boon

  14. Characterization of gamma rays existing in the NMIJ standard neutron field.

    PubMed

    Harano, H; Matsumoto, T; Ito, Y; Uritani, A; Kudo, K

    2004-01-01

    Our laboratory provides national standards on fast neutron fluence. Neutron fields are always accompanied by gamma rays produced in neutron sources and surroundings. We have characterised these gamma rays in the 5.0 MeV standard neutron field. Gamma ray measurement was performed using an NE213 liquid scintillator. Pulse shape discrimination was incorporated to separate the events induced by gamma rays from those by neutrons. The measured gamma ray spectra were unfolded with the HEPRO program package to obtain the spectral fluences using the response matrix prepared with the EGS4 code. Corrections were made for the gamma rays produced by neutrons in the detector assembly using the MCNP4C code. The effective dose equivalents were estimated to be of the order of 25 microSv at the neutron fluence of 10(7) neutrons cm(-2).

  15. Theory and methods for measuring the effective multiplication constant in ADS

    NASA Astrophysics Data System (ADS)

    Rugama Saez, Yolanda

    2001-10-01

    In the thesis an absolute measurements technique for the subcriticality determination is presented. The ADS is a hybrid system where a subcritical system is fed by a proton accelerator. There are different proposals to define an ADS, one is to use plutonium and minor actinides from power plants waste as fuel to be transmuted into non radioactive isotopes (transmuter/burner, ATW). Another proposal is to use a Th232-U233 cycle (Energy Amplifier), being that thorium is an interesting and abundant fertile isotope. The development of accelerator driven systems (ADS) requires the development of methods to monitor and control the subcriticality of this kind of system without interfering with its normal operation mode. With this finality, we have applied noise analysis techniques that allow us to characterise the system when it is operating. The method presented in this thesis is based on the stochastic neutron and photon transport theory that can be implemented by presently available neutron/photon transport codes. In this work, first we analyse the stochastic transport theory which has been applied to define a parameter to determine the subcritical reactivity monitoring measurements. Finally we give the main limitations and recommendations for these subcritical monitoring methodology. As a result of the theoretical methodology, done in the first part of this thesis, a monitoring measurement technique has been developed and verified using two coupled Monte Carlo programs. The first one, LAHET, simulates the spallation collisions and the high energy transport and the other, MCNP-DSP, is used to estimate the counting statistics from a neutron/photon ray counter in a fissile system, as well as the transport for neutron with energies less than 20 MeV. From the coupling of both codes we developed the LAHET/MCNP-DSP code which, has the capability to simulate the total process in the ADS from the proton interaction to the signal detector processing. In these simulations, we compute the cross power spectral densities between pairs of detectors located inside the system which, is defined as the measured parameter. From the comparison of the theoretical predictions with the Monte Carlo simulations, we obtain some practical and simple methods to determine the system multiplication constant. (Abstract shortened by UMI.)

  16. Secondary gamma-ray production in a coded aperture mask

    NASA Technical Reports Server (NTRS)

    Owens, A.; Frye, G. M., Jr.; Hall, C. J.; Jenkins, T. L.; Pendleton, G. N.; Carter, J. N.; Ramsden, D.; Agrinier, B.; Bonfand, E.; Gouiffes, C.

    1985-01-01

    The application of the coded aperture mask to high energy gamma-ray astronomy will provide the capability of locating a cosmic gamma-ray point source with a precision of a few arc-minutes above 20 MeV. Recent tests using a mask in conjunction with drift chamber detectors have shown that the expected point spread function is achieved over an acceptance cone of 25 deg. A telescope employing this technique differs from a conventional telescope only in that the presence of the mask modifies the radiation field in the vicinity of the detection plane. In addition to reducing the primary photon flux incident on the detector by absorption in the mask elements, the mask will also be a secondary radiator of gamma-rays. The various background components in a CAMTRAC (Coded Aperture Mask Track Chamber) telescope are considered. Monte-Carlo calculations are compared with recent measurements obtained using a prototype instrument in a tagged photon beam line.

  17. DynamiX, numerical tool for design of next-generation x-ray telescopes.

    PubMed

    Chauvin, Maxime; Roques, Jean-Pierre

    2010-07-20

    We present a new code aimed at the simulation of grazing-incidence x-ray telescopes subject to deformations and demonstrate its ability with two test cases: the Simbol-X and the International X-ray Observatory (IXO) missions. The code, based on Monte Carlo ray tracing, computes the full photon trajectories up to the detector plane, accounting for the x-ray interactions and for the telescope motion and deformation. The simulation produces images and spectra for any telescope configuration using Wolter I mirrors and semiconductor detectors. This numerical tool allows us to study the telescope performance in terms of angular resolution, effective area, and detector efficiency, accounting for the telescope behavior. We have implemented an image reconstruction method based on the measurement of the detector drifts by an optical sensor metrology. Using an accurate metrology, this method allows us to recover the loss of angular resolution induced by the telescope instability. In the framework of the Simbol-X mission, this code was used to study the impacts of the parameters on the telescope performance. In this paper we present detailed performance analysis of Simbol-X, taking into account the satellite motions and the image reconstruction. To illustrate the versatility of the code, we present an additional performance analysis with a particular configuration of IXO.

  18. Validating the performance of correlated fission multiplicity implementation in radiation transport codes with subcritical neutron multiplication benchmark experiments

    DOE PAGES

    Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson; ...

    2018-06-14

    Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less

  19. Validating the performance of correlated fission multiplicity implementation in radiation transport codes with subcritical neutron multiplication benchmark experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson

    Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less

  20. Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frambati, S.; Frignani, M.

    2012-07-01

    We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less

  1. Radiation Transport Calculation of the UGXR Collimators for the Jules Horowitz Reactor (JHR)

    NASA Astrophysics Data System (ADS)

    Chento, Yelko; Hueso, César; Zamora, Imanol; Fabbri, Marco; Fuente, Cristina De La; Larringan, Asier

    2017-09-01

    Jules Horowitz Reactor (JHR), a major infrastructure of European interest in the fission domain, will be built and operated in the framework of an international cooperation, including the development and qualification of materials and nuclear fuel used in nuclear industry. For this purpose UGXR Collimators, two multi slit gamma and X-ray collimation mechatronic systems, will be installed at the JHR pool and at the Irradiated Components Storage pool. Expected amounts of radiation produced by the spent fuel and X-ray accelerator implies diverse aspects need to be verified to ensure adequate radiological zoning and personnel radiation protection. A computational methodology was devised to validate the Collimators design by means of coupling different engineering codes. In summary, several assessments were performed by means of MCNP5v1.60 to fulfil all the radiological requirements in Nominal scenario (TEDE < 25µSv/h) and in Maintenance scenario (TEDE < 2mSv/h) among others, detailing the methodology, hypotheses and assumptions employed.

  2. The radiation environment on the Moon from galactic cosmic rays in a lunar habitat.

    PubMed

    Jia, Y; Lin, Z W

    2010-02-01

    We calculated how the radiation environment in a habitat on the surface of the Moon would have depended on the thickness of the habitat in the 1977 galactic cosmic-ray environment. The Geant4 Monte Carlo transport code was used, and a hemispherical dome made of lunar regolith was used to simulate the lunar habitat. We investigated the effective dose from primary and secondary particles including nuclei from protons up to nickel, neutrons, charged pions, photons, electrons and positrons. The total effective dose showed a strong decrease with the thickness of the habitat dome. However, the effective dose values from secondary neutrons, charged pions, photons, electrons and positrons all showed a strong increase followed by a gradual decrease with the habitat thickness. The fraction of the summed effective dose from these secondary particles in the total effective dose increased with the habitat thickness, from approximately 5% for the no-habitat case to about 47% for the habitat with an areal thickness of 100 g/cm(2).

  3. Radiation Transport Tools for Space Applications: A Review

    NASA Technical Reports Server (NTRS)

    Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn

    2008-01-01

    This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.

  4. Optimal shielding thickness for galactic cosmic ray environments

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Bahadori, Amir A.; Reddell, Brandon D.; Singleterry, Robert C.; Clowdsley, Martha S.; Blattnig, Steve R.

    2017-02-01

    Models have been extensively used in the past to evaluate and develop material optimization and shield design strategies for astronauts exposed to galactic cosmic rays (GCR) on long duration missions. A persistent conclusion from many of these studies was that passive shielding strategies are inefficient at reducing astronaut exposure levels and the mass required to significantly reduce the exposure is infeasible, given launch and associated cost constraints. An important assumption of this paradigm is that adding shielding mass does not substantially increase astronaut exposure levels. Recent studies with HZETRN have suggested, however, that dose equivalent values actually increase beyond ∼20 g/cm2 of aluminum shielding, primarily as a result of neutron build-up in the shielding geometry. In this work, various Monte Carlo (MC) codes and 3DHZETRN are evaluated in slab geometry to verify the existence of a local minimum in the dose equivalent versus aluminum thickness curve near 20 g/cm2. The same codes are also evaluated in polyethylene shielding, where no local minimum is observed, to provide a comparison between the two materials. Results are presented so that the physical interactions driving build-up in dose equivalent values can be easily observed and explained. Variation of transport model results for light ions (Z ≤ 2) and neutron-induced target fragments, which contribute significantly to dose equivalent for thick shielding, is also highlighted and indicates that significant uncertainties are still present in the models for some particles. The 3DHZETRN code is then further evaluated over a range of related slab geometries to draw closer connection to more realistic scenarios. Future work will examine these related geometries in more detail.

  5. Optimal shielding thickness for galactic cosmic ray environments.

    PubMed

    Slaba, Tony C; Bahadori, Amir A; Reddell, Brandon D; Singleterry, Robert C; Clowdsley, Martha S; Blattnig, Steve R

    2017-02-01

    Models have been extensively used in the past to evaluate and develop material optimization and shield design strategies for astronauts exposed to galactic cosmic rays (GCR) on long duration missions. A persistent conclusion from many of these studies was that passive shielding strategies are inefficient at reducing astronaut exposure levels and the mass required to significantly reduce the exposure is infeasible, given launch and associated cost constraints. An important assumption of this paradigm is that adding shielding mass does not substantially increase astronaut exposure levels. Recent studies with HZETRN have suggested, however, that dose equivalent values actually increase beyond ∼20g/cm 2 of aluminum shielding, primarily as a result of neutron build-up in the shielding geometry. In this work, various Monte Carlo (MC) codes and 3DHZETRN are evaluated in slab geometry to verify the existence of a local minimum in the dose equivalent versus aluminum thickness curve near 20g/cm 2 . The same codes are also evaluated in polyethylene shielding, where no local minimum is observed, to provide a comparison between the two materials. Results are presented so that the physical interactions driving build-up in dose equivalent values can be easily observed and explained. Variation of transport model results for light ions (Z ≤ 2) and neutron-induced target fragments, which contribute significantly to dose equivalent for thick shielding, is also highlighted and indicates that significant uncertainties are still present in the models for some particles. The 3DHZETRN code is then further evaluated over a range of related slab geometries to draw closer connection to more realistic scenarios. Future work will examine these related geometries in more detail. Published by Elsevier Ltd.

  6. Ray-tracing critical-angle transmission gratings for the X-ray Surveyor and Explorer-size missions

    NASA Astrophysics Data System (ADS)

    Günther, Hans M.; Bautz, Marshall W.; Heilmann, Ralf K.; Huenemoerder, David P.; Marshall, Herman L.; Nowak, Michael A.; Schulz, Norbert S.

    2016-07-01

    We study a critical angle transmission (CAT) grating spectrograph that delivers a spectral resolution significantly above any X-ray spectrograph ever own. This new technology will allow us to resolve kinematic components in absorption and emission lines of galactic and extragalactic matter down to unprecedented dispersion levels. We perform ray-trace simulations to characterize the performance of the spectrograph in the context of an X-ray Surveyor or Arcus like layout (two mission concepts currently under study). Our newly developed ray-trace code is a tool suite to simulate the performance of X-ray observatories. The simulator code is written in Python, because the use of a high-level scripting language allows modifications of the simulated instrument design in very few lines of code. This is especially important in the early phase of mission development, when the performances of different configurations are contrasted. To reduce the run-time and allow for simulations of a few million photons in a few minutes on a desktop computer, the simulator code uses tabulated input (from theoretical models or laboratory measurements of samples) for grating efficiencies and mirror reflectivities. We find that the grating facet alignment tolerances to maintain at least 90% of resolving power that the spectrometer has with perfect alignment are (i) translation parallel to the optical axis below 0.5 mm, (ii) rotation around the optical axis or the groove direction below a few arcminutes, and (iii) constancy of the grating period to 1:105. Translations along and rotations around the remaining axes can be significantly larger than this without impacting the performance.

  7. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  8. X-Ray, EUV, UV and Optical Emissivities of Astrophysical Plasmas

    NASA Technical Reports Server (NTRS)

    Raymond, John C.; West, Donald (Technical Monitor)

    2000-01-01

    This grant primarily covered the development of the thermal X-ray emission model code called APEC, which is meant to replace the Raymond and Smith (1977) code. The new code contains far more spectral lines and a great deal of updated atomic data. The code is now available (http://hea-www.harvard.edu/APEC), though new atomic data is still being added, particularly at longer wavelengths. While initial development of the code was funded by this grant, current work is carried on by N. Brickhouse, R. Smith and D. Liedahl under separate funding. Over the last five years, the grant has provided salary support for N. Brickhouse, R. Smith, a summer student (L. McAllister), an SAO predoctoral fellow (A. Vasquez), and visits by T. Kallman, D. Liedahl, P. Ghavamian, J.M. Laming, J. Li, P. Okeke, and M. Martos. In addition to the code development, the grant supported investigations into X-ray and UV spectral diagnostics as applied to shock waves in the ISM, accreting black holes and white dwarfs, and stellar coronae. Many of these efforts are continuing. Closely related work on the shock waves and coronal mass ejections in the solar corona has grown out of the efforts supported by the grant.

  9. Modeling and design of radiative hydrodynamic experiments with X-ray Thomson Scattering measurements on NIF

    NASA Astrophysics Data System (ADS)

    Ma, K. H.; Lefevre, H. J.; Belancourt, P. X.; MacDonald, M. J.; Doeppner, T.; Keiter, P. A.; Kuranz, C. C.; Johnsen, E.

    2017-10-01

    Recent experiments at the National Ignition Facility studied the effect of radiation on shock-driven hydrodynamic instability growth. X-ray radiography images from these experiments indicate that perturbation growth is lower in highly radiative shocks compared to shocks with negligible radiation flux. The reduction in instability growth is attributed to ablation from higher temperatures in the foam for highly radiative shocks. The proposed design implements the X-ray Thomson Scattering (XRTS) technique in the radiative shock tube platform to measure electron temperatures and densities in the shocked foam. We model these experiments with CRASH, an Eulerian radiation hydrodynamics code with block-adaptive mesh refinement, multi-group radiation transport and electron heat conduction. Simulations are presented with SiO2 and carbon foams for both the high temperature, radiative shock and the low-temperature, hydrodynamic shock cases. Calculations from CRASH give estimations for shock speed, electron temperature, effective ionization, and other quantities necessary for designing the XRTS diagnostic measurement. This work is funded by the LLNL under subcontract B614207, and was performed under the auspices of the U.S. DOE by LLNL under Contract No. DE-AC52-07NA27344.

  10. Monitoring Cosmic Radiation Risk: Comparisons between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-01-01

    proton PARMA PHITS -based Analytical Radiation Model in the Atmosphere PCAIRE Predictive Code for Aircrew Radiation Exposure PHITS Particle and...radiation transport code utilized is called PARMA ( PHITS based Analytical Radiation Model in the Atmosphere) [36]. The particle fluxes calculated from the...same dose equivalent coefficient regulations from the ICRP-60 regulations. As a result, the transport codes utilized by EXPACS ( PHITS ) and CARI-6

  11. Monitoring Cosmic Radiation Risk: Comparisons Between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-07-05

    proton PARMA PHITS -based Analytical Radiation Model in the Atmosphere PCAIRE Predictive Code for Aircrew Radiation Exposure PHITS Particle and Heavy...transport code utilized is called PARMA ( PHITS based Analytical Radiation Model in the Atmosphere) [36]. The particle fluxes calculated from the input...dose equivalent coefficient regulations from the ICRP-60 regulations. As a result, the transport codes utilized by EXPACS ( PHITS ) and CARI-6 (PARMA

  12. Anisotropic imaging performance in breast tomosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badano, Aldo; Kyprianou, Iacovos S.; Jennings, Robert J.

    We describe the anisotropy in imaging performance caused by oblique x-ray incidence in indirect detectors for breast tomosynthesis based on columnar scintillator screens. We use MANTIS, a freely available combined x-ray, electron, and optical Monte Carlo transport package which models the indirect detection processes in columnar screens, interaction by interaction. The code has been previously validated against published optical distributions. In this article, initial validation results are provided concerning the blur for particular designs of phosphor screens for which some details with respect to the columnar geometry are available from scanning electron microscopy. The polyenergetic x-ray spectrum utilized comes frommore » a database of experimental data for three different anode/filter/kVp combinations: Mo/Mo at 28 kVp, Rh/Rh at 28 kVp, and W/Al at 42 kVp. The x-ray spectra were then filtered with breast tissue (3, 4, and 6 cm thickness), compression paddle, and support base, according to the oblique paths determined by the incidence angle. The composition of the breast tissue was 50%/50% adipose/glandular tissue mass ratio. Results are reported on the pulse-height statistics of the light output and on spatial blur, expressed as the response of the detector to a pencil beam with a certain incidence angle. Results suggest that the response is nonsymmetrical and that the resolution properties of a tomosynthesis system vary significantly with the angle of x-ray incidence. In contrast, it is found that the noise due to the variability in the number of light photons detected per primary x-ray interaction changes only a few percent. The anisotropy in the response is not less in screens with absorptive backings while the noise introduced by variations in the depth-dependent light output and optical transport is larger. The results suggest that anisotropic imaging performance across the detector area can be incorporated into reconstruction algorithms for improving the image quality of breast tomosynthesis. This study also demonstrates that the assessment of image quality of breast tomosynthesis systems requires a more complete description of the detector response beyond local, center measurements of resolution and noise that assume some degree of symmetry in the detector performance.« less

  13. Moving from Batch to Field Using the RT3D Reactive Transport Modeling System

    NASA Astrophysics Data System (ADS)

    Clement, T. P.; Gautam, T. R.

    2002-12-01

    The public domain reactive transport code RT3D (Clement, 1997) is a general-purpose numerical code for solving coupled, multi-species reactive transport in saturated groundwater systems. The code uses MODFLOW to simulate flow and several modules of MT3DMS to simulate the advection and dispersion processes. RT3D employs the operator-split strategy which allows the code solve the coupled reactive transport problem in a modular fashion. The coupling between reaction and transport is defined through a separate module where the reaction equations are specified. The code supports a versatile user-defined reaction option that allows users to define their own reaction system through a Fortran-90 subroutine, known as the RT3D-reaction package. Further a utility code, known as BATCHRXN, allows the users to independently test and debug their reaction package. To analyze a new reaction system at a batch scale, users should first run BATCHRXN to test the ability of their reaction package to model the batch data. After testing, the reaction package can simply be ported to the RT3D environment to study the model response under 1-, 2-, or 3-dimensional transport conditions. This paper presents example problems that demonstrate the methods for moving from batch to field-scale simulations using BATCHRXN and RT3D codes. The first example describes a simple first-order reaction system for simulating the sequential degradation of Tetrachloroethene (PCE) and its daughter products. The second example uses a relatively complex reaction system for describing the multiple degradation pathways of Tetrachloroethane (PCA) and its daughter products. References 1) Clement, T.P, RT3D - A modular computer code for simulating reactive multi-species transport in 3-Dimensional groundwater aquifers, Battelle Pacific Northwest National Laboratory Research Report, PNNL-SA-28967, September, 1997. Available at: http://bioprocess.pnl.gov/rt3d.htm.

  14. Ways with Data: Understanding Coding as Writing

    ERIC Educational Resources Information Center

    Lindgren, Chris

    2017-01-01

    In this dissertation, I report findings from an exploratory case-study about Ray, a web developer, who works on a data-driven news team that finds and tells compelling stories with large sets of data. I implicate this case of Ray's coding on a data team in a writing studies epistemology, which is guided by the following question: "What might…

  15. Generic reactive transport codes as flexible tools to integrate soil organic matter degradation models with water, transport and geochemistry in soils

    NASA Astrophysics Data System (ADS)

    Jacques, Diederik; Gérard, Fréderic; Mayer, Uli; Simunek, Jirka; Leterme, Bertrand

    2016-04-01

    A large number of organic matter degradation, CO2 transport and dissolved organic matter models have been developed during the last decades. However, organic matter degradation models are in many cases strictly hard-coded in terms of organic pools, degradation kinetics and dependency on environmental variables. The scientific input of the model user is typically limited to the adjustment of input parameters. In addition, the coupling with geochemical soil processes including aqueous speciation, pH-dependent sorption and colloid-facilitated transport are not incorporated in many of these models, strongly limiting the scope of their application. Furthermore, the most comprehensive organic matter degradation models are combined with simplified representations of flow and transport processes in the soil system. We illustrate the capability of generic reactive transport codes to overcome these shortcomings. The formulations of reactive transport codes include a physics-based continuum representation of flow and transport processes, while biogeochemical reactions can be described as equilibrium processes constrained by thermodynamic principles and/or kinetic reaction networks. The flexibility of these type of codes allows for straight-forward extension of reaction networks, permits the inclusion of new model components (e.g.: organic matter pools, rate equations, parameter dependency on environmental conditions) and in such a way facilitates an application-tailored implementation of organic matter degradation models and related processes. A numerical benchmark involving two reactive transport codes (HPx and MIN3P) demonstrates how the process-based simulation of transient variably saturated water flow (Richards equation), solute transport (advection-dispersion equation), heat transfer and diffusion in the gas phase can be combined with a flexible implementation of a soil organic matter degradation model. The benchmark includes the production of leachable organic matter and inorganic carbon in the aqueous and gaseous phases, as well as different decomposition functions with first-order, linear dependence or nonlinear dependence on a biomass pool. In addition, we show how processes such as local bioturbation (bio-diffusion) can be included implicitly through a Fickian formulation of transport of soil organic matter. Coupling soil organic matter models with generic and flexible reactive transport codes offers a valuable tool to enhance insights into coupled physico-chemical processes at different scales within the scope of C-biogeochemical cycles, possibly linked with other chemical elements such as plant nutrients and pollutants.

  16. 49 CFR 171.25 - Additional requirements for the use of the IMDG Code.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Packages containing primary lithium batteries and cells that are transported in accordance with Special Provision 188 of the IMDG Code must be marked “PRIMARY LITHIUM BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT” or “LITHIUM METAL BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT.” This...

  17. 77 FR 18716 - Transportation Security Administration Postal Zip Code Change; Technical Amendment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... organizational changes and it has no substantive effect on the public. DATES: Effective March 28, 2012. FOR... No. 1572-9] Transportation Security Administration Postal Zip Code Change; Technical Amendment AGENCY: Transportation Security Administration, DHS. ACTION: Final rule. SUMMARY: This rule is a technical change to...

  18. Fast high-energy X-ray imaging for Severe Accidents experiments on the future PLINIUS-2 platform

    NASA Astrophysics Data System (ADS)

    Berge, L.; Estre, N.; Tisseur, D.; Payan, E.; Eck, D.; Bouyer, V.; Cassiaut-Louis, N.; Journeau, C.; Tellier, R. Le; Pluyette, E.

    2018-01-01

    The future PLINIUS-2 platform of CEA Cadarache will be dedicated to the study of corium interactions in severe nuclear accidents, and will host innovative large-scale experiments. The Nuclear Measurement Laboratory of CEA Cadarache is in charge of real-time high-energy X-ray imaging set-ups, for the study of the corium-water and corium-sodium interaction, and of the corium stratification process. Imaging such large and high-density objects requires a 15 MeV linear electron accelerator coupled to a tungsten target creating a high-energy Bremsstrahlung X-ray flux, with corresponding dose rate about 100 Gy/min at 1 m. The signal is detected by phosphor screens coupled to high-framerate scientific CMOS cameras. The imaging set-up is established using an experimentally-validated home-made simulation software (MODHERATO). The code computes quantitative radiographic signals from the description of the source, object geometry and composition, detector, and geometrical configuration (magnification factor, etc.). It accounts for several noise sources (photonic and electronic noises, swank and readout noise), and for image blur due to the source spot-size and to the detector unsharpness. In a view to PLINIUS-2, the simulation has been improved to account for the scattered flux, which is expected to be significant. The paper presents the scattered flux calculation using the MCNP transport code, and its integration into the MODHERATO simulation. Then the validation of the improved simulation is presented, through confrontation to real measurement images taken on a small-scale equivalent set-up on the PLINIUS platform. Excellent agreement is achieved. This improved simulation is therefore being used to design the PLINIUS-2 imaging set-ups (source, detectors, cameras, etc.).

  19. GRAYSKY-A new gamma-ray skyshine code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witts, D.J.; Twardowski, T.; Watmough, M.H.

    1993-01-01

    This paper describes a new prototype gamma-ray skyshine code GRAYSKY (Gamma-RAY SKYshine) that has been developed at BNFL, as part of an industrially based master of science course, to overcome the problems encountered with SKYSHINEII and RANKERN. GRAYSKY is a point kernel code based on the use of a skyshine response function. The scattering within source or shield materials is accounted for by the use of buildup factors. This is an approximate method of solution but one that has been shown to produce results that are acceptable for dose rate predictions on operating plants. The novel features of GRAYSKY aremore » as follows: 1. The code is fully integrated with a semianalytical point kernel shielding code, currently under development at BNFL, which offers powerful solid-body modeling capabilities. 2. The geometry modeling also allows the skyshine response function to be used in a manner that accounts for the shielding of air-scattered radiation. 3. Skyshine buildup factors calculated using the skyshine response function have been used as well as dose buildup factors.« less

  20. Thermodynamic and transport properties of gaseous tetrafluoromethane in chemical equilibrium

    NASA Technical Reports Server (NTRS)

    Hunt, J. L.; Boney, L. R.

    1973-01-01

    Equations and in computer code are presented for the thermodynamic and transport properties of gaseous, undissociated tetrafluoromethane (CF4) in chemical equilibrium. The computer code calculates the thermodynamic and transport properties of CF4 when given any two of five thermodynamic variables (entropy, temperature, volume, pressure, and enthalpy). Equilibrium thermodynamic and transport property data are tabulated and pressure-enthalpy diagrams are presented.

  1. High-fidelity plasma codes for burn physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooley, James; Graziani, Frank; Marinak, Marty

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental datamore » and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.« less

  2. Development of a 1.5D plasma transport code for coupling to full orbit runaway electron simulations

    NASA Astrophysics Data System (ADS)

    Lore, J. D.; Del Castillo-Negrete, D.; Baylor, L.; Carbajal, L.

    2017-10-01

    A 1.5D (1D radial transport + 2D equilibrium geometry) plasma transport code is being developed to simulate runaway electron generation, mitigation, and avoidance by coupling to the full-orbit kinetic electron transport code KORC. The 1.5D code solves the time-dependent 1D flux surface averaged transport equations with sources for plasma density, pressure, and poloidal magnetic flux, along with the Grad-Shafranov equilibrium equation for the 2D flux surface geometry. Disruption mitigation is simulated by introducing an impurity neutral gas `pellet', with impurity densities and electron cooling calculated from ionization, recombination, and line emission rate coefficients. Rapid cooling of the electrons increases the resistivity, inducing an electric field which can be used as an input to KORC. The runaway electron current is then included in the parallel Ohm's law in the transport equations. The 1.5D solver will act as a driver for coupled simulations to model effects such as timescales for thermal quench, runaway electron generation, and pellet impurity mixtures for runaway avoidance. Current progress on the code and details of the numerical algorithms will be presented. Work supported by the US DOE under DE-AC05-00OR22725.

  3. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  4. Anomalous Transport of Cosmic Rays in a Nonlinear Diffusion Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Litvinenko, Yuri E.; Fichtner, Horst; Walter, Dominik

    2017-05-20

    We investigate analytically and numerically the transport of cosmic rays following their escape from a shock or another localized acceleration site. Observed cosmic-ray distributions in the vicinity of heliospheric and astrophysical shocks imply that anomalous, superdiffusive transport plays a role in the evolution of the energetic particles. Several authors have quantitatively described the anomalous diffusion scalings, implied by the data, by solutions of a formal transport equation with fractional derivatives. Yet the physical basis of the fractional diffusion model remains uncertain. We explore an alternative model of the cosmic-ray transport: a nonlinear diffusion equation that follows from a self-consistent treatmentmore » of the resonantly interacting cosmic-ray particles and their self-generated turbulence. The nonlinear model naturally leads to superdiffusive scalings. In the presence of convection, the model yields a power-law dependence of the particle density on the distance upstream of the shock. Although the results do not refute the use of a fractional advection–diffusion equation, they indicate a viable alternative to explain the anomalous diffusion scalings of cosmic-ray particles.« less

  5. Evaluating the hydraulic and transport properties of peat soil using pore network modeling and X-ray micro computed tomography

    NASA Astrophysics Data System (ADS)

    Gharedaghloo, Behrad; Price, Jonathan S.; Rezanezhad, Fereidoun; Quinton, William L.

    2018-06-01

    Micro-scale properties of peat pore space and their influence on hydraulic and transport properties of peat soils have been given little attention so far. Characterizing the variation of these properties in a peat profile can increase our knowledge on the processes controlling contaminant transport through peatlands. As opposed to the common macro-scale (or bulk) representation of groundwater flow and transport processes, a pore network model (PNM) simulates flow and transport processes within individual pores. Here, a pore network modeling code capable of simulating advective and diffusive transport processes through a 3D unstructured pore network was developed; its predictive performance was evaluated by comparing its results to empirical values and to the results of computational fluid dynamics (CFD) simulations. This is the first time that peat pore networks have been extracted from X-ray micro-computed tomography (μCT) images of peat deposits and peat pore characteristics evaluated in a 3D approach. Water flow and solute transport were modeled in the unstructured pore networks mapped directly from μCT images. The modeling results were processed to determine the bulk properties of peat deposits. Results portray the commonly observed decrease in hydraulic conductivity with depth, which was attributed to the reduction of pore radius and increase in pore tortuosity. The increase in pore tortuosity with depth was associated with more decomposed peat soil and decreasing pore coordination number with depth, which extended the flow path of fluid particles. Results also revealed that hydraulic conductivity is isotropic locally, but becomes anisotropic after upscaling to core-scale; this suggests the anisotropy of peat hydraulic conductivity observed in core-scale and field-scale is due to the strong heterogeneity in the vertical dimension that is imposed by the layered structure of peat soils. Transport simulations revealed that for a given solute, the effective diffusion coefficient decreases with depth due to the corresponding increase of diffusional tortuosity. Longitudinal dispersivity of peat also was computed by analyzing advective-dominant transport simulations that showed peat dispersivity is similar to the empirical values reported in the same peat soil; it is not sensitive to soil depth and does not vary much along the soil profile.

  6. The PHITS code for space applications: status and recent developments

    NASA Astrophysics Data System (ADS)

    Sihver, Lembit; Ploc, Ondrej; Sato, Tatsuhiko; Niita, Koji; Hashimoto, Shintaro; El-Jaby, Samy

    Since COSPAR 2012, the Particle and Heavy Ion Transport code System, PHITS, has been upgraded and released to the public [1]. The code has been improved and so has the contents of its package, such as the attached data libraries. In the new version, the intra-nuclear cascade models INCL4.6 and INC-ELF have been implemented as well as the Kurotama model for the total reaction cross sections. The accuracies of the new reaction models for transporting the galactic cosmic-rays were investigated by comparing with experimental data. The incorporation of these models has improved the capabilities of PHITS to perform particle transport simulations for different space applications. A methodology for assessing the pre-mission exposure of space crew aboard the ISS has been developed in terms of an effective dose equivalent [2]. PHITS was used to calculate the particle transport of the GCR and trapped radiation through the hull of the ISS. By using the predicted spectra, and fluence-to-dose conversion factors, the semi-empirical ISSCREM [3,4,5] code was then scaled to predict the effective dose equivalent. This methodology provides an opportunity for pre-flight predictions of the effective dose equivalent, which can be compared to post-flight estimates, and therefore offers a means to assess the impact of radiation exposure on ISS flight crew. We have also simulated [6] the protective curtain experiment, which was performed to test the efficiency of water-soaked hygienic tissue wipes and towels as a simple and cost-effective additional spacecraft shielding. The dose from the trapped particles and low energetic GCR, was significantly reduced, which shows that the protective curtains are efficient when they are applied on spacecraft at LEO. The results of these benchmark calculations, as well as the mentioned applications of PHITS to space dosimetry, will be presented. [1] T. Sato et al. J. Nucl. Sci. Technol. 50, 913-923 (2013). [2] S. El-Jaby, et al. Adv. Space Res. doi: http://dx.doi.org/10.1016/j.asr.2013.12.022 (2013). [3] S. El-Jaby, et al. Adv. Space Res. doi: http://dx.doi.org/10.1016/j.asr.2013.10.006 (2013). [4] S. El-Jaby, et al. In proc. to the IEEE Aerospace Conference, Big Sky, MN, USA (2013). [5] S. El-Jaby, PhD Thesis, Royal Military College of Canada (2012). [6] O. Ploc, et al., Adv. Space Res. 52, 1911-1918 (2013).

  7. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    PubMed

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  8. Effect of the diffusion parameters on the observed γ-ray spectrum of sources and their contribution to the local all-electron spectrum: The EDGE code

    NASA Astrophysics Data System (ADS)

    López-Coto, R.; Hahn, J.; BenZvi, S.; Dingus, B.; Hinton, J.; Nisa, M. U.; Parsons, R. D.; Greus, F. Salesa; Zhang, H.; Zhou, H.

    2018-11-01

    The positron excess measured by PAMELA and AMS can only be explained if there is one or several sources injecting them. Moreover, at the highest energies, it requires the presence of nearby ( ∼ hundreds of parsecs) and middle age (maximum of ∼ hundreds of kyr) sources. Pulsars, as factories of electrons and positrons, are one of the proposed candidates to explain the origin of this excess. To calculate the contribution of these sources to the electron and positron flux at the Earth, we developed EDGE (Electron Diffusion and Gamma rays to the Earth), a code to treat the propagation of electrons and compute their diffusion from a central source with a flexible injection spectrum. Using this code, we can derive the source's gamma-ray spectrum, spatial extension, the all-electron density in space, the electron and positron flux reaching the Earth and the positron fraction measured at the Earth. We present in this paper the foundations of the code and study how different parameters affect the gamma-ray spectrum of a source and the electron flux measured at the Earth. We also studied the effect of several approximations usually performed in these studies. This code has been used to derive the results of the positron flux measured at the Earth in [1].

  9. Accelerator test of the coded aperture mask technique for gamma-ray astronomy

    NASA Technical Reports Server (NTRS)

    Jenkins, T. L.; Frye, G. M., Jr.; Owens, A.; Carter, J. N.; Ramsden, D.

    1982-01-01

    A prototype gamma-ray telescope employing the coded aperture mask technique has been constructed and its response to a point source of 20 MeV gamma-rays has been measured. The point spread function is approximately a Gaussian with a standard deviation of 12 arc minutes. This resolution is consistent with the cell size of the mask used and the spatial resolution of the detector. In the context of the present experiment, the error radius of the source position (90 percent confidence level) is 6.1 arc minutes.

  10. Noiseless coding for the Gamma Ray spectrometer

    NASA Technical Reports Server (NTRS)

    Rice, R.; Lee, J. J.

    1985-01-01

    The payload of several future unmanned space missions will include a sophisticated gamma ray spectrometer. Severely constrained data rates during certain portions of these missions could limit the possible science return from this instrument. This report investigates the application of universal noiseless coding techniques to represent gamma ray spectrometer data more efficiently without any loss in data integrity. Performance results demonstrate compression factors from 2.5:1 to 20:1 in comparison to a standard representation. Feasibility was also demonstrated by implementing a microprocessor breadboard coder/decoder using an Intel 8086 processor.

  11. Heavy ion contributions to organ dose equivalent for the 1977 galactic cosmic ray spectrum

    NASA Astrophysics Data System (ADS)

    Walker, Steven A.; Townsend, Lawrence W.; Norbury, John W.

    2013-05-01

    Estimates of organ dose equivalents for the skin, eye lens, blood forming organs, central nervous system, and heart of female astronauts from exposures to the 1977 solar minimum galactic cosmic radiation spectrum for various shielding geometries involving simple spheres and locations within the Space Transportation System (space shuttle) and the International Space Station (ISS) are made using the HZETRN 2010 space radiation transport code. The dose equivalent contributions are broken down by charge groups in order to better understand the sources of the exposures to these organs. For thin shields, contributions from ions heavier than alpha particles comprise at least half of the organ dose equivalent. For thick shields, such as the ISS locations, heavy ions contribute less than 30% and in some cases less than 10% of the organ dose equivalent. Secondary neutron production contributions in thick shields also tend to be as large, or larger, than the heavy ion contributions to the organ dose equivalents.

  12. MCNP modelling of the wall effects observed in tissue-equivalent proportional counters.

    PubMed

    Hoff, J L; Townsend, L W

    2002-01-01

    Tissue-equivalent proportional counters (TEPCs) utilise tissue-equivalent materials to depict homogeneous microscopic volumes of human tissue. Although both the walls and gas simulate the same medium, they respond to radiation differently. Density differences between the two materials cause distortions, or wall effects, in measurements, with the most dominant effect caused by delta rays. This study uses a Monte Carlo transport code, MCNP, to simulate the transport of secondary electrons within a TEPC. The Rudd model, a singly differential cross section with no dependence on electron direction, is used to describe the energy spectrum obtained by the impact of two iron beams on water. Based on the models used in this study, a wall-less TEPC had a higher lineal energy (keV.micron-1) as a function of impact parameter than a solid-wall TEPC for the iron beams under consideration. An important conclusion of this study is that MCNP has the ability to model the wall effects observed in TEPCs.

  13. Constraining heat-transport models by comparison to experimental data in a NIF hohlraum

    NASA Astrophysics Data System (ADS)

    Farmer, W. A.; Jones, O. S.; Barrios Garcia, M. A.; Koning, J. M.; Kerbel, G. D.; Strozzi, D. J.; Hinkel, D. E.; Moody, J. D.; Suter, L. J.; Liedahl, D. A.; Moore, A. S.; Landen, O. L.

    2017-10-01

    The accurate simulation of hohlraum plasma conditions is important for predicting the partition of energy and the symmetry of the x-ray field within a hohlraum. Electron heat transport within the hohlraum plasma is difficult to model due to the complex interaction of kinetic plasma effects, magnetic fields, laser-plasma interactions, and microturbulence. Here, we report simulation results using the radiation-hydrodynamic code, HYDRA, utilizing various physics packages (e.g., nonlocal Schurtz model, MHD, flux limiters) and compare to data from hohlraum plasma experiments which contain a Mn-Co tracer dot. In these experiments, the dot is placed in various positions in the hohlraum in order to assess the spatial variation of plasma conditions. Simulated data is compared to a variety of experimental diagnostics. Conclusions are given concerning how the experimental data does and does not constrain the physics models examined. This work was supported by the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  14. Ford Motor Company NDE facility shielding design.

    PubMed

    Metzger, Robert L; Van Riper, Kenneth A; Jones, Martin H

    2005-01-01

    Ford Motor Company proposed the construction of a large non-destructive evaluation laboratory for radiography of automotive power train components. The authors were commissioned to design the shielding and to survey the completed facility for compliance with radiation doses for occupationally and non-occupationally exposed personnel. The two X-ray sources are Varian Linatron 3000 accelerators operating at 9-11 MV. One performs computed tomography of automotive transmissions, while the other does real-time radiography of operating engines and transmissions. The shield thickness for the primary barrier and all secondary barriers were determined by point-kernel techniques. Point-kernel techniques did not work well for skyshine calculations and locations where multiple sources (e.g. tube head leakage and various scatter fields) impacted doses. Shielding for these areas was determined using transport calculations. A number of MCNP [Briesmeister, J. F. MCNPCA general Monte Carlo N-particle transport code version 4B. Los Alamos National Laboratory Manual (1997)] calculations focused on skyshine estimates and the office areas. Measurements on the operational facility confirmed the shielding calculations.

  15. Sensitivity Analysis of Cf-252 (sf) Neutron and Gamma Observables in CGMF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, Austin Lewis; Talou, Patrick; Stetcu, Ionel

    CGMF is a Monte Carlo code that simulates the decay of primary fission fragments by emission of neutrons and gamma rays, according to the Hauser-Feshbach equations. As the CGMF code was recently integrated into the MCNP6.2 transport code, great emphasis has been placed on providing optimal parameters to CGMF such that many different observables are accurately represented. Of these observables, the prompt neutron spectrum, prompt neutron multiplicity, prompt gamma spectrum, and prompt gamma multiplicity are crucial for accurate transport simulations of criticality and nonproliferation applications. This contribution to the ongoing efforts to improve CGMF presents a study of the sensitivitymore » of various neutron and gamma observables to several input parameters for Californium-252 spontaneous fission. Among the most influential parameters are those that affect the input yield distributions in fragment mass and total kinetic energy (TKE). A new scheme for representing Y(A,TKE) was implemented in CGMF using three fission modes, S1, S2 and SL. The sensitivity profiles were calculated for 17 total parameters, which show that the neutron multiplicity distribution is strongly affected by the TKE distribution of the fragments. The total excitation energy (TXE) of the fragments is shared according to a parameter RT, which is defined as the ratio of the light to heavy initial temperatures. The sensitivity profile of the neutron multiplicity shows a second order effect of RT on the mean neutron multiplicity. A final sensitivity profile was produced for the parameter alpha, which affects the spin of the fragments. Higher values of alpha lead to higher fragment spins, which inhibit the emission of neutrons. Understanding the sensitivity of the prompt neutron and gamma observables to the many CGMF input parameters provides a platform for the optimization of these parameters.« less

  16. Dose calculations at high altitudes and in deep space with GEANT4 using BIC and JQMD models for nucleus nucleus reactions

    NASA Astrophysics Data System (ADS)

    Sihver, L.; Matthiä, D.; Koi, T.; Mancusi, D.

    2008-10-01

    Radiation exposure of aircrew is more and more recognized as an occupational hazard. The ionizing environment at standard commercial aircraft flight altitudes consists mainly of secondary particles, of which the neutrons give a major contribution to the dose equivalent. Accurate estimations of neutron spectra in the atmosphere are therefore essential for correct calculations of aircrew doses. Energetic solar particle events (SPE) could also lead to significantly increased dose rates, especially at routes close to the North Pole, e.g. for flights between Europe and USA. It is also well known that the radiation environment encountered by personnel aboard low Earth orbit (LEO) spacecraft or aboard a spacecraft traveling outside the Earth's protective magnetosphere is much harsher compared with that within the atmosphere since the personnel are exposed to radiation from both galactic cosmic rays (GCR) and SPE. The relative contribution to the dose from GCR when traveling outside the Earth's magnetosphere, e.g. to the Moon or Mars, is even greater, and reliable and accurate particle and heavy ion transport codes are essential to calculate the radiation risks for both aircrew and personnel on spacecraft. We have therefore performed calculations of neutron distributions in the atmosphere, total dose equivalents, and quality factors at different depths in a water sphere in an imaginary spacecraft during solar minimum in a geosynchronous orbit. The calculations were performed with the GEANT4 Monte Carlo (MC) code using both the binary cascade (BIC) model, which is part of the standard GEANT4 package, and the JQMD model, which is used in the particle and heavy ion transport code PHITS GEANT4.

  17. Modelling of the EAST lower-hybrid current drive experiment using GENRAY/CQL3D and TORLH/CQL3D

    NASA Astrophysics Data System (ADS)

    Yang, C.; Bonoli, P. T.; Wright, J. C.; Ding, B. J.; Parker, R.; Shiraiwa, S.; Li, M. H.

    2014-12-01

    The coupled GENRAY-CQL3D code has been used to do systematic ray-tracing and Fokker-Planck analysis for EAST Lower Hybrid wave Current Drive (LHCD) experiments. Despite being in the weak absorption regime, the experimental level of LH current drive is successfully simulated, by taking into account the variations in the parallel wavenumber due to the toroidal effect. The effect of radial transport of the fast LH electrons in EAST has also been studied, which shows that a modest amount of radial transport diffusion can redistribute the fast LH current significantly. Taking advantage of the new capability in GENRAY, the actual Scrape Off Layer (SOL) model with magnetic field, density, temperature, and geometry is included in the simulation for both the lower and the higher density cases, so that the collisional losses of Lower Hybrid Wave (LHW) power in the SOL has been accounted for, which together with fast electron losses can reproduce the LHCD experimental observations in different discharges of EAST. We have also analyzed EAST discharges where there is a significant ohmic contribution to the total current, and good agreement with experiment in terms of total current has been obtained. Also, the full-wave code TORLH has been used for the simulation of the LH physics in the EAST, including full-wave effects such as diffraction and focusing which may also play an important role in bridging the spectral gap. The comparisons between the GENRAY and the TORLH codes are done for both the Maxwellian and the quasi-linear electron Landau damping cases. These simulations represent an important addition to the validation studies of the GENRAY-CQL3D and TORLH models being used in weak absorption scenarios of tokamaks with large aspect ratio.

  18. THE EFFECT OF INTERMITTENT GYRO-SCALE SLAB TURBULENCE ON PARALLEL AND PERPENDICULAR COSMIC-RAY TRANSPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Roux, J. A.

    Earlier work based on nonlinear guiding center (NLGC) theory suggested that perpendicular cosmic-ray transport is diffusive when cosmic rays encounter random three-dimensional magnetohydrodynamic turbulence dominated by uniform two-dimensional (2D) turbulence with a minor uniform slab turbulence component. In this approach large-scale perpendicular cosmic-ray transport is due to cosmic rays microscopically diffusing along the meandering magnetic field dominated by 2D turbulence because of gyroresonant interactions with slab turbulence. However, turbulence in the solar wind is intermittent and it has been suggested that intermittent turbulence might be responsible for the observation of 'dropout' events in solar energetic particle fluxes on small scales.more » In a previous paper le Roux et al. suggested, using NLGC theory as a basis, that if gyro-scale slab turbulence is intermittent, large-scale perpendicular cosmic-ray transport in weak uniform 2D turbulence will be superdiffusive or subdiffusive depending on the statistical characteristics of the intermittent slab turbulence. In this paper we expand and refine our previous work further by investigating how both parallel and perpendicular transport are affected by intermittent slab turbulence for weak as well as strong uniform 2D turbulence. The main new finding is that both parallel and perpendicular transport are the net effect of an interplay between diffusive and nondiffusive (superdiffusive or subdiffusive) transport effects as a consequence of this intermittency.« less

  19. Design and performance of coded aperture optical elements for the CESR-TA x-ray beam size monitor

    NASA Astrophysics Data System (ADS)

    Alexander, J. P.; Chatterjee, A.; Conolly, C.; Edwards, E.; Ehrlichman, M. P.; Flanagan, J. W.; Fontes, E.; Heltsley, B. K.; Lyndaker, A.; Peterson, D. P.; Rider, N. T.; Rubin, D. L.; Seeley, R.; Shanks, J.

    2014-12-01

    We describe the design and performance of optical elements for an x-ray beam size monitor (xBSM), a device measuring e+ and e- beam sizes in the CESR-TA storage ring. The device can measure vertical beam sizes of 10 - 100 μm on a turn-by-turn, bunch-by-bunch basis at e± beam energies of 2 - 5 GeV. x-rays produced by a hard-bend magnet pass through a single- or multiple-slit (coded aperture) optical element onto a detector. The coded aperture slit pattern and thickness of masking material forming that pattern can both be tuned for optimal resolving power. We describe several such optical elements and show how well predictions of simple models track measured performances.

  20. A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes

    NASA Technical Reports Server (NTRS)

    Schnittman, Jeremy David; Krolik, Julian H.

    2013-01-01

    We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.

  1. Viscosity and inertia in cosmic-ray transport - Effects of an average magnetic field

    NASA Technical Reports Server (NTRS)

    Williams, L. L.; Jokipii, J. R.

    1991-01-01

    A generalized transport equation is introduced which describes the transport and propagation of cosmic rays in a magnetized, collisionless medium. The equation is valid if the cosmic-ray distribution function is nearly isotropic in momentum, if the ratio of fluid speed to fluid-flow particle speed is small, and if the ratio of collision time to time for change in the macroscopic flow is small. Five independent cosmic-ray viscosity coefficients are found, and the ralationship of this viscosity to particle orbits in a magnetic field is presented.

  2. Benchmarking of Neutron Production of Heavy-Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  3. Benchmarking of Heavy Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  4. Treating voxel geometries in radiation protection dosimetry with a patched version of the Monte Carlo codes MCNP and MCNPX.

    PubMed

    Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P

    2007-01-01

    The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported.

  5. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA andmore » MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  6. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground Based Computation and Control Systems and Human Health and Safety

    NASA Technical Reports Server (NTRS)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as on human health and safety, as well as the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in earth surface, atmospheric flight, and space flight environments. Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools (e.g. ground based test methods as well as high energy particle transport and reaction codes) needed to design, test, and verify the safety and reliability of modern complex electronic systems as well as effects on human health and safety. The effects of primary cosmic ray particles, and secondary particle showers produced by nuclear reactions with spacecraft materials, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth's surface, especially if the net target area of the sensitive electronic system components is large. Accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO).

  7. Accurate Modeling of the Terrestrial Gamma-Ray Background for Homeland Security Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandness, Gerald A.; Schweppe, John E.; Hensley, Walter K.

    2009-10-24

    Abstract–The Pacific Northwest National Laboratory has developed computer models to simulate the use of radiation portal monitors to screen vehicles and cargo for the presence of illicit radioactive material. The gamma radiation emitted by the vehicles or cargo containers must often be measured in the presence of a relatively large gamma-ray background mainly due to the presence of potassium, uranium, and thorium (and progeny isotopes) in the soil and surrounding building materials. This large background is often a significant limit to the detection sensitivity for items of interest and must be modeled accurately for analyzing homeland security situations. Calculations ofmore » the expected gamma-ray emission from a disk of soil and asphalt were made using the Monte Carlo transport code MCNP and were compared to measurements made at a seaport with a high-purity germanium detector. Analysis revealed that the energy spectrum of the measured background could not be reproduced unless the model included gamma rays coming from the ground out to distances of at least 300 m. The contribution from beyond about 50 m was primarily due to gamma rays that scattered in the air before entering the detectors rather than passing directly from the ground to the detectors. These skyshine gamma rays contribute tens of percent to the total gamma-ray spectrum, primarily at energies below a few hundred keV. The techniques that were developed to efficiently calculate the contributions from a large soil disk and a large air volume in a Monte Carlo simulation are described and the implications of skyshine in portal monitoring applications are discussed.« less

  8. Modeling study of a proposed field calibration source using K-40 and high-Z targets for sodium iodide detectors

    DOE PAGES

    Rogers, Jeremy; Marianno, Craig; Kallenbach, Gene; ...

    2016-06-01

    Calibration sources based on the primordial isotope potassium-40 ( 40K) have reduced controls on the source’s activity due to its terrestrial ubiquity and very low specific activity. Potassium–40’s beta emissions and 1,460.8 keV gamma ray can be used to induce K-shell fluorescence x rays in high-Z metals between 60 and 80 keV. A gamma ray calibration source that uses potassium chloride salt and a high-Z metal to create a two-point calibration for a sodium iodide field gamma spectroscopy instrument is thus proposed. The calibration source was designed in collaboration with the Sandia National Laboratory using the Monte Carlo N-Particle eXtendedmore » (MCNPX) transport code. Two methods of x-ray production were explored. First, a thin high-Z layer (HZL) was interposed between the detector and the potassium chloride-urethane source matrix. Second, bismuth metal powder was homogeneously mixed with a urethane binding agent to form a potassium chloride-bismuth matrix (KBM). The bismuth-based source was selected as the development model because it is inexpensive, nontoxic, and outperforms the high-Z layer method in simulation. As a result, based on the MCNPX studies, sealing a mixture of bismuth powder and potassium chloride into a thin plastic case could provide a light, inexpensive field calibration source.« less

  9. Measurement of Cerenkov radiation induced by the gamma-rays of Co-60 therapy units using wavelength shifting fiber.

    PubMed

    Jang, Kyoung Won; Shin, Sang Hun; Kim, Seon Geun; Kim, Jae Seok; Yoo, Wook Jae; Ji, Young Hoon; Lee, Bongsoo

    2014-04-21

    In this study, a wavelength shifting fiber that shifts ultra-violet and blue light to green light was employed as a sensor probe of a fiber-optic Cerenkov radiation sensor. In order to characterize Cerenkov radiation generated in the developed wavelength shifting fiber and a plastic optical fiber, spectra and intensities of Cerenkov radiation were measured with a spectrometer. The spectral peaks of light outputs from the wavelength shifting fiber and the plastic optical fiber were measured at wavelengths of 500 and 510 nm, respectively, and the intensity of transmitted light output of the wavelength shifting fiber was 22.2 times higher than that of the plastic optical fiber. Also, electron fluxes and total energy depositions of gamma-ray beams generated from a Co-60 therapy unit were calculated according to water depths using the Monte Carlo N-particle transport code. The relationship between the fluxes of electrons over the Cerenkov threshold energy and the energy depositions of gamma-ray beams from the Co-60 unit is a near-identity function. Finally, percentage depth doses for the gamma-ray beams were obtained using the fiber-optic Cerenkov radiation sensor, and the results were compared with those obtained by an ionization chamber. The average dose difference between the results of the fiber-optic Cerenkov radiation sensor and those of the ionization chamber was about 2.09%.

  10. CREME96 and Related Error Rate Prediction Methods

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    2012-01-01

    Predicting the rate of occurrence of single event effects (SEEs) in space requires knowledge of the radiation environment and the response of electronic devices to that environment. Several analytical models have been developed over the past 36 years to predict SEE rates. The first error rate calculations were performed by Binder, Smith and Holman. Bradford and Pickel and Blandford, in their CRIER (Cosmic-Ray-Induced-Error-Rate) analysis code introduced the basic Rectangular ParallelePiped (RPP) method for error rate calculations. For the radiation environment at the part, both made use of the Cosmic Ray LET (Linear Energy Transfer) spectra calculated by Heinrich for various absorber Depths. A more detailed model for the space radiation environment within spacecraft was developed by Adams and co-workers. This model, together with a reformulation of the RPP method published by Pickel and Blandford, was used to create the CR ME (Cosmic Ray Effects on Micro-Electronics) code. About the same time Shapiro wrote the CRUP (Cosmic Ray Upset Program) based on the RPP method published by Bradford. It was the first code to specifically take into account charge collection from outside the depletion region due to deformation of the electric field caused by the incident cosmic ray. Other early rate prediction methods and codes include the Single Event Figure of Merit, NOVICE, the Space Radiation code and the effective flux method of Binder which is the basis of the SEFA (Scott Effective Flux Approximation) model. By the early 1990s it was becoming clear that CREME and the other early models needed Revision. This revision, CREME96, was completed and released as a WWW-based tool, one of the first of its kind. The revisions in CREME96 included improved environmental models and improved models for calculating single event effects. The need for a revision of CREME also stimulated the development of the CHIME (CRRES/SPACERAD Heavy Ion Model of the Environment) and MACREE (Modeling and Analysis of Cosmic Ray Effects in Electronics). The Single Event Figure of Merit method was also revised to use the solar minimum galactic cosmic ray spectrum and extended to circular orbits down to 200 km at any inclination. More recently a series of commercial codes was developed by TRAD (Test & Radiations) which includes the OMERE code which calculates single event effects. There are other error rate prediction methods which use Monte Carlo techniques. In this chapter the analytic methods for estimating the environment within spacecraft will be discussed.

  11. Systematic design and three-dimensional simulation of X-ray FEL oscillator for Shanghai Coherent Light Facility

    NASA Astrophysics Data System (ADS)

    Li, Kai; Deng, Haixiao

    2018-07-01

    The Shanghai Coherent Light Facility (SCLF) is a quasi-continuous wave hard X-ray free electron laser facility, which is currently under construction. Due to the high repetition rate and high-quality electron beams, it is straightforward to consider X-ray free electron laser oscillator (XFELO) operation for the SCLF. In this paper, the main processes for XFELO design, and parameter optimization of the undulator, X-ray cavity, and electron beam are described. A three-dimensional X-ray crystal Bragg diffraction code, named BRIGHT, was introduced for the first time, which can be combined with the GENESIS and OPC codes for the numerical simulations of the XFELO. The performance of the XFELO of the SCLF is investigated and optimized by theoretical analysis and numerical simulation.

  12. Notes on the husbandry and long-term transportation of Bull ray (Pteromylaeus bovinus) and Dolphinfish (Coryphaena hippurus and Coryphaena equiselis).

    PubMed

    Rodrigues, Nuno; Correia, João; Pinho, Rúben; Graça, José; Rodrigues, Filipe; Hirofumi, Morikawa

    2013-03-01

    Bull rays (Pteromylaeus bovinus) and Dolphinfish (Coryphaena hippurus and Coryphaena equiselis) were collected in Olhão (south of Portugal). These animals hosted multiple parasites, namely Caligus spp., and underwent a variety of treatments to remove them. Of all treatments tested, hydrogen peroxide showed the best results, although only concentrations above 100 ppm were effective in parasite removal. These high concentrations, however, proved to be highly toxic for the fish and led to the loss of some animals, especially those which had been handled before treatment. A total of 14 Bull rays were transported to Bolougne-Sur-Mer (France) by road and some animals were lost, which was attributed to excessive time in transit (>45 hr). In another transport, three Bull rays and 10 Dolphinfishes were moved to Stralsund (Germany) by road and air. The mechanical wounds suffered by one of the Bull rays during transport led to its death and, consequently, a deterioration of water quality in the tank containing two other conspecifics. This deterioration of water quality resulted in problems for the other two Bull rays, and one perished approximately 48 hr after arrival. The authors concluded that Dolphinfish can be transported with a low bioload for at least 27 hr, and Bull rays should not undergo transports longer than 35 hr. Special attention must be taken to injured animals, since this can lead to a decrease in water quality and consequently affect other animals in the same transport tank. © 2012 Wiley Periodicals, Inc.

  13. CGRO Guest Investigator Program

    NASA Technical Reports Server (NTRS)

    Begelman, Mitchell C.

    1997-01-01

    The following are highlights from the research supported by this grant: (1) Theory of gamma-ray blazars: We studied the theory of gamma-ray blazars, being among the first investigators to propose that the GeV emission arises from Comptonization of diffuse radiation surrounding the jet, rather than from the synchrotron-self-Compton mechanism. In related work, we uncovered possible connections between the mechanisms of gamma-ray blazars and those of intraday radio variability, and have conducted a general study of the role of Compton radiation drag on the dynamics of relativistic jets. (2) A Nonlinear Monte Carlo code for gamma-ray spectrum formation: We developed, tested, and applied the first Nonlinear Monte Carlo (NLMC) code for simulating gamma-ray production and transfer under much more general (and realistic) conditions than are accessible with other techniques. The present version of the code is designed to simulate conditions thought to be present in active galactic nuclei and certain types of X-ray binaries, and includes the physics needed to model thermal and nonthermal electron-positron pair cascades. Unlike traditional Monte-Carlo techniques, our method can accurately handle highly non-linear systems in which the radiation and particle backgrounds must be determined self-consistently and in which the particle energies span many orders of magnitude. Unlike models based on kinetic equations, our code can handle arbitrary source geometries and relativistic kinematic effects In its first important application following testing, we showed that popular semi-analytic accretion disk corona models for Seyfert spectra are seriously in error, and demonstrated how the spectra can be simulated if the disk is sparsely covered by localized 'flares'.

  14. METHES: A Monte Carlo collision code for the simulation of electron transport in low temperature plasmas

    NASA Astrophysics Data System (ADS)

    Rabie, M.; Franck, C. M.

    2016-06-01

    We present a freely available MATLAB code for the simulation of electron transport in arbitrary gas mixtures in the presence of uniform electric fields. For steady-state electron transport, the program provides the transport coefficients, reaction rates and the electron energy distribution function. The program uses established Monte Carlo techniques and is compatible with the electron scattering cross section files from the open-access Plasma Data Exchange Project LXCat. The code is written in object-oriented design, allowing the tracing and visualization of the spatiotemporal evolution of electron swarms and the temporal development of the mean energy and the electron number due to attachment and/or ionization processes. We benchmark our code with well-known model gases as well as the real gases argon, N2, O2, CF4, SF6 and mixtures of N2 and O2.

  15. STELLTRANS: A Transport Analysis Suite for Stellarators

    NASA Astrophysics Data System (ADS)

    Mittelstaedt, Joseph; Lazerson, Samuel; Pablant, Novimir; Weir, Gavin; W7-X Team

    2016-10-01

    The stellarator transport code STELLTRANS allows us to better analyze the power balance in W7-X. Although profiles of temperature and density are measured experimentally, geometrical factors are needed in conjunction with these measurements to properly analyze heat flux densities in stellarators. The STELLTRANS code interfaces with VMEC to find an equilibrium flux surface configuration and with TRAVIS to determine the RF heating and current drive in the plasma. Stationary transport equations are then considered which are solved using a boundary value differential equation solver. The equations and quantities considered are averaged over flux surfaces to reduce the system to an essentially one dimensional problem. We have applied this code to data from W-7X and were able to calculate the heat flux coefficients. We will also present extensions of the code to a predictive capacity which would utilize DKES to find neoclassical transport coefficients to update the temperature and density profiles.

  16. Using computational modeling to compare X-ray tube Practical Peak Voltage for Dental Radiology

    NASA Astrophysics Data System (ADS)

    Holanda Cassiano, Deisemar; Arruda Correa, Samanda Cristine; de Souza, Edmilson Monteiro; da Silva, Ademir Xaxier; Pereira Peixoto, José Guilherme; Tadeu Lopes, Ricardo

    2014-02-01

    The Practical Peak Voltage-PPV has been adopted to measure the voltage applied to an X-ray tube. The PPV was recommended by the IEC document and accepted and published in the TRS no. 457 code of practice. The PPV is defined and applied to all forms of waves and is related to the spectral distribution of X-rays and to the properties of the image. The calibration of X-rays tubes was performed using the MCNPX Monte Carlo code. An X-ray tube for Dental Radiology (operated from a single phase power supply) and an X-ray tube used as a reference (supplied from a constant potential power supply) were used in simulations across the energy range of interest of 40 kV to 100 kV. Results obtained indicated a linear relationship between the tubes involved.

  17. Deep Space Test Bed for Radiation Studies

    NASA Technical Reports Server (NTRS)

    Adams, James H.; Adcock, Leonard; Apple, Jeffery; Christl, Mark; Cleveand, William; Cox, Mark; Dietz, Kurt; Ferguson, Cynthia; Fountain, Walt; Ghita, Bogdan

    2006-01-01

    The Deep Space Test-Bed (DSTB) Facility is designed to investigate the effects of galactic cosmic rays on crews and systems during missions to the Moon or Mars. To gain access to the interplanetary ionizing radiation environment the DSTB uses high-altitude polar balloon flights. The DSTB provides a platform for measurements to validate the radiation transport codes that are used by NASA to calculate the radiation environment within crewed space systems. It is also designed to support other Exploration related investigations such as measuring the shielding effectiveness of candidate spacecraft and habitat materials, testing new radiation monitoring instrumentation and flight avionics and investigating the biological effects of deep space radiation. We describe the work completed thus far in the development of the DSTB and its current status.

  18. Collapse of magnetized hypermassive neutron stars in general relativity.

    PubMed

    Duez, Matthew D; Liu, Yuk Tung; Shapiro, Stuart L; Shibata, Masaru; Stephens, Branson C

    2006-01-27

    Hypermassive neutron stars (HMNSs)--equilibrium configurations supported against collapse by rapid differential rotation--are possible transient remnants of binary neutron-star mergers. Using newly developed codes for magnetohydrodynamic simulations in dynamical spacetimes, we are able to track the evolution of a magnetized HMNS in full general relativity for the first time. We find that secular angular momentum transport due to magnetic braking and the magnetorotational instability results in the collapse of an HMNS to a rotating black hole, accompanied by a gravitational wave burst. The nascent black hole is surrounded by a hot, massive torus undergoing quasistationary accretion and a collimated magnetic field. This scenario suggests that HMNS collapse is a possible candidate for the central engine of short gamma-ray bursts.

  19. Modeling of Dynamic Behavior of Carbon Fiber-Reinforced Polymer (CFRP) Composite under X-ray Radiation.

    PubMed

    Zhang, Kun; Tang, Wenhui; Fu, Kunkun

    2018-01-16

    Carbon fiber-reinforced polymer (CFRP) composites have been increasingly used in spacecraft applications. Spacecraft may encounter highenergy-density X-ray radiation in outer space that can cause severe damage. To protect spacecraft from such unexpected damage, it is essential to predict the dynamic behavior of CFRP composites under X-ray radiation. In this study, we developed an in-house three-dimensional explicit finite element (FEM) code to investigate the dynamic responses of CFRP composite under X-ray radiation for the first time, by incorporating a modified PUFF equation-of-state. First, the blow-off impulse (BOI) momentum of an aluminum panel was predicted by our FEM code and compared with an existing radiation experiment. Then, the FEM code was utilized to determine the dynamic behavior of a CFRP composite under various radiation conditions. It was found that the numerical result was comparable with the experimental one. Furthermore, the CFRP composite was more effective than the aluminum panel in reducing radiation-induced pressure and BOI momentum. The numerical results also revealed that a 1 keV X-ray led to vaporization of surface materials and a high-magnitude compressive stress wave, whereas a low-magnitude stress wave was generated with no surface vaporization when a 3 keV X-ray was applied.

  20. Modeling of Dynamic Behavior of Carbon Fiber-Reinforced Polymer (CFRP) Composite under X-ray Radiation

    PubMed Central

    Zhang, Kun; Tang, Wenhui; Fu, Kunkun

    2018-01-01

    Carbon fiber-reinforced polymer (CFRP) composites have been increasingly used in spacecraft applications. Spacecraft may encounter highenergy-density X-ray radiation in outer space that can cause severe damage. To protect spacecraft from such unexpected damage, it is essential to predict the dynamic behavior of CFRP composites under X-ray radiation. In this study, we developed an in-house three-dimensional explicit finite element (FEM) code to investigate the dynamic responses of CFRP composite under X-ray radiation for the first time, by incorporating a modified PUFF equation-of-state. First, the blow-off impulse (BOI) momentum of an aluminum panel was predicted by our FEM code and compared with an existing radiation experiment. Then, the FEM code was utilized to determine the dynamic behavior of a CFRP composite under various radiation conditions. It was found that the numerical result was comparable with the experimental one. Furthermore, the CFRP composite was more effective than the aluminum panel in reducing radiation-induced pressure and BOI momentum. The numerical results also revealed that a 1 keV X-ray led to vaporization of surface materials and a high-magnitude compressive stress wave, whereas a low-magnitude stress wave was generated with no surface vaporization when a 3 keV X-ray was applied. PMID:29337891

  1. Electron transport model of dielectric charging

    NASA Technical Reports Server (NTRS)

    Beers, B. L.; Hwang, H. C.; Lin, D. L.; Pine, V. W.

    1979-01-01

    A computer code (SCCPOEM) was assembled to describe the charging of dielectrics due to irradiation by electrons. The primary purpose for developing the code was to make available a convenient tool for studying the internal fields and charge densities in electron-irradiated dielectrics. The code, which is based on the primary electron transport code POEM, is applicable to arbitrary dielectrics, source spectra, and current time histories. The code calculations are illustrated by a series of semianalytical solutions. Calculations to date suggest that the front face electric field is insufficient to cause breakdown, but that bulk breakdown fields can easily be exceeded.

  2. Capabilities overview of the MORET 5 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cochet, B.; Jinaphanh, A.; Heulers, L.; Jacquet, O.

    2014-06-01

    The MORET code is a simulation tool that solves the transport equation for neutrons using the Monte Carlo method. It allows users to model complex three-dimensional geometrical configurations, describe the materials, define their own tallies in order to analyse the results. The MORET code has been initially designed to perform calculations for criticality safety assessments. New features has been introduced in the MORET 5 code to expand its use for reactor applications. This paper presents an overview of the MORET 5 code capabilities, going through the description of materials, the geometry modelling, the transport simulation and the definition of the outputs.

  3. Earth and Planetary Science Letters

    NASA Technical Reports Server (NTRS)

    Nishiizumi, K.; Klein, J.; Middleton, R.; Masarik, J.; Reedy, R. C.; Arnold, J. R.; Fink, D.

    1997-01-01

    Systematic measurements of the concentrations of cosmogen Ca-41 (half-life = 1.04 x 10(exp 5) yr) in the Apollo 15 long core 15001-15006 were performed by accelerator mass spectroscopy. Earlier measurements of cosmogenic Be-10, C-14, Al-26, Cl-36, and Mn-53 in the same core have provided confirmation and improvement of theoretical models for predicting production profiles of nuclides by cosmic ray induced spallation in the Moon and large meteorites. Unlike these nuclides, Ca-40 in the lunar surface is produced mainly by thermal neutron capture reactions on Ca-40. The maximum production of Ca-41, about 1 dpm/g Ca, was observed at a depth in the Moon of about 150 g/sq cm. For depths below about 300 g/sq cm, Ca-41 production falls off exponentially with an e-folding length of 175 g/sq cm. Neutron production in the Moon was modeled with the Los Alamos High Energy Transport Code System, and yields of nuclei produced by low-energy thermal and epithermal neutrons were calculated with the Monte Carlo N-Particle code. The new theoretical calculations using these codes are in good agreement with our measured Ca-41 concentrations as well as with Co-60 and direct neutron fluence measurements in the Moon.

  4. MCNP6 Simulation of Light and Medium Nuclei Fragmentation at Intermediate Energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mashnik, Stepan Georgievich; Kerby, Leslie Marie

    2015-05-22

    MCNP6, the latest and most advanced LANL Monte Carlo transport code, representing a merger of MCNP5 and MCNPX, is actually much more than the sum of those two computer codes; MCNP6 is available to the public via RSICC at Oak Ridge, TN, USA. In the present work, MCNP6 was validated and verified (V&V) against different experimental data on intermediate-energy fragmentation reactions, and results by several other codes, using mainly the latest modifications of the Cascade-Exciton Model (CEM) and of the Los Alamos version of the Quark-Gluon String Model (LAQGSM) event generators CEM03.03 and LAQGSM03.03. It was found that MCNP6 usingmore » CEM03.03 and LAQGSM03.03 describes well fragmentation reactions induced on light and medium target nuclei by protons and light nuclei of energies around 1 GeV/nucleon and below, and can serve as a reliable simulation tool for different applications, like cosmic-ray-induced single event upsets (SEU’s), radiation protection, and cancer therapy with proton and ion beams, to name just a few. Future improvements of the predicting capabilities of MCNP6 for such reactions are possible, and are discussed in this work.« less

  5. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.

  6. ARES: automated response function code. Users manual. [HPGAM and LSQVM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maung, T.; Reynolds, G.M.

    This ARES user's manual provides detailed instructions for a general understanding of the Automated Response Function Code and gives step by step instructions for using the complete code package on a HP-1000 system. This code is designed to calculate response functions of NaI gamma-ray detectors, with cylindrical or rectangular geometries.

  7. Learning to Analyze and Code Accounting Transactions in Interactive Mode.

    ERIC Educational Resources Information Center

    Bentz, William F.; Ambler, Eric E.

    An interactive computer-assisted instructional (CAI) system, called CODE, is used to teach transactional analysis, or coding, in elementary accounting. The first major component of CODE is TEACH, a program which controls student input and output. Following the statement of a financial position on a cathode ray tube, TEACH describes an event to…

  8. GAPD: a GPU-accelerated atom-based polychromatic diffraction simulation code.

    PubMed

    E, J C; Wang, L; Chen, S; Zhang, Y Y; Luo, S N

    2018-03-01

    GAPD, a graphics-processing-unit (GPU)-accelerated atom-based polychromatic diffraction simulation code for direct, kinematics-based, simulations of X-ray/electron diffraction of large-scale atomic systems with mono-/polychromatic beams and arbitrary plane detector geometries, is presented. This code implements GPU parallel computation via both real- and reciprocal-space decompositions. With GAPD, direct simulations are performed of the reciprocal lattice node of ultralarge systems (∼5 billion atoms) and diffraction patterns of single-crystal and polycrystalline configurations with mono- and polychromatic X-ray beams (including synchrotron undulator sources), and validation, benchmark and application cases are presented.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aleman, S.E.

    This report documents a finite element code designed to model subsurface flow and contaminant transport, named FACT. FACT is a transient three-dimensional, finite element code designed to simulate isothermal groundwater flow, moisture movement, and solute transport in variably saturated and fully saturated subsurface porous media.

  10. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  11. The Athena Astrophysical MHD Code in Cylindrical Geometry

    NASA Astrophysics Data System (ADS)

    Skinner, M. A.; Ostriker, E. C.

    2011-10-01

    We have developed a method for implementing cylindrical coordinates in the Athena MHD code (Skinner & Ostriker 2010). The extension has been designed to alter the existing Cartesian-coordinates code (Stone et al. 2008) as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport, a central feature of the Athena algorithm, while making use of previously implemented code modules such as the eigensystems and Riemann solvers. Angular-momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully. We describe modifications for cylindrical coordinates of the higher-order spatial reconstruction and characteristic evolution steps as well as the finite-volume and constrained transport updates. Finally, we have developed a test suite of standard and novel problems in one-, two-, and three-dimensions designed to validate our algorithms and implementation and to be of use to other code developers. The code is suitable for use in a wide variety of astrophysical applications and is freely available for download on the web.

  12. Transport and equilibrium in field-reversed mirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyd, J.K.

    Two plasma models relevant to compact torus research have been developed to study transport and equilibrium in field reversed mirrors. In the first model for small Larmor radius and large collision frequency, the plasma is described as an adiabatic hydromagnetic fluid. In the second model for large Larmor radius and small collision frequency, a kinetic theory description has been developed. Various aspects of the two models have been studied in five computer codes ADB, AV, NEO, OHK, RES. The ADB code computes two dimensional equilibrium and one dimensional transport in a flux coordinate. The AV code calculates orbit average integralsmore » in a harmonic oscillator potential. The NEO code follows particle trajectories in a Hill's vortex magnetic field to study stochasticity, invariants of the motion, and orbit average formulas. The OHK code displays analytic psi(r), B/sub Z/(r), phi(r), E/sub r/(r) formulas developed for the kinetic theory description. The RES code calculates resonance curves to consider overlap regions relevant to stochastic orbit behavior.« less

  13. Benchmarking of neutron production of heavy-ion transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, I.; Ronningen, R. M.; Heilbronn, L.

    Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less

  14. Estimation of Effective Doses for Radiation Cancer Risks on ISS, Lunar, and Mars Missions with Space Radiation Measurement

    NASA Technical Reports Server (NTRS)

    Kim, M.Y.; Cucinotta, F.A.

    2005-01-01

    Radiation protection practices define the effective dose as a weighted sum of equivalent dose over major sites for radiation cancer risks. Since a crew personnel dosimeter does not make direct measurement of effective dose, it has been estimated with skin-dose measurements and radiation transport codes for ISS and STS missions. The Phantom Torso Experiment (PTE) of NASA s Operational Radiation Protection Program has provided the actual flight measurements of active and passive dosimeters which were placed throughout the phantom on STS-91 mission for 10 days and on ISS Increment 2 mission. For the PTE, the variation in organ doses, which is resulted by the absorption and the changes in radiation quality with tissue shielding, was considered by measuring doses at many tissue sites and at several critical body organs including brain, colon, heart, stomach, thyroid, and skins. These measurements have been compared with the organ dose calculations obtained from the transport models. Active TEPC measurements of lineal energy spectra at the surface of the PTE also provided the direct comparison of galactic cosmic ray (GCR) or trapped proton dose and dose equivalent. It is shown that orienting the phantom body as actual in ISS is needed for the direct comparison of the transport models to the ISS data. One of the most important observations for organ dose equivalent of effective dose estimates on ISS is the fractional contribution from trapped protons and GCR. We show that for most organs over 80% is from GCR. The improved estimation of effective doses for radiation cancer risks will be made with the resultant tissue weighting factors and the modified codes.

  15. An analysis of options available for developing a common laser ray tracing package for Ares and Kull code frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weeratunga, S K

    Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can bemore » easily shared between these two code frameworks and concludes with a set of recommendations for its development.« less

  16. Method for calculating internal radiation and ventilation with the ADINAT heat-flow code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butkovich, T.R.; Montan, D.N.

    1980-04-01

    One objective of the spent fuel test in Climax Stock granite (SFTC) is to correctly model the thermal transport, and the changes in the stress field and accompanying displacements from the application of the thermal loads. We have chosen the ADINA and ADINAT finite element codes to do these calculations. ADINAT is a heat transfer code compatible to the ADINA displacement and stress analysis code. The heat flow problem encountered at SFTC requires a code with conduction, radiation, and ventilation capabilities, which the present version of ADINAT does not have. We have devised a method for calculating internal radiation andmore » ventilation with the ADINAT code. This method effectively reproduces the results from the TRUMP multi-dimensional finite difference code, which correctly models radiative heat transport between drift surfaces, conductive and convective thermal transport to and through air in the drifts, and mass flow of air in the drifts. The temperature histories for each node in the finite element mesh calculated with ADINAT using this method can be used directly in the ADINA thermal-mechanical calculation.« less

  17. Monte Carlo Modeling of the Initial Radiation Emitted by a Nuclear Device in the National Capital Region

    DTIC Science & Technology

    2013-07-01

    also simulated in the models. Data was derived from calculations using the three-dimensional Monte Carlo radiation transport code MCNP (Monte Carlo N...32  B.  MCNP PHYSICS OPTIONS ......................................................................................... 33  C.  HAZUS...input deck’) for the MCNP , Monte Carlo N-Particle, radiation transport code. MCNP is a general-purpose code designed to simulate neutron, photon

  18. Considerations of MCNP Monte Carlo code to be used as a radiotherapy treatment planning tool.

    PubMed

    Juste, B; Miro, R; Gallardo, S; Verdu, G; Santos, A

    2005-01-01

    The present work has simulated the photon and electron transport in a Theratron 780® (MDS Nordion)60Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle). This project explains mainly the different methodologies carried out to speedup calculations in order to apply this code efficiently in radiotherapy treatment planning.

  19. Ray tracing through a hexahedral mesh in HADES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, G L; Aufderheide, M B

    In this paper we describe a new ray tracing method targeted for inclusion in HADES. The algorithm tracks rays through three-dimensional tetrakis hexahedral mesh objects, like those used by the ARES code to model inertial confinement experiments.

  20. CTViz: A tool for the visualization of transport in nanocomposites.

    PubMed

    Beach, Benjamin; Brown, Joshua; Tarlton, Taylor; Derosa, Pedro A

    2016-05-01

    A visualization tool (CTViz) for charge transport processes in 3-D hybrid materials (nanocomposites) was developed, inspired by the need for a graphical application to assist in code debugging and data presentation of an existing in-house code. As the simulation code grew, troubleshooting problems grew increasingly difficult without an effective way to visualize 3-D samples and charge transport in those samples. CTViz is able to produce publication and presentation quality visuals of the simulation box, as well as static and animated visuals of the paths of individual carriers through the sample. CTViz was designed to provide a high degree of flexibility in the visualization of the data. A feature that characterizes this tool is the use of shade and transparency levels to highlight important details in the morphology or in the transport paths by hiding or dimming elements of little relevance to the current view. This is fundamental for the visualization of 3-D systems with complex structures. The code presented here provides these required capabilities, but has gone beyond the original design and could be used as is or easily adapted for the visualization of other particulate transport where transport occurs on discrete paths. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. REgolith X-Ray Imaging Spectrometer (REXIS) Aboard NASA’s OSIRIS-REx Mission

    NASA Astrophysics Data System (ADS)

    Hong, JaeSub; Allen, Branden; Grindlay, Jonathan E.; Binzel, Richard P.; Masterson, Rebecca; Inamdar, Niraj K; Chodas, Mark; Smith, Matthew W; Bautz, Mark W.; Kissel, Steven E; Villasenor, Jesus Noel; Oprescu, Antonia

    2014-06-01

    The REgolith X-Ray Imaging Spectrometer (REXIS) is a student-led instrument being designed, built, and operated as a collaborative effort involving MIT and Harvard. It is a part of NASA's OSIRIS-REx mission, which is scheduled for launch in September of 2016 for a rendezvous with, and collection of a sample from the surface of the primitive carbonaceous chondrite-like asteroid 101955 Bennu in 2019. REXIS will determine spatial variations in elemental composition of Bennu's surface through solar-induced X-ray fluorescence. REXIS consists of four X-ray CCDs in the detector plane and an X-ray mask. It is the first coded-aperture X-ray telescope in a planetary mission, which combines the benefit of high X-ray throughput of wide-field collimation with imaging capability of a coded-mask, enabling detection of elemental surface distributions at approximately 50-200 m scales. We present an overview of the REXIS instrument and the expected performance.

  2. Path Toward a Unified Geometry for Radiation Transport

    NASA Astrophysics Data System (ADS)

    Lee, Kerry

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The work-flow for doing radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats.

  3. Identification of Trends into Dose Calculations for Astronauts through Performing Sensitivity Analysis on Calculational Models Used by the Radiation Health Office

    NASA Technical Reports Server (NTRS)

    Adams, Thomas; VanBaalen, Mary

    2009-01-01

    The Radiation Health Office (RHO) determines each astronaut s cancer risk by using models to associate the amount of radiation dose that astronauts receive from spaceflight missions. The baryon transport codes (BRYNTRN), high charge (Z) and energy transport codes (HZETRN), and computer risk models are used to determine the effective dose received by astronauts in Low Earth orbit (LEO). This code uses an approximation of the Boltzman transport formula. The purpose of the project is to run this code for various International Space Station (ISS) flight parameters in order to gain a better understanding of how this code responds to different scenarios. The project will determine how variations in one set of parameters such as, the point of the solar cycle and altitude can affect the radiation exposure of astronauts during ISS missions. This project will benefit NASA by improving mission dosimetry.

  4. Simulations of neutron transport at low energy: a comparison between GEANT and MCNP.

    PubMed

    Colonna, N; Altieri, S

    2002-06-01

    The use of the simulation tool GEANT for neutron transport at energies below 20 MeV is discussed, in particular with regard to shielding and dose calculations. The reliability of the GEANT/MICAP package for neutron transport in a wide energy range has been verified by comparing the results of simulations performed with this package in a wide energy range with the prediction of MCNP-4B, a code commonly used for neutron transport at low energy. A reasonable agreement between the results of the two codes is found for the neutron flux through a slab of material (iron and ordinary concrete), as well as for the dose released in soft tissue by neutrons. These results justify the use of the GEANT/MICAP code for neutron transport in a wide range of applications, including health physics problems.

  5. Galactic Cosmic-ray Transport in the Global Heliosphere: A Four-Dimensional Stochastic Model

    NASA Astrophysics Data System (ADS)

    Florinski, V.

    2009-04-01

    We study galactic cosmic-ray transport in the outer heliosphere and heliosheath using a newly developed transport model based on stochastic integration of the phase-space trajectories of Parker's equation. The model employs backward integration of the diffusion-convection transport equation using Ito calculus and is four-dimensional in space+momentum. We apply the model to the problem of galactic proton transport in the heliosphere during a negative solar minimum. Model results are compared with the Voyager measurements of galactic proton radial gradients and spectra in the heliosheath. We show that the heliosheath is not as efficient in diverting cosmic rays during solar minima as predicted by earlier two-dimensional models.

  6. Multimode imaging device

    DOEpatents

    Mihailescu, Lucian; Vetter, Kai M

    2013-08-27

    Apparatus for detecting and locating a source of gamma rays of energies ranging from 10-20 keV to several MeV's includes plural gamma ray detectors arranged in a generally closed extended array so as to provide Compton scattering imaging and coded aperture imaging simultaneously. First detectors are arranged in a spaced manner about a surface defining the closed extended array which may be in the form a circle, a sphere, a square, a pentagon or higher order polygon. Some of the gamma rays are absorbed by the first detectors closest to the gamma source in Compton scattering, while the photons that go unabsorbed by passing through gaps disposed between adjacent first detectors are incident upon second detectors disposed on the side farthest from the gamma ray source, where the first spaced detectors form a coded aperture array for two or three dimensional gamma ray source detection.

  7. X-ray spectral signatures of photoionized plasmas. [astrophysics

    NASA Technical Reports Server (NTRS)

    Liedahl, Duane A.; Kahn, Steven M.; Osterheld, Albert L.; Goldstein, William H.

    1990-01-01

    Plasma emission codes have become a standard tool for the analysis of spectroscopic data from cosmic X-ray sources. However, the assumption of collisional equilibrium, typically invoked in these codes, renders them inapplicable to many important astrophysical situations, particularly those involving X-ray photoionized nebulae. This point is illustrated by comparing model spectra which have been calculated under conditions appropriate to both coronal plasmas and X-ray photoionized plasmas. It is shown that the (3s-2p)/(3d-2p) line ratios in the Fe L-shell spectrum can be used to effectively discriminate between these two cases. This diagnostic will be especially useful for data analysis associated with AXAF and XMM, which will carry spectroscopic instrumentation with sufficient sensitivity and resolution to identify X-ray photoionized nebulae in a wide range of astrophysical environments.

  8. Accurate Ray-tracing of Realistic Neutron Star Atmospheres for Constraining Their Parameters

    NASA Astrophysics Data System (ADS)

    Vincent, Frederic H.; Bejger, Michał; Różańska, Agata; Straub, Odele; Paumard, Thibaut; Fortin, Morgane; Madej, Jerzy; Majczyna, Agnieszka; Gourgoulhon, Eric; Haensel, Paweł; Zdunik, Leszek; Beldycki, Bartosz

    2018-03-01

    Thermal-dominated X-ray spectra of neutron stars in quiescent, transient X-ray binaries and neutron stars that undergo thermonuclear bursts are sensitive to mass and radius. The mass–radius relation of neutron stars depends on the equation of state (EoS) that governs their interior. Constraining this relation accurately is therefore of fundamental importance to understand the nature of dense matter. In this context, we introduce a pipeline to calculate realistic model spectra of rotating neutron stars with hydrogen and helium atmospheres. An arbitrarily fast-rotating neutron star with a given EoS generates the spacetime in which the atmosphere emits radiation. We use the LORENE/NROTSTAR code to compute the spacetime numerically and the ATM24 code to solve the radiative transfer equations self-consistently. Emerging specific intensity spectra are then ray-traced through the neutron star’s spacetime from the atmosphere to a distant observer with the GYOTO code. Here, we present and test our fully relativistic numerical pipeline. To discuss and illustrate the importance of realistic atmosphere models, we compare our model spectra to simpler models like the commonly used isotropic color-corrected blackbody emission. We highlight the importance of considering realistic model-atmosphere spectra together with relativistic ray-tracing to obtain accurate predictions. We also insist upon the crucial impact of the star’s rotation on the observables. Finally, we close a controversy that has been ongoing in the literature in the recent years, regarding the validity of the ATM24 code.

  9. SHIELD and HZETRN comparisons of pion production cross sections

    NASA Astrophysics Data System (ADS)

    Norbury, John W.; Sobolevsky, Nikolai; Werneth, Charles M.

    2018-03-01

    A program of comparing American (NASA) and Russian (ROSCOSMOS) space radiation transport codes has recently begun, and the first paper directly comparing the NASA and ROSCOSMOS space radiation transport codes, HZETRN and SHIELD respectively has recently appeared. The present work represents the second time that NASA and ROSCOSMOS calculations have been directly compared, and the focus here is on models of pion production cross sections used in the two transport codes mentioned above. It was found that these models are in overall moderate agreement with each other and with experimental data. Disagreements that were found are discussed.

  10. Evaluation and utilization of beam simulation codes for the SNS ion source and low energy beam transport developmenta)

    NASA Astrophysics Data System (ADS)

    Han, B. X.; Welton, R. F.; Stockli, M. P.; Luciano, N. P.; Carmichael, J. R.

    2008-02-01

    Beam simulation codes PBGUNS, SIMION, and LORENTZ-3D were evaluated by modeling the well-diagnosed SNS base line ion source and low energy beam transport (LEBT) system. Then, an investigation was conducted using these codes to assist our ion source and LEBT development effort which is directed at meeting the SNS operational and also the power-upgrade project goals. A high-efficiency H- extraction system as well as magnetic and electrostatic LEBT configurations capable of transporting up to 100mA is studied using these simulation tools.

  11. Radial dependence of lineal energy distribution of 290-MeV/u carbon and 500-MeV/u iron ion beams using a wall-less tissue-equivalent proportional counter

    PubMed Central

    Tsuda, Shuichi; Sato, Tatsuhiko; Watanabe, Ritsuko; Takada, Masashi

    2015-01-01

    Using a wall-less tissue-equivalent proportional counter for a 0.72-μm site in tissue, we measured the radial dependence of the lineal energy distribution, yf(y), of 290-MeV/u carbon ions and 500-MeV/u iron ion beams. The measured yf(y) distributions and the dose-mean of y, y¯D, were compared with calculations performed with the track structure simulation code TRACION and the microdosimetric function of the Particle and Heavy Ion Transport code System (PHITS). The values of the measured y¯D were consistent with calculated results within an error of 2%, but differences in the shape of yf(y) were observed for iron ion irradiation. This result indicates that further improvement of the calculation model for yf(y) distribution in PHITS is needed for the analytical function that describes energy deposition by delta rays, particularly for primary ions having linear energy transfer in excess of a few hundred keV μm−1. PMID:25210053

  12. Anisotropic diffusion in mesh-free numerical magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.

    2017-04-01

    We extend recently developed mesh-free Lagrangian methods for numerical magnetohydrodynamics (MHD) to arbitrary anisotropic diffusion equations, including: passive scalar diffusion, Spitzer-Braginskii conduction and viscosity, cosmic ray diffusion/streaming, anisotropic radiation transport, non-ideal MHD (Ohmic resistivity, ambipolar diffusion, the Hall effect) and turbulent 'eddy diffusion'. We study these as implemented in the code GIZMO for both new meshless finite-volume Godunov schemes (MFM/MFV). We show that the MFM/MFV methods are accurate and stable even with noisy fields and irregular particle arrangements, and recover the correct behaviour even in arbitrarily anisotropic cases. They are competitive with state-of-the-art AMR/moving-mesh methods, and can correctly treat anisotropic diffusion-driven instabilities (e.g. the MTI and HBI, Hall MRI). We also develop a new scheme for stabilizing anisotropic tensor-valued fluxes with high-order gradient estimators and non-linear flux limiters, which is trivially generalized to AMR/moving-mesh codes. We also present applications of some of these improvements for SPH, in the form of a new integral-Godunov SPH formulation that adopts a moving-least squares gradient estimator and introduces a flux-limited Riemann problem between particles.

  13. On the Development of a Deterministic Three-Dimensional Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Rockell, Candice; Tweed, John

    2011-01-01

    Since astronauts on future deep space missions will be exposed to dangerous radiations, there is a need to accurately model the transport of radiation through shielding materials and to estimate the received radiation dose. In response to this need a three dimensional deterministic code for space radiation transport is now under development. The new code GRNTRN is based on a Green's function solution of the Boltzmann transport equation that is constructed in the form of a Neumann series. Analytical approximations will be obtained for the first three terms of the Neumann series and the remainder will be estimated by a non-perturbative technique . This work discusses progress made to date and exhibits some computations based on the first two Neumann series terms.

  14. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less

  15. AN OPEN-SOURCE NEUTRINO RADIATION HYDRODYNAMICS CODE FOR CORE-COLLAPSE SUPERNOVAE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Connor, Evan, E-mail: evanoconnor@ncsu.edu; CITA, Canadian Institute for Theoretical Astrophysics, Toronto, M5S 3H8

    2015-08-15

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrinomore » transport calculations is the neutrino–matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.« less

  16. Silicon Drift Detector response function for PIXE spectra fitting

    NASA Astrophysics Data System (ADS)

    Calzolai, G.; Tapinassi, S.; Chiari, M.; Giannoni, M.; Nava, S.; Pazzi, G.; Lucarelli, F.

    2018-02-01

    The correct determination of the X-ray peak areas in PIXE spectra by fitting with a computer program depends crucially on accurate parameterization of the detector peak response function. In the Guelph PIXE software package, GUPIXWin, one of the most used PIXE spectra analysis code, the response of a semiconductor detector to monochromatic X-ray radiation is described by a linear combination of several analytical functions: a Gaussian profile for the X-ray line itself, and additional tail contributions (exponential tails and step functions) on the low-energy side of the X-ray line to describe incomplete charge collection effects. The literature on the spectral response of silicon X-ray detectors for PIXE applications is rather scarce, in particular data for Silicon Drift Detectors (SDD) and for a large range of X-ray energies are missing. Using a set of analytical functions, the SDD response functions were satisfactorily reproduced for the X-ray energy range 1-15 keV. The behaviour of the parameters involved in the SDD tailing functions with X-ray energy is described by simple polynomial functions, which permit an easy implementation in PIXE spectra fitting codes.

  17. Reactive transport modeling in fractured rock: A state-of-the-science review

    NASA Astrophysics Data System (ADS)

    MacQuarrie, Kerry T. B.; Mayer, K. Ulrich

    2005-10-01

    The field of reactive transport modeling has expanded significantly in the past two decades and has assisted in resolving many issues in Earth Sciences. Numerical models allow for detailed examination of coupled transport and reactions, or more general investigation of controlling processes over geologic time scales. Reactive transport models serve to provide guidance in field data collection and, in particular, enable researchers to link modeling and hydrogeochemical studies. In this state-of-science review, the key objectives were to examine the applicability of reactive transport codes for exploring issues of redox stability to depths of several hundreds of meters in sparsely fractured crystalline rock, with a focus on the Canadian Shield setting. A conceptual model of oxygen ingress and redox buffering, within a Shield environment at time and space scales relevant to nuclear waste repository performance, is developed through a review of previous research. This conceptual model describes geochemical and biological processes and mechanisms materially important to understanding redox buffering capacity and radionuclide mobility in the far-field. Consistent with this model, reactive transport codes should ideally be capable of simulating the effects of changing recharge water compositions as a result of long-term climate change, and fracture-matrix interactions that may govern water-rock interaction. Other aspects influencing the suitability of reactive transport codes include the treatment of various reaction and transport time scales, the ability to apply equilibrium or kinetic formulations simultaneously, the need to capture feedback between water-rock interactions and porosity-permeability changes, and the representation of fractured crystalline rock environments as discrete fracture or dual continuum media. A review of modern multicomponent reactive transport codes indicates a relatively high-level of maturity. Within the Yucca Mountain nuclear waste disposal program, reactive transport codes of varying complexity have been applied to investigate the migration of radionuclides and the geochemical evolution of host rock around the planned disposal facility. Through appropriate near- and far-field application of dual continuum codes, this example demonstrates how reactive transport models have been applied to assist in constraining historic water infiltration rates, interpreting the sealing of flow paths due to mineral precipitation, and investigating post-closure geochemical monitoring strategies. Natural analogue modeling studies, although few in number, are also of key importance as they allow the comparison of model results with hydrogeochemical and paleohydrogeological data over geologic time scales.

  18. Simulating Coupling Complexity in Space Plasmas: First Results from a new code

    NASA Astrophysics Data System (ADS)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.

    2005-12-01

    The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.

  19. Reactive transport codes for subsurface environmental simulation

    DOE PAGES

    Steefel, C. I.; Appelo, C. A. J.; Arora, B.; ...

    2014-09-26

    A general description of the mathematical and numerical formulations used in modern numerical reactive transport codes relevant for subsurface environmental simulations is presented. The formulations are followed by short descriptions of commonly used and available subsurface simulators that consider continuum representations of flow, transport, and reactions in porous media. These formulations are applicable to most of the subsurface environmental benchmark problems included in this special issue. The list of codes described briefly here includes PHREEQC, HPx, PHT3D, OpenGeoSys (OGS), HYTEC, ORCHESTRA, TOUGHREACT, eSTOMP, HYDROGEOCHEM, CrunchFlow, MIN3P, and PFLOTRAN. The descriptions include a high-level list of capabilities for each of themore » codes, along with a selective list of applications that highlight their capabilities and historical development.« less

  20. Development of a Coded Aperture X-Ray Backscatter Imager for Explosive Device Detection

    NASA Astrophysics Data System (ADS)

    Faust, Anthony A.; Rothschild, Richard E.; Leblanc, Philippe; McFee, John Elton

    2009-02-01

    Defence R&D Canada has an active research and development program on detection of explosive devices using nuclear methods. One system under development is a coded aperture-based X-ray backscatter imaging detector designed to provide sufficient speed, contrast and spatial resolution to detect antipersonnel landmines and improvised explosive devices. The successful development of a hand-held imaging detector requires, among other things, a light-weight, ruggedized detector with low power requirements, supplying high spatial resolution. The University of California, San Diego-designed HEXIS detector provides a modern, large area, high-temperature CZT imaging surface, robustly packaged in a light-weight housing with sound mechanical properties. Based on the potential for the HEXIS detector to be incorporated as the detection element of a hand-held imaging detector, the authors initiated a collaborative effort to demonstrate the capability of a coded aperture-based X-ray backscatter imaging detector. This paper will discuss the landmine and IED detection problem and review the coded aperture technique. Results from initial proof-of-principle experiments will then be reported.

  1. Insights into electron and ion acceleration and transport from x-ray and gamma-ray imaging spectroscopy

    NASA Astrophysics Data System (ADS)

    Hurford, Gordon J.; Krucker, Samuel

    The previous solar maximum has featured high resolution imaging/spectroscopy observations at hard x-ray and gamma-ray energies by the Reuven Ramaty High Energy Solar/Spectroscopic Imager (RHESSI). Highlights of these observations will be reviewed, along with their impli-cations for our understanding of ion and electron acceleration and transport processes. The results to date have included new insights into the location of the acceleration region and the thick target model, a new appreciation of the significance of x-ray albedo, observation of coronal gamma-ray sources and their implications for electron trapping, and indications of differences in the acceleration and transport between electrons and ions. The role of RHESSI's observational strengths and weaknesses in determining the character of its scientific results will also be discussed and used to identify what aspects of the acceleration and transport processes must await the next generation of instrumentation. The extent to which new instrumentation now under development, such as Solar Orbiter/STIX, GRIPS, and FOXSI, can address these open issues will be outlined.

  2. Aerosol and gamma background measurements at Basic Environmental Observatory Moussala

    NASA Astrophysics Data System (ADS)

    Angelov, Christo; Arsov, Todor; Penev, Ilia; Nikolova, Nina; Kalapov, Ivo; Georgiev, Stefan

    2016-03-01

    Trans boundary and local pollution, global climate changes and cosmic rays are the main areas of research performed at the regional Global Atmospheric Watch (GAW) station Moussala BEO (2925 m a.s.l., 42°10'45'' N, 23°35'07'' E). Real time measurements and observations are performed in the field of atmospheric chemistry and physics. Complex information about the aerosol is obtained by using a threewavelength integrating Nephelometer for measuring the scattering and backscattering coefficients, a continuous light absorption photometer and a scanning mobile particle sizer. The system for measuring radioactivity and heavy metals in aerosols allows us to monitor a large scale radioactive aerosol transport. The measurements of the gamma background and the gamma-rays spectrum in the air near Moussala peak are carried out in real time. The HYSPLIT back trajectory model is used to determine the origin of the data registered. DREAM code calculations [2] are used to forecast the air mass trajectory. The information obtained combined with a full set of corresponding meteorological parameters is transmitted via a high frequency radio telecommunication system to the Internet.

  3. Systematic measurement of lineal energy distributions for proton, He and Si ion beams over a wide energy range using a wall-less tissue equivalent proportional counter.

    PubMed

    Tsuda, Shuichi; Sato, Tatsuhiko; Takahashi, Fumiaki; Satoh, Daiki; Sasaki, Shinichi; Namito, Yoshihito; Iwase, Hiroshi; Ban, Shuichi; Takada, Masashi

    2012-01-01

    The frequency distributions of the lineal energy, y, of 160 MeV proton, 150 MeV/u helium, and 490 MeV/u silicon ion beams were measured using a wall-less tissue equivalent proportional counter (TEPC) with a site size of 0.72 µm. The measured frequency distributions of y as well as the dose-mean values, y(D), agree with the corresponding data calculated using the microdosimetric function of the particle and heavy ion transport code system PHITS. The values of y(D) increase in the range of LET below ~10 keV µm(-1) because of discrete energy deposition by delta rays, while the relation is reversed above ~10 keV µm(-1) as the amount of energy escaping via delta rays increases. These results indicate that care should be taken with the difference between y(D) and LET when estimating the ionization density that usually relates to relative biological effectiveness (RBE) of energetic heavy ions.

  4. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  5. Comparison of CREME (cosmic-ray effects on microelectronics) model LET (linear energy transfer) spaceflight dosimetry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Letaw, J.R.; Adams, J.H.

    The galactic cosmic radiation (GCR) component of space radiation is the dominant cause of single-event phenomena in microelectronic circuits when Earth's magnetic shielding is low. Spaceflights outside the magnetosphere and in high inclination orbits are examples of such circumstances. In high-inclination orbits, low-energy (high LET) particles are transmitted through the field only at extreme latitudes, but can dominate the orbit-averaged dose. GCR is an important part of the radiation dose to astronauts under the same conditions. As a test of the CREME environmental model and particle transport codes used to estimate single event upsets, we have compiled existing measurements ofmore » HZE doses were compiled where GCR is expected to be important: Apollo 16 and 17, Skylab, Apollo Soyuz Test Project, and Kosmos 782. The LET spectra, due to direct ionization from GCR, for each of these missions has been estimated. The resulting comparisons with data validate the CREME model predictions of high-LET galactic cosmic-ray fluxes to within a factor of two. Some systematic differences between the model and data are identified.« less

  6. Experimental and modeling uncertainties in the validation of lower hybrid current drive

    DOE PAGES

    Poli, F. M.; Bonoli, P. T.; Chilenski, M.; ...

    2016-07-28

    Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less

  7. Non-destructive in-situ method and apparatus for determining radionuclide depth in media

    DOEpatents

    Xu, X. George; Naessens, Edward P.

    2003-01-01

    A non-destructive method and apparatus which is based on in-situ gamma spectroscopy is used to determine the depth of radiological contamination in media such as concrete. An algorithm, Gamma Penetration Depth Unfolding Algorithm (GPDUA), uses point kernel techniques to predict the depth of contamination based on the results of uncollided peak information from the in-situ gamma spectroscopy. The invention is better, faster, safer, and/cheaper than the current practice in decontamination and decommissioning of facilities that are slow, rough and unsafe. The invention uses a priori knowledge of the contaminant source distribution. The applicable radiological contaminants of interest are any isotopes that emit two or more gamma rays per disintegration or isotopes that emit a single gamma ray but have gamma-emitting progeny in secular equilibrium with its parent (e.g., .sup.60 Co, .sup.235 U, and .sup.137 Cs to name a few). The predicted depths from the GPDUA algorithm using Monte Carlo N-Particle Transport Code (MCNP) simulations and laboratory experiments using .sup.60 Co have consistently produced predicted depths within 20% of the actual or known depth.

  8. Method for measuring the focal spot size of an x-ray tube using a coded aperture mask and a digital detector.

    PubMed

    Russo, Paolo; Mettivier, Giovanni

    2011-04-01

    The goal of this study is to evaluate a new method based on a coded aperture mask combined with a digital x-ray imaging detector for measurements of the focal spot sizes of diagnostic x-ray tubes. Common techniques for focal spot size measurements employ a pinhole camera, a slit camera, or a star resolution pattern. The coded aperture mask is a radiation collimator consisting of a large number of apertures disposed on a predetermined grid in an array, through which the radiation source is imaged onto a digital x-ray detector. The method of the coded mask camera allows one to obtain a one-shot accurate and direct measurement of the two dimensions of the focal spot (like that for a pinhole camera) but at a low tube loading (like that for a slit camera). A large number of small apertures in the coded mask operate as a "multipinhole" with greater efficiency than a single pinhole, but keeping the resolution of a single pinhole. X-ray images result from the multiplexed output on the detector image plane of such a multiple aperture array, and the image of the source is digitally reconstructed with a deconvolution algorithm. Images of the focal spot of a laboratory x-ray tube (W anode: 35-80 kVp; focal spot size of 0.04 mm) were acquired at different geometrical magnifications with two different types of digital detector (a photon counting hybrid silicon pixel detector with 0.055 mm pitch and a flat panel CMOS digital detector with 0.05 mm pitch) using a high resolution coded mask (type no-two-holes-touching modified uniformly redundant array) with 480 0.07 mm apertures, designed for imaging at energies below 35 keV. Measurements with a slit camera were performed for comparison. A test with a pinhole camera and with the coded mask on a computed radiography mammography unit with 0.3 mm focal spot was also carried out. The full width at half maximum focal spot sizes were obtained from the line profiles of the decoded images, showing a focal spot of 0.120 mm x 0.105 mm at 35 kVp and M = 6.1, with a detector entrance exposure as low as 1.82 mR (0.125 mA s tube load). The slit camera indicated a focal spot of 0.112 mm x 0.104 mm at 35 kVp and M = 3.15, with an exposure at the detector of 72 mR. Focal spot measurements with the coded mask could be performed up to 80 kVp. Tolerance to angular misalignment with the reference beam up to 7 degrees in in-plane rotations and 1 degrees deg in out-of-plane rotations was observed. The axial distance of the focal spot from the coded mask could also be determined. It is possible to determine the beam intensity via measurement of the intensity of the decoded image of the focal spot and via a calibration procedure. Coded aperture masks coupled to a digital area detector produce precise determinations of the focal spot of an x-ray tube with reduced tube loading and measurement time, coupled to a large tolerance in the alignment of the mask.

  9. NPTFit: A Code Package for Non-Poissonian Template Fitting

    NASA Astrophysics Data System (ADS)

    Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R.

    2017-06-01

    We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ˜GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allow for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.

  10. Simulation of image formation in x-ray coded aperture microscopy with polycapillary optics.

    PubMed

    Korecki, P; Roszczynialski, T P; Sowa, K M

    2015-04-06

    In x-ray coded aperture microscopy with polycapillary optics (XCAMPO), the microstructure of focusing polycapillary optics is used as a coded aperture and enables depth-resolved x-ray imaging at a resolution better than the focal spot dimensions. Improvements in the resolution and development of 3D encoding procedures require a simulation model that can predict the outcome of XCAMPO experiments. In this work we introduce a model of image formation in XCAMPO which enables calculation of XCAMPO datasets for arbitrary positions of the object relative to the focal plane as well as to incorporate optics imperfections. In the model, the exit surface of the optics is treated as a micro-structured x-ray source that illuminates a periodic object. This makes it possible to express the intensity of XCAMPO images as a convolution series and to perform simulations by means of fast Fourier transforms. For non-periodic objects, the model can be applied by enforcing artificial periodicity and setting the spatial period larger then the field-of-view. Simulations are verified by comparison with experimental data.

  11. Some issues and subtleties in numerical simulation of X-ray FEL's

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fawley, William M.

    Part of the overall design effort for x-ray FEL's such as the LCLS and TESLA projects has involved extensive use of particle simulation codes to predict their output performance and underlying sensitivity to various input parameters (e.g. electron beam emittance). This paper discusses some of the numerical issues that must be addressed by simulation codes in this regime. We first give a brief overview of the standard approximations and simulation methods adopted by time-dependent(i.e. polychromatic) codes such as GINGER, GENESIS, and FAST3D, including the effects of temporal discretization and the resultant limited spectral bandpass,and then discuss the accuracies and inaccuraciesmore » of these codes in predicting incoherent spontaneous emission (i.e. the extremely low gain regime).« less

  12. The Use of Pro/Engineer CAD Software and Fishbowl Tool Kit in Ray-tracing Analysis

    NASA Technical Reports Server (NTRS)

    Nounu, Hatem N.; Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2009-01-01

    This document is designed as a manual for a user who wants to operate the Pro/ENGINEER (ProE) Wildfire 3.0 with the NASA Space Radiation Program's (SRP) custom-designed Toolkit, called 'Fishbowl', for the ray tracing of complex spacecraft geometries given by a ProE CAD model. The analysis of spacecraft geometry through ray tracing is a vital part in the calculation of health risks from space radiation. Space radiation poses severe risks of cancer, degenerative diseases and acute radiation sickness during long-term exploration missions, and shielding optimization is an important component in the application of radiation risk models. Ray tracing is a technique in which 3-dimensional (3D) vehicle geometry can be represented as the input for the space radiation transport code and subsequent risk calculations. In ray tracing a certain number of rays (on the order of 1000) are used to calculate the equivalent thickness, say of aluminum, of the spacecraft geometry seen at a point of interest called the dose point. The rays originate at the dose point and terminate at a homogenously distributed set of points lying on a sphere that circumscribes the spacecraft and that has its center at the dose point. The distance a ray traverses in each material is converted to aluminum or other user-selected equivalent thickness. Then all equivalent thicknesses are summed up for each ray. Since each ray points to a direction, the aluminum equivalent of each ray represents the shielding that the geometry provides to the dose point from that particular direction. This manual will first list for the user the contact information for help in installing ProE and Fishbowl in addition to notes on the platform support and system requirements information. Second, the document will show the user how to use the software to ray trace a Pro/E-designed 3-D assembly and will serve later as a reference for troubleshooting. The user is assumed to have previous knowledge of ProE and CAD modeling.

  13. Dust-Particle Transport in Tokamak Edge Plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pigarov, A Y; Krasheninnikov, S I; Soboleva, T K

    2005-09-12

    Dust particulates in the size range of 10nm-100{micro}m are found in all fusion devices. Such dust can be generated during tokamak operation due to strong plasma/material-surface interactions. Some recent experiments and theoretical estimates indicate that dust particles can provide an important source of impurities in the tokamak plasma. Moreover, dust can be a serious threat to the safety of next-step fusion devices. In this paper, recent experimental observations on dust in fusion devices are reviewed. A physical model for dust transport simulation, and a newly developed code DUSTT, are discussed. The DUSTT code incorporates both dust dynamics due to comprehensivemore » dust-plasma interactions as well as the effects of dust heating, charging, and evaporation. The code tracks test dust particles in realistic plasma backgrounds as provided by edge-plasma transport codes. Results are presented for dust transport in current and next-step tokamaks. The effect of dust on divertor plasma profiles and core plasma contamination is examined.« less

  14. Total x-ray power measurements in the Sandia LIGA program.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malinowski, Michael E.; Ting, Aili

    2005-08-01

    Total X-ray power measurements using aluminum block calorimetry and other techniques were made at LIGA X-ray scanner synchrotron beamlines located at both the Advanced Light Source (ALS) and the Advanced Photon Source (APS). This block calorimetry work was initially performed on the LIGA beamline 3.3.1 of the ALS to provide experimental checks of predictions of the LEX-D (LIGA Exposure- Development) code for LIGA X-ray exposures, version 7.56, the version of the code in use at the time calorimetry was done. These experiments showed that it was necessary to use bend magnet field strengths and electron storage ring energies different frommore » the default values originally in the code in order to obtain good agreement between experiment and theory. The results indicated that agreement between LEX-D predictions and experiment could be as good as 5% only if (1) more accurate values of the ring energies, (2) local values of the magnet field at the beamline source point, and (3) the NIST database for X-ray/materials interactions were used as code inputs. These local magnetic field value and accurate ring energies, together with NIST database, are now defaults in the newest release of LEX-D, version 7.61. Three dimensional simulations of the temperature distributions in the aluminum calorimeter block for a typical ALS power measurement were made with the ABAQUS code and found to be in good agreement with the experimental temperature data. As an application of the block calorimetry technique, the X-ray power exiting the mirror in place at a LIGA scanner located at the APS beamline 10 BM was measured with a calorimeter similar to the one used at the ALS. The overall results at the APS demonstrated the utility of calorimetry in helping to characterize the total X-ray power in LIGA beamlines. In addition to the block calorimetry work at the ALS and APS, a preliminary comparison of the use of heat flux sensors, photodiodes and modified beam calorimeters as total X-ray power monitors was made at the ALS, beamline 3.3.1. This work showed that a modification of a commercially available, heat flux sensor could result in a simple, direct reading beam power meter that could be a useful for monitoring total X-ray power in Sandia's LIGA exposure stations at the ALS, APS and Stanford Synchrotron Radiation Laboratory (SSRL).« less

  15. Regolith X-Ray Imaging Spectrometer (REXIS) Aboard the OSIRIS-REx Asteroid Sample Return Mission

    NASA Astrophysics Data System (ADS)

    Masterson, R. A.; Chodas, M.; Bayley, L.; Allen, B.; Hong, J.; Biswas, P.; McMenamin, C.; Stout, K.; Bokhour, E.; Bralower, H.; Carte, D.; Chen, S.; Jones, M.; Kissel, S.; Schmidt, F.; Smith, M.; Sondecker, G.; Lim, L. F.; Lauretta, D. S.; Grindlay, J. E.; Binzel, R. P.

    2018-02-01

    The Regolith X-ray Imaging Spectrometer (REXIS) is the student collaboration experiment proposed and built by an MIT-Harvard team, launched aboard NASA's OSIRIS-REx asteroid sample return mission. REXIS complements the scientific investigations of other OSIRIS-REx instruments by determining the relative abundances of key elements present on the asteroid's surface by measuring the X-ray fluorescence spectrum (stimulated by the natural solar X-ray flux) over the range of energies 0.5 to 7 keV. REXIS consists of two components: a main imaging spectrometer with a coded aperture mask and a separate solar X-ray monitor to account for the Sun's variability. In addition to element abundance ratios (relative to Si) pinpointing the asteroid's most likely meteorite association, REXIS also maps elemental abundance variability across the asteroid's surface using the asteroid's rotation as well as the spacecraft's orbital motion. Image reconstruction at the highest resolution is facilitated by the coded aperture mask. Through this operation, REXIS will be the first application of X-ray coded aperture imaging to planetary surface mapping, making this student-built instrument a pathfinder toward future planetary exploration. To date, 60 students at the undergraduate and graduate levels have been involved with the REXIS project, with the hands-on experience translating to a dozen Master's and Ph.D. theses and other student publications.

  16. March 7, 1970 solar eclipse investigation

    NASA Technical Reports Server (NTRS)

    Accardo, C. A.

    1972-01-01

    Studies from rockets directed toward establishing the solar X-ray fluxes during the 7 March 1970 total eclipse over the North American continent are reported. A map of the eclipse path is presented. The measured absorption profiles for the residual X-rays are useful in establishing their contribution to the D and E region ionization during the eclipse. The studies were performed with two Nike-Apache payloads launched over Wallops Island, Virginia. In addition to three X-ray detectors in the 1 to 8A, 8 to 20A and 44 to 60A bands, there was included in the payloads two additional experiments. These were an electric field experiment and an epithermal photoelectron experiment. The X-ray instrumentation, payload description, flight circumstances and finally, the X-ray results obtained are described. The various computer codes employed for the purpose of reducing the telemetered data as well as the eclipse codes are included.

  17. Advanced x-ray imaging spectrometer

    NASA Technical Reports Server (NTRS)

    Callas, John L. (Inventor); Soli, George A. (Inventor)

    1998-01-01

    An x-ray spectrometer that also provides images of an x-ray source. Coded aperture imaging techniques are used to provide high resolution images. Imaging position-sensitive x-ray sensors with good energy resolution are utilized to provide excellent spectroscopic performance. The system produces high resolution spectral images of the x-ray source which can be viewed in any one of a number of specific energy bands.

  18. TRUST. I. A 3D externally illuminated slab benchmark for dust radiative transfer

    NASA Astrophysics Data System (ADS)

    Gordon, K. D.; Baes, M.; Bianchi, S.; Camps, P.; Juvela, M.; Kuiper, R.; Lunttila, T.; Misselt, K. A.; Natale, G.; Robitaille, T.; Steinacker, J.

    2017-07-01

    Context. The radiative transport of photons through arbitrary three-dimensional (3D) structures of dust is a challenging problem due to the anisotropic scattering of dust grains and strong coupling between different spatial regions. The radiative transfer problem in 3D is solved using Monte Carlo or Ray Tracing techniques as no full analytic solution exists for the true 3D structures. Aims: We provide the first 3D dust radiative transfer benchmark composed of a slab of dust with uniform density externally illuminated by a star. This simple 3D benchmark is explicitly formulated to provide tests of the different components of the radiative transfer problem including dust absorption, scattering, and emission. Methods: The details of the external star, the slab itself, and the dust properties are provided. This benchmark includes models with a range of dust optical depths fully probing cases that are optically thin at all wavelengths to optically thick at most wavelengths. The dust properties adopted are characteristic of the diffuse Milky Way interstellar medium. This benchmark includes solutions for the full dust emission including single photon (stochastic) heating as well as two simplifying approximations: One where all grains are considered in equilibrium with the radiation field and one where the emission is from a single effective grain with size-distribution-averaged properties. A total of six Monte Carlo codes and one Ray Tracing code provide solutions to this benchmark. Results: The solution to this benchmark is given as global spectral energy distributions (SEDs) and images at select diagnostic wavelengths from the ultraviolet through the infrared. Comparison of the results revealed that the global SEDs are consistent on average to a few percent for all but the scattered stellar flux at very high optical depths. The image results are consistent within 10%, again except for the stellar scattered flux at very high optical depths. The lack of agreement between different codes of the scattered flux at high optical depths is quantified for the first time. Convergence tests using one of the Monte Carlo codes illustrate the sensitivity of the solutions to various model parameters. Conclusions: We provide the first 3D dust radiative transfer benchmark and validate the accuracy of this benchmark through comparisons between multiple independent codes and detailed convergence tests.

  19. A broad band X-ray imaging spectrophotometer for astrophysical studies

    NASA Technical Reports Server (NTRS)

    Lum, Kenneth S. K.; Lee, Dong Hwan; Ku, William H.-M.

    1988-01-01

    A broadband X-ray imaging spectrophotometer (BBXRIS) has been built for astrophysical studies. The BBXRIS is based on a large-imaging gas scintillation proportional counter (LIGSPC), a combination of a gas scintillation proportional counter and a multiwire proportional counter, which achieves 8 percent (FWHM) energy resolution and 1.5-mm (FWHM) spatial resolution at 5.9 keV. The LIGSPC can be integrated with a grazing incidence mirror and a coded aperture mask to provide imaging over a broad range of X-ray energies. The results of tests involving the LIGSPC and a coded aperture mask are presented, and possible applications of the BBXRIS are discussed.

  20. On the effect of the neutral Hydrogen density on the 26 day variations of galactic cosmic rays

    NASA Astrophysics Data System (ADS)

    Engelbrecht, Nicholas; Burger, Renier; Ferreira, Stefan; Hitge, Mariette

    Preliminary results of a 3D, steady-state ab-initio cosmic ray modulation code are presented. This modulation code utilizes analytical expressions for the parallel and perpendicular mean free paths based on the work of Teufel and Schlickeiser (2003) and Shalchi et al. (2004), incorporating Breech et al. (2008)'s model for the 2D variance, correlation scale, and normalized cross helicity. The effects of such a model for basic turbulence quantities, coupled with a 3D model for the neutral Hydrogen density on the 26-day variations of cosmic rays, is investigated, utilizing a Schwadron-Parker hybrid heliospheric magnetic field.

  1. CMT for transport in porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, L.

    This session is comprised of an outline of uses for x-ray microtomography in the field of petroleum geology. Calculations, diagrams, and color photomicrographs depict the many applications of synchrotron x-ray microtomograpy in determining transport properties and fluid flow characteristics of reservoir rocks, micro-porosity in carbonates, and aspects of multi-phase transport.

  2. Study of no-man's land physics in the total-f gyrokinetic code XGC1

    NASA Astrophysics Data System (ADS)

    Ku, Seung Hoe; Chang, C. S.; Lang, J.

    2014-10-01

    While the ``transport shortfall'' in the ``no-man's land'' has been observed often in delta-f codes, it has not yet been observed in the global total-f gyrokinetic particle code XGC1. Since understanding the interaction between the edge and core transport appears to be a critical element in the prediction for ITER performance, understanding the no-man's land issue is an important physics research topic. Simulation results using the Holland case will be presented and the physics causing the shortfall phenomenon will be discussed. Nonlinear nonlocal interaction of turbulence, secondary flows, and transport appears to be the key.

  3. Development of Safety Analysis Code System of Beam Transport and Core for Accelerator Driven System

    NASA Astrophysics Data System (ADS)

    Aizawa, Naoto; Iwasaki, Tomohiko

    2014-06-01

    Safety analysis code system of beam transport and core for accelerator driven system (ADS) is developed for the analyses of beam transients such as the change of the shape and position of incident beam. The code system consists of the beam transport analysis part and the core analysis part. TRACE 3-D is employed in the beam transport analysis part, and the shape and incident position of beam at the target are calculated. In the core analysis part, the neutronics, thermo-hydraulics and cladding failure analyses are performed by the use of ADS dynamic calculation code ADSE on the basis of the external source database calculated by PHITS and the cross section database calculated by SRAC, and the programs of the cladding failure analysis for thermoelastic and creep. By the use of the code system, beam transient analyses are performed for the ADS proposed by Japan Atomic Energy Agency. As a result, the rapid increase of the cladding temperature happens and the plastic deformation is caused in several seconds. In addition, the cladding is evaluated to be failed by creep within a hundred seconds. These results have shown that the beam transients have caused a cladding failure.

  4. Diffusive transport of energetic electrons in the solar corona: X-ray and radio diagnostics

    NASA Astrophysics Data System (ADS)

    Musset, S.; Kontar, E. P.; Vilmer, N.

    2018-02-01

    Context. Imaging spectroscopy in X-rays with RHESSI provides the possibility to investigate the spatial evolution of X-ray emitting electron distribution and therefore, to study transport effects on energetic electrons during solar flares. Aims: We study the energy dependence of the scattering mean free path of energetic electrons in the solar corona. Methods: We used imaging spectroscopy with RHESSI to study the evolution of energetic electrons distribution in various parts of the magnetic loop during the 2004 May 21 flare. We compared these observations with the radio observations of the gyrosynchrotron radiation of the same flare and with the predictions of a diffusive transport model. Results: X-ray analysis shows a trapping of energetic electrons in the corona and a spectral hardening of the energetic electron distribution between the top of the loop and the footpoints. Coronal trapping of electrons is stronger for radio-emitting electrons than for X-ray-emitting electrons. These observations can be explained by a diffusive transport model. Conclusions: We show that the combination of X-ray and radio diagnostics is a powerful tool to study electron transport in the solar corona in different energy domains. We show that the diffusive transport model can explain our observations, and in the range 25-500 keV, the scattering mean free path of electrons decreases with electron energy. We can estimate for the first time the scattering mean free path dependence on energy in the corona.

  5. Use of FEC coding to improve statistical multiplexing performance for video transport over ATM networks

    NASA Astrophysics Data System (ADS)

    Kurceren, Ragip; Modestino, James W.

    1998-12-01

    The use of forward error-control (FEC) coding, possibly in conjunction with ARQ techniques, has emerged as a promising approach for video transport over ATM networks for cell-loss recovery and/or bit error correction, such as might be required for wireless links. Although FEC provides cell-loss recovery capabilities it also introduces transmission overhead which can possibly cause additional cell losses. A methodology is described to maximize the number of video sources multiplexed at a given quality of service (QoS), measured in terms of decoded cell loss probability, using interlaced FEC codes. The transport channel is modelled as a block interference channel (BIC) and the multiplexer as single server, deterministic service, finite buffer supporting N users. Based upon an information-theoretic characterization of the BIC and large deviation bounds on the buffer overflow probability, the described methodology provides theoretically achievable upper limits on the number of sources multiplexed. Performance of specific coding techniques using interlaced nonbinary Reed-Solomon (RS) codes and binary rate-compatible punctured convolutional (RCPC) codes is illustrated.

  6. Clean Energy in City Codes: A Baseline Analysis of Municipal Codification across the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Jeffrey J.; Aznar, Alexandra; Dane, Alexander

    Municipal governments in the United States are well positioned to influence clean energy (energy efficiency and alternative energy) and transportation technology and strategy implementation within their jurisdictions through planning, programs, and codification. Municipal governments are leveraging planning processes and programs to shape their energy futures. There is limited understanding in the literature related to codification, the primary way that municipal governments enact enforceable policies. The authors fill the gap in the literature by documenting the status of municipal codification of clean energy and transportation across the United States. More directly, we leverage online databases of municipal codes to develop nationalmore » and state-specific representative samples of municipal governments by population size. Our analysis finds that municipal governments with the authority to set residential building energy codes within their jurisdictions frequently do so. In some cases, communities set codes higher than their respective state governments. Examination of codes across the nation indicates that municipal governments are employing their code as a policy mechanism to address clean energy and transportation.« less

  7. The Parker Instability with Cosmic-Ray Streaming

    NASA Astrophysics Data System (ADS)

    Heintz, Evan; Zweibel, Ellen G.

    2018-06-01

    Recent studies have found that cosmic-ray transport plays an important role in feedback processes such as star formation and the launching of galactic winds. Although cosmic-ray buoyancy is widely held to be a destabilizing force in galactic disks, the effect of cosmic-ray transport on the stability of stratified systems has yet to be analyzed. We perform a stability analysis of a stratified layer for three different cosmic-ray transport models: decoupled (Classic Parker), coupled with γ c = 4/3 but not streaming (Modified Parker), and finally coupled with streaming at the Alfvén speed. When the compressibility of the cosmic rays is decreased the system becomes much more stable, but the addition of cosmic-ray streaming to the Parker instability severely destabilizes it. Through comparison of these three cases and analysis of the work contributions for the perturbed quantities of each system, we demonstrate that cosmic-ray heating of the gas is responsible for the destabilization of the system. We find that a 3D system is unstable over a larger range of wavelengths than the 2D system. Therefore, the Parker instability with cosmic-ray streaming may play an important role in cosmic-ray feedback.

  8. ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2008-04-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a methodmore » for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.« less

  9. Poster - 28: Shielding of X-ray Rooms in Ontario in the Absence of Best Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frimeth, Jeff; Richer, Jeff; Nesbitt, James

    This poster will be strictly based on the Healing Arts Radiation Protection (HARP) Act, Regulation 543 under this Act (X-ray Safety Code), and personal communication the presenting author has had. In Ontario, the process of approval of an X-ray machine installation by the Director of the X-ray Inspection Service (XRIS) follows a certain protocol. Initially, the applicant submits a series of forms, including recommended shielding amounts, in order to satisfy the law. This documentation is then transferred to a third-party vendor (i.e. a professional engineer – P.Eng.) outsourced by the Ministry of Health and Long-term Care (MOHLTC). The P.Eng. thenmore » evaluates the submitted documentation for appropriate fulfillment of the HARP Act and Reg. 543 requirements. If the P.Eng.’s evaluation of the documentation is to their satisfaction, the XRIS is then notified. Finally, the Director will then issue a letter of approval to install the equipment at the facility. The methodology required to be used by the P.Eng. in order to determine the required amounts of protective barriers, and recommended to be used by the applicant, is contained within Safety Code 20A. However, Safety Code 35 has replaced the obsolete Safety Code 20A document and employs best practices in shielding design. This talk will focus further on specific intentions and limitations of Safety Code 20A. Furthermore, this talk will discuss the definition of the “practice of professional engineering” in Ontario. COMP members who are involved in shielding design are strongly encouraged to attend.« less

  10. Path Toward a Unifid Geometry for Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lee, Kerry; Barzilla, Janet; Davis, Andrew; Zachmann

    2014-01-01

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex computer-aided design (CAD) models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN [high charge and energy transport code developed by NASA Langley Research Center (LaRC)], are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading to the creation and maintenance of toolkit-specific simplistic geometry models. The work presented here builds on the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit developed for use with the Monte Carlo N-Particle (MCNP) transport code. The workflow for achieving radiation transport on CAD models using MCNP and FLUKA has been demonstrated and the results of analyses on realistic spacecraft/habitats will be presented. Future work is planned that will further automate this process and enable the use of multiple radiation transport codes on identical geometry models imported from CAD. This effort will enhance the modeling tools used by NASA to accurately evaluate the astronaut space radiation risk and accurately determine the protection provided by as-designed exploration mission vehicles and habitats

  11. 3D Laser Imprint Using a Smoother Ray-Traced Power Deposition Method

    NASA Astrophysics Data System (ADS)

    Schmitt, Andrew J.

    2017-10-01

    Imprinting of laser nonuniformities in directly-driven icf targets is a challenging problem to accurately simulate with large radiation-hydro codes. One of the most challenging aspects is the proper construction of the complex and rapidly changing laser interference structure driving the imprint using the reduced laser propagation models (usually ray-tracing) found in these codes. We have upgraded the modelling capability in our massively-parallel fastrad3d code by adding a more realistic EM-wave interference structure. This interference model adds an axial laser speckle to the previous transverse-only laser structure, and can be impressed on our improved smoothed 3D raytrace package. This latter package, which connects rays to form bundles and performs power deposition calculations on the bundles, is intended to decrease ray-trace noise (which can mask or add to imprint) while using fewer rays. We apply this improved model to 3D simulations of recent imprint experiments performed on the Omega-EP laser and the Nike laser that examined the reduction of imprinting due to very thin high-Z target coatings. We report on the conditions in which this new model makes a significant impact on the development of laser imprint. Supported by US DoE/NNSA.

  12. Boltzmann Transport Code Update: Parallelization and Integrated Design Updates

    NASA Technical Reports Server (NTRS)

    Heinbockel, J. H.; Nealy, J. E.; DeAngelis, G.; Feldman, G. A.; Chokshi, S.

    2003-01-01

    The on going efforts at developing a web site for radiation analysis is expected to result in an increased usage of the High Charge and Energy Transport Code HZETRN. It would be nice to be able to do the requested calculations quickly and efficiently. Therefore the question arose, "Could the implementation of parallel processing speed up the calculations required?" To answer this question two modifications of the HZETRN computer code were created. The first modification selected the shield material of Al(2219) , then polyethylene and then Al(2219). The modified Fortran code was labeled 1SSTRN.F. The second modification considered the shield material of CO2 and Martian regolith. This modified Fortran code was labeled MARSTRN.F.

  13. Measurements of the neutral particle spectra on Mars by MSL/RAD from 2015-11-15 to 2016-01-15

    NASA Astrophysics Data System (ADS)

    Guo, Jingnan; Zeitlin, Cary; Wimmer-Schweingruber, Robert; Hassler, Donald M.; Köhler, Jan; Ehresmann, Bent; Böttcher, Stephan; Böhm, Eckart; Brinza, David E.

    2017-08-01

    The Radiation Assessment Detector (RAD), onboard the Mars Science Laboratory (MSL) rover Curiosity, has been measuring the energetic charged and neutral particles and the radiation dose rate on the surface of Mars since the landing of the rover in August 2012. In contrast to charged particles, neutral particles (neutrons and γ-rays) are measured indirectly: the energy deposition spectra produced by neutral particles are complex convolutions of the incident particle spectra with the detector response functions. An inversion technique has been developed and applied to jointly unfold the deposited energy spectra measured in two scintillators of different types (CsI for high γ detection efficiency, and plastic for neutrons) to obtain the neutron and γ-ray spectra. This result is important for determining the biological impact of the Martian surface radiation contributed by neutrons, which interact with materials differently from the charged particles. These first in-situ measurements on Mars provide (1) an important reference for assessing the radiation-associated health risks for future manned missions to the red planet and (2) an experimental input for validating the particle transport codes used to model the radiation environments within spacecraft or on the surface of planets. Here we present neutral particle spectra as well as the corresponding dose and dose equivalent rates derived from RAD measurement during a period (November 15, 2015 to January 15, 2016) for which the surface particle spectra have been simulated via different transport models.

  14. Simulating cosmic ray physics on a moving mesh

    NASA Astrophysics Data System (ADS)

    Pfrommer, C.; Pakmor, R.; Schaal, K.; Simpson, C. M.; Springel, V.

    2017-03-01

    We discuss new methods to integrate the cosmic ray (CR) evolution equations coupled to magnetohydrodynamics on an unstructured moving mesh, as realized in the massively parallel AREPO code for cosmological simulations. We account for diffusive shock acceleration of CRs at resolved shocks and at supernova remnants in the interstellar medium (ISM) and follow the advective CR transport within the magnetized plasma, as well as anisotropic diffusive transport of CRs along the local magnetic field. CR losses are included in terms of Coulomb and hadronic interactions with the thermal plasma. We demonstrate the accuracy of our formalism for CR acceleration at shocks through simulations of plane-parallel shock tubes that are compared to newly derived exact solutions of the Riemann shock-tube problem with CR acceleration. We find that the increased compressibility of the post-shock plasma due to the produced CRs decreases the shock speed. However, CR acceleration at spherically expanding blast waves does not significantly break the self-similarity of the Sedov-Taylor solution; the resulting modifications can be approximated by a suitably adjusted, but constant adiabatic index. In first applications of the new CR formalism to simulations of isolated galaxies and cosmic structure formation, we find that CRs add an important pressure component to the ISM that increases the vertical scaleheight of disc galaxies and thus reduces the star formation rate. Strong external structure formation shocks inject CRs into the gas, but the relative pressure of this component decreases towards halo centres as adiabatic compression favours the thermal over the CR pressure.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Bernhard; Janka, Hans-Thomas; Marek, Andreas, E-mail: bjmuellr@mpa-garching.mpg.de, E-mail: thj@mpa-garching.mpg.de

    We present the first two-dimensional general relativistic (GR) simulations of stellar core collapse and explosion with the COCONUT hydrodynamics code in combination with the VERTEX solver for energy-dependent, three-flavor neutrino transport, using the extended conformal flatness condition for approximating the space-time metric and a ray-by-ray-plus ansatz to tackle the multi-dimensionality of the transport. For both of the investigated 11.2 and 15 M{sub Sun} progenitors we obtain successful, though seemingly marginal, neutrino-driven supernova explosions. This outcome and the time evolution of the models basically agree with results previously obtained with the PROMETHEUS hydro solver including an approximative treatment of relativistic effectsmore » by a modified Newtonian potential. However, GR models exhibit subtle differences in the neutrinospheric conditions compared with Newtonian and pseudo-Newtonian simulations. These differences lead to significantly higher luminosities and mean energies of the radiated electron neutrinos and antineutrinos and therefore to larger energy-deposition rates and heating efficiencies in the gain layer with favorable consequences for strong nonradial mass motions and ultimately for an explosion. Moreover, energy transfer to the stellar medium around the neutrinospheres through nucleon recoil in scattering reactions of heavy-lepton neutrinos also enhances the mentioned effects. Together with previous pseudo-Newtonian models, the presented relativistic calculations suggest that the treatment of gravity and energy-exchanging neutrino interactions can make differences of even 50%-100% in some quantities and is likely to contribute to a finally successful explosion mechanism on no minor level than hydrodynamical differences between different dimensions.« less

  16. Coupling extended magnetohydrodynamic fluid codes with radiofrequency ray tracing codes for fusion modeling

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Held, Eric D.

    2015-09-01

    Neoclassical tearing modes are macroscopic (L ∼ 1 m) instabilities in magnetic fusion experiments; if unchecked, these modes degrade plasma performance and may catastrophically destroy plasma confinement by inducing a disruption. Fortunately, the use of properly tuned and directed radiofrequency waves (λ ∼ 1 mm) can eliminate these modes. Numerical modeling of this difficult multiscale problem requires the integration of separate mathematical models for each length and time scale (Jenkins and Kruger, 2012 [21]); the extended MHD model captures macroscopic plasma evolution while the RF model tracks the flow and deposition of injected RF power through the evolving plasma profiles. The scale separation enables use of the eikonal (ray-tracing) approximation to model the RF wave propagation. In this work we demonstrate a technique, based on methods of computational geometry, for mapping the ensuing RF data (associated with discrete ray trajectories) onto the finite-element/pseudospectral grid that is used to model the extended MHD physics. In the new representation, the RF data can then be used to construct source terms in the equations of the extended MHD model, enabling quantitative modeling of RF-induced tearing mode stabilization. Though our specific implementation uses the NIMROD extended MHD (Sovinec et al., 2004 [22]) and GENRAY RF (Smirnov et al., 1994 [23]) codes, the approach presented can be applied more generally to any code coupling requiring the mapping of ray tracing data onto Eulerian grids.

  17. BRYNTRN: A baryon transport model

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.; Nealy, John E.; Chun, Sang Y.; Hong, B. S.; Buck, Warren W.; Lamkin, S. L.; Ganapol, Barry D.; Khan, Ferdous; Cucinotta, Francis A.

    1989-01-01

    The development of an interaction data base and a numerical solution to the transport of baryons through an arbitrary shield material based on a straight ahead approximation of the Boltzmann equation are described. The code is most accurate for continuous energy boundary values, but gives reasonable results for discrete spectra at the boundary using even a relatively coarse energy grid (30 points) and large spatial increments (1 cm in H2O). The resulting computer code is self-contained, efficient and ready to use. The code requires only a very small fraction of the computer resources required for Monte Carlo codes.

  18. ecode - Electron Transport Algorithm Testing v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franke, Brian C.; Olson, Aaron J.; Bruss, Donald Eugene

    2016-10-05

    ecode is a Monte Carlo code used for testing algorithms related to electron transport. The code can read basic physics parameters, such as energy-dependent stopping powers and screening parameters. The code permits simple planar geometries of slabs or cubes. Parallelization consists of domain replication, with work distributed at the start of the calculation and statistical results gathered at the end of the calculation. Some basic routines (such as input parsing, random number generation, and statistics processing) are shared with the Integrated Tiger Series codes. A variety of algorithms for uncertainty propagation are incorporated based on the stochastic collocation and stochasticmore » Galerkin methods. These permit uncertainty only in the total and angular scattering cross sections. The code contains algorithms for simulating stochastic mixtures of two materials. The physics is approximate, ranging from mono-energetic and isotropic scattering to screened Rutherford angular scattering and Rutherford energy-loss scattering (simple electron transport models). No production of secondary particles is implemented, and no photon physics is implemented.« less

  19. MCNP capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tallymore » characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.« less

  20. Differential Cross Section Kinematics for 3-dimensional Transport Codes

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Dick, Frank

    2008-01-01

    In support of the development of 3-dimensional transport codes, this paper derives the relevant relativistic particle kinematic theory. Formulas are given for invariant, spectral and angular distributions in both the lab (spacecraft) and center of momentum frames, for collisions involving 2, 3 and n - body final states.

  1. 49 CFR 171.25 - Additional requirements for the use of the IMDG Code.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 176 of this subchapter. (3) Packages containing primary lithium batteries and cells that are transported in accordance with Special Provision 188 of the IMDG Code must be marked “PRIMARY LITHIUM BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT” or “LITHIUM METAL BATTERIES—FORBIDDEN FOR...

  2. 49 CFR 171.25 - Additional requirements for the use of the IMDG Code.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 176 of this subchapter. (3) Packages containing primary lithium batteries and cells that are transported in accordance with Special Provision 188 of the IMDG Code must be marked “PRIMARY LITHIUM BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT” or “LITHIUM METAL BATTERIES—FORBIDDEN FOR...

  3. 49 CFR 171.25 - Additional requirements for the use of the IMDG Code.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 176 of this subchapter. (3) Packages containing primary lithium batteries and cells that are transported in accordance with Special Provision 188 of the IMDG Code must be marked “PRIMARY LITHIUM BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT” or “LITHIUM METAL BATTERIES—FORBIDDEN FOR...

  4. Nonperturbative methods in HZE ion transport

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Badavi, Francis F.; Costen, Robert C.; Shinn, Judy L.

    1993-01-01

    A nonperturbative analytic solution of the high charge and energy (HZE) Green's function is used to implement a computer code for laboratory ion beam transport. The code is established to operate on the Langley Research Center nuclear fragmentation model used in engineering applications. Computational procedures are established to generate linear energy transfer (LET) distributions for a specified ion beam and target for comparison with experimental measurements. The code is highly efficient and compares well with the perturbation approximations.

  5. SU-F-I-53: Coded Aperture Coherent Scatter Spectral Imaging of the Breast: A Monte Carlo Evaluation of Absorbed Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, R; Lakshmanan, M; Fong, G

    Purpose: Coherent scatter based imaging has shown improved contrast and molecular specificity over conventional digital mammography however the biological risks have not been quantified due to a lack of accurate information on absorbed dose. This study intends to characterize the dose distribution and average glandular dose from coded aperture coherent scatter spectral imaging of the breast. The dose deposited in the breast from this new diagnostic imaging modality has not yet been quantitatively evaluated. Here, various digitized anthropomorphic phantoms are tested in a Monte Carlo simulation to evaluate the absorbed dose distribution and average glandular dose using clinically feasible scanmore » protocols. Methods: Geant4 Monte Carlo radiation transport simulation software is used to replicate the coded aperture coherent scatter spectral imaging system. Energy sensitive, photon counting detectors are used to characterize the x-ray beam spectra for various imaging protocols. This input spectra is cross-validated with the results from XSPECT, a commercially available application that yields x-ray tube specific spectra for the operating parameters employed. XSPECT is also used to determine the appropriate number of photons emitted per mAs of tube current at a given kVp tube potential. With the implementation of the XCAT digital anthropomorphic breast phantom library, a variety of breast sizes with differing anatomical structure are evaluated. Simulations were performed with and without compression of the breast for dose comparison. Results: Through the Monte Carlo evaluation of a diverse population of breast types imaged under real-world scan conditions, a clinically relevant average glandular dose for this new imaging modality is extrapolated. Conclusion: With access to the physical coherent scatter imaging system used in the simulation, the results of this Monte Carlo study may be used to directly influence the future development of the modality to keep breast dose to a minimum while still maintaining clinically viable image quality.« less

  6. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their experiments, including the ability to model the beam line, the shielding of samples and sample holders, and the estimates of basic physical and biological outputs of the designed experiments. We present an overview of the GERM code GUI, as well as providing training applications.

  7. CLUMPY: A code for γ-ray signals from dark matter structures

    NASA Astrophysics Data System (ADS)

    Charbonnier, Aldée; Combet, Céline; Maurin, David

    2012-03-01

    We present the first public code for semi-analytical calculation of the γ-ray flux astrophysical J-factor from dark matter annihilation/decay in the Galaxy, including dark matter substructures. The core of the code is the calculation of the line of sight integral of the dark matter density squared (for annihilations) or density (for decaying dark matter). The code can be used in three modes: i) to draw skymaps from the Galactic smooth component and/or the substructure contributions, ii) to calculate the flux from a specific halo (that is not the Galactic halo, e.g. dwarf spheroidal galaxies) or iii) to perform simple statistical operations from a list of allowed DM profiles for a given object. Extragalactic contributions and other tracers of DM annihilation (e.g. positrons, anti-protons) will be included in a second release.

  8. Anomalous Transport of High Energy Cosmic Rays in Galactic Superbubbles

    NASA Technical Reports Server (NTRS)

    Barghouty, Nasser F.

    2014-01-01

    High-energy cosmic rays may exhibit anomalous transport as they traverse and are accelerated by a collection of supernovae explosions in a galactic superbubble. Signatures of this anomalous transport can show up in the particles' evolution and their spectra. In a continuous-time-random- walk (CTRW) model assuming standard diffusive shock acceleration theory (DSA) for each shock encounter, and where the superbubble (an OB stars association) is idealized as a heterogeneous region of particle sources and sinks, acceleration and transport in the superbubble can be shown to be sub-diffusive. While the sub-diffusive transport can be attributed to the stochastic nature of the acceleration time according to DSA theory, the spectral break appears to be an artifact of transport in a finite medium. These CTRW simulations point to a new and intriguing phenomenon associated with the statistical nature of collective acceleration of high energy cosmic rays in galactic superbubbles.

  9. Investigation of Secondary Neutron Production in Large Space Vehicles for Deep Space

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Koontz, Steve; Reddell, Brandon; Atwell, William; Boeder, Paul

    2016-01-01

    Future NASA missions will focus on deep space and Mars surface operations with large structures necessary for transportation of crew and cargo. In addition to the challenges of manufacturing these large structures, there are added challenges from the space radiation environment and its impacts on the crew, electronics, and vehicle materials. Primary radiation from the sun (solar particle events) and from outside the solar system (galactic cosmic rays) interact with materials of the vehicle and the elements inside the vehicle. These interactions lead to the primary radiation being absorbed or producing secondary radiation (primarily neutrons). With all vehicles, the high-energy primary radiation is of most concern. However, with larger vehicles, there is more opportunity for secondary radiation production, which can be significant enough to cause concern. In a previous paper, we embarked upon our first steps toward studying neutron production from large vehicles by validating our radiation transport codes for neutron environments against flight data. The following paper will extend the previous work to focus on the deep space environment and the resulting neutron flux from large vehicles in this deep space environment.

  10. Air kerma calibration factors and chamber correction values for PTW soft x-ray, NACP and Roos ionization chambers at very low x-ray energies.

    PubMed

    Ipe, N E; Rosser, K E; Moretti, C J; Manning, J W; Palmer, M J

    2001-08-01

    This paper evaluates the characteristics of ionization chambers for the measurement of absorbed dose to water using very low-energy x-rays. The values of the chamber correction factor, k(ch), used in the IPEMB 1996 code of practice for the UK secondary standard ionization chambers (PTW type M23342 and PTW type M23344), the Roos (PTW type 34001) and NACP electron chambers are derived. The responses in air of the small and large soft x-ray chambers (PTW type M23342 and PTW type M23344) and the NACP and Roos electron ionization chambers were compared. Besides the soft x-ray chambers, the NACP and Roos chambers can be used for very low-energy x-ray dosimetry provided that they are used in the restricted energy range for which their response does not change by more than 5%. The chamber correction factor was found by comparing the absorbed dose to water determined using the dosimetry protocol recommended for low-energy x-rays with that for very low-energy x-rays. The overlap energy range was extended using data from Grosswendt and Knight. Chamber correction factors given in this paper are chamber dependent, varying from 1.037 to 1.066 for a PTW type M23344 chamber, which is very different from a value of unity given in the IPEMB code. However, the values of k(ch) determined in this paper agree with those given in the DIN standard within experimental uncertainty. The authors recommend that the very low-energy section of the IPEMB code is amended to include the most up-to-date values of k(ch).

  11. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  12. Response of the first wetted wall of an IFE reactor chamber to the energy release from a direct-drive DT capsule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medin, Stanislav A.; Basko, Mikhail M.; Orlov, Yurii N.

    2012-07-11

    Radiation hydrodynamics 1D simulations were performed with two concurrent codes, DEIRA and RAMPHY. The DEIRA code was used for DT capsule implosion and burn, and the RAMPHY code was used for computation of X-ray and fast ions deposition in the first wall liquid film of the reactor chamber. The simulations were run for 740 MJ direct drive DT capsule and Pb thin liquid wall reactor chamber of 10 m diameter. Temporal profiles for DT capsule leaking power of X-rays, neutrons and fast {sup 4}He ions were obtained and spatial profiles of the liquid film flow parameter were computed and analyzed.

  13. Integrated modelling framework for short pulse high energy density physics experiments

    NASA Astrophysics Data System (ADS)

    Sircombe, N. J.; Hughes, S. J.; Ramsay, M. G.

    2016-03-01

    Modelling experimental campaigns on the Orion laser at AWE, and developing a viable point-design for fast ignition (FI), calls for a multi-scale approach; a complete description of the problem would require an extensive range of physics which cannot realistically be included in a single code. For modelling the laser-plasma interaction (LPI) we need a fine mesh which can capture the dispersion of electromagnetic waves, and a kinetic model for each plasma species. In the dense material of the bulk target, away from the LPI region, collisional physics dominates. The transport of hot particles generated by the action of the laser is dependent on their slowing and stopping in the dense material and their need to draw a return current. These effects will heat the target, which in turn influences transport. On longer timescales, the hydrodynamic response of the target will begin to play a role as the pressure generated from isochoric heating begins to take effect. Recent effort at AWE [1] has focussed on the development of an integrated code suite based on: the particle in cell code EPOCH, to model LPI; the Monte-Carlo electron transport code THOR, to model the onward transport of hot electrons; and the radiation hydrodynamics code CORVUS, to model the hydrodynamic response of the target. We outline the methodology adopted, elucidate on the advantages of a robustly integrated code suite compared to a single code approach, demonstrate the integrated code suite's application to modelling the heating of buried layers on Orion, and assess the potential of such experiments for the validation of modelling capability in advance of more ambitious HEDP experiments, as a step towards a predictive modelling capability for FI.

  14. MO-FG-CAMPUS-TeP3-02: Benchmarks of a Proton Relative Biological Effectiveness (RBE) Model for DNA Double Strand Break (DSB) Induction in the FLUKA, MCNP, TOPAS, and RayStation™ Treatment Planning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R; Streitmatter, S; Traneus, E

    2016-06-15

    Purpose: Validate implementation of a published RBE model for DSB induction (RBEDSB) in several general purpose Monte Carlo (MC) code systems and the RayStation™ treatment planning system (TPS). For protons and other light ions, DSB induction is a critical initiating molecular event that correlates well with the RBE for cell survival. Methods: An efficient algorithm to incorporate information on proton and light ion RBEDSB from the independently tested Monte Carlo Damage Simulation (MCDS) has now been integrated into MCNP (Stewart et al. PMB 60, 8249–8274, 2015), FLUKA, TOPAS and a research build of the RayStation™ TPS. To cross-validate the RBEDSBmore » model implementation LET distributions, depth-dose and lateral (dose and RBEDSB) profiles for monodirectional monoenergetic (100 to 200 MeV) protons incident on a water phantom are compared. The effects of recoil and secondary ion production ({sub 2}H{sub +}, {sub 3}H{sub +}, {sub 3}He{sub 2+}, {sub 4}He{sub 2+}), spot size (3 and 10 mm), and transport physics on beam profiles and RBEDSB are examined. Results: Depth-dose and RBEDSB profiles among all of the MC models are in excellent agreement using a 1 mm distance criterion (width of a voxel). For a 100 MeV proton beam (10 mm spot), RBEDSB = 1.2 ± 0.03 (− 2–3%) at the tip of the Bragg peak and increases to 1.59 ± 0.3 two mm distal to the Bragg peak. RBEDSB tends to decrease as the kinetic energy of the incident proton increases. Conclusion: The model for proton RBEDSB has been accurately implemented into FLUKA, MCNP, TOPAS and the RayStation™TPS. The transport of secondary light ions (Z > 1) has a significant impact on RBEDSB, especially distal to the Bragg peak, although light ions have a small effect on (dosexRBEDSB) profiles. The ability to incorporate spatial variations in proton RBE within a TPS creates new opportunities to individualize treatment plans and increase the therapeutic ratio. Dr. Erik Traneus is employed full-time as a Research Scientist at RaySearch Laboratories. The research build of the RayStation used in the study was made available to the University of Washington free of charge. RaySearch Laboratories did not provide any monetary support for the reported studies.« less

  15. An Update of Recent Phits Code

    NASA Astrophysics Data System (ADS)

    Sihver, Lembit; Sato, Tatsuhiko; Niita, Koji; Iwase, Hiroshi; Iwamoto, Yosuke; Matsuda, Norihiro; Nakashima, Hiroshi; Sakamoto, Yukio; Gustafsson, Katarina; Mancusi, Davide

    We will first present the current status of the General-Purpose Particle and Heavy-Ion Transport code System (PHITS). In particular, we will describe benchmarking of calculated cross sections against measurements; we will introduce a relativistically covariant version of JQMD, called R- JQMD, that features an improved ground-state initialization algorithm, and we will show heavyion charge-changing cross sections simulated with R-JQMD and compare them to experimental data and to results predicted by the JQMD model. We will also show calculations of dose received by aircrews and personnel in space from cosmic radiation. In recent years, many countries have issued regulations or recommendations to set annual dose limitations for aircrews. Since estimation of cosmic-ray spectra in the atmosphere is an essential issue for the evaluation of aviation doses we have calculated these spectra using PHITS. The accuracy of the simulation, which has well been verified by experimental data taken under various conditions, will be presented together with a software called EXPACS-V, that can visualize the cosmic-ray dose rates at ground level or at a certain altitude on the map of Google Earth, using the PHITS based Analytical Radiation Model in the Atmosphere (PARMA). PARMA can instantaneously calculate the cosmic-ray spectra anywhere in the world by specifying the atmospheric depth, the vertical cut-off rigidity and the force-field potential. For the purpose of examining the applicability of PHITS to the shielding design in space, the absorbed doses in a tissue equivalent water phantom inside an imaginary space vessel has been estimated for different shielding materials of different thicknesses. The results confirm previous results which indicate that PHITS is a suitable tool when performing shielding design studies of spacecrafts. Finally we have used PHITS for the calculations of depth-dose distributions in MATROSHKA, which is an ESA project dedicated to determining the radiation load on astronauts within and outside the International Space Station (ISS).

  16. Realistic radiative MHD simulation of a solar flare

    NASA Astrophysics Data System (ADS)

    Rempel, Matthias D.; Cheung, Mark; Chintzoglou, Georgios; Chen, Feng; Testa, Paola; Martinez-Sykora, Juan; Sainz Dalda, Alberto; DeRosa, Marc L.; Viktorovna Malanushenko, Anna; Hansteen, Viggo H.; De Pontieu, Bart; Carlsson, Mats; Gudiksen, Boris; McIntosh, Scott W.

    2017-08-01

    We present a recently developed version of the MURaM radiative MHD code that includes coronal physics in terms of optically thin radiative loss and field aligned heat conduction. The code employs the "Boris correction" (semi-relativistic MHD with a reduced speed of light) and a hyperbolic treatment of heat conduction, which allow for efficient simulations of the photosphere/corona system by avoiding the severe time-step constraints arising from Alfven wave propagation and heat conduction. We demonstrate that this approach can be used even in dynamic phases such as a flare. We consider a setup in which a flare is triggered by flux emergence into a pre-existing bipolar active region. After the coronal energy release, efficient transport of energy along field lines leads to the formation of flare ribbons within seconds. In the flare ribbons we find downflows for temperatures lower than ~5 MK and upflows at higher temperatures. The resulting soft X-ray emission shows a fast rise and slow decay, reaching a peak corresponding to a mid C-class flare. The post reconnection energy release in the corona leads to average particle energies reaching 50 keV (500 MK under the assumption of a thermal plasma). We show that hard X-ray emission from the corona computed under the assumption of thermal bremsstrahlung can produce a power-law spectrum due to the multi-thermal nature of the plasma. The electron energy flux into the flare ribbons (classic heat conduction with free streaming limit) is highly inhomogeneous and reaches peak values of about 3x1011 erg/cm2/s in a small fraction of the ribbons, indicating regions that could potentially produce hard X-ray footpoint sources. We demonstrate that these findings are robust by comparing simulations computed with different values of the saturation heat flux as well as the "reduced speed of light".

  17. Extension to Higher Mass Numbers of an Improved Knockout-Ablation-Coalescence Model for Secondary Neutron and Light Ion Production in Cosmic Ray Interactions

    NASA Astrophysics Data System (ADS)

    Indi Sriprisan, Sirikul; Townsend, Lawrence; Cucinotta, Francis A.; Miller, Thomas M.

    Purpose: An analytical knockout-ablation-coalescence model capable of making quantitative predictions of the neutron spectra from high-energy nucleon-nucleus and nucleus-nucleus collisions is being developed for use in space radiation protection studies. The FORTRAN computer code that implements this model is called UBERNSPEC. The knockout or abrasion stage of the model is based on Glauber multiple scattering theory. The ablation part of the model uses the classical evaporation model of Weisskopf-Ewing. In earlier work, the knockout-ablation model has been extended to incorporate important coalescence effects into the formalism. Recently, alpha coalescence has been incorporated, and the ability to predict light ion spectra with the coalescence model added. The earlier versions were limited to nuclei with mass numbers less than 69. In this work, the UBERNSPEC code has been extended to make predictions of secondary neutrons and light ion production from the interactions of heavy charged particles with higher mass numbers (as large as 238). The predictions are compared with published measurements of neutron spectra and light ion energy for a variety of collision pairs. Furthermore, the predicted spectra from this work are compared with the predictions from the recently-developed heavy ion event generator incorporated in the Monte Carlo radiation transport code HETC-HEDS.

  18. SOC-DS computer code provides tool for design evaluation of homogeneous two-material nuclear shield

    NASA Technical Reports Server (NTRS)

    Disney, R. K.; Ricks, L. O.

    1967-01-01

    SOC-DS Code /Shield Optimization Code-Direc Search/, selects a nuclear shield material of optimum volume, weight, or cost to meet the requirments of a given radiation dose rate or energy transmission constraint. It is applicable to evaluating neutron and gamma ray shields for all nuclear reactors.

  19. Computing Challenges in Coded Mask Imaging

    NASA Technical Reports Server (NTRS)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  20. Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Atwell, William; Boeder, Paul; Koontz, Steve

    2014-01-01

    NASA's future missions are focused on deep space for human exploration that do not provide a simple emergency return to Earth. In addition, the deep space environment contains a constant background Galactic Cosmic Ray (GCR) radiation exposure, as well as periodic Solar Particle Events (SPEs) that can produce intense amounts of radiation in a short amount of time. Given these conditions, it is important that the avionics systems for deep space human missions are not susceptible to Single Event Effects (SEE) that can occur from radiation interactions with electronic components. The typical process to minimizing SEE effects is through using heritage hardware and extensive testing programs that are very costly. Previous work by Koontz, et al. [1] utilized an analysis-based method for investigating electronic component susceptibility. In their paper, FLUKA, a Monte Carlo transport code, was used to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data. In addition, CREME-96, a deterministic code, was also compared with FLUKA and in-flight data. However, FLUKA has a long run-time (on the order of days), and CREME-96 has not been updated in several years. This paper will investigate the use of HZETRN 2010, a deterministic transport code developed at NASA Langley Research Center, as another tool that can be used to analyze SEE and SEU rates. The benefits to using HZETRN over FLUKA and CREME-96 are that it has a very fast run time (on the order of minutes) and has been shown to be of similar accuracy as other deterministic and Monte Carlo codes when considering dose [2, 3, 4]. The 2010 version of HZETRN has updated its treatment of secondary neutrons and thus has improved its accuracy over previous versions. In this paper, the Linear Energy Transfer (LET) spectra are of interest rather than the total ionizing dose. Therefore, the LET spectra output from HZETRN 2010 will be compared with the FLUKA and in-flight data to validate HZETRN 2010 as a computational tool for SEE qualification by analysis. Furthermore, extrapolation of these data to interplanetary environments at 1 AU will be investigated to determine whether HZETRN 2010 can be used successfully and confidently for deep space mission analyses.

  1. Muon simulation codes MUSIC and MUSUN for underground physics

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, V. A.

    2009-03-01

    The paper describes two Monte Carlo codes dedicated to muon simulations: MUSIC (MUon SImulation Code) and MUSUN (MUon Simulations UNderground). MUSIC is a package for muon transport through matter. It is particularly useful for propagating muons through large thickness of rock or water, for instance from the surface down to underground/underwater laboratory. MUSUN is designed to use the results of muon transport through rock/water to generate muons in or around underground laboratory taking into account their energy spectrum and angular distribution.

  2. Approximate Green's function methods for HZE transport in multilayered materials

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Badavi, Francis F.; Shinn, Judy L.; Costen, Robert C.

    1993-01-01

    A nonperturbative analytic solution of the high charge and energy (HZE) Green's function is used to implement a computer code for laboratory ion beam transport in multilayered materials. The code is established to operate on the Langley nuclear fragmentation model used in engineering applications. Computational procedures are established to generate linear energy transfer (LET) distributions for a specified ion beam and target for comparison with experimental measurements. The code was found to be highly efficient and compared well with the perturbation approximation.

  3. TEMPEST code simulations of hydrogen distribution in reactor containment structures. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Eyler, L.L.

    The mass transport version of the TEMPEST computer code was used to simulate hydrogen distribution in geometric configurations relevant to reactor containment structures. Predicted results of Battelle-Frankfurt hydrogen distribution tests 1 to 6, and 12 are presented. Agreement between predictions and experimental data is good. Best agreement is obtained using the k-epsilon turbulence model in TEMPEST in flow cases where turbulent diffusion and stable stratification are dominant mechanisms affecting transport. The code's general analysis capabilities are summarized.

  4. Optical Surface Analysis Code (OSAC). 7.0

    NASA Technical Reports Server (NTRS)

    Glenn, P.

    1998-01-01

    The purpose of this modification to the Optical Surface Analysis Code (OSAC) is to upgrade the PSF program to allow the user to get proper diffracted energy normalization even when deliberately obscuring rays with internal obscurations.

  5. 14 CFR 257.6 - Effective and compliance dates.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the selling carrier is not the transporting carrier and (ii) Of the transporting carrier's identity... transportation involving a code-share arrangement of the transporting carrier's corporate name and any other name...

  6. 14 CFR 257.6 - Effective and compliance dates.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the selling carrier is not the transporting carrier and (ii) Of the transporting carrier's identity... transportation involving a code-share arrangement of the transporting carrier's corporate name and any other name...

  7. 14 CFR 257.6 - Effective and compliance dates.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the selling carrier is not the transporting carrier and (ii) Of the transporting carrier's identity... transportation involving a code-share arrangement of the transporting carrier's corporate name and any other name...

  8. 14 CFR 257.6 - Effective and compliance dates.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the selling carrier is not the transporting carrier and (ii) Of the transporting carrier's identity... transportation involving a code-share arrangement of the transporting carrier's corporate name and any other name...

  9. PelePhysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-05-17

    PelePhysics is a suite of physics packages that provides functionality of use to reacting hydrodynamics CFD codes. The initial release includes an interface to reaction rate mechanism evaluation, transport coefficient evaluation, and a generalized equation of state (EOS) facility. Both generic evaluators and interfaces to code from externally available tools (Fuego for chemical rates, EGLib for transport coefficients) are provided.

  10. Analysis of JT-60SA operational scenarios

    NASA Astrophysics Data System (ADS)

    Garzotti, L.; Barbato, E.; Garcia, J.; Hayashi, N.; Voitsekhovitch, I.; Giruzzi, G.; Maget, P.; Romanelli, M.; Saarelma, S.; Stankiewitz, R.; Yoshida, M.; Zagórski, R.

    2018-02-01

    Reference scenarios for the JT-60SA tokamak have been simulated with one-dimensional transport codes to assess the stationary state of the flat-top phase and provide a profile database for further physics studies (e.g. MHD stability, gyrokinetic analysis) and diagnostics design. The types of scenario considered vary from pulsed standard H-mode to advanced non-inductive steady-state plasmas. In this paper we present the results obtained with the ASTRA, CRONOS, JINTRAC and TOPICS codes equipped with the Bohm/gyro-Bohm, CDBM and GLF23 transport models. The scenarios analysed here are: a standard ELMy H-mode, a hybrid scenario and a non-inductive steady state plasma, with operational parameters from the JT-60SA research plan. Several simulations of the scenarios under consideration have been performed with the above mentioned codes and transport models. The results from the different codes are in broad agreement and the main plasma parameters generally agree well with the zero dimensional estimates reported previously. The sensitivity of the results to different transport models and, in some cases, to the ELM/pedestal model has been investigated.

  11. Improvements on non-equilibrium and transport Green function techniques: The next-generation TRANSIESTA

    NASA Astrophysics Data System (ADS)

    Papior, Nick; Lorente, Nicolás; Frederiksen, Thomas; García, Alberto; Brandbyge, Mads

    2017-03-01

    We present novel methods implemented within the non-equilibrium Green function code (NEGF) TRANSIESTA based on density functional theory (DFT). Our flexible, next-generation DFT-NEGF code handles devices with one or multiple electrodes (Ne ≥ 1) with individual chemical potentials and electronic temperatures. We describe its novel methods for electrostatic gating, contour optimizations, and assertion of charge conservation, as well as the newly implemented algorithms for optimized and scalable matrix inversion, performance-critical pivoting, and hybrid parallelization. Additionally, a generic NEGF "post-processing" code (TBTRANS/PHTRANS) for electron and phonon transport is presented with several novelties such as Hamiltonian interpolations, Ne ≥ 1 electrode capability, bond-currents, generalized interface for user-defined tight-binding transport, transmission projection using eigenstates of a projected Hamiltonian, and fast inversion algorithms for large-scale simulations easily exceeding 106 atoms on workstation computers. The new features of both codes are demonstrated and bench-marked for relevant test systems.

  12. TReacLab: An object-oriented implementation of non-intrusive splitting methods to couple independent transport and geochemical software

    NASA Astrophysics Data System (ADS)

    Jara, Daniel; de Dreuzy, Jean-Raynald; Cochepin, Benoit

    2017-12-01

    Reactive transport modeling contributes to understand geophysical and geochemical processes in subsurface environments. Operator splitting methods have been proposed as non-intrusive coupling techniques that optimize the use of existing chemistry and transport codes. In this spirit, we propose a coupler relying on external geochemical and transport codes with appropriate operator segmentation that enables possible developments of additional splitting methods. We provide an object-oriented implementation in TReacLab developed in the MATLAB environment in a free open source frame with an accessible repository. TReacLab contains classical coupling methods, template interfaces and calling functions for two classical transport and reactive software (PHREEQC and COMSOL). It is tested on four classical benchmarks with homogeneous and heterogeneous reactions at equilibrium or kinetically-controlled. We show that full decoupling to the implementation level has a cost in terms of accuracy compared to more integrated and optimized codes. Use of non-intrusive implementations like TReacLab are still justified for coupling independent transport and chemical software at a minimal development effort but should be systematically and carefully assessed.

  13. Delta-ray Production in MCNP 6.2.0

    NASA Astrophysics Data System (ADS)

    Anderson, C.; McKinney, G.; Tutt, J.; James, M.

    Secondary electrons in the form of delta-rays, also referred to as knock-on electrons, have been a feature of MCNP for electron and positron transport for over 20 years. While MCNP6 now includes transport for a suite of heavy-ions and charged particles from its integration with MCNPX, the production of delta-rays was still limited to electron and positron transport. In the newest release of MCNP6, version 6.2.0, delta-ray production has now been extended for all energetic charged particles. The basis of this production is the analytical formulation from Rossi and ICRU Report 37. This paper discusses the MCNP6 heavy charged-particle implementation and provides production results for several benchmark/test problems.

  14. Applications of the microdosimetric function implemented in the macroscopic particle transport simulation code PHITS.

    PubMed

    Sato, Tatsuhiko; Watanabe, Ritsuko; Sihver, Lembit; Niita, Koji

    2012-01-01

    Microdosimetric quantities such as lineal energy are generally considered to be better indices than linear energy transfer (LET) for expressing the relative biological effectiveness (RBE) of high charge and energy particles. To calculate their probability densities (PD) in macroscopic matter, it is necessary to integrate microdosimetric tools such as track-structure simulation codes with macroscopic particle transport simulation codes. As an integration approach, the mathematical model for calculating the PD of microdosimetric quantities developed based on track-structure simulations was incorporated into the macroscopic particle transport simulation code PHITS (Particle and Heavy Ion Transport code System). The improved PHITS enables the PD in macroscopic matter to be calculated within a reasonable computation time, while taking their stochastic nature into account. The microdosimetric function of PHITS was applied to biological dose estimation for charged-particle therapy and risk estimation for astronauts. The former application was performed in combination with the microdosimetric kinetic model, while the latter employed the radiation quality factor expressed as a function of lineal energy. Owing to the unique features of the microdosimetric function, the improved PHITS has the potential to establish more sophisticated systems for radiological protection in space as well as for the treatment planning of charged-particle therapy.

  15. NPTFit: A Code Package for Non-Poissonian Template Fitting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R., E-mail: smsharma@princeton.edu, E-mail: nrodd@mit.edu, E-mail: bsafdi@mit.edu

    We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ∼GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allowmore » for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.« less

  16. Analysis of the Effect of Electron Density Perturbations Generated by Gravity Waves on HF Communication Links

    NASA Astrophysics Data System (ADS)

    Fagre, M.; Elias, A. G.; Chum, J.; Cabrera, M. A.

    2017-12-01

    In the present work, ray tracing of high frequency (HF) signals in ionospheric disturbed conditions is analyzed, particularly in the presence of electron density perturbations generated by gravity waves (GWs). The three-dimensional numerical ray tracing code by Jones and Stephenson, based on Hamilton's equations, which is commonly used to study radio propagation through the ionosphere, is used. An electron density perturbation model is implemented to this code based upon the consideration of atmospheric GWs generated at a height of 150 km in the thermosphere and propagating up into the ionosphere. The motion of the neutral gas at these altitudes induces disturbances in the background plasma which affects HF signals propagation. To obtain a realistic model of GWs in order to analyze the propagation and dispersion characteristics, a GW ray tracing method with kinematic viscosity and thermal diffusivity was applied. The IRI-2012, HWM14 and NRLMSISE-00 models were incorporated to assess electron density, wind velocities, neutral temperature and total mass density needed for the ray tracing codes. Preliminary results of gravity wave effects on ground range and reflection height are presented for low-mid latitude ionosphere.

  17. Main functions, recent updates, and applications of Synchrotron Radiation Workshop code

    NASA Astrophysics Data System (ADS)

    Chubar, Oleg; Rakitin, Maksim; Chen-Wiegart, Yu-Chen Karen; Chu, Yong S.; Fluerasu, Andrei; Hidas, Dean; Wiegart, Lutz

    2017-08-01

    The paper presents an overview of the main functions and new application examples of the "Synchrotron Radiation Workshop" (SRW) code. SRW supports high-accuracy calculations of different types of synchrotron radiation, and simulations of propagation of fully-coherent radiation wavefronts, partially-coherent radiation from a finite-emittance electron beam of a storage ring source, and time-/frequency-dependent radiation pulses of a free-electron laser, through X-ray optical elements of a beamline. An extended library of physical-optics "propagators" for different types of reflective, refractive and diffractive X-ray optics with its typical imperfections, implemented in SRW, enable simulation of practically any X-ray beamline in a modern light source facility. The high accuracy of calculation methods used in SRW allows for multiple applications of this code, not only in the area of development of instruments and beamlines for new light source facilities, but also in areas such as electron beam diagnostics, commissioning and performance benchmarking of insertion devices and individual X-ray optical elements of beamlines. Applications of SRW in these areas, facilitating development and advanced commissioning of beamlines at the National Synchrotron Light Source II (NSLS-II), are described.

  18. The Enceladus Ionizing Radiation Environment: Implications for Biomolecules

    NASA Astrophysics Data System (ADS)

    Teodoro, L. A.; Elphic, R. C.; Davila, A. F.; McKay, C.; Dartnell, L.

    2016-12-01

    Enceladus' subsurface ocean is a possible abode for life, but it is inaccessible with current technology. However, icy particles and vapor are being expelled into space through surface fractures known as Tiger Stripes, forming a large plume centered in the South Polar Terrains. Direct chemical analyses by Cassini have revealed salts and organic compounds in a significant fraction of plume particles, which suggests that the subsurface ocean is the main source of materials in the plume (i.e. frozen ocean spray). While smaller icy particles in the plume reach escape velocity and feed Saturn's E-ring, larger particles fall back on the moon's surface, where they accumulate as icy mantling deposits at practically all latitudes. The organic content of these fall-out materials could be of great astrobiological relevance. Galactic Cosmic Rays (GCRs) that strike both Enceladus' surface and the lofted icy particles produce ionizing radiation in the form of high-energy electrons, protons, gamma rays, neutrons and muons. An additional source of ionizing radiation is the population of energetic charged particles in Saturn's magnetosphere. The effects of ionizing radiation in matter always involve the destruction of chemical bonds and the creation of free radicals. Both affect organic matter, and can damage or destroy biomarkers over time. Using ionizing radiation transport codes, we recreated the radiation environment on the surface of Enceladus, and evaluated its possible effects on organic matter (including biomarkers) in the icy mantling deposits. Here, we present full Monte-Carlo simulations of the nuclear reactions induced by the GCRs hitting Enceladus's surface using a code based on the GEANT-4 toolkit for the transport of particles. To model the GCR primary spectra for Z= 1-26 (protons to iron nuclei) we assumed the CREAME96 model under solar minimum, modified to take into account Enceladus' location. We considered bulk compositions of: i) pure water ice, ii) water ice and organics (1-10%), and iii) water ice, organics and salts (up to 2%). The computed flux of ionizing radiation is converted into dosage at the molecular level using a "biologically-weighted" scheme, which provides an estimate of the biomarkers' survival time.

  19. 3D unstructured-mesh radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morel, J.

    1997-12-31

    Three unstructured-mesh radiation transport codes are currently being developed at Los Alamos National Laboratory. The first code is ATTILA, which uses an unstructured tetrahedral mesh in conjunction with standard Sn (discrete-ordinates) angular discretization, standard multigroup energy discretization, and linear-discontinuous spatial differencing. ATTILA solves the standard first-order form of the transport equation using source iteration in conjunction with diffusion-synthetic acceleration of the within-group source iterations. DANTE is designed to run primarily on workstations. The second code is DANTE, which uses a hybrid finite-element mesh consisting of arbitrary combinations of hexahedra, wedges, pyramids, and tetrahedra. DANTE solves several second-order self-adjoint forms of the transport equation including the even-parity equation, the odd-parity equation, and a new equation called the self-adjoint angular flux equation. DANTE also offers three angular discretization options:more » $$S{_}n$$ (discrete-ordinates), $$P{_}n$$ (spherical harmonics), and $$SP{_}n$$ (simplified spherical harmonics). DANTE is designed to run primarily on massively parallel message-passing machines, such as the ASCI-Blue machines at LANL and LLNL. The third code is PERICLES, which uses the same hybrid finite-element mesh as DANTE, but solves the standard first-order form of the transport equation rather than a second-order self-adjoint form. DANTE uses a standard $$S{_}n$$ discretization in angle in conjunction with trilinear-discontinuous spatial differencing, and diffusion-synthetic acceleration of the within-group source iterations. PERICLES was initially designed to run on workstations, but a version for massively parallel message-passing machines will be built. The three codes will be described in detail and computational results will be presented.« less

  20. AN EXTENSION OF THE ATHENA++ CODE FRAMEWORK FOR GRMHD BASED ON ADVANCED RIEMANN SOLVERS AND STAGGERED-MESH CONSTRAINED TRANSPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Christopher J.; Stone, James M.; Gammie, Charles F.

    2016-08-01

    We present a new general relativistic magnetohydrodynamics (GRMHD) code integrated into the Athena++ framework. Improving upon the techniques used in most GRMHD codes, ours allows the use of advanced, less diffusive Riemann solvers, in particular HLLC and HLLD. We also employ a staggered-mesh constrained transport algorithm suited for curvilinear coordinate systems in order to maintain the divergence-free constraint of the magnetic field. Our code is designed to work with arbitrary stationary spacetimes in one, two, or three dimensions, and we demonstrate its reliability through a number of tests. We also report on its promising performance and scalability.

Top