FLYCHK Collisional-Radiative Code
National Institute of Standards and Technology Data Gateway
SRD 160 FLYCHK Collisional-Radiative Code (Web, free access) FLYCHK provides a capability to generate atomic level populations and charge state distributions for low-Z to mid-Z elements under NLTE conditions.
BART: Bayesian Atmospheric Radiative Transfer fitting code
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph; Rojo, Patricio; Lust, Nate; Bowman, Oliver; Stemm, Madison; Foster, Andrew; Loredo, Thomas J.; Fortney, Jonathan; Madhusudhan, Nikku
2016-08-01
BART implements a Bayesian, Monte Carlo-driven, radiative-transfer scheme for extracting parameters from spectra of planetary atmospheres. BART combines a thermochemical-equilibrium code, a one-dimensional line-by-line radiative-transfer code, and the Multi-core Markov-chain Monte Carlo statistical module to constrain the atmospheric temperature and chemical-abundance profiles of exoplanets.
MACRAD: A mass analysis code for radiators
Gallup, D.R.
1988-01-01
A computer code to estimate and optimize the mass of heat pipe radiators (MACRAD) is currently under development. A parametric approach is used in MACRAD, which allows the user to optimize radiator mass based on heat pipe length, length to diameter ratio, vapor to wick radius, radiator redundancy, etc. Full consideration of the heat pipe operating parameters, material properties, and shielding requirements is included in the code. Preliminary results obtained with MACRAD are discussed.
TORUS: Radiation transport and hydrodynamics code
NASA Astrophysics Data System (ADS)
Harries, Tim
2014-04-01
TORUS is a flexible radiation transfer and radiation-hydrodynamics code. The code has a basic infrastructure that includes the AMR mesh scheme that is used by several physics modules including atomic line transfer in a moving medium, molecular line transfer, photoionization, radiation hydrodynamics and radiative equilibrium. TORUS is useful for a variety of problems, including magnetospheric accretion onto T Tauri stars, spiral nebulae around Wolf-Rayet stars, discs around Herbig AeBe stars, structured winds of O supergiants and Raman-scattered line formation in symbiotic binaries, and dust emission and molecular line formation in star forming clusters. The code is written in Fortran 2003 and is compiled using a standard Gnu makefile. The code is parallelized using both MPI and OMP, and can use these parallel sections either separately or in a hybrid mode.
Airborne antenna radiation pattern code user's manual
NASA Technical Reports Server (NTRS)
Burnside, Walter D.; Kim, Jacob J.; Grandchamp, Brett; Rojas, Roberto G.; Law, Philip
1985-01-01
The use of a newly developed computer code to analyze the radiation patterns of antennas mounted on a ellipsoid and in the presence of a set of finite flat plates is described. It is shown how the code allows the user to simulate a wide variety of complex electromagnetic radiation problems using the ellipsoid/plates model. The code has the capacity of calculating radiation patterns around an arbitrary conical cut specified by the user. The organization of the code, definition of input and output data, and numerous practical examples are also presented. The analysis is based on the Uniform Geometrical Theory of Diffraction (UTD), and most of the computed patterns are compared with experimental results to show the accuracy of this solution.
2-DUST: Dust radiative transfer code
NASA Astrophysics Data System (ADS)
Ueta, Toshiya; Meixner, Margaret
2016-04-01
2-DUST is a general-purpose dust radiative transfer code for an axisymmetric system that reveals the global energetics of dust grains in the shell and the 2-D projected morphologies of the shell that are strongly dependent on the mixed effects of the axisymmetric dust distribution and inclination angle. It can be used to model a variety of axisymmetric astronomical dust systems.
An integrated radiation physics computer code system.
NASA Technical Reports Server (NTRS)
Steyn, J. J.; Harris, D. W.
1972-01-01
An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.
The Helicopter Antenna Radiation Prediction Code (HARP)
NASA Technical Reports Server (NTRS)
Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.
1990-01-01
The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.
Radiation hydrodynamics integrated in the PLUTO code
NASA Astrophysics Data System (ADS)
Kolb, Stefan M.; Stute, Matthias; Kley, Wilhelm; Mignone, Andrea
2013-11-01
Aims: The transport of energy through radiation is very important in many astrophysical phenomena. In dynamical problems the time-dependent equations of radiation hydrodynamics have to be solved. We present a newly developed radiation-hydrodynamics module specifically designed for the versatile magnetohydrodynamic (MHD) code PLUTO. Methods: The solver is based on the flux-limited diffusion approximation in the two-temperature approach. All equations are solved in the co-moving frame in the frequency-independent (gray) approximation. The hydrodynamics is solved by the different Godunov schemes implemented in PLUTO, and for the radiation transport we use a fully implicit scheme. The resulting system of linear equations is solved either using the successive over-relaxation (SOR) method (for testing purposes) or using matrix solvers that are available in the PETSc library. We state in detail the methodology and describe several test cases to verify the correctness of our implementation. The solver works in standard coordinate systems, such as Cartesian, cylindrical, and spherical, and also for non-equidistant grids. Results: We present a new radiation-hydrodynamics solver coupled to the MHD-code PLUTO that is a modern, versatile, and efficient new module for treating complex radiation hydrodynamical problems in astrophysics. As test cases, either purely radiative situations, or full radiation-hydrodynamical setups (including radiative shocks and convection in accretion disks) were successfully studied. The new module scales very well on parallel computers using MPI. For problems in star or planet formation, we added the possibility of irradiation by a central source.
ASIMUT on line radiative transfer code
NASA Astrophysics Data System (ADS)
Vandaele, A. C.; Neary, L.; Robert, S.; Letocart, V.; Giuranna, M.; Kasaba, Y.
2015-10-01
The CROSS DRIVE project aims to develop an innovative collaborative workspace infrastructure for space missions that will allow distributed scientific and engineering teams to collectively analyse and interpret scientific data as well as execute operations of planetary spacecraft. ASIMUT will be one of the tools that will be made available to the users. Here we describe this radiative transfer code and how it will be integrated into the virtual environment developed within CROSS DRIVE.
Validation of comprehensive space radiation transport code
Shinn, J.L.; Simonsen, L.C.; Cucinotta, F.A.
1998-12-01
The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation.
THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE
WATERS, LAURIE S.; MCKINNEY, GREGG W.; DURKEE, JOE W.; FENSIN, MICHAEL L.; JAMES, MICHAEL R.; JOHNS, RUSSELL C.; PELOWITZ, DENISE B.
2007-01-10
MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.
LPGS. Code System for Calculating Radiation Exposure
White, J.E.; Eckerman, K.F.
1983-01-01
LPGS was developed to calculate the radiological impacts resulting from radioactive releases to the hydrosphere. The name LPGS was derived from the Liquid Pathway Generic Study for which the original code was used primarily as an analytic tool in the assessment process. The hydrosphere is represented by the following types of water bodies: estuary, small river, well, lake, and one-dimensional (1-d) river. LPGS is designed to calculate radiation dose (individual and population) to body organs as a function of time for the various exposure pathways. The radiological consequences to the aquatic biota are estimated. Several simplified radionuclide transport models are employed with built-in formulations to describe the release rate of the radionuclides. A tabulated user-supplied release model can be input, if desired. Printer plots of dose versus time for the various exposure pathways are provided.
Advances in space radiation shielding codes
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Qualls, Garry D.; Cucinotta, Francis A.; Prael, Richard E.; Norbury, John W.; Heinbockel, John H.; Tweed, John; De Angelis, Giovanni
2002-01-01
Early space radiation shield code development relied on Monte Carlo methods and made important contributions to the space program. Monte Carlo methods have resorted to restricted one-dimensional problems leading to imperfect representation of appropriate boundary conditions. Even so, intensive computational requirements resulted and shield evaluation was made near the end of the design process. Resolving shielding issues usually had a negative impact on the design. Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary concept to the final design. For the last few decades, we have pursued deterministic solutions of the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design methods. A single ray trace in such geometry requires 14 milliseconds and limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given.
Advances in space radiation shielding codes.
Wilson, John W; Tripathi, Ram K; Qualls, Garry D; Cucinotta, Francis A; Prael, Richard E; Norbury, John W; Heinbockel, John H; Tweed, John; De Angelis, Giovanni
2002-12-01
Early space radiation shield code development relied on Monte Carlo methods and made important contributions to the space program. Monte Carlo methods have resorted to restricted one-dimensional problems leading to imperfect representation of appropriate boundary conditions. Even so, intensive computational requirements resulted and shield evaluation was made near the end of the design process. Resolving shielding issues usually had a negative impact on the design. Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary concept to the final design. For the last few decades, we have pursued deterministic solutions of the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design methods. A single ray trace in such geometry requires 14 milliseconds and limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given. PMID:12793737
Space Radiation Transport Code Development: 3DHZETRN
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
The space radiation transport code, HZETRN, has been used extensively for research, vehicle design optimization, risk analysis, and related applications. One of the simplifying features of the HZETRN transport formalism is the straight-ahead approximation, wherein all particles are assumed to travel along a common axis. This reduces the governing equation to one spatial dimension allowing enormous simplification and highly efficient computational procedures to be implemented. Despite the physical simplifications, the HZETRN code is widely used for space applications and has been found to agree well with fully 3D Monte Carlo simulations in many circumstances. Recent work has focused on the development of 3D transport corrections for neutrons and light ions (Z < 2) for which the straight-ahead approximation is known to be less accurate. Within the development of 3D corrections, well-defined convergence criteria have been considered, allowing approximation errors at each stage in model development to be quantified. The present level of development assumes the neutron cross sections have an isotropic component treated within N explicit angular directions and a forward component represented by the straight-ahead approximation. The N = 1 solution refers to the straight-ahead treatment, while N = 2 represents the bi-directional model in current use for engineering design. The figure below shows neutrons, protons, and alphas for various values of N at locations in an aluminum sphere exposed to a solar particle event (SPE) spectrum. The neutron fluence converges quickly in simple geometry with N > 14 directions. The improved code, 3DHZETRN, transports neutrons, light ions, and heavy ions under space-like boundary conditions through general geometry while maintaining a high degree of computational efficiency. A brief overview of the 3D transport formalism for neutrons and light ions is given, and extensive benchmarking results with the Monte Carlo codes Geant4, FLUKA, and
Los Alamos radiation transport code system on desktop computing platforms
Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T. )
1990-01-01
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.
Description of Transport Codes for Space Radiation Shielding
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Wilson, John W.; Cucinotta, Francis A.
2011-01-01
This slide presentation describes transport codes and their use for studying and designing space radiation shielding. When combined with risk projection models radiation transport codes serve as the main tool for study radiation and designing shielding. There are three criteria for assessing the accuracy of transport codes: (1) Ground-based studies with defined beams and material layouts, (2) Inter-comparison of transport code results for matched boundary conditions and (3) Comparisons to flight measurements. These three criteria have a very high degree with NASA's HZETRN/QMSFRG.
Radiation transport phenomena and modeling - part A: Codes
Lorence, L.J.
1997-06-01
The need to understand how particle radiation (high-energy photons and electrons) from a variety of sources affects materials and electronics has motivated the development of sophisticated computer codes that describe how radiation with energies from 1.0 keV to 100.0 GeV propagates through matter. Predicting radiation transport is the necessary first step in predicting radiation effects. The radiation transport codes that are described here are general-purpose codes capable of analyzing a variety of radiation environments including those produced by nuclear weapons (x-rays, gamma rays, and neutrons), by sources in space (electrons and ions) and by accelerators (x-rays, gamma rays, and electrons). Applications of these codes include the study of radiation effects on electronics, nuclear medicine (imaging and cancer treatment), and industrial processes (food disinfestation, waste sterilization, manufacturing.) The primary focus will be on coupled electron-photon transport codes, with some brief discussion of proton transport. These codes model a radiation cascade in which electrons produce photons and vice versa. This coupling between particles of different types is important for radiation effects. For instance, in an x-ray environment, electrons are produced that drive the response in electronics. In an electron environment, dose due to bremsstrahlung photons can be significant once the source electrons have been stopped.
Radiation flux tables for ICRCCM using the GLA GCM radiation codes
NASA Technical Reports Server (NTRS)
HARSHVARDHAN
1986-01-01
Tabulated values of longwave and shortwave radiation fluxes and also cooling and heating rates in the atmosphere for standard atmospheric profiles are presented. The radiation codes used in the Goddard general circulation model were employed for the computations. These results were obtained for an international intercomparison projected called Intercomparison of Radiation Codes in Climate Models (ICRCCM).
Recent developments in the Los Alamos radiation transport code system
Forster, R.A.; Parsons, K.
1997-06-01
A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results.
Space radiator simulation manual for computer code
NASA Technical Reports Server (NTRS)
Black, W. Z.; Wulff, W.
1972-01-01
A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.
Overview of HZETRN and BRNTRN Space Radiation Shielding Codes
NASA Technical Reports Server (NTRS)
Wilson, John W.; Cucinotta, F. A.; Shinn, J. L.; Simonsen, L. C.; Badavi, F. F.
1997-01-01
The NASA Radiation Health Program has supported basic research over the last decade in radiation physics to develop ionizing radiation transport codes and corresponding data bases for the protection of astronauts from galactic and solar cosmic rays on future deep space missions. The codes describe the interactions of the incident radiations with shield materials where their content is modified by the atomic and nuclear reactions through which high energy heavy ions are fragmented into less massive reaction products and reaction products are produced as radiations as direct knockout of shield constituents or produced as de-excitation products in the reactions. This defines the radiation fields to which specific devices are subjected onboard a spacecraft. Similar reactions occur in the device itself which is the initiating event for the device response. An overview of the computational procedures and data base with some applications to photonic and data processing devices will be given.
The Continual Intercomparison of Radiation Codes: Results from Phase I
NASA Technical Reports Server (NTRS)
Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri; Rose, Fred; Zhang, Yuanchong; Wilson Michael J.; Rossow, William
2011-01-01
The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality
ACCELERATING HIGH-ENERGY PULSAR RADIATION CODES
Venter, C.; De Jager, O. C.
2010-12-20
Curvature radiation (CR) is believed to be a dominant mechanism for creating gamma-ray emission from pulsars and is emitted by relativistic particles that are constrained to move along curved magnetic field lines. Additionally, synchrotron radiation (SR) is expected to be radiated by both relativistic primaries (involving cyclotron resonant absorption of radio photons and re-emission of SR photons), or secondary electron-positron pairs (created by magnetic or photon-photon pair production processes involving CR gamma rays in the pulsar magnetosphere). When calculating these high-energy spectra, especially in the context of pulsar population studies where several millions of CR and SR spectra have to be generated, it is profitable to consider approximations that would save computational time without sacrificing too much accuracy. This paper focuses on one such approximation technique, and we show that one may gain significantly in computational speed while preserving the accuracy of the spectral results.
Acceleration of a Monte Carlo radiation transport code
Hochstedler, R.D.; Smith, L.M.
1996-03-01
Execution time for the Integrated TIGER Series (ITS) Monte Carlo radiation transport code has been reduced by careful re-coding of computationally intensive subroutines. Three test cases for the TIGER (1-D slab geometry), CYLTRAN (2-D cylindrical geometry), and ACCEPT (3-D arbitrary geometry) codes were identified and used to benchmark and profile program execution. Based upon these results, sixteen top time-consuming subroutines were examined and nine of them modified to accelerate computations with equivalent numerical output to the original. The results obtained via this study indicate that speedup factors of 1.90 for the TIGER code, 1.67 for the CYLTRAN code, and 1.11 for the ACCEPT code are achievable. {copyright} {ital 1996 American Institute of Physics.}
Stratospheric Relaxation in IMPACT's Radiation Code
Edis, T; Grant, K; Cameron-Smith, P
2006-11-13
While Impact incorporates diagnostic radiation routines from our work in previous years, it has not previously included the stratospheric relaxation required for forcing calculations. We have now implemented the necessary changes for stratospheric relaxation, tested its stability, and compared the results with stratosphere temperatures obtained from CAM3 met data. The relaxation results in stable temperature profiles in the stratosphere, which is encouraging for use in forcing calculations. It does, however, produce a cooling bias when compared to CAM3, which appears to be due to differences in radiation calculations rather than the interactive treatment of ozone. The cause of this bias is unclear as yet, but seems to be systematic and hence cancels out when differences are taken relative to a control simulation.
A Radiation Shielding Code for Spacecraft and Its Validation
NASA Technical Reports Server (NTRS)
Shinn, J. L.; Cucinotta, F. A.; Singleterry, R. C.; Wilson, J. W.; Badavi, F. F.; Badhwar, G. D.; Miller, J.; Zeitlin, C.; Heilbronn, L.; Tripathi, R. K.
2000-01-01
The HZETRN code, which uses a deterministic approach pioneered at NASA Langley Research Center, has been developed over the past decade to evaluate the local radiation fields within sensitive materials (electronic devices and human tissue) on spacecraft in the space environment. The code describes the interactions of shield materials with the incident galactic cosmic rays, trapped protons, or energetic protons from solar particle events in free space and low Earth orbit. The content of incident radiations is modified by atomic and nuclear reactions with the spacecraft and radiation shield materials. High-energy heavy ions are fragmented into less massive reaction products, and reaction products are produced by direct knockout of shield constituents or from de-excitation products. An overview of the computational procedures and database which describe these interactions is given. Validation of the code with recent Monte Carlo benchmarks, and laboratory and flight measurement is also included.
Rykovanov, S. G.; Chen, M.; Geddes, C. G. R.; Schroeder, C. B.; Esarey, E.; Leemans, W. P.
2012-12-21
The Virtual Detector for Synchrotron Radiation (VDSR) is a parallel C++ code developed to calculate the incoherent radiation from a single charged particle or a beam moving in given external electro-magnetic fields. In this proceedings the code structure and features are introduced. An example of radiation generation from the betatron motion of a beam in the focusing fields of the wake in a laser-plasma accelerator is presented.
MORSE Monte Carlo radiation transport code system
Emmett, M.B.
1983-02-01
This report is an addendum to the MORSE report, ORNL-4972, originally published in 1975. This addendum contains descriptions of several modifications to the MORSE Monte Carlo Code, replacement pages containing corrections, Part II of the report which was previously unpublished, and a new Table of Contents. The modifications include a Klein Nishina estimator for gamma rays. Use of such an estimator required changing the cross section routines to process pair production and Compton scattering cross sections directly from ENDF tapes and writing a new version of subroutine RELCOL. Another modification is the use of free form input for the SAMBO analysis data. This required changing subroutines SCORIN and adding new subroutine RFRE. References are updated, and errors in the original report have been corrected. (WHK)
Description of transport codes for space radiation shielding.
Kim, Myung-Hee Y; Wilson, John W; Cucinotta, Francis A
2012-11-01
Exposure to ionizing radiation in the space environment is one of the hazards faced by crews in space missions. As space radiations traverse spacecraft, habitat shielding, or tissues, their energies and compositions are altered by interactions with the shielding. Modifications to the radiation fields arise from atomic interactions of charged particles with orbital electrons and nuclear interactions leading to projectile and target fragmentation, including secondary particles such as neutrons, protons, mesons, and nuclear recoils. The transport of space radiation through shielding can be simulated using Monte Carlo techniques or deterministic solutions of the Boltzmann equation. To determine shielding requirements and to resolve radiation constraints for future human missions, the shielding evaluation of a spacecraft concept is required as an early step in the design process. To do this requires (1) accurate knowledge of space environmental models to define the boundary condition for transport calculations, (2) transport codes with detailed shielding and body geometry models to determine particle transmission into areas of internal shielding and at each critical body organ, and (3) the assessment of organ dosimetric quantities and biological risks by applying the corresponding response models for space radiation against the particle spectra that have been accurately determined from the transport code. This paper reviews current transport codes and analyzes their accuracy through comparison to laboratory and spaceflight data. This paper also introduces a probabilistic risk assessment approach for the evaluation of radiation shielding. PMID:23032892
Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes
NASA Technical Reports Server (NTRS)
Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.
2010-01-01
The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,
Code for Analyzing and Designing Spacecraft Power System Radiators
NASA Technical Reports Server (NTRS)
Juhasz, Albert
2005-01-01
GPHRAD is a computer code for analysis and design of disk or circular-sector heat-rejecting radiators for spacecraft power systems. A specific application is for Stirling-cycle/linear-alternator electric-power systems coupled to radioisotope general-purpose heat sources. GPHRAD affords capabilities and options to account for thermophysical properties (thermal conductivity, density) of either metal-alloy or composite radiator materials.
The Continual Intercomparison of Radiation Codes: Results from Phase I
Oreopoulos, L.; Mlawer, Eli J.; Delamere, Jennifer; Shippert, Timothy R.; Cole, Jason; Fomin, Boris; Iacono, Michael J.; Jin, Zhonghai; Li, Jiangning; Manners, James; Raisanen, Petri; Rose, Fred; Zhang, Yuanchong; Wilson, Michael J.; Rossow, William B.
2012-01-01
We present results from Phase I of the Continual Intercomparison of Radiation Codes (CIRC), intended as an evolving and regularly updated reference source for evaluation of radiative transfer (RT) codes used in Global Climate Models. CIRC differs from previous intercomparisons in that it relies on an observationally validated catalogue of cases. The seven CIRC Phase I baseline cases, five cloud-free, and two with overcast liquid clouds, are built around observations by the Atmospheric Radiation Measurements (ARM) program that satisfy the goals of Phase I, namely to examine radiative transfer (RT) model performance in realistic, yet not overly complex, atmospheric conditions. In addition to the seven baseline cases, additional idealized "subcases" are also examined to facilitate intrepretation of the causes of model errors. In addition to summarizing individual model performance with respect to reference line-by-line calculations and inter-model differences, we also highlight RT model behavior for conditions of doubled CO2, aspects of utilizing a spectral specification of surface albedo, and the impact of the inclusion of scattering in the thermal infrared. Our analysis suggests that RT models should work towards improving their calculation of diffuse shortwave flux, shortwave absorption, treatment of spectral surface albedo, and shortwave CO2 forcing. On the other hand, LW calculations appear to be significantly closer to the reference results. By enhancing the range of conditions under which participating codes are tested, future CIRC phases will hopefully allow even more rigorous examination of RT code performance.
Radiative transfer code SHARM for atmospheric and terrestrial applications
NASA Astrophysics Data System (ADS)
Lyapustin, A. I.
2005-12-01
An overview of the publicly available radiative transfer Spherical Harmonics code (SHARM) is presented. SHARM is a rigorous code, as accurate as the Discrete Ordinate Radiative Transfer (DISORT) code, yet faster. It performs simultaneous calculations for different solar zenith angles, view zenith angles, and view azimuths and allows the user to make multiwavelength calculations in one run. The Δ-M method is implemented for calculations with highly anisotropic phase functions. Rayleigh scattering is automatically included as a function of wavelength, surface elevation, and the selected vertical profile of one of the standard atmospheric models. The current version of the SHARM code does not explicitly include atmospheric gaseous absorption, which should be provided by the user. The SHARM code has several built-in models of the bidirectional reflectance of land and wind-ruffled water surfaces that are most widely used in research and satellite data processing. A modification of the SHARM code with the built-in Mie algorithm designed for calculations with spherical aerosols is also described.
Space shuttle rendezous, radiation and reentry analysis code
NASA Technical Reports Server (NTRS)
Mcglathery, D. M.
1973-01-01
A preliminary space shuttle mission design and analysis tool is reported emphasizing versatility, flexibility, and user interaction through the use of a relatively small computer (IBM-7044). The Space Shuttle Rendezvous, Radiation and Reentry Analysis Code is used to perform mission and space radiation environmental analyses for four typical space shuttle missions. Included also is a version of the proposed Apollo/Soyuz rendezvous and docking test mission. Tangential steering circle to circle low-thrust tug orbit raising and the effects of the trapped radiation environment on trajectory shaping due to solar electric power losses are also features of this mission analysis code. The computational results include a parametric study on single impulse versus double impulse deorbiting for relatively low space shuttle orbits as well as some definitive data on the magnetically trapped protons and electrons encountered on a particular mission.
Prototype demonstration of radiation therapy planning code system
Little, R.C.; Adams, K.J.; Estes, G.P.; Hughes, L.S. III; Waters, L.S.
1996-09-01
This is the final report of a one-year, Laboratory-Directed Research and Development project at the Los Alamos National Laboratory (LANL). Radiation therapy planning is the process by which a radiation oncologist plans a treatment protocol for a patient preparing to undergo radiation therapy. The objective is to develop a protocol that delivers sufficient radiation dose to the entire tumor volume, while minimizing dose to healthy tissue. Radiation therapy planning, as currently practiced in the field, suffers from inaccuracies made in modeling patient anatomy and radiation transport. This project investigated the ability to automatically model patient-specific, three-dimensional (3-D) geometries in advanced Los Alamos radiation transport codes (such as MCNP), and to efficiently generate accurate radiation dose profiles in these geometries via sophisticated physics modeling. Modem scientific visualization techniques were utilized. The long-term goal is that such a system could be used by a non-expert in a distributed computing environment to help plan the treatment protocol for any candidate radiation source. The improved accuracy offered by such a system promises increased efficacy and reduced costs for this important aspect of health care.
The Continuous Intercomparison of Radiation Codes (CIRC): Phase I Cases
NASA Technical Reports Server (NTRS)
Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Turner, David D.; Miller, Mark A.; Minnis, Patrick; Clough, Shepard; Barker, Howard; Ellingson, Robert
2007-01-01
CIRC aspires to be the successor to ICRCCM (Intercomparison of Radiation Codes in Climate Models). It is envisioned as an evolving and regularly updated reference source for GCM-type radiative transfer (RT) code evaluation with the principle goal to contribute in the improvement of RT parameterizations. CIRC is jointly endorsed by DOE's Atmospheric Radiation Measurement (ARM) program and the GEWEX Radiation Panel (GRP). CIRC's goal is to provide test cases for which GCM RT algorithms should be performing at their best, i.e, well characterized clear-sky and homogeneous, overcast cloudy cases. What distinguishes CIRC from previous intercomparisons is that its pool of cases is based on observed datasets. The bulk of atmospheric and surface input as well as radiative fluxes come from ARM observations as documented in the Broadband Heating Rate Profile (BBHRP) product. BBHRP also provides reference calculations from AER's RRTM RT algorithms that can be used to select the most optimal set of cases and to provide a first-order estimate of our ability to achieve radiative flux closure given the limitations in our knowledge of the atmospheric state.
NERO- a post-maximum supernova radiation transport code
NASA Astrophysics Data System (ADS)
Maurer, I.; Jerkstrand, A.; Mazzali, P. A.; Taubenberger, S.; Hachinger, S.; Kromer, M.; Sim, S.; Hillebrandt, W.
2011-12-01
The interpretation of supernova (SN) spectra is essential for deriving SN ejecta properties such as density and composition, which in turn can tell us about their progenitors and the explosion mechanism. A very large number of atomic processes are important for spectrum formation. Several tools for calculating SN spectra exist, but they mainly focus on the very early or late epochs. The intermediate phase, which requires a non-local thermodynamic equilibrium (NLTE) treatment of radiation transport has rarely been studied. In this paper, we present a new SN radiation transport code, NERO, which can look at those epochs. All the atomic processes are treated in full NLTE, under a steady-state assumption. This is a valid approach between roughly 50 and 500 days after the explosion depending on SN type. This covers the post-maximum photospheric and the early and the intermediate nebular phase. As a test, we compare NERO to the radiation transport code of Jerkstrand, Fransson & Kozma and to the nebular code of Mazzali et al. All three codes have been developed independently and a comparison provides a valuable opportunity to investigate their reliability. Currently, NERO is one-dimensional and can be used for predicting spectra of synthetic explosion models or for deriving SN properties by spectral modelling. To demonstrate this, we study the spectra of the 'normal' Type Ia supernova (SN Ia) 2005cf between 50 and 350 days after the explosion and identify most of the common SN Ia line features at post-maximum epochs.
A Radiation Solver for the National Combustion Code
NASA Technical Reports Server (NTRS)
Sockol, Peter M.
2015-01-01
A methodology is given that converts an existing finite volume radiative transfer method that requires input of local absorption coefficients to one that can treat a mixture of combustion gases and compute the coefficients on the fly from the local mixture properties. The Full-spectrum k-distribution method is used to transform the radiative transfer equation (RTE) to an alternate wave number variable, g . The coefficients in the transformed equation are calculated at discrete temperatures and participating species mole fractions that span the values of the problem for each value of g. These results are stored in a table and interpolation is used to find the coefficients at every cell in the field. Finally, the transformed RTE is solved for each g and Gaussian quadrature is used to find the radiant heat flux throughout the field. The present implementation is in an existing cartesian/cylindrical grid radiative transfer code and the local mixture properties are given by a solution of the National Combustion Code (NCC) on the same grid. Based on this work the intention is to apply this method to an existing unstructured grid radiation code which can then be coupled directly to NCC.
Towards a 3D Space Radiation Transport Code
NASA Technical Reports Server (NTRS)
Wilson, J. W.; Tripathl, R. K.; Cicomptta, F. A.; Heinbockel, J. H.; Tweed, J.
2002-01-01
High-speed computational procedures for space radiation shielding have relied on asymptotic expansions in terms of the off-axis scatter and replacement of the general geometry problem by a collection of flat plates. This type of solution was derived for application to human rated systems in which the radius of the shielded volume is large compared to the off-axis diffusion limiting leakage at lateral boundaries. Over the decades these computational codes are relatively complete and lateral diffusion effects are now being added. The analysis for developing a practical full 3D space shielding code is presented.
Development of the 3DHZETRN code for space radiation protection
NASA Astrophysics Data System (ADS)
Wilson, John; Badavi, Francis; Slaba, Tony; Reddell, Brandon; Bahadori, Amir; Singleterry, Robert
Space radiation protection requires computationally efficient shield assessment methods that have been verified and validated. The HZETRN code is the engineering design code used for low Earth orbit dosimetric analysis and astronaut record keeping with end-to-end validation to twenty percent in Space Shuttle and International Space Station operations. HZETRN treated diffusive leakage only at the distal surface limiting its application to systems with a large radius of curvature. A revision of HZETRN that included forward and backward diffusion allowed neutron leakage to be evaluated at both the near and distal surfaces. That revision provided a deterministic code of high computational efficiency that was in substantial agreement with Monte Carlo (MC) codes in flat plates (at least to the degree that MC codes agree among themselves). In the present paper, the 3DHZETRN formalism capable of evaluation in general geometry is described. Benchmarking will help quantify uncertainty with MC codes (Geant4, FLUKA, MCNP6, and PHITS) in simple shapes such as spheres within spherical shells and boxes. Connection of the 3DHZETRN to general geometry will be discussed.
Documentation of the detailed radiation property data for the radiation-ablation code RASLE
NASA Technical Reports Server (NTRS)
Henline, William D.
1991-01-01
This report is a documentation of the necessary radiation property input data for the radiating shock layer simulation code RASLE. The tabulated data are required to simulate systems which are composed of oxygen, nitrogen, carbon, hydrogen, and silicon. These data are needed to compute the flowfield effects of many practical ablative, hypersonic vehicle heat shield materials. A brief outline description is provided for the RASLE code. A more detailed discussion is provided for the RASLE code non-grey gas spectral radiation model. This model is related to the required radiation property data which are tabulated at the end of the report. Other correlations needed for the RASLE simulations are not discussed, since these are automatically included in the program and no input data are required.
A nonlocal electron conduction model for multidimensional radiation hydrodynamics codes
NASA Astrophysics Data System (ADS)
Schurtz, G. P.; Nicolaï, Ph. D.; Busquet, M.
2000-10-01
Numerical simulation of laser driven Inertial Confinement Fusion (ICF) related experiments require the use of large multidimensional hydro codes. Though these codes include detailed physics for numerous phenomena, they deal poorly with electron conduction, which is the leading energy transport mechanism of these systems. Electron heat flow is known, since the work of Luciani, Mora, and Virmont (LMV) [Phys. Rev. Lett. 51, 1664 (1983)], to be a nonlocal process, which the local Spitzer-Harm theory, even flux limited, is unable to account for. The present work aims at extending the original formula of LMV to two or three dimensions of space. This multidimensional extension leads to an equivalent transport equation suitable for easy implementation in a two-dimensional radiation-hydrodynamic code. Simulations are presented and compared to Fokker-Planck simulations in one and two dimensions of space.
Validation of a comprehensive space radiation transport code.
Shinn, J L; Cucinotta, F A; Simonsen, L C; Wilson, J W; Badavi, F F; Badhwar, G D; Miller, J; Zeitlin, C; Heilbronn, L; Tripathi, R K; Clowdsley, M S; Heinbockel, J H; Xapsos, M A
1998-12-01
The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation. PMID:11542474
New Parallel computing framework for radiation transport codes
Kostin, M.A.; Mokhov, N.V.; Niita, K.; /JAERI, Tokai
2010-09-01
A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.
History of one family of atmospheric radiative transfer codes
NASA Astrophysics Data System (ADS)
Anderson, Gail P.; Wang, Jinxue; Hoke, Michael L.; Kneizys, F. X.; Chetwynd, James H., Jr.; Rothman, Laurence S.; Kimball, L. M.; McClatchey, Robert A.; Shettle, Eric P.; Clough, Shepard (.; Gallery, William O.; Abreu, Leonard W.; Selby, John E. A.
1994-12-01
Beginning in the early 1970's, the then Air Force Cambridge Research Laboratory initiated a program to develop computer-based atmospheric radiative transfer algorithms. The first attempts were translations of graphical procedures described in a 1970 report on The Optical Properties of the Atmosphere, based on empirical transmission functions and effective absorption coefficients derived primarily from controlled laboratory transmittance measurements. The fact that spectrally-averaged atmospheric transmittance (T) does not obey the Beer-Lambert Law (T equals exp(-(sigma) (DOT)(eta) ), where (sigma) is a species absorption cross section, independent of (eta) , the species column amount along the path) at any but the finest spectral resolution was already well known. Band models to describe this gross behavior were developed in the 1950's and 60's. Thus began LOWTRAN, the Low Resolution Transmittance Code, first released in 1972. This limited initial effort has how progressed to a set of codes and related algorithms (including line-of-sight spectral geometry, direct and scattered radiance and irradiance, non-local thermodynamic equilibrium, etc.) that contain thousands of coding lines, hundreds of subroutines, and improved accuracy, efficiency, and, ultimately, accessibility. This review will include LOWTRAN, HITRAN (atlas of high-resolution molecular spectroscopic data), FASCODE (Fast Atmospheric Signature Code), and MODTRAN (Moderate Resolution Transmittance Code), their permutations, validations, and applications, particularly as related to passive remote sensing and energy deposition.
A model code for the radiative theta pinch
Lee, S.; Saw, S. H.; Lee, P. C. K.; Akel, M.; Damideh, V.; Khattak, N. A. D.; Mongkolnavin, R.; Paosawatyanyong, B.
2014-07-15
A model for the theta pinch is presented with three modelled phases of radial inward shock phase, reflected shock phase, and a final pinch phase. The governing equations for the phases are derived incorporating thermodynamics and radiation and radiation-coupled dynamics in the pinch phase. A code is written incorporating correction for the effects of transit delay of small disturbing speeds and the effects of plasma self-absorption on the radiation. Two model parameters are incorporated into the model, the coupling coefficient f between the primary loop current and the induced plasma current and the mass swept up factor f{sub m}. These values are taken from experiments carried out in the Chulalongkorn theta pinch.
3D unstructured-mesh radiation transport codes
Morel, J.
1997-12-31
Three unstructured-mesh radiation transport codes are currently being developed at Los Alamos National Laboratory. The first code is ATTILA, which uses an unstructured tetrahedral mesh in conjunction with standard Sn (discrete-ordinates) angular discretization, standard multigroup energy discretization, and linear-discontinuous spatial differencing. ATTILA solves the standard first-order form of the transport equation using source iteration in conjunction with diffusion-synthetic acceleration of the within-group source iterations. DANTE is designed to run primarily on workstations. The second code is DANTE, which uses a hybrid finite-element mesh consisting of arbitrary combinations of hexahedra, wedges, pyramids, and tetrahedra. DANTE solves several second-order self-adjoint forms of the transport equation including the even-parity equation, the odd-parity equation, and a new equation called the self-adjoint angular flux equation. DANTE also offers three angular discretization options: $S{_}n$ (discrete-ordinates), $P{_}n$ (spherical harmonics), and $SP{_}n$ (simplified spherical harmonics). DANTE is designed to run primarily on massively parallel message-passing machines, such as the ASCI-Blue machines at LANL and LLNL. The third code is PERICLES, which uses the same hybrid finite-element mesh as DANTE, but solves the standard first-order form of the transport equation rather than a second-order self-adjoint form. DANTE uses a standard $S{_}n$ discretization in angle in conjunction with trilinear-discontinuous spatial differencing, and diffusion-synthetic acceleration of the within-group source iterations. PERICLES was initially designed to run on workstations, but a version for massively parallel message-passing machines will be built. The three codes will be described in detail and computational results will be presented.
Recent radiation damage studies and developments of the Marlowe code
NASA Astrophysics Data System (ADS)
Ortiz, C. J.; Souidi, A.; Becquart, C. S.; Domain, C.; Hou, M.
2014-07-01
Radiation damage in materials relevant to applications evolves over time scales spanning from the femtosecond - the characteristic time for an atomic collision - to decades - the aging time expected for nuclear materials. The relevant kinetic energies of atoms span from thermal motion to the MeV range.The question motivating this contribution is to identify the relationship between elementary atomic displacements triggered by irradiation and the subsequent microstructural evolution of metals in the long term. The Marlowe code, based on the binary collision approximation (BCA) is used to simulate the sequences of atomic displacements generated by energetic primary recoils and the Object Kinetic Monte Carlo code LAKIMOCA, parameterized on a range of ab initio calculations, is used to predict the subsequent long-term evolution of point defect and clusters thereof. In agreement with full Molecular Dynamics, BCA displacement cascades in body-centered cubic (BCC) Fe and a face-centered cubic (FCC) Febond Nibond Cr alloy display recursive properties that are found useful for predictions in the long term.The case of defects evolution in W due to external irradiation with energetic H and He is also discussed. To this purpose, it was useful to extend the inelastic energy loss model available in Marlowe up to the Bethe regime. The last version of the Marlowe code (version 15) was delivered before message passing instructions softwares (such as MPI) were available but the structure of the code was designed in such a way to permit parallel executions within a distributed memory environment. This makes possible to obtain N different cascades simultaneously using N independent nodes without any communication between processors. The parallelization of the code using MPI was recently achieved by one author of this report (C.J.O.). Typically, the parallelized version of Marlowe allows simulating millions of displacement cascades using a limited number of processors (<64) within only
Status of the MORSE multigroup Monte Carlo radiation transport code
Emmett, M.B.
1993-06-01
There are two versions of the MORSE multigroup Monte Carlo radiation transport computer code system at Oak Ridge National Laboratory. MORSE-CGA is the most well-known and has undergone extensive use for many years. MORSE-SGC was originally developed in about 1980 in order to restructure the cross-section handling and thereby save storage. However, with the advent of new computer systems having much larger storage capacity, that aspect of SGC has become unnecessary. Both versions use data from multigroup cross-section libraries, although in somewhat different formats. MORSE-SGC is the version of MORSE that is part of the SCALE system, but it can also be run stand-alone. Both CGA and SGC use the Multiple Array System (MARS) geometry package. In the last six months the main focus of the work on these two versions has been on making them operational on workstations, in particular, the IBM RISC 6000 family. A new version of SCALE for workstations is being released to the Radiation Shielding Information Center (RSIC). MORSE-CGA, Version 2.0, is also being released to RSIC. Both SGC and CGA have undergone other revisions recently. This paper reports on the current status of the MORSE code system.
Operation of the helicopter antenna radiation prediction code
NASA Technical Reports Server (NTRS)
Braeden, E. W.; Klevenow, F. T.; Newman, E. H.; Rojas, R. G.; Sampath, K. S.; Scheik, J. T.; Shamansky, H. T.
1993-01-01
HARP is a front end as well as a back end for the AMC and NEWAIR computer codes. These codes use the Method of Moments (MM) and the Uniform Geometrical Theory of Diffraction (UTD), respectively, to calculate the electromagnetic radiation patterns for antennas on aircraft. The major difficulty in using these codes is in the creation of proper input files for particular aircraft and in verifying that these files are, in fact, what is intended. HARP creates these input files in a consistent manner and allows the user to verify them for correctness using sophisticated 2 and 3D graphics. After antenna field patterns are calculated using either MM or UTD, HARP can display the results on the user's screen or provide hardcopy output. Because the process of collecting data, building the 3D models, and obtaining the calculated field patterns was completely automated by HARP, the researcher's productivity can be many times what it could be if these operations had to be done by hand. A complete, step by step, guide is provided so that the researcher can quickly learn to make use of all the capabilities of HARP.
NASA Astrophysics Data System (ADS)
Artyomov, K. P.; Ryzhov, V. V.; Naumenko, G. A.; Shevelev, M. V.
2012-05-01
Different types of polarization radiation generated by a relativistic electron beam are simulated using fully electromagnetic particle-in-cell (PIC) code KARAT. The simulation results for diffraction radiation, transition radiation, Smith-Purcell radiation and Vavilov-Cherenkov radiation are in a good agreement with experimental data and analytical models. Modern PIC simulation is a good tool to check and predict experimental results.
VISRAD, 3-D Target Design and Radiation Simulation Code
NASA Astrophysics Data System (ADS)
Li, Yingjie; Macfarlane, Joseph; Golovkin, Igor
2015-11-01
The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, LMJ, Z, and PLX. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e.g., that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. We will discuss recent improvements to the software package and plans for future developments.
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.
HELIOS: A new open-source radiative transfer code
NASA Astrophysics Data System (ADS)
Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin
2015-12-01
I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net
CODE's new solar radiation pressure model for GNSS orbit determination
NASA Astrophysics Data System (ADS)
Arnold, D.; Meindl, M.; Beutler, G.; Dach, R.; Schaer, S.; Lutz, S.; Prange, L.; Sośnica, K.; Mervart, L.; Jäggi, A.
2015-08-01
The Empirical CODE Orbit Model (ECOM) of the Center for Orbit Determination in Europe (CODE), which was developed in the early 1990s, is widely used in the International GNSS Service (IGS) community. For a rather long time, spurious spectral lines are known to exist in geophysical parameters, in particular in the Earth Rotation Parameters (ERPs) and in the estimated geocenter coordinates, which could recently be attributed to the ECOM. These effects grew creepingly with the increasing influence of the GLONASS system in recent years in the CODE analysis, which is based on a rigorous combination of GPS and GLONASS since May 2003. In a first step we show that the problems associated with the ECOM are to the largest extent caused by the GLONASS, which was reaching full deployment by the end of 2011. GPS-only, GLONASS-only, and combined GPS/GLONASS solutions using the observations in the years 2009-2011 of a global network of 92 combined GPS/GLONASS receivers were analyzed for this purpose. In a second step we review direct solar radiation pressure (SRP) models for GNSS satellites. We demonstrate that only even-order short-period harmonic perturbations acting along the direction Sun-satellite occur for GPS and GLONASS satellites, and only odd-order perturbations acting along the direction perpendicular to both, the vector Sun-satellite and the spacecraft's solar panel axis. Based on this insight we assess in the third step the performance of four candidate orbit models for the future ECOM. The geocenter coordinates, the ERP differences w. r. t. the IERS 08 C04 series of ERPs, the misclosures for the midnight epochs of the daily orbital arcs, and scale parameters of Helmert transformations for station coordinates serve as quality criteria. The old and updated ECOM are validated in addition with satellite laser ranging (SLR) observations and by comparing the orbits to those of the IGS and other analysis centers. Based on all tests, we present a new extended ECOM which
Modeling Planet-Building Stellar Disks with Radiative Transfer Code
NASA Astrophysics Data System (ADS)
Swearingen, Jeremy R.; Sitko, Michael L.; Whitney, Barbara; Grady, Carol A.; Wagner, Kevin Robert; Champney, Elizabeth H.; Johnson, Alexa N.; Warren, Chelsea C.; Russell, Ray W.; Hammel, Heidi B.; Lisse, Casey M.; Cure, Michel; Kraus, Stefan; Fukagawa, Misato; Calvet, Nuria; Espaillat, Catherine; Monnier, John D.; Millan-Gabet, Rafael; Wilner, David J.
2015-01-01
Understanding the nature of the many planetary systems found outside of our own solar system cannot be completed without knowledge of the beginnings these systems. By detecting planets in very young systems and modeling the disks of material around stars from which they form, we can gain a better understanding of planetary origin and evolution. The efforts presented here have been in modeling two pre-transitional disk systems using a radiative transfer code. With the first of these systems, V1247 Ori, a model that fits the spectral energy distribution (SED) well and whose parameters are consistent with existing interferometry data (Kraus et al 2013) has been achieved. The second of these two systems, SAO 206462, has presented a different set of challenges but encouraging SED agreement between the model and known data gives hope that the model can produce images that can be used in future interferometry work. This work was supported by NASA ADAP grant NNX09AC73G, and the IR&D program at The Aerospace Corporation.
NASA Technical Reports Server (NTRS)
Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.
2016-01-01
With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.
Radiation transport phenomena and modeling. Part A: Codes; Part B: Applications with examples
Lorence, L.J. Jr.; Beutler, D.E.
1997-09-01
This report contains the notes from the second session of the 1997 IEEE Nuclear and Space Radiation Effects Conference Short Course on Applying Computer Simulation Tools to Radiation Effects Problems. Part A discusses the physical phenomena modeled in radiation transport codes and various types of algorithmic implementations. Part B gives examples of how these codes can be used to design experiments whose results can be easily analyzed and describes how to calculate quantities of interest for electronic devices.
Development of a Monte-Carlo Radiative Transfer Code for the Juno/JIRAM Limb Measurements
NASA Astrophysics Data System (ADS)
Sindoni, G.; Adriani, A.; Mayorov, B.; Aoki, S.; Grassi, D.; Moriconi, M.; Oliva, F.
2013-09-01
The Juno/JIRAM instrument will acquire limb spectra of the Jupiter atmosphere in the infrared spectral range. The analysis of these spectra requires a radiative transfer code that takes into account the multiple scattering by particles in a spherical-shell atmosphere. Therefore, we are developing a code based on the Monte-Carlo approach to simulate the JIRAM observations. The validation of the code was performed by comparison with DISORT-based codes.
Code system to compute radiation dose in human phantoms
Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.
1986-01-01
Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods. (LEW)
International "Intercomparison of 3-Dimensional (3D) Radiation Codes" (13RC)
NASA Technical Reports Server (NTRS)
Cahalan, Robert F.; Einaudi, Franco (Technical Monitor)
2000-01-01
An international "Intercomparison of 3-dimensional (3D) Radiation Codes" 13RC) has been initiated. It is endorsed by the GEWEX Radiation Panel, and funded jointly by the United States Department of Energy ARM program, and by the National Aeronautics and Space Administration Radiation Sciences program. It is a 3-phase effort that has as its goals to: (1) understand the errors and limits of 3D methods; (2) provide 'baseline' cases for future 3D code development; (3) promote sharing of 3D tools; (4) derive guidelines for 3D tool selection; and (5) improve atmospheric science education in 3D radiation.
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
Smith, L.M.; Hochstedler, R.D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).
A Monte Carlo Code for Relativistic Radiation Transport around Kerr Black Holes
NASA Astrophysics Data System (ADS)
Schnittman, Jeremy D.; Krolik, Julian H.
2013-11-01
We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.
A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes
NASA Technical Reports Server (NTRS)
Schnittman, Jeremy David; Krolik, Julian H.
2013-01-01
We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.
A MONTE CARLO CODE FOR RELATIVISTIC RADIATION TRANSPORT AROUND KERR BLACK HOLES
Schnittman, Jeremy D.; Krolik, Julian H. E-mail: jhk@pha.jhu.edu
2013-11-01
We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.
A simple code for use in shielding and radiation dosage analyses
NASA Technical Reports Server (NTRS)
Wan, C. C.
1972-01-01
A simple code for use in analyses of gamma radiation effects in laminated materials is described. Simple and good geometry is assumed so that all multiple collision and scattering events are excluded from consideration. The code is capable of handling laminates up to six layers. However, for laminates of more than six layers, the same code may be used to incorporate two additional layers at a time, making use of punch-tape outputs from previous computation on all preceding layers. Spectrum of attenuated radiation are obtained as both printed output and punch tape output as desired.
TRAP/SEE Code Users Manual for Predicting Trapped Radiation Environments
NASA Technical Reports Server (NTRS)
Armstrong, T. W.; Colborn, B. L.
2000-01-01
TRAP/SEE is a PC-based computer code with a user-friendly interface which predicts the ionizing radiation exposure of spacecraft having orbits in the Earth's trapped radiation belts. The code incorporates the standard AP8 and AE8 trapped proton and electron models but also allows application of an improved database interpolation method. The code treats low-Earth as well as highly-elliptical Earth orbits, taking into account trajectory perturbations due to gravitational forces from the Moon and Sun, atmospheric drag, and solar radiation pressure. Orbit-average spectra, peak spectra per orbit, and instantaneous spectra at points along the orbit trajectory are calculated. Described in this report are the features, models, model limitations and uncertainties, input and output descriptions, and example calculations and applications for the TRAP/SEE code.
Quasilinear simulation of auroral kilometric radiation by a relativistic Fokker-Planck code
Matsuda, Y.
1991-01-01
An intense terrestrial radiation called the auroral kilometric radiation (AKR) is believed to be generated by cyclotron maser instability. We study a quasilinear evolution of this instability by means of a two-dimensional relativistic Fokker-Planck code which treats waves and distributions self-consistently, including radiation loss and electron source and sink. We compare the distributions and wave amplitude with spacecraft observations to elucidate physical processes involved. 3 refs., 1 fig.
HETC radiation transport code development for cosmic ray shielding applications in space.
Townsend, L W; Miller, T M; Gabriel, Tony A
2005-01-01
In order to facilitate three-dimensional analyses of space radiation shielding scenarios for future space missions, the Monte Carlo radiation transport code HETC is being extended to include transport of energetic heavy ions, such as are found in the galactic cosmic ray spectrum in space. Recently, an event generator capable of providing nuclear interaction data for use in HETC was developed and incorporated into the code. The event generator predicts the interaction product yields and production angles and energies using nuclear models and Monte Carlo techniques. Testing and validation of the extended transport code has begun. In this work, the current status of code modifications, which enable energetic heavy ions and their nuclear reaction products to be transported through thick shielding, are described. Also, initial results of code testing against available laboratory beam data for energetic heavy ions interacting in thick targets are presented. PMID:16604614
The Performance of Current Atmospheric Radiation Codes in Phase I of CIRC
NASA Technical Reports Server (NTRS)
Oreopoulos, L.; Mlawer, E.; Shippert, T.; Cole, J.; Fomin, B.; Iacono, M.; Jin, Z.; Li, J.; Manners, J.; Raisanen, P.; Rose, F.; Zhang, Y.; Wilson, M.; Rossow, W.
2012-01-01
The Continual Intercomparison of Radiation Codes (CIRC) is intended as an evolving and regularly updated reference source for evaluation of radiative transfer (RT) codes used in Global Climate Models and other atmospheric applications. In our presentation we will discuss our evaluation of the performance of 13 shortwave and 11 longwave RT codes that participated in Phase I of CIRC. CIRC differs from previous intercomparisons in that it relies on an observationally validated catalogue of cases. The seven CIRC Phase I baseline cases, five cloud-free, and two with overcast liquid clouds, are built around observations by the Atmospheric Radiation Measurements (ARM) program that satisfy the goals .of Phase I, namely to examine RT model performance in realistic, yet not overly complex, atmospheric conditions. Besides the seven baseline cases, additional idealized "subcases" are also examined to facilitate interpretation of model errors. We will quantify individual model performance with respect to reference line-by-line calculations, and will also highlight RT code behavior for conditions of doubled CO2 , aspects of utilizing a spectral specification of surface albedo, and the impact of the inclusion of scattering in the thermal infrared. Our analysis suggests that RT codes should work towards improving their calculation of diffuse shortwave flux, shortwave absorption, treatment of spectral surface albedo, and shortwave CO2 forcing. Despite practical difficulties in comparing our results to previous results by the Intercomparison of Radiation Codes in Climate Models (ICRCCM) conducted about 20 years ago, it appears that the current generation of RT codes do indeed perform better than the codes of the ICRCCM era. By enhancing the range of conditions under which participating codes are tested, future CIRC phases will hopefully allow even more rigorous examination of RT code performance.
CRASH: A BLOCK-ADAPTIVE-MESH CODE FOR RADIATIVE SHOCK HYDRODYNAMICS-IMPLEMENTATION AND VERIFICATION
Van der Holst, B.; Toth, G.; Sokolov, I. V.; Myra, E. S.; Fryxell, B.; Drake, R. P.; Powell, K. G.; Holloway, J. P.; Stout, Q.; Adams, M. L.; Morel, J. E.; Karni, S.
2011-06-01
We describe the Center for Radiative Shock Hydrodynamics (CRASH) code, a block-adaptive-mesh code for multi-material radiation hydrodynamics. The implementation solves the radiation diffusion model with a gray or multi-group method and uses a flux-limited diffusion approximation to recover the free-streaming limit. Electrons and ions are allowed to have different temperatures and we include flux-limited electron heat conduction. The radiation hydrodynamic equations are solved in the Eulerian frame by means of a conservative finite-volume discretization in either one-, two-, or three-dimensional slab geometry or in two-dimensional cylindrical symmetry. An operator-split method is used to solve these equations in three substeps: (1) an explicit step of a shock-capturing hydrodynamic solver; (2) a linear advection of the radiation in frequency-logarithm space; and (3) an implicit solution of the stiff radiation diffusion, heat conduction, and energy exchange. We present a suite of verification test problems to demonstrate the accuracy and performance of the algorithms. The applications are for astrophysics and laboratory astrophysics. The CRASH code is an extension of the Block-Adaptive Tree Solarwind Roe Upwind Scheme (BATS-R-US) code with a new radiation transfer and heat conduction library and equation-of-state and multi-group opacity solvers. Both CRASH and BATS-R-US are part of the publicly available Space Weather Modeling Framework.
CRASH: A Block-adaptive-mesh Code for Radiative Shock Hydrodynamics—Implementation and Verification
NASA Astrophysics Data System (ADS)
van der Holst, B.; Tóth, G.; Sokolov, I. V.; Powell, K. G.; Holloway, J. P.; Myra, E. S.; Stout, Q.; Adams, M. L.; Morel, J. E.; Karni, S.; Fryxell, B.; Drake, R. P.
2011-06-01
We describe the Center for Radiative Shock Hydrodynamics (CRASH) code, a block-adaptive-mesh code for multi-material radiation hydrodynamics. The implementation solves the radiation diffusion model with a gray or multi-group method and uses a flux-limited diffusion approximation to recover the free-streaming limit. Electrons and ions are allowed to have different temperatures and we include flux-limited electron heat conduction. The radiation hydrodynamic equations are solved in the Eulerian frame by means of a conservative finite-volume discretization in either one-, two-, or three-dimensional slab geometry or in two-dimensional cylindrical symmetry. An operator-split method is used to solve these equations in three substeps: (1) an explicit step of a shock-capturing hydrodynamic solver; (2) a linear advection of the radiation in frequency-logarithm space; and (3) an implicit solution of the stiff radiation diffusion, heat conduction, and energy exchange. We present a suite of verification test problems to demonstrate the accuracy and performance of the algorithms. The applications are for astrophysics and laboratory astrophysics. The CRASH code is an extension of the Block-Adaptive Tree Solarwind Roe Upwind Scheme (BATS-R-US) code with a new radiation transfer and heat conduction library and equation-of-state and multi-group opacity solvers. Both CRASH and BATS-R-US are part of the publicly available Space Weather Modeling Framework.
CRASH: A Block-Adaptive-Mesh Code for Radiative Shock Hydrodynamics
NASA Astrophysics Data System (ADS)
van der Holst, B.; Toth, G.; Sokolov, I. V.; Powell, K. G.; Holloway, J. P.; Myra, E. S.; Stout, Q.; Adams, M. L.; Morel, J. E.; Drake, R. P.
2011-01-01
We describe the CRASH (Center for Radiative Shock Hydrodynamics) code, a block adaptive mesh code for multi-material radiation hydrodynamics. The implementation solves the radiation diffusion model with the gray or multigroup method and uses a flux limited diffusion approximation to recover the free-streaming limit. The electrons and ions are allowed to have different temperatures and we include a flux limited electron heat conduction. The radiation hydrodynamic equations are solved in the Eulerian frame by means of a conservative finite volume discretization in either one, two, or three-dimensional slab geometry or in two-dimensional cylindrical symmetry. An operator split method is used to solve these equations in three substeps: (1) solve the hydrodynamic equations with shock-capturing schemes, (2) a linear advection of the radiation in frequency-logarithm space, and (3) an implicit solve of the stiff radiation diffusion, heat conduction, and energy exchange. We present a suite of verification test problems to demonstrate the accuracy and performance of the algorithms. The CRASH code is an extension of the Block-Adaptive Tree Solarwind Roe Upwind Scheme (BATS-R-US) code with this new radiation transfer and heat conduction library and equation-of-state and multigroup opacity solvers. Both CRASH and BATS-R-US are part of the publicly available Space Weather Modeling Framework (SWMF).
Radiation and confinement in 0D fusion systems codes
NASA Astrophysics Data System (ADS)
Lux, H.; Kemp, R.; Fable, E.; Wenninger, R.
2016-07-01
In systems modelling for fusion power plants, it is essential to robustly predict the performance of a given machine design (including its respective operating scenario). One measure of machine performance is the energy confinement time {τ\\text{E}} that is typically predicted from experimentally derived confinement scaling laws (e.g. IPB98(y,2)). However, the conventionally used scaling laws have been derived for ITER which—unlike a fusion power plant—will not have significant radiation inside the separatrix. In the absence of a new high core radiation relevant confinement scaling, we propose an ad hoc correction to the loss power {{P}\\text{L}} used in the ITER confinement scaling and the calculation of the stored energy {{W}\\text{th}} by the radiation losses from the ‘core’ of the plasma {{P}\\text{rad,\\text{core}}} . Using detailed ASTRA / TGLF simulations, we find that an appropriate definition of {{P}\\text{rad,\\text{core}}} is given by 60% of all radiative losses inside a normalised minor radius {ρ\\text{core}}=0.75 . We consider this an improvement for current design predictions, but it is far from an ideal solution. We therefore encourage more detailed experimental and theoretical work on this issue.
NASA Technical Reports Server (NTRS)
Meyer, H. D.
1993-01-01
The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.
Radiative transfer codes for atmospheric correction and aerosol retrieval: intercomparison study.
Kotchenova, Svetlana Y; Vermote, Eric F; Levy, Robert; Lyapustin, Alexei
2008-05-01
Results are summarized for a scientific project devoted to the comparison of four atmospheric radiative transfer codes incorporated into different satellite data processing algorithms, namely, 6SV1.1 (second simulation of a satellite signal in the solar spectrum, vector, version 1.1), RT3 (radiative transfer), MODTRAN (moderate resolution atmospheric transmittance and radiance code), and SHARM (spherical harmonics). The performance of the codes is tested against well-known benchmarks, such as Coulson's tabulated values and a Monte Carlo code. The influence of revealed differences on aerosol optical thickness and surface reflectance retrieval is estimated theoretically by using a simple mathematical approach. All information about the project can be found at http://rtcodes.ltdri.org. PMID:18449285
Intercomparison of Shortwave Radiative Transfer Codes and Measurements
Halthore, Rangasayi N.; Crisp, David; Schwartz, Stephen E.; Anderson, Gail; Berk, A.; Bonnel, B.; Boucher, Olivier; Chang, Fu-Lung; Chou, Ming-Dah; Clothiaux, Eugene E.; Dubuisson, P.; Fomin, Boris; Fouquart, Y.; Freidenreich, S.; Gautier, Catherine; Kato, Seiji; Laszlo, Istvan; Li, Zhanqing; Mather, Jim H.; Plana-Fattori, Artemio; Ramaswamy, V.; Ricchiazzi, P.; Shiren, Y.; Trishchenko, A.; Wiscombe, Warren J.
2005-06-03
Computation of components of shortwave (SW) or solar irradiance in the surface-atmospheric system forms the basis of intercomparison between 16 radiative transfer models of varying spectral resolution ranging from line-by-line models to broadband and general circulation models. In order of increasing complexity the components are: direct solar irradiance at the surface, diffuse irradiance at the surface, diffuse upward flux at the surface, and diffuse upward flux at the top of the atmosphere. These components allow computation of the atmospheric absorptance. Four cases are considered from pure molecular atmospheres to atmospheres with aerosols and atmosphere with a simple uniform cloud. The molecular and aerosol cases allow comparison of aerosol forcing calculation among models. A cloud-free case with measured atmospheric and aerosol properties and measured shortwave radiation components provides an absolute basis for evaluating the models. For the aerosol-free and cloud-free dry atmospheres, models agree to within 1% (root mean square deviation as a percentage of mean) in broadband direct solar irradiance at surface; the agreement is relatively poor at 5% for a humid atmosphere. A comparison of atmospheric absorptance, computed from components of SW radiation, shows that agreement among models is understandably much worse at 3% and 10% for dry and humid atmospheres, respectively. Inclusion of aerosols generally makes the agreement among models worse than when no aerosols are present, with some exceptions. Modeled diffuse surface irradiance is higher than measurements for all models for the same model inputs. Inclusion of an optically thick low-cloud in a tropical atmosphere, a stringent test for multiple scattering calculations, produces, in general, better agreement among models for a low solar zenith angle (SZA = 30?) than for a high SZA (75?). All models show about a 30% increase in broadband absorptance for 30? SZA relative to the clear-sky case and almost no
Modelling Radiative Stellar Winds with the SIMECA Code
NASA Astrophysics Data System (ADS)
Stee, Ph.
Using the SIMECA code developped by Stee & Araùjo ([CITE]), we report theoretical HI visible and near-IR line profiles, i.e. Hα (6562 Å), Hβ (4861 Å) and Brγ (21 656 Å), and intensity maps for a large set of parameters representative of early to late Be spectral types. We have computed the size of the emitting region in the Brγ line and its nearby continuum which both originate from a very extended region, i.e. at least 40 stellar radii which is twice the size of the Hα emitting region. We predict the relative fluxes from the central star, the envelope contribution in the given lines and in the continuum for a wide range of parameters characterizing the disk models. Finally, we have also studied the effect of changing the spectral type on our results and we obtain a clear correlation between the luminosity in Hα and in the infrared.
Method for calculating internal radiation and ventilation with the ADINAT heat-flow code
Butkovich, T.R.; Montan, D.N.
1980-04-01
One objective of the spent fuel test in Climax Stock granite (SFTC) is to correctly model the thermal transport, and the changes in the stress field and accompanying displacements from the application of the thermal loads. We have chosen the ADINA and ADINAT finite element codes to do these calculations. ADINAT is a heat transfer code compatible to the ADINA displacement and stress analysis code. The heat flow problem encountered at SFTC requires a code with conduction, radiation, and ventilation capabilities, which the present version of ADINAT does not have. We have devised a method for calculating internal radiation and ventilation with the ADINAT code. This method effectively reproduces the results from the TRUMP multi-dimensional finite difference code, which correctly models radiative heat transport between drift surfaces, conductive and convective thermal transport to and through air in the drifts, and mass flow of air in the drifts. The temperature histories for each node in the finite element mesh calculated with ADINAT using this method can be used directly in the ADINA thermal-mechanical calculation.
Simulations of implosions with a 3D, parallel, unstructured-grid, radiation-hydrodynamics code
Kaiser, T B; Milovich, J L; Prasad, M K; Rathkopf, J; Shestakov, A I
1998-12-28
An unstructured-grid, radiation-hydrodynamics code is used to simulate implosions. Although most of the problems are spherically symmetric, they are run on 3D, unstructured grids in order to test the code's ability to maintain spherical symmetry of the converging waves. Three problems, of increasing complexity, are presented. In the first, a cold, spherical, ideal gas bubble is imploded by an enclosing high pressure source. For the second, we add non-linear heat conduction and drive the implosion with twelve laser beams centered on the vertices of an icosahedron. In the third problem, a NIF capsule is driven with a Planckian radiation source.
On the Development of a Deterministic Three-Dimensional Radiation Transport Code
NASA Technical Reports Server (NTRS)
Rockell, Candice; Tweed, John
2011-01-01
Since astronauts on future deep space missions will be exposed to dangerous radiations, there is a need to accurately model the transport of radiation through shielding materials and to estimate the received radiation dose. In response to this need a three dimensional deterministic code for space radiation transport is now under development. The new code GRNTRN is based on a Green's function solution of the Boltzmann transport equation that is constructed in the form of a Neumann series. Analytical approximations will be obtained for the first three terms of the Neumann series and the remainder will be estimated by a non-perturbative technique . This work discusses progress made to date and exhibits some computations based on the first two Neumann series terms.
MODTRAN6: a major upgrade of the MODTRAN radiative transfer code
NASA Astrophysics Data System (ADS)
Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette
2014-06-01
The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.
A multigroup radiation diffusion test problem: Comparison of code results with analytic solution
Shestakov, A I; Harte, J A; Bolstad, J H; Offner, S R
2006-12-21
We consider a 1D, slab-symmetric test problem for the multigroup radiation diffusion and matter energy balance equations. The test simulates diffusion of energy from a hot central region. Opacities vary with the cube of the frequency and radiation emission is given by a Wien spectrum. We compare results from two LLNL codes, Raptor and Lasnex, with tabular data that define the analytic solution.
RRTMGP: A fast and accurate radiation code for the next decade
NASA Astrophysics Data System (ADS)
Mlawer, E. J.; Pincus, R.; Wehe, A.; Delamere, J.
2015-12-01
Atmospheric radiative processes are key drivers of the Earth's climate and must be accurately represented in global circulations models (GCMs) to allow faithful simulations of the planet's past, present, and future. The radiation code RRTMG is widely utilized by global modeling centers for both climate and weather predictions, but it has become increasingly out-of-date. The code's structure is not well suited for the current generation of computer architectures and its stored absorption coefficients are not consistent with the most recent spectroscopic information. We are developing a new broadband radiation code for the current generation of computational architectures. This code, called RRTMGP, will be a completely restructured and modern version of RRTMG. The new code preserves the strengths of the existing RRTMG parameterization, especially the high accuracy of the k-distribution treatment of absorption by gases, but the entire code is being rewritten to provide highly efficient computation across a range of architectures. Our redesign includes refactoring the code into discrete kernels corresponding to fundamental computational elements (e.g. gas optics), optimizing the code for operating on multiple columns in parallel, simplifying the subroutine interface, revisiting the existing gas optics interpolation scheme to reduce branching, and adding flexibility with respect to run-time choices of streams, need for consideration of scattering, aerosol and cloud optics, etc. The result of the proposed development will be a single, well-supported and well-validated code amenable to optimization across a wide range of platforms. Our main emphasis is on highly-parallel platforms including Graphical Processing Units (GPUs) and Many-Integrated-Core processors (MICs), which experience shows can accelerate broadband radiation calculations by as much as a factor of fifty. RRTMGP will provide highly efficient and accurate radiative fluxes calculations for coupled global
3D Polarized Radiative Transfer for Solar System Applications Using the public-domain HYPERION Code
NASA Astrophysics Data System (ADS)
Wolff, M. J.; Robitaille, T.; Whitney, B. A.
2012-12-01
We present a public-domain radiative transfer tool that will allow researchers to examine a wide-range of interesting solar system applications. Hyperion is a new three-dimensional continuum Monte-Carlo radiative transfer code that is designed to be as general as possible, allowing radiative transfer to be computed through a variety of three-dimensional grids (Robitaille, 2011, Astronomy & Astrophysics 536 A79). The main part of the code is problem-independent, and only requires the user to define the three-dimensional density structure, and the opacity and the illumination properties (as well as a few parameters that control execution and output of the code). Hyperion is written in Fortran 90 and parallelized using the MPI-2 standard. It is bundled with Python libraries that enable very flexible pre- and post-processing options (arbitrary shapes, multiple aerosol components, etc.). These routines are very amenable to user-extensibility. The package is currently distributed at www.hyperion-rt.org. Our presentation will feature 1) a brief overview of the code, including a description of the solar system-specific modifications that we have made beyond the capabilities in the original release; 2) Several solar system applications (i.e., Deep Impact Plume, Martian atmosphere, etc.); 3) discussion of availability and distribution of code components via www.hyperion-rt.org.
Creation and utilization of a World Wide Web based space radiation effects code: SIREST
NASA Technical Reports Server (NTRS)
Singleterry, R. C. Jr; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.; Thibeault, S. A.; Noor, A. K.; Cucinotta, F. A.; Badavi, F. F.; Chang, C. K.; Qualls, G. D.; Clowdsley, M. S.; Kim, M. H.; Heinbockel, J. H.; Norbury, J.; Blattning, S. R.; Miller, J.; Zeitlin, C.; Heilbronn, L. H.
2001-01-01
In order for humans and electronics to fully and safely operate in the space environment, codes like HZETRN (High Charge and Energy Transport) must be included in any designer's toolbox for design evaluation with respect to radiation damage. Currently, spacecraft designers do not have easy access to accurate radiation codes like HZETRN to evaluate their design for radiation effects on humans and electronics. Today, the World Wide Web is sophisticated enough to support the entire HZETRN code and all of the associated pre and post processing tools. This package is called SIREST (Space Ionizing Radiation Effects and Shielding Tools). There are many advantages to SIREST. The most important advantage is the instant update capability of the web. Another major advantage is the modularity that the web imposes on the code. Right now, the major disadvantage of SIREST will be its modularity inside the designer's system. This mostly comes from the fact that a consistent interface between the designer and the computer system to evaluate the design is incomplete. This, however, is to be solved in the Intelligent Synthesis Environment (ISE) program currently being funded by NASA.
Creation and utilization of a World Wide Web based space radiation effects code: SIREST.
Singleterry, R C; Wilson, J W; Shinn, J L; Tripathi, R K; Thibeault, S A; Noor, A K; Cucinotta, F A; Badavi, F F; Chang, C K; Qualls, G D; Clowdsley, M S; Kim, M H; Heinbockel, J H; Norbury, J; Blattning, S R; Miller, J; Zeitlin, C; Heilbronn, L H
2001-01-01
In order for humans and electronics to fully and safely operate in the space environment, codes like HZETRN (High Charge and Energy Transport) must be included in any designer's toolbox for design evaluation with respect to radiation damage. Currently, spacecraft designers do not have easy access to accurate radiation codes like HZETRN to evaluate their design for radiation effects on humans and electronics. Today, the World Wide Web is sophisticated enough to support the entire HZETRN code and all of the associated pre and post processing tools. This package is called SIREST (Space Ionizing Radiation Effects and Shielding Tools). There are many advantages to SIREST. The most important advantage is the instant update capability of the web. Another major advantage is the modularity that the web imposes on the code. Right now, the major disadvantage of SIREST will be its modularity inside the designer's system. This mostly comes from the fact that a consistent interface between the designer and the computer system to evaluate the design is incomplete. This, however, is to be solved in the Intelligent Synthesis Environment (ISE) program currently being funded by NASA. PMID:11770545
NASA Astrophysics Data System (ADS)
Townsend, L. W.; Porter, J.; Spence, H. E.; Golightly, M. J.; Smith, S. S.; Schwadron, N.; Kasper, J. C.; Case, A. W.; Blake, J. B.; Mazur, J. E.; Looper, M. D.; Zeitlin, C. J.
2014-12-01
The Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument on the Lunar Reconnaissance Orbiter (LRO) spacecraft measures the energy depositions by solar and galactic cosmic radiations in its silicon detectors. These energy depositions are converted to linear energy transfer (LET) spectra, which can contribute to benchmarking space radiation transport codes and also used to estimate doses for the Lunar environment. In this work the Monte Carlo transport code HETC-HEDS (High Energy Transport Code - Human Exploration and Development in Space) and the deterministic NASA space radiation transport code HZETRN2010 are used to estimate LET and dose contributions from the incident primary ions and their charged secondaries produced in nuclear collisions within the components of the CRaTER instrument. Comparisons of the calculated LET spectra with measurements of LET from the CRaTER instrument are made and clearly show the importance of including corrections to the calculated average energy deposition spectra in the silicon detectors using a Vavilov distribution function.
Multigroup Three-Dimensional Direct Integration Method Radiation Transport Analysis Code System.
Energy Science and Technology Software Center (ESTSC)
1987-09-18
Version 00 TRISTAN solves the three-dimensional, fixed-source, Boltzmann transport equation for neutrons or gamma rays in rectangular geometry. The code can solve an adjoint problem as well as a usual transport problem. TRISTAN is a suitable tool to analyze radiation shielding problems such as streaming and deep penetration problems.
Development of a coupling code for PWR reactor cavity radiation streaming calculation
Zheng, Z.; Wu, H.; Cao, L.; Zheng, Y.; Zhang, H.; Wang, M.
2012-07-01
PWR reactor cavity radiation streaming is important for the safe of the personnel and equipment, thus calculation has to be performed to evaluate the neutron flux distribution around the reactor. For this calculation, the deterministic codes have difficulties in fine geometrical modeling and need huge computer resource; and the Monte Carlo codes require very long sampling time to obtain results with acceptable precision. Therefore, a coupling method has been developed to eliminate the two problems mentioned above in each code. In this study, we develop a coupling code named DORT2MCNP to link the Sn code DORT and Monte Carlo code MCNP. DORT2MCNP is used to produce a combined surface source containing top, bottom and side surface simultaneously. Because SDEF card is unsuitable for the combined surface source, we modify the SOURCE subroutine of MCNP and compile MCNP for this application. Numerical results demonstrate the correctness of the coupling code DORT2MCNP and show reasonable agreement between the coupling method and the other two codes (DORT and MCNP). (authors)
Code System to Calculate Radiation Dose Rates Relative to Spent Fuel Shipping Casks.
Energy Science and Technology Software Center (ESTSC)
1993-05-20
Version 00 QBF calculates and plots in a short running time, three dimensional radiation dose rate distributions in the form of contour maps on specified planes resulting from cylindrical sources loaded into vehicles or ships. Shielding effects by steel walls and shielding material layers are taken into account in addition to the shadow effect among casks. This code system identifies the critical points on which to focus when designing the radiation shielding structure and wheremore » each of the spent fuel shipping casks should be stored. The code GRAPH reads the output data file of QBF and plots it using the HGX graphics library. QBF unifies the functions of the SMART and MANYCASK codes included in CCC-482.« less
User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)
NASA Technical Reports Server (NTRS)
Hainley, Donald C.
1991-01-01
A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.
FURN3D: A computer code for radiative heat transfer in pulverized coal furnaces
Ahluwalia, R.K.; Im, K.H.
1992-08-01
A computer code FURN3D has been developed for assessing the impact of burning different coals on heat absorption pattern in pulverized coal furnaces. The code is unique in its ability to conduct detailed spectral calculations of radiation transport in furnaces fully accounting for the size distributions of char, soot and ash particles, ash content, and ash composition. The code uses a hybrid technique of solving the three-dimensional radiation transport equation for absorbing, emitting and anisotropically scattering media. The technique achieves an optimal mix of computational speed and accuracy by combining the discrete ordinate method (S[sub 4]), modified differential approximation (MDA) and P, approximation in different range of optical thicknesses. The code uses spectroscopic data for estimating the absorption coefficients of participating gases C0[sub 2], H[sub 2]0 and CO. It invokes Mie theory for determining the extinction and scattering coefficients of combustion particulates. The optical constants of char, soot and ash are obtained from dispersion relations derived from reflectivity, transmissivity and extinction measurements. A control-volume formulation is adopted for determining the temperature field inside the furnace. A simple char burnout model is employed for estimating heat release and evolution of particle size distribution. The code is written in Fortran 77, has modular form, and is machine-independent. The computer memory required by the code depends upon the number of grid points specified and whether the transport calculations are performed on spectral or gray basis.
FURN3D: A computer code for radiative heat transfer in pulverized coal furnaces
Ahluwalia, R.K.; Im, K.H.
1992-08-01
A computer code FURN3D has been developed for assessing the impact of burning different coals on heat absorption pattern in pulverized coal furnaces. The code is unique in its ability to conduct detailed spectral calculations of radiation transport in furnaces fully accounting for the size distributions of char, soot and ash particles, ash content, and ash composition. The code uses a hybrid technique of solving the three-dimensional radiation transport equation for absorbing, emitting and anisotropically scattering media. The technique achieves an optimal mix of computational speed and accuracy by combining the discrete ordinate method (S{sub 4}), modified differential approximation (MDA) and P, approximation in different range of optical thicknesses. The code uses spectroscopic data for estimating the absorption coefficients of participating gases C0{sub 2}, H{sub 2}0 and CO. It invokes Mie theory for determining the extinction and scattering coefficients of combustion particulates. The optical constants of char, soot and ash are obtained from dispersion relations derived from reflectivity, transmissivity and extinction measurements. A control-volume formulation is adopted for determining the temperature field inside the furnace. A simple char burnout model is employed for estimating heat release and evolution of particle size distribution. The code is written in Fortran 77, has modular form, and is machine-independent. The computer memory required by the code depends upon the number of grid points specified and whether the transport calculations are performed on spectral or gray basis.
ICRCCM Phase 2: Verification and calibration of radiation codes in climate models
Ellingson, R.G.; Wiscombe, W.J.; Murcray, D.; Smith, W.; Strauch, R.
1991-01-01
Following the finding by the InterComparison of Radiation Codes used in Climate Models (ICRCCM) of large differences among fluxes predicted by sophisticated radiation models that could not be sorted out because of the lack of a set of accurate atmospheric spectral radiation data measured simultaneously with the important radiative properties of the atmosphere, our team of scientists proposed to remedy the situation by carrying out a comprehensive program of measurement and analysis called SPECTRE (Spectral Radiance Experiment). SPECTRE will establish an absolute standard against which to compare models, and will aim to remove the hidden variables'' (unknown humidities, aerosols, etc.) which radiation modelers have invoked to excuse disagreements with observation. The data to be collected during SPECTRE will form the test bed for the second phase of ICRCCM, namely verification and calibration of radiation codes used in climate models. This should lead to more accurate radiation models for use in parameterizing climate models, which in turn play a key role in the prediction of trace-gas greenhouse effects.
Kirk, B.L. )
1990-01-01
In nuclear applications, the conversion of mainframe software to the personal computer (PC) environment has seen an accelerated pace. Credit has to be extended to the software companies that have made the scientific language FORTRAN available on PCs. Not to be neglected are the scientists who dedicate their time in the conversion of codes and are challenged by the limited PC memory and disk space. The Radiation shielding Information Center (RSIC) at Oak Ridge National Laboratory has encouraged these developments, and the shielding community has cooperated by making these new tools available via RSIC. The PC codes in the shielding and radiation transport area are divided into five categories (these categories are not mutually exclusive): (1) gamma-ray scattering; (2) neutron and gamma-ray transport (also coupled); (3) environmental dose; (4) medical applications; and (5) reactor physics. Each category is discussed.
PEREGRINE: An all-particle Monte Carlo code for radiation therapy
Hartmann Siantar, C.L.; Chandler, W.P.; Rathkopf, J.A.; Svatos, M.M.; White, R.M.
1994-09-01
The goal of radiation therapy is to deliver a lethal dose to the tumor while minimizing the dose to normal tissues. To carry out this task, it is critical to calculate correctly the distribution of dose delivered. Monte Carlo transport methods have the potential to provide more accurate prediction of dose distributions than currently-used methods. PEREGRINE is a new Monte Carlo transport code developed at Lawrence Livermore National Laboratory for the specific purpose of modeling the effects of radiation therapy. PEREGRINE transports neutrons, photons, electrons, positrons, and heavy charged-particles, including protons, deuterons, tritons, helium-3, and alpha particles. This paper describes the PEREGRINE transport code and some preliminary results for clinically relevant materials and radiation sources.
Monte Carlo Code System for High-Energy Radiation Transport Calculations.
Energy Science and Technology Software Center (ESTSC)
2000-02-16
Version 00 HERMES-KFA consists of a set of Monte Carlo Codes used to simulate particle radiation and interaction with matter. The main codes are HETC, MORSE, and EGS. They are supported by a common geometry package, common random routines, a command interpreter, and auxiliary codes like NDEM that is used to generate a gamma-ray source from nuclear de-excitation after spallation processes. The codes have been modified so that any particle history falling outside the domainmore » of the physical theory of one program can be submitted to another program in the suite to complete the work. Also response data can be submitted by each program, to be collected and combined by a statistic package included within the command interpreter.« less
Development of a GPU Compatible Version of the Fast Radiation Code RRTMG
NASA Astrophysics Data System (ADS)
Iacono, M. J.; Mlawer, E. J.; Berthiaume, D.; Cady-Pereira, K. E.; Suarez, M.; Oreopoulos, L.; Lee, D.
2012-12-01
The absorption of solar radiation and emission/absorption of thermal radiation are crucial components of the physics that drive Earth's climate and weather. Therefore, accurate radiative transfer calculations are necessary for realistic climate and weather simulations. Efficient radiation codes have been developed for this purpose, but their accuracy requirements still necessitate that as much as 30% of the computational time of a GCM is spent computing radiative fluxes and heating rates. The overall computational expense constitutes a limitation on a GCM's predictive ability if it becomes an impediment to adding new physics to or increasing the spatial and/or vertical resolution of the model. The emergence of Graphics Processing Unit (GPU) technology, which will allow the parallel computation of multiple independent radiative calculations in a GCM, will lead to a fundamental change in the competition between accuracy and speed. Processing time previously consumed by radiative transfer will now be available for the modeling of other processes, such as physics parameterizations, without any sacrifice in the accuracy of the radiative transfer. Furthermore, fast radiation calculations can be performed much more frequently and will allow the modeling of radiative effects of rapid changes in the atmosphere. The fast radiation code RRTMG, developed at Atmospheric and Environmental Research (AER), is utilized operationally in many dynamical models throughout the world. We will present the results from the first stage of an effort to create a version of the RRTMG radiation code designed to run efficiently in a GPU environment. This effort will focus on the RRTMG implementation in GEOS-5. RRTMG has an internal pseudo-spectral vector of length of order 100 that, when combined with the much greater length of the global horizontal grid vector from which the radiation code is called in GEOS-5, makes RRTMG/GEOS-5 particularly suited to achieving a significant speed improvement
ICRCCM Phase 2: Verification and calibration of radiation codes in climate models
Ellingson, R.G.; Wiscombe, W.J.; Murcray, D.; Smith, W.; Strauch, R.
1992-01-01
Following the finding by the InterComparison of Radiation Codes used in Climate Models (ICRCCM) of large differences among fluxes predicted by sophisticated radiation models that could not be sorted out because of the lack of a set of accurate atmospheric spectral radiation data measured simultaneously with the important radiative properties of the atmosphere, our team of scientists proposed to remedy the situation by carrying out a comprehensive program of measurement and analysis called SPECTRE (Spectral Radiance Experiment). The data collected during SPECTRE form the test bed for the second phase of ICRCCM, namely verification and calibration of radiation codes used in climate models. This should lead to more accurate radiation models for use in parameterizing climate models, which in turn play a key role in the prediction of trace-gas greenhouse effects. This report summarizes the activities of our group during the project's Third year to meet our stated objectives. The report is divided into three sections entitled: SPECTRE Activities, ICRCCM Activities, and summary information. The section on SPECTRE activities summarizes the field portion of the project during 1991, and the data reduction/analysis performed by the various participants. The section on ICRCCM activities summarizes our initial attempts to select data for distribution to ICRCCM participants and at comparison of observations with calculations as will be done by the ICRCCM participants. The Summary Information section lists data concerning publications, presentations, graduate students supported, and post-doctoral appointments during the project.
A Radiation Chemistry Code Based on the Green's Function of the Diffusion Equation
NASA Technical Reports Server (NTRS)
Plante, Ianik; Wu, Honglu
2014-01-01
Stochastic radiation track structure codes are of great interest for space radiation studies and hadron therapy in medicine. These codes are used for a many purposes, notably for microdosimetry and DNA damage studies. In the last two decades, they were also used with the Independent Reaction Times (IRT) method in the simulation of chemical reactions, to calculate the yield of various radiolytic species produced during the radiolysis of water and in chemical dosimeters. Recently, we have developed a Green's function based code to simulate reversible chemical reactions with an intermediate state, which yielded results in excellent agreement with those obtained by using the IRT method. This code was also used to simulate and the interaction of particles with membrane receptors. We are in the process of including this program for use with the Monte-Carlo track structure code Relativistic Ion Tracks (RITRACKS). This recent addition should greatly expand the capabilities of RITRACKS, notably to simulate DNA damage by both the direct and indirect effect.
A unified radiative magnetohydrodynamics code for lightning-like discharge simulations
Chen, Qiang Chen, Bin Xiong, Run; Cai, Zhaoyang; Chen, P. F.
2014-03-15
A two-dimensional Eulerian finite difference code is developed for solving the non-ideal magnetohydrodynamic (MHD) equations including the effects of self-consistent magnetic field, thermal conduction, resistivity, gravity, and radiation transfer, which when combined with specified pulse current models and plasma equations of state, can be used as a unified lightning return stroke solver. The differential equations are written in the covariant form in the cylindrical geometry and kept in the conservative form which enables some high-accuracy shock capturing schemes to be equipped in the lightning channel configuration naturally. In this code, the 5-order weighted essentially non-oscillatory scheme combined with Lax-Friedrichs flux splitting method is introduced for computing the convection terms of the MHD equations. The 3-order total variation diminishing Runge-Kutta integral operator is also equipped to keep the time-space accuracy of consistency. The numerical algorithms for non-ideal terms, e.g., artificial viscosity, resistivity, and thermal conduction, are introduced in the code via operator splitting method. This code assumes the radiation is in local thermodynamic equilibrium with plasma components and the flux limited diffusion algorithm with grey opacities is implemented for computing the radiation transfer. The transport coefficients and equation of state in this code are obtained from detailed particle population distribution calculation, which makes the numerical model is self-consistent. This code is systematically validated via the Sedov blast solutions and then used for lightning return stroke simulations with the peak current being 20 kA, 30 kA, and 40 kA, respectively. The results show that this numerical model consistent with observations and previous numerical results. The population distribution evolution and energy conservation problems are also discussed.
T.J. Urbatsch; T.M. Evans
2006-02-15
We have released Version 2 of Milagro, an object-oriented, C++ code that performs radiative transfer using Fleck and Cummings' Implicit Monte Carlo method. Milagro, a part of the Jayenne program, is a stand-alone driver code used as a methods research vehicle and to verify its underlying classes. These underlying classes are used to construct Implicit Monte Carlo packages for external customers. Milagro-2 represents a design overhaul that allows better parallelism and extensibility. New features in Milagro-2 include verified momentum deposition, restart capability, graphics capability, exact energy conservation, and improved load balancing and parallel efficiency. A users' guide also describes how to configure, make, and run Milagro2.
DOPEX-1D2C: A one-dimensional, two-constraint radiation shield optimization code
NASA Technical Reports Server (NTRS)
Lahti, G. P.
1973-01-01
A one-dimensional, two-constraint radiation sheild weight optimization procedure and a computer program, DOPEX-1D2C, is described. The DOPEX-1D2C uses the steepest descent method to alter a set of initial (input) thicknesses of a spherical shield configuration to achieve a minimum weight while simultaneously satisfying two dose-rate constraints. The code assumes an exponential dose-shield thickness relation with parameters specified by the user. Code input instruction, a FORTRAN-4 listing, and a sample problem are given. Typical computer time required to optimize a seven-layer shield is less than 1/2 minute on an IBM 7094.
European Code against Cancer 4th Edition: Ionising and non-ionising radiation and cancer.
McColl, Neil; Auvinen, Anssi; Kesminiene, Ausrele; Espina, Carolina; Erdmann, Friederike; de Vries, Esther; Greinert, Rüdiger; Harrison, John; Schüz, Joachim
2015-12-01
Ionising radiation can transfer sufficient energy to ionise molecules, and this can lead to chemical changes, including DNA damage in cells. Key evidence for the carcinogenicity of ionising radiation comes from: follow-up studies of the survivors of the atomic bombings in Japan; other epidemiological studies of groups that have been exposed to radiation from medical, occupational or environmental sources; experimental animal studies; and studies of cellular responses to radiation. Considering exposure to environmental ionising radiation, inhalation of naturally occurring radon is the major source of radiation in the population - in doses orders of magnitude higher than those from nuclear power production or nuclear fallout. Indoor exposure to radon and its decay products is an important cause of lung cancer; radon may cause approximately one in ten lung cancers in Europe. Exposures to radon in buildings can be reduced via a three-step process of identifying those with potentially elevated radon levels, measuring radon levels, and reducing exposure by installation of remediation systems. In the 4th Edition of the European Code against Cancer it is therefore recommended to: "Find out if you are exposed to radiation from naturally high radon levels in your home. Take action to reduce high radon levels". Non-ionising types of radiation (those with insufficient energy to ionise molecules) - including extremely low-frequency electric and magnetic fields as well as radiofrequency electromagnetic fields - are not an established cause of cancer and are therefore not addressed in the recommendations to reduce cancer risk. PMID:26126928
Simulation of high-energy radiation belt electron fluxes using NARMAX-VERB coupled codes
Pakhotin, I P; Drozdov, A Y; Shprits, Y Y; Boynton, R J; Subbotin, D A; Balikhin, M A
2014-01-01
This study presents a fusion of data-driven and physics-driven methodologies of energetic electron flux forecasting in the outer radiation belt. Data-driven NARMAX (Nonlinear AutoRegressive Moving Averages with eXogenous inputs) model predictions for geosynchronous orbit fluxes have been used as an outer boundary condition to drive the physics-based Versatile Electron Radiation Belt (VERB) code, to simulate energetic electron fluxes in the outer radiation belt environment. The coupled system has been tested for three extended time periods totalling several weeks of observations. The time periods involved periods of quiet, moderate, and strong geomagnetic activity and captured a range of dynamics typical of the radiation belts. The model has successfully simulated energetic electron fluxes for various magnetospheric conditions. Physical mechanisms that may be responsible for the discrepancies between the model results and observations are discussed. PMID:26167432
Evaluation of Error-Correcting Codes for Radiation-Tolerant Memory
NASA Astrophysics Data System (ADS)
Jeon, S.; Vijaya Kumar, B. V. K.; Hwang, E.; Cheng, M. K.
2010-05-01
In space, radiation particles can introduce temporary or permanent errors in memory systems. To protect against potential memory faults, either thick shielding or error-correcting codes (ECC) are used by memory modules. Thick shielding translates into increased mass, and conventional ECCs designed for memories are typically capable of correcting only a single error and detecting a double error. Decoding is usually performed through hard decisions where bits are treated as either correct or flipped in polarity. We demonstrate that low-density parity-check (LDPC) codes that are already prevalent in many communication applications can also be used to protect memories in space. Because the achievable code rate monotonically decreases with time due to the accumulation of permanent errors, the achievable rate serves as a useful metric in designing an appropriate ECC. We describe how to compute soft symbol reliabilities on our channel and compare the performance of soft-decision decoding LDPC codes against conventional hard-decision decoding of Reed-Solomon (RS) codes and Bose-Chaudhuri-Hocquenghem (BCH) codes for a specific memory structure.
3D and 4D Simulations of the Dynamics of the Radiation Belts using VERB code
NASA Astrophysics Data System (ADS)
Shprits, Yuri; Kellerman, Adam; Drozdov, Alexander; Orlova, Ksenia
2015-04-01
Modeling and understanding of ring current and higher energy radiation belts has been a grand challenge since the beginning of the space age. In this study we show long term simulations with a 3D VERB code of modeling the radiation belts with boundary conditions derived from observations around geosynchronous orbit. We also present 4D VERB simulations that include convective transport, radial diffusion, pitch angle scattering and local acceleration. We show that while lower energy radial transport is dominated by the convection and higher energy transport is dominated by the diffusive radial transport. We also show there exists an intermediate range of energies for electrons for which both processes work simultaneously.
Intercomparison of three microwave/infrared high resolution line-by-line radiative transfer codes
NASA Astrophysics Data System (ADS)
Schreier, F.; Garcia, S. Gimeno; Milz, M.; Kottayil, A.; Höpfner, M.; von Clarmann, T.; Stiller, G.
2013-05-01
An intercomparison of three line-by-line (lbl) codes developed independently for atmospheric sounding - ARTS, GARLIC, and KOPRA - has been performed for a thermal infrared nadir sounding application assuming a HIRS-like (High resolution Infrared Radiation Sounder) setup. Radiances for the HIRS infrared channels and a set of 42 atmospheric profiles from the "Garand dataset" have been computed. Results of this intercomparison and a discussion of reasons of the observed differences are presented.
Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes.
Pinsky, L S; Wilson, T L; Ferrari, A; Sala, P; Carminati, F; Brun, R
2001-01-01
This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be useful in the design and analysis of experiments such as ACCESS (Advanced Cosmic-ray Composition Experiment for Space Station), which is an Office of Space Science payload currently under evaluation for deployment on the International Space Station (ISS). FLUKA will be significantly improved and tailored for use in simulating space radiation in four ways. First, the additional physics not presently within the code that is necessary to simulate the problems of interest, namely the heavy ion inelastic processes, will be incorporated. Second, the internal geometry package will be replaced with one that will substantially increase the calculation speed as well as simplify the data input task. Third, default incident flux packages that include all of the different space radiation sources of interest will be included. Finally, the user interface and internal data structure will be melded together with ROOT, the object-oriented data analysis infrastructure system. Beyond
New Particle-in-Cell Code for Numerical Simulation of Coherent Synchrotron Radiation
Balsa Terzic, Rui Li
2010-05-01
We present a first look at the new code for self-consistent, 2D simulations of beam dynamics affected by the coherent synchrotron radiation. The code is of the particle-in-cell variety: the beam bunch is sampled by point-charge particles, which are deposited on the grid; the corresponding forces on the grid are then computed using retarded potentials according to causality, and interpolated so as to advance the particles in time. The retarded potentials are evaluated by integrating over the 2D path history of the bunch, with the charge and current density at the retarded time obtained from interpolation of the particle distributions recorded at discrete timesteps. The code is benchmarked against analytical results obtained for a rigid-line bunch. We also outline the features and applications which are currently being developed.
Application of the new MultiTrans SP3 radiation transport code in BNCT dose planning.
Kotiluoto, P; Hiisamäki, P; Savolainen, S
2001-09-01
Dose planning in boron neutron capture therapy (BNCT) is a complex problem and requires sophisticated numerical methods. In the framework of the Finnish BNCT project, new deterministic three-dimensional radiation transport code MultiTrans SP3 has been developed at VTT Chemical Technology, based on a novel application of the tree multigrid technique. To test the applicability of this new code in a realistic BNCT dose planning problem, cylindrical PMMA (polymethyl-methacrylate) phantom was chosen as a benchmark case. It is a convenient benchmark, as it has been modeled by several different codes, including well-known DORT and MCNP. Extensive measured data also exist. In this paper, a comparison of the new MultiTrans SP3 code with other methods is presented for the PMMA phantom case. Results show that the total neutron dose rate to ICRU adult brain calculated by the MultiTrans SP3 code differs less than 4% in 2 cm depth in phantom (in thermal maximum) from the DORT calculation. Results also show that the calculated 197Au(n,gamma) and 55Mn(n,gamma) reaction rates in 2 cm depth in phantom differ less than 4% and 1% from the measured values, respectively. However, the photon dose calculated by the MultiTrans SP3 code seems to be incorrect in this PMMA phantom case, which requires further studying. As expected, the deterministic MultiTrans SP3 code is over an order of magnitude faster than stochastic Monte Carlo codes (with similar resolution), thus providing a very efficient tool for BNCT dose planning. PMID:11585221
SKIRT: An advanced dust radiative transfer code with a user-friendly architecture
NASA Astrophysics Data System (ADS)
Camps, P.; Baes, M.
2015-03-01
We discuss the architecture and design principles that underpin the latest version of SKIRT, a state-of-the-art open source code for simulating continuum radiation transfer in dusty astrophysical systems, such as spiral galaxies and accretion disks. SKIRT employs the Monte Carlo technique to emulate the relevant physical processes including scattering, absorption and emission by the dust. The code features a wealth of built-in geometries, radiation source spectra, dust characterizations, dust grids, and detectors, in addition to various mechanisms for importing snapshots generated by hydrodynamical simulations. The configuration for a particular simulation is defined at run-time through a user-friendly interface suitable for both occasional and power users. These capabilities are enabled by careful C++ code design. The programming interfaces between components are well defined and narrow. Adding a new feature is usually as simple as adding another class; the user interface automatically adjusts to allow configuring the new options. We argue that many scientific codes, like SKIRT, can benefit from careful object-oriented design and from a friendly user interface, even if it is not a graphical user interface.
Collisional radiative average atom code based on a relativistic Screened Hydrogenic Model
NASA Astrophysics Data System (ADS)
Benita, A. J.; Mínguez, E.; Mendoza, M. A.; Rubiano, J. G.; Gil, J. M.; Rodríguez, R.; Martel, P.
2015-03-01
A steady-state and time-dependent collisional-radiative ''average-atom'' (AA) model (ATMED CR) is presented for the calculation of atomic and radiative properties of plasmas for a wide range of laboratory and theoretical conditions: coronal, local thermodynamic equilibrium or nonlocal thermodynamic equilibrium, optically thin or thick plasmas and photoionized plasmas. The radiative and collisional rates are a set of analytical approximations that compare well with more sophisticated quantum treatment of atomic rates that yield fast calculations. The atomic model is based on a new Relativistic Screened Hydrogenic Model (NRSHM) with a set of universal screening constants including nlj-splitting that has been obtained by fitting to a large database of ionization potentials and excitation energies compiled from the National Institute of Standards and Technology (NIST) database and the Flexible Atomic Code (FAC). The model NRSHM has been validated by comparing the results with ionization energies, transition energies and wave functions computed using sophisticated self-consistent codes and experimental data. All the calculations presented in this work were performed using ATMED CR code.
NASA Technical Reports Server (NTRS)
Chambers, Lin Hartung
1994-01-01
The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.
Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P
2007-01-01
The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported. PMID:17038404
Radiation Coupling with the FUN3D Unstructured-Grid CFD Code
NASA Technical Reports Server (NTRS)
Wood, William A.
2012-01-01
The HARA radiation code is fully-coupled to the FUN3D unstructured-grid CFD code for the purpose of simulating high-energy hypersonic flows. The radiation energy source terms and surface heat transfer, under the tangent slab approximation, are included within the fluid dynamic ow solver. The Fire II flight test, at the Mach-31 1643-second trajectory point, is used as a demonstration case. Comparisons are made with an existing structured-grid capability, the LAURA/HARA coupling. The radiative surface heat transfer rates from the present approach match the benchmark values within 6%. Although radiation coupling is the focus of the present work, convective surface heat transfer rates are also reported, and are seen to vary depending upon the choice of mesh connectivity and FUN3D ux reconstruction algorithm. On a tetrahedral-element mesh the convective heating matches the benchmark at the stagnation point, but under-predicts by 15% on the Fire II shoulder. Conversely, on a mixed-element mesh the convective heating over-predicts at the stagnation point by 20%, but matches the benchmark away from the stagnation region.
EMMA: an adaptive mesh refinement cosmological simulation code with radiative transfer
NASA Astrophysics Data System (ADS)
Aubert, Dominique; Deparis, Nicolas; Ocvirk, Pierre
2015-11-01
EMMA is a cosmological simulation code aimed at investigating the reionization epoch. It handles simultaneously collisionless and gas dynamics, as well as radiative transfer physics using a moment-based description with the M1 approximation. Field quantities are stored and computed on an adaptive three-dimensional mesh and the spatial resolution can be dynamically modified based on physically motivated criteria. Physical processes can be coupled at all spatial and temporal scales. We also introduce a new and optional approximation to handle radiation: the light is transported at the resolution of the non-refined grid and only once the dynamics has been fully updated, whereas thermo-chemical processes are still tracked on the refined elements. Such an approximation reduces the overheads induced by the treatment of radiation physics. A suite of standard tests are presented and passed by EMMA, providing a validation for its future use in studies of the reionization epoch. The code is parallel and is able to use graphics processing units (GPUs) to accelerate hydrodynamics and radiative transfer calculations. Depending on the optimizations and the compilers used to generate the CPU reference, global GPU acceleration factors between ×3.9 and ×16.9 can be obtained. Vectorization and transfer operations currently prevent better GPU performance and we expect that future optimizations and hardware evolution will lead to greater accelerations.
Reanalysis and forecasting killer electrons in Earth's radiation belts using the VERB code
NASA Astrophysics Data System (ADS)
Kellerman, Adam; Kondrashov, Dmitri; Shprits, Yuri; Podladchikova, Tatiana; Drozdov, Alexander
2016-07-01
The Van Allen radiation belts are torii-shaped regions of trapped energetic particles, that in recent years, have become a principle focus for satellite operators and engineers. During geomagnetic storms, electrons can be accelerated up to relativistic energies, where they may penetrate spacecraft shielding and damage electrical systems, causing permanent damage or loss of spacecraft. Data-assimilation provides an optimal way to combine observations of the radiation belts with a physics-based model in order to more accurately specify the global state of the Earth's radiation belts. We present recent advances to the data-assimilative version of the Versatile Electron Radiation Belt (VERB) code, including more sophisticated error analysis, and incorporation of realistic field-models to more accurately specify fluxes at a given MLT or along a spacecraft trajectory. The effect of recent stream-interaction-region (SIR) driven enhancements are investigated using the improved model. We also present a real-time forecast model based on the data-assimilative VERB code, and discuss the forecast performance over the past 12 months.
MULTI2D - a computer code for two-dimensional radiation hydrodynamics
NASA Astrophysics Data System (ADS)
Ramis, R.; Meyer-ter-Vehn, J.; Ramírez, J.
2009-06-01
Simulation of radiation hydrodynamics in two spatial dimensions is developed, having in mind, in particular, target design for indirectly driven inertial confinement energy (IFE) and the interpretation of related experiments. Intense radiation pulses by laser or particle beams heat high-Z target configurations of different geometries and lead to a regime which is optically thick in some regions and optically thin in others. A diffusion description is inadequate in this situation. A new numerical code has been developed which describes hydrodynamics in two spatial dimensions (cylindrical R-Z geometry) and radiation transport along rays in three dimensions with the 4 π solid angle discretized in direction. Matter moves on a non-structured mesh composed of trilateral and quadrilateral elements. Radiation flux of a given direction enters on two (one) sides of a triangle and leaves on the opposite side(s) in proportion to the viewing angles depending on the geometry. This scheme allows to propagate sharply edged beams without ray tracing, though at the price of some lateral diffusion. The algorithm treats correctly both the optically thin and optically thick regimes. A symmetric semi-implicit (SSI) method is used to guarantee numerical stability. Program summaryProgram title: MULTI2D Catalogue identifier: AECV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 151 098 No. of bytes in distributed program, including test data, etc.: 889 622 Distribution format: tar.gz Programming language: C Computer: PC (32 bits architecture) Operating system: Linux/Unix RAM: 2 Mbytes Word size: 32 bits Classification: 19.7 External routines: X-window standard library (libX11.so) and corresponding heading files (X11/*.h) are
Application of the MASH v1.0 Code System to radiological warfare radiation threats
Johnson, J.O.; Santoro, R.T.; Smith, M.S.
1994-03-01
Nuclear hardening capabilities of US and foreign ground force systems is a primary concern of the Department of Defense (DoD) and US Army. The Monte Carlo Adjoint Shielding Code System -- MASH v1.0 was developed at Oak Ridge National Laboratory (ORNL) to analyze these capabilities, i.e. the shielding effectiveness, for prompt radiation from a nuclear weapon detonation. Rapidly changing world events and the proliferation of nuclear weapons related technology have increased the kinds of nuclear threats to include intentionally dispersed radiation sources and fallout from tactical nuclear weapons used in the modern AirLand battlefield scenario. Consequently, a DoD area of increasing interest focuses on determining the shielding effectiveness of foreign and US armored vehicles to radiological warfare and fallout radiation threats. To demonstrate the applicability of MASH for analyzing dispersed radiation source problems, calculations have been completed for two distributed sources; a dispersed radiation environment simulated by a uniformly distributed {sup 60}Co source, and a {sup 235}U fission weapon fallout source. Fluence and dose assessments were performed for the free-field, the inside of a steel-walled two-meter box, in a phantom standing in the free-field, and in a phantom standing in the two-meter box. The results indicate substantial radiation protection factors for the {sup 60}Co dispersed radiation source and the fallout source compared to the prompt radiation protection factors. The dose protection factors ranged from 40 to 95 for the two-meter box and from 55 to 123 for the mid-gut position of the phantom standing in the box. The results further indicate that a {sup 60}Co source might be a good first order approximation for a tactical fission weapon fallout protection factor analysis.
Spectral and Structure Modeling of Low and High Mass Young Stars Using a Radiative Trasnfer Code
NASA Astrophysics Data System (ADS)
Robson Rocha, Will; Pilling, Sergio
The spectroscopy data from space telescopes (ISO, Spitzer, Herchel) shows that in addition to dust grains (e.g. silicates), there is also the presence of the frozen molecular species (astrophysical ices, such as H _{2}O, CO, CO _{2}, CH _{3}OH) in the circumstellar environments. In this work we present a study of the modeling of low and high mass young stellar objects (YSOs), where we highlight the importance in the use of the astrophysical ices processed by the radiation (UV, cosmic rays) comes from stars in formation process. This is important to characterize the physicochemical evolution of the ices distributed by the protostellar disk and its envelope in some situations. To perform this analysis, we gathered (i) observational data from Infrared Space Observatory (ISO) related with low mass protostar Elias29 and high mass protostar W33A, (ii) absorbance experimental data in the infrared spectral range used to determinate the optical constants of the materials observed around this objects and (iii) a powerful radiative transfer code to simulate the astrophysical environment (RADMC-3D, Dullemond et al, 2012). Briefly, the radiative transfer calculation of the YSOs was done employing the RADMC-3D code. The model outputs were the spectral energy distribution and theoretical images in different wavelengths of the studied objects. The functionality of this code is based on the Monte Carlo methodology in addition to Mie theory for interaction among radiation and matter. The observational data from different space telescopes was used as reference for comparison with the modeled data. The optical constants in the infrared, used as input in the models, were calculated directly from absorbance data obtained in the laboratory of both unprocessed and processed simulated interstellar samples by using NKABS code (Rocha & Pilling 2014). We show from this study that some absorption bands in the infrared, observed in the spectrum of Elias29 and W33A can arises after the ices
Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum
NASA Technical Reports Server (NTRS)
Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; Singleterry, Robert C.; Norbury, John W.; Badavi, Francis F.; Aghara, Sukesh K.
2009-01-01
Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.
The Development of the Ducted Fan Noise Propagation and Radiation Code CDUCT-LaRC
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Farassat, F.; Pope, D. Stuart; Vatsa, Veer
2003-01-01
The development of the ducted fan noise propagation and radiation code CDUCT-LaRC at NASA Langley Research Center is described. This code calculates the propagation and radiation of given acoustic modes ahead of the fan face or aft of the exhaust guide vanes in the inlet or exhaust ducts, respectively. This paper gives a description of the modules comprising CDUCT-LaRC. The grid generation module provides automatic creation of numerical grids for complex (non-axisymmetric) geometries that include single or multiple pylons. Files for performing automatic inviscid mean flow calculations are also generated within this module. The duct propagation is based on the parabolic approximation theory of R. P. Dougherty. This theory allows the handling of complex internal geometries and the ability to study the effect of non-uniform (i.e. circumferentially and axially segmented) liners. Finally, the duct radiation module is based on the Ffowcs Williams-Hawkings (FW-H) equation with a penetrable data surface. Refraction of sound through the shear layer between the external flow and bypass duct flow is included. Results for benchmark annular ducts, as well as other geometries with pylons, are presented and compared with available analytical data.
Blakeman, E.D.
2000-05-07
A software system, GRAVE (Geometry Rendering and Visual Editor), has been developed at the Oak Ridge National Laboratory (ORNL) to perform interactive visualization and development of models used as input to the TORT three-dimensional discrete ordinates radiation transport code. Three-dimensional and two-dimensional visualization displays are included. Display capabilities include image rotation, zoom, translation, wire-frame and translucent display, geometry cuts and slices, and display of individual component bodies and material zones. The geometry can be interactively edited and saved in TORT input file format. This system is an advancement over the current, non-interactive, two-dimensional display software. GRAVE is programmed in the Java programming language and can be implemented on a variety of computer platforms. Three- dimensional visualization is enabled through the Visualization Toolkit (VTK), a free-ware C++ software library developed for geometric and data visual display. Future plans include an extension of the system to read inputs using binary zone maps and combinatorial geometry models containing curved surfaces, such as those used for Monte Carlo code inputs. Also GRAVE will be extended to geometry visualization/editing for the DORT two-dimensional transport code and will be integrated into a single GUI-based system for all of the ORNL discrete ordinates transport codes.
NASA Astrophysics Data System (ADS)
Havemann, Stephan; Thelen, Jean-Claude; Taylor, Jonathan P.; Keil, Andreas
2009-03-01
The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) has been developed for the simulation of highly spectrally resolved measurements from satellite based (i.e. Infrared Atmospheric Sounding Interferometer (IASI), Atmospheric Infrared Sounder (AIRS)) and airborne (i.e. Atmospheric Research Interferometer Evaluation System (ARIES)) instruments. The use of principle components enables the calculation of a complete spectrum in less than a second. The principal compoents are derived from a diverse training set of atmospheres and surfaces and contain their spectral characteristics in a highly compressed form. For any given atmosphere/surface, the HT-FRTC calculates the weightings (also called scores) of a few hundred principal components based on selected monochromatic radiative transfer calculations, which is far cheaper than thousands of channel radiance calculations. By intercomparison with line-by-line and other fast models the HT-FRTC has been shown to be accurate. The HT-FRTC has been successfully applied to simultaneous variational retrievals of atmospheric temperature and humidity profiles, surface temperature and surface emissivity over land. This is the subject of another presentation at this conference. The HT-FRTC has now also been extended to include an exact treatment of scattering by aerosols/clouds. The radiative transfer problem is solved using a discrete ordinate method (DISORT). Modelling results at high-spectral resolution for non-clear sky atmospheres obtained with the HT-FRTC are presented.
Update on the Radiation Code in IMPACT: Clouds, Heating Rates, and Comparisons
Edis, T; Grant, K; Cameron-Smith, P
2005-07-22
This is a summary of work done over two months in the summer of 2005, which was devoted to improving the radiation code of IMPACT, the LLNL 3D global atmospheric chemistry and aerosol model. Most of the work concerned the addition and testing of new cloud optical property routines designed to work with CAM3 meteorological data, and the comparison of CAM3 with the results of IMPACT runs using meteorological data from CAM3 and MACCM3. Additional related work done in the course of these main tasks will be described as necessary.
Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.
2015-12-21
This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000® problems. These benchmark and scaling studies show promising results.« less
Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.
2015-12-21
This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Some specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000^{®} problems. These benchmark and scaling studies show promising results.
NASA Astrophysics Data System (ADS)
Ioan, M.-R.
2016-08-01
In ionizing radiation related experiments, precisely knowing of the involved parameters it is a very important task. Some of these experiments are involving the use of electromagnetic ionizing radiation such are gamma rays and X rays, others make use of energetic charged or not charged small dimensions particles such are protons, electrons, neutrons and even, in other cases, larger accelerated particles such are helium or deuterium nuclei are used. In all these cases the beam used to hit an exposed target must be previously collimated and precisely characterized. In this paper, a novel method to determine the distribution of the collimated beam involving Matlab coding is proposed. The method was implemented by using of some Pyrex glass test samples placed in the beam where its distribution and dimension must be determined, followed by taking high quality pictures of them and then by digital processing the resulted images. By this method, information regarding the doses absorbed in the exposed samples volume are obtained too.
Improving the Salammbo code modelling and using it to better predict radiation belts dynamics
NASA Astrophysics Data System (ADS)
Maget, Vincent; Sicard-Piet, Angelica; Grimald, Sandrine Rochel; Boscher, Daniel
2016-07-01
In the framework of the FP7-SPACESTORM project, one objective is to improve the reliability of the model-based predictions performed of the radiation belt dynamics (first developed during the FP7-SPACECAST project). In this purpose we have analyzed and improved the way the simulations using the ONERA Salammbô code are performed, especially in : - Better controlling the driving parameters of the simulation; - Improving the initialization of the simulation in order to be more accurate at most energies for L values between 4 to 6; - Improving the physics of the model. For first point a statistical analysis of the accuracy of the Kp index has been conducted. For point two we have based our method on a long duration simulation in order to extract typical radiation belt states depending on the solar wind stress and geomagnetic activity. For last point we have first improved separately the modelling of different processes acting in the radiation belts and then, we have analyzed the global improvements obtained when simulating them together. We'll discuss here on all these points and on the balance that has to be taken into account between modeled processes to globally improve the radiation belt modelling.
HELIOS-CR A 1-D radiation-magnetohydrodynamics code with inline atomic kinetics modeling
NASA Astrophysics Data System (ADS)
Macfarlane, J. J.; Golovkin, I. E.; Woodruff, P. R.
2006-05-01
HELIOS-CR is a user-oriented 1D radiation-magnetohydrodynamics code to simulate the dynamic evolution of laser-produced plasmas and z-pinch plasmas. It includes an in-line collisional-radiative (CR) model for computing non-LTE atomic level populations at each time step of the hydrodynamics simulation. HELIOS-CR has been designed for ease of use, and is well-suited for experimentalists, as well as graduate and undergraduate student researchers. The energy equations employed include models for laser energy deposition, radiation from external sources, and high-current discharges. Radiative transport can be calculated using either a multi-frequency flux-limited diffusion model, or a multi-frequency, multi-angle short characteristics model. HELIOS-CR supports the use of SESAME equation of state (EOS) tables, PROPACEOS EOS/multi-group opacity data tables, and non-LTE plasma properties computed using the inline CR modeling. Time-, space-, and frequency-dependent results from HELIOS-CR calculations are readily displayed with the HydroPLOT graphics tool. In addition, the results of HELIOS simulations can be post-processed using the SPECT3D Imaging and Spectral Analysis Suite to generate images and spectra that can be directly compared with experimental measurements. The HELIOS-CR package runs on Windows, Linux, and Mac OSX platforms, and includes online documentation. We will discuss the major features of HELIOS-CR, and present example results from simulations.
CASTRO: A New AMR Radiation-Hydrodynamics Code for Compressible Astrophysics
NASA Astrophysics Data System (ADS)
Almgren, Ann; Bell, J.; Day, M.; Howell, L.; Joggerst, C.; Myra, E.; Nordhaus, J.; Singer, M.; Zingale, M.
2010-01-01
CASTRO is a new, multi-dimensional, Eulerian AMR radiation-hydrodynamics code designed for astrophysical simulations. The code includes routines for various equations of state and nuclear reaction networks, and can be used with Cartesian, cylindrical or spherical coordinates. Time integration of the hydrodynamics equations is based on a higher-order, unsplit Godunov scheme. Self-gravity can be calculated on the adaptive hierarchy using a simple monopole approximation or a full Poisson solve for the potential. CASTRO includes gray and multigroup radiation diffusion. Multi-species neutrino diffusion for supernovae is nearing completion. The adaptive framework of CASTRO is based on an time-evolving hierarchy of nested rectangular grids with refinement in both space and time; the entire implementation is designed to run on thousands of processors. We describe in more detail how CASTRO is implemented and can be used for a number of different simulations. Our initial applications of CASTRO include Type Ia and Type II supernovae. This work has been supported by the SciDAC Program of the DOE Office of Mathematics, Information, and Computational Sciences under contracts No. DE-AC02-05CH11231 (LBNL), No. DE-FC02-06ER41438 (UCSC), and No. DE-AC52-07NA27344 (LLNL); and LLNL contracts B582735 and B574691(Stony Brook). Calculations shown were carried out on Franklin at NERSC.
A Random Walk on WASP-12b with the Bayesian Atmospheric Radiative Transfer (BART) Code
NASA Astrophysics Data System (ADS)
Harrington, Joseph; Cubillos, Patricio; Blecic, Jasmina; Challener, Ryan; Rojo, Patricio; Lust, Nathaniel B.; Bowman, Oliver; Blumenthal, Sarah D.; Foster, Andrew S. D.; Foster, Austin James; Stemm, Madison; Bruce, Dylan
2016-01-01
We present the Bayesian Atmospheric Radiative Transfer (BART) code for atmospheric property retrievals from transit and eclipse spectra, and apply it to WASP-12b, a hot (~3000 K) exoplanet with a high eclipse signal-to-noise ratio. WASP-12b has been controversial. We (Madhusudhan et al. 2011, Nature) claimed it was the first planet with a high C/O abundance ratio. Line et al. (2014, ApJ) suggested a high CO2 abundance to explain the data. Stevenson et al. (2014, ApJ, atmospheric model by Madhusudhan) add additional data and reaffirm the original result, stating that C2H2 and HCN, not included in the Line et al. models, explain the data. We explore several modeling configurations and include Hubble, Spitzer, and ground-based eclipse data.BART consists of a differential-evolution Markov-Chain Monte Carlo sampler that drives a line-by-line radiative transfer code through the phase space of thermal- and abundance-profile parameters. BART is written in Python and C. Python modules generate atmospheric profiles from sets of MCMC parameters and integrate the resulting spectra over observational bandpasses, allowing high flexibility in modeling the planet without interacting with the fast, C portions that calculate the spectra. BART's shared memory and optimized opacity calculation allow it to run on a laptop, enabling classroom use. Runs can scale constant abundance profiles, profiles of thermochemical equilibrium abundances (TEA) calculated by the included TEA code, or arbitrary curves. Several thermal profile parameterizations are available. BART is an open-source, reproducible-research code. Users must release any code or data modifications if they publish results from it, and we encourage the community to use it and to participate in its development via http://github.com/ExOSPORTS/BART.This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. J. Blecic holds a NASA Earth and Space Science
WASP-12b According to the Bayesian Atmospheric Radiative Transfer (BART) Code
NASA Astrophysics Data System (ADS)
Harrington, Joseph; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.; Rojo, Patricio M.; Lust, Nate B.; Bowman, M. Oliver; Blumenthal, Sarah D.; Foster, Andrew SD; Foster, A. J.
2015-11-01
We present the Bayesian Atmospheric Radiative Transfer (BART) code for atmospheric property retrievals from transit and eclipse spectra, and apply it to WASP-12b, a hot (~3000 K) exoplanet with a high eclipse signal-to-noise ratio. WASP-12b has been controversial. We (Madhusudhan et al. 2011, Nature) claimed it was the first planet with a high C/O abundance ratio. Line et al. (2014, ApJ) suggested a high CO2 abundance to explain the data. Stevenson et al. (2014, ApJ, atmospheric model by Madhusudhan) add additional data and reaffirm the original result, stating that C2H2 and HCN, not included in the Line et al. models, explain the data. We explore several modeling configurations and include Hubble, Spitzer, and ground-based eclipse data.BART consists of a differential-evolution Markov-Chain Monte Carlo sampler that drives a line-by-line radiative transfer code through the phase space of thermal- and abundance-profile parameters. BART is written in Python and C. Python modules generate atmospheric profiles from sets of MCMC parameters and integrate the resulting spectra over observational bandpasses, allowing high flexibility in modeling the planet without interacting with the fast, C portions that calculate the spectra. BART's shared memory and optimized opacity calculation allow it to run on a laptop, enabling classroom use. Runs can scale constant abundance profiles, profiles of thermochemical equilibrium abundances (TEA) calculated by the included TEA code, or arbitrary curves. Several thermal profile parameterizations are available. BART is an open-source, reproducible-research code. Users must release any code or data modifications if they publish results from it, and we encourage the community to use it and to participate in its development via http://github.com/ExOSPORTS/BART.This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. J. Blecic holds a NASA Earth and Space Science
A Radiation Chemistry Code Based on the Greens Functions of the Diffusion Equation
NASA Technical Reports Server (NTRS)
Plante, Ianik; Wu, Honglu
2014-01-01
Ionizing radiation produces several radiolytic species such as.OH, e-aq, and H. when interacting with biological matter. Following their creation, radiolytic species diffuse and chemically react with biological molecules such as DNA. Despite years of research, many questions on the DNA damage by ionizing radiation remains, notably on the indirect effect, i.e. the damage resulting from the reactions of the radiolytic species with DNA. To simulate DNA damage by ionizing radiation, we are developing a step-by-step radiation chemistry code that is based on the Green's functions of the diffusion equation (GFDE), which is able to follow the trajectories of all particles and their reactions with time. In the recent years, simulations based on the GFDE have been used extensively in biochemistry, notably to simulate biochemical networks in time and space and are often used as the "gold standard" to validate diffusion-reaction theories. The exact GFDE for partially diffusion-controlled reactions is difficult to use because of its complex form. Therefore, the radial Green's function, which is much simpler, is often used. Hence, much effort has been devoted to the sampling of the radial Green's functions, for which we have developed a sampling algorithm This algorithm only yields the inter-particle distance vector length after a time step; the sampling of the deviation angle of the inter-particle vector is not taken into consideration. In this work, we show that the radial distribution is predicted by the exact radial Green's function. We also use a technique developed by Clifford et al. to generate the inter-particle vector deviation angles, knowing the inter-particle vector length before and after a time step. The results are compared with those predicted by the exact GFDE and by the analytical angular functions for free diffusion. This first step in the creation of the radiation chemistry code should help the understanding of the contribution of the indirect effect in the
Review of Fast Monte Carlo Codes for Dose Calculation in Radiation Therapy Treatment Planning
Jabbari, Keyvan
2011-01-01
An important requirement in radiation therapy is a fast and accurate treatment planning system. This system, using computed tomography (CT) data, direction, and characteristics of the beam, calculates the dose at all points of the patient's volume. The two main factors in treatment planning system are accuracy and speed. According to these factors, various generations of treatment planning systems are developed. This article is a review of the Fast Monte Carlo treatment planning algorithms, which are accurate and fast at the same time. The Monte Carlo techniques are based on the transport of each individual particle (e.g., photon or electron) in the tissue. The transport of the particle is done using the physics of the interaction of the particles with matter. Other techniques transport the particles as a group. For a typical dose calculation in radiation therapy the code has to transport several millions particles, which take a few hours, therefore, the Monte Carlo techniques are accurate, but slow for clinical use. In recent years, with the development of the ‘fast’ Monte Carlo systems, one is able to perform dose calculation in a reasonable time for clinical use. The acceptable time for dose calculation is in the range of one minute. There is currently a growing interest in the fast Monte Carlo treatment planning systems and there are many commercial treatment planning systems that perform dose calculation in radiation therapy based on the Monte Carlo technique. PMID:22606661
NASA Technical Reports Server (NTRS)
Mashnik, S. G.; Gudima, K. K.; Sierk, A. J.; Moskalenko, I. V.
2002-01-01
Space radiation shield applications and studies of cosmic ray propagation in the Galaxy require reliable cross sections to calculate spectra of secondary particles and yields of the isotopes produced in nuclear reactions induced both by particles and nuclei at energies from threshold to hundreds of GeV per nucleon. Since the data often exist in a very limited energy range or sometimes not at all, the only way to obtain an estimate of the production cross sections is to use theoretical models and codes. Recently, we have developed improved versions of the Cascade-Exciton Model (CEM) of nuclear reactions: the codes CEM97 and CEM2k for description of particle-nucleus reactions at energies up to about 5 GeV. In addition, we have developed a LANL version of the Quark-Gluon String Model (LAQGSM) to describe reactions induced both by particles and nuclei at energies up to hundreds of GeVhucleon. We have tested and benchmarked the CEM and LAQGSM codes against a large variety of experimental data and have compared their results with predictions by other currently available models and codes. Our benchmarks show that CEM and LAQGSM codes have predictive powers no worse than other currently used codes and describe many reactions better than other codes; therefore both our codes can be used as reliable event-generators for space radiation shield and cosmic ray propagation applications. The CEM2k code is being incorporated into the transport code MCNPX (and several other transport codes), and we plan to incorporate LAQGSM into MCNPX in the near future. Here, we present the current status of the CEM2k and LAQGSM codes, and show results and applications to studies of cosmic ray propagation in the Galaxy.
Energy Science and Technology Software Center (ESTSC)
1982-11-18
Version 00 LPGS was developed to calculate the radiological impacts resulting from radioactive releases to the hydrosphere. The name LPGS was derived from the Liquid Pathway Generic Study for which the original code was used primarily as an analytic tool in the assessment process. The hydrosphere is represented by the following types of water bodies: estuary, small river, well, lake, and one-dimensional (1-D) river. LPGS is designed to calculate radiation dose (individual and population) tomore » body organs as a function of time for the various exposure pathways. The radiological consequences to the aquatic biota are estimated. Several simplified radionuclide transport models are employed with built-in formulations to describe the release rate of the radionuclides. A tabulated user-supplied release model can be input, if desired. Printer plots of dose versus time for the various exposure pathways are provided.« less
Development and validation of a GEANT4 radiation transport code for CT dosimetry
Carver, DE; Kost, SD; Fernald, MJ; Lewis, KG; Fraser, ND; Pickens, DR; Price, RR; Stabin, MG
2014-01-01
We have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate our simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air, a standard 16-cm acrylic head phantom, and a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of our Monte Carlo simulations. We found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135
An object-oriented implementation of a parallel Monte Carlo code for radiation transport
NASA Astrophysics Data System (ADS)
Santos, Pedro Duarte; Lani, Andrea
2016-05-01
This paper describes the main features of a state-of-the-art Monte Carlo solver for radiation transport which has been implemented within COOLFluiD, a world-class open source object-oriented platform for scientific simulations. The Monte Carlo code makes use of efficient ray tracing algorithms (for 2D, axisymmetric and 3D arbitrary unstructured meshes) which are described in detail. The solver accuracy is first verified in testcases for which analytical solutions are available, then validated for a space re-entry flight experiment (i.e. FIRE II) for which comparisons against both experiments and reference numerical solutions are provided. Through the flexible design of the physical models, ray tracing and parallelization strategy (fully reusing the mesh decomposition inherited by the fluid simulator), the implementation was made efficient and reusable.
NASA Astrophysics Data System (ADS)
Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.
2016-03-01
This work discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package authored at Oak Ridge National Laboratory. Shift has been developed to scale well from laptops to small computing clusters to advanced supercomputers and includes features such as support for multiple geometry and physics engines, hybrid capabilities for variance reduction methods such as the Consistent Adjoint-Driven Importance Sampling methodology, advanced parallel decompositions, and tally methods optimized for scalability on supercomputing architectures. The scaling studies presented in this paper demonstrate good weak and strong scaling behavior for the implemented algorithms. Shift has also been validated and verified against various reactor physics benchmarks, including the Consortium for Advanced Simulation of Light Water Reactors' Virtual Environment for Reactor Analysis criticality test suite and several Westinghouse AP1000® problems presented in this paper. These benchmark results compare well to those from other contemporary Monte Carlo codes such as MCNP5 and KENO.
Development and validation of a GEANT4 radiation transport code for CT dosimetry.
Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G
2015-04-01
The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135
A study of the earth radiation budget using a 3D Monte-Carlo radiative transer code
NASA Astrophysics Data System (ADS)
Okata, M.; Nakajima, T.; Sato, Y.; Inoue, T.; Donovan, D. P.
2013-12-01
The purpose of this study is to evaluate the earth's radiation budget when data are available from satellite-borne active sensors, i.e. cloud profiling radar (CPR) and lidar, and a multi-spectral imager (MSI) in the project of the Earth Explorer/EarthCARE mission. For this purpose, we first developed forward and backward 3D Monte Carlo radiative transfer codes that can treat a broadband solar flux calculation including thermal infrared emission calculation by k-distribution parameters of Sekiguchi and Nakajima (2008). In order to construct the 3D cloud field, we tried the following three methods: 1) stochastic cloud generated by randomized optical thickness each layer distribution and regularly-distributed tilted clouds, 2) numerical simulations by a non-hydrostatic model with bin cloud microphysics model and 3) Minimum cloud Information Deviation Profiling Method (MIDPM) as explained later. As for the method-2 (numerical modeling method), we employed numerical simulation results of Californian summer stratus clouds simulated by a non-hydrostatic atmospheric model with a bin-type cloud microphysics model based on the JMA NHM model (Iguchi et al., 2008; Sato et al., 2009, 2012) with horizontal (vertical) grid spacing of 100m (20m) and 300m (20m) in a domain of 30km (x), 30km (y), 1.5km (z) and with a horizontally periodic lateral boundary condition. Two different cell systems were simulated depending on the cloud condensation nuclei (CCN) concentration. In the case of horizontal resolution of 100m, regionally averaged cloud optical thickness,
NASA Astrophysics Data System (ADS)
Hilmy, N.; Febrida, A.; Basril, A.
2007-11-01
Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.
Park, Jinhyoung; Lee, Jungwoo; Lau, Sien Ting; Lee, Changyang; Huang, Ying; Lien, Ching-Ling; Kirk Shung, K
2012-04-01
Acoustic radiation force impulse (ARFI) imaging has been developed as a non-invasive method for quantitative illustration of tissue stiffness or displacement. Conventional ARFI imaging (2-10 MHz) has been implemented in commercial scanners for illustrating elastic properties of several organs. The image resolution, however, is too coarse to study mechanical properties of micro-sized objects such as cells. This article thus presents a high-frequency coded excitation ARFI technique, with the ultimate goal of displaying elastic characteristics of cellular structures. Tissue mimicking phantoms and zebrafish embryos are imaged with a 100-MHz lithium niobate (LiNbO₃) transducer, by cross-correlating tracked RF echoes with the reference. The phantom results show that the contrast of ARFI image (14 dB) with coded excitation is better than that of the conventional ARFI image (9 dB). The depths of penetration are 2.6 and 2.2 mm, respectively. The stiffness data of the zebrafish demonstrate that the envelope is harder than the embryo region. The temporal displacement change at the embryo and the chorion is as large as 36 and 3.6 μm. Consequently, this high-frequency ARFI approach may serve as a remote palpation imaging tool that reveals viscoelastic properties of small biological samples. PMID:22101757
Breast Lesions Evaluated by Color-Coded Acoustic Radiation Force Impulse (ARFI) Imaging.
Zhou, JianQiao; Yang, ZhiFang; Zhan, WeiWei; Zhang, JingWen; Hu, Na; Dong, YiJie; Wang, YingYing
2016-07-01
The goal of our study was to investigate the value of color-coded Virtual Touch tissue imaging (VTI) using acoustic radiation force impulse (ARFI) technology in the characterization of breast lesions and to compare it with conventional ultrasound (US). Conventional US and color-coded VTI were performed in 196 solid breast lesions in 196 consecutive women (age range 17-91 y; mean 48.17 ± 14.46 y). A four-point scale VTI score was assigned for each lesion according to the color pattern both in the lesion and in the surrounding breast tissue. The mean VTI score was significantly higher for malignant lesions (3.80 ± 0.66, range 1-4) than for benign ones (2.02 ± 1.20, range 1-4) (p < 0.001), and the optimal cut-off value was between score 3 and score 4. The area under the receiver operating characteristic (ROC) curve for combined conventional US and VTI (0.945) was significantly higher than that for conventional US (0.902) and for VTI (0.871) (p = 0.0021 and p < 0.001, respectively). It was concluded that color-coded VTI with the proposed four-point scale score system combined with conventional US might have the potential to aid in the characterization of benign and malignant breast lesions. PMID:27131841
Ballarini, Francesca; Altieri, Saverio; Bortolussi, Silva; Carante, Mario; Giroletti, Elio; Protti, Nicoletta
2014-08-01
This paper presents a biophysical model of radiation-induced cell death, implemented as a Monte Carlo code called BIophysical ANalysis of Cell death and chromosome Aberrations (BIANCA), based on the assumption that some chromosome aberrations (dicentrics, rings, and large deletions, called ‘‘lethal aberrations’’) lead to clonogenic inactivation. In turn, chromosome aberrations are assumed to derive from clustered, and thus severe, DNA lesions (called ‘‘cluster lesions,’’ or CL) interacting at the micrometer scale; the CL yield and the threshold distance governing CL interaction are the only model parameters. After a pilot study on V79 hamster cells exposed to protons and carbon ions, in the present work the model was extended and applied to AG1522 human cells exposed to photons, He ions, and heavier ions including carbon and neon. The agreement with experimental survival data taken from the literature supported the assumptions. In particular, the inactivation of AG1522 cells was explained by lethal aberrations not only for X-rays, as already reported by others, but also for the aforementioned radiation types. Furthermore, the results are consistent with the hypothesis that the critical initial lesions leading to cell death are DNA cluster lesions having yields in the order of *2 CL Gy-1 cell-1 at low LET and*20 CL Gy-1 cell-1 at high LET, and that the processing of these lesions is modulated by proximity effects at the micrometer scale related to interphase chromatin organization. The model was then applied to calculate the fraction of inactivated cells, as well as the yields of lethal aberrations and cluster lesions, as a function of LET; the results showed a maximum around 130 keV/lm, and such maximum was much higher for cluster lesions and lethal aberrations than for cell inactivation. PMID:24659413
Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji
2013-09-25
A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.
MOCRA: a Monte Carlo code for the simulation of radiative transfer in the atmosphere.
Premuda, Margherita; Palazzi, Elisa; Ravegnani, Fabrizio; Bortoli, Daniele; Masieri, Samuele; Giovanelli, Giorgio
2012-03-26
This paper describes the radiative transfer model (RTM) MOCRA (MOnte Carlo Radiance Analysis), developed in the frame of DOAS (Differential Optical Absorption Spectroscopy) to correctly interpret remote sensing measurements of trace gas amounts in the atmosphere through the calculation of the Air Mass Factor. Besides the DOAS-related quantities, the MOCRA code yields: 1- the atmospheric transmittance in the vertical and sun directions, 2- the direct and global irradiance, 3- the single- and multiple- scattered radiance for a detector with assigned position, line of sight and field of view. Sample calculations of the main radiometric quantities calculated with MOCRA are presented and compared with the output of another RTM (MODTRAN4). A further comparison is presented between the NO2 slant column densities (SCDs) measured with DOAS at Evora (Portugal) and the ones simulated with MOCRA. Both comparisons (MOCRA-MODTRAN4 and MOCRA-observations) gave more than satisfactory results, and overall make MOCRA a versatile tool for atmospheric radiative transfer simulations and interpretation of remote sensing measurements. PMID:22453470
Odyssey: A Public GPU-based Code for General Relativistic Radiative Transfer in Kerr Spacetime
NASA Astrophysics Data System (ADS)
Pu, Hung-Yi; Yun, Kiyun; Younsi, Ziri; Yoon, Suk-Jin
2016-04-01
General relativistic radiative transfer calculations coupled with the calculation of geodesics in the Kerr spacetime are an essential tool for determining the images, spectra, and light curves from matter in the vicinity of black holes. Such studies are especially important for ongoing and upcoming millimeter/submillimeter very long baseline interferometry observations of the supermassive black holes at the centers of Sgr A* and M87. To this end we introduce Odyssey, a graphics processing unit (GPU) based code for ray tracing and radiative transfer in the Kerr spacetime. On a single GPU, the performance of Odyssey can exceed 1 ns per photon, per Runge-Kutta integration step. Odyssey is publicly available, fast, accurate, and flexible enough to be modified to suit the specific needs of new users. Along with a Graphical User Interface powered by a video-accelerated display architecture, we also present an educational software tool, Odyssey_Edu, for showing in real time how null geodesics around a Kerr black hole vary as a function of black hole spin and angle of incidence onto the black hole.
European Code against Cancer 4th Edition: Ultraviolet radiation and cancer.
Greinert, Rüdiger; de Vries, Esther; Erdmann, Friederike; Espina, Carolina; Auvinen, Anssi; Kesminiene, Ausrele; Schüz, Joachim
2015-12-01
Ultraviolet radiation (UVR) is part of the electromagnetic spectrum emitted naturally from the sun or from artificial sources such as tanning devices. Acute skin reactions induced by UVR exposure are erythema (skin reddening), or sunburn, and the acquisition of a suntan triggered by UVR-induced DNA damage. UVR exposure is the main cause of skin cancer, including cutaneous malignant melanoma, basal-cell carcinoma, and squamous-cell carcinoma. Skin cancer is the most common cancer in fair-skinned populations, and its incidence has increased steeply over recent decades. According to estimates for 2012, about 100,000 new cases of cutaneous melanoma and about 22,000 deaths from it occurred in Europe. The main mechanisms by which UVR causes cancer are well understood. Exposure during childhood appears to be particularly harmful. Exposure to UVR is a risk factor modifiable by individuals' behaviour. Excessive exposure from natural sources can be avoided by seeking shade when the sun is strongest, by wearing appropriate clothing, and by appropriately applying sunscreens if direct sunlight is unavoidable. Exposure from artificial sources can be completely avoided by not using sunbeds. Beneficial effects of sun or UVR exposure, such as for vitamin D production, can be fully achieved while still avoiding too much sun exposure and the use of sunbeds. Taking all the scientific evidence together, the recommendation of the 4th edition of the European Code Against Cancer for ultraviolet radiation is: "Avoid too much sun, especially for children. Use sun protection. Do not use sunbeds." PMID:26096748
Michalsky, J.; Harrison, L.
1992-03-17
Two tasks are included in the second year of this project. One task continues the collection of high quality data sets for the testing of radiation codes within climate models. The other task involves the development of accurate spectral instruments for the measurement of shortwave radiation. A third task was completed in the second half of the first year of the project and will be briefly summarized.
Michalsky, J.; Harrison, L.
1992-03-17
Two tasks are included in the second year of this project. One task continues the collection of high quality data sets for the testing of radiation codes within climate models. The other task involves the development of accurate spectral instruments for the measurement of shortwave radiation. A third task was completed in the second half of the first year of the project and will be briefly summarized.
NASA Astrophysics Data System (ADS)
Porter, Jamie A.; Townsend, Lawrence W.; Spence, Harlan; Golightly, Michael; Schwadron, Nathan; Kasper, Justin; Case, Anthony W.; Blake, John B.; Zeitlin, Cary
2014-06-01
The Cosmic Ray Telescope for the Effects of Radiation (CRaTER), an instrument carried on the Lunar Reconnaissance Orbiter spacecraft, directly measures the energy depositions by solar and galactic cosmic radiations in its silicon wafer detectors. These energy depositions are converted to linear energy transfer (LET) spectra. High LET particles, which are mainly high-energy heavy ions found in the incident cosmic ray spectrum, or target fragments and recoils produced by protons and heavier ions, are of particular importance because of their potential to cause significant damage to human tissue and electronic components. Aside from providing LET data useful for space radiation risk analyses for lunar missions, the observed LET spectra can also be used to help validate space radiation transport codes, used for shielding design and risk assessment applications, which is a major thrust of this work. In this work the Monte Carlo transport code HETC-HEDS (High-Energy Transport Code-Human Exploration and Development in Space) is used to estimate LET contributions from the incident primary ions and their charged secondaries produced by nuclear collisions as they pass through the three pairs of silicon detectors. Also in this work, the contributions to the LET of the primary ions and their charged secondaries are analyzed and compared with estimates obtained using the deterministic space radiation code HZETRN 2010, developed at NASA Langley Research Center. LET estimates obtained from the two transport codes are compared with measurements of LET from the CRaTER instrument during the mission. Overall, a comparison of the LET predictions of the HETC-HEDS code to the predictions of the HZETRN code displays good agreement. The code predictions are also in good agreement with the CRaTER LET measurements above 15 keV/µm but differ from the measurements for smaller values of LET. A possible reason for this disagreement between measured and calculated spectra below 15 keV/µm is an
NASA Astrophysics Data System (ADS)
Takabe, Hideaki
A brief review is given of the physics of radiation transport, a topic that is important in the study of astrophysics, laser-plasmas, divertor-plasmas, etc. In general, we must solve non-local thermodynamic equilibrium processes using an appropriate atomic model. The resultant data related to the spectral emissivity and opacity of partially ionized plasmas are then used to solve the radiation transfer equation. In this note, I briefly overview a variety of ways to carry out such a calculation. In addition, similarities and differences in the physical process between laser-plasmas and divertor-plasmas are briefly described.
Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes
NASA Technical Reports Server (NTRS)
Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.
2001-01-01
The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as
NASA Technical Reports Server (NTRS)
Armstrong, T. W.
1972-01-01
Several Monte Carlo radiation transport computer codes are used to predict quantities of interest in the fields of radiotherapy and radiobiology. The calculational methods are described and comparisions of calculated and experimental results are presented for dose distributions produced by protons, neutrons, and negatively charged pions. Comparisons of calculated and experimental cell survival probabilities are also presented.
Modeling the physical structure of star-forming regions with LIME, a 3D radiative transfer code
NASA Astrophysics Data System (ADS)
Quénard, D.; Bottinelli, S.; Caux, E.
2016-05-01
The ability to predict line emission is crucial in order to make a comparison with observations. From LTE to full radiative transfer codes, the goal is always to derive the most accurately possible the physical properties of the source. Non-LTE calculations can be very time consuming but are needed in most of the cases since many studied regions are far from LTE.
Shapiro, A.B.
1983-08-01
The computer code FACET calculates the radiation geometric view factor (alternatively called shape factor, angle factor, or configuration factor) between surfaces for axisymmetric, two-dimensional planar and three-dimensional geometries with interposed third surface obstructions. FACET was developed to calculate view factors for input to finite-element heat-transfer analysis codes. The first section of this report is a brief review of previous radiation-view-factor computer codes. The second section presents the defining integral equation for the geometric view factor between two surfaces and the assumptions made in its derivation. Also in this section are the numerical algorithms used to integrate this equation for the various geometries. The third section presents the algorithms used to detect self-shadowing and third-surface shadowing between the two surfaces for which a view factor is being calculated. The fourth section provides a user's input guide followed by several example problems.
The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) and its applications
NASA Astrophysics Data System (ADS)
Thelen, Jean-Claude; Havemann, Stephan; Lewis, Warren
2015-09-01
The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) is a component of the Met Office NEON Tactical Decision Aid (TDA). Within NEON, the HT-FRTC has for a number of years been used to predict the IR apparent thermal contrasts between different surface types as observed by an airborne sensor. To achieve this, the HT-FRTC is supplied with the inherent temperatures and spectral properties of these surfaces (i.e. ground target(s) and background). A key strength of the HT-FRTC is its ability to take into account the detailed properties of the atmosphere, which in the context of NEON tend to be provided by a Numerical Weather Prediction (NWP) forecast model. While water vapour and ozone are generally the most important gases, additional trace gases are now being incorporated into the HT-FRTC. The HT-FRTC also includes an exact treatment of atmospheric scattering based on spherical harmonics. This allows the treatment of several different aerosol species and of liquid and ice clouds. Recent developments can even account for rain and falling snow. The HT-FRTC works in Principal Component (PC) space and is trained on a wide variety of atmospheric and surface conditions, which significantly reduces the computational requirements regarding memory and time. One clear-sky simulation takes approximately one millisecond. Recent developments allow the training to be completely general and sensor independent. This is significant as the user of the code can add new sensors and new surfaces/targets by simply supplying extra files which contain their (possibly classified) spectral properties. The HT-FRTC has been extended to cover the spectral range of Photopic and NVG sensors. One aim here is to give guidance on the expected, directionally resolved sky brightness, especially at night, again taking the actual or forecast atmospheric conditions into account. Recent developments include light level predictions during the period of twilight.
Pymiedap: a versatile radiative transfer code with polarization for terrestrial (exo)planets.
NASA Astrophysics Data System (ADS)
Rossi, Loïc; Stam, Daphne; Hogenboom, Michael
2016-04-01
Polarimetry promises to be an important method to detect exoplanets: the light of a star is usually unpolarized te{kemp1987} while scattering by gas and clouds in an atmosphere can generate high levels of polarization. Furthermore, the polarization of scattered light contains information about the properties of the atmosphere and surface of a planet, allowing a possible characterization te{stam2008}, a method already validated in the solar system with Venus te{hansen1974,rossi2015}. We present here Pymiedap (Python Mie Doubling-Adding Program): a set of Python objects interfaced with Fortran radiative transfer codes that allows to define a planetary atmosphere and compute the flux and polarization of the light that is scattered. Several different properties of the planet can be set interactively by the user through the Python interface such as gravity, distance to the star, surface properties, atmospheric layers, gaseous and aerosol composition. The radiative transfer calculations are then computed following the doubling-adding method te{deHaan1987}. We present some results of the code and show its possible use for different planetary atmospheres for both resolved and disk-integrated measurements. We investigate the effect of gas, clouds and aerosols composition and surface properties for horizontally homogeneous and inhomogenous planets, in the case of Earth-like planets. We also study the effect of gaseous absorption on the flux and polarization as a marker for gaseous abundance and cloud top altitude. [1]{kemp1987} Kemp et al. The optical polarization of the sun measured at a sensitivity of parts in ten million. Nature, 1987, 326, 270-273 [2]{stam2008} Stam, D. M. Spectropolarimetric signatures of Earth-like extrasolar planets. A&A, 2008, 482, 989-1007 [3]{hansen1974} Hansen, J. E. & Hovenier, J. W. Interpretation of the polarization of Venus. Journal of Atmospheric Sciences, 1974, 31, 1137-1160 [4]{rossi2015} Rossi et al. Preliminary study of Venus cloud layers
Dipp, T.M. |
1993-12-01
The generation of radiation via photoelectrons induced off of a conducting surface was explored using Particle-In-Cell (PIC) code computer simulations. Using the MAGIC PIC code, the simulations were performed in one dimension to handle the diverse scale lengths of the particles and fields in the problem. The simulations involved monoenergetic, nonrelativistic photoelectrons emitted normal to the illuminated conducting surface. A sinusoidal, 100% modulated, 6.3263 ns pulse train, as well as unmodulated emission, were used to explore the behavior of the particles, fields, and generated radiation. A special postprocessor was written to convert the PIC code simulated electron sheath into far-field radiation parameters by means of rigorous retarded time calculations. The results of the small-spot PIC simulations were used to generate various graphs showing resonance and nonresonance radiation quantities such as radiated lobe patterns, frequency, and power. A database of PIC simulation results was created and, using a nonlinear curve-fitting program, compared with theoretical scaling laws. Overall, the small-spot behavior predicted by the theoretical scaling laws was generally observed in the PIC simulation data, providing confidence in both the theoretical scaling laws and the PIC simulations.
Takahashi, F; Shigemori, Y; Seki, A
2009-01-01
A system has been developed to assess radiation dose distribution inside the body of exposed persons in a radiological accident by utilising radiation transport calculation codes-MCNP and MCNPX. The system consists mainly of two parts, pre-processor and post-processor of the radiation transport calculation. Programs for the pre-processor are used to set up a 'problem-dependent' input file, which defines the accident condition and dosimetric quantities to be estimated. The program developed for the post-processor part can effectively indicate dose information based upon the output file of the code. All of the programs in the dosimetry system can be executed with a generally used personal computer and accurately give the dose profile to an exposed person in a radiological accident without complicated procedures. An experiment using a physical phantom was carried out to verify the availability of the dosimetry system with the developed programs in a gamma ray irradiation field. PMID:19181661
Coronal extension of the MURaM radiative MHD code: From quiet sun to flare simulations
NASA Astrophysics Data System (ADS)
Rempel, Matthias D.; Cheung, Mark
2016-05-01
We present a new version of the MURaM radiative MHD code, which includes a treatment of the solar corona in terms of MHD, optically thin radiative loss and field-aligned heat conduction. In order to relax the severe time-step constraints imposed by large Alfven velocities and heat conduction we use a combination of semi-relativistic MHD with reduced speed of light ("Boris correction") and a hyperbolic formulation of heat conduction. We apply the numerical setup to 4 different setups including a mixed polarity quiet sun, an open flux region, an arcade solution and an active region setup and find all cases an amount of coronal heating sufficient to maintain a corona with temperatures from 1 MK (quiet sun) to 2 MK (active region, arcade). In all our setups the Poynting flux is self-consistently created by photospheric and sub-photospheric magneto-convection in the lower part of our simulation domain. Varying the maximum allowed Alfven velocity ("reduced speed of light") leads to only minor changes in the coronal structure as long as the limited Alfven velocity remains larger than the speed of sound and about 1.5-3 times larger than the peak advection velocity. We also found that varying details of the numerical diffusivities that govern the resistive and viscous energy dissipation do not strongly affect the overall coronal heating, but the ratio of resistive and viscous energy dependence is strongly dependent on the effective numerical magnetic Prandtl number. We use our active region setup in order to simulate a flare triggered by the emergence of a twisted flux rope into a pre-existing bipolar active region. Our simulation yields a series of flares, with the strongest one reaching GOES M1 class. The simulation reproduces many observed properties of eruptions such as flare ribbons, post flare loops and a sunquake.
C5 Benchmark Problem with Discrete Ordinate Radiation Transport Code DENOVO
Yesilyurt, Gokhan; Clarno, Kevin T; Evans, Thomas M; Davidson, Gregory G; Fox, Patricia B
2011-01-01
The C5 benchmark problem proposed by the Organisation for Economic Co-operation and Development/Nuclear Energy Agency was modeled to examine the capabilities of Denovo, a three-dimensional (3-D) parallel discrete ordinates (S{sub N}) radiation transport code, for problems with no spatial homogenization. Denovo uses state-of-the-art numerical methods to obtain accurate solutions to the Boltzmann transport equation. Problems were run in parallel on Jaguar, a high-performance supercomputer located at Oak Ridge National Laboratory. Both the two-dimensional (2-D) and 3-D configurations were analyzed, and the results were compared with the reference MCNP Monte Carlo calculations. For an additional comparison, SCALE/KENO-V.a Monte Carlo solutions were also included. In addition, a sensitivity analysis was performed for the optimal angular quadrature and mesh resolution for both the 2-D and 3-D infinite lattices of UO{sub 2} fuel pin cells. Denovo was verified with the C5 problem. The effective multiplication factors, pin powers, and assembly powers were found to be in good agreement with the reference MCNP and SCALE/KENO-V.a Monte Carlo calculations.
XTAT: A New Multilevel-Multiline Polarized Radiative Transfer Code with PRD
NASA Astrophysics Data System (ADS)
Bommier, V.
2014-10-01
This work is intended to the interpretation of the so-called "Second Solar Spectrum" (Stenflo 1996), which is the spectrum of the linear polarization formed by scattering and observed close to the solar internal limb. The lines are also optically thick, and the problem is to solve in a coherent manner, the statistical equilibrium of the atomic density matrix and the polarized radiative transfer in the atmosphere. Following Belluzzi & Landi Degl'Innocenti (2009), 30 % of the solar visible line linear polarization profiles display the M-type shape typical of coherent scattering effect in the far wings. A new theory including both coherent (Rayleigh) and resonant scatterings was developed by Bommier (1997a,b). Raman scattering was later added (Bommier 1999, SPW2). In this theory, which is straightly derived from the Schrödinger equation for the atomic density matrix, the radiative line broadening appears as a non-Markovian process of atom-photon interaction. The collisional broadening is included. The Rayleigh (Raman) scattering appears as an additional term in the emissivity from the fourth order of the atom-photon interaction perturbation development. The development is pursued and finally summed up, leading to a non-perturbative final result. In this formalism, the use of redistribution functions is avoided. The published formalism was limited to the two-level atom without lower level alignment. But most of the solar lines are more complex. We will present how the theory has to be complemented for multi-level atom modeling, including lower level alignment. The role of the collisions as balancing coherent and resonant scatterings is fully taken into account. Progress report will be given about the development of a new code for the numerical iterative solution of the statistical equilibrium and polarized radiative transfer equations, for multi-level atoms and their multi-line spectrum. Fine and hyperfine structures, and Hanle, Kemp (Kemp et al. 1984), Zeeman
Improvements of the Radiation Code "MstrnX" in AORI/NIES/JAMSTEC Models
NASA Astrophysics Data System (ADS)
Sekiguchi, M.; Suzuki, K.; Takemura, T.; Watanabe, M.; Ogura, T.
2015-12-01
There is a large demand for an accurate yet rapid radiation transfer scheme accurate for general climate models. The broadband radiative transfer code "mstrnX", ,which was developed by Atmosphere and Ocean Research Institute (AORI) and was implemented in several global and regional climate models cooperatively developed in the Japanese research community, for example, MIROC (the Model for Interdisciplinary Research on Climate) [Watanabe et al., 2010], NICAM (Non-hydrostatic Icosahedral Atmospheric Model) [Satoh et al, 2008], and CReSS (Cloud Resolving Storm Simulator) [Tsuboki and Sakakibara, 2002]. In this study, we improve the gas absorption process and the scattering process of ice particles. For update of gas absorption process, the absorption line database is replaced by the latest versions of the Harvard-Smithsonian Center, HITRAN2012. An optimization method is adopted in mstrnX to decrease the number of integration points for the wavenumber integration using the correlated k-distribution method and to increase the computational efficiency in each band. The integration points and weights of the correlated k-distribution are optimized for accurate calculation of the heating rate up to altitude of 70 km. For this purpose we adopted a new non-linear optimization method of the correlated k-distribution and studied an optimal initial condition and the cost function for the non-linear optimization. It is known that mstrnX has a considerable bias in case of quadrapled carbon dioxide concentrations [Pincus et al., 2015], however, the bias is decreased by this improvement. For update of scattering process of ice particles, we adopt a solid column as an ice crystal habit [Yang et al., 2013]. The single scattering properties are calculated and tabulated in advance. The size parameter of this table is ranged from 0.1 to 1000 in mstrnX, we expand the maximum to 50000 in order to correspond to large particles, like fog and rain drop. Those update will be introduced to
Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.
1988-09-01
The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.
Ralchenko, Yu.; Abdallah, J. Jr.; Colgan, J.; Fontes, C. J.; Foster, M.; Zhang, H. L.; Bar-Shalom, A.; Oreg, J.; Bauche, J.; Bauche-Arnoult, C.; Bowen, C.; Faussurier, G.; Chung, H.-K.; Hansen, S. B.; Lee, R. W.; Scott, H.; Gaufridy de Dortan, F. de; Poirier, M.; Golovkin, I.; Novikov, V.
2009-09-10
We present calculations of ionization balance and radiative power losses for tungsten in magnetic fusion plasmas. The simulations were performed within the framework of Non-Local Thermodynamic Equilibrium (NLTE) Code Comparison Workshops utilizing several independent collisional-radiative models. The calculations generally agree with each other; however, a clear disagreement with experimental ionization distributions at low temperatures 2 keV
NASA Astrophysics Data System (ADS)
Chubar, Oleg
2014-09-01
Recent updates in the "Synchrotron Radiation Workshop" physical optics computer code, including the transition to the Open Source development format, the results of the on-going collaborative development efforts in the area of X-ray optics, in particular grazing incidence mirrors, gratings and crystal monochromators, and in other areas, as well as some simulation activities for storage ring and X-ray free-electron laser sources are reported. Future development plans are discussed.
Peter Cebull
2004-05-01
The Attila radiation transport code, which solves the Boltzmann neutron transport equation on three-dimensional unstructured tetrahedral meshes, was ported to a Cray SV1. Cray's performance analysis tools pointed to two subroutines that together accounted for 80%-90% of the total CPU time. Source code modifications were performed to enable vectorization of the most significant loops, to correct unfavorable strides through memory, and to replace a conjugate gradient solver subroutine with a call to the Cray Scientific Library. These optimizations resulted in a speedup of 7.79 for the INEEL's largest ATR model. Parallel scalability of the OpenMP version of the code is also discussed, and timing results are given for other non-vector platforms.
HZETRN: A heavy ion/nucleon transport code for space radiations
NASA Technical Reports Server (NTRS)
Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.
1991-01-01
The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.
HZETRN: A heavy ion/nucleon transport code for space radiations
NASA Astrophysics Data System (ADS)
Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.
1991-12-01
The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.
NASA Astrophysics Data System (ADS)
Blecic, Jasmina; Harrington, Joseph; Bowman, Matthew O.; Cubillos, Patricio E.; Stemm, Madison; Foster, Andrew
2014-11-01
We present a new, open-source, Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. TEA uses the Gibbs-free-energy minimization method with an iterative Lagrangian optimization scheme. It initializes the radiative-transfer calculation in our Bayesian Atmospheric Radiative Transfer (BART) code. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. The code is tested against the original method developed by White at al. (1958), the analytic method developed by Burrows and Sharp (1999), and the Newton-Raphson method implemented in the open-source Chemical Equilibrium with Applications (CEA) code. TEA is written in Python and is available to the community via the open-source development site GitHub.com. We also present BART applied to eclipse depths of WASP-43b exoplanet, constraining atmospheric thermal and chemical parameters. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.
Roussin, R.W.
1993-03-01
From the very early days in its history Radiation Shielding Information Center (RSIC) has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.
Roussin, R.W.
1993-01-01
From the very early days in its history Radiation Shielding Information Center (RSIC) has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.
Chatani, K. )
1992-08-01
This report summarizes the calculational results from analyses of a Clinch River Breeder Reactor (CRBR) prototypic coolant pipe chaseway neutron streaming experiment Comparisons of calculated and measured results are presented, major emphasis being placed on results at bends in the chaseway. Calculations were performed with three three-dimensional radiation transport codes: the discrete ordinates code TORT and the Monte Carlo code MORSE, both developed by the Oak Ridge National Laboratory (ORNL), and the discrete ordinates code ENSEMBLE, developed by Japan. The calculated results from the three codes are compared (1) with previously-calculated DOT3.5 two-dimensional results, (2) among themselves, and (3) with measured results. Calculations with TORT used both the weighted-difference and nodal methods. Only the weighted-difference method was used in ENSEMBLE. When the calculated results were compared to measured results, it was found that calculation-to-experiment (C/E) ratios were good in the regions of the chaseway where two-dimensional modeling might be difficult and where there were no significant discrete ordinates ray effects. Excellent agreement was observed for responses dominated by thermal neutron contributions. MORSE-calculated results and comparisons are described also, and detailed results are presented in an appendix.
NASA Astrophysics Data System (ADS)
Sijoy, C. D.; Chaturvedi, S.
2016-06-01
Higher-order cell-centered multi-material hydrodynamics (HD) and parallel node-centered radiation transport (RT) schemes are combined self-consistently in three-temperature (3T) radiation hydrodynamics (RHD) code TRHD (Sijoy and Chaturvedi, 2015) developed for the simulation of intense thermal radiation or high-power laser driven RHD. For RT, a node-centered gray model implemented in a popular RHD code MULTI2D (Ramis et al., 2009) is used. This scheme, in principle, can handle RT in both optically thick and thin materials. The RT module has been parallelized using message passing interface (MPI) for parallel computation. Presently, for multi-material HD, we have used a simple and robust closure model in which common strain rates to all materials in a mixed cell is assumed. The closure model has been further generalized to allow different temperatures for the electrons and ions. In addition to this, electron and radiation temperatures are assumed to be in non-equilibrium. Therefore, the thermal relaxation between the electrons and ions and the coupling between the radiation and matter energies are required to be computed self-consistently. This has been achieved by using a node-centered symmetric-semi-implicit (SSI) integration scheme. The electron thermal conduction is calculated using a cell-centered, monotonic, non-linear finite volume scheme (NLFV) suitable for unstructured meshes. In this paper, we have described the details of the 2D, 3T, non-equilibrium, multi-material RHD code developed with a special attention to the coupling of various cell-centered and node-centered formulations along with a suite of validation test problems to demonstrate the accuracy and performance of the algorithms. We also report the parallel performance of RT module. Finally, in order to demonstrate the full capability of the code implementation, we have presented the simulation of laser driven shock propagation in a layered thin foil. The simulation results are found to be in good
NASA Astrophysics Data System (ADS)
Sijoy, C. D.; Chaturvedi, S.
2016-06-01
Higher-order cell-centered multi-material hydrodynamics (HD) and parallel node-centered radiation transport (RT) schemes are combined self-consistently in three-temperature (3T) radiation hydrodynamics (RHD) code TRHD (Sijoy and Chaturvedi, 2015) developed for the simulation of intense thermal radiation or high-power laser driven RHD. For RT, a node-centered gray model implemented in a popular RHD code MULTI2D (Ramis et al., 2009) is used. This scheme, in principle, can handle RT in both optically thick and thin materials. The RT module has been parallelized using message passing interface (MPI) for parallel computation. Presently, for multi-material HD, we have used a simple and robust closure model in which common strain rates to all materials in a mixed cell is assumed. The closure model has been further generalized to allow different temperatures for the electrons and ions. In addition to this, electron and radiation temperatures are assumed to be in non-equilibrium. Therefore, the thermal relaxation between the electrons and ions and the coupling between the radiation and matter energies are required to be computed self-consistently. This has been achieved by using a node-centered symmetric-semi-implicit (SSI) integration scheme. The electron thermal conduction is calculated using a cell-centered, monotonic, non-linear finite volume scheme (NLFV) suitable for unstructured meshes. In this paper, we have described the details of the 2D, 3T, non-equilibrium, multi-material RHD code developed with a special attention to the coupling of various cell-centered and node-centered formulations along with a suite of validation test problems to demonstrate the accuracy and performance of the algorithms. We also report the parallel performance of RT module. Finally, in order to demonstrate the full capability of the code implementation, we have presented the simulation of laser driven shock propagation in a layered thin foil. The simulation results are found to be in good
NASA Astrophysics Data System (ADS)
Schreier, Franz; Gimeno García, Sebastián; Hedelt, Pascal; Hess, Michael; Mendrok, Jana; Vasquez, Mayte; Xu, Jian
2014-04-01
A suite of programs for high resolution infrared-microwave atmospheric radiative transfer modeling has been developed with emphasis on efficient and reliable numerical algorithms and a modular approach appropriate for simulation and/or retrieval in a variety of applications. The Generic Atmospheric Radiation Line-by-line Infrared Code - GARLIC - is suitable for arbitrary observation geometry, instrumental field-of-view, and line shape. The core of GARLIC's subroutines constitutes the basis of forward models used to implement inversion codes to retrieve atmospheric state parameters from limb and nadir sounding instruments. This paper briefly introduces the physical and mathematical basics of GARLIC and its descendants and continues with an in-depth presentation of various implementation aspects: An optimized Voigt function algorithm combined with a two-grid approach is used to accelerate the line-by-line modeling of molecular cross sections; various quadrature methods are implemented to evaluate the Schwarzschild and Beer integrals; and Jacobians, i.e. derivatives with respect to the unknowns of the atmospheric inverse problem, are implemented by means of automatic differentiation. For an assessment of GARLIC's performance, a comparison of the quadrature methods for solution of the path integral is provided. Verification and validation are demonstrated using intercomparisons with other line-by-line codes and comparisons of synthetic spectra with spectra observed on Earth and from Venus.
A fast Monte Carlo code for proton transport in radiation therapy based on MCNPX
Jabbari, Keyvan; Seuntjens, Jan
2014-01-01
An important requirement for proton therapy is a software for dose calculation. Monte Carlo is the most accurate method for dose calculation, but it is very slow. In this work, a method is developed to improve the speed of dose calculation. The method is based on pre-generated tracks for particle transport. The MCNPX code has been used for generation of tracks. A set of data including the track of the particle was produced in each particular material (water, air, lung tissue, bone, and soft tissue). This code can transport protons in wide range of energies (up to 200 MeV for proton). The validity of the fast Monte Carlo (MC) code is evaluated with data MCNPX as a reference code. While analytical pencil beam algorithm transport shows great errors (up to 10%) near small high density heterogeneities, there was less than 2% deviation of MCNPX results in our dose calculation and isodose distribution. In terms of speed, the code runs 200 times faster than MCNPX. In the Fast MC code which is developed in this work, it takes the system less than 2 minutes to calculate dose for 106 particles in an Intel Core 2 Duo 2.66 GHZ desktop computer. PMID:25190994
Roussin, R.W.
1994-10-01
From the very early days in its history RSIC has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.
Code of Practice for the Use of Ionizing Radiations in Secondary Schools.
ERIC Educational Resources Information Center
National Health and Medical Research Council, Canberra (Australia).
The appreciation of the potential hazard of ionizing radiation led to the setting up of national, and later, international commissions for the defining of standards of protection for the occupationally exposed worker in the use of ionizing radiation. However, in the last twenty years, with the large scale development of nuclear energy, the need…
NASA Astrophysics Data System (ADS)
Class, G.
1987-07-01
A program to simulate gas motion and shine through of thermal radiation in fusion reactor vacuum flow channels was developed. The inner surface of the flow channel is described by plane areas (triangles, parallelograms) and by surfaces of revolution. By introducing control planes in the flow path, a variance reduction and shortening of the computation, respectively, are achieved through particle splitting and Russian roulette. The code is written in PL/I and verified using published data. Computer aided input of model data is performed interactively either under IBM-TSO or at a microprocessor (IBM PC-AT). The data files are exchangeable between the IBM-mainframe and IBM-PC computers. Both computers can produce plots of the elaborated channel model. For testing, the simulating computation can likewise be run interactively, whereas the production computation can be issued batchwise. The results of code verification are explained, and examples of channel models and of the interactive mode are given.
NASA Astrophysics Data System (ADS)
Tian, Zhen; Jiang Graves, Yan; Jia, Xun; Jiang, Steve B.
2014-10-01
Monte Carlo (MC) simulation is commonly considered as the most accurate method for radiation dose calculations. Commissioning of a beam model in the MC code against a clinical linear accelerator beam is of crucial importance for its clinical implementation. In this paper, we propose an automatic commissioning method for our GPU-based MC dose engine, gDPM. gDPM utilizes a beam model based on a concept of phase-space-let (PSL). A PSL contains a group of particles that are of the same type and close in space and energy. A set of generic PSLs was generated by splitting a reference phase-space file. Each PSL was associated with a weighting factor, and in dose calculations the particle carried a weight corresponding to the PSL where it was from. Dose for each PSL in water was pre-computed, and hence the dose in water for a whole beam under a given set of PSL weighting factors was the weighted sum of the PSL doses. At the commissioning stage, an optimization problem was solved to adjust the PSL weights in order to minimize the difference between the calculated dose and measured one. Symmetry and smoothness regularizations were utilized to uniquely determine the solution. An augmented Lagrangian method was employed to solve the optimization problem. To validate our method, a phase-space file of a Varian TrueBeam 6 MV beam was used to generate the PSLs for 6 MV beams. In a simulation study, we commissioned a Siemens 6 MV beam on which a set of field-dependent phase-space files was available. The dose data of this desired beam for different open fields and a small off-axis open field were obtained by calculating doses using these phase-space files. The 3D γ-index test passing rate within the regions with dose above 10% of dmax dose for those open fields tested was improved averagely from 70.56 to 99.36% for 2%/2 mm criteria and from 32.22 to 89.65% for 1%/1 mm criteria. We also tested our commissioning method on a six-field head-and-neck cancer IMRT plan. The
NASA Astrophysics Data System (ADS)
Stone, James M.; Norman, Michael L.
1992-06-01
A detailed description of ZEUS-2D, a numerical code for the simulation of fluid dynamical flows including a self-consistent treatment of the effects of magnetic fields and radiation transfer is presented. Attention is given to the hydrodynamic (HD) algorithms which form the foundation for the more complex MHD and radiation HD algorithms. The effect of self-gravity on the flow dynamics is accounted for by an iterative solution of the sparse-banded matrix resulting from discretizing the Poisson equation in multidimensions. The results of an extensive series of HD test problems are presented. A detailed description of the MHD algorithms in ZEUS-2D is presented. A new method of computing the electromotive force is developed using the method of characteristics (MOC). It is demonstrated through the results of an extensive series of MHD test problems that the resulting hybrid MOC-constrained transport method provides for the accurate evolution of all modes of MHD wave families.
MESTRN: A Deterministic Meson-Muon Transport Code for Space Radiation
NASA Technical Reports Server (NTRS)
Blattnig, Steve R.; Norbury, John W.; Norman, Ryan B.; Wilson, John W.; Singleterry, Robert C., Jr.; Tripathi, Ram K.
2004-01-01
A safe and efficient exploration of space requires an understanding of space radiations, so that human life and sensitive equipment can be protected. On the way to these sensitive sites, the radiation fields are modified in both quality and quantity. Many of these modifications are thought to be due to the production of pions and muons in the interactions between the radiation and intervening matter. A method used to predict the effects of the presence of these particles on the transport of radiation through materials is developed. This method was then used to develop software, which was used to calculate the fluxes of pions and muons after the transport of a cosmic ray spectrum through aluminum and water. Software descriptions are given in the appendices.
Parameterized code SHARM-3D for radiative transfer over inhomogeneous surfaces
NASA Astrophysics Data System (ADS)
Lyapustin, Alexei; Wang, Yujie
2005-12-01
The code SHARM-3D, developed for fast and accurate simulations of the monochromatic radiance at the top of the atmosphere over spatially variable surfaces with Lambertian or anisotropic reflectance, is described. The atmosphere is assumed to be laterally uniform across the image and to consist of two layers with aerosols contained in the bottom layer. The SHARM-3D code performs simultaneous calculations for all specified incidence-view geometries and multiple wavelengths in one run. The numerical efficiency of the current version of code is close to its potential limit and is achieved by means of two innovations. The first is the development of a comprehensive precomputed lookup table of the three-dimensional atmospheric optical transfer function for various atmospheric conditions. The second is the use of a linear kernel model of the land surface bidirectional reflectance factor (BRF) in our algorithm that has led to a fully parameterized solution in terms of the surface BRF parameters. The code is also able to model inland lakes and rivers. The water pixels are described with the Nakajima-Tanaka BRF model of wind-roughened water surface with a Lambertian offset, which is designed to model approximately the reflectance of suspended matter and of a shallow lake or river bottom.
Reactor Dosimetry Applications Using RAPTOR-M3G:. a New Parallel 3-D Radiation Transport Code
NASA Astrophysics Data System (ADS)
Longoni, Gianluca; Anderson, Stanwood L.
2009-08-01
The numerical solution of the Linearized Boltzmann Equation (LBE) via the Discrete Ordinates method (SN) requires extensive computational resources for large 3-D neutron and gamma transport applications due to the concurrent discretization of the angular, spatial, and energy domains. This paper will discuss the development RAPTOR-M3G (RApid Parallel Transport Of Radiation - Multiple 3D Geometries), a new 3-D parallel radiation transport code, and its application to the calculation of ex-vessel neutron dosimetry responses in the cavity of a commercial 2-loop Pressurized Water Reactor (PWR). RAPTOR-M3G is based domain decomposition algorithms, where the spatial and angular domains are allocated and processed on multi-processor computer architectures. As compared to traditional single-processor applications, this approach reduces the computational load as well as the memory requirement per processor, yielding an efficient solution methodology for large 3-D problems. Measured neutron dosimetry responses in the reactor cavity air gap will be compared to the RAPTOR-M3G predictions. This paper is organized as follows: Section 1 discusses the RAPTOR-M3G methodology; Section 2 describes the 2-loop PWR model and the numerical results obtained. Section 3 addresses the parallel performance of the code, and Section 4 concludes this paper with final remarks and future work.
NASA Astrophysics Data System (ADS)
Ngo, N. H.; Lisak, D.; Tran, H.; Hartmann, J.-M.
2013-11-01
We demonstrate that a previously proposed model opens the route for the inclusion of refined non-Voigt profiles in spectroscopic databases and atmospheric radiative transfer codes. Indeed, this model fulfills many essential requirements: (i) it takes both velocity changes and the speed dependences of the pressure-broadening and -shifting coefficients into account. (ii) It leads to accurate descriptions of the line shapes of very different molecular systems. Tests made for pure H2, CO2 and O2 and for H2O diluted in N2 show that residuals are down to ≃0.2% of the peak absorption, (except for the untypical system of H2 where a maximum residual of ±3% is reached), thus fulfilling the precision requirements of the most demanding remote sensing experiments. (iii) It is based on a limited set of parameters for each absorption line that have known dependences on pressure and can thus be stored in databases. (iv) Its calculation requires very reasonable computer costs, only a few times higher than that of a usual Voigt profile. Its inclusion in radiative transfer codes will thus induce bearable CPU time increases. (v) It can be extended in order to take line-mixing effects into account, at least within the so-called first-order approximation.
NASA Astrophysics Data System (ADS)
Davis, A. B.; Cahalan, R. F.
2001-05-01
The Intercomparison of 3D Radiation Codes (I3RC) is an on-going initiative involving an international group of over 30 researchers engaged in the numerical modeling of three-dimensional radiative transfer as applied to clouds. Because of their strong variability and extreme opacity, clouds are indeed a major source of uncertainty in the Earth's local radiation budget (at GCM grid scales). Also 3D effects (at satellite pixel scales) invalidate the standard plane-parallel assumption made in the routine of cloud-property remote sensing at NASA and NOAA. Accordingly, the test-cases used in I3RC are based on inputs and outputs which relate to cloud effects in atmospheric heating rates and in real-world remote sensing geometries. The main objectives of I3RC are to (1) enable participants to improve their models, (2) publish results as a community, (3) archive source code, and (4) educate. We will survey the status of I3RC and its plans for the near future with a special emphasis on the mathematical models and computational approaches. We will also describe some of the prime applications of I3RC's efforts in climate models, cloud-resolving models, and remote-sensing observations of clouds, or that of the surface in their presence. In all these application areas, computational efficiency is the main concern and not accuracy. One of I3RC's main goals is to document the performance of as wide a variety as possible of three-dimensional radiative transfer models for a small but representative number of ``cases.'' However, it is dominated by modelers working at the level of linear transport theory (i.e., they solve the radiative transfer equation) and an overwhelming majority of these participants use slow-but-robust Monte Carlo techniques. This means that only a small portion of the efficiency vs. accuracy vs. flexibility domain is currently populated by I3RC participants. To balance this natural clustering the present authors have organized a systematic outreach towards
Evans, T.E.; Leonard, A.W.; West, W.P.; Finkenthal, D.F.; Fenstermacher, M.E.; Porter, G.D.
1998-08-01
Experimentally measured carbon line emissions and total radiated power distributions from the DIII-D divertor and Scrape-Off Layer (SOL) are compared to those calculated with the Monte Carlo Impurity (MCI) model. A UEDGE background plasma is used in MCI with the Roth and Garcia-Rosales (RG-R) chemical sputtering model and/or one of six physical sputtering models. While results from these simulations do not reproduce all of the features seen in the experimentally measured radiation patterns, the total radiated power calculated in MCI is in relatively good agreement with that measured by the DIII-D bolometric system when the Smith78 physical sputtering model is coupled to RG-R chemical sputtering in an unaltered UEDGE plasma. Alternatively, MCI simulations done with UEDGE background ion temperatures along the divertor target plates adjusted to better match those measured in the experiment resulted in three physical sputtering models which when coupled to the RG-R model gave a total radiated power that was within 10% of measured value.
Bayesian Atmospheric Radiative Transfer (BART) Code and Application to WASP-43b
NASA Astrophysics Data System (ADS)
Blecic, Jasmina; Harrington, Joseph; Cubillos, Patricio; Bowman, Oliver; Rojo, Patricio; Stemm, Madison; Lust, Nathaniel B.; Challener, Ryan; Foster, Austin James; Foster, Andrew S.; Blumenthal, Sarah D.; Bruce, Dylan
2016-01-01
We present a new open-source Bayesian radiative-transfer framework, Bayesian Atmospheric Radiative Transfer (BART, https://github.com/exosports/BART), and its application to WASP-43b. BART initializes a model for the atmospheric retrieval calculation, generates thousands of theoretical model spectra using parametrized pressure and temperature profiles and line-by-line radiative-transfer calculation, and employs a statistical package to compare the models with the observations. It consists of three self-sufficient modules available to the community under the reproducible-research license, the Thermochemical Equilibrium Abundances module (TEA, https://github.com/dzesmin/TEA, Blecic et al. 2015}, the radiative-transfer module (Transit, https://github.com/exosports/transit), and the Multi-core Markov-chain Monte Carlo statistical module (MCcubed, https://github.com/pcubillos/MCcubed, Cubillos et al. 2015). We applied BART on all available WASP-43b secondary eclipse data from the space- and ground-based observations constraining the temperature-pressure profile and molecular abundances of the dayside atmosphere of WASP-43b. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.
FY05 LDRD Final Report Molecular Radiation Biodosimetry LDRD Project Tracking Code: 04-ERD-076
Jones, I M; A.Coleman, M; Lehmann, J; Manohar, C F; Marchetti, F; Mariella, R; Miles, R; Nelson, D O; Wyrobek, A J
2006-02-03
In the event of a nuclear or radiological accident or terrorist event, it is important to identify individuals that can benefit from prompt medical care and to reassure those that do not need it. Achieving these goals will maximize the ability to manage the medical consequences of radiation exposure that unfold over a period of hours, days, weeks, years, depending on dose. Medical interventions that reduce near term morbidity and mortality from high but non-lethal exposures require advanced medical support and must be focused on those in need as soon as possible. There are two traditional approaches to radiation dosimetry, physical and biological. Each as currently practiced has strengths and limitations. Physical dosimetry for radiation exposure is routine for selected sites and for individual nuclear workers in certain industries, medical centers and research institutions. No monitoring of individuals in the general population is currently performed. When physical dosimetry is available at the time of an accident/event or soon thereafter, it can provide valuable information in support of accident/event triage. Lack of data for most individuals is a major limitation, as differences in exposure can be significant due to shielding, atmospherics, etc. A smaller issue in terms of number of people affected is that the same dose may have more or less biological effect on subsets of the population. Biological dosimetry is the estimation of exposure based on physiological or cellular alterations induced in an individual by radiation. The best established and precise biodosimetric methods are measurement of the decline of blood cells over time and measurement of the frequency of chromosome aberrations. In accidents or events affecting small numbers of people, it is practical to allocate the resources and time (days of clinical follow-up or specialists laboratory time) to conduct these studies. However, if large numbers of people have been exposed, or fear they may have
NASA Astrophysics Data System (ADS)
Plante, Ianik; Devroye, Luc
2015-09-01
Several computer codes simulating chemical reactions in particles systems are based on the Green's functions of the diffusion equation (GFDE). Indeed, many types of chemical systems have been simulated using the exact GFDE, which has also become the gold standard for validating other theoretical models. In this work, a simulation algorithm is presented to sample the interparticle distance for partially diffusion-controlled reversible ABCD reaction. This algorithm is considered exact for 2-particles systems, is faster than conventional look-up tables and uses only a few kilobytes of memory. The simulation results obtained with this method are compared with those obtained with the independent reaction times (IRT) method. This work is part of our effort in developing models to understand the role of chemical reactions in the radiation effects on cells and tissues and may eventually be included in event-based models of space radiation risks. However, as many reactions are of this type in biological systems, this algorithm might play a pivotal role in future simulation programs not only in radiation chemistry, but also in the simulation of biochemical networks in time and space as well.
NASA Astrophysics Data System (ADS)
Gillespie, K. M.; Speirs, D. C.; Ronald, K.; McConville, S. L.; Phelps, A. D. R.; Bingham, R.; Cross, A. W.; Robertson, C. W.; Whyte, C. G.; He, W.; Vorgul, I.; Cairns, R. A.; Kellett, B. J.
2008-12-01
Auroral Kilometric Radiation (AKR), occurs naturally in the polar regions of the Earth's magnetosphere where electrons are accelerated by electric fields into the increasing planetary magnetic dipole. Here conservation of the magnetic moment converts axial to rotational momentum forming a horseshoe distribution in velocity phase space. This distribution is unstable to cyclotron emission with radiation emitted in the X-mode. In a scaled laboratory reproduction of this process, a 75-85 keV electron beam of 5-40 A was magnetically compressed by a system of solenoids and emissions were observed for cyclotron frequencies of 4.42 GHz and 11.7 GHz resonating with near cut-off TE0,1 and TE0,3 modes, respectively. Here we compare these measurements with numerical predictions from the 3D PiC code KARAT. The 3D simulations accurately predicted the radiation modes and frequencies produced by the experiment. The predicted conversion efficiency between electron kinetic and wave field energy of around 1% is close to the experimental measurements and broadly consistent with quasi-linear theoretical analysis and geophysical observations.
Plante, Ianik; Devroye, Luc
2015-09-15
Several computer codes simulating chemical reactions in particles systems are based on the Green's functions of the diffusion equation (GFDE). Indeed, many types of chemical systems have been simulated using the exact GFDE, which has also become the gold standard for validating other theoretical models. In this work, a simulation algorithm is presented to sample the interparticle distance for partially diffusion-controlled reversible ABCD reaction. This algorithm is considered exact for 2-particles systems, is faster than conventional look-up tables and uses only a few kilobytes of memory. The simulation results obtained with this method are compared with those obtained with the independent reaction times (IRT) method. This work is part of our effort in developing models to understand the role of chemical reactions in the radiation effects on cells and tissues and may eventually be included in event-based models of space radiation risks. However, as many reactions are of this type in biological systems, this algorithm might play a pivotal role in future simulation programs not only in radiation chemistry, but also in the simulation of biochemical networks in time and space as well.
NUSTART: A PC code for NUclear STructure And Radiative Transition analysis and supplementation
Larsen, G.L.; Gardner, D.G.; Gardner, M.A.
1990-10-01
NUSTART is a computer program for the IBM PC/At. It is designed for use with the nuclear reaction cross-section code STAPLUS, which is a STAPRE-based CRAY computer code that is being developed at Lawrence Livermore National Laboratory. The NUSTART code was developed to handle large sets of discrete nuclear levels and the multipole transitions among these levels; it operates in three modes. The Data File Error Analysis mode analyzes an existing STAPLUS input file containing the levels and their multipole transition branches for a number of physics and/or typographical errors. The Interactive Data File Generation mode allows the user to create input files of discrete levels and their branching fractions in the format required by STAPLUS, even though the user enters the information in the (different) format used by many people in the nuclear structure field. In the Branching Fractions Calculations mode, the discrete nuclear level set is read, and the multipole transitions among the levels are computed under one of two possible assumptions: (1) the levels have no collective character, or (2) the levels are all rotational band heads. Only E1, M1, and E2 transitions are considered, and the respective strength functions may be constants or, in the case of E1 transitions, the strength function may be energy dependent. The first option is used for nuclei closed shells; the bandhead option may be used to vary the E1, M1, and E2 strengths for interband transitions. K-quantum number selection rules may be invoked if desired. 19 refs.
MagRad: A code to optimize the operation of superconducting magnets in a radiation environment
Yeaw, C.T.
1995-12-31
A powerful computational tool, called MagRad, has been developed which optimizes magnet design for operation in radiation fields. Specifically, MagRad has been used for the analysis and design modification of the cable-in-conduit conductors of the TF magnet systems in fusion reactor designs. Since the TF magnets must operate in a radiation environment which damages the material components of the conductor and degrades their performance, the optimization of conductor design must account not only for start-up magnet performance, but also shut-down performance. The degradation in performance consists primarily of three effects: reduced stability margin of the conductor; a transition out of the well-cooled operating regime; and an increased maximum quench temperature attained in the conductor. Full analysis of the magnet performance over the lifetime of the reactor includes: radiation damage to the conductor, stability, protection, steady state heat removal, shielding effectiveness, optimal annealing schedules, and finally costing of the magnet and reactor. Free variables include primary and secondary conductor geometric and compositional parameters, as well as fusion reactor parameters. A means of dealing with the radiation damage to the conductor, namely high temperature superconductor anneals, is proposed, examined, and demonstrated to be both technically feasible and cost effective. Additionally, two relevant reactor designs (ITER CDA and ARIES-II/IV) have been analyzed. Upon addition of pure copper strands to the cable, the ITER CDA TF magnet design was found to be marginally acceptable, although much room for both performance improvement and cost reduction exists. A cost reduction of 10-15% of the capital cost of the reactor can be achieved by adopting a suitable superconductor annealing schedule. In both of these reactor analyses, the performance predictive capability of MagRad and its associated costing techniques have been demonstrated.
PORTA: A Massively Parallel Code for 3D Non-LTE Polarized Radiative Transfer
NASA Astrophysics Data System (ADS)
Štěpán, J.
2014-10-01
The interpretation of the Stokes profiles of the solar (stellar) spectral line radiation requires solving a non-LTE radiative transfer problem that can be very complex, especially when the main interest lies in modeling the linear polarization signals produced by scattering processes and their modification by the Hanle effect. One of the main difficulties is due to the fact that the plasma of a stellar atmosphere can be highly inhomogeneous and dynamic, which implies the need to solve the non-equilibrium problem of generation and transfer of polarized radiation in realistic three-dimensional stellar atmospheric models. Here we present PORTA, a computer program we have developed for solving, in three-dimensional (3D) models of stellar atmospheres, the problem of the generation and transfer of spectral line polarization taking into account anisotropic radiation pumping and the Hanle and Zeeman effects in multilevel atoms. The numerical method of solution is based on a highly convergent iterative algorithm, whose convergence rate is insensitive to the grid size, and on an accurate short-characteristics formal solver of the Stokes-vector transfer equation which uses monotonic Bezier interpolation. In addition to the iterative method and the 3D formal solver, another important feature of PORTA is a novel parallelization strategy suitable for taking advantage of massively parallel computers. Linear scaling of the solution with the number of processors allows to reduce the solution time by several orders of magnitude. We present useful benchmarks and a few illustrations of applications using a 3D model of the solar chromosphere resulting from MHD simulations. Finally, we present our conclusions with a view to future research. For more details see Štěpán & Trujillo Bueno (2013).
1980-02-29
Version 00 LADTAP II calculates the radiation exposure to man from potable water, aquatic foods, shoreline deposits, swimming, boating, and irrigated foods, and also the dose to biota. Doses are calculated for both the maximum individual and for the population and are summarized for each pathway by age group and organ. It also calculates the doses to certain representative biota other than man in the aquatic environment such as fish, invertebrates, algae, muskrats, raccoons, herons, and ducks using models presented in WASH-1258.
NASA Technical Reports Server (NTRS)
Staenz, K.; Williams, D. J.; Fedosejevs, G.; Teillet, P. M.
1995-01-01
Surface reflectance retrieval from imaging spectrometer data as acquired with the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has become important for quantitative analysis. In order to calculate surface reflectance from remotely measured radiance, radiative transfer codes such as 5S and MODTRAN2 play an increasing role for removal of scattering and absorption effects of the atmosphere. Accurate knowledge of the exo-atmospheric solar irradiance (E(sub 0)) spectrum at the spectral resolution of the sensor is important for this purpose. The present study investigates the impact of differences in the solar irradiance function, as implemented in a modified version of 5S (M5S), 6S, and MODTRAN2, and as proposed by Green and Gao, on the surface reflectance retrieved from AVIRIS data. Reflectance measured in situ is used as a basis of comparison.
Mao, S.; Liu, J.; Nelson, W.R.
1992-01-01
The EGS computer code, developed for the Monte Carlo simulation of the transport of electrons and photons, has been used since 1970 in the design of accelerators and detectors for high-energy physics. In this paper we present three examples demonstrating how the current version, EGS4, is used to determine energy-loss patterns and source terms along beam pipes, (i.e., including flanges, collimators, etc.). This information is useful for further shielding and dosimetry studies. The calculated results from the analysis are in close agreement with the measured values. To facilitate this review, a new add-on package called SHOWGRAF, is used in order to display shower trajectories for the three examples.
FESTR: Finite-Element Spectral Transfer of Radiation spectroscopic modeling and analysis code
Hakel, Peter
2016-06-16
Here we report on the development of a new spectral postprocessor of hydrodynamic simulations of hot, dense plasmas. Based on given time histories of one-, two-, and three-dimensional spatial distributions of materials, and their local temperature and density conditions, spectroscopically-resolved signals are computed. The effects of radiation emission and absorption by the plasma on the emergent spectra are simultaneously taken into account. This program can also be used independently of hydrodynamic calculations to analyze available experimental data with the goal of inferring plasma conditions.
Energy Science and Technology Software Center (ESTSC)
1980-02-29
Version 00 LADTAP II calculates the radiation exposure to man from potable water, aquatic foods, shoreline deposits, swimming, boating, and irrigated foods, and also the dose to biota. Doses are calculated for both the maximum individual and for the population and are summarized for each pathway by age group and organ. It also calculates the doses to certain representative biota other than man in the aquatic environment such as fish, invertebrates, algae, muskrats, raccoons, herons,more » and ducks using models presented in WASH-1258.« less
Modelling of the Global Space Radiation Field at Aircraft Altitudes by the European Code EPCARD
NASA Astrophysics Data System (ADS)
Heinrich, W.; Schraube, H.; Roesler, S.
Supported by the European Commission the European Program Package for the Calculation of Aviation Route Doses (EPCARD) was developed. For this purpose we combined the state of the art models to (i) describe the cosmic radiation field with respect to solar modulation, geomagnetic shielding and to (ii) describe the particle interaction and production in the atmosphere and to (iii) determine the appropriate dose quantities. Spectral fluence rates of different particles (n, p, , , e, μ) produced in the atmosphere by interactions of primary cosmic rays have been determined by Monte Carlo calculations for different periods of solar modulation, geomagnetic shielding conditions and depths in the atmosphere. These data are used as basis in EPCARD. For any chosen flight route and profile, operational and effective doses can be determined in full agreement with the ICRU/ICRP definitions, and also the readings of airborne instruments can be determined. The results of the model predictions agree generally within +/-30% or significantly better with experimental data. Differences are caused by model uncertainties and also by uncertainties in the fundamental understanding of the response characteristics of experimental devices employed. Several examples of comparison between model predictions and experimental data are given. Finally we discuss the capabilities of model predictions for the estimation of radiation doses due to solar particle events. Large uncertainties arise due to the extremely complicated situation of the incident solar particles: their non-isotropy, asymptotic arrival directions, time dependence of spectral fluxes and geomagnetic disturbances, which are known to exist, but are not known in detail.
Karlykhanov, N. G.; Glazyrin, I. V.; Lykov, V. A.; Politov, V. Yu.; Sofronov, A. A.; Timakova, M. S.
1997-04-15
1D ERA code allows to simulate the kinetic of ionization and nonequilibrium radiation transfer in line and continua. This code is expanded to account the processes of ions ionization and excitation in the field of laser radiation. For the descriptions of the processes the wave equation is solved. The results of calculations of X-ray yield at the irradiation of Al plate by picosecond laser pulse at intensity of 10{sup 16}-10{sup 17} W/cm{sup 2} are presented.
NASA Astrophysics Data System (ADS)
Yamada, Takayoshi; Kasai, Yasuko; Yoshida, Naohiro
2016-07-01
The Submillimeter Wave Instrument (SWI) is one of the scientific instruments on the JUpiter Icy moon Explorer (JUICE). We plan to observe atmospheric compositions including water vapor and its isotopomers in Galilean moons (Io, Europa, Ganymede, and Callisto). The frequency windows of SWI are 530 to 625 GHz and 1080 to 1275 GHz with 100 kHz spectral resolution. We are developing a radiative transfer code in Japan with line-by-line method for Ganymede atmosphere in THz region (0 - 3 THz). Molecular line parameters (line intensity and partition function) were taken from JPL (Jet Propulsion Laboratory) catalogue. The pencil beam was assumed to calculate a spectrum of H _{2}O and CO in rotational transitions at the THz region. We performed comparisons between our model and ARTS (Atmospheric Radiative Transfer Simulator). The difference were less than 10% and 5% for H _{2}O and CO, respectively, under the condition of the local thermodynamic equilibrium (LTE). Comparison with several models with non-LTE assumption will be presented.
Coupling external radiation transport code results to the GADRAS detector response function.
Mitchell, Dean J; Thoreson, Gregory G.; Horne, Steven M.
2014-01-01
Simulating gamma spectra is useful for analyzing special nuclear materials. Gamma spectra are influenced not only by the source and the detector, but also by the external, and potentially complex scattering environment. The scattering environment can make accurate representations of gamma spectra difficult to obtain. By coupling the Monte Carlo Nuclear Particle (MCNP) code with the Gamma Detector Response and Analysis Software (GADRAS) detector response function, gamma spectrum simulations can be computed with a high degree of fidelity even in the presence of a complex scattering environment. Traditionally, GADRAS represents the external scattering environment with empirically derived scattering parameters. By modeling the external scattering environment in MCNP and using the results as input for the GADRAS detector response function, gamma spectra can be obtained with a high degree of fidelity. This method was verified with experimental data obtained in an environment with a significant amount of scattering material. The experiment used both gamma-emitting sources and moderated and bare neutron-emitting sources. The sources were modeled using GADRAS and MCNP in the presence of the external scattering environment, producing accurate representations of the experimental data.
NASA Astrophysics Data System (ADS)
Poirier, M.; de Gaufridy de Dortan, F.
2009-12-01
The behavior of non-local thermal equilibrium (NLTE) plasmas plays a central role in many fields of modern-day physics, such as laser- produced plasmas, astrophysics, inertial or magnetic confinement fusion devices, and X-ray sources. In steady-state cases the proper description of these plasmas may require the solution of thousands of linear rate equations. A possible simplification for this numerical task lies in some form of statistical averaging, such as the averaging over configurations or superconfigurations. However, to assess the validity of such an averaging procedure and to handle cases where isolated lines play an important role, it will be necessary to treat detailed levels systems. This involves matrices with potentially billions of elements, which are rather sparse but still involve thousands of diagonals above and below the main one. We propose here a numerical algorithm based on the LU decomposition for such linear systems. It will be shown that this method is orders of magnitude faster than the traditional Gauss elimination. Moreover, it is found that there are no convergence or accuracy issues, which are found when using methods based on conjugate gradients or minimization. Among cases treated at the last NLTE-kinetics-code meeting, krypton and tungsten plasmas are considered. Furthermore, to assess the validity of configuration averaging, several criteria are discussed. While a criterion based on detailed balance is relevant in cases not too far from LTE, it is found to be insufficient in general. An alternate criterion based on the inspection of the influence of an arbitrary configuration temperature is proposed and tested successfully.
NASA Astrophysics Data System (ADS)
Kuroda, Takami; Takiwaki, Tomoya; Kotake, Kei
2016-02-01
We present a new multi-dimensional radiation-hydrodynamics code for massive stellar core-collapse in full general relativity (GR). Employing an M1 analytical closure scheme, we solve spectral neutrino transport of the radiation energy and momentum based on a truncated moment formalism. Regarding neutrino opacities, we take into account a baseline set in state-of-the-art simulations, in which inelastic neutrino-electron scattering, thermal neutrino production via pair annihilation, and nucleon-nucleon bremsstrahlung are included. While the Einstein field equations and the spatial advection terms in the radiation-hydrodynamics equations are evolved explicitly, the source terms due to neutrino-matter interactions and energy shift in the radiation moment equations are integrated implicitly by an iteration method. To verify our code, we first perform a series of standard radiation tests with analytical solutions that include the check of gravitational redshift and Doppler shift. A good agreement in these tests supports the reliability of the GR multi-energy neutrino transport scheme. We then conduct several test simulations of core-collapse, bounce, and shock stall of a 15{M}⊙ star in the Cartesian coordinates and make a detailed comparison with published results. Our code performs quite well to reproduce the results of full Boltzmann neutrino transport especially before bounce. In the postbounce phase, our code basically performs well, however, there are several differences that are most likely to come from the insufficient spatial resolution in our current 3D-GR models. For clarifying the resolution dependence and extending the code comparison in the late postbounce phase, we discuss that next-generation Exaflops class supercomputers are needed at least.
Faden, R R; Lederer, S E; Moreno, J D
1996-11-27
The Advisory Committee on Human Radiation Experiments (ACHRE), established to review allegations of abuses of human subjects in federally sponsored radiation research, was charged with identifying appropriate standards to evaluate the ethics of cold war radiation experiments. One central question for ACHRE was to determine what role, if any, the Nuremberg Code played in the norms and practices of US medical researchers. Based on the evidence from ACHRE's Ethics Oral History Project and extensive archival research, we conclude that the Code, at the time it was promulgated, had little effect on mainstream medical researchers engaged in human subjects research. Although some clinical investigators raised questions about the conduct of research involving human beings, the medical profession did not pursue this issue until the 1960s. PMID:8922454
Emery, L.
1995-07-01
The interface program shower to the FGS Monte Carlo electromagnetic cascade shower simulation code system was written to facilitate the definition of complicated target and shielding geometries and to simplify the handling of input and output of data. The geometry is defined by a series of namelist commands in an input file. The input and output beam data files follow the SPDDS (self-describing data set) protocol, which makes the files compatible with other physics codes that follow the same protocol. For instance, one can use the results of the cascade shower simulation as the input data for an accelerator tracking code. The shower code has also been used to calculate the bremsstrahlung component of radiation doses for possible beam loss scenarios at the Advanced Photon Source (APS) at Argonne National Laboratory.
NASA Astrophysics Data System (ADS)
Stone, James M.; Norman, Michael L.
1992-06-01
In this, the second of a series of three papers, we continue a detailed description of ZEUS-2D, a numerical code for the simulation of fluid dynamical flows in astrophysics including a self-consistent treatment of the effects of magnetic fields and radiation transfer. In this paper, we give a detailed description of the magnetohydrodynamical (MHD) algorithms in ZEUS-2D. The recently developed constrained transport (CT) algorithm is implemented for the numerical evolution of the components of the magnetic field for MHD simulations. This formalism guarantees the numerically evolved field components will satisfy the divergence-free constraint at all times. We find, however, that the method used to compute the electromotive forces must be chosen carefully to propagate accurately all modes of MHD wave families (in particular shear Alfvén waves). A new method of computing the electromotive force is developed using the method of characteristics (MOC). It is demonstrated through the results of an extensive series of MHD test problems that the resulting hybrid MOC-CT method provides for the accurate evolution of all modes of MHD wave families.
Shestakov, Aleksei I. Offner, Stella S.R.
2008-01-10
We present a scheme to solve the nonlinear multigroup radiation diffusion (MGD) equations. The method is incorporated into a massively parallel, multidimensional, Eulerian radiation-hydrodynamic code with Adaptive Mesh Refinement (AMR). The patch-based AMR algorithm refines in both space and time creating a hierarchy of levels, coarsest to finest. The physics modules are time-advanced using operator splitting. On each level, separate 'level-solve' packages advance the modules. Our multigroup level-solve adapts an implicit procedure which leads to a two-step iterative scheme that alternates between elliptic solves for each group with intra-cell group coupling. For robustness, we introduce pseudo transient continuation ({psi}tc). We analyze the magnitude of the {psi}tc parameter to ensure positivity of the resulting linear system, diagonal dominance and convergence of the two-step scheme. For AMR, a level defines a subdomain for refinement. For diffusive processes such as MGD, the refined level uses Dirichlet boundary data at the coarse-fine interface and the data is derived from the coarse level solution. After advancing on the fine level, an additional procedure, the sync-solve (SS), is required in order to enforce conservation. The MGD SS reduces to an elliptic solve on a combined grid for a system of G equations, where G is the number of groups. We adapt the 'partial temperature' scheme for the SS; hence, we reuse the infrastructure developed for scalar equations. Results are presented. We consider a multigroup test problem with a known analytic solution. We demonstrate utility of {psi}tc by running with increasingly larger timesteps. Lastly, we simulate the sudden release of energy Y inside an Al sphere (r = 15 cm) suspended in air at STP. For Y = 11 kT, we find that gray radiation diffusion and MGD produce similar results. However, if Y = 1 MT, the two packages yield different results. Our large Y simulation contradicts a long-standing theory and demonstrates
NASA Astrophysics Data System (ADS)
Shestakov, Aleksei I.; Offner, Stella S. R.
2008-01-01
We present a scheme to solve the nonlinear multigroup radiation diffusion (MGD) equations. The method is incorporated into a massively parallel, multidimensional, Eulerian radiation-hydrodynamic code with Adaptive Mesh Refinement (AMR). The patch-based AMR algorithm refines in both space and time creating a hierarchy of levels, coarsest to finest. The physics modules are time-advanced using operator splitting. On each level, separate "level-solve" packages advance the modules. Our multigroup level-solve adapts an implicit procedure which leads to a two-step iterative scheme that alternates between elliptic solves for each group with intra-cell group coupling. For robustness, we introduce pseudo transient continuation (Ψtc). We analyze the magnitude of the Ψtc parameter to ensure positivity of the resulting linear system, diagonal dominance and convergence of the two-step scheme. For AMR, a level defines a subdomain for refinement. For diffusive processes such as MGD, the refined level uses Dirichlet boundary data at the coarse-fine interface and the data is derived from the coarse level solution. After advancing on the fine level, an additional procedure, the sync-solve (SS), is required in order to enforce conservation. The MGD SS reduces to an elliptic solve on a combined grid for a system of G equations, where G is the number of groups. We adapt the "partial temperature" scheme for the SS; hence, we reuse the infrastructure developed for scalar equations. Results are presented. We consider a multigroup test problem with a known analytic solution. We demonstrate utility of Ψtc by running with increasingly larger timesteps. Lastly, we simulate the sudden release of energy Y inside an Al sphere (r = 15 cm) suspended in air at STP. For Y = 11 kT, we find that gray radiation diffusion and MGD produce similar results. However, if Y = 1 MT, the two packages yield different results. Our large Y simulation contradicts a long-standing theory and demonstrates the
Shestakov, A I; Offner, S R
2006-09-21
We present a scheme to solve the nonlinear multigroup radiation diffusion (MGD) equations. The method is incorporated into a massively parallel, multidimensional, Eulerian radiation-hydrodynamic code with adaptive mesh refinement (AMR). The patch-based AMR algorithm refines in both space and time creating a hierarchy of levels, coarsest to finest. The physics modules are time-advanced using operator splitting. On each level, separate 'level-solve' packages advance the modules. Our multigroup level-solve adapts an implicit procedure which leads to a two-step iterative scheme that alternates between elliptic solves for each group with intra-cell group coupling. For robustness, we introduce pseudo transient continuation ({Psi}tc). We analyze the magnitude of the {Psi}tc parameter to ensure positivity of the resulting linear system, diagonal dominance and convergence of the two-step scheme. For AMR, a level defines a subdomain for refinement. For diffusive processes such as MGD, the refined level uses Dirichet boundary data at the coarse-fine interface and the data is derived from the coarse level solution. After advancing on the fine level, an additional procedure, the sync-solve (SS), is required in order to enforce conservation. The MGD SS reduces to an elliptic solve on a combined grid for a system of G equations, where G is the number of groups. We adapt the 'partial temperature' scheme for the SS; hence, we reuse the infrastructure developed for scalar equations. Results are presented. We consider a multigroup test problem with a known analytic solution. We demonstrate utility of {Psi}tc by running with increasingly larger timesteps. Lastly, we simulate the sudden release of energy Y inside an Al sphere (r = 15 cm) suspended in air at STP. For Y = 11 kT, we find that gray radiation diffusion and MGD produce similar results. However, if Y = 1 MT, the two packages yield different results. Our large Y simulation contradicts a long-standing theory and demonstrates
Ellingson, R.G.; Wiscombe, W.J.; Murcray, D.; Smith, W.; Strauch, R.
1992-12-31
Following the finding by the InterComparison of Radiation Codes used in Climate Models (ICRCCM) of large differences among fluxes predicted by sophisticated radiation models that could not be sorted out because of the lack of a set of accurate atmospheric spectral radiation data measured simultaneously with the important radiative properties of the atmosphere, our team of scientists proposed to remedy the situation by carrying out a comprehensive program of measurement and analysis called SPECTRE (Spectral Radiance Experiment). The data collected during SPECTRE form the test bed for the second phase of ICRCCM, namely verification and calibration of radiation codes used in climate models. This should lead to more accurate radiation models for use in parameterizing climate models, which in turn play a key role in the prediction of trace-gas greenhouse effects. This report summarizes the activities of our group during the project`s Third year to meet our stated objectives. The report is divided into three sections entitled: SPECTRE Activities, ICRCCM Activities, and summary information. The section on SPECTRE activities summarizes the field portion of the project during 1991, and the data reduction/analysis performed by the various participants. The section on ICRCCM activities summarizes our initial attempts to select data for distribution to ICRCCM participants and at comparison of observations with calculations as will be done by the ICRCCM participants. The Summary Information section lists data concerning publications, presentations, graduate students supported, and post-doctoral appointments during the project.
Kotchenova, Svetlana Y; Vermote, Eric F; Matarrese, Raffaella; Klemm, Frank J
2006-09-10
A vector version of the 6S (Second Simulation of a Satellite Signal in the Solar Spectrum) radiative transfer code (6SV1), which enables accounting for radiation polarization, has been developed and validated against a Monte Carlo code, Coulson's tabulated values, and MOBY (Marine Optical Buoy System) water-leaving reflectance measurements. The developed code was also tested against the scalar codes SHARM, DISORT, and MODTRAN to evaluate its performance in scalar mode and the influence of polarization. The obtained results have shown a good agreement of 0.7% in comparison with the Monte Carlo code, 0.2% for Coulson's tabulated values, and 0.001-0.002 for the 400-550 nm region for the MOBY reflectances. Ignoring the effects of polarization led to large errors in calculated top-of-atmosphere reflectances: more than 10% for a molecular atmosphere and up to 5% for an aerosol atmosphere. This new version of 6S is intended to replace the previous scalar version used for calculation of lookup tables in the MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric correction algorithm. PMID:16926910
NASA Astrophysics Data System (ADS)
Thelen, Jean-Claude; Havemann, Stephan; Wong, Gerald
2015-10-01
The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) is a core component of the Met Office NEON Tactical Decision Aid (TDA). Within NEON, the HT-FRTC has for a number of years been used to predict the infrared apparent thermal contrasts between different surface types as observed by an airborne sensor. To achieve this, the HT-FRTC is supplied with the inherent temperatures and spectral properties of these surfaces (i.e. ground target(s) and backgrounds). A key strength of the HT-FRTC is its ability to take into account the detailed properties of the atmosphere, which in the context of NEON tend to be provided by a Numerical Weather Prediction (NWP) forecast model. While water vapour and ozone are generally the most important gases, additional trace gases are now being incorporated into the HT-FRTC. The HT-FRTC also includes an exact treatment of atmospheric scattering based on spherical harmonics. This allows for the treatment of several different aerosol species and of liquid and ice clouds. Recent developments can even account for rain and falling snow. The HT-FRTC works in Principal Component (PC) space and is trained on a wide variety of atmospheric and surface conditions, which significantly reduces the computational requirements regarding memory and processing time. One clear-sky simulation takes approximately one millisecond at the time of writing. Recent developments allow the training of HT-FRTC to be both completely generalised and sensor independent. This is significant as the user of the code can add new sensors and new surfaces/targets by supplying extra files which contain their (possibly classified) spectral properties. The HT-FRTC has been extended to cover the spectral range of Photopic and NVG sensors. One aim here is to give guidance on the expected, directionally resolved sky brightness, especially at night, again taking the actual or forecast atmospheric conditions into account. Recent developments include light level predictions during
NASA Technical Reports Server (NTRS)
Reddell, Brandon
2015-01-01
Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.
NASA Astrophysics Data System (ADS)
Sijoy, C. D.; Chaturvedi, S.
2015-05-01
Three-temperature (3T), unstructured-mesh, non-equilibrium radiation hydrodynamics (RHD) code have been developed for the simulation of intense thermal radiation or high-power laser driven radiative shock hydrodynamics in two-dimensional (2D) axis-symmetric geometries. The governing hydrodynamics equations are solved using a compatible unstructured Lagrangian method based on a control volume differencing (CVD) scheme. A second-order predictor-corrector (PC) integration scheme is used for the temporal discretization of the hydrodynamics equations. For the radiation energy transport, frequency averaged gray model is used in which the flux-limited diffusion (FLD) approximation is used to recover the free-streaming limit of the radiation propagation in optically thin regions. The proposed RHD model allows to have different temperatures for the electrons and ions. In addition to this, the electron and thermal radiation temperatures are assumed to be in non-equilibrium. Therefore, the thermal relaxation between the electrons and ions and the coupling between the radiation and matter energies are required to be computed self-consistently. For this, the coupled flux limited electron heat conduction and the non-equilibrium radiation diffusion equations are solved simultaneously by using an implicit, axis-symmetric, cell-centered, monotonic, nonlinear finite volume (NLFV) scheme. In this paper, we have described the details of the 2D, 3T, non-equilibrium RHD code developed along with a suite of validation test problems to demonstrate the accuracy and performance of the algorithms. We have also conducted a performance analysis with different linearity preserving interpolation schemes that are used for the evaluation of the nodal values in the NLFV scheme. Finally, in order to demonstrate full capability of the code implementation, we have presented the simulation of laser driven thin Aluminum (Al) foil acceleration. The simulation results are found to be in good agreement
Lubin, D.; Cutchin, D.; Conant, W.; Grassl, H.; Schmid, U.; Biselli, W.
1995-02-01
Longwave emission by the tropical western Pacific atmosphere has been measured at the ocean surface by a Fourier Transform Infrared (FTIR) spectroradiometer deployed aboard the research vessel John Vickers as part of the Central Equatorial Pacific Experiment. The instrument operated throughout a Pacific Ocean crossing, beginning on 7 March 1993 in Honiara, Solomon Islands, and ending on 29 March 1993 in Los Angeles, and recorded longwave emission spectra under atmospheres associated with sea surface temperatures ranging from 291.0 to 302.8 K. Precipitable water vapor abundances ranged from 1.9 to 5.5 column centimeters. Measured emission spectra (downwelling zenith radiance) covered the middled infrared (5-20 {mu}m) with one inverse centimeter spectral resolution. FTIR measurements made under an entirely clear field of view are compared with spectra generated by LOWTRAN 7 and MODTRAN 2, as well as downwelling flux calculated by the NCAR COmmunity Climate Model (CCM-2) radiation code, using radiosonde profiles as input data for these calculations. In the spectral interval 800-1000 cm{sup -1}, these comparisons show a discrepance between FTIR data and MODTRAN 2 having an overall variability of 6-7 mW m{sup -2} sr{sup -1} cm and a concave shape that may be related to the representation of water vapor continuum emission in MODTRAN 2. Another discrepancy appears in the spectral interval 1200-1300 cm{sup -1}, whether MODTRAN 2 appears to overestimate zenith radiance by 5 mW m{sup -2} sr-1 cm. These discrepancies appear consistently; however, they become only slightly larger at the highest water vapor abundances. Because these radiance discrepancies correspond to broadband (500-2000 cm{sup -1}) flux uncertainties of around 3 W m{sup -2}, there appear to be no serious inadequacies with the performance of MODTRAN 2 or LOWTRAN 7 at high atmospheric temperatures and water vapor abundances. 23 refs., 10 figs.
NASA Astrophysics Data System (ADS)
Kellerman, Adam; Shprits, Yuri; Podladchikova, Tatiana; Kondrashov, Dmitri
2016-04-01
The Versatile Electron Radiation Belt (VERB) code 2.0 models the dynamics of radiation-belt electron phase space density (PSD) in Earth's magnetosphere. Recently, a data-assimilative version of this code has been developed, which utilizes a split-operator Kalman-filtering approach to solve for electron PSD in terms of adiabatic invariants. A new dataset based on the TS07d magnetic field model is presented, which may be utilized for analysis of past geomagnetic storms, and for initial and boundary conditions in running simulations. Further, a data-assimilative forecast model is introduced, which has the capability to forecast electron PSD several days into the future, given a forecast Kp index. The model assimilates an empirical model capable of forecasting the conditions at geosynchronous orbit. The model currently runs in real time and a forecast is available to view online http://rbm.epss.ucla.edu.
Parks, C.V.; Broadhead, B.L.; Hermann, O.W.; Tang, J.S.; Cramer, S.N.; Gauthey, J.C.; Kirk, B.L.; Roussin, R.W.
1988-07-01
This report provides a preliminary assessment of the computational tools and existing methods used to obtain radiation dose rates from shielded spent nuclear fuel and high-level radioactive waste (HLW). Particular emphasis is placed on analysis tools and techniques applicable to facilities/equipment designed for the transport or storage of spent nuclear fuel or HLW. Applications to cask transport, storage, and facility handling are considered. The report reviews the analytic techniques for generating appropriate radiation sources, evaluating the radiation transport through the shield, and calculating the dose at a desired point or surface exterior to the shield. Discrete ordinates, Monte Carlo, and point kernel methods for evaluating radiation transport are reviewed, along with existing codes and data that utilize these methods. A literature survey was employed to select a cadre of codes and data libraries to be reviewed. The selection process was based on specific criteria presented in the report. Separate summaries were written for several codes (or family of codes) that provided information on the method of solution, limitations and advantages, availability, data access, ease of use, and known accuracy. For each data library, the summary covers the source of the data, applicability of these data, and known verification efforts. Finally, the report discusses the overall status of spent fuel shielding analysis techniques and attempts to illustrate areas where inaccuracy and/or uncertainty exist. The report notes the advantages and limitations of several analysis procedures and illustrates the importance of using adequate cross-section data sets. Additional work is recommended to enable final selection/validation of analysis tools that will best meet the US Department of Energy's requirements for use in developing a viable HLW management system. 188 refs., 16 figs., 27 tabs.
Ellingson, R.G.; Wiscombe, W.J.; Murcray, D.; Smith, W.; Strauch, R.
1993-12-31
Following the finding by the InterComparison of Radiation Codes used in Climate Models (ICRCCM) of large differences among fluxes predicted by sophisticated radiation models that could not be sorted out because of the lack of a set of accurate atmospheric spectral radiation data measured simultaneously with the important radiative properties of the atmosphere, the team of scientists proposed to remedy the situation by carrying out a comprehensive program of measurement and analysis called SPECTRE (Spectral Radiance Experiment). SPECTRE was to establish an absolute standard against which to compare models, and aimed to remove the hidden variables (unknown humidities, aerosols, etc.) which radiation modelers had invoked to excuse disagreements with observation. The data collected during SPECTRE were to form the test bed for the second phase of ICRCCM, namely verification and calibration of radiation codes used in climate models. This should lead to more accurate radiation models for use in parameterizing climate models, which in turn play a key role in the prediction of trace-gas greenhouse effects. This report summarizes the activities during the project`s Third year to meet stated objectives. The report is divided into three sections entitled: (1) SPECTRE Activities, (2) ICRCCM Activities, and (3) Summary Information. The section on SPECTRE activities summarizes the field portion of the project during 1991, and the data reduction/analysis performed by the various participants. The section on ICRCCM activities summarizes their initial attempts to select data for distribution to ICRCCM participants and at comparison of observations with calculations as will be done by the ICRCCM participants. The Summary Information section lists data concerning publications, presentations, graduate students supported, and post-doctoral appointments during the project.
Ellingson, R.G.; Baer, F.
1992-01-01
Research by the US Department of Energy (DOE) has shown that cloud radiative feedback is the single most important effect determining the magnitude of possible climatic responses to human activity. However, these effects are still not known at the levels needed for climate prediction. Consequently, DOE has launched a major initiative-- the Atmospheric Radiation Measurements (ARM) Program -- directed at improving the parameterization of the physics governing cloud and radiative processes in general circulation models (GCM's). One specific goal of ARM is to improve the treatment of radiative transfer in GCM's under clear-sky, general overcast and broken cloud conditions. Our approach to developing the radiation model will be to test existing models in an iterative, predictive fashion. We will supply the Clouds and Radiative Testbed (CART) with a set of models to be compared with operationally observed data. The differences we find will lead to the development of new models to be tested with new data. Similarly, our GCM studies will use existing GCM's to study the radiation sensitivity problem. We anticipate that the outcome of this approach will provide both a better longwave radiative forcing algorithm and a better understanding of how longwave radiative forcing influences the equilibrium climate of the atmosphere.
Harrison, L.; Michalsky, J.
1991-03-13
Three separate tasks are included in the first year of the project. Two involve assembling data sets useful for testing radiation models in global climate modeling (GCM) codes, and the third is concerned with the development of advance instrumentation for performing accurate spectral radiation measurements. Task 1: Three existing data sets have been merged for two locations, one in the wet northeastern US and a second in the dry western US. The data sets are meteorological data from the WBAN network, upper air data from the NCDC, and high quality solar radiation measurements from Albany, New York and Golden, Colorado. These represent test data sets for those modelers developing radiation codes for the GCM models. Task 2: Existing data are not quite adequate from a modeler`s perspective without downwelling infrared data and surface albedo, or reflectance, data. Before the deployment of the first CART site in ARM the authors are establishing this more complete set of radiation measurements at the Albany site to be operational only until CART is operational. The authors will have the site running by April 1991, which will provide about one year`s data from this location. They will coordinate their measurements with satellite overpasses, and, to the extent possible, with radiosonde releases, in order that the data set be coincident in time. Task 3: Work has concentrated on the multiple filter instrument. The mechanical, optical, and software engineering for this instrument is complete, and the first field prototype is running at the Rattlesnake Mountain Observatory (RMO) test site. This instrument is performing well, and is already delivering reliable and useful information.
Giantsoudi, D; Schuemann, J; Dowdell, S; Paganetti, H; Jia, X; Jiang, S
2014-06-15
Purpose: For proton radiation therapy, Monte Carlo simulation (MCS) methods are recognized as the gold-standard dose calculation approach. Although previously unrealistic due to limitations in available computing power, GPU-based applications allow MCS of proton treatment fields to be performed in routine clinical use, on time scales comparable to that of conventional pencil-beam algorithms. This study focuses on validating the results of our GPU-based code (gPMC) versus fully implemented proton therapy based MCS code (TOPAS) for clinical patient cases. Methods: Two treatment sites were selected to provide clinical cases for this study: head-and-neck cases due to anatomical geometrical complexity (air cavities and density heterogeneities), making dose calculation very challenging, and prostate cases due to higher proton energies used and close proximity of the treatment target to sensitive organs at risk. Both gPMC and TOPAS methods were used to calculate 3-dimensional dose distributions for all patients in this study. Comparisons were performed based on target coverage indices (mean dose, V90 and D90) and gamma index distributions for 2% of the prescription dose and 2mm. Results: For seven out of eight studied cases, mean target dose, V90 and D90 differed less than 2% between TOPAS and gPMC dose distributions. Gamma index analysis for all prostate patients resulted in passing rate of more than 99% of voxels in the target. Four out of five head-neck-cases showed passing rate of gamma index for the target of more than 99%, the fifth having a gamma index passing rate of 93%. Conclusion: Our current work showed excellent agreement between our GPU-based MCS code and fully implemented proton therapy based MC code for a group of dosimetrically challenging patient cases.
Robert G. Ellingson
2004-09-28
One specific goal of the Atmospheric Radiation Measurements (ARM) program is to improve the treatment of radiative transfer in General Circulation Models (GCMs) under clear-sky, general overcast and broken cloud conditions. Our project was geared to contribute to this goal by attacking major problems associated with one of the dominant radiation components of the problem --longwave radiation. The primary long-term project objectives were to: (1) develop an optimum longwave radiation model for use in GCMs that has been calibrated with state-of-the-art observations for clear and cloudy conditions, and (2) determine how the longwave radiative forcing with an improved algorithm contributes relatively in a GCM when compared to shortwave radiative forcing, sensible heating, thermal advection and convection. The approach has been to build upon existing models in an iterative, predictive fashion. We focused on comparing calculations from a set of models with operationally observed data for clear, overcast and broken cloud conditions. The differences found through the comparisons and physical insights have been used to develop new models, most of which have been tested with new data. Our initial GCM studies used existing GCMs to study the climate model-radiation sensitivity problem. Although this portion of our initial plans was curtailed midway through the project, we anticipate that the eventual outcome of this approach will provide both a better longwave radiative forcing algorithm and from our better understanding of how longwave radiative forcing influences the model equilibrium climate, how improvements in climate prediction using this algorithm can be achieved.
NASA Technical Reports Server (NTRS)
Egan, Michael P.; Leung, Chun Ming; Spagna, George F., Jr.
1988-01-01
The program solves the radiation transport problem in a dusty medium with one-dimensional planar, spherical or cylindrical geometry. It determines self-consistently the effects of multiple scattering, absorption, and re-emission of photons on the temperature of dust grains and the characteristics of the internal radiation field. The program can treat radiation field anisotropy, linear anisotropic scattering, and multi-grain components. The program output consists of the dust-temperature distribution, flux spectrum, surface brightness at each frequency and the observed intensities (involving a convolution with a telescope beam pattern).
Radiation Oncology Treatment Team
... Upper GI What is Radiation Therapy? Find a Radiation Oncologist Last Name: Facility: City: State: Zip Code: ... who specializes in using radiation to treat cancer . Radiation Oncologists Radiation oncologists are the doctors who will ...
Carver, D; Kost, S; Pickens, D; Price, R; Stabin, M
2014-06-15
Purpose: To assess the utility of optically stimulated luminescent (OSL) dosimeter technology in calibrating and validating a Monte Carlo radiation transport code for computed tomography (CT). Methods: Exposure data were taken using both a standard CT 100-mm pencil ionization chamber and a series of 150-mm OSL CT dosimeters. Measurements were made at system isocenter in air as well as in standard 16-cm (head) and 32-cm (body) CTDI phantoms at isocenter and at the 12 o'clock positions. Scans were performed on a Philips Brilliance 64 CT scanner for 100 and 120 kVp at 300 mAs with a nominal beam width of 40 mm. A radiation transport code to simulate the CT scanner conditions was developed using the GEANT4 physics toolkit. The imaging geometry and associated parameters were simulated for each ionization chamber and phantom combination. Simulated absorbed doses were compared to both CTDI{sub 100} values determined from the ion chamber and to CTDI{sub 100} values reported from the OSLs. The dose profiles from each simulation were also compared to the physical OSL dose profiles. Results: CTDI{sub 100} values reported by the ion chamber and OSLs are generally in good agreement (average percent difference of 9%), and provide a suitable way to calibrate doses obtained from simulation to real absorbed doses. Simulated and real CTDI{sub 100} values agree to within 10% or less, and the simulated dose profiles also predict the physical profiles reported by the OSLs. Conclusion: Ionization chambers are generally considered the standard for absolute dose measurements. However, OSL dosimeters may also serve as a useful tool with the significant benefit of also assessing the radiation dose profile. This may offer an advantage to those developing simulations for assessing radiation dosimetry such as verification of spatial dose distribution and beam width.
Li, Yong Gang; Yang, Yang; Short, Michael P.; Ding, Ze Jun; Zeng, Zhi; Li, Ju
2015-01-01
SRIM-like codes have limitations in describing general 3D geometries, for modeling radiation displacements and damage in nanostructured materials. A universal, computationally efficient and massively parallel 3D Monte Carlo code, IM3D, has been developed with excellent parallel scaling performance. IM3D is based on fast indexing of scattering integrals and the SRIM stopping power database, and allows the user a choice of Constructive Solid Geometry (CSG) or Finite Element Triangle Mesh (FETM) method for constructing 3D shapes and microstructures. For 2D films and multilayers, IM3D perfectly reproduces SRIM results, and can be ∼102 times faster in serial execution and > 104 times faster using parallel computation. For 3D problems, it provides a fast approach for analyzing the spatial distributions of primary displacements and defect generation under ion irradiation. Herein we also provide a detailed discussion of our open-source collision cascade physics engine, revealing the true meaning and limitations of the “Quick Kinchin-Pease” and “Full Cascades” options. The issues of femtosecond to picosecond timescales in defining displacement versus damage, the limitation of the displacements per atom (DPA) unit in quantifying radiation damage (such as inadequacy in quantifying degree of chemical mixing), are discussed. PMID:26658477
NASA Astrophysics Data System (ADS)
Li, Yong Gang; Yang, Yang; Short, Michael P.; Ding, Ze Jun; Zeng, Zhi; Li, Ju
2015-12-01
SRIM-like codes have limitations in describing general 3D geometries, for modeling radiation displacements and damage in nanostructured materials. A universal, computationally efficient and massively parallel 3D Monte Carlo code, IM3D, has been developed with excellent parallel scaling performance. IM3D is based on fast indexing of scattering integrals and the SRIM stopping power database, and allows the user a choice of Constructive Solid Geometry (CSG) or Finite Element Triangle Mesh (FETM) method for constructing 3D shapes and microstructures. For 2D films and multilayers, IM3D perfectly reproduces SRIM results, and can be ∼102 times faster in serial execution and > 104 times faster using parallel computation. For 3D problems, it provides a fast approach for analyzing the spatial distributions of primary displacements and defect generation under ion irradiation. Herein we also provide a detailed discussion of our open-source collision cascade physics engine, revealing the true meaning and limitations of the “Quick Kinchin-Pease” and “Full Cascades” options. The issues of femtosecond to picosecond timescales in defining displacement versus damage, the limitation of the displacements per atom (DPA) unit in quantifying radiation damage (such as inadequacy in quantifying degree of chemical mixing), are discussed.
NASA Astrophysics Data System (ADS)
Dattoli, G.; Migliorati, M.; Schiavi, A.
2007-05-01
The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high-intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of these types of problems should be fast and reliable, conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that the integration procedure is capable of reproducing the onset of instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed.
Morgan C. White
2000-07-01
The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a select group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V&V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second, the ability to
Lee, K. T.
2007-02-12
The long term human exploration goals that NASA has embraced, requires the need to understand the primary radiation and secondary particle production under a variety of environmental conditions. In order to perform accurate transport simulations for the incident particles found in the space environment, accurate nucleus-nucleus inelastic event generators are needed, and NASA is funding their development. For the first time, NASA is including the radiation problem into the . design of the next manned exploration vehicle. The NASA-funded FLUER-S (FLUKA Executing Under ROOT-Space) project has several goals beyond the improvement of the internal nuclear physics simulations. These include making FLUKA more user-friendly. Several tools have been developed to simplify the use of FLUKA without compromising its accuracy or versatility. Among these tools are a general source input, ability of distributive computing, simplification of geometry input, geometry and event visualization, and standard FLUKA scoring output analysis using a ROOT GUI. In addition to describing these tools we will show how they have been used for space radiation environment data analysis in MARIE, IVCPDS, and EVCPDS. Similar analyses can be performed for future radiation measurement detectors before they are deployed in order to optimize their design. These tools can also be used in the design of nuclear-based power systems on manned exploration vehicles and planetary surfaces. In addition to these space applications, the simulations are being used to support accelerator based experiments like the cross-section measurements being performed at HIMAC and NSRL at BNL.
Hayes, J C; Norman, M
1999-10-28
This report details an investigation into the efficacy of two approaches to solving the radiation diffusion equation within a radiation hydrodynamic simulation. Because leading-edge scientific computing platforms have evolved from large single-node vector processors to parallel aggregates containing tens to thousands of individual CPU's, the ability of an algorithm to maintain high compute efficiency when distributed over a large array of nodes is critically important. The viability of an algorithm thus hinges upon the tripartite question of numerical accuracy, total time to solution, and parallel efficiency.
Validation of a GPU-based Monte Carlo code (gPMC) for proton radiation therapy: clinical cases study
NASA Astrophysics Data System (ADS)
Giantsoudi, Drosoula; Schuemann, Jan; Jia, Xun; Dowdell, Stephen; Jiang, Steve; Paganetti, Harald
2015-03-01
Monte Carlo (MC) methods are recognized as the gold-standard for dose calculation, however they have not replaced analytical methods up to now due to their lengthy calculation times. GPU-based applications allow MC dose calculations to be performed on time scales comparable to conventional analytical algorithms. This study focuses on validating our GPU-based MC code for proton dose calculation (gPMC) using an experimentally validated multi-purpose MC code (TOPAS) and compare their performance for clinical patient cases. Clinical cases from five treatment sites were selected covering the full range from very homogeneous patient geometries (liver) to patients with high geometrical complexity (air cavities and density heterogeneities in head-and-neck and lung patients) and from short beam range (breast) to large beam range (prostate). Both gPMC and TOPAS were used to calculate 3D dose distributions for all patients. Comparisons were performed based on target coverage indices (mean dose, V95, D98, D50, D02) and gamma index distributions. Dosimetric indices differed less than 2% between TOPAS and gPMC dose distributions for most cases. Gamma index analysis with 1%/1 mm criterion resulted in a passing rate of more than 94% of all patient voxels receiving more than 10% of the mean target dose, for all patients except for prostate cases. Although clinically insignificant, gPMC resulted in systematic underestimation of target dose for prostate cases by 1-2% compared to TOPAS. Correspondingly the gamma index analysis with 1%/1 mm criterion failed for most beams for this site, while for 2%/1 mm criterion passing rates of more than 94.6% of all patient voxels were observed. For the same initial number of simulated particles, calculation time for a single beam for a typical head and neck patient plan decreased from 4 CPU hours per million particles (2.8-2.9 GHz Intel X5600) for TOPAS to 2.4 s per million particles (NVIDIA TESLA C2075) for gPMC. Excellent agreement was
Giantsoudi, Drosoula; Schuemann, Jan; Jia, Xun; Dowdell, Stephen; Jiang, Steve; Paganetti, Harald
2015-03-21
Monte Carlo (MC) methods are recognized as the gold-standard for dose calculation, however they have not replaced analytical methods up to now due to their lengthy calculation times. GPU-based applications allow MC dose calculations to be performed on time scales comparable to conventional analytical algorithms. This study focuses on validating our GPU-based MC code for proton dose calculation (gPMC) using an experimentally validated multi-purpose MC code (TOPAS) and compare their performance for clinical patient cases. Clinical cases from five treatment sites were selected covering the full range from very homogeneous patient geometries (liver) to patients with high geometrical complexity (air cavities and density heterogeneities in head-and-neck and lung patients) and from short beam range (breast) to large beam range (prostate). Both gPMC and TOPAS were used to calculate 3D dose distributions for all patients. Comparisons were performed based on target coverage indices (mean dose, V95, D98, D50, D02) and gamma index distributions. Dosimetric indices differed less than 2% between TOPAS and gPMC dose distributions for most cases. Gamma index analysis with 1%/1 mm criterion resulted in a passing rate of more than 94% of all patient voxels receiving more than 10% of the mean target dose, for all patients except for prostate cases. Although clinically insignificant, gPMC resulted in systematic underestimation of target dose for prostate cases by 1-2% compared to TOPAS. Correspondingly the gamma index analysis with 1%/1 mm criterion failed for most beams for this site, while for 2%/1 mm criterion passing rates of more than 94.6% of all patient voxels were observed. For the same initial number of simulated particles, calculation time for a single beam for a typical head and neck patient plan decreased from 4 CPU hours per million particles (2.8-2.9 GHz Intel X5600) for TOPAS to 2.4 s per million particles (NVIDIA TESLA C2075) for gPMC. Excellent agreement was
Ellingson, R.G.; Baer, F.
1993-12-31
This report summarizes the activities of our group to meet our stated objectives. The report is divided into sections entitled: Radiation Model Testing Activities, General Circulation Model Testing Activities, Science Team Activities, and Publications, Presentations and Meetings. The section on Science Team Activities summarizes our participation with the science team to further advance the observation and modeling programs. Appendix A lists graduate students supported, and post-doctoral appointments during the project. Reports on the activities during each of the first two years are included as Appendix B. Significant progress has been made in: determining the ability of line-by-line radiation models to calculate the downward longwave flux at the surface; determining the uncertainties in calculated the downwelling radiance and flux at the surface associated with the use of different proposed profiling techniques; intercomparing clear-sky radiance and flux observations with calculations from radiation codes from different climate models; determining the uncertainties associated with estimating N* from surface longwave flux observations; and determining the sensitivity of model calculations to different formulations of the effects of finite sized clouds.
Compressible Astrophysics Simulation Code
Energy Science and Technology Software Center (ESTSC)
2007-07-18
This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.
Monte Carlo Simulation of a 6 MV X-Ray Beam for Open and Wedge Radiation Fields, Using GATE Code
Bahreyni-Toosi, Mohammad-Taghi; Nasseri, Shahrokh; Momennezhad, Mahdi; Hasanabadi, Fatemeh; Gholamhosseinian, Hamid
2014-01-01
The aim of this study is to provide a control software system, based on Monte Carlo simulation, and calculations of dosimetric parameters of standard and wedge radiation fields, using a Monte Carlo method. GATE version 6.1 (OpenGATE Collaboration), was used to simulate a compact 6 MV linear accelerator system. In order to accelerate the calculations, the phase-space technique and cluster computing (Condor version 7.2.4, Condor Team, University of Wisconsin–Madison) were used. Dosimetric parameters used in treatment planning systems for the standard and wedge radiation fields (10 cm × 10 cm to 30 cm × 30 cm and a 60° wedge), including the percentage depth dose and dose profiles, were measured by both computational and experimental methods. Gamma index was applied to compare calculated and measured results with 3%/3 mm criteria. Gamma index was applied to compare calculated and measured results. Almost all calculated data points have satisfied gamma index criteria of 3% to 3 mm. Based on the good agreement between calculated and measured results obtained for various radiation fields in this study, GATE may be used as a useful tool for quality control or pretreatment verification procedures in radiotherapy. PMID:25426430
NASA Astrophysics Data System (ADS)
Valdivia, Valeska; Hennebelle, Patrick
2014-11-01
Context. Ultraviolet radiation plays a crucial role in molecular clouds. Radiation and matter are tightly coupled and their interplay influences the physical and chemical properties of gas. In particular, modeling the radiation propagation requires calculating column densities, which can be numerically expensive in high-resolution multidimensional simulations. Aims: Developing fast methods for estimating column densities is mandatory if we are interested in the dynamical influence of the radiative transfer. In particular, we focus on the effect of the UV screening on the dynamics and on the statistical properties of molecular clouds. Methods: We have developed a tree-based method for a fast estimate of column densities, implemented in the adaptive mesh refinement code RAMSES. We performed numerical simulations using this method in order to analyze the influence of the screening on the clump formation. Results: We find that the accuracy for the extinction of the tree-based method is better than 10%, while the relative error for the column density can be much more. We describe the implementation of a method based on precalculating the geometrical terms that noticeably reduces the calculation time. To study the influence of the screening on the statistical properties of molecular clouds we present the probability distribution function of gas and the associated temperature per density bin and the mass spectra for different density thresholds. Conclusions: The tree-based method is fast and accurate enough to be used during numerical simulations since no communication is needed between CPUs when using a fully threaded tree. It is then suitable to parallel computing. We show that the screening for far UV radiation mainly affects the dense gas, thereby favoring low temperatures and affecting the fragmentation. We show that when we include the screening, more structures are formed with higher densities in comparison to the case that does not include this effect. We
NASA Astrophysics Data System (ADS)
Espy, P. J.; Daae, M.; Shprits, Y.
2010-12-01
The correlation between the inner edge of the outer radiation belt phase space density (PSD) and the plasmapause location (Lpp) using reanalysis is investigated. A large data set is applied for the statistical analysis, using data from 1990-1991 from the CRRES satellite, GEO 1989, GPS-ns18 and Akebono. These data are incorporated into reanalysis by means of a Kalman filter with the UCLA 1-D VERB code. The result is a continuous radial and temporal distribution of the PSD from L*=3 to L*=7. The innovation vector of the reconstructed PSD can give us information about regions where local loss or source processes are dominating. We analyze both the PSD and the innovation vector by binning them into slots of Dst and Kp values. This has been done by finding the time for when the Dst (Kp) is within each bin-size of 20 nT (1) from 10 nT to -130 nT (1 to 8). The PSD and innovation vector was then averaged over each of those times. The result shows a good correlation between the location of the inner edge of the outer radiation belt in the PSD and the location of the plasmapause, which is consistent with previous observations. The boundary between the inner edge of the radiation belt and the Lpp becomes sharper, and the radiation belt becomes thinner, during times of high geomagnetic activity. The innovation vector shows that the inner edge of the source region also lines up well with the Lpp, and further showing a battle between losses and sources during active times. This study also illustrates how data assimilation in the radiation belts can be used to understand the underlining processes of acceleration and loss in the inner magnetosphere.
NASA Technical Reports Server (NTRS)
Soden, B.; Tjemkes, S.; Schmetz, J.; Saunders, R.; Bates, J.; Ellingson, B.; Engelen, R.; Garand, L.; Jackson, D.; Jedlovec, G.
1999-01-01
An intercomparison of radiation codes used in retrieving upper tropospheric humidity (UTH) from observations in the v2 (6.3 microns) water vapor absorption band was performed. This intercomparison is one part of a coordinated effort within the GEWEX Water Vapor Project (GVaP) to assess our ability to monitor the distribution and variations of upper tropospheric moisture from space-borne sensors. A total of 23 different codes, ranging from detailed line-by-line (LBL) models, to coarser resolution narrow-band (NB) models, to highly-parameterized single-band (SB) models participated in the study. Forward calculations were performed using a carefully selected set of temperature and moisture profiles chosen to be representative of a wide range of atmospheric conditions. The LBL model calculations exhibited the greatest consistency with each other, typically agreeing to within 0.5 K in terms of the equivalent blackbody brightness temperature (T(sub b)). The majority of NB and SB models agreed to within +/- 1 K of the LBL models, although a few older models exhibited systematic T(sub b) biases in excess of 2 K. A discussion of the discrepancies between various models, their association with differences in model physics (e.g. continuum absorption), and their implications for UTH retrieval and radiance assimilation is presented.